The first place the CHP stored the car was in a garage that promptly burned down, with only the wreckage of the old car left standing. Chalking this up to bad luck, the CHP continued to use the car, taking it to high schools as a visual aid for the dangers of reckless driving. En route to one school, the car broke loose from the truck hauling it and crashed into another vehicle, causing a fatal accident. Undeterred by these bad omens, the CHP took the car to another school, where the car fell on a student, breaking their hip. In total, the Spyder fell off of the trailer that carried it three times, crushing a truck driver once. Not only did the car give law enforcement trouble, but it also made life difficult for criminals. Two thieves tried to steal the bloodstained seats and steering wheel from the wreck. Instead of getting some memorabilia out of it, all they got were injuries.2
The mystery of James Dean’s Spyder, which went missing shortly after its supposed reign of terror, came back to the headlines in 2015 when forty-seven-year-old Shawn Reilly revealed that as a child he had helped his father hide a sports car behind a wall. During the task, Shawn sustained a scar on his thumb that still lingers today. The Volo Auto Museum in Illinois conducted a lie-detector test on Reilly to substantiate claims that he and his father, along with George Barris, hid the Spyder.
He passed the test.
Whether cars can be possessed or cursed, they are undoubtedly an integral part of our lives. As technology rockets ahead, the reality of self-driving cars has come to be more than just fiction. When Stephen King wrote
In 1985, researchers determined that 57 percent of traffic crashes were due to human error, while only 2 percent were due to the vehicle alone.
The US Department of Transportation first released guidelines in 2016 to give direction to developers of self-driving technology, and has updated them three times since. The guidelines emphasize safety, tech neutrality, and regulatory consistency, and they encourage companies to submit information about their approaches to safety. But they didn’t create any rules around testing self-driving cars. Today, states oversee testing and have taken wildly different approaches. California, for example, requires developers to obtain a license to test, and to annually submit detailed information on its testing activities, which is released to the public. (Many in the industry argue that the data collected by the state doesn’t provide a clear or accurate picture of its activities.) Arizona, by contrast, only requires companies to submit information about how their vehicles will interact with law enforcement and emergency services, and to guarantee in writing that their vehicles are autonomous. After that, Arizona does not monitor the vehicles, though its governor reserves the right to revoke any company’s ability to operate after something goes wrong. Governor Doug Ducey did just that in 2018, kicking Uber’s self-driving operations out of the state after a testing vehicle struck and killed a woman on foot.3
Forty-nine-year-old Elaine Herzberg died when she was struck by a self-driving Uber with a human operator in the driver’s seat. According to the National Safety Traffic Board, her tragic death was blamed on the vehicle’s inability “to classify an object as a pedestrian unless that object was near a crosswalk.”4 Therefore, because it couldn’t recognize Herzberg as a person since she was jaywalking, the Uber concluded she was a bike or an “other” and braked only one second before hitting her. Unfortunately, the operator inside the vehicle was not alerted by the vehicle, and was streaming TV on her phone rather than watching the road. This mix of human and technological error cost Elaine Herzberg her life. Because of this accident, Uber has programmed their recent cars to be able to recognize jaywalking humans.