A long piece in Wired contains an interview with Rafaela Vasquez, the tragic figure who was behind the wheel when Elaine Herzberg became the first bystander to be killed by a self-driving car. Self-driving car developers, including Uber ATG, have been unwilling to acknowledge the true effects of the Herzberg crash in Tempe, but it has transformed the industry’s understanding of what is at stake in testing on public roads. Safety, which had been neglected for years, has been since 2018 forced to the front of innovators’ minds.
I’ve written before about the lessons from the crash and the difficulty of ascribing blame. Any crash is the product of multiple causes. We all, but especially self-driving car companies, tend to blame human error. So even though Rafaela Vasquez clearly did things wrong, she has become, to use Madeleine Elish’s phrase, a ‘moral crumple zone’ for a wider system. The Wired piece goes some way towards rehabilitating her, and in doing so, reveals some important new details.
First, there are insights about the sort of hidden labour that tech workers are used to employing. Vasquez had done a range of tech work…
“moderating grisly posts on Facebook, she says; tweeting about Dancing With the Stars from ABC’s Twitter; policing social media for Wingstop and Walmart.”
… before taking her ghost work to Uber’s ‘Ghost Town’. She and her colleagues were given training, but the well-known hazards of automation complacency were not given much attention. Uber’s aim was to “crush miles”, boasting of running 84,000 miles per week, often running the same loop repeatedly, emphasising quantity over quality, to impress management and their investors. Other safety drivers had been caught looking at their mobile phones while behind the wheel. But Uber prioritised crushing miles and saving money over safety culture. Just before the crash, they had reduced the number of people in the car from two to one as the automation improved.
The blame game afterwards has been unedifying:
“You can’t put the blame on just that one person,” says the Pittsburgh manager. “I mean, it’s absurd.” Uber “had to know this would happen. We get distracted in regular driving,” the manager says. “
Another insider told the reporter that the company was
“very clever about liability as opposed to being smart about responsibility.”
Once the layers got involved, the opportunity for real learning was cut. Both the company and Arizona’s government, who had been so desperate to get the company to Phoenix and so shocked, SHOCKED! that the company could misbehave, have been hit by subsequent lawsuits, but Vasquez remains a soft target.
The Wired piece suffers, as much of Wired’s reporting does, from a breathless need to emphasise the inevitability and desirability of tech. Vasquez was clearly excited by and supportive of a self-driving future, but the framing of her story is depressing. The piece concludes
“To reach that purported future, we must first weather the era we’re in now: when tech is a student driver… And inevitably, as experts have always warned, that means crashes”
To call crashes inevitable is a different sort of tragedy: a fatalistic technological determinism that will jeopardise future innovation. Elaine Herzberg’s death was the result of choices that are becoming increasingly clear. Things could have gone differently and they should be redirected in the light of our new knowledge.