Skip to main content

The world’s first robot car death was the result of human error — and it can happen again

The world’s first robot car death was the result of human error — and it can happen again

/

The damage from the Uber crash will have far-reaching consequences

Share this story

On November 20th, the National Transportation Safety Board (NTSB) released the results of its investigation into the 2018 fatal Uber crash in Tempe, Arizona, which was widely believed to be the world’s first death by a self-driving car.

But rather than slap the cuffs on Uber’s robot car, investigators instead highlighted the many human errors that culminated in the death of 49-year-old Elaine Herzberg. And they sounded a warning: it could happen again.

“If your company tests automated driving systems on public roads, this crash, it was about you,” NTSB chair Robert Sumwalt said in his opening statement of the hearing yesterday.

When the board read aloud its findings on the probable cause of the crash in Tempe, the first person to be blamed was Rafaela Vasquez, the safety driver in the vehicle at the time of the crash. Vasquez was never called out by name, but her failures as a watchdog for the automated driving system were put on stark display by the NTSB.

The safety driver was never called out by name, but her failures were put on stark display

In the minutes before impact, Vasquez was reportedly streaming an episode of The Voice on her phone, which is in violation of Uber’s policy banning phone use. In fact, investigators determined that she had been glancing down at her phone and away from the road for over a third of the total time she had been in the car up until the moment of the crash.

Vasquez’s failure “to monitor the driving environment and the operation of the automated driving system because she was visually distracted throughout the trip by her personal cell phone” was cited as the primary cause of the crash. But she shares the blame with her employers at Uber where a woefully inadequate safety culture also contributed to Herzberg’s death, the board said. Likewise, the federal government also bore its share of responsibility for failing to better regulate autonomous car operations.

Uber self-driving crash screen cap
Image: ABC 15

“In my opinion they’ve put technology advancement here before saving lives,” NTSB board member Jennifer Homendy said of the National Highway Traffic Safety Administration (NHTSA), which is responsible for regulating vehicle safety standards.

At the time of the crash, Uber’s Advanced Technologies Group had no corporate safety plan or guiding document that identified individual employee roles and responsibilities to manage safety, said Michael Fox, senior highway accident investigator at the NTSB. The company also lacked a safety division and did not have a dedicated safety manager responsible for risk assessment and mitigation. In the weeks before the crash, Uber made the fateful decision to reduce the number of safety drivers in each vehicle from two to one. That decision removed important redundancy that could have helped prevent Herzberg’s death.

Not only was Vasquez alone in the car at the time, but her complacent attitude about the car’s automated driving system also helped contribute to the crash. And that attitude was woefully misplaced. The car detected that Herzberg was crossing the street with her bicycle 5.6 seconds before impact. But even though the system continued to track Herzberg right up until the crash, it never correctly identified her as a human being on the road, nor did it accurately predict her path.

“Automation complacency... needs to be in everyone’s vocabulary.”

“Automation complacency... needs to be in everyone’s vocabulary,” said Bruce Landsberg, an NTSB board member.

Vasquez was completely unaware of this conflict until it was too late. One of the implicit lessons of the Uber crash and the subsequent NTSB investigation is the underutilization of safety drivers in self-driving cars, said Mary “Missy” Cummings, director of Humans and Autonomy Lab at Duke University. Uber and other companies that test self-driving cars typically hire independent contractors as safety drivers to ride around in the cars and generate miles. They are seen as little more than bodies in seats. Instead, they should be seen as critical partners in a testing protocol that can give very useful feedback, Cummings said.

“Of course, this would cost money,” she said. “Despite everyone’s lip service that safety is paramount, no one that I know of is supporting safety drivers in this way.”

Uber’s aggressive corporate culture — the need to test autonomous vehicles even though the technology wasn’t ready — has been exposed not only through this investigation, but also through the lawsuit brought by Waymo, the self-driving company spun out of Google, which accused Uber of stealing its self-driving trade secrets.

Photo: Uber

“The driver had one chance to save her life,” one former Uber ATG employee who was with the company at the time of the crash, told The Verge, “but Uber had dozens.”

Uber ATG was under enormous pressure to show results to the new CEO, Dara Khosrowshahi, who was reportedly considering shutting the division down due to mounting R&D costs. That led them to cut corners, though the company has since made significant strides to address those mistakes.

The NTSB’s board members saved their most blistering assessments for the federal government. Homendy blamed the NHTSA for prioritizing technological advancement over saving lives, and she called the agency’s voluntary guidance so “lax” as to be “laughable.”

The agency’s voluntary guidance is so “lax” as to be “laughable”

The voluntary safety guidelines were first established under President Obama who feared that restrictive rules governing the testing of self-driving cars could stifle innovation. Those rules have been made even more lax under President Trump who went further by eliminating an all-star federal committee on vehicle automation that was meant to serve as a “critical resource” for the Department of Transportation. Trump axed the committee without even telling some of its members, The Verge reported recently.

So far, only 16 companies have submitted voluntary safety reports to the NHTSA, many of which amount to little more than “marketing brochures,” said Ensar Becic, project manager and human performance investigator in the Office of Highway Safety. That represents only a fraction of the over 60 companies testing self-driving cars in California alone.

“I mean you might as well say we would like your assessments, but we’re really not requiring it,” Homendy said during the hearing. “So why do it?”

The Department of Transportation has issued three versions of its automated vehicle safety guidance, and it plans to issue a fourth version that incorporates lessons learned from the Tempe crash, said Joel Szabat, acting under secretary of policy at the Department of Transportation, during a Senate hearing on November 20th. (The original document is called “Automated Driving Systems: A Vision for Safety.”They should rename it ‘A Vision for Lack of Safety,’” Homendy quipped.)

Today, there are no federal laws requiring AV operators to demonstrate the safety of their vehicles before testing them on public roads or provide data on disengagements or failures in their automated driving systems. The government’s only role is reactive: to recall a part if it’s defective or launch an investigation in the event of a crash.

“They should rename it ‘A Vision for Lack of Safety.’”

In its final report, the NTSB recommends changing that. AV operators should be required, not encouraged, to submit safety assessments if they want to test their vehicles on public roads, it argues, and there needs to be a process for ongoing evaluation to determine whether AV operators are sticking to their safety goals.

But the day after the report was released, the NHTSA’s top administrator testified at a Senate hearing that Congress should pass a law to speed the deployment of fully self-driving cars that lack traditional controls like steering wheels and pedals. Currently, the agency is only allowed to exempt a total of 25,000 vehicles from federal motor vehicle safety standards a year.

“As we’re hearing from industry that cap may be too small,” said NHTSA acting administrator James Owens.

The previous attempt to lift the restrictions of cars without human controls flopped. Democrats in the Senate blocked the bill, citing inadequate measures to ensure safety. A second attempt is in the works, but it remains to be seen whether it can muster enough votes to pass.

“Just because Uber and their operator then behaved badly, everybody else should not be penalized.”

Overly restrictive federal regulations at this stage of a rapidly changing technology will very likely cause significantly more harm than good, said Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University. “Just because Uber and their operator then behaved badly, everybody else should not be penalized,” he said.

Autonomous vehicles are supposed to save lives, not take them. And if people don’t trust the companies that are building the technology, then the life-saving potential of self-driving cars will never materialize. The cars will roam the streets, empty and underutilized, until the operators pull the plug.

Polls show that half of US adults think automated vehicles are more dangerous than traditional vehicles operated by people. Opinions are already hardening, and it’s unclear what can be done to undo the damage the Uber crash has already caused.