Skip to main content

Tesla Autopilot, distracted driving to blame in deadly 2018 crash

Tesla Autopilot, distracted driving to blame in deadly 2018 crash


A damaged crash attenuator also contributed to the driver’s death

Share this story

Illustration by Alex Castro / The Verge

The National Transportation Safety Board said Tuesday that Tesla’s Autopilot driver assistance system was one of the probable causes of a fatal 2018 crash into a concrete barrier. In addition, the safety board said the driver was playing a mobile game while using Autopilot before the crash, and investigators also determined he was overly confident in Autopilot’s capabilities.

The safety board arrived at those probable causes after a nearly two-year investigation into the crash. NTSB investigators also named a number of contributing factors, including that the crash attenuator in front of the barrier was damaged and had not been repaired by California’s transportation department, Caltrans, in a timely manner. Had the crash attenuator been replaced, NTSB investigators said Tuesday that the driver, Walter Huang, likely would have survived.

The NTSB shared its findings at the end of a three-hour-long hearing on Tuesday. During the hearing, board members took issue with Tesla’s approach to mitigating the misuse of Autopilot, the National Highway Traffic Safety Administration’s lax approach to regulating partial automation technology, and Apple — Huang’s employer — for not having a distracted driving policy. (Huang was playing the mobile game on a company-issued iPhone.)

“We urge Tesla to continue to work on improving their Autopilot technology”

“In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,” NTSB chairman Robert Sumwalt said at the end of the hearing on Tuesday. “We urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”

The investigators’ findings

On March 23rd, 2018, Huang was traveling south on US-101 using Autopilot on his way to work in Mountain View, California. He eventually approached a section of the highway where State Route 85 begins by splitting off to the left of US-101. He was in the left-most HOV lane thanks to the clean air sticker that electric vehicle owners are eligible for.

As the left exit lane for State Route 85 started to split off to the left, the Autopilot system in Huang’s Model X briefly lost sight of the lines marking his lane. (Investigators showed photos on Tuesday of the worn-down lane markers.) Autopilot started following the right-most lane marker of the exit lane, and Huang’s Model X steered into the “gore area” that separates the exit lane from the rest of the highway. A second or so later, his Model X smashed into the damaged crash attenuator and the concrete barrier. He died a few hours later at a local hospital.

Investigators recapped the crash during Tuesday’s hearing, presenting evidence that the NTSB made public last week. At the end of the presentation, and after a few hours of questions from the five members of the safety board, the team of investigators presented 23 findings, and made nine new safety recommendations, in addition to naming the probable causes.

One of the team’s findings was that the crash was made possible, in part, because of the limits of Autopilot’s vision-based processing system. Tesla CEO Elon Musk has long argued that autonomous cars don’t need LIDAR (a laser sensor that can build a real-time 3D model of the world), and so Autopilot is designed around a system of cameras, as well as ultrasonic sensors and a forward-facing radar. That reliance on cameras has limits, investigators said Tuesday, and the way Huang’s car drifted out of the HOV lane is an example of those limits. In fact, as the investigators found, Huang’s car had done this same dangerous maneuver multiple times in the days and weeks before his crash.

In addition, the investigators said that Tesla’s method of making sure drivers are paying attention while using Autopilot — using a torque sensor to measure force on the steering wheel — “did not provide an effective means of monitoring the driver’s level of engagement with the driving task.”

The NTSB says Tesla’s driver monitoring is inadequate

NTSB investigators also found that the Model X’s forward collision warning system didn’t alert Huang to the coming impact, nor did it slow the vehicle down at all; in fact, the Model X sped up before impact because the system thought it was free to resume the 75 mile per hour cruise control speed Huang had set. The NTSB said Tesla had not designed these emergency systems to handle a situation like this. It also placed some blame on the National Highway Traffic Safety Administration for not requiring companies like Tesla to make those systems work in a crash like this.

If Tesla doesn’t add new safeguards that limit the use of the Autopilot outside of its advertised applications, investigators wrote, then the “risk for future crashes will remain.”

The investigators gave similar weight to distracted driving’s role in Huang’s death. They found that he was playing a mobile game prior to the crash, and said that was “likely” the reason why he didn’t try to turn away from the barrier. The team said that countermeasures, like limiting distracting features or locking out smartphones entirely, would help lower the rate of crashes tied to distracted driving.

Apple does have a feature that turns off many features while driving, but one NTSB board member, Thomas Chapman, said he was “frankly unaware I had such an option on my phone.”

“And that of course is the point, it makes more sense to turn on such a feature as the default setting to better ensure users are aware such an option exists,” Chapman said.

Making things worse was Huang’s apparent overconfidence in Autopilot’s capabilities, evidenced by his willingness to play a mobile game while operating the car. Tesla itself has said in the past that overconfidence is a risk for drivers using Autopilot, and is one of the reasons the company constantly reminds owners its cars’ manuals to pay close attention while using the driver assistance system.

Sumwalt was particularly stunned by Huang’s apparent overconfidence in Autopilot, though, especially after learning that he had dealt with Autopilot failing in that same area before.

“This kind of points out two things to me. These semi-autonomous vehicles can lead drivers to be complacent, highly complacent, about their systems. And it also points out that smartphones manipulating them can be so addictive that people aren’t going to put them down,” he said midway through Tuesday’s hearing.

Nine new recommendations

The new recommendations were issued to multiple parties, but not to Tesla — perhaps because the company has still not officially responded to the safety board’s recommendations from a 2017 investigation into a different Autopilot-related fatal crash, something Sumwalt criticized Tesla for on Tuesday.

Instead, the NTSB appears happy to try to influence the decisions of Tesla and other automakers by asking other government agencies to step in and regulate.

One member said NHTSA’s crash testing, which doesn’t examine features like Autopilot, is “pretty worthless”

The first four recommendations were aimed at NHTSA. The NTSB asked NHTSA to start testing automakers’ forward collision avoidance systems, and especially how they deal with “common obstacles” like “traffic safety hardware, crosstraffic vehicle profiles, and other applicable vehicle shapes or objects found in the highway operating environment.” The current version of NHTSA’s evaluation process, known as the New Car Assessment Program, doesn’t take any of this into account, which some board members were not happy about. (At one point, board member Michael Graham held up an NCAP evaluation of a car one of his staffers owns and said it was “pretty worthless.”)

The safety board asked NHTSA to start evaluating the limits of Autopilot, and to determine the likelihood that it will be misused. If safety defects are identified, they wrote, NHTSA should use its regulatory authority to make sure Tesla “takes corrective action.”

The NTSB also asked NHTSA to work with the Society of Automotive Engineers to draw up standards for driver monitoring systems that would “minimize driver disengagement, prevent automation complacency, and account for foreseeable misuse of the automation,” and require that technology in all vehicles with Autopilot-like features.

On the distracted driving side, the NTSB recommended that Apple adopt a policy that bans the nonemergency use of smartphones and tablets in company-owned vehicles, or while doing company work. And it asked the Occupational Safety and Health Administration help increase employers’ awareness of the dangers of distracted driving, and to step in when employers don’t comply. The board also recommended that smartphone manufacturers develop better in-car “do not disturb” modes that “automatically disable any driver-distracting functions when a vehicle is in motion” (but allows the device to be used in an emergency), and make that the default setting.

Is anybody listening?

One of the pervasive themes at Tuesday’s meeting, and in the NTSB’s findings and recommendations, is that the safety board believes companies and other government agencies are basically exploiting its lack of regulatory authority.

The NTSB is an independent government agency, but it only has the authority to investigate and make recommendations. It’s still up to these other parties to act. And according to the NTSB, that’s not really happening.

Tesla still hasn’t formally responded to NTSB recommendations issued 881 days ago. NHTSA hasn’t properly responded to a previous NTSB recommendation to improve the New Car Assessment Program. And Caltrans has gone 1,044 days without formally responding to an NTSB recommendation about improving its crash attenuator replacement process.

“It is frankly disheartening.”

“I am disturbed that in this report we call out a number of recommendations that are very late in being responded to,” Sumwalt said at Tuesday’s hearing. “Again, this is how we affect change, through our recommendations. It is frankly disheartening.”

This exasperation was codified, too, as the NTSB on Tuesday “reiterated” five previous recommendations that various governing bodies and companies have not met. To Tesla, for instance, it once again asked that the company add safeguards that limit the use of Autopilot only to situations that it was designed for. The board also re-recommended that Tesla develop a better driver monitoring system.

Sumwalt and the other board members held little back when expressing their frustration with NHTSA and Tesla throughout the hearing. They accused the former of basically abandoning its role as a regulatory body when it comes to systems like Autopilot (“I wanted to join expressing disappointment for the lack of leadership” from NHTSA, Chapman said at one point) and Tesla for how it’s acted in that regulatory vacuum. Sumwalt even took issue with the name Autopilot, saying he has “a personal belief that the term Autopilot may not be, from a safety point of view, the most well-branded name.”

But since the NTSB can’t force any of the changes it’s seeking, it’s unlikely that anything will change in the near term. And until it does, the safety board will probably have to keep investigating similar crashes like the one discussed on Tuesday.

In fact, when the NTSB released the documents in the Huang investigation last week, it also published documents related to its probe of another fatal crash involving Autopilot that happened last year. The full report is expected to be released in the next few weeks.