Skip to main content

    Tesla crash involving Autopilot prompts federal investigation

    Tesla crash involving Autopilot prompts federal investigation

    /

    The driver says she was using Autopilot, but there is no evidence the system failed

    Share this story

    Illustration by Alex Castro / The Verge

    The National Highway Traffic Safety Administration is investigating a Tesla Model S crash that may have involved the electric car maker’s semi-autonomous Autopilot software, according to Reuters. The crash took place last week in Utah, and photos of the totaled Model S made headlines around the country, despite the driver walking away with only a broken ankle.

    The driver, a 28-year-old woman who has not been identified, told police she was driving 60 mph at the time, had Autopilot engaged, and was looking at her phone with her hands off the steering wheel up until right before the crash occurred. She rear ended a fire truck at a red light, according to police. However, it is not yet clear whether the NHTSA is investigating the crash specifically because of Autopilot, as the agency does not say in its statement why it’s looking into the matter.

    There is no evidence Autopilot failed, and we don’t yet know why the NHTSA is investigating

    “The agency has launched its special crash investigations team to gather information on the South Jordan, Utah crash,” NHTSA said in a statement given to Reuters today. “NHTSA will take appropriate action based on its review.”

    A report from the South Jordan Police Department in Utah details how the driver was not following standard Autopilot protocol, including removing her hands from the steering wheel more than a dozen times and looking at her phone prior the crash. So while there is no evidence Autopilot failed in any way here, NHTSA may be looking into the case to make a more definitive determination on the matter before we can conclusively say it was the driver’s fault. The Tesla Model S only monitors drivers by measuring resistance in the steering wheel, and the car will provide visual alerts if it detects the hands off the steering wheel for a certain amount of time.

    Tesla declined to comment specifically on the NHTSA investigation. However, a company spokesperson said, “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents.” Tesla did confirm Autopilot was indeed engaged when the driver in Utah rear ended the firetruck.

    Tesla CEO Elon Musk has been highly critical of the media coverage about the crash, saying journalists are unfairly focusing on Tesla crashes for sensationalist reasons and questioning why the numerous standard road deaths that occur every day are not covered as vigorously. Musk did, however, admit in a follow-up tweet that Autopilot “certainly needs to be better & we work to improve it every day.”

    Musk has said in the past that Autopilot reduces the chances of a driver getting into an accident and as a result saves lives, and Tesla has reiterated the point in public statements even following fatal crashes. (Three people have died while using the feature, but never because of it.) There is no concrete data on the safety of Autopilot, so Tesla said earlier this month that it will start publishing quarterly reports to outline Autopilot’s performance as it pushes forward on its plan to release a fully self-driving system by next year. Musk has also often criticized Autopilot users who find themselves in accidents.

    “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said on the analyst call in which he announced the quarterly Autopilot reports. “They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”

    A report from The Wall Street Journal on Monday revealed that, while Autopilot may make drivers safer and that crashes tend to happen only when guidelines are not followed, Tesla rejected more advanced driver monitoring features on its cars that could prevent Autopilot from being used recklessly. Tesla executives, including Musk, rejected the advanced features for saying they may be difficult or too expensive, may annoy drivers, or might not work as intended. These monitoring features were said to involve eye tracking using infrared sensors and cameras, and perhaps more vigorous alerts and sensors to keep a driver’s hands on the steering wheel. Musk later said on Twitter that eye tracking was not used because it was ineffective.