Tesla crash involving Autopilot prompts federal investigation

Illustration by Alex Castro / The Verge

The National Highway Traffic Safety Administration is investigating a Tesla Model S crash that may have involved the electric car maker’s semi-autonomous Autopilot software, according to Reuters. The crash took place last week in Utah, and photos of the totaled Model S made headlines around the country, despite the driver walking away with only a broken ankle.

The driver, a 28-year-old woman who has not been identified, told police she was driving 60 mph at the time, had Autopilot engaged, and was looking at her phone with her hands off the steering wheel up until right before the crash occurred. She rear ended a fire truck at a red light, according to police. However, it is not yet clear whether the NHTSA is investigating the crash specifically because of Autopilot, as the agency does not say in its statement why it’s looking into the matter.

“The agency has launched its special crash investigations team to gather information on the South Jordan, Utah crash,” NHTSA said in a statement given to Reuters today. “NHTSA will take appropriate action based on its review.”

A report from the South Jordan Police Department in Utah details how the driver was not following standard Autopilot protocol, including removing her hands from the steering wheel more than a dozen times and looking at her phone prior the crash. So while there is no evidence Autopilot failed in any way here, NHTSA may be looking into the case to make a more definitive determination on the matter before we can conclusively say it was the driver’s fault. The Tesla Model S only monitors drivers by measuring resistance in the steering wheel, and the car will provide visual alerts if it detects the hands off the steering wheel for a certain amount of time.

Tesla declined to comment specifically on the NHTSA investigation. However, a company spokesperson said, “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents.” Tesla did confirm Autopilot was indeed engaged when the driver in Utah rear ended the firetruck.

Tesla CEO Elon Musk has been highly critical of the media coverage about the crash, saying journalists are unfairly focusing on Tesla crashes for sensationalist reasons and questioning why the numerous standard road deaths that occur every day are not covered as vigorously. Musk did, however, admit in a follow-up tweet that Autopilot “certainly needs to be better & we work to improve it every day.”

Musk has said in the past that Autopilot reduces the chances of a driver getting into an accident and as a result saves lives, and Tesla has reiterated the point in public statements even following fatal crashes. (Three people have died while using the feature, but never because of it.) There is no concrete data on the safety of Autopilot, so Tesla said earlier this month that it will start publishing quarterly reports to outline Autopilot’s performance as it pushes forward on its plan to release a fully self-driving system by next year. Musk has also often criticized Autopilot users who find themselves in accidents.

“When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said on the analyst call in which he announced the quarterly Autopilot reports. “They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”

A report from The Wall Street Journal on Monday revealed that, while Autopilot may make drivers safer and that crashes tend to happen only when guidelines are not followed, Tesla rejected more advanced driver monitoring features on its cars that could prevent Autopilot from being used recklessly. Tesla executives, including Musk, rejected the advanced features for saying they may be difficult or too expensive, may annoy drivers, or might not work as intended. These monitoring features were said to involve eye tracking using infrared sensors and cameras, and perhaps more vigorous alerts and sensors to keep a driver’s hands on the steering wheel. Musk later said on Twitter that eye tracking was not used because it was ineffective.

Comments

Wait, so Utah has roads with red lights, where you can drive 60?

I’m really over Musk’s smug attitude.

There are multiple roads in Utah with speed limits of 55 or 60 with stoplights. Granted, this being Utah that doesn’t stop people from going 70+.

At first, I was about to say that they really need to do something. But then I read this:

The driver, a 28-year-old woman who has not been identified, told police she was driving 60 mph at the time, had Autiopilot engaged, and was looking at her phone with her hands off the steering wheel up until right before the crash occurred.

Autopilot or not, your phone should not be in your hand while driving. You should always be paying attention.

Owners should have to take iq tests, seems majority of them are idiots.

Autopilot or not, your phone should not be in your hand while driving. You should always be paying attention.

I agree but I think if you give people a system to take over the driving an inevitable outcome is they’re not always going to pay attention.

This is my issue with the state of "auto pilot" right now. I don’t think I’ll use any of these half measures. Either fully autonomous or nothing.

Side note, my buddy just hit a car trying to use his "auto parallel park" in his new VW.

I don’t think I’ll use any of these half measures

Nobody is forcing you to NOT pay attention. You could have autopilot plus pay attention as designed. That’ll only increase safety.

Elon Musk is right, autopilot is simply cruise control 2.0. Right now there are those that abuse it, yet Tesla gets blamed for their negligence?! No excuses if the driver doesn’t heed to road safety laws because they think autopilot means "it’ll drive itself".

But if you think Tesla is in the wrong, then Apple, Google, and Samsung should be held responsible for texting while driving deaths since they make smartphones.

Tesla should consider changing the name of its technology. "Autopilot" suggests that it is capable of more than it what it actually does. (The driver is of course still at fault here.)

This has been discussed already and confirmed by actual… aircraft pilots – the use of the ‘Autopilot’ word here is 100% relevant. Just look up in the dictionary what an autopilot is.

Indeed, autopilot in an aircraft just means ‘flying in a straight line’, everybody should keep that in mind.

However that is not what the general public thinks autopilot means – and that’s the thing that matters (the public’s perception of that name).

That said, Tesla needs to design for the LCD user in this case or they’ll just keep getting hounded with this bad PR (possibly pushed along by fossil fuel or other auto mfr interests) which is starting to snowball and have a PR perception effect – which they don’t need.

Agreed. When I think of an airplane flying on autopilot I imagine the pilot and co-pilot asleep in the cockpit. Then a flight attendant wakes them up just before landing so they can look like they were in control the whole time as they wave goodbye to the deplaning passengers.

aircraft pilots – the use of the ‘Autopilot’ word here is 100% relevant

That would be fine if Tesla was selling Autopilot to trained pilots. However, they’re selling it to general consumers, and general consumers think autopilot = drives automatically.

Doesn’t help that they state in big bold letters it has full self driving hardware right at the top of their website.

Doesn’t help people don’t look up the word in dictionaries either.

It’s hilarious, SF-homage maker Musk’s fancy cars foiled by the Science Fiction definition of autopilot…

I have said this multiple times but 0 (ZERO) owners of Tesla are confused by what AUTOPILOT means and its limitations. You realize that 5 minutes into taking possession of the car and engaging Autopilot. The only people confused about the capability of autopilot are those who’ve never used it.

Imagine I sold you a widget called "Automobile". How long will you be confused if it makes you "automatically" "mobile" or is human interaction required?

What ever happened to personal responsibility. Sorry, but if you think your car can drive itself because it has a feature called autopilot, then you are stupid. There are no self driving cars on the market and you should actually know what you’re buying if you spend $60,000.

Nothing happened to personal responsibility. Tesla drivers are still held responsible in an accident even if Autopilot was engaged.

This is not an aircraft. An aircraft is surrounded by air (void). Can you tell the difference?

Ok, now let’s focus on what the general consumer thinks autopilot does. That is 100% relevant in this case.

Yeah because every potential Tesla owner should know what autopilot exactly means for aircraft or that they should all look up in the dictionary every words and buzzwords they find in marketing materials. /s

This is a overused and terrible argument, aircraft pilots spend hundreds of hours in training to deal with autopilot failures and make sure they don’t rely on it. If you mandate the same training to people using Tesla’s "autopilot", do you think there’d still be any buyers?

Ducky is right.

Which is what makes all of this so annoying.

Drivers: Keep your hands on the wheel, and your own vehicle under control. That’s your responsibility at all times, no matter what. It’s your own fault if you crash because you stopped doing those things.

Tesla: stop pretending that your cars are possessed by some magic computer-god. It’s your fault that people are crashing.

It’s your fault that people are crashing.

Well… no, it’s not their fault, that’s the point.

No, it’s stupid peoples fault for crashing.

View All Comments
Back to top ↑