Tesla’s Autopilot steered car toward barrier before deadly crash, investigators say

Illustration by Alex Castro / The Verge

A navigation mistake by Autopilot contributed to the grisly death of a Tesla Model X owner in Mountain View, California, according to a preliminary report released today by the National Transportation Safety Board.

Apple engineer Wei “Walter” Huang was traveling south on US Highway 101 on March 23rd when his Model X P100D smashed into the safety barrier section of a divider that separates the carpool lane from the off-ramp to the left. The front end of his SUV was ripped apart, the vehicle caught fire, and two other cars crashed into the rear end. Huang was removed from the vehicle by rescuers and brought to Stanford Hospital, where he died from injuries sustained in the crash.

The agency says that Huang’s hands were detected on the steering wheel for a total of 34 seconds, on three separate occasions, in the 60 seconds before impact. NTSB also confirms Tesla’s position that the vehicle did not detect the driver’s hands on the steering wheel in the six seconds before the crash. There were two visual alerts and one auditory alert for the driver to place his hands on the steering wheel, but those alerts were made more than 15 minutes before the crash.

Huang’s Model X was following a lead vehicle using adaptive cruise control and autosteer, and traveling about 65 mph, eight seconds before the crash. A second later, the Tesla began a left steering movement. Then the vehicle the Tesla had been following moved, causing Huang’s vehicle’s speed to increase from 62 mph to 70.8 mph. There was no braking or evasive steering detected prior to impact.

A Tesla spokesperson declined to comment on the preliminary report, and instead pointed to the company’s prior statement on the deadly crash. In that statement, the company said that a damaged safety barrier, called a crash attenuator, contributed to the severity. Tesla also said that Huang had “about five seconds and 150 meters of unobstructed view” of the concrete divider with the crushed safety barrier before the incident. Huang’s family has retained a law firm and is exploring their legal options, local reports say.

NTSB confirms the attenuator had been damaged in the previous week when a Toyota Prius crashed in the same location. The damage likely made the attenuator ineffective and contributed to Huang’s death.

The battery from the totaled Model X began to smoke later that afternoon while in the impound lot, according to the NTSB report. “The battery was monitored with a thermal imaging camera, but no active fire operations were conducted,” NTSB says. “On March 28, [five] days after the crash, the battery reignited.” Firefighters responded and extinguished the blaze.

The report follows the decision by NTSB to boot Tesla from its investigation into the deadly crash, which the agency claims was because Tesla had released “investigative information before it was vetted and confirmed by” the agency. Tesla CEO Elon Musk also reportedly hung up on the head of the agency during a heated call concerning the investigation.

The NTSB said Tesla is still a party in two other ongoing investigations into non-fatal accidents: one from January 22nd, 2018 involving Autopilot, and one from last summer involving a battery fire. In May, Musk vowed to begin releasing a quarterly safety report about Autopilot.

Comments

Steering "toward" a barrier is a different thing than "centering itself in the lanes as they diverged where there happened to be a barrier". One version implies intent.

Intent is the wrong word

I don’t think it is the wrong word… AutoPilot has generally been a lane keep assist system… the intent of the developers is to steer as needed to stay at a "safe" distance from one of the lines. It is getting more sophisticated though, specially with recent updates.

Fantastic article talking about why auto-drive systems ignore stationary objects (and run into them) at higher speeds:

https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/

No, ‘towards’ does not imply intent. It literally just means the direction. The autopilot system was tricked by the lane split and sped up steering towards the barrier.

I would agree the driver is mostly responsible, but the autopilot system has some responsibility here as well.

At this point I think that each Tesla owner should be required to sit through some training when they pick up their car to make sure they understand the limitations of the technology. Because clearly too many use it incorrectly.
Since the system can monitor compliance, perhaps there should be some kind of 3 strikes and your out policy, at which point you have to re-educate to gain access back

There’s a reason why most companies haven’t bothered with Tesla’s approach. The consensus among most is full self driving or bust. Humans are just not trustworthy at paying attention and being ready to take over at any time while not driving. I’m assuming the autopilot beeps all the time, asking people to take over so they just start ignoring it after a while. Obviously I have no proof of this, but it explains why none of these deaths ever involve a new Tesla owner trying autopilot for the first time.

I’m not sure educating people will help the situation much.

I think most car companies didn’t implement it when Tesla first did, but now most do (that have a reasonable version of the tech). For example, Didn’t GM just announce their version, called Cruise something is coming to more vehicles beyond Cadillac?

Super Cruise. All Cadillacs will get it in 2020. Many of the components used for it are already used in assistive driving technologies common in GM vehicles, so while Super Cruise isn’t on it’s way to Chevrolet yet, you can see a trickle down effect happening in lower models/brands.

Tesla is partly to blame for this lack of education. They were only too happy to call their system "Autopilot" and market it as autonomous tech, only to start pulling back after the issues started piling up.

Not only that, but they included easter eggs that encouraged driver distraction by enabling them when autopilot was engaged, and told auto journalists about it.

I remember they had a page on their site for autopilot that said, in huge letters "full self driving." Then they had some subtext explaining it’s not full self driving in tiny text.

Have a page… and it is https://tesla.com/autopilot

Sorry, for some reason I thought they were smart enough to take that down. They should seriously replace that with an educational video about autopilot. If they even care about their reputation at all.

I don’t know what the point of Autopilot is if you have to hover your hands over the wheel at all times and grab onto it once in a while. Try to just hold your hands up over your steering wheel and see how long that’s really practical in doing.

So at any second that Tesla might just dive off the road. Wouldn’t that in its self, be stressful? Hands hovering as you’re driving just waiting for the car to screw up and do something. You pay thousands for this feature. I think I’d rathe just drive myself with my hands resting there on the wheel and knowing at all times what I’M going to be doing. I’m not going to surprise myself and drive myself off the road for the hell of it.

Tesla collects large amounts of data from Tesla owners, including those who use Autopilot. This is one of the reasons they can determne things like hand placenent on the steering wheel: This data is then used to make improvements in the system. Unlike many auto manufacturers, Tesla decided to release Autopilot before it was compliant with truly I autonomous operation. As Tesla explains to drivers, Autopilot is not intended to replace human control over a vehicle and should never be used for such purposes. That said, drivers will use Autopilot as they see fit, and some use it under circumstances for which it was not intended. There is an argument to be made that autonomous technology should not be included in vehicles until it has been tested and found to be safe under a variety of driving conditions. But, at some point, companies must test their autonomous driving technology under normal, "real life" conditions. Perhaps this should be done with trained technicians until the technology is very mature and only then released to the public, in order to prevent drivers from using the technology in ways it was never intended. That said, car companies that push the boundaries understand that this is a selling point that will attract buyers, particularly early adopters. People want to be the first in their neighborhood to have autonomous technology in their car. I agree with other commenters who have suggested that Tesla ought to provide a couple of warnings to people who misuse Autopilot, after which the technology should be switched off if it continues to be misused.

If you understand anything about the word "autopilot " in the two most commonly used applications on aircraft and ships it doesn’t mean let the machine take over and stick your head up your ass. But for some reason in cars some people seem to think autopilot means I don’t have pay attention ….

I’ll tell you the reason: quite a few people do think autopilot in an aircraft literally means "let the machine take over and stick your head up your ass".

What the system actually is is totally irrelevant to this conversation. What’s relevant is what people perceive it to be. And for most people, that perception is totally off. Tesla should have realized that.

Everything you said was correct up until "Tesla should have realized that". Tesla has no obligation or responsibility to protect people from their own stupidity. However, they do go out of their way to do so with educating new owners and the audible tones. Personally I like Tesla for the battery. If I owned a Tesla I would not purchase Autopilot – I have no use for semiautonomous vehicles. Full control or fully autonomous, I don’t se the point of anything else.

The way I see it, any company has a responsibility to not be misleading.

I’m not disputing they’re doing their best to educate new owners. What I’m not comfortable with is their marketing that’s geared way too much towards promotting autonomous driving that is not actually implemented. Even though afaik they’ve never been dumb enough to actually claim any of their cars is currently fully autonomous, their insistance with promotting their "fully autonomy capable hardware" and with that Autopilot branding.

Their website even goes as far as stating this:

All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.

Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction. It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval. Please note also that using a self-driving Tesla for car sharing and ride hailing for friends and family is fine, but doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year.

Even with that disclaimer at the end, this is seriously misleading. This disclaimer should be much more clearly worded. To me that kind of communication on what is effectively an advanced driver assistance system is uttetly irresponsible.

To be clear I’m not claiming they have any legal responsibility here. I’m just talking about ethical responsibility. Different people have different ethics so heh, your opinion is as valid as mine!

With all due respect, this is Tesla’s fault. Look at their advertisements for Autopilot…they’re only too happy for people to think that they can drive while heaving their heads up their asses.

https://tesla.com/autopilot FULL SELF-DRIVING! (thanks @dwightk)

Not sure where you’re getting your opinion on this, but that’s pretty much what autopilot is. The plane will keep a set a speed, will turn to the heading you want, climb or descend at the rate you want, etc. Pilots definitely don’t need to keep their hands on the yoke haha.

Tesla is partly to blame for this lack of education. They were only too happy to call their system "Autopilot"

Not this again. Tesla owners are not confused about the capability of the feature due to the name "Autopilot". It’s not like their cars were chauffeuring them autonomously every day and one day it crashed into a barrier. The bloody thing needs attention and correction every minute or two. We are clearly aware it’s not "self driving".

Agree that some subsection of owners think this more of an autonomous system than a driver assist technology, and I think Tesla bears some of the blame for that. That’s why I think a system where your access to the technology is revoked when it is mis-used makes some sense. It’s analogous to having your license suspended when you rack up too many moving violations. And I would imagine that Tesla could easily accomplish this via their OTA updates.

There’s a reason why most companies haven’t bothered with Tesla’s approach. The consensus among most is full self driving or bust.

Ummmm…. wut? This is the exact opposite of the strategy of nearly every other car company. My current car has many features that include some degree of self-driving, but has nothing close to full self-driving mode. It will parallel park itself, keep a constant distance from the car in front of it, brake to avoid a collision if I don’t, warn me if I’m drifting out of my lane, even warn me if it thinks I’m getting tired (a camera monitors my eyes). Small, incremental steps towards self-driving is the norm – again, the exact opposite of what you’re saying.

View All Comments
Back to top ↑