Tesla didn’t fix an Autopilot problem for three years, and now another person is dead

Image: NTSB

On May 7th, 2016, a 40-year-old man named Joshua Brown was killed when his Tesla Model S sedan collided with a tractor-trailer that was crossing his path on US Highway 27A, near Williston, Florida. Nearly three years later, another Tesla owner, 50-year-old Jeremy Beren Banner, was also killed on a Florida highway under eerily similar circumstances: his Model 3 collided with a tractor-trailer that was crossing his path, shearing the roof off in the process.

There was another major similarity: both drivers were found by investigators to have been using Tesla’s advanced driver assist system Autopilot at the time of their respective crashes.

Autopilot is Level 2 semi-autonomous system, as described by the Society of Automotive Engineers, that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. Tesla bills it as one of the safest systems on the road today, but the deaths of Brown and Banner raise questions about those claims and suggest that the Tesla has neglected to address a major weakness in its flagship technology.

There are some big differences between the two crashes. For instance, Brown and Banner’s cars had completely different driver assistance technologies, although both are called Autopilot. The Autopilot in Brown’s Model S was based on technology supplied by Mobileye, an Israeli startup since acquired by Intel. Brown’s death was partly responsible for the two companies parting ways in 2016. Banner’s Model 3 was equipped with a second-generation version of Autopilot that Tesla developed in house.

That suggests that Tesla had a chance to address this so-called “edge case,” or unusual circumstance, when redesigning Autopilot, but it has, so far, failed to do so. After Brown’s death, Tesla said its camera failed to recognize the white truck against a bright sky; the US National Highway Traffic Safety Administration (NHTSA) essentially found that Brown was not paying attention to the road and exonerated Tesla. It determined he set his car’s cruise control at 74 mph about two minutes before the crash, and he should have had at least seven seconds to notice the truck before crashing into it.

Federal investigators have yet to make a determination in Banner’s death. In a preliminary report released May 15th, the National Traffic Safety Board (NTSB) said that Banner engaged Autopilot about 10 seconds before the collision. “From less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel,” NTSB said. The vehicle was traveling at 68 mph when it crashed.

In a statement, a Tesla spokesperson phrased it differently, changing the passive “the vehicle did not detect the driver’s hands on the steering wheel” to the more active “the driver immediately removed his hands from the wheel.” The spokesperson did not respond to follow-up questions about what the company has done to address this problem.

In the past, Tesla CEO Elon Musk has blamed crashes involving Autopilot on driver overconfidence. “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said last year.

The latest crash comes at a time when Musk is touting Tesla’s plans to deploy a fleet of autonomous taxis in 2020. “A year from now, we’ll have over a million cars with full self-driving, software, everything,” he said at a recent “Autonomy Day” event for investors.

Those plans will be futile if federal regulators decide to crack down on Autopilot. Consumer advocates are calling on the government to open up an investigation into the advanced driver assist system. “Either Autopilot can’t see the broad side of an 18-wheeler, or it can’t react safely to it,” David Friedman, vice president of advocacy for Consumer Reports, said in a statement. “This system can’t dependably navigate common road situations on its own and fails to keep the driver engaged exactly when needed most.”

Car safety experts note that adaptive cruise control systems like Autopilot rely mostly on radar to avoid hitting other vehicles on the road. Radar is good at detecting moving objects but not stationary objects. It also has difficulty detecting objects like a vehicle crossing the road not moving in the car’s direction of travel.

Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.

On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.

“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added.

Tesla has not yet explained in detail how it intends to fix this problem. The company releases a quarterly safety report about the safety of Autopilot, but that report is short on details. That means experts in the research community don’t have hard data that would allow them to compare the effectiveness of Autopilot to other systems. Only Tesla has 100 percent understanding of Autopilot’s logic and source code, and it guards those secrets closely.

“We need detailed exposure data related to when, where, and what conditions drivers are leveraging Autopilot,” said Bryan Reimer, a research scientist in the MIT Center for Transportation and Logistics, in an email to The Verge, “so that we can begin to better quantify the risk with respect to other vehicles of a similar age and class.”

Other Tesla owners have spoken out about Autopilot’s problem of perceiving trucks in the vehicle’s path. An anonymous Twitter user who uses the handle @greentheonly “hacked” a Model X and posts observations on Twitter and YouTube. They did this to “observe Autopilot from the inside,” they said in an email to The Verge. In March, their Model X encountered a tractor-trailer perpendicular to their path, similar to both Brown and Banner. The vehicle would have tried to drive underneath the truck had the driver not intervened.

According to @greentheonly’s data, the semi was not marked as an obstacle. But they decided not to tempt fate: “I did not try to approach the trailer and see if any of the inputs would change (but I bet not).”

Comments

There is something so beautiful about a 4 seat convertible… they should have worked on that instead of falcon wing doors

The car makes itself a convertible on demand, what more could you want?

What a classy comment on an article about death. Gross.

At least he didn’t sat that Tesla’s working on a special Jayne Mansfield Signature Edition Model 3.

Happy I’m not the only ghoul whose initial thought was, "man that looks cool as a convertible."

I know! The white Tesla-synthetic seats really shine in the sun light. The trunk space would be gone, but rear wheel drive Tesla’s have massive frunks and rear seats leave plenty of storage. I think it would be a revolutionary cool practically car, unlike anything else. No one else but Tesla could pull something as cool and practical as that… A 4-door convertible with plenty of storage and a short wheel base. i’m in love.

Alternative title: and now 9 more people are alive than would have been anticipated based on the miles (>1 billion) travelled.

Also would note based on the Twitter thread when the semi has side skirts (which are required legally in Europe, and improve safety regardless of whether you’re using autopilot) there is no issue. Clearly Tesla needs to work on this, but at the same time I wonder if the US could just join the rest of the civilized world in requiring side skirts.

Here is one (of many) examples of someone without autopilot dying as a result of these poorly designed vehicles: https://www.currentargus.com/story/news/crime/2018/11/02/houston-woman-dies-lea-county-fatal/1862442002/

So while this is an issue for autopilot, it’s also an issue for humans it appears.

I agree entirely. Ignore the fact it would make it easier for Tesla, and that should not be the deciding factor in rule changes, but it would be safer for every single person on the road be it dumb 50 year old cars, cyclists or pedestrians.

It’s not just Tesla’s fault. People die even with "conventional" car. Around 4000 in the USA. See "IIHS Crash Tests Show Side Underride Guards on Trucks Save Lives"…

It’s actually 40,000 deaths annually in the USA.

How do they improve safety?

If he was driving 74 mph into that trailer, I don’t think side skirts would’ve helped much.

This is a confused and poor argument.

The problem isn’t whether Tesla’s self driving system is leading to a net gain or net loss of life, but whether the loss of life is obviously preventable (or at the very least, could Tesla have done more to try to prevent the loss of life). The discussion isn’t whether to completely stop autonomous vehicle development (I have not seen a single person who argues that), but whether Tesla should better implement better features (that other car manufacturers implement, so it’s not like these features are impossible, or even hard, to implement) to reduce the unnecessary loss of life.

The first line of defense in avoiding loss of life is always the driver. First responsibility belongs squarely with the driver, especially since this is a known problem. Why would any sane driver take their hands off the wheel and eyes off the road?

Over 37,000 people die in auto accidents in the US every year. Tesla’s cars and driver assist features do reduce accidents. But The Verge sees fit to write about only one of those deaths and position it as an indictment of Tesla.

That’s basically dishonest reporting.

Once again you’re refusing to engage the actual argument. This isn’t about who has "first responsibility", this is about whether the loss of life was unnecessary and preventable.

It’s the reason why seatbelts have warning systems even though it’s the driver’s responsibility to use the seatbelt; it’s the reason why Ford is getting sued for their poorly designed parking system on the Grand Cherokee; it’s the reason why car companies (and airplanes, and anything that can cause multiple loss of life) have multiple redundancies even if the "first responsibility" belongs to the user.

It’s a childish argument to say "We’re not going to do everything we can to reduce the loss of life, because the "first responsibility" lies with the driver".

There are lots of unnecessary and preventable deaths. See: all the cars that don’t get 5-stars on every crash metric or are sold without AEB. We know that those cars could be designed at additional expense, and yet they are still allowed to be sold despite them being more dangerous, causing additional fatalities.

Or alternatively, look at literally any car that isn’t a model 3: https://electrek-co.cdn.ampproject.org/i/s/electrek.co/wp-content/uploads/sites/3/2018/10/m3-nhtsa-blog-09272018-e1538965297188.jpg?resize=1292,1000

What would it cost Tesla to stop advertising their driver assist feature as "Autopilot"?
What would it cost for Tesla to implement a software update that stops the car slowly if driver hands are off the steering wheel?
How much does the cheapest Model 3 you can actually buy cost vs the cheapest car in other automaker’s line ups?

At least try to engage the argument in good faith.

Well, what would it take to have Audi not call it traffic jam pilot? Or Nissan to stop calling it pro pilot?

What would it cost Nissan Audi Volvo Mercedes Lexus etc to implement a feature where it slows down after not detecting hands?

(PS: if you fail to respond to the request fast enough to put your hands on the wheel the feature is disabled until you stop and reboot the feature)

The cheapest model 3 is 35,400. I don’t know the cheapest Mercedes, Audi, Volvo, bmw, Maserati, Porsche or Land Rover is but I would expect around the same price or quite a lot higher. Still, in 2019 not a single other manufacturer has AEB on 100% of their cars, while it has been standard on every Tesla sold for a couple years now.

Calling it autopilot does not mean you can ignore what the car is doing, you still have to monitor a plane while the auto pilot is engaged. The real question here is how many tractor-trailers have pulled out in front of a Tesla, where the driver is doing what they are told to do the moment they enable autopilot, that ended up causing a fatal accident? I don’t have access to the numbers but I would guess that, since all fatal Tesla accidents are reported worldwide, the number is zero.

People who think we should protect the entire population from their own stupidity are leading to a snowball effect that is creating a population that contains more stupidity to protect from. Misuse of Tesla’s product by its driver and poor driving by the commercial vehicle’s operator are the only causes of death in both of these situations.

This. It seems that if pulling out in front of car results in collision it is by definition an unsafe maneuver. Ticket the crap out of them truckers and perhaps a manslaughter charge on top

The car already does slow to a stop if the driver is inattentive long enough. It will also disable AutoPilot features if the driver is found to be inattentive. Check your facts.

Tesla spends 0 in advertising. And provides multiple – unambiguous warnings when user uses AP.
"user convenience" has to be taken care of when designing any system. If you make AP too strict – switching off if you take your hands off, people would altogether stop using it – leading to more deaths.

We could make cars with "breath analysers" installed, cars who won’t start if "Seat belt" is not buckled…but why don’t we do that??

Two things.

First, you assume Tesla has done nothing since the first death. While the fact is that we simply don’t know what progress Tesla has made, I think it’s much more likely that they have been working on it than not.

Second, seat belts are a great example to bring up. At first, simple lap belts were an option. Next, they became required by law. Then, three-point belts were required. Then manufacturers began having warning tones and lights. Then later those lights and tones became required.

This took place over DECADES. Although the first seat belts were installed in 1949, the first seat belt LAW wasn’t passed until 1970. Many thousands died during this period of development and progress.

I am not in favor of tens of thousands dying every year in car accidents. I’m human. I also know for a fact that tens of thousands will die every year.

I also believe that every engineer, software or hardware, working on vehicle safety, in every auto company including Tesla, is trying as hard as they can to improve safety.

To believe otherwise I think says more about the person holding that belief than it does about automotive engineers. It takes an enormous amount of calloused hatred to believe that these engineers are not working on safety.

And I pity the people whose hearts are so cynical and hardened that they hold these beliefs.

Basically dishonest? It’s completely dishonest. This may be the most ridiculous article title I’ve ever seen.

View All Comments
Back to top ↑