Skip to main content

Tesla will regularly release data about the safety of Autopilot, Elon Musk says

Tesla will regularly release data about the safety of Autopilot, Elon Musk says

/

Quarterly reports will outline Autopilot’s performance as the company pushes for full self-driving functionality — and a shared autonomous fleet by the end of 2019

Share this story

Illustration featuring the Tesla wordmark logo
Illustration by Alex Castro / The Verge

Tesla will publish quarterly reports about the safety of its Autopilot driver assistance feature, CEO Elon Musk announced on a call with analysts Wednesday. Musk didn’t elaborate on exactly what the reports will entail, and a representative for Tesla declined to add any further detail. But the move could represent a major change in how the company treats data related to Autopilot, which is typically closely guarded.

Musk argued on the call that there is “no question” that Autopilot reduces the chance of a driver getting in an accident, something both he and his company have often claimed in the past. “The statistics are unequivocal that Autopilot improves safety,” he said. Publishing these statistics about Autopilot’s performance will let the public know “exactly what Autopilot’s safety [level] is,” Musk said. “Is it getting better, is it getting worse?”

Musk said he wants to let people know if Autopilot is getting better or worse, and by how much

It has not been easy to determine exactly how much Autopilot improves the safety of drivers, or even how to measure that in the first place. The most common figure Tesla and Musk use when making claims about Autopilot’s safety is that it was “found by the U.S. government to reduce crash rates by as much as 40%.” This statistic comes from the report that was filed at the conclusion of the National Highway Traffic Safety Administration’s investigation into the 2016 death of Joshua Brown, who was using Autopilot when his Tesla Model S crashed into a tractor trailer. But the veracity of the statistic has recently come under fire, and today NHTSA distanced itself from the claim. Tesla declined to comment on the news.

Musk announced the plan to increase transparency around Autopilot after repeatedly criticizing press coverage of the company’s driver assistance feature, something he’s done on analyst calls in the past. Autopilot has faced increased scrutiny after a driver of a Tesla Model X died while using the driver assistance feature on a California highway in March.

“I think theres 1.2 million automotive deaths per year, and how many do you read about? Basically none of them,” Musk said, referring to global statistics. “But if it’s an autonomous situation, it’s headline news. And the media fails to mention that, actually, they shouldn’t really be writing this story, they should be writing a story about how autonomous cars are really safe. But that’s not the story that people want to click on. So they write inflammatory headlines that are fundamentally misleading to the readers.” Musk didn’t clarify which reports he takes umbrage with, or whether he’s including local news coverage of vehicle fatalities in his criticism.

The problem, Musk says, isn’t just that these “inflammatory headlines” are “misleading” to readers. He also argued that it’s affecting public policy. “Regulators respond to public pressure, and the press,” Musk said. “So if the press is hounding the regulators and the public is laboring on misapprehension that autonomy is less safe because of misleading press, then this is where I find things, the challenge for predicting it to be very difficult.”

Musk argued that “negative” press coverage of Autopilot is affecting public policy around self-driving cars

The “it” that Musk was referring to was the timeframe for the launch of the shared, autonomous fleet of Teslas that he referenced in the second company “master plan” that was released in 2016. Despite the difficulty in guessing how fast (or slow) regulations will move around self-driving cars in the coming years, Musk did say that on the technical side, he thinks a shared fleet of autonomous Teslas — one that he said would be like a mix of “Uber Lyft and AirBnB,” will “probably be ready by the end of next year.”

Before Tesla gets there, though, it still needs to roll out full self-driving capability to the cars it is and has been making. While Tesla currently offers customers the ability to pre-pay for full self driving capability, the current version of Autopilot still more closely resembles advanced driver assistance systems like Cadillac’s Super Cruise.

Meanwhile, a coast-to-coast demonstration of Tesla’s full self-driving capabilities has been delayed. Musk recently promised that the cross country drive is coming this year, though, and that the same capability will be made available to paying customers soon. Cars currently being made by Tesla should be able to handle full autonomy, Musk said, though he reiterated that the company may need to swap in more powerful computers to handle the processing power required.

While Tesla is often cagey about the number of miles driven using Autopilot, Musk did say on the call that overall usage is increasing. For cars equipped with the feature, a third or “maybe half” of highway miles in “some regions” are now driven using Autopilot, he said.

“But then of course when there’s negative news in the press, then that dips,” he added. “And then I was like, okay, this is not good, because people are reading things in the press that cause them to use Autopilot less, and then that makes it dangerous for our customers. And that’s not cool, that’s why I get upset.”

Musk has made this argument in the past — that “negative” press coverage about Autopilot could scare people into using the feature less, which in turn could put more people in harm’s way. But the Tesla CEO made a new argument about something he says the press gets wrong near the end of the call. Journalists often claim that a lack of understanding is to blame for Autopilot accidents, Musk said, but he believes the opposite is the case.

“When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said. “They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”

Overconfidence in the system does seem to be a problem, whether it’s in the form of drivers hopping into the passenger seat of a Tesla or, as the company says happened in the fatal crash in March, when it appears that the driver received and ignored numerous prompts to retake control of the car.

Some experts think problems like these could be mitigated with driver monitoring systems, like how Super Cruise watches drivers’ eyes to make sure they’re paying attention. In the Model S, X, and 3, Tesla only monitors driver attention by measuring resistance in the steering wheel. While it’s thought that a small camera in the Model 3 might someday used for driver monitoring, it has not yet been activated. A lack of robust driver monitoring systems was one of the criticisms laid out in the National Transportation Safety Board’s investigation into Brown’s death. Back then, the NTSB recommended that Tesla — along with other automakers — find ways to monitor driver attention that go beyond detecting steering-wheel engagement.