Skip to main content

Tesla privately admits Elon Musk has been exaggerating about ‘full self-driving’

Tesla privately admits Elon Musk has been exaggerating about ‘full self-driving’

/

‘Elon’s tweet does not match engineering reality’

Share this story

Elon Musk Awarded With Axel Springer Award In Berlin
Photo by Britta Pedersen-Pool/Getty Images

Tesla CEO Elon Musk has been overstating the capabilities of the company’s advanced driver assist system, the company’s director of Autopilot software told the California Department of Motor Vehicles. The comments came from a memo released by legal transparency group PlainSite, which obtained the documents from a public records request.

It was the latest revelation about the widening gap between what Musk says publicly about Autopilot and what Autopilot can actually do. And it coincides with Tesla coming under increased scrutiny after a Tesla vehicle without anyone in the driver’s seat crashed in Texas, killing two men.

“Elon’s tweet does not match engineering reality per CJ”

“Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently,” the California DMV said in the memo about its March 9th conference call with Tesla representatives, including the director of Autopilot software CJ Moore. Level 2 technology refers to a semi-automated driving system, which requires supervision by a human driver.

In an earnings call in January, Musk told investors that he was “highly confident the car will be able to drive itself with reliability in excess of human this year.” (It would appear the DMV was referring to these January comments, which Moore misunderstood as a tweet from Musk.)

Last October, Tesla introduced a new product called “Full Self-Driving” (FSD) beta to vehicle owners in its Early Access Program. The update enabled drivers to access Autopilot’s partially automated driver assist system on city streets and local roads. The early access program is used as a testing platform to help iron out software bugs. In the DMV memo, Tesla said that as of March 9th there were 824 vehicles in the pilot program, including 753 employees and 71 non-employees.

Musk has said the company was handling the software update “very cautiously.” Drivers still are expected to keep their hands on the steering wheel and should be prepared to assume control of their Tesla at any time. But he has also offered lofty predictions about Tesla’s ability to achieve full autonomy that conflict with what his own engineers are saying to regulators.

Tesla is unlikely to achieve Level 5 (L5) autonomy, in which its cars can drive themselves anywhere, under any conditions, without any human supervision, by the end of 2021, Tesla representatives told the DMV.

The ratio of driver interaction would need to be in the magnitude of 1 or 2 million miles per driver interaction to move into higher levels of automation. Tesla indicated that Elon is extrapolating on the rates of improvement when speaking about L5 capabilities. Tesla couldn’t say if the rate of improvement would make it to L5 by end of calendar year.

This isn’t the first time that Tesla’s private communications with the DMV have contradicted Musk’s public declarations about his company’s autonomous capabilities. In March, PlainSite published communications from last December between Tesla’s associate general counsel Eric Williams and California DMV’s chief of the autonomous vehicles branch, Miguel Acosta. In it, Williams notes that “neither Autopilot nor FSD Capability is an autonomous system, and currently no comprising feature, whether singularly or collectively, is autonomous or makes our vehicles autonomous.” In other words, Tesla’s FSD beta is self-driving in name only.

“Tesla couldn’t say if the rate of improvement would make it to L5 by end of calendar year”

(Al Prescott, acting general counsel at Tesla, was also involved in the December meeting with the DMV. Prescott has since left Tesla for LIDAR maker Luminar.)

Tesla and Musk have long been criticized for overstating the capabilities of the company’s Autopilot system, which in its most basic form can center a Tesla vehicle in a lane and around curves and adjust the car’s speed based on the vehicle ahead. The use of brand names like Autopilot and FSD has also helped contribute to an environment in which Tesla customers are misled into believing their vehicles can actually drive themselves.

There have been a number of fatal crashes involving Tesla vehicles with Autopilot enabled. The latest took place in Spring, Texas, in which two men were killed after their Tesla smashed into a tree. Local law enforcement said there was no one in the driver’s seat at the time of the crash, leading to speculation that the men were misusing Autopilot. Later, Tesla claimed that Autopilot was not in use at the time of the crash and someone may have been in the driver’s seat, too.

The US National Highway Traffic Safety Administration and the National Transportation Safety Board are both investigating the crash, in addition to dozens of other incidents involving Tesla Autopilot. Tesla didn’t respond to a request for comment, likely because the company has dissolved its press office and typically doesn’t respond to media requests anymore.