Over the weekend, avid Twitter user Elon Musk said that Tesla’s “long awaited” Version 9 of Autopilot would begin rolling out this August. “To date, Autopilot resources have rightly focused entirely on safety,” Musk tweeted. “With V9, we will begin to enable full self-driving features.”
This would appear to put Musk on track to achieve the promise he made two years ago to offer “full self-driving” capabilities to Tesla owners by 2019. At the time, Musk said that all of his company’s vehicles would be shipped with the hardware necessary for “full self-driving.” That meant customers interested in the feature could shell out $3,000 for an add-on to its $5,000 “Enhanced Autopilot” option.
But what does “full self-driving” mean? A Tesla spokesperson said Musk’s tweet was the extent of the company’s comments. Tesla has been a leader in semi-autonomous driving, racing ahead of legacy automakers by releasing beta systems to customers. The idea is that these betas increase safety (such as with forward collision and lane departure warnings) and maintain Tesla’s edge in the market. But it’s also increased risks for Tesla.
That issue is better in latest Autopilot software rolling out now & fully fixed in August update as part of our long-awaited Tesla Version 9. To date, Autopilot resources have rightly focused entirely on safety. With V9, we will begin to enable full self-driving features.— Elon Musk (@elonmusk) June 10, 2018
There have been a number of car crashes recently involving Tesla vehicles using Autopilot, three of which resulted in fatalities. Federal investigators recently issued a preliminary report on one fatal crash in Mountain View, Calif, in which Autopilot was reported to have made a navigational mistake contributing to the incident.
Tesla has also been slow to roll out updates to Autopilot. The latest over-the-air software update came in March 2017, which included improved Autosteer for speeds up to 90 mph and auto lane changing. Tesla settled a class action lawsuit this week with owners of its vehicles who alleged Autopilot was “essentially unusable and demonstrably dangerous.”
Consumer Watchdog recently blasted Tesla for what it calls “deceptive and unfair practices in advertising and marketing” of Autopilot. That suggests that Tesla’s problem is really with marketing, that drivers hear “Autopilot” and assume they can let their attention wander while using it, and that if Tesla just changed the name, a lot of the problems associated with Autopilot could be cleared up.
Indeed, every time there is a crash involving Autopilot, the company issues a statement reminding drivers “to keep their hands on the wheel and maintain control of the vehicle at all times.” (Musk has been featured in news segments using Autopilot without his hands touching the steering wheel.)
Tesla’s system uses an array of eight cameras, 12 ultrasonic sensors, and a forward-facing radar to detect objects and obstructions on the road. This hardware is paired with Tesla’s vision and neural net system, which enable vehicles in the company’s fleet to continuously learn and improve, based on software trained from the billions of miles of road data that are collected by its vehicles.
But Tesla does not use LIDAR, the laser-based sensors that many self-driving operators consider a crucial piece of their hardware stack. Musk has called LIDAR “a crutch” for the self-driving industry and has defended Tesla’s strategy of achieving “full autonomy” using only cameras, radar, and ultrasonic sensors.
Musk’s self-driving tweet has already sent Tesla watchers (including both long and short sellers of the company’s stock) into a feverish bout of speculation. Short sellers, of course, are convinced that this is another dose of Musk-inspired fantasy, while Tesla’s fans are counting the days until they can take naps behind the wheel of their cars.
Let’s assume for now that Musk isn’t saying that Tesla will enable its cars to become fully driverless, like Waymo’s driverless minivans, which rely on LIDAR. Autopilot Version 9 might enable improved image recognition, “seeing” objects such as traffic lights, stop signs, pedestrians, and other road features — after all, Tesla recently hired deep learning and computer vision expert Andrej Karpathy to help Autopilot with its detection abilities.
Some critics have called on Tesla to ramp down its risky use of Autopilot until the bugs can be fixed. Sam Abuelsamid, senior analyst at Navigant Research, said recently that Tesla should “stop using their customers as guinea pigs and disable Autopilot until they have it working properly.” Instead, based on Musk’s tweet, it sounds like Tesla is preparing to take its Autopilot experiment to the next level.