Tesla CEO Elon Musk is finally admitting that he underestimated how difficult it is to develop a safe and reliable self-driving car. To which the entire engineering community rose up as one to say, “No duh.”
Or at least that’s how it should have happened in a just world. Instead, all the Tesla sycophants and ass-kissers on Twitter told Musk to keep up the good work, that they believed in him, and encouraged him to hurry up and roll out the latest version of his “Full Self-Driving” software that, it’s worth pointing out, does not enable a Tesla vehicle to drive itself without input from the driver.
Musk has a long history of over promising and under delivering when it comes to his company’s so-called “Full Self-Driving” software. He did it in 2018, when he promised that the “long awaited” V9 (Version 9) would begin rolling out in August. He did it again in 2019, proclaiming that “a year from now” there would be “over a million cars with full self-driving, software, everything.”
He was back at it again this past weekend, promising that “FSD 9 beta is shipping soon,” with an added “I swear!” just in case you had any doubts about his solemnity.
There’s no question that Tesla is more willing than its competitors to test beta versions of its Autopilot driver assist feature on its customers in the interest of gathering data and working out any bugs in the system. And Tesla customers are mostly fine with this, routinely flooding Musk’s mentions begging to be white-listed for the current version of Full Self-Driving. This has helped contribute to Tesla’s public reputation as a leader in autonomous driving, despite its vehicles continuously falling short of what most experts would agree defines a self-driving car.
Tesla says Autopilot is safe — it releases quarterly reports with selectively reported data that it says proves this — but that it also requires constant input from the driver in order to work. Meanwhile, AV companies like Waymo have real driverless vehicles on public roads giving rides to passengers. Waymo, like practically every AV company, uses a combination of different sensors, like radar, lidar, and cameras, to ensure there are redundancies in case any system failures. Tesla recently switched to a camera-only sensing system. The contrast between Tesla and every other company pursuing self-driving technology couldn’t be more stark.
I, for one, am all for Musk taking as long as he wants with the release of V9. Let the cake bake for as long as it needs, in my opinion, especially after viewing videos like the one to just come out of China of a Tesla Model 3 in Autopilot utterly failing to take a sharp turn and crashing into a ditch.
An anonymous Twitter user who uses the handle @greentheonly to post “hacks” of Tesla’s Autopilot, recreated the scenario to demonstrate how the company’s driver assist feature struggles with these sharp turns. With an overlay of Tesla’s Autopilot display running in the corner of the screen, greentheonly shows how the vehicle “actually outputs various alerts before the eventual ‘take over we are giving up.’” Other times, the car actually slows down enough and manages to take the turn safely.
A system that fails to take a sharp turn in “half the cases” should not inspire a great amount of confidence! Quite the opposite actually. The number of open investigations into vehicle crashes involving Tesla Autopilot seems to be growing in inverse relation to customer expectations about Musk’s ability to deliver on the promises he’s been making (and breaking) for years now.
Musk isn’t alone in coming to the realization that self-driving cars are hard. Nearly the entire industry was predicting that by now ours roads would be swarmed with self-driving cars, only to later admit they underestimated how complicated it was to get cars to drive themselves safely and reliably.
To which we can now say to Musk, “Welcome to the party, pal.”