Skip to main content

Elon Musk still doesn’t think LIDAR is necessary for fully driverless cars

Elon Musk still doesn’t think LIDAR is necessary for fully driverless cars

/

‘In my view, it’s a crutch’

Share this story

Elon Musk
Photo by Asa Mathat

Elon Musk is not a fan of LIDAR, the laser sensor that most tech and car companies see as an essential component for self-driving cars. In an earnings call on Wednesday, Musk reiterated his dislike of LIDAR and defended Tesla’s strategy of achieving “full autonomy” using only cameras, radar, and ultrasonic sensors.

LIDAR, which stands for light detection and ranging, has become a common fixture on self-driving cars operated by companies like GM and Alphabet’s Waymo. But Musk has long argued that LIDAR is too expensive and too bulky for Tesla’s vehicles.

“Perhaps I am wrong, and I will look like a fool.”

“In my view, it’s a crutch that will drive companies to a local maximum that they will find very hard to get out of,” Musk said. He added, “Perhaps I am wrong, and I will look like a fool. But I am quite certain that I am not.”

Vjeran Pavic

It’s quite a departure from how other self-driving vehicle operators view LIDAR. Waymo and Uber are battling it out in federal court right now over Waymo’s belief that Uber stole its LIDAR designs to get ahead in the autonomous car race. “Laser is the sauce,” former Uber CEO Travis Kalanick wrote on a whiteboard in 2016, a photo of which was submitted as evidence earlier this week. But not for Tesla.

Laser is not the “sauce” for Tesla

Musk said Tesla is trying to tackle a much bigger problem: passive optical recognition. This is why Tesla is banking on cameras as the key hardware component in the long term to autonomous vehicle development. With their ever-increasing pixel resolution and the low-price point, camera sensors are seen as indispensable for advanced driver assistance systems (like Tesla’s Autopilot) and fully autonomous systems. For Tesla, cameras are everything.

The only problem is that Tesla had a very high-profile split with its main camera supplier, Mobileye, in 2016 in the wake of the fatal accident of a Tesla driver who was using Autopilot. With the introduction of the second-generation Autopilot in October 2016, Tesla eliminated Mobileye’s computer vision technology powered by its EyeQ3 chip. The company replaced that with its own computer vision system called “Tesla Vision” powered by Nvidia’s Drive PX2 onboard computer. Many experts note that Autopilot 2.0 is still missing a lot of the functionality of the original version that relied more heavily on Mobileye’s vision system.

But Musk said he’s confident that Tesla can get to full autonomy because of the company’s “sophisticated neural net,” which he says will be able to “see through” adverse conditions like fog, rain, dust, and snow. Before Teslas can start driving autonomously, the company needs to collect a lot of data to prove to customers (and regulators) that the technology is safe and reliable. So, its cars run Autopilot in “shadow mode” in order for Tesla to gather statistical data to show false positives and false negatives of the software. In shadow mode, the car isn’t taking any action, but it registers when it would have taken action.

The Tesla CEO couldn’t help but take a swipe at his competitors and their approach to autonomous driving. Musk said he found it “quite puzzling” that so many companies relied on LIDAR to help their cars “see.” To be sure, Tesla ranked last in a recent scorecard of the 19 companies developing self-driving cars. Analysts noted that even Tesla’s own suppliers like Nvidia have expressed doubt that the computing hardware it sells to Tesla is capable of supporting full automation reliably.