Tesla's Autopilot semi-autonomous technology gives drivers the ability to take their hands off the wheel, while the car effectively drives itself on the highway. At launch, drivers didn't even need to touch the wheel to show that they were still awake and in the driver's seat. It's impressive technology and perhaps the most advanced autonomous tech available in a commercial automobile today. But that doesn't mean it's a good implementation of autonomous technology.
While Tesla says its technology is Level 2 autonomous — a combination of two technologies designed to make driving easier — some automotive industry experts, including Ford CEO Mark Fields, believe Autopilot is a Level 3 technology. That means it's designed to take over "safety-critical functions" from the driver. That's my impression from testing the system on a long road trip last month. I was able to drive on the highway for minutes at a time without touching the wheel at all. The driver is supposed to pay attention to the road in case of unexpected developments, but, as long as everything is going well, the car can keep itself in-lane and at the appropriate speed. But it's those unexpected developments that can be the problem.
"It gives you the impression that it's doing more than it is," says Trent Victor, senior technical leader of crash avoidance at Volvo, in an interview with The Verge. "[Tesla's Autopilot] is more of an unsupervised wannabe." In other words, Tesla is trying to create an semi-autonomous car that appears to be autonomous.
Victor says that Volvo believes that Level 3 autonomy, where the driver needs to be ready to take over at a moment's notice, is an unsafe solution. Because the driver is theoretically freed up to work on email or watch a video while the car drives itself, the company believes it is unrealistic to expect the driver to be ready to take over at a moment's notice and still have the car operate itself safely. "It's important for us as a company, our position on autonomous driving, is to keep it quite different so you know when you're in semi-autonomous and know when you're in unsupervised autonomous," he says.
Volvo's Drive Me autonomous car, which will launch in a public pilot next year, is a Level 4 autonomous car — this means not only will it drive itself down the road, but it is capable of handling any situation that it comes across without any human intervention. As a result, the human doesn't need to be involved in the driving at all. If something goes wrong, the car can safely stop itself at the side of the road.
"In our concept, if you don't take over, if you have fallen asleep or are watching a film, then we will take responsibility still," says Victor. "We won't just turn [autonomous mode] off. We take responsibility and we'll be stopping the vehicle if you don't take over." Unsaid here is that in its current "beta" incarnation (which customers have to pay thousands of dollars to enable) Tesla's Autopilot can suddenly turn itself off if it gets into trouble, and the driver must take over immediately or bad things can happen.
"We take responsibility."
"That's a really important step in terms of safety, to make people understand that it's only an option for them take over," says Victor. Volvo is "taking responsibility both for crash events, and we're also programming it for extreme events like people walking in the road even where they're not supposed to be. There's a massive amount of work put into making it handle a crash or conflict situations."
It's mostly a difference of autonomous design philosophy for Tesla and Volvo. Tesla believes drivers can be trusted to make the appropriate decision with regards to their vehicle while Volvo wants to keep the driver from even putting himself into the position of getting into trouble with its autonomous tech.
At the end of the day, it may not be up to manufacturers or buyers: regulators are in the thick of trying to shore up rulemaking around self-driving systems. A NHTSA hearing to solicit public comments is taking place in Silicon Valley today.