Tesla issued a recall notice today for 2 million vehicles in the US to address a “defect” with Autopilot, the company’s groundbreaking and controversial advanced driver-assist system. Safety experts say the recall will likely make Autopilot harder to misuse.
Harder, but not impossible.
“It’s progress,” said Mary “Missy” Cummings, a robotics expert who wrote a 2020 paper evaluating the risks of Tesla’s Autopilot system, “but minimal progress.”
Cummings said the National Highway Traffic Safety Administration missed an opportunity to force the company to address concerns around Tesla owners using Autopilot on roads where it wasn’t intended to work. Last week, The Washington Post published an investigation linking at least eight fatal or serious crashes to Tesla drivers using Autopilot on roads it couldn’t “reliably navigate.”
As per the recall, Tesla will issue a software update to some 2 million cars in the US — nearly every vehicle it has ever sold in the country — that will increase the number of warnings and alerts when drivers are not paying attention.
Autopilot, which comes standard in all Tesla vehicles, packages together a number of features including Traffic-Aware Cruise Control and Autosteer, which is only intended to be used on limited-access freeways when it’s not operating in tandem with the more sophisticated Autosteer on City Streets.
According to the recall, the software update should address the concerns around using Autosteer on roads where it wasn’t intended to work.
“If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage,” the recall document states.
Tesla is also expanding its “three strikes and you’re out” system to include Autopilot for the first time. Previously, drivers with the more expensive and expansive Full Self-Driving feature would be locked out of the system if they were not paying attention to the road. Now, this feature will include Autopilot users as well. According to the recall, drivers will face “eventual suspension from Autosteer use if the driver repeatedly fails to demonstrate continuous and sustained driving responsibility while the feature is engaged.”
But Cummings, who served for a year as a senior advisor for safety at NHTSA, isn’t convinced this will be enough to prevent future incidents. “It’s very vague,” she said.
That’s likely because the recall was the result of a two-year-long negotiation between NHTSA and Tesla, said Phil Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety. The company didn’t concur with the agency’s findings but ultimately agreed to issue the recall — indicating some important elements may have been left out of the recall’s purview.
“This has all the earmarks of a compromise to get the remedy out and avoid another year of negotiation between NHTSA and Tesla,” he said. “So the remedy will likely not be as robust as NHTSA would like to see.”
“This has all the earmarks of a compromise to get the remedy out”
Cummings agrees. “Tesla likely fought back,” she said. “NHTSA wants very much recalls to be voluntary so Tesla probably used that as a bargaining chip.”
Of course, the nature of the recall itself has some safety experts calling this a huge missed opportunity. Allowing Tesla to push an over-the-air software update ignores many of the structural defects with Autopilot, said Sam Abuelsamid, principal research analyst at Guidehouse Insights.
Tesla’s driver monitoring system, which includes torque sensors in the steering wheel to detect hand placement and an in-cabin camera to track head movements, is inadequate and can be easily fooled, Abuelsamid said. The torque sensors are prone to false positives, such as when drivers try to trick the system by adding a weight to the steering wheel that counteracts automatic movements, and false negatives, like when the wheel fails to detect a driver’s hands if they are holding it steady.
Meanwhile, the camera, which only went into use for Autopilot driver monitoring in 2021, doesn’t work in low-light conditions, he noted. Other automakers use infrared sensors that can detect depth and work in low-light situations. Consumer Reports demonstrated recently that Tesla’s cameras could be tricked into thinking there was someone in the driver’s seat when there wasn’t.
“This absolutely could have gone another way,” Abuelsamid said. “NHTSA could do its job and actually force Tesla to do a recall and install robust driver eye and hand monitoring and true geofencing of the system or disable Autosteer altogether if they cannot do a hardware update.”
“NHTSA has continually dropped the ball when it comes to Tesla,” he added. “Sadly, given the agency’s history in dealing with Tesla, this was likely to be the best case outcome.”