Consumer Reports said Thursday it was “easily” able to trick Tesla’s Autopilot system to operate without anyone in the driver’s seat. The publication’s test came amid questions about the safety of the company’s advanced driver assist system in the aftermath of a fatal crash in Texas in which authorities said there was no one behind the steering wheel.
Using a weighted chain attached to the steering wheel to simulate the pressure of the driver’s hands, two Consumer Reports researchers were able to use the steering wheel dial on a Tesla Model Y to accelerate from a full stop, and then “drive” around on a closed-course test track for several miles — all while sitting in the passenger seat and backseat. They stopped the vehicle by again using the dial to bring the speed back down to zero.
Tricking the Tesla to operate without someone behind the wheel was as simple as keeping the driver’s seatbelt buckled, not opening the driver’s side door during the test, and using the weight to simulate hands on the steering wheel.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat”
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Jake Fisher, CR’s senior director of auto testing, said in a statement. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Fisher also warned against trying to similarly trick Tesla’s Autopilot, noting this experiment should not be attempted by anyone but a trained professional. In addition to conducting the test on a closed course, CR also had safety crews standing by and never exceeded 30mph. “Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger,” Fisher says.
The track on which CR conducted its test had painted lanes, which Tesla CEO Elon Musk has claimed Autopilot needs in order to operate. His response to the crash in Spring, Texas, in which two men were killed in what authorities have described as a driverless Tesla Model S, noted that the road they were on did not have painted lines. However, some Tesla drivers have demonstrated that Autopilot works on roads without painted lanes. Federal crash investigators are now examining the crash in Texas.
Musk also claimed that data logs recovered from the crashed Model S “so far show Autopilot was not enabled.” But research has shown that Autopilot can turn off unexpectedly without notifying the driver. “Autopilot makes mistakes, and when it encounters a situation that it cannot negotiate, it can immediately shut itself off,” Fisher said. “If the driver isn’t ready to react quickly, it can end in a crash.”
Research has shown that Autopilot can turn off unexpectedly without notifying the driver
Tesla warns that drivers need to keep their eyes on the road and hands on the wheel at all times, though the automaker famously refuses to include a more robust driver-monitoring system (like infrared eye tracking, for example) to ensure its customers are following safety protocols. Autopilot is considered a Level 2 “partially automated” system by the Society of Automotive Engineers’ standards, which requires that drivers keep their hands on the wheel and eyes on the road.
But this hasn’t stopped many Tesla owners from abusing Autopilot — sometimes going so far as to film and publicize the results. Drivers have been caught sleeping in the passenger seat or backseat of their Teslas while the vehicle speeds down a crowded highway. A Canadian man was charged with reckless driving recently after being pulled over for sleeping while traveling at speeds of 150 km/h (93mph).
Autopilot has been proven by federal traffic investigators to have contributed to a number of fatal crashes in the past, and the families of deceased drivers have sued Tesla for wrongful death. For his part, Musk blames driver overconfidence. “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said in 2018.
Consumer Reports has had a bumpy relationship with Tesla over the years. In 2015, the publication broke its own rating system in its effusive praise of the Model S P85D. But that love affair started going south almost immediately when it surveyed about 1,400 Tesla owners and used that data to project a “worse-than-average overall problem rate” for new buyers over the life span of the vehicle. As a result, it pulled its coveted “recommended” rating for the Model S. It did the same for the Model 3, citing “declining reliability.”