A Tesla Model S owner in Alberta, Canada, was charged with dangerous driving after being pulled over for sleeping while traveling at speeds of 150 km/h (93 mph). The case raises questions about Tesla’s partially automated driving system, Autopilot, and driver complacency.
On July 9th, the Royal Canadian Mounted Police said they received a complaint of reckless driving on Highway 2 near Ponoka in Alberta. The 2019 Tesla Model S “appeared to be self-driving,” police said, “traveling over 140 km/h, with both front seats completely reclined and both occupants appearing to be asleep.”
Officers began to pursue the vehicle with their emergency lights flashing, at which point the vehicle “automatically began to accelerate,” eventually reaching a speed of 150 km/h, police said. After pulling over the vehicle, the driver, a 21-year-old male from British Columbia, was charged with speeding and driving while fatigued, resulting in a 24-hour license suspension. Later, the man was also charged with dangerous driving.
“Although manufacturers of new vehicles have built in safeguards to prevent drivers from taking advantage of the new safety systems in vehicles, those systems are just that — supplemental safety systems,” Superintendent Gary Graham of Alberta RCMP Traffic Services said in a statement. “They are not self-driving systems, they still come with the responsibility of driving.”
A spokesperson for Tesla did not respond to a request for comment. Autopilot is a Level 2 partially autonomous system that combines adaptive cruise control, lane keep assist, self-parking, and, most recently, the ability to automatically change lanes. It uses a suite of sensors, including eight cameras, radar, and ultrasonic, to automate some of the driving tasks, but it also requires drivers to stay engaged with the vehicle in order to operate.
The automaker’s Autopilot system has been proven by traffic investigators to have contributed to a number of fatal crashes in the past, and the families of deceased drivers have sued Tesla for wrongful death.
Tesla CEO Elon Musk has blamed crashes involving Autopilot on driver overconfidence. “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said in 2018. But by marketing its system as “Autopilot,” Tesla has been shown to encourage driver inattention.
It’s unclear to what extent the Tesla owner in Canada was misusing Autopilot. Tesla has said the advanced driver assist system will only work when it detects a driver’s hands on the steering wheel. If a driver’s hands aren’t detected, the display behind the wheel will begin to flash, followed by audible warnings, and eventually, Autopilot will disable itself.
Since its launch in 2015, Tesla owners have sought out new and creative ways to trick Autopilot. People couldn’t wait to upload videos sitting in the backseat while their cars drove “autonomously” down the highway. Tesla responded by updating its software to require drivers to keep their hands on the steering wheel — which seemed like a smart fix until one driver figured out all you needed to do to fool the system was wedge an orange against the wheel to simulate the pressure of a human hand.
“Autopilot Buddy” was a piece of magnetic plastic that attaches to the steering wheel in order to create the impression that the driver is keeping his or her hands there. Federal regulators issued a cease and desist order to prevent its sale.
People love tricking technology, even if it could cost them their lives.