Skip to main content

The federal government doesn't know how to regulate Tesla’s autopilot software

The federal government doesn't know how to regulate Tesla’s autopilot software

Share this story

Autonomous vehicles are still years away from hitting the streets, but before they do they’ll be put through a rigorous set of tests by federal regulators to ensure they meet the government’s slowly evolving standards for road safety. Tesla Motors, though, has been able to get its autopilot technology into consumer hands much faster, thanks to lightning quick, smartphone-style software updates that allow Model S owners to bravely test it out while it’s still in "beta."

Government agencies tasked with regulating the automotive industry have a blind spot when it comes to modern vehicle software — the kind that allows the Palo Alto-based automaker to introduce such self-described beta features in its vehicles without having to get a thumbs-up from the feds. But this new reality — a reality where carmakers can introduce new features and fix system bugs remotely — raises questions about liability, safety, and the ability of historically bureaucratic organizations to keep pace with innovation in the automotive industry.

Modern vehicle software is a blind spot for regulatory agencies

This lack of oversight came to light after Model S owners began posting videos on YouTube of the autopilot features in action — all the while exhibiting unadvisable and sometimes straight-up stupid behavior. One owner posted a video of himself reading the newspaper while driving his Tesla. Sure, the road was blissfully traffic-free, but the driver was tempting fate by ignoring Tesla’s explicit instructions "to remain engaged and aware when Autosteer is enabled" by keeping his hands on the steering wheel at all times.

Other videos showed the Model S reacting unpredictably when the Autosteer was engaged, jerking into traffic or pulling the wheel unexpectedly in response to other approaching cars. All were displaying the aberrant behavior while the drivers were flouting Tesla’s steering wheel advice. (Considering the allure and the promise of the self-driving car, it’s kind of difficult to blame them.)

Tesla leaned into the technological breakthrough element of its new software when describing the autopilot as a "unique combination of cameras, radar, ultrasonic sensors and data to automatically steer down the highway, change lanes, and adjust speed in response to traffic." In other words, the company was telling the world that its technology would make driving safer by removing some of the human element, and human error, out of the equation.

CEO Elon Musk is insistent these features are in beta, presumably to drive home that the onus is still on the driver to keep control of the steering wheel and pay attention. In an earnings call Tuesday, he acknowledged the videos of drivers misusing Autosteer, saying that the company is planning "some additional constraints" around when the feature can be enabled in order to "minimize the possibility of people doing crazy things with it."

Jeffrey Miller, associate professor of engineering practice at the University of Southern California and a member of the Institute of Electrical and Electronics Engineers, said Tesla’s beta software raises a host of questions for regulators.

"Beta software typically means that a company is not fully releasing this to the public," Miller told The Verge. "They are releasing it to people who are willing to test it with the understanding there will be bugs in it."

Indeed, Tesla’s software itself is buggy in places, which the company argues it’s over-the-air software updates allow it to fix. But Tesla is able to skirt labeling it "defective" because of its request that drivers keep their hands on the wheel.

During Tuesday’s earnings call, Musk said he wasn’t aware of any accidents caused by Autopilot, only accidents prevented by the beta feature. But Tesla was able to upgrade its Model S in the U.S. without any regulatory approval, while European and Asian drivers will need to wait for their respective governments to sign off before gaining access to the self-driving function. Why is that?

Essentially, auto regulators in the U.S. don’t view software upgrades as any different than other creature comforts a car manufacturer may introduce, like self-closing doors and trunks. The National Highway Traffic Safety Administration (NHTSA), which is part of the Department of Transportation, says its mission is to "save lives, prevent injuries, [and] reduce vehicle-related crashes."

Safety systems, computers, and lines of code are endlessly intertwined

But it’s more complicated than that: in a modern car, safety systems, computers, and lines of code are endlessly intertwined. In November, the agency announced that it was updating its 5-Star vehicle safety ratings to include automatic emergency braking as a recommended safety technology. It also added electronic stability control, forward collision warning, lane-departure warning, and rearview camera systems to its list of "recommended" technologies since 2011.

The agency "applies performance requirements to the regulated system as a whole, and typically does not develop requirements for specific elements within the regulated system such as its software," a spokesman told The Verge. "As with any new vehicle feature, manufacturers are free offer it. If defective however, [the] agency can pursue a recall." AndNHTSA certainly has its eye on autonomous and semiautonomous features as they crop up in the marketplace: in June, the agency announced a preliminary investigation into the 2014 Jeep Grand Cherokee, after complaints that the vehicle’s automatic brakes have been triggering for no reason.

So far, despite a few bugs, the only thing truly defective about Tesla’s autopilot option is the way some drivers are using it. But the government seems to be aware that it is in the process of being outpaced by innovators like Elon Musk and Tesla. The NHTSA spokesperson added, "We are currently assessing the need for additional standards as it relates to software and vehicle electronics in general."

tesla autopilot cluster

The instrument cluster of a Tesla Model S with autopilot enabled. (Tesla Motors)

A spokesperson for Tesla did not respond to a request for comment. But other vehicle manufacturers — and aspirational manufacturers — are certainly thinking about the benefits of fully autonomous cars over semi-autonomous ones like Tesla. In its most recent monthly report on its self-driving car program, Google said it was committed to producing a fully autonomous car, arguing that people couldn’t be trusted to take control of semi-autonomous cars fast enough to avoid accidents.

Miller argued that regulators could lead the way in developing safety standards for beta updates and fully self-driving cars as they start to come online. But they likely won’t.

"Very rarely do we get proactive laws. They’re always reactive," he said. "Right now we have an opportunity to get in front of the technology. We’ve had these small, incremental releases to the public, and we’ll continue to see small, incremental releases until we see a completely driverless vehicle around 2019, 2020."

He added, "But this is the time that we need these regulatory agencies to say, ‘We’ve four years before we’re projecting one of these vehicles are released to the consumer market. Let’s come up with the laws first before that happens.'"