Skip to main content

Waymo pulls back the curtain on 6.1 million miles of self-driving car data in Phoenix

Over 21 months in Arizona, Waymo’s vehicles were involved in 47 collisions and near-misses, none of which resulted in injuries

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Photo by Vjeran Pavic / The Verge

In its first report on its autonomous vehicle operations in Phoenix, Arizona, Waymo said that it was involved in 18 crashes and 29 near-miss collisions during 2019 and the first nine months of 2020. 

These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and “nearly all” of the collisions were the fault of the other driver. 

The report is the deepest dive yet into the real-life operations of the world’s leading autonomous vehicle company, which recently began offering rides in its fully driverless vehicles to the general public. Autonomous vehicle (AV) companies can be a black box, with most firms keeping a tight lid on measurable metrics and only demonstrating their technology to the public under the most controlled settings. 

Indeed, Waymo, which was spun out of Google in 2016, mostly communicates about its self-driving program through glossy press releases or blog posts that reveal scant data about the actual nuts and bolts of autonomous driving. But in this paper, and another also published today, the company is showing its work. Waymo says its intention is to build public trust in automated vehicle technology, but these papers also serve as a challenge to other AV competitors. 

“This is a major milestone, we think, in transparency.”

“This is a major milestone, we think, in transparency,” said Matthew Schwall, head of field safety at Waymo, in a briefing with reporters Wednesday. Waymo claims this is the first time that any autonomous vehicle company has released a detailed overview of its safety methodologies, including vehicle crash data, when not required by a government entity.  “Our goal here is to kickstart a renewed industry dialogue in terms of how safety is assessed for these technologies,” Schwall said.

The two papers take different approaches. The first outlines a multilayered approach that maps out Waymo’s approach to safety. It includes three layers: 

  • Hardware, including the vehicle itself, the sensor suite, the steering and braking system, and the computing platform;
  • The automated driving system behavioral layer, such as avoiding collisions with other cars, successfully completing fully autonomous rides, and adhering to the rules of the road;
  • Operations, like fleet operations, risk management, and a field safety program to resolve potential safety issues. 

The second paper is meatier, with detailed information on the company’s self-driving operations in Phoenix, including the number of miles driven and the number of “contact events” Waymo’s vehicles have had with other road users. This is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix. 

The public road testing data covers Waymo’s self-driving operations in Phoenix from January 2019 through September 2020. The company has approximately 600 vehicles as part of its fleet. More than 300 vehicles operate in an approximately 100-square-mile service area that includes the towns of Chandler, Gilbert, Mesa, and Tempe — though its fully driverless cars are restricted to an area that is only half that size. (Waymo hasn’t disclosed how many of its vehicles operate without safety drivers.)

Between January and December 2019, Waymo’s vehicles with trained safety drivers drove 6.1 million miles. In addition, from January 2019 through September 2020, its fully driverless vehicles drove 65,000 miles. Taken together, the company says this represents “over 500 years of driving for the average licensed US driver,” citing a 2017 survey of travel trends by the Federal Highway Administration.

The company says this represents “over 500 years of driving for the average licensed US driver”

Waymo says its vehicles were involved in 47 “contact events” with other road users, including other vehicles, pedestrians, and cyclists. Eighteen of these events occurred in real life, while 29 were in simulation. “Nearly all” of these collisions were the fault of a human driver or pedestrian, Waymo says, and none resulted in any “severe or life-threatening injuries.”

The company says it also counts events in which its trained safety drivers assume control of the vehicle to avoid a collision. Waymo’s engineers then simulate what would have happened had the driver not disengaged the vehicle’s self-driving system to generate a counterfactual, or “what if,” scenario. The company uses these events to examine how the vehicle would have reacted and then uses that data to improve its self-driving software. Ultimately, these counterfactual simulations can be “significantly more realistic” than simulated events that are generated “synthetically,” Waymo says.

This use of these simulated scenarios sets Waymo apart from other AV operators, said Daniel McGehee, director of the National Advanced Driving Simulator Laboratories at the University of Iowa. That’s because it allows Waymo to go deeper on a variety of issues that may contribute to a crash, such as sensor reliability or the interpretation of particular images by the vehicle’s perception software “They’re really going beyond regular data,” McGehee said in an interview. “And that’s very new and very unique.”

Waymo says the majority of its collisions were extremely minor and at low speeds. But the company highlighted eight incidents that it considered “most severe or potentially severe.” Three of these crashes occurred in real life and five only in simulation. Airbags were deployed in all eight incidents. 

In the paper, Waymo outlines how “road rule violations” of other drivers contributed to each of the eight “severe” collisions:

1/8

Waymo

The most common type of crash involving Waymo’s vehicles was rear-end collisions. Waymo said it was involved in 14 actual and two simulated fender-benders, and in all but one, the other vehicle was the one doing the rear-ending.

The one incident where Waymo rear-ended another vehicle was in simulation: the company determined that the AV would have rear-ended another car that swerved in front of it and then braked hard despite a lack of obstruction ahead — which the company says was “consistent with antagonistic motive.” (There have been dozens of reports of Waymo’s autonomous vehicles being harassed by other drivers, including attempts to run them off the road.) The speed of impact, had it occurred in real life, would have been 1 mph, Waymo says. 

Waymo’s vehicles often drive hyper-cautiously or in ways that can frustrate a human driver — which can lead to fender-benders. But Waymo says its vehicles aren’t rear-ended more frequently than the average driver. “We don’t like getting rear ended,” Schwall said. “And we’re always looking for ways to get rear ended less.”

“We don’t like getting rear ended.”

The only crash involving a fully driverless Waymo vehicle, without a safety driver behind the wheel, was also a rear-ending. The Waymo vehicle was slowing to stop at a red light when it was rear-ended by another vehicle traveling at 28 mph. An airbag deployed in the vehicle that struck the Waymo vehicle.

Just one crash took place with a passenger in a Waymo vehicle, in the Uber-like Waymo One ride-hailing service that’s been operating since 2018. By early 2020, Waymo One was doing 1,000 to 2,000 rides every week. Most of these rides had safety drivers, though 5 percent to 10 percent were fully driverless vehicles. The crash occurred when a Waymo vehicle with a safety driver behind the wheel was rear-ended by a vehicle traveling around 4 mph. No injuries were reported. 

Waymo was also involved in 14 simulated crashes in which two vehicles collided at an intersection or while turning. There was also one actual collision. These types of crashes, called “angled” collisions, are important because they account for over a quarter of all vehicle collisions in the US, and nearly a quarter of all vehicle fatalities, Waymo says. The one actual, non-simulated angled collision occurred when a vehicle ran a red light at 36 mph, smashing into the side of a Waymo vehicle that was traveling through the intersection at 38 mph.

Fortunately, the “most severe” collision only took place in simulation. The Waymo vehicle was traveling at 41 mph when another vehicle suddenly crossed in front of it. In real life, the safety driver took control, braking in time to avoid a collision; in the simulation, Waymo’s self-driving system didn’t brake in time to prevent the crash. Waymo determined it could have reduced its speed to 29 mph before colliding with the other vehicle. The company says the crash “approaches the boundary” between two classifications of severe collisions that could have resulted in critical injuries. 

Self-driving car safety has drawn additional scrutiny after the first fatal crash in March 2018, when an Uber vehicle struck and killed a pedestrian in Tempe, Arizona. At the time, Waymo CEO John Krafcik said his company’s vehicles would have avoided that fatal collision. 

Fortunately, the “most severe” collision only took place in simulation

The vast majority of cars on the road today are controlled by humans, many of whom are terrible drivers — which means Waymo’s vehicles will continue to be involved in many more crashes. “The frequency of challenging events that were induced by incautious behaviors of other drivers serves as a clear reminder of the challenges in collision avoidance so long as AVs share roadways with human drivers,” Waymo says at the conclusion of its paper. AVs are expected to share the road with human drivers for decades to come, even under the rosiest predictions about the technology

There’s no standard approach for evaluating AV safety. A recent study by RAND concluded that in the absence of a framework, customers are most likely to trust the government — even though US regulators appear content to let the private sector dictate what’s safe. In this vacuum, Waymo hopes that by publishing this data, policymakers, researchers, and even other companies may begin to take on the task of developing a universal framework. 

To be sure, there is currently no federal rule requiring AV companies to submit information about their testing activities to the government. Instead, a patchwork of state-by-state regulations governs what is and isn’t disclosed. California has the most stringent rules, requiring companies to obtain a license for different types of testing, disclose vehicle crashes, list the number of miles driven, and the frequency at which human safety drivers were forced to take control of their autonomous vehicles (also known as a “disengagement”). Unsurprisingly, AV companies hate California’s requirements.

What Waymo has provided with these two papers is just a snapshot of a decade worth of public road testing of autonomous vehicles — but a very important one nonetheless. Many of Waymo’s competitors, including Argo, Aurora, Cruise, Zoox, Nuro, and many others, publish blog posts detailing their approach to safety, submit data to California as part of the state’s AV testing program, but not much else beyond that. With these publications, Waymo is laying down the gauntlet for the rest of the AV industry, the University of Iowa’s McGehee said. 

“I think it will go a long way to force other automated driving companies to reveal these kinds of data.”

“I think it will go a long way to force other automated driving companies to reveal these kinds of data moving forward,” he said, “so when things go wrong, they provide a framework of data that is available to the public.”

Not all companies are proceeding with as much caution as Waymo. Tesla CEO Elon Musk recently called Waymo’s approach to autonomous driving “impressive, but a highly specialized solution.” Last week, his company released a beta software update called “Full Self-Driving” to a select group of customers. Musk claimed it was capable of “zero intervention drives,” but within hours of the release, videos surfaced of Tesla customers swerving to avoid parked cars and other near misses.

Years ago, Waymo considered developing an advanced driver-assist system like Tesla’s “Full Self-Driving” version of Autopilot but ultimately decided against it having become “alarmed” by the negative effects on the driver, Waymo’s director of systems engineering Nick Webb said. Drivers would zone out or fall asleep at the wheel. The experiment in driver assistance helped solidify Waymo’s mission: fully autonomous or bust. 

“We felt that Level 4 autonomy is the best opportunity to improve road safety,” Webb added. “And so we’ve committed to that fully.”