Waymo pulls back the curtain on 6.1 million miles of self-driving car data in Phoenix

Photo by Vjeran Pavic / The Verge

In its first report on its autonomous vehicle operations in Phoenix, Arizona, Waymo said that it was involved in 18 crashes and 29 near-miss collisions during 2019 and the first nine months of 2020.

These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and “nearly all” of the collisions were the fault of the other driver.

The report is the deepest dive yet into the real-life operations of the world’s leading autonomous vehicle company, which recently began offering rides in its fully driverless vehicles to the general public. Autonomous vehicle (AV) companies can be a black box, with most firms keeping a tight lid on measurable metrics and only demonstrating their technology to the public under the most controlled settings.

Indeed, Waymo, which was spun out of Google in 2016, mostly communicates about its self-driving program through glossy press releases or blog posts that reveal scant data about the actual nuts and bolts of autonomous driving. But in this paper, and another also published today, the company is showing its work. Waymo says its intention is to build public trust in automated vehicle technology, but these papers also serve as a challenge to other AV competitors.

“This is a major milestone, we think, in transparency,” said Matthew Schwall, head of field safety at Waymo, in a briefing with reporters Wednesday. Waymo claims this is the first time that any autonomous vehicle company has released a detailed overview of its safety methodologies, including vehicle crash data, when not required by a government entity. “Our goal here is to kickstart a renewed industry dialogue in terms of how safety is assessed for these technologies,” Schwall said.

The two papers take different approaches. The first outlines a multilayered approach that maps out Waymo’s approach to safety. It includes three layers:

The second paper is meatier, with detailed information on the company’s self-driving operations in Phoenix, including the number of miles driven and the number of “contact events” Waymo’s vehicles have had with other road users. This is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix.

The public road testing data covers Waymo’s self-driving operations in Phoenix from January 2019 through September 2020. The company has approximately 600 vehicles as part of its fleet. More than 300 vehicles operate in an approximately 100-square-mile service area that includes the towns of Chandler, Gilbert, Mesa, and Tempe — though its fully driverless cars are restricted to an area that is only half that size. (Waymo hasn’t disclosed how many of its vehicles operate without safety drivers.)

Between January and December 2019, Waymo’s vehicles with trained safety drivers drove 6.1 million miles. In addition, from January 2019 through September 2020, its fully driverless vehicles drove 65,000 miles. Taken together, the company says this represents “over 500 years of driving for the average licensed US driver,” citing a 2017 survey of travel trends by the Federal Highway Administration.

Waymo says its vehicles were involved in 47 “contact events” with other road users, including other vehicles, pedestrians, and cyclists. Eighteen of these events occurred in real life, while 29 were in simulation. “Nearly all” of these collisions were the fault of a human driver or pedestrian, Waymo says, and none resulted in any “severe or life-threatening injuries.”

The company says it also counts events in which its trained safety drivers assume control of the vehicle to avoid a collision. Waymo’s engineers then simulate what would have happened had the driver not disengaged the vehicle’s self-driving system to generate a counterfactual, or “what if,” scenario. The company uses these events to examine how the vehicle would have reacted and then uses that data to improve its self-driving software. Ultimately, these counterfactual simulations can be “significantly more realistic” than simulated events that are generated “synthetically,” Waymo says.

This use of these simulated scenarios sets Waymo apart from other AV operators, said Daniel McGehee, director of the National Advanced Driving Simulator Laboratories at the University of Iowa. That’s because it allows Waymo to go deeper on a variety of issues that may contribute to a crash, such as sensor reliability or the interpretation of particular images by the vehicle’s perception software “They’re really going beyond regular data,” McGehee said in an interview. “And that’s very new and very unique.”

Waymo says the majority of its collisions were extremely minor and at low speeds. But the company highlighted eight incidents that it considered “most severe or potentially severe.” Three of these crashes occurred in real life and five only in simulation. Airbags were deployed in all eight incidents.

In the paper, Waymo outlines how “road rule violations” of other drivers contributed to each of the eight “severe” collisions:

Waymo
Waymo
Waymo
Waymo
Waymo
Waymo
Waymo
Waymo

The most common type of crash involving Waymo’s vehicles was rear-end collisions. Waymo said it was involved in 14 actual and two simulated fender-benders, and in all but one, the other vehicle was the one doing the rear-ending.

The one incident where Waymo rear-ended another vehicle was in simulation: the company determined that the AV would have rear-ended another car that swerved in front of it and then braked hard despite a lack of obstruction ahead — which the company says was “consistent with antagonistic motive.” (There have been dozens of reports of Waymo’s autonomous vehicles being harassed by other drivers, including attempts to run them off the road.) The speed of impact, had it occurred in real life, would have been 1 mph, Waymo says.

Waymo’s vehicles often drive hyper-cautiously or in ways that can frustrate a human driver — which can lead to fender-benders. But Waymo says its vehicles aren’t rear-ended more frequently than the average driver. “We don’t like getting rear ended,” Schwall said. “And we’re always looking for ways to get rear ended less.”

The only crash involving a fully driverless Waymo vehicle, without a safety driver behind the wheel, was also a rear-ending. The Waymo vehicle was slowing to stop at a red light when it was rear-ended by another vehicle traveling at 28 mph. An airbag deployed in the vehicle that struck the Waymo vehicle.

Just one crash took place with a passenger in a Waymo vehicle, in the Uber-like Waymo One ride-hailing service that’s been operating since 2018. By early 2020, Waymo One was doing 1,000 to 2,000 rides every week. Most of these rides had safety drivers, though 5 percent to 10 percent were fully driverless vehicles. The crash occurred when a Waymo vehicle with a safety driver behind the wheel was rear-ended by a vehicle traveling around 4 mph. No injuries were reported.

Waymo was also involved in 14 simulated crashes in which two vehicles collided at an intersection or while turning. There was also one actual collision. These types of crashes, called “angled” collisions, are important because they account for over a quarter of all vehicle collisions in the US, and nearly a quarter of all vehicle fatalities, Waymo says. The one actual, non-simulated angled collision occurred when a vehicle ran a red light at 36 mph, smashing into the side of a Waymo vehicle that was traveling through the intersection at 38 mph.

Fortunately, the “most severe” collision only took place in simulation. The Waymo vehicle was traveling at 41 mph when another vehicle suddenly crossed in front of it. In real life, the safety driver took control, braking in time to avoid a collision; in the simulation, Waymo’s self-driving system didn’t brake in time to prevent the crash. Waymo determined it could have reduced its speed to 29 mph before colliding with the other vehicle. The company says the crash “approaches the boundary” between two classifications of severe collisions that could have resulted in critical injuries.

Self-driving car safety has drawn additional scrutiny after the first fatal crash in March 2018, when an Uber vehicle struck and killed a pedestrian in Tempe, Arizona. At the time, Waymo CEO John Krafcik said his company’s vehicles would have avoided that fatal collision.

The vast majority of cars on the road today are controlled by humans, many of whom are terrible drivers — which means Waymo’s vehicles will continue to be involved in many more crashes. “The frequency of challenging events that were induced by incautious behaviors of other drivers serves as a clear reminder of the challenges in collision avoidance so long as AVs share roadways with human drivers,” Waymo says at the conclusion of its paper. AVs are expected to share the road with human drivers for decades to come, even under the rosiest predictions about the technology.

There’s no standard approach for evaluating AV safety. A recent study by RAND concluded that in the absence of a framework, customers are most likely to trust the government — even though US regulators appear content to let the private sector dictate what’s safe. In this vacuum, Waymo hopes that by publishing this data, policymakers, researchers, and even other companies may begin to take on the task of developing a universal framework.

To be sure, there is currently no federal rule requiring AV companies to submit information about their testing activities to the government. Instead, a patchwork of state-by-state regulations governs what is and isn’t disclosed. California has the most stringent rules, requiring companies to obtain a license for different types of testing, disclose vehicle crashes, list the number of miles driven, and the frequency at which human safety drivers were forced to take control of their autonomous vehicles (also known as a “disengagement”). Unsurprisingly, AV companies hate California’s requirements.

What Waymo has provided with these two papers is just a snapshot of a decade worth of public road testing of autonomous vehicles — but a very important one nonetheless. Many of Waymo’s competitors, including Argo, Aurora, Cruise, Zoox, Nuro, and many others, publish blog posts detailing their approach to safety, submit data to California as part of the state’s AV testing program, but not much else beyond that. With these publications, Waymo is laying down the gauntlet for the rest of the AV industry, the University of Iowa’s McGehee said.

“I think it will go a long way to force other automated driving companies to reveal these kinds of data moving forward,” he said, “so when things go wrong, they provide a framework of data that is available to the public.”

Not all companies are proceeding with as much caution as Waymo. Tesla CEO Elon Musk recently called Waymo’s approach to autonomous driving “impressive, but a highly specialized solution.” Last week, his company released a beta software update called “Full Self-Driving” to a select group of customers. Musk claimed it was capable of “zero intervention drives,” but within hours of the release, videos surfaced of Tesla customers swerving to avoid parked cars and other near misses.

Years ago, Waymo considered developing an advanced driver-assist system like Tesla’s “Full Self-Driving” version of Autopilot but ultimately decided against it having become “alarmed” by the negative effects on the driver, Waymo’s director of systems engineering Nick Webb said. Drivers would zone out or fall asleep at the wheel. The experiment in driver assistance helped solidify Waymo’s mission: fully autonomous or bust.

“We felt that Level 4 autonomy is the best opportunity to improve road safety,” Webb added. “And so we’ve committed to that fully.”

Entertainment

The biggest announcements from DC FanDome

Apps

Some Instacart shoppers plan Saturday strike over pay and work conditions

Gaming

Analogue’s Pocket will have an extremely fancy OS to save games to and from cartridges

View all stories in Tech

Comments

By the way Waymo is not the world’s leading autonomous vehicle company. There is not such a thing, just as there is not a world’s leading phone company.
You’ll need to select a metric how you would determine a leading company (revenue, profit, miles driven, level 4 availability).
You could write a leading autonomous vehicle company, which is true, but there is not a really good metric to compare the different companies. Each of them has their own advantages and disadvantages.

Number of miles driven?

That would be Tesla.

Whether or not you consider their autopilot implementation good enough is another story but they have the most miles of "autonomous driving" out of any company simply due to the massive fleet size and beta testing with customers.

There is no clear metric to define who is best.

You can’t take your hands off the wheel in a Tesla though. You can go from point A to point B in a Waymo van and never even sit in the driver’s seat. As a metric to define who’s best that one seems… fairly obvious to me.

Waymo operates in a very limited, laser mapped area. It’s not that difficult to design a system to operate in a controlled environment. Tesla’s system is designed to work across the country, dealing with all kinds of different conditions.

Not to mention, that even though Waymo vehicles don’t have someone in the driver’s seat, they still have safety observers sitting at consoles ready to take over if the vehicle gets stuck.

There is no clear metric to define who is best.

No, but there is a pretty clear metric to define what "autonomous driving" is, and Tesla does not rise to that standard even with their latest beta: https://www.nhtsa.gov/technology-innovation/automated-vehicles#topic-road-self-driving

As for what defines "best", I would put deaths and injuries pretty far up there in the criteria, and Tesla doesn’t rate very well there either.

I wouldn’t put profitability in there just because if it’s profitable doesn’t mean the tech is good. I dunno what other company is as close as Waymo is by all accounts to full autonomy so I dunno why it’s not "the world’s leading autonomous vehicle company". Uber isn’t, and Tesla damn sure isn’t.

Uber isn’t, and Tesla damn sure isn’t.

I don’t think many would argue with your Uber evaluation since they’re basically doing the Waymo/Cruise thing but with less apparent progress. But Tesla’s standing is up for debate since their approach is entirely different and there’s no apples-to-apples comparison to make. They have no lidar, but a million data gathering vehicles. They have less simulated miles but more IRL miles. How can anyone compare these with certainty?

Waymo might be at 95% FSD in Tempe, while Tesla might be 60% globally. Will Waymo hit 100% in Tempe before Tesla hits 100% globally? If so, will Tesla reach 100% before Waymo can learn the other regions of the world?

But one thing is pretty certain: if they both achieve FSD, Tesla will be in a much better position than Waymo. They can instantly turn on FSD for a million cars that are already on the road, and they can immediately produce more at a much larger scale and for much less cost.

What Tesla has is great but it’s just a fancied up version of ADAS and I don’t think it’s in the same ballpark. I would argue Waymo is safer and further along, so I dunno what other metric you’d bother with comparing them to Tesla. That there are more Teslas? Sure, but they aren’t actually autonomous.

I agree Waymo appears to be "safer and further along" in the Arizona neighborhood where it operates. But they’re unable to drive at all in the rest of the world (more or less). So again, they’re two completely different approaches, each with the possibility for success even if you "don’t think [Tesla is] in the same ballpark" based on your intuition.

You ever search for cats on Google Images? There is always one pic that is way off. I think that’s how Tesla’s approach will always be until they add maps and Lidar. Right now they are selling to drivers of cars so it doesn’t matter, but it will be years before Tesla has cars on the road without drivers. It is estimated that the cost of some Lidars will come down to under $100. It won’t be costly and will be more safe. I think a mixture of cameras and Lidar with maps is the way to go. The safer the better.

Lidar doesn’t give you a clear image of the world around you. All it gives you is a mess of points. You still have to apply an image over top of that to know what the vehicle is actually seeing. An image recognition system still has to be developed. Lidar just adds another layer of complexity.

We don’t use lidar to drive our cars. All people use to drive right now are two cameras on a slow gimbal. Tesla’s solution uses eight cameras for a simultaneous 360 degree view around the vehicle. There is no reason that they can’t refine the software to make a camera only system work.

Waymo has a disengagement rate of once per 13,000 miles driven. That’s over 100x the miles driven between disengagement than what I’m getting on my Tesla even when it’s doing the extremely basic task of staying in the same lane, and not colliding with the car in front of it. Hopefully the FSD-beta is better… but atm I would still say Waymo’s "latest stable version" is clearly better atm.

Yeah, but you can only use Waymo in a very limited area. So if you had the autopilot system of Waymo it would be way better, but you wouldn’t be able to activate it (unless you pass by that city).

Two points.

First, geo-fencing limitation imposed by Waymo is primarily a business decision and not a technical one as they are trying to build a lot of public trust and a sense of quality in their product. While Phoenix area is the only place you can publicly ride a Waymo car, they have been test driving the Waymo cars for several years in over 25 cities including ones Michigan, Florida, Washington – testing them in rain, snow, fog, ice, wind, and even dust storms. Here’s a blog about how their cars handled heavy fog and dusty storms

https://medium.com/waymo/waymo-and-the-weather-9ddd66ee61a

Second, while technically there is no geographic restriction on Tesla, in reality there are a lot of situations where they don’t work. From personal experience, if the lane markers are too hard to see or if the roads turns too much the Tesla will disengage itself quite a lot of times, force itself to drive really slowly (I’ve had it say it will only drive 30-40mph in a 50mph zone), and refuse to engage (the grey steering wheel icon that turns blue when you engage will completely disappear if Tesla feels that is cannot enable autopilot).

So de facto Tesla’s solution is also quite limited due to geography. Plus it gets blinded it is is low in the sky… so if you are commuting on a East/West road in the morning or early evening. It will also disengage. It also disengages in rain, snow, and sometimes will random do it at night.

As I said, I have high hopes fo the new FSD-beta… but I don’t how they get around the fact that their cameras are getting blinded by the sun (where as Waymo lidars are much less vulnerable to having the sun shine into its "eyes") by the design of the LIDAR.

So far there is no real universal self driving. (that works everywhere).
So the worlds leading self driving company is the company that will be first to archive self driving everywhere (in the western world) even during rain.
And which company is this?
Well we don’t know. We can’t compare Tesla and Waymo, because we don’t have the data to compare them. There is some US data like miles driven without intervention, but this metric doesn’t say anything. I think Waymo scored about 1000 times better than BMW, since BMW did some much harder experiments, and does most of its testing out of the US anyways.

I would say disengagement rate is a decent metric as I treat any time a driver nears to explicitly override the car as some type failure of the systme. Waymo disengagement rate is 0.076 per 1,000 miles or roughly once per 13,000 miles.

As an owner of a Tesla, I can tell you that my personal disengagement rate is MUCH less than once per 13,000 miles (more like once per 130 miles or less) even though all it is doing is staying in the same lane and not crashing into the car in front of it. That even with that fact that the majority of those miles are at highway speeds allows Tesla to rack up lots of miles of easy driving in a short period of time. As I drive during commute times when the sun is low in the sky in the morning/night… my Tesla frequently disables its autopilot feature due to the sun blinding the cameras.

Even that’s not a great metric. A more risk taking algorithm would disengage less.

Huh? Typically it isn’t algorithm that says "HELP!", human driver thinks algorithm is doing something stupid/risky/… and takes control. The problem is perhaps the other way around. Safety driver will let the car drive like a grandma just so it doesn’t cause an accident.

Actually the problem with this metric is, that the situations can’t be compared. Every car company can teach it’s car to drive perfectly around the block. So the car can get a perfect score on thousands of miles, when it just continues to drive around the block.
Waymo’s environment is relative easy. They selected a city with no rain or snow, which is build for cars. It’s really easy to drive there and they did a lot of optimization that it will work well.

That’s incorrect…. while Waymo opened to the public in one city. The majority of their miles come from the ~25 other cities they are testing in.

They test in rain

They test in snow

Even in a haboob (intense dust storm)

The key things is that they don’t want any bad PR for a crash like Uber or Tesla earlier – even if the fault is another human.

Right now where the thingy is unproven, revenue or profit are meaningless and safety+convenience matters.

Convenience is a huge pain to evaluate. Where you can use it, how jerky the drive is, how fast the drive is and so forth… tons of variables you can’t realistically estimate and compare.
On the other hand, safety can be compared easily, the most relevant metric is accidents+interventions per mile – it doesn’t matter if you do millions of miles and a driver needs to fix stupidity every second intersection.
Sure, you might claim safety is easy to game by driving too slow and stopping faster (= worse convenience), but then you get rear-ended more often so I believe it should mostly balance out.

In terms of troubles/mile, Waymo is decent, but I don’t know stats of Cruise, Uber and tons of others limited to ~40km/h to see how they compare. They are ALL currently ahead of car companies.
But nobody knows how to reach L5. It might be Waymo or any of their competitors slowly expanding their reach until they are available globally. It might also be a car company that iteratively adds functionality until their cars reach L5.

"On the other hand, safety can be compared easily, the most relevant metric is accidents+interventions per mile"
No, only if they all drive in the same environment. Miles itself can’t be really compared, since they can much harder or easier to drive.

As an occasional motorcycle rider, T-Bones at intersections scare the crap out of me. I look forward to more AVs on the road!!!

Seriously – human drivers are the worst. I enjoy driving but I’d give it up in a second to not have to worry about others behind the wheel, especially while biking around NYC.

View All Comments
Back to top ↑