Logic (and data) suggests that well-developed self-driving cars are safer than manually driven ones, considering that humans are prone to mistakes and poor decision-making. But a study published this week by the University of Michigan's Transportation Research Institute suggests the exact opposite, saying that "the current best estimate is that self-driving vehicles have a higher crash rate per million miles traveled than conventional vehicles."
The study, using data compiled from 2012 through September of this year, includes a razor-thin slice of data: Google's operations in Mountain View and Austin, Texas; a single cross-country trip conducted by Delphi; and a single 550-mile trip from Audi. (Of the 10 companies approved to operate self-driving vehicles in California, only three were included, due to a lack of publicly available data for the others.) From that, it found that the self-driving cars crashed 9.1 times per million miles of travel, compared to 1.9 crashes per million miles for conventional vehicles, while also acknowledging that the self-driving cars had a lower fatality rate (0 compared to 0.01). Adjusted for typical rates of underreporting car accidents, the crash rate for manually driven cars rises to 4.1 per million miles, while self-driving cars — which have stringent reporting requirements — remain unchanged at 9.1. Even with that adjustment, the self-driving crash rate is more than twice as high.
By the study's own admission, these results may mean absolutely nothing at all
But by the study's own admission, these results may mean absolutely nothing at all — and in fact, the crash rate may actually be higher for conventional vehicles, which is the exact opposite of its stated conclusion. The data set for the self-driving cars is somewhere in the range of 0.0000002 times the size of the data set for manually driven cars, for one thing. Furthermore, all of the self-driving vehicles in the study are prototypes designed for testing and development, not real-world use.
There's also the consideration that, as far as we know from the data available, a self-driving car has never been at fault in a collision. There are reasonable explanations for how the crash rates have been higher in this very small data set: where a human driver might react to an impending rear-end collision by driving unsafely, self-driving systems do not — they have no misgivings about sacrificing themselves, relying on safety systems to keep passengers safe in a fender-bender.
What does that mean for the real world? Right now, nothing — by the time self-driving cars are commercialized in any large-scale way, they'll be more refined than they are today. (This also explains why Tesla's Autopilot occasionally acts erratically, and why Tesla calls it a "beta" and insists that drivers keep their hands on the wheel.) But it does suggest that self-driving cars could be victims of collisions when they're substantially outnumbered by human drivers. In the distant future, when autonomous driving is inevitably the norm, the crash and injury rates will universally decline.
But, as the study states, "we currently cannot rule out, with a reasonable level of confidence, the possibility that the actual rates for self-driving vehicles are lower than for conventional vehicles," which negates this whole thing. It's hard to get past that line.
Verge Video: This is how Mercedes envisions the driverless future