Uber’s self-driving car showed no signs of slowing before fatal crash, police say

Uber’s self-driving car was traveling at a speed of 40 mph when it struck a 49-year-old woman in Arizona Sunday night, and did not show significant signs of slowing down, police said today.

The pedestrian, Elaine Herzberg, was transported to the hospital, where she later died from her injuries. Herzberg, who was pushing a bicycle across the street, “may have been” homeless, Tempe Police Sgt. Ronald Elcock said in a press conference. There was a safety driver behind the wheel of the vehicle, identified as Rafael Vasquez, 44. There is no sign that the driver was impaired, police said. An Uber spokesperson confirmed that Vasquez is employed as a safety driver by the ride-hailing company. The vehicle was traveling in autonomous mode at the time of the crash.

The crash late Sunday evening has drawn intense scrutiny by the national media. It was likely the first time that a human pedestrian has been killed by an autonomous vehicle. Proponents of the technology have championed self-driving cars as a potential antidote to the tens of thousands of traffic fatalities that occur each year, while some safety advocates have expressed concern with the speed with which these vehicles are being pushed onto public roads for testing.

Image: ABC 15

Uber suspended all of its self-driving testing in cities across the country in the wake of the crash. Volvo and Toyota, which have self-driving partnerships with Uber (or were negotiating deals), have declined to comment on the future of their relationship with Uber.

The crash is the most significant fatal incident involving a self-driving vehicle since a Tesla driver was killed in 2016 while his vehicle was in semi-autonomous Autopilot mode. Arguably this case will be even more scrutinized, as it involved a more advanced vehicle and a more controversial company. The National Traffic Safety Board has dispatched a team to Tempe to investigate.

Herzberg was crossing the street mid-block when she was struck by the self-driving Uber, police said. “The safety of our citizens here in Tempe is of the utmost importance,” Elcock said. In reminding citizens to use crosswalks, Elcock added, “None of us ever want to go through this ever again, using the crosswalks will definitely limit this from happening again.”

Of course, manufacturers often tout the ability of autonomous vehicles to “see” beyond what normal human drivers can see, thanks to an expensive array of cameras, radars, and LIDAR sensors powering the car’s perception. And safe street advocates often bristle at law enforcement’s description of crash circumstances that appear to place blame on the victims.

The crash will likely lead to an intense round of finger-pointing among law enforcement, regulators, tech experts, and the auto industry. And since Uber’s self-driving cars are roving data recorders, there will be high-level interest in seeing what happens to the footage and crash stats captured by the vehicle itself.

The investigation is going to be similar to those of normal car accidents, Elcock said. Once the investigation concludes, the case will be referred to the Maricopa County attorney who will determine whether to bring charges. When asked to describe what it means to be in “autonomous mode,” Tempe police deferred to Uber.

Comments

If that is indeed an image of the site, I don’t see anything that would have obstructed the sensors from the uber car, and it looks like there would have been room to give the victim a wide berth

Right, which could indicate that the car didn’t detect her, but in that case, shouldn’t the safety driver have done something? These vehicles still have difficulty detecting certain things; they may be able to see a person riding a bike or pedestrian fine, but a pedestrian walking a bike may not have been detected properly.

But it still should have been detected as an obstruction regardless. If person walking with bike isn’t in the database, the sensors should default to basic obstruction protocol.

Oh definitely. The SDC I worked on had low level LiDAR that simply slammed on the brakes if anything was in range (as other sensors should detect and avoid objects well before they’re in the range). That was a lower speed vehicle that’s less sophisticated, but this accident shouldn’t happen.

It seems even more interesting that the Volvo itself didn´t break. The XC90 onboard safety assist systems would have stopped that car.
So, do they remove those systems when using their self driving setup so that it does not interfere?
That seems a trivial setup that pretty much any modern car would not let happen.

They turn them off as they’d interfere with the supposedly superior system.

I don’t think the Volvo one works at 40mph. I had a VW that only worked at less than 20 (may be wrong but it certainly didn’t work at higher speeds). It would only alert the driver.

It would have certainly detected it though, mine used to detect bushes when parking.

Thx, didn’t know they turn it off. The emergency breaking assistant works up to 100mph, at least it does on my wife’s VW. And it already saved my ads a few times. However, the active front assist for pedestrians only works up to 20 or 30 mph, but emergency breaking would habe triggered.

But as it seems the woman crossed the car without notice which would explain why no systems triggered a stop.

The sign next to the road seems suspicious, tho it may have nothing to do with the accident.
Fact is that both the driver and the autopilot missed a pedestrian. Neither reacted before impact so, however this happened, it probably wasn’t a simple matter of detection and avoidance.

Uber, always screwing up the party by bringing the drugs

I really don’t see how Uber/the driver can’t be at fault now. Maybe the pedestrian was walking parallel to the road, then suddenly switched to crossing? But even in that case the car and driver should’ve been tracking her before she entered the road, and there seems to be plenty of room to swerve. If the car was unable to handle that situation, it should not have been on public roads yet.

If someone was walking on the side of the road and then crossed in front of me, away from a legal crossing zone, it would legally not be my fault if I hit them.

I might feel awful about it. But if this person did illegally cross traffic while the car was driving in the legal guidelines of the road…well, only one person was doing something foolish.

Even if the person was being foolish, these cars should do a better job detecting and avoiding. Otherwise, in a sea of thousands of such self driving cars in the near future, any encroachment onto the road would mean certain death, where a human might just apply brakes.

We simply don’t know enough to say whether the car worked properly. If the lady walked into the cars path at the last moment, no driver (human or autonomous) could have stopped the vehicle. It could be a failure of the tech as well. We don’t know. But we should know soon. I’m sure they have all the data they need to figure it out.
I find it interesting that police are considering bringing charges. Who do you charge when the car is driving itself?

I would say easily the guy behind the wheel. His entire purpose it to make sure these things don’t happen.

But not really. I see the safety driver as a person who takes over, if it’s clear that the superior self driving mechanism says it’s failing(this may not be right), or if the car is in a known situation where the self driving doesn’t work. We humans are slow, even when we should be super focused.

Hopefully the data will tell what happened.

He was not driving the car. The police said the autonomous mode was driving. He was a passenger. I’ve never heard of a passenger in a vehicle being found criminally liable for hitting a pedestrian.

And either way, the law in Arizona seems to be that Uber is responsible for the actions of the car, not the safety driver.

Yes, Uber would be liable.

Cars can’t read minds.

But they can: We know the mass and habits of pedestrians, so we can map a probability zone around a detected ped for anticipated shenanigans. Ditto bikes, ditto unexpected obstructions. This one wasn’t likely dashing anywhere, not with a loaded bike. There sure feels like a flaw in the implementation.

Agreed. Or simply a mechanical failure of some type.

Right. The cars’ expectation should be that everyone is going to suddenly jump into the path. They should be able to tell the moment the person changes direction.

The circumstances of the encroachment matter.
If someone darts into the street in such a way that a human driver has no time to react, it’s as good as suicide. Our intuition and identification ability is far better but our ability to pay attention is terrible.
An automated car may try to avoid what it sees but this depends on it having the time to physically react. The difference is an autopilot has no excuse of being distracted, but there are scenarios that are impossible to win at even with super human reflexes.

The danger is that, because the media is highlighting autopilot accidents, they may get an unfair labeling even if they are superior drivers.

True what you say, of course, but I think that if we are really going to give computers the wheel because they are better drivers, then they should be able to handle some amount of foolishness… jamming on the brakes when someone steps off the curb in front of the vehicle.

The big tell in this case is that there’s no evidence of the car braking/slowing (per police reports). If that’s the case, it tells us that the car probably didn’t detect the woman at all.

View All Comments
Back to top ↑