Uber halts self-driving tests after pedestrian killed in Arizona

Image: Uber

A female pedestrian was killed after being struck by an autonomous Uber vehicle in Arizona, according to ABC 15. In response, Uber has pulled all of its self-driving cars from public roads in the state as well as in the cities of San Francisco, Toronto, and Pittsburgh.

The crash occurred near Mill Avenue and Curry Road early Monday morning in Tempe, Arizona, police confirm. The Uber vehicle was headed northbound when a woman walking across the street was struck. The woman was taken to the hospital, where she later died from her injuries. Early reports suggested that she may have been a bicyclist, but that was not the case. Police have identified her as 49-year-old Elaine Herzberg.

Uber confirms that the vehicle was traveling in autonomous mode with a safety driver behind the wheel during the crash. That would make the pedestrian one of the first known victims of a crash involving a self-driving car.

The female safety driver was the only person inside the vehicle at the time of the crash, and did not sustain any injuries, an Uber spokesperson said. The company declined to disclose any additional details about the driver’s identity or qualifications. The vehicle is still in police possession. Data from the vehicle’s many cameras and sensors will no doubt prove useful to investigators when determining the cause of the crash.

The self-driving Uber vehicle involved in a fatal crash in Tempe, Arizona.
Image: ABC 15

“Our hearts go out to the victim’s family,” an Uber spokesperson told The Verge. “We are fully cooperating with local authorities in their investigation of this incident.”

This wasn’t the first crash involving one of Uber’s autonomous vehicles in Tempe. Almost a year ago, a self-driving Uber was knocked onto its side after another vehicle failed to yield and hit it. There were no injuries during the incident.

Uber began testing its self-driving cars in Arizona in February 2017 after California’s Department of Motor Vehicles revoked the registrations of the company’s fleet operating in San Francisco. Uber had refused to apply for a $150 permit to test the vehicles in the city.

Update March 19th, 12:58PM ET: Uber CEO Dara Khosrowshahi tweeted his condolences to the family of the victim.

Update March 19th, 1:28PM ET: The US National Transportation Safety Board has opened an investigation into the crash and is sending a small team to Tempe, according to Bloomberg.

Here’s the full statement from the Tempe Police Department regarding the incident.

On March 18, 2018 at approximately 10pm, Tempe PD responded to a traffic collision on Curry Road and Mill Avenue in Tempe, Arizona. The vehicle involved is one of the Uber’s self-driving vehicles. It was in autonomous mode at the time of the collision, with a vehicle operator behind the wheel. The vehicle was traveling northbound just south of Curry Road when a female walking outside of the crosswalk crossed the road from west to east when she was struck by the Uber vehicle. The female was identified as 49 year old Elaine Herzberg. Herzberg was transported to a local area hospital where she passed away from her injuries. Uber is assisting and this is still an active investigation.

Comments

This changes everything.

Agreed. I doubt this changes the overall safety utility of self-driving cars (no one said they were perfect, just a lot less incidents per driven more compared to normal vehicles), but this does occur right as legislation and regulations were being so quickly approved that it almost seemed we would have the vehicles all over by 2019, if not the end of this year. Now, even if the unfortunate death here was an act of God incident, this will likely slow down politicians and burrecrats from passing those new rules, at least for the time being.

act of God incident,

God does not exist so how can he drive a car?

Act of God is a legal term for incidents outside of human control where no one can be held responsible, typically sudden natural disasters, being used illustratively in my comment. It does not presuppose a higher being of any sort.

But this accident is clearly not outside of human control. The car being autonomous means it is controlled by software created by humans.

…being used illustratively in my comment.

What did or did not cause the incident is actually not important to message of my comment.

30,000 people die every year in car accidents caused by humans = Act of God

1 person dies in potentially life saving AV research = Danger to humanity

Don’t mean to be tone deaf, but this should not slow down research into this, on the contrary, we need this now more than ever, as driving has become the most dangerous things we do every day.

I hope it doesn’t slow down research, but what should affect the pace of research and what will affect the pace of research are not necessarily the same things. This could affect the pace of research depending on how the event is covered and the time it lingers in the mainstream, which could make politicians less willing to approve regulations and laws that would help research/adoption, is all I was saying.

I think it should only slow down Uber. As their needs to be some thorough investigation to figure out why it happened and how to prevent this from happening again.

Particularly given their (being generous) "cavalier" attitude towards regulation.

Right! Waymo seems to be methodical in it’s desire to make fully autonomous vehicles, Uber just seems to want to get rid of human drivers to increase profits ASAP.

Exactly. Uber’s continued existence relies on them getting driver-less cars working (and even then they’ll need to trim fat) before they run out of venture-capital, while Waymo has the purse of Google to suckle on.

I agree that it should not slow down research, but this thing was rolling around city streets. That’s not just research. This thing is in active trials. When a human hits another human, it’s actually not an act of god, it’s negligence. Someone did something wrong, whether it’s the driver or the person that was hit. Here, we have a person and a vehicle. It’s possible the person that was hit did something dumb and walked out in front of the vehicle in a way that no driver, whether a person or computer, could do anything to avoid hitting her. It’s also possible the self-driving vehicle screwed up. Obviously we’ll need to wait and see. But I am very wary of allowing these things on the road at this point. I think we’re kind of playing with fire. If we come to find out the vehicle was the problem (or even if it’s ambiguous), I hope Uber does the right thing and is very generous with the family of the woman that was killed.

why are you " wary " now and not yesterday or tomorrow ?

why are you not wary of well-know prone to negligence humans drivers ?

you know nothing about the circumstances of the accident but you already have an opinion and a solution.

" we’re kind of playing with fire ", yes, like everyday from millions of years. Nothing is sure, the future is not under your control, every machine can break, every humans can go crazy, every day a meteor or a roof could fall on you. You don’t know.

What we know is we can work to improve stuff. All of us, you too.

Of course. I’ve been wary since I heard they were taking these out on the streets. My concern didn’t start upon hearing of this death. And of course, I’m wary of human drivers as well. I’ve been hit at least 3 times in the last 5 years. But humans are a known quantity

I’m not opposed to self-driving cars as a concept at all, in fact, I can’t wait. I’m just not sure we’re there yet. From the reading I’ve done on the matter, we’re not as close as Uber might be thinking we are, and I think we’re jumping the gun on putting these things on the road right now. I’m all for them testing these in a controlled environment; I just don’t know that they’re ready for live trials. Also, I’m not sure how to square the idea that this lady, who has nothing to do with Uber, AI, or self-driving cars, is an acceptable loss for progress. I mean, if an Uber employee gets killed during testing, they were taking on that risk and knew what they were getting into. Not so for this pedestrian.

Read my comment again, I don’t already have an opinion much less a solution. Like I said, we don’t know who was at fault (it could have been the car, it could have been the pedestrian). I’m just wary.

But this accident is clearly not outside of human control. The car being autonomous means it is controlled by software created by humans.

Even with human drivers actively controlling the car, it is possible for accidents to happen that are unavoidable – a tree falling down on the road right in front of you, a person deciding to drive the wrong way down a highway, etc.

Even if with a computer controlling a car, it is possible to sudden introduce an obstruction such that it is not possible for the car to change its direction and speed quickly enough to avoid collision due to it’s momentum (due to physics don’t stop or turn instantaneously).

But it is clear that this incident is not one of those cases.

So you agree now with the "Act of God" definition that insurance companies use to ascribe fault. If not, you do an internet search for "act of god insurance" that will help.

I can imagine situations where this incident is not one of those cases, but from that available information on what happen. I don’t see how it was clear that this is not one of those cases. Can you link to an article that has details on the specifics of the accident described in this article?

Is that clear?

I didn’t think investigators were at the point yet were they even attributed fault, let alone divulged enough info to determine if software tweaks could have avoided this.

There is no evidence to support an ‘act of god’ situation. Either the software failed to do its job or it was a situation where the collision was physically unavoidable no matter who or what was controlling the car.

There is no evidence to support an ‘act of god’ situation. Either the software failed to do its job or it was a situation where the collision was physically unavoidable no matter who or what was controlling the car.

I’m not sure who are you trying to argue with. All of this came from you claiming that it was "clearly not" an act of god situation after replying to a post that simply considering the possibility of such a situation. Here again is the post you originally replied to.

[…] even if the unfortunate death here was an act of God incident, this will likely slow down politicians and burrecrats from passing those new rules, at least for the time being.

Your posts seem to claim you know something else… and now you have backtracked

This is a perfect "iamverysmart" reddit thread posting. Strange is working hard for the money today.

the word " god " annoys him. He refuses to admit it’s a just a legal term, here.

Perhaps what annoys Dr. Strange is that he thinks he is god. In which case:

He is an all knowing god, and therefore knows the exact circumstances of the accident and he does know who is at fault.

Or he is not an all knowing god, and all he does know with certainty is that he was not directly involved in causing the accident.

If any of these two statements are true, it’s not an act of god. But that is assuming there is only one god. Because if there are more than one, it could be classified as "an act of another god".

you don’t know.

it’s way too premature for you or anyone on that forum to have an opinion, a belief, an idea or whatever. You were not there ! Let the police, the engineers and assurances to do their job.

You have a job to do too.

View All Comments
Back to top ↑