Fatal Crash Leads Uber to Halt Autonomous Car Tests

Uber Autonomous Vehicle

Uber suspended its autonomous vehicle test program in the U.S. Monday after a woman was struck and killed by one of the company’s test vehicles in Tempe, AZ. The company suspended its test programs in San Francisco, Phoenix, Pittsburgh, and Toronto, according to The Washington Post. It was the first known fatality tied to autonomous vehicle testing in the U.S.

Uber Autonomous Vehicle
An Uber autonomous test vehicle

The victim was 49-year-old Elaine Herzberg, and she was struck by an Uber vehicle in autonomous mode when she was crossing the road. A driver was behind the wheel of the vehicle—which is required in most testing environments—but the car was in autonomous mode. The Post reported that Tempe police said she was “walking outside of the crosswalk.”

“Our hearts go out to the victim’s family,” Uber said in a statement. “We are fully cooperating with local authorities in their investigation of this incident.”


I’m of multiple minds on this issue. On the one hand, true road safety will almost assuredly only be found once 100% of vehicular traffic is AI-controlled. The harsh reality is that the only way to get there is through testing.

As Timothy Carone, an associate teaching professor specializing in autonomous systems at the University of Notre Dame, told The Post:

It’s going to be difficult to accept the deaths … but at some point you’ll start to see the curve bend. The fact is these things will save lives and we need to get there. Hopefully it happens much faster and with a much shorter time scale [than it took with aviation].

On the other hand, Uber has exemplified the rules-are-for-suckers-and-chumps approach to this testing process, rushing to markets like Arizona where they can test without having the filthy fingers of government nannies tell them what to do. When it comes to harsh realities, the truth is that government nannies save lives in just about every major industry.

All the safety precautions we take for granted in agriculture, food, automobiles, aviation, medicine, pharmaceuticals, mining, shipping, trucking, trains, paint (how’s that lack of lead working out for you?), and just about everything else were all developed either at the behest of or in partnership with government regulators.

Surely autonomous vehicles need the same sort of oversight. Personally, I’m hoping this tragic incident leads to federal guidelines and rules governing the development of autonomous vehicle systems.

16 thoughts on “Fatal Crash Leads Uber to Halt Autonomous Car Tests

  • Sadly, I have to report the Uber camera video is a crock. It shows an unlit street with the pedestrian appearing out of inconsistent shadows at the last second. A resident of the area drove the same street to show that it is well lit and how doctored the Uber video appears to be.

    Compare the official story https://www.youtube.com/watch?v=sFrZLVjueXo
    with this video https://www.youtube.com/watch?v=CRW0q8i3u6E
    and see if you come to the same conclusion.

    This accident should have been avoided by the automatic driver and not even required human intervention. The shadowed patch from which the pedestrian emerged simply does not exist in reality.

    Apparently Uber recently reduced the number of required humans in the car from 2 to 1 and that was probably the reason there was no human intervention. The likelihood of two observers being distracted at the crucial moment would have been much lower. The pedestrian was 3/4 of the way across the road before impact occurred.

  • Seems to me that Uber suspending testing, unnecessary but responsible behaviour, signals a change of corporate attitude for Uber. It’s now clear that no human driver would have avoided this fatality and it would be unreasonable for an automated driver to do any better, which we fully expect the technology to be able to do in many circumstances as it develops – just not in this case.

    Uber did nothing wrong here. It showed unexpected public awareness in its response to the incident. And self-driving technology it technically confirmed. Reporting seems balanced and uncharacteristically unemotional. The public reaction is yet to be gauged, it will take time, but seems reasonable.

    Without minimising the tragedy, I’m encouraged that common sense seems to be prevailing here.

  • I lived in the Orlando, Fl area for several years. It is an area where pedestrians often leave common sense at home. The concept of walking to a crosswalk and waiting for pedestrian right of way lights is most often unused. Even though one can rule the accident as pedestrian fault; I wouldn’t have felt good at all hitting and/or killing one. At night it was almost daily that some pedestrian dressed in black would cross the six lanes of 50 mph traffic at any place along RT 192.

    This is what I did. Whenever I saw a shadow along the side or in the medium I assumed a pedestrian was attempting to cross the road. Sometimes it proved to be a bush or other object. But many times I was thankful that I had slowed and moved over. To be sure human drivers do not always use caution on the road. They should be held accountable.

    Current accident avoidance systems are too dumb to be testing on public roads. Safe driving courses stress awareness of surroundings and anticipating another driver, a pedestrian or a child will do something that will put you on a collision course with them. I would be a whole lot less pessimistic if DOT certification rules for these auto-cars were published. Can they pass a ball bounding toward the road ahead test? If not they should be kept off most streets.

  • For the record my wife and I are waiting for a fully autonomous car. At that point we will trade in the cars we have. Driving just isn’t that much fun any more. I’d much rather just get in, say “Go to Target” and then not worry about it.

    1. My wife and i are taking public transit more and more. Fortunately we are a 15 minute walk from a big transit center and even closer to bus stops that connect there. It helps that the San Diego area finally has a pretty good system with dedicated bus ramps onto the freeways, some dedicated bus lanes and even more carpool lanes that the buses use. Having a senior pass is very cost effective, $18 a month for unlimited bus and light rail (rural routes excepted).

  • I note that the pedestrian was jaywalking when struck by the vehicle though it certainly could have occured in a crosswalk. Maybe the law should be no driverless vehicles on service streets.

      1. Yes, a pedestrian is not supposed to cross even if in a crosswalk and they have the light unless it is safe to do so.

  • Hopefully the resulting lawsuit will convince Uber, and any other company testing these vehicles, of their responsibilities to test the technology in a controlled environment.

  • There is a real problem with testing a product that puts the lives of people at risk without their consent and without their knowledge. This incident clearly shows that they aren’T close to being ready to do these tests in the public roadways. People shouldn’t be used as crash dummies. Oops that doesn’t work so let’s make an adjustment and try again.

    Pedestrians are going to be one of the difficult problems to be solved for these systems. There has to be a better way to develop these systems without putting pedestrians at such risk and it doesn’t matter that she was outside of the crosswalk. We have a collision avoidance system on our vehicle. I have learned not to trust the thing. It sees things that are not there and fails to see things that are a danger.

    The peripheral vision of these systems are not trustworthy. The intelligence is not even close to being ready. These systems should be able to demonstrate that it responds before people are in its’ path. For examples: a ball rolling or bouncing towards the road should trigger a response because children often chase after them, any pedestrian near the edge of a road should trigger deceleration.

    This incident says they are not ready. Not sure about the logic of allowing companies such as Uber to be allowed to test unproven technology on public roadways.

    1. any pedestrian near the edge of a road should trigger deceleration.

      Really? Because I’ve driven in a lot of cities where that would bring all traffic to a stop. Heck in Minneapolis it’s normal for people to stand in the street next to the curb waiting for the light to change.

  • Autonomous mode or not, why didn’t the human driver stop the car? Did the driver expect the car to stop itself so just did nothing, in which case we need to improve the process for testing, or was the pedestrian very hard to see for both automatic systems and humans?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.