Tesla Autopilot Death a Stark Reminder that Autonomous Vehicles Are Still Early

2 minute read
| Editorial

The death of a Tesla-driver whose car was in “Autopilot” mode is a tragic reminder that self-driving cars are in their infancy. We have far, far to go before self-driving vehicles are the norm, though that day remains inevitable.

The National Highway Traffic Safety Administration (NHTSA) opened an investigation into the death of a 2015 Tesla Model S driver. The investigation, “calls for an examination of the design and performance of any driving aids in use at the time of the crash.”

Inside a Tesla Model S cockpit

Inside a Tesla Model S Cockpit

Tesla

Tesla acknowledged the death and the investigation in a blog post titled “A Tragic Loss.” In that post, Tesla wrote:

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Neither Tesla nor The NHTSA named the driver, but The New York Times noted that the Florida Highway Patrol identified him as Joshua Brown, 40, of Canton, Ohio. The newspaper also said Mr. Brown was a Tesla enthusiast who posted many videos of his car in Autopilot mode.

In one of those videos, Mr. Brown described a situation where he said Autopilot mode saved his car from a crash, saying:

Tesla Model S autopilot saved the car autonomously from a side collision from a boom lift truck. I was driving down the interstate and you can see the boom lift truck in question on the left side of the screen on a joining interstate road. Once the roads merged, the truck tried to get to the exit ramp on the right and never saw my Tesla. I actually wasn’t watching that direction and Tessy (the name of my car) was on duty with autopilot engaged. I became aware of the danger when Tessy alerted me with the “immediately take over” warning chime and the car swerving to the right to avoid the side collision.

That video—it’s pretty cool:

Much to Learn

We don’t yet know the circumstances of Mr. Brown’s death. We don’t know if it was a problem with Autopilot, driver error though Autopilot was engaged, a freak accident that defies understanding, or something else altogether.

We do know that autonomous vehicles are far from perfected. Google, Tesla, Detroit, Germany, at least one genius in his basement, and even Apple are working on this challenge. But much work remains to be done and many regulations have yet to be developed. It will be many years before autonomous vehicles are even a sizable minority of vehicles, let alone the norm. This accident—tragic though it is—will hopefully neither derail nor slow that process down.

In the meanwhile, my heart goes out to Mr. Brown’s family and friends.

8
Leave a Reply

Please Login to comment
8 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
6 Comment authors
JoelSbrilorNomaardmanibuck Recent comment authors

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  Subscribe  
newest oldest most voted
Notify of
JoelS
Member
JoelS

I agree with those who express condolences to the Brown family. Unfortunately, I predict that we will find that Mr. Brown was not paying attention (watching a video???) and was speeding madly. Automation is not a cure for poor judgement.

brilor
Member
brilor

The critical question should not be “is the system perfect?”, but “does the system introduce failure modes that a human driver is not vulnerable to?” and “how do those failure modes (if any) contrast with the failure modes it mitigates (if any)?” Agreed. Having Autopilot at least as good as a human for a given set of circumstances should be the target. The liability issues noted by other posters are sure to be important as the regulation around this technology evolves. As others noted, getting the driver’s agreement is one thing but agreement from vehicle passengers and folks not inside… Read more »

Nom
Member
Nom

The antidote to Tesla and Mr. Musk’s hubris is the American tort system. Except that it’s a system that trends towards blaming the party with the most money, not necessarily the party responsible for the wrongdoing. The really interesting (if you’ll pardon the word choice) legal situation will be when a semi-autonomous vehicle causes serious injury to a third party who is behaving according the the legal requirements. As sad as it is, in this situation the victim (Mr Brown) was the person nominally in control of the vehicle, and thus voluntarily assumed the risk. I find this comment interesting:… Read more »

aardman
Member
aardman

“Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.” This is certainly not enough to absolve Tesla of any responsibility or liability. This is an agreement between Tesla and the driver. Well, what about third parties who could get injured or traumatized by a malfunctioning autopilot? Were they asked if they were okay with an unproved, potentially lethal, self-driving mass of metal, plastic and glass sharing the road with them, hurtling along at 70 miles per hour? The antidote to Tesla and… Read more »

ibuck
Member
ibuck

[quote]We don’t yet know the circumstances of Mr. Brown’s death. We don’t know if it was a problem with Autopilot, driver error though Autopilot was engaged, a freak accident that defies understanding, or something else altogether. [/quote] Using the link you supplied to Tesla’s web page, they state… [quote]…the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied[/quote] So we don’t know the Tesla’s… Read more »

aardman
Member
aardman

Will Tesla get the same standard of scrutiny (albeit at a smaller scale) that Takata got for those airbags? I’m not holding my breath.

Jamie
Member
Jamie

I’m in agreement with both of you. The hype around current technology is off the charts, and I can’t believe we are fast-tracking so much and rewriting laws to favor what is essentially in very early beta and has such profound real world consequences. The greed and arrogance in so many modern companies is truly shocking. Equally shocking is our government’s eagerness to be complicit in it all. Hopefully public conscience and reason will win out.

aardman
Member
aardman

Not surprised. I am actually amazed that Tesla was allowed to deploy an autopilot system without any effort by the feds to subject it to comprehensive testing. Time for the tech industry to realize that their traditional product testing and proving procedure is not appropriate in situations where glitches lead to real crashes. I.e. the kind where metal twists, glass shatters, and blood spills.