Autonomous (self-driving) cars will certainly be built to a high level of reliability and safety. However, there will still be the rare case when an autonomous car goes awry and there's a crash. What will happen then?
Autonomous cars are upon us. Sure, it'll be a few years before they are pervasive, but we're already on the leading edge of the technology. For some interesting and informative background, I refer you to the February 2016 issue of Car & Driver, p. 58, "Going to the Dogs." From the intro:
So while we're getting worse behind the wheel, the sensors and algorithms capable of saving us from ourselves are getting better. And although we're not convinced this will ever yield totally hands-off personal transportation, scores of manufacturers are working feverishly to prove us wrong.
Including, we have surmised, Apple.
Even more interesting than cars that are autonomous, but have passengers onboard who are monitoring, is the concept of what one might call a driverless car. For example, "Man Summons Tesla Via Apple Watch, Makes Your Car Look Like A Fossil." In this example, the car simply moves itself from the garage to the driveway, but eventually, freight companies will think seriously about driverless trucks. See: "Self-Driving Trucks Could Rewrite The Rules For Transporting Freight."
All this brings up in my mind the question of who bears responsibility when one of these driverless vehicles makes a terrible mistake. Property is damaged and/or perhaps life is lost.
Of course, the first step is to pave the way for these vehicles to operate legally on the road. This is more than a technical blessing by the government. There are legal ramifications. See, for example, "Federal Government Will Treat Google’s Driverless Car System as a Legal Driver."
A Fine Distinction
Even if a driverless car has the authority to be on the road as a "legal driver," there will still be the question of liability, and that has to fall back to the owner of the vehicle. Until I set up an interview with an expert, I do have an opinion, however.
I suspect, today, that these autonomous vehicles will have to prove themselves, in testing, to the satisfaction of the car insurance companies. Once the vehicles meet those standards, they can be insured against the very small probability of a failure, bug or network outage that causes an accident. If that all works out, the prospective customers of the cars will be comfortable with the purchase—and affordable insurance.
After all, as the Car & Driver article above points out, we are already at the mercy of our car's computers: stability control, anti-lock brakes, collision-avoidance and so on.
I am sure that, in the future, there will be some high profile court cases that will fine-tune the laws. With Toyota, we've already seen how dangerous software bugs are handled by governments, and so there is ample legal and regulatory precedent to proceed as we have.
From what I've read, it appears that all the parties concerned are proceeding carefully, working out the technology and the legalities. You'll insure your autonomous car just as you insure yourself (and family) against a mental lapse. The difference may well be that cars prove themselves much better drivers than humans. Fewer lives will be lost.
And then, a day will come, when a human being who wants to drive a car on the highway, without computer oversight (gasp!) will have to have a very special test and special driver's license. After all, who will want a human threat to all those expensive autonomous cars and freight trucks ... on the loose?
Traffic jam image via Shutterstock.