
If A Driverless Car Crashes, Who's Liable? - antr
http://www.npr.org/blogs/money/2013/03/08/173766352/if-a-driverless-car-crashes-whos-liable
======
Irregardless
The same entities who would be responsible if engines started randomly falling
out of cars: automakers.

If driving becomes a feature of the car itself and that feature fails in a way
that causes and accident, there's no other answer. Maybe they'll find some
kind of loophole where the self-driving feature is only covered for X years
under a warranty? I don't think it would be too far-fetched for them to argue
that the sensors for automated driving can deteriorate over time, at which
point the consumer's insurer would become responsible for damages caused by
any failure.

That being said, it might be in our best interest (or Google's?) to shoulder
some of the liability if this issue would otherwise be a major hurdle
preventing the adoption of self-driving technology by automakers.

~~~
Shivetya
However engines falling out of cars do not injure other parties. Driver less
cars have the possibility of injuring others. Currently the blame would reside
with the operator, but I am pretty sure some certain Texas courts would let
the creator of the car, the software, the electronics, and whomever else had
big pockets, get sued and lose.

Unless they totally remove the ability for the driver to take over control of
the car I do not see how you remove fault from the person in that seat.

~~~
fossuser
I'd think an engine failure has a similar capacity to injure other parties.

The automation has to get to the stability where such failures happen about
the same amount of time as engine failures (rarely).

------
damoncali
_In the 1980s, the rising threat of liability prompted vaccine manufacturers
to pull out of the business. So Congress stepped in and created a new system
for people who are injured by vaccines. Cases are handled in special hearings,
and victims are paid out of a fund created by a special tax on vaccines._

That is interesting. I can't imagine the car manufacturers will be able to
handle the costs of the liability by themselves (using our current model of
car insurance), and you have to wonder if they should have to (how much is the
"driver" responsible for? Is a "driver" in Manhattan riskier than one in
Wyoming?)

It's also interesting that we might not even care if it weren't for the fact
that we all have to pay for car insurance now. I mean, nobody wonders who's
going to cover the liability of a subway system. Perhaps this will wind up a
government service, out of reach of the personal injury lawyers. Or like guns,
which enjoy some specific protections from lawsuits.

~~~
eksith
Vaccines don't compare to cars as much since the mere _exposure_ to cars don't
really cause injury or death. Unless they're in motion or were in the car
while running (carbon monoxide poisoning).

Although so many people drive, and there are so many accidents, it isn't an
imminent threat to all people. The mere exposure to a disease is often all
that's needed to for it to spread to an individual's family and their
families/friends and so on until we have an exponential infection rate. This
puts vaccines on the critically needed list so the government stepped in to
mitigate the greater harm of lacking them altogether.

I still think it will be a case of shared blame for driverless cars, although
my hope is that blame would be largely unnecessary. If driverless cars can
reach the same reliability of autopilots in aircraft, this will save lives.

~~~
Kliment
The mere exposure to cars can cause injury or death. How about getting run
over on the street? How about getting poisoned by exhaust fumes? It is an
imminent threat to all people. Driverless cars are likely to be safer, and
hopefully eventually cleaner, but ask the people who had to breathe that crap
back when there was lead in gas whether exposure to cars is dangerous or not.

~~~
eksith
Hence the... "Unless they're in motion or were in the car while running
(carbon monoxide poisoning)."

~~~
Kliment
Ah, I see what you mean now. That was an extremely confusing sentence.

------
wtvanhest
Liability would be determined by a court on a case by case basis. The Deep
Water Horizon Oil Spill will be a good guide. Multiple manufacturers involved
in a difficult court battle.

1) First, you would determine which vehicle malfunctioned. If a blowout of a
tire is the malfunction rather than the automated driving system only, the
court would need to assign damages to A) the tire manufacturer for the blowout
and B) the automation creation company for not designing a system that detects
the blowout and prevents the crash...

In this example, 100% of the blame may fall on the tire manufacturer if it is
determined that there is no way for the automated system to do anything else.
Like a near by cliff with an oncoming truck for example.

Most cases should assign damages based on percent of fault.

Overtime, legislators will probably set rules to make legal actions happen
faster and with more efficiency.

This entire process will be lengthy, expensive and challenging. But, the
overall net benefit of going through the process will be driverless cars and
we call all agree that the benefits of driverless cars will be amazing.

Note: Google Chrome says the word "driverless" is misspelled, they should add
it to their dictionary.

------
venus
In theory, self-driving cars should all perform equally within the same range
and, probably, within the same manufacturer - you'd assume they'd put the same
system into each car, and keep it updated to their latest best effort. In
fact, this principle should perhaps be enshrined in law.

Therefore, individual owners do not need to be taken into account. Simply
calculate the average cost of an accident and multiply it by the likelihood of
that maker's system causing a crash, and send the owners a bill each year.

It doesn't even really need to be determined who "caused" the crash in each
specific instance. It should be plainly obvious, statistically, which self-
driving systems are more likely to be involved in accidents - premiums would
be weighted to reflect that, and this would encourage a virtuous cycle of
improvement incentivisation.

~~~
ed209
But some people will be on the road more often than others, so the likelihood
of an incident will be different.

I think it would be much better to add a premium onto the fuel the car uses
and all claims would go to a central organisation.

Self driving cars should level out risk we currently have in different driving
demographics i.e. a teenager in car is as likely to crash as someone who has
been on the road for 20 years. So it's not like you'd be paying more
"liability coverage" than you should.

~~~
venus
Ah yes that is a good point. Fuel use would be a reasonable start as a proxy
for kilometers driven, but why not just go the whole way and measure the
actual kilometers?

I agree about the central organisation. I don't believe private insurance
companies add much value and with the removal of personal driver fault they
add no value at all. There is no reason it can't be managed via a not-for-
profit, competent central organisation similar to any other aspect of
registration.

~~~
ed209
the reason I prefer measuring energy use rather than distance travelled is
that it also taxes less efficient cars.

safe + efficient is the goal for transport in my view!

------
toddmorey
One thing the article didn't discuss: the massive amount of data that
driverless cars will collect and store. The "automotive black boxes" will help
us analyze and simulate accidents in ways we just can't now. As the
responsibility for driving switches to the manufacturers, so will the
liability. However, with fewer accidents and better data (when an accident
does occur) the industry should be well positioned to bear the burden.

------
wgirish
Its hard for IT folks NOT to 'Download & Apply' the latest release to get
'that' feature. I have learned over period off time to wait for a stable
release. Its possible that Car Maker releases a bug, which lets say " doesn't
apply the brakes" in time. What happens say 100,000 ppl 'Download the latest
automatically at night 3:00 AM " ? Who is responsible ?

~~~
dsr_
Some obvious implementation details that spring to mind...

Whoever owns the code, owns the car's liability. It is in the best interests
of the manufacturer to be able to detect a vehicle not running a signed
release, and require an explicit liability acceptance from the owner. Doing so
whenever registration changes hands is probably a requirement, too. Or disable
the ability to change the code unless you ask for an individual key that
requires you to accept liability when you use it.

No reason to roll out to all the 2017 Toyota AutoCorolla Xv examples at once
-- after QA acceptance, do staged roll-outs to increasing group sizes. Best if
the first few hundred were all Toyota employees.

Roll-back to the previous version or a "known good" release, should be simple
and obvious.

~~~
gte910h
"The first group that we'll roll out to is the cars that drive your children.
Cheers!"

Interesting, if not very threatening

------
yk
I actually think that the entire problem is that currently cars have drivers.
So the current situation is that drivers are forced to behave reasonable. But
this is because there is a human in the loop, who will act due to distractions
(or malice) less than optimal. On the other hand a driverless car always
performs as designed and therefore there are a lot less possible failure modes
involved, in fact I think essentially the only two failure modes are failures
due to improper testing and design flaws ( the manufacturer is responsible) or
due to neglected maintenance or actually unforeseeable conditions ( the owners
insurance is liable). So I think that a driverless car should be viewed like a
malfunctioning coffee machine instead of like a car that is involved in an
accident.

------
adventured
Auto makers and car insurance companies will not go quietly into the night
when it comes to allowing drivers to shift liability. They both profit too
dramatically from things as they are, to just let drivers suddenly not carry
the coverage.

We might very well wake up to a comedy of drivers still having to carry their
full insurance on their driverless cars (it'll be argued that it's part of the
privilege and responsibility of owning a vehicle).

------
Datsundere
this is not a problem because if other cars with the same program don't crash
it's not the programmer's. The owner of the vehicle will pay. But if others
also crash with similar situations, then the programmer/company will pay.

That said, I will never get a self driving car. I like automation in
everything except cars. Driving should be a fun experience.

~~~
Tuna-Fish
In a few short years, laws will prevent you from driving on the public roads
without some level of automation.

There are something like >30k deaths on the highways per year these days.
That's the equivalent of all American Vietnam war casualties every two years.
The highways are more dangerous than any war the US has been in since the WW2.
And a very large proportion of those deaths can be called "innocent" -- not at
fault in their death in any way.

There really is only one reason this is accepted by the public. Because there
is no alternative. There is no policy you can establish that would stop the
deaths that would not also inconvenience everyone too much.

Once self-driving cars fall down in price enough to be a real possibility,
every highway death where a non-automated car was involved turns from
something sad, but inevitable into something that is squarely blamed on the
existence of that traditional car. "This wouldn't have happened if Datsundere
wouldn't have put his personal enjoyment above the safety of everyone else on
the road."

Since car makers will be happy to lobby for this (why not, they get to sell
everyone something new), since there will be enough relatives of dead people
so the grassroots support is strong, since the federal government has control
over roads, and since there are no strong lobbies to oppose it, this will pass
as a law as soon as it's actually feasible to replace/refit all the cars on
the road.

The best you can hope for is a car that will allow you to turn off the
automation in situations the car deems safe, with the car automatically taking
over if it feels like something is wrong.

Beyond that, the only place you will have a fun driving experience in the
future is on the track.

~~~
to3m
I think you underestimate the extent to which people truly, honestly don't
believe that it will happen to them, provided they are in control of the car.
A self-driving car just doesn't give you that assurance. Besides, who would
trust their adorable children, the lights of their very life, to a robot? A
mindless, unthinking automaton, a heartless killing machine with no soul?

~~~
crusso
I think you underestimate how pervasive the demonstrated superiority of
autonomous vehicles will be.

Right now, we just have some videos showing fancy parallel parking:
[http://www.gizmag.com/just-how-good-will-autonomous-
automobi...](http://www.gizmag.com/just-how-good-will-autonomous-automobiles-
get-watch-this/15096/)

By the time legislation is being contemplated, everyone will have seen groups
of autonomous vehicles moving in near perfect synchronization through lights,
around obstacles, etc. The statistics will be overwhelming. The evidence will
be all around us and only the last few hold-outs will need to be convinced
with financial incentives (insurance rates) or legal ones.

------
AnthonyMouse
The insurance company. How is this different than no fault insurance, or
accidents caused by any other mechanical failure?

~~~
Spooky23
No fault only applies to personal injury.

If I'm driving along, over-correct on a turn and smash into your fence, it's
my responsibility. If I do nothing wrong, but my tire blows out, I'm still
responsible.

If my tire blows out because the repair shop over-inflated it, or because
there is a material defect in the tire, that's where responsibility lands.

The question is, when my robot car overcorrects (ie. driver error), am I
responsible, or is the manufacturer? And in what circumstances is the
manufacturer negligent?

I'd dare say that once automated cars move out of California and Nevada and
start showing up in Minnesota and New York, things will get complex quickly.

~~~
drivingmenuts
If you are not in control of the car at the moment of the accident, then the
manufacturer is liable for providing faulty equipment.

Good intentions does not do away with liability.

For that matter, why would you even purchase insurance for a vehicle like
that. It should be supplied by the manufacturer.

Possibly even the car should not be owned by you, since you are not 100% in
control at all times. But that's debatable, since people are attached to car
ownership.

~~~
stonemetal
_is liable for providing faulty equipment._

If a street racer modifies their car? If you didn't perform routine car
washing so the sensors where unable to sense? The car was in a wreck last
month and the repair shop didn't do as good a job as they should have? The car
is 30 years old and stuff naturally wears out? There are so many variables
that manufacturer pays maybe a first approximation to an answer but no where
near the whole answer.

~~~
drivingmenuts
All of that is part of the reason why I will never own a robot car until they
hammer out those details. Currently, when I buy a car, the manufacturer/dealer
specifies what they are responsible for and how long. The rest is on me. So I
have to purchase driver's insurance.

A robot car? I'm not the driver. The manufacturer is, by robot proxy. So,
perhaps the manufacturer should be insuring the car. After all, at that point,
I'm just a passenger.

------
rayiner
Just like with human drivers: insurance companies. It's just the same as no
fault auto insurance always is.

~~~
jackalope
Yes. If self-driving cars significantly reduce accidents, insurance companies
can use it as incentive to lower rates. The rates for cars that can only be
driven manually might even skyrocket.

~~~
crusso
Just imagine once drivers of autonomous vehicles take manual drivers to court
for causing fatal accidents "that could have been avoided by using a cheaply
available autonomous vehicle". The word "negligence" will be thrown around.
Some law suits will succeed in some states. Insurance rates for manual driving
will rocket skyward, making buying that new autonomous car or kit for your
current car cheaper than buying the insurance for 2 years.

------
ryusage
Is it possible for a manufacturer to just provide a statement upfront when you
buy the car that they can't guarantee that the car is 100% accident-proof, and
you use the car at your own risk? That might put people off initially, but as
people get more exposure and realize the cars are more or less safe, they'd be
willing to use them and just get insurance like they currently do.

------
lucian1900
Who cares? It will almost never happen, compared to how often humans crash
cars.

There will be some convention and eventually laws about this.

------
squozzer
Joint and several liability = almost everyone.

------
martinced
It's much more complex than that. We're in 2013 and there are _still_ major
companies being breached within due to buffer-overflow based security
exploits. Buffer overflows.

That's just one example: there are countless of zero days exploits and, if
anything, security has only been getting worse and worse, with more and more
insecure technologies that we rely upon daily.

What makes you think these cars are going to be secure? What makes you think
the company providing the roadmap data is going to be secure? What makes you
these cars are going to be running on a separate network and what makes you
think that network would be secure?

Regarding liability and thinking that "automakers" are going to be the one
liable is an over-simplification: just look at the latest Boeing. It's back an
forth between at least three companies to determine who's liable.

Who is going to be reliable when there _shall_ be a human death involving a
security exploit in the OS / software stack in one of these cars?

The question is not _if_ , it's _when_ btw.

~~~
danielweber
There isn't just one OS in these cars running a monolithic instance of
DRIVE.EXE.

If one sensor sees a person and another sensor does not, the car doesn't flip
a coin before driving through that spot. It doesn't drive through that spot.

The real failure mode for automated cars will be how easy it is to make them
stop in their tracks because someone fooled just one of the sensors into
seeing a human.

