
If an autonomous machine kills someone, who is responsible? - muon
http://www.guardian.co.uk/science/blog/2009/aug/19/autonomous-machines-systems-report
======
stijnm
I don't want to forego a stimulating intelectual discourse, but isn't it
obvious?

Replace 'autonomous machine' with 'child' and it is clear a parent is
responsible. For autonomous machines these are the owners. Machines may be
autonomous, but if a real person is liable, then I am sure there will be
plenty of checks and tests done to prevent any disasters.

In fact, the results of the accounting scandals not so long ago may be a good
template. Investors were fully dependent on this autonomous, faceless,
financial machine to churn out profits. That blew up in their face and they
had no-one to blame. One result was the enactment of Sarbanes-Oxley which set,
amongst other things, criminal penalties on individuals.

That is good motivation: accountability.

~~~
eru
Interesting. However you forget one thing: Parents are not liable for damage
their children do. E.g. in Germany as long as parents do not violate the
obligatory supervision neither can they nor their children be made liable for
any acts of their children.

~~~
stijnm
Your example implicitly means that you do agree someone needs to be made
liable for the child through 'obligatory supervision' - because if you don't
do that, you will be accountable.

You say 'as long as parents do not violate'. Indeed, as long as you do what is
required to ensure nothing goes wrong ("bonus pater familias"), but an
accident still happens, then society/German Law accepts no one is liable.

Your argument seems to point more to: How do you define someone as liable and
how do you enforce it?

~~~
roundsquare
If we take the case of the robot surgeon, then the person/institution who made
the decision to use the robot would presumably be responsible (e.g. in this
case, the hospital). The 'obligatory supervision' in this case would mean
running regular tests, getting it repaired, etc...

I think in most cases we can come up with someone who is responsible for the
robot. The person who owns it will generally be the person responsible but
there will be exceptions (e.g. if I borrow your robot, break the "don't attack
people" software and then use it).

If, after doing all the maintenance, the robot surgeon still kills someone,
then you would need to investigate. Did something go wrong with the robot? If
so, is this a defect in design that the manufacturers should have thought of?
If so, blame them. If not, well, no one is to blame (unless it happens with
the next model... where they should have learned from the past). I suspect
we'd also end up seeing recalls of robots when flaws are detected, as we do
with cars occasionally.

------
gdp
Software defects kill people on an infrequent but fairly regular basis. Legal
precedent and licensing agreements suggest that this is the software user's
fault, not the manufacturer's.

So I assume we'll carry on with this preposterous state of affairs and just
extend it to software-in-machines killing people autonomously, 'cause a defect
in that _still_ wouldn't be the manufacturer's fault, right?

~~~
patio11
We also have fairly well-developed laws and case law on animal attacks. If you
keep a chihuahua in a jurisdiction which does not have strict liability, and
it bites someone, you are likely not at fault unless you had reason to know it
was prone to violence AND you took insufficient steps to restrain it. Your
local jurisdiction may have strict liability ("It is the owner's fault,
period") or may have designated certain breeds as "aggressive dogs" whose
owners are presumptively liable.

You could see this happening with the law of machines, too. Roomba crushes
someone? Awful but lawful. KillBot9000 mistakes a troop of Girl Scouts for
armed revolutionaries? You're probably going to end up liable.

------
anigbrowl
I don't know, but expect to see Hofstadter and Searle debate by proxy in a
courtroom, because Searle's Chinese Room argument is exactly the sort of thing
you could put over on a typical jury.

~~~
stijnm
I had not heard of this before, but very interesting!

For those wanting more info: <http://en.wikipedia.org/wiki/Chinese_room>

~~~
anigbrowl
If you found it interesting then you really need to read
<http://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach> ...and indeed, pretty
much everything else he has written.

It seems to me that the long slow debate between these two philophers (add in
Daniel Dennett on Hofstadter's side) turns on the fundamental philosophical
question of where consciousness begins and ends, which has considerable
ramifications beyond the field of artificial intelligence. I reject Searle's
argument for reasons too long to go into here, but he is a very interesting
philosopher in his own right.

While I'm on this topic, I may as well also mention Stanislaw Lem's essay _Non
Serviam_ excerpted here ([http://themindi.blogspot.com/2007/02/chapter-19-non-
serviam....](http://themindi.blogspot.com/2007/02/chapter-19-non-
serviam.html)) from _A Perfect Vacuum: Perfect reviews of nonexistent books_.
Much, perhaps most, of Lem's output is concerned with the nature of
consciousness and autonomy explored through the medium of fiction, just as
Philip K dick tends to explore the same questions from a psychological
standpoint.

------
akd
This question exists because humans _demand_ a cause for each effect, and a
"responsible party" for every tragedy. Sometimes, things are just "accidents."

A colleague of my dad's tripped and fell while walking down a set of _three_
stairs, hit his head, and died. It was nobody's fault -- it was just bad luck.

~~~
gdp
But that ignores the idea that somebody designs and builds an autonomous
machine.

If an airplane autopilot sends a plane full of people into a mountain due to a
defect in the software, who is liable?

~~~
sokoloff
The FAA (and common sense) treats an autopilot as just a fancy interface to
the actual control surfaces of the aircraft, meaning the pilot-in-command is
still responsible.

Even on autopilot, I'm supposed to be monitoring the systems, keeping track of
where we are, where we're headed, the weather, etc. If my autopilot sends me
towards a mountain, it's on me (practically and legally) to fix it.

~~~
roundsquare
Sure, but if you see some bad weather and decide to turn off autopilot... and
the system won't let you, then the company is to blame (legally, though
practically you'd obviously do everything you can to make sure you survive).

The decision would need to be made case by case in courts to determine who (if
anyone) is responsible. At least in court systems like the US has. For very
different systems, I'm not sure.

~~~
sokoloff
The certification process for autopilots, even for light single-engine
aircraft, is designed to prevent "autopilot can't be turned off or over-
ridden" scenarios.

Can it prevent all of them? No, but mine has a mandatory self-test that has to
complete before it can be engaged, there is a dedicated breaker with a collar
so I can kill power to just the autopilot, failing that I can kill the power
to all avionics, failing that kill all electrical power, and failing all that
each of the servos has a clutch on it that can easily be over-ridden by even
the daintiest pilot. If the autopilot is commanding a left turn that I don't
want, I can mechanically override it, or I have 4 electrical ways to shut it
off. That takes "autopilot can't be disengaged" way, way down on the list of
things to worry about, especially when compared to flying behind a single
piston engine, with no anti-icing equipment.

If I fly my airplane into a mountain, the NTSB report will say: Probable
cause: The pilot in command's failure to maintain clearance from terrain,
resulting in in-flight collision with mountainous terrain. They may add:
Contributing causes were: <some list of lesser causes such as weather,
lighting conditions, failure of navigation equipment, projected spatial
disorientation, and if they have evidence to support it, autopilot malfunction
or more likely, pilot's incorrect or insufficient reaction to that
malfunction>. The likelihood that S-tec [maker of my autopilot] is found
liable is precisely 0.00%.

~~~
roundsquare
Wow... taking the autopilot example too literally.

I suppose there is no need for me to point this out, but I'm not a pilot and
have never flown anything, with or without autopilot. I was just using this as
an example of "something going wrong with the machine" to show that there are
times you would blame the pilot and times you would blame the company who
built the autopilot. If autopilot is a bad example, replace it with some other
autonomous machine... or pretend that we were back in the stone ages of
autopilots when things could go wrong. Or, I'll make up a new example where
the autopilot all of a sudden inverts the plane and then sets it right before
you get a chance to turn it off therefore injuring one of the passengers. Lets
not get too caught up in the specifics of autopilots and aviation laws, etc...

That said, having a number of failsafe mechanisms (like the ones you
described) would be a critical part of any autonomous system, especially in
the early stages, which would, of course, affect who we end up blaming.

~~~
sokoloff
I recognized that you weren't confining yourself to specifically autopilots,
but I felt that describing the autopilot failsafes (even on $50K family
airplane) as well as the case law surrounding what people might consider to be
"autonomous system" but which case law considers "fancy cotrols under command
of the pilot" was useful to the discussion.

I wasn't trying to argue against any particular point of yours.

------
yannis
The autonomous machine should be brought to court and be allowed to defend
itself before any judgement is passed. They should be allowed access to legal
aid even a pro deo counsel.

It is not impossible actually for a legal system - that over time has passed
judgement to witches, dogs donkeys, used torture to extract confessions and in
some parts of the world demands its high priests to wear funny clothes and
wigs - to be modified to suit the new circumstances.

In the meantime if you developing software for automating trains, aircraft or
heavy construction machinery get plenty of Professional Idemnity Insurance. If
you developing software for drones, missiles or nukes don't worry, there will
be nobody alive to sue you if something goes wrong!

~~~
roundsquare
I don't think autonomous here is the same as sentient/intelligent/etc... It
just means a machine that does something without human intervention. E.g. if a
Rumba takes off your toe, you can't take it to court.

------
estacado
God.

