

Should autonomous cars kill their owner to save multiple other lives? - maxcan
http://www.popsci.com/blog-network/zero-moment/mathematics-murder-should-robot-sacrifice-your-life-save-two?src=SOC&dom=fb

======
muaddirac
In the long term, if all cars are autonomous and communicating, it's
reasonable to think that most situations could be handled by carefully
coordinated group action so that no one is injured. (Especially in the example
given in the article - if only one car has a problem, the others should
compensate.)

In the near term, though? For legal reasons it might be necessary to hand
control immediately to a human driver to let them make the decision.

Or perhaps far, far more stringent inspection requirements could help mitigate
some of this?

~~~
JoeAltmaier
If civil-rights-conscious Americans demand the right to manually control their
cars, then its possible an unpredicted human operation (abrupt stop/turn)
could upset the autonomous grid and create unsolvable problems - cars going
too fast to adapt in time.

Either the entire grid would have to slow down enough for any unanticipated
human action (which are unlimited in variety), or they would have to 'push the
envelope' and accommodate only the likely behaviors.

This could very likely result in facing the issue raised by OP: does the
autonomous car(s) sacrifice itself, or hit the human driver while preemptively
engaging safety devices for its own driver?

The designers of the autonomous network will have to face this issue
immediately. Its not some remote possibility; its part of the very fabric of
the network.

------
chrisBob
There is always a balance. At some point these types of algorithms will
probably be regulated, and that will take some of the pressure off of both
owners and manufacturers.

Given the choice I would usually pick me over you, but I would also be ok with
buying a car that would choose a tree over a group of kids. This is especially
true given that the autonomous car is safer for me overall, even if it takes
that one split-second choice away.

------
johngalt
This article goes out of it's way to create hand wringing 'what-if' scenarios.
Completely ignoring the fact that car accidents would be practically
eliminated with autonomous vehicles.

It's perfectly reasonable for a group of cars to all take action with their
owners self-preservation in mind first and foremost. If your worried about
'greater total number of deaths', look at human drivers.

~~~
thematt
I still think it's an interesting question to think about. There's going to be
a (long) period of time whereby autonomous vehicles are sharing the road with
human controlled vehicles and the elimination of all/most accidents is not
likely during that transition.

------
dllthomas
It seems to me that - socially - we should regard this as the responsibility
of the driver rather than the automaker. They chose to drive in that place at
that time, they maintain their vehicle, and they have chosen some balance
between safety and heroism (either by tweaking a setting or by picking what
car they buy). It's the driver's agency at work.

~~~
krapp
At first, sure. But the goal of autonomous cars is to eventually remove driver
agency altogether. What happens when all cars are fully autonomous? Will they
collectively weight their passengers by "killability", perhaps based on age,
insurance premium, marital status, social media status, etc?

~~~
dllthomas
I don't follow. You're proposing there will be driverless cars pursuing their
own agenda, as opposed to the agenda of some human or group of humans who have
- in some sense - deployed them? That sounds unlikely before the robot
uprising.

~~~
krapp
I don't know about "pursuing their own agenda" but making their own decisions
- yes, probably. There isn't much point to autonomous cars which are
essentially drones.

~~~
dllthomas
Decisions don't create agency. My router makes decisions about what packets to
route (and where) and what packets to drop. It does so based on a policy I set
and those decisions are an _extension_ of my agency. If I configure my car to
prefer saving me to saving 100 schoolchildren (or pick a car with that
configuration over the alternative), I'm responsible for that.

~~~
JoeAltmaier
Currently there's no requirement in law to be a hero. Some famous court case
about a taxi driver co-opted by a bank robber or some such.

Autonomous car algorithms must face this issue immediately. Choosing to serve
the driver/owner's interest will be a compelling sales point.

~~~
dllthomas
I wasn't speaking about what is (or should be) required by law. I was speaking
about our social reaction, which certainly has a relationship with law but is
not the same thing.

If it's acceptable for a person, in the heat of the moment, to decide to save
themselves at a cost to others (once they've considered what other options are
available to the best of their ability), then it's _acceptable_ to decide to
put in place a system that does the same, assuming it is (at least) comparably
capable of considering other options.

If it's heroic for a person to, in the heat of the moment, decide that it's
important to save another person even at risk to themselves, then it's also
heroic for them to put in place a system that does the same.

Of course, opting out of the whole thing and sticking with a human-driven car
is also a choice, and once autonomous cars are sufficiently better than human
drivers it may itself become an unacceptable choice.

In the long run, the law should probably catch up to wherever we've arrived at
socially and forbid more egregiously unacceptable choices, but I don't see any
reason we should start there.

~~~
JoeAltmaier
But to sell an autonomous car, you have to survive lawsuits. That's the
paramount issue in front of manufacturers. The engineering issues are simple
in comparison.

What's acceptable or not will be decided in a court, unfortunately. And like
most other traffic-related issues, some default rule (driver in rear at fault
etc) will be built over many cases over time.

Unfortunately its unlikely a reasonable rule will prevail. Instead some
enforceable rule will win out. Regardless, early manufacturers of autonomous
cars are likely to be sued to oblivion in the process.

~~~
dllthomas
What's decided in a court depends in large part on social judgements. We
should be deciding what's reasonable and convincing others (and being
convinced - as a part of that collective deciding), not whining about things
that haven't happened yet.

------
serf
now there's a selling point.

"Hyper-aggressive BMW AI takes out 15 pedestrians and causes $250,000 in
collateral city-wide damage. Reason cited : danger to owner's career prospects
from possible late business lunch arrival. Thankfully the owner is reported to
be safe (and punctual)."

------
smrtinsert
Yes, and knowing that, I'm sure everyone will buy an autonomous car, so that
it will save the many, and not their family.

~~~
EpicEng
I don't know why this is getting downvoted. Sorry, but our most basic human
instinct is that of survival. By that I mean _my_ and _my family 's_ survival,
not yours. I'm not saying that I wouldn't sacrifice myself to save many other
people (no one really knows what they would do until they actually have to
make that call), but I'm not going to buy something that will sacrifice my
wife and child because a computer thinks it's the best outcome.

~~~
fivethree
Congrats, you've exposed the failings of human logic and ethics!

