
MIT's Moral Machine wants you to decide who dies in a self-driving car accident - aburan28
http://thenextweb.com/cars/2017/01/16/mits-moral-machine-wants-you-to-decide-who-dies-in-self-driving-car-accidents/
======
udfalkso
I understand the discomfort that people have with the notion that an
autonomous car will make such life and death decisions. However, I feel like
it's _obviously_ better for such decisions to be made ahead of time in a
thoughtful, collaborative manner than by one human driver in less than a
second when something goes wrong.

The truth however, is that the decision will end up being made by some opaque
neural net that's been trained by the actions of other human drivers in
relevant situations. It'll just "react" the same way that we do. Coding
specific logic into such edge cases is probably unrealistic.

------
mixedCase
As far as tech goes, this is mostly a pointless exercise.

The car is supposed to know it's in a low-speed urban environment where there
may be zebra crossings, so it accelerates accordingly and always has time to
break.

If a pedestrian makes an illegal crossing, the car should try to protect its
occupants first, which is the most natural response for a human driver. Not to
mention autonomous cars will be more likely to turn these unavoidable
accidents into avoidable ones.

One can entertain all sorts of situations, but so far I haven't found one
without an answer. It always boils down to not including a "split-second
morality system".

------
I_am_neo
This isn't a question for morals, it's obviously a question of property damage
and insurance premiums. Anyone suggesting otherwise simply misses the point of
what makes our great wonderful 'modern' society actually run day to day.

------
aaronhoffman
Either way, the car owner is responsible, not the programmer.

~~~
esrauch
How can the car owner be responsible when they have made no decision (and had
no opportunity to do so).

~~~
aaronhoffman
It boils down to property rights and contracts.

The decided to buy the car, they decided to operate it, they decided where to
go and when.

~~~
mixedCase
That's only in a libertarian dystopia with "Caveat Emptor" as its motto and no
contract in place to guarantee safety as long as the car works properly.

In the real world, it depends on the situation.

In this particular situation, if the car was driving so fast it does not have
time to break for, the fault most likely falls to the manufacturer for writing
software that defaults to reckless driving in an urban environment, and
inmediately after some of the fault might fall to the city or the inspectors
that approved the manufacturer if it is concluded that the approval was given
either knowing the software was like this, or failing to account for it in
their testing procedure.

If there was a mapping error, it depends on the map provider's license to the
car manufacturer.

All other situations (car breaking down due to lack of maintenance, breaking
down unexpectedly, pedestrian making an illegal move, etc,...) are already
handled by current law (obligatory "some exceptions may apply").

~~~
aaronhoffman
I agree that in our current situation voluntary interaction is considered
harmful.

But even in your description, it still boils down to Property Rights and
Contacts (not described in those terms however). It may be that the
programmers or business owners will be forced into contracts that hold them
liable. IMO, that would be unfortunate.

