
MIT Moral Machine - edwinjm
http://moralmachine.mit.edu/
======
grahamel
I don't always understand the focus on this. It's not something people are
tested on when learning to drive and not something assessed on your driving
test.

How should self drive cars react to this? It should be programmed to not get
into the situation in the first place. We should be developing systems that
continually monitor vehicles for faults - kind of like an on-going MOT - that
can react to the first signs of something going wrong, rather than waiting for
a yearly test. This will stop the breaks failing/accelerator stuck situations
more often than not.

Should the car plough into the people or the obstacle? Well it should be safe
enough to crash and to cause serious harm to the occupants - which is
something Tesla have put a lot of work into - and also not accelerate towards
points in the road that can change suddenly. Guy stumbling along the side of
the road? Drive as if he'll fall into the road at any moment. Road ahead
partly blocked? Drive as if the whole road could suddenly get blocked, and so
on.

Driving is all about reading the road and being aware of what can change
around you, which is what self drive cars can and should be doing better than
people.

~~~
jwalton
Yeah, I like that this self driving car is advanced enough to know whether or
not the person it's about to run over is fit, or a criminal, but it can't work
out that it should slow down before running into a concrete barrier in the
same lane it is in. (And how does this moral car know so much about the
personal lives of the people it's murdering, when I can't even get my car to
consistently play music from my phone?)

