
MIT website makes you decide who a self-driving car should kill in an accident - cocktailpeanuts
http://www.businessinsider.com/mit-research-crowdsource-moral-decision-making-self-driving-cars-2016-9
======
cocktailpeanuts
I've seen this nonsense keep being shared on Twitter and now even MIT thinks
this is a huge dilemma for some reason.

They make you think there's only two choices (hit the pedestrians or hit the
car), but you know what? You can just hit the guard rail.

And what's funny is, that's exactly the choice most human beings will make
too. Which is why I think this is such a buillshit--the two choices given
doesn't even include the one most likely outcome that most normal human beings
/ machines would make--It's just an infographic made to generate traffic and
for some reason people think it makes sense and are sharing it everywhere.

Why would MIT follow this sensationalism fad? Do they really think there's
only two choices in this picture?

