
The trolley problem (Relevant to autopilot mode in cars) - neya
https://en.wikipedia.org/wiki/Trolley_problem
======
imagist
This isn't relevant to autopilot in cars, it's just fear mongering.

Humans have driven cars for a century now and animal-driven vehicles for
millennia before that. I have yet to find a single documented case of the
trolley problem occurring in real life with a vehicle. This simply is a
problem which does not exist.

------
kozak
And the discussion wouldn't be complete without this link:
[https://www.facebook.com/TrolleyProblemMemes/](https://www.facebook.com/TrolleyProblemMemes/)

~~~
soneca
And this guy solved the trolley problem in a very elegant way:
[https://www.youtube.com/watch?v=-N_RZJUAQY4](https://www.youtube.com/watch?v=-N_RZJUAQY4)

------
mcguire
The most interesting variant of the problem:

" _Suppose that a judge or magistrate is faced with rioters demanding that a
culprit be found for a certain crime and threatening otherwise to take their
own bloody revenge on a particular section of the community. The real culprit
being unknown, the judge sees himself as able to prevent the bloodshed only by
framing some innocent person and having him executed._ "

~~~
nat4ever
Thinking it through, a person kind of develops a sense that all of the
interesting things, in this branch of thought experiments, tend to shake out,
postmortem.

In the heat of the moment, what seems like a plausible bargain threatens to
reveal grotesque details amid the wreckage.

So, the trolley problem itself usually becomes most interesting when
inspecting the identities of any casualties. e.g. What about famous people?
What about small children?

In this case, the scenario becomes threatening with the possibility of proving
that the condemned was sacrificed only to placate mob rule. e.g. What if the
real criminal is found later, somehow?

Acting with a tendency to ignore the postmortem consequences is probably how a
machine will operate. They don't have reputation. They don't consult emotion
to inform their decisions.

A machine might not question which teddy bear was _the most favorite_ teddy
bear, when organizing a garage sale.

------
pseudometa
This is only interesting to think about, but this nuanced level of logic would
not actually ever be programmed into self driving cars. Once you go past the
avoid hitting things level, lawsuits from many vectors come into play, and
high risks of false positives. The podcast Hello Internet covered this topic
and I'm convinced this is far more relevant to one's personal prejudice in
every day life.

------
r721
Relevant book: [https://www.amazon.com/Would-You-Kill-Fat-
Man/dp/0691154023/](https://www.amazon.com/Would-You-Kill-Fat-
Man/dp/0691154023/)

