

Would a Google car sacrifice you for the sake of the many? - 147
https://medium.com/@dweinberger/e9d6abcf6fed

======
japhyr
I have been thinking about this the last few days. My first thought is that I
would accept being sacrificed so that the overall impact of any collision is
minimized.

When I get into a car, I'm not really worried about my chance of getting into
an accident on this one trip. I'm interested in my overall chances of getting
in a serious accident over the long term. I would happily drive in a car that
will sacrifice me for the greater good, knowing that every other car out there
is programmed the same way. I will get in a car each day knowing that my
overall chances of surviving the trip are pretty good.

That said, I do pay attention to short-term factors such as impending severe
weather and unusual road conditions.

The real question is, how do we help people who are less statistically minded
accept cars that reason like this? The current US political climate does not
seem like one that will accept this level of rationality easily.

~~~
jgeorge
My first thought is I would never get into a personal vehicle if the decision
to "sacrifice myself for the greater good" was not my choice. [1][2]

How do you program "greater good" in a split-second decision making scenario?
How do you decide who wins and who loses, and by what criteria?

I'll answer those rhetorical questions for you: you don't.

The guy that darts into traffic, which results in a network of cars deciding
to kill me in order to minimize "overall" damage to humanity? You've killed a
person with a family, with people who rely on me to provide for them. The guy
that darts into traffic is running from the police after having committed a
serious crime. How do you programmatically determine that his life is more
worth than mine or the others who may be impacted by avoiding just running
over him in the first place?

You don't. Not with a set of programmed rules in a decision making system that
doesn't have the ability to take in external variables in the process of
making that decision.

That guy jumping into the road, with the cop chasing him that I can see on the
sidewalk? I'm running over him instead of killing myself in the process. If I
see that "guy in the road" is the entire soprano section of the Angelic
Children's Choir, then maybe I aim for the light pole and pick a deity to pray
to.

I do not trust ANY piece of software to make that decision for me, or for
anyone, for any reason, and in any situation.

[1] I have intentionally ditched my car in a way that I thought would result
in my own demise in order to minimize damage to others. It was my choice to do
so.

[2] I understand by getting into a vehicle that I don't control (plane, train,
bus, taxicab, etc) that I am giving that decision-making process over to
someone else. But that someone else is also human, with the same ability to
process external input as I would, and I entrust them to not sacrifice both of
us foolishly.

