
The Amorality of Self-Driving Cars - benbreen
https://nplusonemag.com/online-only/online-only/the-amorality-of-self-driving-cars/
======
CM30
I'm going to be extremely unpopular for saying this, but I still think the
self driving car problem should be solved by having them value the people in
the car more than those on the road around them. Is that immoral? Maybe, but
if I'm buying a car, I like to think it's looking out for me first and someone
else seven hundred and fifty sixth.

Of course, this is rather less of an issue that it seems, since the article
itself points out that other, more normal safety features are being included
that lower the risk from accidental collisions and the cars might not be going
fast enough for this sort of issue to be a concern in most instances (read,
those not involving a motorway or autobahn). And in those cases, well, anyone
who might be in the path of such a car in such a situation might be best
classed as 'too dumb to live', since high speed roads are not good places for
pedestrian crossings or playing chicken.

But in the 0.00001% of cases which this issue might be a thing, then they
should honestly just value the driver and passengers. Like every other tool
and item in human society,for good or bad alike.

~~~
ChuckMcM
I actually got a rather deep insight from the article, which was "How would an
autonomous car get itself into such a situation?"

Such a car would not drive so close to a truck that it could not simply stop
should the truck lose its cargo. It would not approach a tunnel at a rate of
speed where it could not brake to avoid an obstruction. _People_ do that
because they ignore risk in favor of some perceived time advantage, but robots
don't.

If we imagine the sets of circumstances that require a life and death decision
to occur while driving safely, suddenly the number becomes vanishingly small.
Things like "an earthquake occurs and the bridge you are on loses a section."
or "a power transformer explodes and drops a power line across the road", or
"a plane crashes into the ground in front of you." Under these extreme
situations how the _human_ in control would perform is practically a random
number function. That a robot would minimize the loss of life, seems like the
"correct" choice regardless.

~~~
Tiktaalik
Supposing autonomous cars actually drive extremely safely and this results in
them going 20 km/h under the speed limit. Are drivers going to be happy about
the length of their commute increasing while regular car drivers (due to human
recklessness) wiz by in the left lane at 130 km/h? I think you'd see a lot of
human drivers get frustrated and turn off the AI.

~~~
lisivka
We are already have fully automatic transportation machines: lifts. Are you
happy with them? Do you want to use faster lifts at exchange of higher risk of
injure or death? Do you want such faster lifts for others or for your family?

IMHO, safety will win. Human life is much more valuable than wasted time.
50km/h is consider as safe speed for human driver in city. Computers are
faster, so 60-70km/h will be safe for them.

~~~
bryanlarsen
The maximum safe speed on any street where there is the possibility of a kid
hidden behind a parked car is 30 km/h, no matter human or robot driver.

A 30 kph collision with a pedestrian is almost always survivable. Anything
faster isn't.

Robots can react faster, but cars still take time to stop. Most streets don't
have more than a couple of feet of distance between moving traffic and parked
cars. Reaction time is meaningless there.

~~~
lisivka
Robots can adjust speed of the car according to situation, especially on
electric cars with recuperation of energy in brakes. Robots can use statistics
or history of incidents to improve safety.

Robots can use also other things to improve safety, which cannot be used by
human driver because (s)he is busy with driving, e.g. "laser" pointers of car
direction of drive at road, like "mark my lane" light used by bicyclist; or
directed audio/video/vibration alarms to warn of potential crash; or safety
hands at front of car (like human hands), which will be released by computer
to pickup child right before hit, or to to take hit from larger distance,
slowly, or to change direction of car, or remove something from the road.

Humans cannot do that but robots can.

------
PantaloonFlames
All this hang-wringing about the decision a driverless car will make when
faced with the trolley problem is pointless.

Driverless cars will result in a MASSIVE reduction in deaths of drivers and
passengers, a MASSIVE reduction in harm to human pedestrians and non-drivers
and to the environment.

Driverless cars do not look at their cell phones. Driverless cars do not drive
drunk. Driverless cars are not in a hurry, not distracted, not upset by what
their boss just said, or by the kids in the back seat. Driverless cars will
not err as humans do. Driverless cars will reduce the incidence of "trolley
problems" incurred in practice by 2 orders of magnitude or more.

Driverless TRUCKS will allow cargo to travel at night, when humans are less
likely to be on the road. Which will reduce congestion, which will allow
people to get where they're going on time when they ARE on the road.
Driverless trucks will KNOW when a car is behind them; there is no "blind
spot", ever. And driverless trucks will signal to the car behind them, "you
are too close, back off". And it will happen, and no one will get upset about
it.

Driverless cars will not accelerate pointlessly from a stoplight, only to slam
on the brakes a second later, because they did not see the line of traffic
ahead. Driverless cars will not engage in petty competition with other
driverless cars. Driverless cars will be like Buddhist drivers - steady as she
goes, no stress, no drama. That means less fuel consumption, less road wear,
less environmental damage.

Driverless cars will also give mobility to those who are unable to drive -
people with eyesight challenges or physical coordination challenges.

Anyone fretting about the ethics of driverless cars along the lines of this
article is completely, utterly, radically missing the forest for the trees.

Driverless cars will be a huge boon for modern society.

(Don't get me started about the idiocy of the statement that "car sales are
engineered to bring safety features to the expensive models". Has this author
completely missed the lessons of capitalism? )

~~~
Noos
Driverless cars will still err, though. They will err like software. The car
won't be distracted, but if a software fault causes a google car to classify a
certain make of car as invisible due to its lidar signature, or the glitch it
causes in the algorithim, it will be far worse. Human error is significant,
but software error probably can be more catastrophic because it can affect a
huge subset of cars, much more than human error can.

~~~
Ensorceled
Driver error kills about 33K people a year in North America. That's how
freaking bad this "software glitch" would need to be.

------
mc32
However it works out, on the whole, there will be fewer deaths and injuries
with autonomous cars. So, we can try to weight the different scenarios to give
one a handicap over another --but this pales in comparison to all the lives
saved by having autonomous cars.

~~~
Doctor_Fegg
Yet even autonomous cars will kill more people than bicycles, trains, and
walking infrastructure.

Yes, I know, I know, suburban sprawl, rural communities, etc. But rather than
perpetuating car-centric design and the inevitable, though fewer, deaths,
maybe we should concentrate on designing walkable, bikeable, liveable cities
instead.

~~~
mc32
I love walking as much as anyone. And I love efficient and convenient mass
transit, but I realize that there still is a place for cars. Tokyo, Paris,
etc. Very good transit and very walkable, yet cars are still necessary. With
autonomous vehicles we can utilize capacity better reducing the number of
vehicles necessary to transport a given number of people.

------
blazespin
Talk about pointless edge case. It's so incredibly rare, you can just have the
car protect the passenger and that's fine. My biggest concern about self
driving cars is the ability to mess with the image recognition software. How
hard would it be to flash a piece of material at a lidar/radar and confuse it
that so much that it either crashes / goes into DOS / or thinks a minivan full
of babies is cruising down the road towards it.

------
Eric_WVGG
I loved that this article focused on the peculiar de facto morality that has
evolved around car culture.

I quit driving about fifteen years ago, primarily get around on bicycle with
the occasional train or Uber. You really need to get locked out of a car for a
while to appreciate the weird sense of entitlement that drivers get.

When the streets are full of cars and traffic is slow, well that’s just a fact
of life, you mutter and deal with it. But if a bicycle is a driver’s commute
by a few seconds, that’s license to go completely bananas. Somehow just the
fact that someone owns a car makes them feel like their time is worth more
than that of others, that society owes them a speedy drive, convenience, and
free parking anywhere.

When people are regularly encased in hermetically sealed wheeled bubbled, they
turn into sociopaths.

~~~
visarga
As a driver I often feel cyclists are reckless in how they ride in busy city
streets. I have seen guys entering the path of a speeding car with no regard
to his safety. Cyclists are not protected and I am acutely aware of the risks
around them, and that makes me nervous. As a cyclist, I am afraid to take
these risks, so I keep to sidewalks and parks.

Unfortunately, when I walk on sidewalks I often feel threatened by the
cyclists. They ride too fast, too close to people. They feel entitled and
reckless because they have wheels and us only legs to walk. Cyclists sharing a
sidewalk are just as bad, or worse compared to drivers sharing a road with
cyclists.

The solution would be dedicated cycling paths, but those are a rarity in my
city and many of those that exist are just one meter width marked right on the
sidewalk, reducing the space for walking. Sometimes the sidewalk is too narrow
and people have to enter the cycling path in order to pass, and that's where a
self entitled cyclist would run over you just to teach you not to step on his
path. That's why I often walk with some anxiety, you never know who's on
wheels and what they think they are entitled to do.

~~~
sonthonax
You should be nervous while driving a car, you should be hyper-vigilant,
checking for those which your two tonne mass of iron could easily kill.
Driving a car should be like flying a commercial plane, it's far to much
responsibility for the average person to have.

I'm nervous cycling my bicycle around London, surrounded by entitled drivers
who complain about cyclists at every opportunity but who don't see the problem
with texting while driving at speed?

The solution would be to completely rid of cars in cities, and to only allow
those who are physically disabled, or those who are highly trained to drive.
How long is your commute? 10 miles? that's a 30 minute cycle journey each way
(I did that before a hit and run road rage driver stopped me from cycling).

------
CM30
On another note, it could also be interesting to think about what the legal
situation could be with a situation like this. If a driverless car prioritises
the owner/passengers, then I can see a few court cases where the families of
people killed by one would want to sue the manufacturer over it. And if it
works the other way around, it could be the families of the passengers suing
the company instead.

At which point we then have to let the court system decide how a driverless
car should act in the case of this problem. And then it gets even more
complicated, since different countries and states (and even towns) might come
to different conclusions here. Does your car now need to know how to act in
every possible US state and country in the world in case it's taken there and
gets into an accident?

~~~
csydas
Legally, it will take awhile for most jurisdictions to catch up to the idea of
driverless cars, though in the case of passenger death in the autonomous car,
there will probably be some complicated legal waiver about what does and does
not constitute fault in the software versus what is just an accident.

More likely that naught, there will be federal regulations the US as to how
the cars must behave, and states can elect whether or not to simply allow the
cars, much like how there are national emissions standards.

------
calsy
I always think of these things as tech dreams vs reality. No matter the
publicity, no matter the hype, the reality is you will need to change the
perception of a global public that have been driving cars for a century. You
need to convince the world that it is safe to ride in a car not controlled by
a person. I cant see this happening for sometime, decades even.

Not only that but driverless cars will disrupt government revenue streams, the
workforce, law enforcement etc. every industry will be affected and I don't
think people are just going to welcome these changes with open arms.

Sure the tech news is gushing with praise, which fuels publicity for these big
companies, but in the real world that media will turn on you in an instant
when an incident occurs. There will be a large, active, hostile contingent of
the public who depend on their jobs driving to feed their families and they
wont just sit by and let their jobs become redundant. This isnt Uber, not by a
long shot, this is massive global disruption.

Like any tech venture, it needs to scale successfully to gain world wide
dominance. Prove that it works in a particular environment and expand. Sure
companies will love the cost savings that come from driveless tech, but the
general public, wow that will be a fight.

~~~
JulianMorrison
Society changes fast when a big shift like this comes along. People knew
horses, they were used to them. Boom, gone, when the mass market motor car and
motor tractor replaced them. Society adapted so hard that a horse went from a
daily sight, to a symbol of the past.

People will complain. Laws will be passed in backward places. Truckers will
protest. Boom, gone.

------
lifeformed
I don't get the controversy about this at all. Prioritize the passengers lives
every time.

If someone is walking around on the road, then they are putting themselves in
an extremely dangerous position. Whether or not they deserve or intend it,
they are the ones with the responsibility to not be there. It's unfortunate if
they die, but it's their "fault", at least more than the passenger's fault. If
a bunch of kid runs down a highway into incoming traffic, are all the cars
supposed to kill themselves?

Sure, people will accidentally be stuck in the middle of a road. If you're
unlucky enough to be in that situation, that sucks, but you can't ask someone
else to die for it. Accidents are a part of reality, we should minimize
unfairness rather than minimize loss of utility.

One person getting unlucky and dying is bad, but one person dying because
someone else got unlucky is worse, in my opinion. That's twice the unfairness.

------
mslot
I don't believe that humans take split-second (binary?) decisions based on
morality, only survival. Most of the time, people's instinct tells them to get
off the road if there are no alternatives. This actually seems like a
reasonable decision to increase the chance of survival. Self-driving cars
should probably do the same.

------
tim333
>At some point, my car is going to have a choice between hitting a semi and
hitting a minivan full of kids.

No it's not. I doubt that has ever come up with a human driver and probably
won't with autonomous drivers either. If it does both humans or robots will
probably think oh shit, hit the brakes and hope for the best.

------
venomsnake

      double getMaxSpeed(clear_road_ahead){
            return speedAtWhichStoppingDistanceIs      (clear_road_ahead/3);
      
      }
    
      enforceTopSpeed(getMaxSpeed(clear_road_ahead));
    

Engineering solved morality once again.

~~~
mcbits

        try {
            enforceTopSpeed(getMaxSpeed(clear_road_ahead));
        }
        catch (CatastrophicBrakeFailureException ex) {
            // ???
        }

~~~
venomsnake
turnOffEngine();

The engine turned off can be used very effectively as a brake.

~~~
mcbits
Better catch EngineControlException. Oh, and TireShotBySniperException. Brake
failure is just an example. There dozens of sensors and subsystems with
innumerable combinations that can fail and put the vehicle in a state where
some kind of collision is unavoidable. It will have to decide what to do.

Humans end up calling randomPanic() at some point, so maybe that's what the
vehicle should do to avoid accusations that it "intentionally" harmed one
party or another.

------
gengkev
Self-driving cars that have even the slightest hope of considering such
questions in split-second situations will, undoubtedly, already be better than
any human driver. So when we get to the point where we have to actually ask
questions like these, they'll still be an improvement over jerking the wheel
in a random direction and praying. By then, perhaps, we will be able to
investigate car crashes with the same rigor with which we investigate airplane
crashes.

------
ommunist
Technology is immoral. Business is immoral. These things cannot be put into
the moral dimension. Only humans are moral beings (fortunately that can be
fixed). Essentially, modern planes are 100% self-driving, and look, there are
much less human lives spared in plane accidents than in car accidents every
year.

------
senthilnayagam
Self driving cars are robots period

When in doubt, follow the 3 laws for robots

First generation of the self driving vehicle should focus on safe driving in
comparison to typical human drivers.

Like we have carpool/hov+ lanes we will start having autonomous lanes

I expect cargo trucks would switch to autonomous due mundane nature of the
work and long driving hours

~~~
krapp
> When in doubt, follow the 3 laws for robots

Asimov's 3 Laws were a plot device, not a serious attempt to codify ethics.
They were designed to fail for dramatic effect, and his stories were about
them _failing constantly,_ because they presented the apparent paradox of a
perfectly rational being appearing to behave irrationally.

The only law Asimov's robots actually adhere to is the Law of Unintended
Consequences - and I believe this is the actual lesson for autonomous car
engineers. The more powerful an AI is, and the more complex it is, and the
more likely it is to fail in ways that can't be predicted.

------
elcct
The Amorality of typewriters. Oh wait...

