
Uber Not Criminally Liable in Death of Woman Hit by Self-Driving Car - abhisuri97
https://www.npr.org/2019/03/06/700801945/uber-not-criminally-liable-in-death-of-woman-hit-by-self-driving-car-says-prosec
======
rayiner
> In the six seconds before impact, the self-driving system classified the
> pedestrian as an unknown object, then as a vehicle, and then as a bicycle, a
> preliminary report from the National Transportation Safety Board explained.
> While the system identified that an emergency braking maneuver was needed to
> mitigate a collision, the system was set up to not activate emergency
> braking when under computer control.

The prosecutor made the wrong call here. This part is absolutely criminal
negligence. Putting a “self driving” car out there that doesn’t have emergency
braking enabled (apparently because it creates too many false positives) is an
unjustifiable risk. Working emergency braking should be the first thing
perfected, before the computer gets to control the car.

~~~
rsj_hn
Arizona law is pretty friendly to motorists. Generally speaking, you're not
supposed to jaywalk and when you do, you are taking your life in your own
hands and the car that hits you is not liable. It's even worse for the
pedestrian -- even at an intersection if you walk quickly onto oncoming
traffic you do not have the right of way and are responsible for anything that
happens to you. It's just not the case that the car is assumed to be at fault
in situations like these -- the pedestrian is assumed to be at fault. The
pedestrian is not automatically given the right of way and is basically never
given the right of way outside of an intersection. This is different than the
laws in other states like California.

Situations like drunk driving can make a difference, but in this case that's a
hard claim to make. Maybe if the car was swerving out of its way to hit her,
you could make that case. But arguing that it wasn't doing enough to stop --
that's a tough sell given AZ law. The car had the right of way, it was not
acting maliciously, and she walked right in front of it. According to AZ law,
asking the car to make what might be a dangerous last second swerve into
another lane or slamming on the breaks to avoid hitting her is not a legal
obligation as these are unsafe maneuvers. There might be another car behind
them or other cars/pedestrians alongside them so the law in AZ doesn't require
these types of high risk actions to avoid hitting pedestrians that walk right
in front of oncoming traffic.

Moreover this person was jaywalking at night in an area without street lights,
without even bothering to look both ways, across a median, wearing a black
hoodie, through a high speed road right in front of a car that had proper
headlights and was going a steady speed.

Anyone familiar with AZ law knew that Uber wasn't going to be charged. You may
think the law itself is wrong and that cars should be legally required to take
last second high risk evasive maneuvers. These are all tradeoffs to which
reasonable people will disagree, but changing the law is the job of the
Arizona legislature and not of the prosecutor, who made the correct decision,
even if it seems like the wrong decision to you.

~~~
argonaut
Minor correction: the area was actually very well lit with street lights. The
Uber dashcam video was just poor quality so it gave the impression it was
really dark.

[https://arstechnica.com/cars/2018/03/police-chief-said-
uber-...](https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-
came-from-the-shadows-dont-believe-it/)

~~~
pvaldes
This prove that the area is well lit currently, but not necessarily that it
was very well lit the day and hour of the accident.

It was a very notorious case. I would expect authorities silently improving
the lit in the area and taking other correction measures in all those months.

~~~
Felz
I actually used to bike that road all the time 3-4 years ago. The bridge
section before it could be disturbingly dark, but the point where she was
crossing was decently lit, as I recall. It was also somewhere I would never
consider crossing, mostly because there was a crosswalk just up ahead.

Still, if I absolutely had to it was definitely an intersection to watch
carefully for cars when crossing, but one with fairly long sightlines for
that.

------
vichu
This is potentially precedent as a legal blocker for public adoption of self-
driving cars. If I am to serve a 1 year prison sentence for faulty code or a
deprecated LIDAR sensor, I don't see in what scenario I would be willing to
leave my fate in the hands of a self-driving car manufacturer. Just as in the
case of a chauffeur, if they are driving and commit vehicular manslaughter, I
would not expect to be deemed guilty.

It seems to me that it would be the benefit of self-driving car companies to
own up to liability as it is to their benefit of achieving widespread
adoption. For example, Volvo is in line with this idea and has publicly stated
that they would accept full liability in fully autonomous operation modes[0].

In this case, I do think that some liability lies with the driver - as they
were tasked specifically with prevent situations like this. What is not clear
is whether or not this task is even humanly feasible given reaction times, and
based off of this, whether or not Uber has been criminally negligible. Given
this, I am surprised that the prosecutor seems to have absolved Uber of any
blame.

[0] [https://www.media.volvocars.com/global/en-
gb/media/pressrele...](https://www.media.volvocars.com/global/en-
gb/media/pressreleases/167975/us-urged-to-establish-nationwide-federal-
guidelines-for-autonomous-driving)

~~~
usaphp
> If I am to serve a 1 year prison sentence for faulty code

I wonder why would a programmer go and work on a code when he is always in
danger of going to prison for every mistake he makes. We all know that bugs in
software happen and it’s impossible to write bug free code

~~~
cco
Architects and engineers seem to go to work everyday knowing that if they are
criminally negligent they could face prison.

~~~
tonyarkles
My apologies for potentially hijacking this, but... this is exactly why the
term "software engineer" bothers me. Yes, the software you write isn't likely
to cause a shopping mall to collapse on a crowd of people, but there can be
huge financial and societal responsibility here, and yet, almost every
software license in existence completely disclaims that. Engineering comes
with a tremendous amount of ethical and legal responsibility.

~~~
avs733
But innovation!

Fail fast!

...sorry for the sarcasm but this is a message student engineers internalize
in part because it is pushed by companies they want to work for but can't
explain why. They don't have strong boundaries between why it might he a
justifiable philosophy for Facebook but not for Boeing.

------
8bitsrule
A woman on a road was killed. By a car with no driver.

There was a person sitting in the driver's seat, but that person was in no way
engaged in maintaining safe operation. That person was hired by Uber.

As the judge, therefore, I would certainly assign -most- liability to Uber for
putting that screw-off behind the wheel in the first place. Uber's driverless
car killed a woman.

The charge would be negligent homicide.

~~~
WillPostForFood
How do you feel about the Amtrak engineer being charged in the Philadelphia
derailment? Does Amtrak deserve most liability for putting the engineer in
charge of the train, or does the engineer deserve the blame for failing the
most basic duties of their job?

[https://www.washingtonpost.com/news/dr-
gridlock/wp/2018/02/0...](https://www.washingtonpost.com/news/dr-
gridlock/wp/2018/02/06/criminal-charges-reinstated-against-engineer-in-
deadly-2015-philadelphia-amtrak-crash/)

~~~
8bitsrule
I'm not familiar with the details there. IIRC, the engineer was going around a
curve much faster than it was rated for.

Whatever. Failing to detect that an employee is not fit to do a job which
potentially jeopardizes many lives is a very serious failure. I might -want-
to ask which executive the company will expect to do the time.

Airlines and railroads seem to prefer to blame their operators, particularly
when the operators are killed and unable to defend themselves. In this case,
however, the vehicle had no operator to blame.

~~~
youeseh
The article mentions the negligent operator watching a TV show.

------
kbos87
Let's take a look at some of the ingredients that led to this:

1) A pedestrian crossing the street likely expected an approaching driver to
see them and slow down. Typical behavior on the part of the average pedestrian
and the average driver, whether it's a misdemeanor or not.

2) An engineer at Uber made the decision to put tech on the road without
emergency braking capabilities, likely using the justification that a safety
driver would be there to intervene when needed.

3) A safety driver in an autonomous vehicle that behaves the right way 99% of
the time grows complacent.

The pedestrian may have been jaywalking and the safety driver may have been
abdicating their duties. The safety driver especially isn't an innocent actor
here. But the party who has the most responsibility is by far the engineers
and managers at Uber.

This set of circumstances was completely foreseeable, but they still decided
to take the risk and put this technology out on the roads. I for one don't
want to be a part of Uber, or anyone else's great experiments. Spend the
money, spend the time, and figure this out in a controlled environment before
subjecting the rest of us to the negative impacts of your selfish ambition.
Someone at Uber deserves time in prison for this.

~~~
mannykannot
While I agree that disabling the emergency braking was a foolish and
irresponsible decision, and the most important single technical failure
leading to this tragic outcome, I don't see prison time as a reasonable
response, as I doubt that there was a malicious disregard for safety. If the
persons most responsible for that decision still think they did no wrong,
however... I'm not sure what the ideal response would be.

Beyond that, I do not want to see an extension of the practice where low-level
employees take all the blame for a corporate culture that encourages bad
behavior.

------
r00fus
On one hand, I think Uber was reckless and wantonly created the conditions for
this incident.

On the other hand, I don't want this to be the death knell for autonomous
driving experiments.

On the gripping hand, it's possible that a large settlement might have been
the best outcome from this tragedy for the family of the woman in question.
They gain nothing from Uber's criminal liability. (edit: clarity)

~~~
derekp7
I saw a post on Reddit at the time of this incident, from someone who claimed
to be a former Uber backup driver. The claim was that Uber was too strict with
regards to cell phones, and that if you were caught interacting with your
phone while on the job you were immediately terminated.

I'd like to know how true this is, and if there is a better source for this
claim.

~~~
FireBeyond
> The claim was that Uber was too strict with regards to cell phones, and that
> if you were caught interacting with your phone while on the job you were
> immediately terminated.

Why is that too strict? Distracted driving is rapidly becoming the biggest
problem for MVCs.

------
userbinator
After all these stories I strongly believe that "self-driving, except when
it's not" is even worse than manually driving, because supervising a computer
is not too far from a driving instructor supervising a student driver; most of
the time it's OK, but you have to be extremely alert to catch the times when
it makes a fatal mistake. On the other hand, if you're "manually" driving, you
are fully in control of and can anticipate the situation, thinking ahead with
what's next. A self-driving car, like a student driver, won't tell you what
it's going to do and ask whether that's OK --- it just does and you have to be
_very_ alert to instantly take control to correct when things go wrong.

Autopilots work in planes because the operators are extremely well-trained,
and the reaction times needed are measured in seconds or even minutes. In a
car, it's less than a second.

~~~
Rebelgecko
I think there's actually some debate about how just the right amount of
automation can be dangerous in planes. Automation works great in nominal
conditions but can leave pilots unprepared to handle bad situations.

There was a really interesting article I read about this regarding the Air
France plane that crashed a few years ago (I think it was this:
[https://www.vanityfair.com/news/business/2014/10/air-
france-...](https://www.vanityfair.com/news/business/2014/10/air-france-
flight-447-crash)). Similar things even came to play with the recent Lion Air
crash, although there was also a lot of negligence and crappy maintenance.

------
jimktrains2
> While the system identified that an emergency braking maneuver was needed to
> mitigate a collision, the system was set up to not activate emergency
> braking when under computer control.

> Instead, the car's system relied on the human operator to intervene as
> needed. "The system is not designed to alert the operator," the report
> notes.

How are the not negligent for this specifically?

~~~
henryfjordan
The law doesn't require automatic brakes and alerts on cars. Think about how
many clunkers those laws would make illegal. You can't hold Uber to different
legal standards, but we can certainly say they were negligent from a moral
standpoint

~~~
jimktrains2
> The law doesn't require automatic brakes and alerts on cars.

The law isn't written for the case when the car is under its own control. The
driver is assumed to be in control of the vehicle. However, if the car is also
_a_ driver, neutering its ability stop, or even ask for assistance, in an
emergency should be tantamount to disability the human driver's brakes.

------
perfunctory
> The Arizona Republic has reported that the driver, 44-year-old Rafaela
> Vasquez, was streaming the television show The Voice in the vehicle in the
> minutes before the crash.

I guess we shouldn't be surprised any more as even the drivers of the non-
self-driving cars do the same sometimes. It's scary how often, when I look in
the rear-view mirror, I see drives behind me looking down at their phones.
This is one of the reasons I almost stopped driving.

------
dkarl
Most driving is done by humans. Humans are terrible at it and kill people
unnecessarily all the time. That's the standard to beat. Speaking as someone
who rides a bike on city streets, I really don't give a shit how many people
Uber kills, as long as it's less than human drivers would. This whole thread
smells like Monday morning quarterbacking and people throwing aside reason
under stereotype threat. (Techies are in love with technology, techies are in
thrall to startup narratives and oblivious to social responsibility.
Everyone's anxious to disprove that.)

In fact, the mistake Uber made here was relying on a human being to do a job
that is routine and boring the vast bulk of the time but occasionally requires
life-saving decisions that depend on attentive awareness of the surroundings.
That's the same mistake our entire civilization makes a million times a day.
The fact that the operator had fewer responsibilities than a normal driver
probably magnified the problem, but it's the same problem that makes driving
fundamentally dangerous. She thought she was doing a good enough job and then
oops, guess not, somebody's dead.

That happens every day without robot drivers involved. The standards we hold
autonomous driving technology to should reflect this insane status quo.

------
pteredactyl
My thoughts go out to her family. And the many many people who die every day
because of reckless drivers and accidents. Driverless or not. Really sucks to
lose someone due to no fault of their own (other than being there).

To add, driverless or not, most who kill others using a car are not criminally
liable. Often times not even liable beyond what the state minimum is. Unless
the victim sues for personal assets. So in California, that's 100k. Which is
one of the lowest in the USA.

~~~
thoman23
> to lose someone due to no fault of their own

? She walked in front of a car on the expressway in the dark.

~~~
nkurz
You have a reasonable point that the victim probably bears some responsibility
for crossing in the middle of a block in the dark, but I'm pretty sure that it
was a divided city street with a 35 MPH speed limit rather than what is
typically thought of as an expressway: "the preliminary police investigation
determined that the car was speeding—going 38 mph in a 35 mph zone when the
crash occurred"
([https://www.curbed.com/transportation/2018/3/20/17142090/ube...](https://www.curbed.com/transportation/2018/3/20/17142090/uber-
fatal-crash-driverless-pedestrian-safety)).

------
mnm1
Unbelievable. I wonder how many other self driving cars are out there designed
to not brake even when they clearly detect an object in their path. It's
ridiculous to rely on a human to intervene in such situations. This was a
completely preventable homicide, yet Uber gets away with it. As I've said
before, the easiest way to get away with murder is to commit it under the
protection of a corporation. True, this is manslaughter, but the principle
still stands. I bet they will go after the driver now and try to place blame
on her despite her having an impossible job that should simply not exist
because it can't be carried out. So they get away with manslaughter and a
precedent is set for other car companies that no matter how negligent your
system and operations are, you can kill people and get away with it as long as
you find a patsy to sit behind the wheel and take the blame. With this kind of
attitude, I hope self driving cars never make it to market. I no longer think
they will be safer than human drivers because there is no incentive for these
companies to make them safe, let alone safer.

~~~
zeroname
> I bet they will go after the driver now and try to place blame on her
> despite her having an impossible job

What's impossible about watching the street and hitting the breaks if
necessary, while testing a prototype self-driving car? I would say if you
can't even do that, you shouldn't be in that seat.

This isn't the case where just one person is responsible. The system shouldn't
have been configured that way. The driver should've been alert. The woman
shouldn't have crossed the road like that. _Everyone_ was negligent.

~~~
cgidriver
It happened because one company was greedy, and negligent, - and has the money
to pay the bill. The question is, will the jury really make them pay.

~~~
WillPostForFood
It happened just as much because one driver/monitor wasn't doing their job.

------
true_tuna
Those cars were nowhere near ready to be on the roads. And given how fast the
program spun up there’s no way the drivers were properly trained and vetted. I
personally had several run-ins with the Uber cars in San Francisco before they
got their California registrations revoked. I worked on 3rd in soma and saw
them out as I would walk to lunch. They didn’t even try to yield to
pedestrians in crosswalks during right turn on red (when pedestrians clearly
have right of way). I had more than one near miss. I shouted at the driver and
he grabbed the wheel in panic. He went around the block to try again and the
car did exactly the same thing. It was like they were blind to anything
smaller than a car. Uber knew it too, because the failure to yield was
reported to Uber in person at their garage on 3rd and Harrison. Uber (and
anyone who behaves like that) has no business running a self-driving program.
If they start the program back up I guarantee they will continue to kill
pedestrians. If you see one coming get away quick. I speak from personal
experience and I’m not joking even a little bit.

------
smallgovt
I wonder why Uber (and other self-driving car companies) don't have remote
employees monitoring the road who can take over in case the appointed driver
doesn't react quickly enough.

The problem with these driver-assisted self driving cars is that the driver is
unlikely to be paying attention at any given time since the system is 99%
reliable.

By adding remote monitoring, you could even have multiple people monitoring
each car. It might be impossible to safely steer the car from a remote
location, but they could surely activate the breaks and/or trigger other
simple directives to drastically decrease a fatalistic collision.

Given that the average driver reaction time in a car collision is 2.3s [1], I
doubt network latency would pose much of a problem, and cost surely isn't an
issue for these companies. A remote person could also use the car's cameras to
gain a superior field of vision (especially at night time) when compared to
the in-car driver.

[1]
[https://copradar.com/redlight/factors/IEA2000_ABS51.pdf](https://copradar.com/redlight/factors/IEA2000_ABS51.pdf)

~~~
userbinator
If someone sitting in the vehicle can lose attention, there is not much hope
for someone watching from afar on a TV (even if you're streaming at something
like 4k 120fps --- which would be an _immense_ amount of bandwidth to handle
for each car --- it's still not much more immersive than watching someone
playing a driving game.)

~~~
smallgovt
Maybe they can make it more like a video game, where you're constantly saving
cars from collisions, and you don't know which is a simulation and which is
real. Give them bonuses based on what percentage of collisions they
successfully deter.

~~~
dymk
And let's have Facebook moderators be shown a small percentage of images known
to be of rape or beheadings just to test if they're flagging images correctly.
You see how that might be a bad idea?

------
gudok
Interaction between humans and AI is far from perfect. It seems that it would
be easier for the people to adapt to AI rather than the opposite. I feel that
someday every traffic participant (including pedestrians) will be required to
carry tracking device. These devices will communicate together and prohibit or
allow actions, e.g. making a turn or crossing a road. Eventually they will
make traffic rules as we know today, obsolete. No traffic lights, no road
signs and no crossings anymore. Every action will controlled by the device.
And, of course, they will automatically report and fine law-breakers.

Is this the bright future of the humankind? Or is this a setting for a new
dystopian book?

------
NoPicklez
Interesting reading the article, from a risk perspective I would think Uber
would have assessed the risks associated with using an Uber self-driving
solution. Particularly mitigating automated controls to stop the car in the
event it detects an emergency, given it has this capability.

But I agree with the verdict, it's just strange that the vehicle has the
ability to detect a potential collision but cannot apply emergency braking to
prevent it.

~~~
mnm1
Of course, the prosecutor is judge and jury in this case because they are not
pressing charges. I'm almost certain uber would be held liable by any jury who
heard that they intentionally turned off the braking that could have saved
this woman's life. Then again, it is Arizona so it's not surprising that this
is not even being prosecuted. Uber must be bringing in a ton of money to the
state there to get off so easily.

~~~
jcranmer
I'm not so sure. Uber's defense here amounts to blaming the safety driver for
the crash. They can argue that their actions improved the safety of the self-
driving car (by reducing erratic movements due to false positive) so long as
the safety driver is performing their job properly. Absent a smoking gun email
that says "yep, we know this is unsafe, we don't care" (basically the Ford
Pinto scenario), I don't think a guilty verdict is near-certain.

------
Quarrelsome
and this ladies and gentlemen is a major reason why the US is the best place
in the world to do business and specifically bleeding edge stuff.

------
xutopia
Blame the driver... of course the driver was told that the car drove itself...
he was watching a show on his phone when it happened so Uber is putting the
blame on him.

I don't think the driver was trained properly to drive in that semi-automated
driving system. Is the fault lying with him or is Uber just trying to lay the
blame on him to avoid paying anything.

~~~
jimktrains2
Trains have had dead man's switches for nearly 50 years at least. You can't
give people the job of paying attention if there is nothing to pay attention
to 80%, 90% of the time. It's not reasonable to me to expect someone to be at
peak alertness while doing nothing at all.

~~~
smileysteve
It is reasonable, however, that they not be watching a cell phone when they're
being paid to be a backup driver.

The optics for the driver were really bad. They weren't just not paying
attention, they weren't "just" eating and drinking (something accepted in
American driving for the last 60 years), they weren't staring blankly into the
desert, they didn't fall asleep. No, they were doing the one thing that has
received the most public messaging for the last decade.

~~~
jimktrains2
> The optics for the driver were really bad.

Oh don't get me wrong. I don't feel like the driver is blameless. I just mean
that people not paying attention when they don't have to do anything isn't a
new problem or one without what seems to be an adequate solution.

------
rb808
The law around this is so important. When your car hurts someone, who's fault
is it? What about if its completely self driving like this one? What if there
is a bugfix which you didn't apply? What if a software upgrade causes a crash?
Lane assist caused you to swerve into something?

So many grey areas.

~~~
ams6110
> When your car hurts someone, who's fault is it?

Fault is a loaded word. It's your liability. That's why you have insurance.

~~~
hcurtiss
It's not always the driver's liability. Sometimes the pedestrian is at fault.

~~~
fucking_tragedy
Sometimes the manufacturer is at fault, as well.

------
helthanatos
Isn't most of the point of computers driving us so they can react to things we
can't? I dont know why that bit would be turned off... Seems seriously
problematic. Secondly, that looks like a really bad place to be pushing your
bike at night.

~~~
baroffoos
When you are traveling at 60km/h+ no amount of computers can save you when
even reacting instantly is not enough to slow down enough. Cars, self driving
or not are extremely unsafe. Sure its a bad place to be walking at night but
its quite problematic that we accept that walking around without looking both
ways at every step is punishable by instant death.

~~~
lugg
If reacting instantly is not enough to slow down you were driving way too fast
for the conditions, period. Defensive driving is a solved problem.

Unless you're talking about times where things really are out of the domain of
the car - people jumping in front. But nobody really cares about that non
problem.

Cars unsafe? Compared to what? Cheeseburgers? Most of the time when I drive, I
don't die and I don't kill people. That seems pretty safe to me.

Regarding liability here in this case, I'm not sure where I sit, probably on
the side where uber is liable. Their company, no amount of "it's new so it
doesn't count" hand waiving will change my mind.

But the way I feel about this is my bias for strong consumer protections.

In curious what on earth the argument is that lets uber off the hook here.

~~~
baroffoos
>If reacting instantly is not enough to slow down you were driving way too
fast for the conditions, period. Defensive driving is a solved problem.

Why does our law not reflect this? If this was a solved problem than all roads
where people walk near would be 40km/h but instead we have people riding bikes
and walking across roads with 60km/h traffic and they regularly get killed.
You might be able to safely drive your car and not die most days but its
shockingly common for people to die on our roads because we prioritize getting
to work 30 seconds faster over reducing death.

~~~
lugg
It does reflect this.

Just to be clear what I meant by solved problem was that is has a known
solution, it's not NP hard so to speak.

And while the problem is solved the implementation is far from widespread.
Defensive driving lessons are not forced on everyone.

If you're a bus driver and rare end someone you lose your job specifically
because you should have had enough distance to stop if they suddenly break.

> shockingly common

This phrase doesn't mean anything. Compared to what? People die all the time.
Doesn't mean it's more common than the norm you expect which is what you're
trying to convey.

I was actually thinking about this problem more and more.

There is no reason a self driving car can't prevent incidents like this. There
isn't really any situation you want to crash into anything, allowing the car
to do so is an error.

Defensive driving states if you can't see if there is something that might
cross into the road around that blind corner then you need to reduce speed to
the point you can safely edge around the corner and see what is oncoming. If
that makes for a crappy drive then fix the roads, cut the trees down, make
roads straighter and pedestrians more visible.

~~~
baroffoos
>Defensive driving states if you can't see if there is something that might
cross into the road around that blind corner then you need to reduce speed to
the point you can safely edge around the corner and see what is oncoming. If
that makes for a crappy drive then fix the roads, cut the trees down, make
roads straighter and pedestrians more visible.

Yes if self driving cars did this they would be fine but I doubt they will be
built like this because people probably don't want their car to slam on the
brakes every time someone is walking towards the edge of the street or slowing
to a crawl every time they get to a corner. Virtually no drivers slow down to
a safe speed when going around a corner its just 99.9% of the time there is
nothing around the corner.

The uber car that killed that person was driving far above its ability. From
what I read about it, these self driving cars are unable to tell if something
in the distance is in the way or not at high speed so they assume that any
stopped object in the distance must be not in the way because slamming on the
brakes every time there is a tree on the side of the road is not ideal.

------
edgartaor
Let say that in the future every car it is a Self-Driving Car. Which are more
reliable and accidents are reduced by 99%. How to deal with that 1%? Whose
fault is it? The manufacturer? The developer?

~~~
rhacker
There will plenty of money to pay for the 12500 people that die because of
those systems. (WHO says there are currently 1.25M dead each year from road
accidents, if that reduces by 99% we'll have 12500 dead). Even if that costs
$20B per year in liability, a company like Alphabet/Waymo can pay for it by
the amount of money they are making in the actual self-driving tech.

~~~
pmoriarty
_" There will plenty of money to pay for the 12500 people that die because of
those systems."_

How much is a life worth?

~~~
JumpCrisscross
> _How much is a life worth?_

Value is relative. Nothing has intrinsic financial worth. “What is a life
worth to the average person in a society” is a well-researched and answered
question.

------
tobyhinloopen
UBER is not criminally liable, but THE DRIVER might be, since she was watching
a TV show. Subtle but important detail. Looking at the comments, I suspect
many comment without reading the article.

------
aasasd
Funny enough, this is about the fifth attempt to post this story to HN, with
different links. It's getting barely any response.

~~~
dajohnson89
Those previous attempts were successful. Perhaps you mean, "attempts to get
upvoted to the front page"?

~~~
smallgovt
Yea, he's saying it's suspicious that this didn't get more upvotes. I agree.

~~~
dajohnson89
What's the angle exactly? A shadow force of HN mods, colluding with Uber, to
prevent this story from getting upvotes?

------
npip99
None of the top comments make any sense to me. I simply don't understand them.
Anyway, here is my understanding of the subject:

(1) Yes, Uber should have trained the driver better to look at the road. By
trained, I mean there should have been a sticky note on the wheel saying "PAY
ATTENTION OR YOU WILL KILL SOMEONE"

(2) Yes, The driver absolutely should have to pay some penalty for this, if
Uber told him that he should have been paying attention (Which they most
certainly did). Watching a video while in a self-driving car is IDENTICAL to
watching a video in a normal car. Modern self-driving cars are NOT fully
autonomous, and they SHOULD be viewed as IDENTICAL to cruise control for all
legal considerations, and thought experiments. Most of the top comments, which
are blindly attacking self-driving cars, are not making this analysis.

However, (1) and (2) do not justify the lack of logic displayed by most of the
top comments here. There seems to be violations of

(a) There is no logical difference between a person accidentally killing
someone, and the self-driving car accidentally killing someone. Actually,
because the car is already known to not be fully autonomous, this already is a
case of the person accidentally killing someone. However, even if the car is
fully autonomous, we MUST be considering the ODDS of an accident. None of the
other comments are doing this. There is always an odds of an accident, so a
specific accident means absolute bullshit. Literally nothing. This post
doesn't even mean anything. What SHOULD be posted is "Self-driving cars with
humans at the wheel kill X people per road-hour. Human-only cars kill Y people
per road-hour". If X > Y, then yes we have a fking problem. But without that
information, we literally have nothing to even think about, or process.

(b) Disabling the safety feature of the car is not a fk'ing concern, at all.
Almost no cars have these features - only expensive ones do. Turning an
expensive car into a normal car is NOT something you can be sued for, or is
even something anyone should care about. Who is at blame for the situation is
simply not dependent on this fact. I don't understand why people are
discussing this aspect.

(c) Do you see the image? Maybe it's not showing it all. But as far as I can
see there isn't a light there. Wtf is she doing crossing the road if there's
no light there, without waiting for the cars? As a citydweller that is
constantly found in the middle of streets trying to cross parts of the road
that don't have stop lights, I just can't fathom ever being in her position.
Maybe my city has less considerate drivers than her city, but if I tried to
cross roads without a red light blocking cars, and didn't consciously give
right of way to the traffic, I would die sometime this week.

(d) Many people seem to quote "self-driving system classified the pedestrian
as an unknown object, then as a vehicle, and then as a bicycle" as "Oh this
self-driving system is complete sh*t it's all Uber's fault". Makes no damn
sense. As a side note, if you've ever worked with a neural net, especially
with video as opposed to still photos, you already understand that the given
sentence means nothing. That's just how they work, and there will always be
milliseconds in-between frames where it reassigns the object's identification.
But, anyway, this is not relevant. The self-driving part could have been
completely off, or disabled. The car is SUPPOSED to be driven by the driver,
and any deviation from that is the fault of either the driver, or Uber's
training of the driver. Whether it's the former, or the latter, is exactly
where and how suing should be directed and handled. Nothing else matters,
despite most comments putting much emphasis on many other aspects of the
situation.

~~~
InclinedPlane
Many dashcams do poorly in low light conditions, Uber's footage is not
representative of what that section of road actually looks like. Compare it to
this:
[https://www.youtube.com/watch?v=1XOVxSCG8u0](https://www.youtube.com/watch?v=1XOVxSCG8u0)

Additionally, it DOES. NOT MATTER. Whether the section of road was well lit or
not. Does not matter. Because the pedestrian was already well into the road
when the Uber vehicle hit her. Cars, as many people familiar with them may be
aware, have headlights, which are suggested to be used while driving at night.
If you can't stop fast enough to avoid hitting something at the farthest
limits of your headlights, then you are driving too fast for conditions,
whether a human or computer is driving. As I said, this isn't the case because
the dashcam footage is deceiving, but even if it were accurate, it is
immaterial for this particular case.

------
dana321
I know its bad having a machine-controlled car killing someone, but.. it will
happen and we must accept it and my guess is that it will happen far less than
with human operators in the long-term.

~~~
smallgovt
The problem is that deaths per mile driven are currently much higher for self
driving cars than humans. We don't have to just accept it. Progress can be
made while still being safe -- at least, safer than how the free market would
act.

> The NHTSA reports a fatality rate of 1.25 deaths per 100 million miles[2],
> twenty five times the [4 million miles] Uber has driven.

[1]
[https://news.ycombinator.com/item?id=16620736](https://news.ycombinator.com/item?id=16620736)

~~~
smileysteve
One of the issues is that the car's stock system was disabled;

~~~
JimmyAustin
Absolutely not an issue. The last thing you want when developing a prototype
piece of software for a 1 ton chunk of metal is another bit of software
plugged in that you dont control or understand.

~~~
hopler
That's nonsense. The car had a human backup driver and human pedestrians, all
not under the autonomous cars control.

An automatic emergency brake doesn't prevent a computer or human driver from
braking when it wants to. If the autonomous system can't even avoid
interaction with emergency brake, the prototype belongs in a lab, not on a
public road.

------
sneak
I’m not a lawyer, so forgive my ignorance, but how can a corporation be
criminally liable for anything ever? It’s not like you can charge a
corporation with a crime or put a corporation in prison. Corporations can’t
commit crimes because they cannot act; only humans can.

~~~
anateus
The Ltd or LLC after many company names refer to the Limited Liability that
those forms of companies confer, that protect individuals within them from
personal liability. Corporations were intentionally a lot like people even
before Citizens United. There's also a concept called Piercing the Corporate
Veil that is intended specifically for penetrating through these legal
protections in certain cases.

~~~
sneak
That limited liability is a civil one. How would you even charge a corporation
with a criminal act, eg murder? Corporations cannot act, regardless of their
personhood. Piercing the corporate veil, again, is a civil issue.

------
wand3r
Self-driving vehicle development has so many difficult problems to solve. The
technology itself is extremely difficult to create and relies on this
extremely complex compromise where vehicles are allowed (and are) operated by
code but ultimately babysat by drivers.

I think self-driving technology is worth the risk and worth the large amount
of property damage and human injury that comes with it. However, this is a
societal issue and there needs to be some sort of referendum on this. While
many parallels can be drawn with space travel, those projects were tightly
contained and included individuals who were specifically trained and informed
of the potential dangers--there is no way to really do this at scale for self-
driving tech which necessarily demands an expansive ecosystem that can't be
tightly controlled nor can consent be inferred from the unknown number of
people involved in this experiment.

~~~
gotocake
_I think self-driving technology is worth the risk and worth the large amount
of property damage and human injury that comes with it._

I’m guessing you don’t see your property or life as being worth sacrificing
for this “noble” goal, right? I’m a little horrified at people who gleefully
volunteer their fellow humans to suffer or die in the name of progress. If
however you want to personally volunteer your person and property in the name
of technological advancement, that would be a different matter.

~~~
saulrh
"Are you willing to risk your life to improve this" is a leading question.
What I think you meant to ask is something more like "are you willing to
improve safety with your daily risk or not". I am already risking my life on
the road every day. Last month I was in a Lyft and the driver decided to go
around a car that was blocking an intersection and in the process swung the
front of the car though the path of an oncoming light-rail train. I've known a
couple people that got hit by drunk drivers; they didn't sign up for anything
at all. So what's the difference? There are multiple fatal automobile
accidents per day _per major metro area_. At least make those deaths mean
something.

In fact, we already do this. All over the place. Clinical trials. Yeah, those
are more heavily regulated, but the possibility of relatively increased risk
there is far greater - I would not expect a modern autonomous vehicle to be
ten or a hundred times more likely to kill its passengers than a human driver
is. How about legislation? The legislation on drugs that's ruining some states
is nothing more than some politician spending their constituents' lives to
signal their own perceived moral virtue to their party. Or maybe construction
projects. Every time we put up a lighter, cheaper bridge we are saving time
and money by risking the lives of every person that takes that bridge instead
of finding a job on the near side of the river. Or allergic reactions to
medications, including vaccines! Negative reactions to vaccines are incredibly
rare, but you bet your ass that I will choose to bet that someone doesn't have
guillain-barre instead of letting them spread measles.

Alternatively, excuse me while I fight dirty for a bit. Let's turn your
argument around. I believe that it is effectively certain that autonomous
vehicles will save lives in the long run. Slowing that down costs lives. So
why are you so willing to spend those lives? What right do you have to spend
the lives of the people who would have been saved by autonomous cars because
you thought that it was more important to preserve people's right to get hit
by drunk drivers today?

This is what civilization _is_. Just do the math.

~~~
gotocake
_" Are you willing to risk your life to improve this" is a leading question.
What I think you meant to ask is something more like "are you willing to
improve safety with your daily risk or not". I am already risking my life on
the road every day. Last month I was in a Lyft and the driver decided to go
around a car that was blocking an intersection and in the process swung the
front of the car though the path of an oncoming light-rail train. So what's
the difference? There are multiple fatal automobile accidents per day per
major metro area. At least make that mean something._

Actually I meant what I said, because I responded to someone explicitly
endorsing not only Risk, but _and worth the large amount of property damage
and human injury that comes with it._ as for Lyft, you _choose_ to get into
thst car, I can choose to get into a car or not as well. The lady who was run
down by Uber didn’t make any such choice. In general, choice and informed
consent are the big different and big deal when we’re talking about unproven
technology being developed on our streets; not because no other option exists,
but because it’s quicker and cheaper.

 _In fact, we already do this. All over the place. Clinical trials._

People _volunteer_ for them, and are fully informed of the risks and benefits,
they don’t volunteer random other people. This is the very key difference I
was pointing out. As I said, if you want to be a guinea pig in the name of
progress, that’s your right and I’ll support and laud you for it. Just don’t
volunteer me.

~~~
saulrh
_as for Lyft, you choose to get into thst car, I can choose to get into a car
or not as well_

How far do you want to take this? I accepted the risk of idiot drivers not
checking before turning right when I chose to walk to work. I accepted the
risk of drunk drivers drifting over the double yellow when I decided to drive
to work. That train driver accepted the risk of up-close traumatic experiences
when they took the job. Children accept the risk of being driven to school by
their parents or walking across the street in the morning to catch the school
bus.

I have to get from my apartment to my office every morning. I have to get from
my office to my apartment every evening. So what am I supposed to do? I can't
walk that far. Biking is just as dangerous as taking a taxi. I have bad knees
and public transit only works for me a couple times a week. If I get behind
the wheel I'm on the same road as drunk drivers and idiot drivers, and for
medical reasons I don't have that option anyway. If I take a lyft or a taxi
I'm being driven by some rando off the street. So, what, if I get run over by
a fucking drunk driver one night, it's totally my fault, and I accepted that
risk by _fucking existing_? _I don 't have a choice_. I did not _accept_ this
risk. This risk was _forced on me_.

I don't care which risk I'm taking. I don't care which one is being forced on
my. I don't care who is forcing it on me. I care about how _much_ risk I am
accepting. And so:

 _deal when we’re talking about unproven technology being developed on our
streets_

This is the only part of your argument that matters. The rest is an emotional
distraction. _If_ you can demonstrate that autonomous vehicles are a _greater_
risk than the alternative, _then_ you get to say that they're dangerous. And I
don't think that you can say that. Why do you think that you get to volunteer
a decade of people to being run over by drunk drivers and idiots? Because that
is what you are doing - autonomous vehicles may be an unproven risk, but
humans are such a long-standing and thoroughly proven risk that people have
just become _inured_ to it.

