
Uber backup driver indicted in 2018 self-driving crash that killed woman - runnr_az
https://www.phoenixnewtimes.com/news/uber-backup-driver-in-phoenix-indicted-over-fatal-self-driving-car-crash-in-18-11494111
======
alextheparrot
This job reminds me a lot of when I lifeguarded in high school during the
summer. You sit on a chair in the sun, waiting for something to happen. The
job is not stimulating, which further increases the difficulty of recognising
if someone is in trouble - try watching some rescue training videos [0]. While
a lot of the interventions a lifeguard does are prophylactic (“No running!” vs
administering first aid to someone who tripped and split their skull open),
interventions did happen a few times a month. To make sure those interventions
happen when needed, pools implement processes to better manage the risk.

These processes seem to be the key difference between that job and this one.
Rotations every 30 minutes, a manager actively observing for engagement, and
overlapping zones of coverage all were instrumental in 0 fatalities - I can
recall examples where each of those saved at least one person’s life.
Redundancy, supervision, and engagement are what make a pool safe. While I
think this woman failed to do her job and may be punished for it, it is
important to question why your community pool has better safeguards than an
experimental car.

[0] [https://youtu.be/4sFuULOY5ik](https://youtu.be/4sFuULOY5ik)

~~~
randycupertino
I think the key difference is the backup driver was watching a video on her
phone when the accident happened. I was a summer lifeguard on the ocean for
many years (SOYA summer! Sit On Your Ass Summer), and we would have been fired
if browsed our phones while on duty.

~~~
Hokusai
> I think the key difference is the backup driver was watching a video on her
> phone when the accident happened.

Blaming the individual would not solve the problem. The question that the
parent comment tries to answer is "what do we do so the backup driver does not
watch his phone?".

That would save lives. To blame the individual and move on will kill more
people in the future.

~~~
_AzMoo
We should absolutely be blaming the individual in this case. They were
negligent in their stated job role which led directly to the death of a
bystander. We should also be finding fault with the processes and finding
solutions for them, but this does not absolve the driver of their
responsibility.

~~~
onetimemanytime
>> _We should absolutely be blaming the individual in this case._

Extreme case: Your employer tells you to stay alert in a super-boring job for
8 straight hours or else a person can die or whatever. But that may be an
impossibility due to how our brain works.

I'd like to know how many hours was he sitting there and how often did he have
to intervene that day /week /month. Uber, maybe, should have had another one
checking now and then on drivers. Say one for 10 drivers...

~~~
dependenttypes
If she could not do the job then she should have resigned.

Edit: there was another post by [redacted] but it disappeared, it does not
even show as deleted, weird.

It said: "This isn't some ride along in a consumer grade EV. They were
gathering data to program the car with. The emergency braking systems were not
active. The person that was supposed to be monitoring the vehicle knew this."

~~~
dang
When a commenter deletes their comment, it disappears.

It would probably be more respectful not to copy what they posted along with
their username. Actually, I think we'd better redact the username from your
quote. People sometimes have important personal reasons for deleting things.
The odds aren't high that it matters but the impact could be high if it did.

~~~
dependenttypes
It would be probably be more respectful if you would not mess with my posts,
now even I do not know who made said comment. Guess I should start signing my
posts and keep backups of them.

People sometimes have important personal reasons for deleting things but this
does not mean anything. It is not as if "x posted y on twitter and deleted it
afterwards" or "the page was edited/deleted, here is an archive.org link" is
uncommon on HN, nor it is as if a stalker would not be able to scrap HN posts
of someone instantly as they were posted.

> it disappears.

In my experience they show as [deleted] but I guess this is only for posts
that have replies.

~~~
Wowfunhappy
> In my experience they show as [deleted] but I guess this is only for posts
> that have replies.

Kind of. Technically, posts that have replies _can 't_ be deleted, so what
sometimes happens is the poster edits their post and replaces all the text
with "[deleted]".

~~~
krapp
>Technically, posts that have replies can't be deleted, so what sometimes
happens is the poster edits their post and replaces all the text with
"[deleted]".

Which should also be prohibited, because it's just as destructive. You
shouldn't be able to edit a post with replies either.

~~~
Wowfunhappy
I don't know, if my posts were locked after the first reply, my contributions
to HN would be definitively worse. I almost always find confusing typos and
grammar errors, and things that just could have been stated better, after
initially posting a comment, and I use edits to fix those problems.

Yes, ideally I would just do more revising before posting a comment the first
time, but I don't seem to work that way.

I think the two-hour window is a good compromise, and if anything I really
wish it was longer. Yes it has downsides, but I really think they're
outweighed by the good.

------
shajznnckfke
I think it’s useful not to label this person’s job “driver”, but instead “fall
guy”. Their job is to sit around all day doing absolutely nothing. But in the
rare event where the car fails, they need to suddenly become alert and fix it,
or take the blame for the machine’s failure. I think it’s less accurate to
describe this transaction as a form of labor, but instead as an indemnity, an
assumption of liability, paid for via a wage.

For train conductors with a similar challenge, a solution has been invented
requiring them to pass various visual attention challenges that detect if they
aren’t alert. Such systems weren’t present here. The system wasn’t designed to
work - it was designed to protect Uber by shifting the blame for failure.

~~~
Aeolun
It is hard to imagine they weren’t aware that their job involved doing nothing
(but pay attention to the road).

I’d be inclined to give them a pass if they were looking at the road and just
inattentive (because that’s expected, like you say), but I find it hard to
sympathize with someone watching a show on their phone, explicitly ignoring
their _one_ job.

~~~
clairity
that's the fundamental attribution error again, attributing poor
judgement/negligence to the driver, who's the visible actor, rather than the
inherent systemic flaws designed by other unseen actors who hold greater
responsibility.

~~~
baddox
Would you say the same thing for a chauffeur or truck driver who killed
someone because they were using their phone while driving?

~~~
abduhl
When some people learn to drive they can take courses with instructors who sit
in the passenger seat and have a brake pedal. When the student driver runs
over and kills a pedestrian while the instructor is on their phone, is it the
instructor or the student who should face liability? Why not both?

~~~
lkdsajflkfdsjg
I'm fairly sure the instructor is at least partly liable.

~~~
Ichthypresbyter
In England certainly the person supervising a learner driver is subject to the
same laws about alcohol, mobile phone use, etc as someone actually driving-
and can be convicted on a charge of aiding and abetting any crime the learner
driver commits. This is the case regardless of whether they are a professional
driving instructor with their own brake pedal, or a parent supervising their
child driving the family car.

~~~
glaberficken
In Portugal there is an interesting law regarding this: The
instructor/examiner is always liable for an accident during driving
lessons/exams, except if the accident resulted from an action where the
learner disobeyed a direct order by the instructor. In that case the learner
takes responsibility.

------
slg
>Uber made a series of development decisions that contributed to the crash’s
cause, the NTSB said...Uber deactivated the automatic emergency braking
systems in the Volvo XC90 vehicle and precluded the use of immediate emergency
braking, relying instead on the back-up driver.

If the driver committed homicide, it sure sounds like Uber is also guilty of
homicide.

~~~
clusterfish
Deactivating this optional feature is somehow worse than buying a car without
such feature in the first place? The latter is neither illegal nor immoral.

The safety driver had an actual job to do, he wasn't there "instead" of
automatic emergency braking - which is not certified for driverless operation
btw. But he was distracted with a phone instead of looking at the road.

The halo effect here is unreal.

~~~
bb611
The American legal system places much more emphasis on acts you may have
committed than omissions, and tends to avoid compelling action.

So yes, in an American court, disabling a proven safety feature is
significantly worse than simply purchasing a vehicle without the feature.

The safety driver failed at their job, but the NTSB clearly lays significant
blame for that failure on Uber, who should know well that humans are poorly
suited to monitoring automated systems, and committed acts and omissions that
increased the likelihood of an accident.

~~~
tomalpha
This brings to mind the classic Trolley Problem:

[https://en.wikipedia.org/wiki/Trolley_problem](https://en.wikipedia.org/wiki/Trolley_problem)

The scenario is notably different, but it does dig into the issues around acts
vs omissions and how we perceive them.

------
jedberg
I mean this makes sense. She was watching a video on her phone while driving.
It was her literal job to know that the car might make mistakes and correct
for them, so she should have known that she still has to pay attention as
though she were actually driving.

~~~
brudgers
It makes sense because holding users personally responsible for the
inadequacies of self-driving products externalizes all the risk for companies
selling self-driving systems. The Uber system detected something in the road
and proceeded _because_ it did not recognize it. That's how it is designed.
Otherwise the car would never go very far without stopping because the system
does not recognize most things it detects.

To put it another way, the self-driving system did not alert the driver that
it had detected something and did not know what it was. It wasn't an
emergency, it was the car's normal operation.

~~~
bastawhiz
The system was, by its very nature, not ready for production. That's why it
had a safety driver in the first place. It's crazy to argue that the safety
driver should have been alerted...the whole point they're there is to handle
failures of the system, including ones where the system fails to detect an
issue.

If my skydive instructor doesn't deploy the backup parachute because I, the
student, didn't alert them that the primary chute failed to deploy, it's
entirely their fault if we hit the ground at terminal velocity.

If a lifeguard is working at a public pool watching Netflix on their phone and
a kid drowns, you can't argue that the kid should have splashed more.

~~~
liability
Production? It wasn't even ready for testing on the unsuspecting public.

~~~
bastawhiz
I don't disagree. But that's a separate issue. The safety driver provably did
not even make a good faith attempt to perform their function. It's not
possible to know whether they would have been able to avoid the tragedy that
occurred, but it's a certainty that in any other universe where they were
watching TV on their phone they would not have improved the outcome.

------
samcheng
That poor person is Uber's "moral crumple zone"

[https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757236](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2757236)

Fitting, given Uber's reputation for questionable morality...

~~~
clusterfish
Guy was using his phone not paying attention to the road when testing
experimental equipment on a public road. He was there specifically to mitigate
potential malfunctions of said experimental equipment, but he was reading
reddit or whatever instead of keeping lookout.

I can't see how this is not clear cut negligence.

~~~
ardy42
> Guy was using his phone not paying attention to the road when testing
> experimental equipment on a public road. He was there specifically to
> mitigate potential malfunctions of said experimental equipment, but he was
> reading reddit or whatever instead of keeping lookout.

> I can't see how this is not clear cut negligence.

1) The driver is a woman.

2) While I think the driver does bear some fault here, they don't bear _all_
the fault. Uber designed an unsafe system that relied on an unnatural amount
of vigilance from a single person while simultaneously discouraging that
vigilance [1]. They didn't design the car to shut down when one of it's
critical safety components (the driver) was not operating correctly, and they
didn't even give that driver amphetamines or something to increase their
vigilance to artificial levels.

[1] Basically: pay close attention to a boring process while doing absolutely
nothing for hours on end. I'm pretty sure that's a classic "humans suck at
this" task.

~~~
clusterfish
That "unnatural amount of vigilance" is just sitting in a comfy seat and
looking at the road ahead of you go by. There is nothing unnatural about that.
Lots of people do similar or more boring tasks just fine.

This isn't a truck driver falling asleep from exhaustion caused by aggressive
scheduling. You don't "accidentally" take out your phone when your job is
looking ahead. That's deliberate negligence.

If you want to require eye sensors to detect distraction, by all means, pass a
law about it. Maybe include regular non-AI cars too. #1 cause of accidents
over here.

~~~
vkou
> That "unnatural amount of vigilance" is just sitting in a comfy seat and
> looking at the road ahead of you go by.

Doing that for five minutes is easy.

Can you do that for eight hours? Day after day? Without a single lapse in
attention?

This is a much harder job then the truck driver has - because he constantly
has to make microadjustments, to correct for road conditions.

People's brains don't work the way you think they do.

~~~
Erlich_Bachman
Many people write this notion in this thread. Do you have any data for this
BTW? How are you claiming this confidently? Have you driven a self-driving
car? Many people (myself included) find it quite easy to monitor the road in
it for long periods of time. If the driver was not one of them, she shouldn't
have taken the job and risked innocent lives.

~~~
vkou
How are you so confident that you are different from all the Tesla owners
whose self-crashing autopilots drove their vehicles into stationary objects,
fire trucks, semis, etc?

What makes you confident that you are actually good at it, and are not the
victim of Dunning-Kruger? Do you regularly find yourself in the process of
stopping your self-serving car from crashing into things?

Or has your car simply not crashed yet?

~~~
Erlich_Bachman
> all the Tesla owners whose self-crashing autopilots drove their vehicles
> into stationary objects

All the 3 of them? Among millions of Tesla vehicles out there and the hundreds
of millions of miles driven? Is that even a considerable risk when compared to
the general (non-zero!) risks of driving?

For me personally I simply know when I watch the road and when I don't. For
some people this might be hard, for others - not so much. I am aware of when I
pull out my phone or distract myself and when not.

~~~
vkou
I'm sure you're aware of when you've pulled out your phone, or are playing
with your infotainment system. (You should also stop doing it, it's negligent
and illegal.)

But are you just watching or are you seeing the road? How many times have you
taken control from your car doing something stupid and dangerous?

Unless the answer to that second question is 'I do it all[1] the time, and I'm
batting 20/20', what makes you confident that you'll catch the next instance?

[1] If that's really the case, you should probably short TSLA, it doesn't
sound like their car can safely operate.

------
blackbrokkoli
This discussion reminds me an awful lot of Don Normans example regarding human
error:

> _Air Force_ : It was pilot error - the pilot failed to take corrective
> action

> _Inspector General_ : That's because the pilot was probably unconscious.

> _Air Force_ : So you agree, the pilot failed to correct the problem.

We can go and blame the driver all day long but that will not actually solve
anything. Was it neglect? Probably-maybe, I'm not a lawyer. But that is not
the point. _Why_ did the driver look at her phone? How can we prevent that?
What other, similar failure modes are there?

We did this dance with pilots, truck drivers, forklift operators, life guards,
factory workers and god knows how many more. It is frankly quite disappointing
that HN is overwhelmingly like "this time, we'll just blame the operator!"...

~~~
tobyhinloopen
The operator was watching a video on her phone. That’s not failure to take
action, that’s intentionally putting yourself in a position to not take action

~~~
TeMPOraL
It's also a well-known and completely predictable behavior for humans. In just
about every other job requiring constant vigilance, there are multiple factors
mitigating this failure mode (shift lengths counted in minutes, multiple
observers, automated attention checking devices that shut down the machine if
not reacted to, other employees ensuring the observers pay attention, etc.).

Uber should absolutely get the blame for creating this situation in the first
place; letting a self-driving car out with _only_ a backup driver as safety,
and with her phone on her to boot, should not even be allowed.

------
davidhyde
> "The vehicle operator’s prolonged visual distraction, a typical effect of
> automation complacency, led to her failure to detect the pedestrian in time
> to avoid the collision."

This is quite an important point. It was very dangerous of Uber to disable
both the car’s built-in collision avoidance system and to have nothing to
replace it but the backup driver especially when there is a non-zero risk of
automation complacency. I’m not trying to cover for the driver as they were
clearly at fault. The evidence seems overwhelming. However, Uber shouldn’t be
cleared of fault just because the backup driver is found guilty.

~~~
cyrux004
For a development vehicle,I am assuming they bypassed the car's stock
functionality I am sure they had their own braking system that was supposed to
stop for cars, pedestrians, cyclists etc. It didnt function currently at that
time

~~~
davidhyde
> "Uber had disconnected the Volvo's factory-installed crash avoidance system.
> While the Uber vehicle's autonomous system did detect Herzberg before the
> impact, the vehicle — and Uber — relied on Vasquez to take action if an
> emergency arose."

The Uber software can detect an imminent collision but relies on the backup
driver to act on it. For everything else the car is designed to drive itself.
This is why "automation complacency" should not be ignored.

------
yholio
The six levels of vehicle autonomy are marketing bullshit. They imply failure
of automation is acceptable and the human can be left to pick up the slack.

In reality, there is no spectrum of automated driving, with the human doing
less and less. Driving is not an instantaneous activity like swimming, it
requires planing and strategizing, the decisions made in the past influence
situational awareness. I plan an overtake maneuver based on a myriad factors
and have escape routes already planned if things don't go well.

When the automated driver fails a human driver cannot be expected to just drop
in and correct the mistakes - by that time, they are uncorectable and the
ramp-up time of the human driver far exceeds the available time in most real
life situations. In the overtake example, if the autopilot fails when a 20 ton
truck is approaching, I have no recourse, I have no idea if an attempt to
brake and regroup will be successful because I did not plan the maneuver.
Therefore, computer drivers cannot be allowed to fail in such a scenario.

In practice, there are only two real levels: 1. autonomous; and 0. non-
autonomous, with various automation that helps the driver, while he remains
fully in control and the sole decision maker. What can constitute a self-
driving spectrum is the type of road where full automation is expected to
work: from restricted and instrumented roads, where only other similar
vehicles are allowed, for the dumbest self-driving modes, and up to full self-
driving on general purpose roads mixed with human drivers and pedestrians.

But the idea that you can achieve level 5 self driving by incrementally
improving the systems designed to aid the driver is a dangerous pipe dream.

~~~
bob1029
I still think general AI (i.e. AI so advanced that we would consider
protecting each instance as we would a human life) is a prerequisite for truly
autonomous driving on our complex roadways today. This is the kind of game
theory that would make me feel more comfortable sharing the road with these
kinds of cars.

I completely agree with the assertion that the grey area in between is where a
bunch of people are going to wind up dead.

~~~
Aeolun
> I completely agree with the assertion that the grey area in between is where
> a bunch of people are going to wind up dead.

But will more people end up dead than in the scenario where we had no ‘self-
driving’ cars at all?

Humans are much more accident prone than robots in all the research on this
I’ve seen.

~~~
toast0
The thing is, most of the deaths from self driving will be seen as
preventable, if only a human was driving. That's going to make it look bad,
even if the numbers are less.

If you want to reduce deaths, I think you really want to invest in things like
automatic emergency braking, monitoring driver attention, and safely stopping
the vehicle if the driver is incapacitated. Having the computer supervise the
human means the human is engaged in driving and aware of the circumstances for
the most part; a computer supervising a human can act with great speed if the
situation warrants, but a human supervising a computer is likely not to react
so quickly.

------
btbuildem
This story has all the key elements that come up in discussions around
autonomous vehicles (thankfully omitting that inane trolley problem). To me,
it is a preview of what is likely happen when adoption is at scale.

The "driver" (or rather "liability scapegoat") will bear the responsibility
when things go wrong -- simply because auto makers or service operators have
more legal resources than an individual.

Any person left to "monitor" the vehicle will not be paying attention most of
the time. Even a conscientious, mindful and diligent person will tune out
sooner or later; the task at hand is almost perfectly tailored to make you
drift off. Maybe we can look to train engineers to see how that problem is
tackled?

Further development to improve the self-driving capabilities of vehicles is
likely to plateau at some arbitrary milestone, because pushing further will
cost more than the court battles and keeping favour of public opinion.

~~~
crazygringo
> _Any person left to "monitor" the vehicle will not be paying attention most
> of the time._

I'm not actually convinced that's true. Heck, people drive regular cars
"mostly on autopilot". Haven't you ever "stopped paying attention" and then
realize you don't even remember the last 5 minutes of driving?

But it's not really a safety issue, because your brain is very capable of
automatically and instantly detecting anything out of the ordinary to alert
you to, like something in the road in front of you.

Literally all you have to do is just _keep your eyes on the road_. You don't
have to be paying constant "conscious" attention... you just have to _have
your eyes in the right place_ so your brain can automatically alert you if
something's not right. Which it is _excellent_ at doing.

Now I'm not saying that somebody can do it for 36 hours straight or without
occasional breaks or whatever. But the idea that this is somehow inherently a
task we're not suited for is just ludicrous.

In this particular case though, it doesn't even have anything to do with her
capabilities of sustained attention. She _chose to watch a TV show on her
phone_. She actively and consciously chose to utterly neglect her safety
duties, the entire purpose of her job.

------
yodon
As systems become more reliable, it becomes progressively harder for humans to
stay focused and maintain the concentration required to respond in the way
this driver was expected to respond.

Yes, the driver is at fault for failing to maintain vigilance, but the
engineers who built this system are far more at fault for designing a system
whose operating parameters (long stretches of nothing to do culminating in
either more nothing to do or a sudden life or death emergency requiring full
concentration to detect and process and prevent) are well outside the
operating envelope of the human in the system.

The driver was hired to perform a task that was almost guaranteed to result in
a deadly injury while some driver was behind the wheels. She didn't have any
way to know that, but the engineers did.

~~~
weaksauce
> The driver was hired to perform a task that was almost guaranteed to result
> in a deadly injury while some driver was behind the wheels. She didn't have
> any way to know that, but the engineers did.

I think the culpability lies a bit more on her than you imply... she was
watching a video on her phone instead of surveying the road/instruments for
dangers

~~~
yodon
By virtue of being a HN reader, you are almost certainly much better educated
and much more highly skilled than the person Uber hired to baby sit their
vehicles during testing. Set the alarm on your watch to go off in two hours.
Start looking at your monitor to see if any pixels fail in the next two hours.
Ask yourself how far into that 2 hour task you were able to maintain
vigilance. Now try to do it day after day. Was she at fault, yes. Were the
Uber engineers who designed this testing system at fault too? Absolutely.

~~~
weaksauce
> Was she at fault, yes. Were the Uber engineers who designed this testing
> system at fault too?

Those two tasks are quite different. One is driving and observing new stimuli
and the other is looking at paint dry. I don't think anyone could do that
without adequate breaks.

I never implied that the guilt was solely on her... she's culpable to an
extent as is uber for not having two backup drivers and/or more breaks to stay
fresh. I don't know if they had a written policy about it but I can't imagine
being on the phone watching a movie is allowed.

------
tunesmith
One of the other front page articles right now is the PDF about Judea Pearl's
book, and it describes exactly this case, including that there was a roughly
six second window. And that the car normally had a feature to stop in those
cases, but that the engineers (who I take to not be the driver) turned it off
due to false positives.

I'm still unclear on how they find liability here - should it all fall on the
driver? You could argue "but for" the driver's failure to pay enough attention
to stop in those six seconds, and you could also argue "but for" the
engineers' decision to turn off the safety feature.

~~~
jbay808
False positives are dangerous, because cars stopping suddenly and unexpectedly
can be dangerous. The engineers probably were right to disable that, if the
false positive rate was too high.

~~~
GeorgeTirebiter
IF the engineers told the driver, "Hey, we disabled pretty much all the safety
features on your vehicle because we can't figure out how to make them work --
so be Extra Careful!!!" \--- well, maybe. But shifting the blame onto a low-
wage worker to shield Corporate Hubris seems....wrong.

~~~
true_religion
It's not any more responsibility than any other low wage work would have as a
driver.

You can put someone in a car without collision avoidance or even anti lock
brakes, and it's okay to ask the driver to use their own skill and judgment
instead.

------
ogre_codes
If this wasn't a self driving car and it was just a normal driver texting
while driving at night, it would never have gone to court. The driver would
have said "I didn't see her", and the police wouldn't have even charged her.
This is particularly true since the victim was homeless.

~~~
onion2k
Here in the UK we have a specific law that doubles the penalties for careless
driving if you're using a phone. If you ran someone over you'd be charged with
'causing death by wreckless driving' and you'd probably spend some time in
prison. Saying "I didn't see her" would be an admission of guilt.

~~~
ogre_codes
Wasn't considering the phone at all. In a lot of places in the US, using a
phone is _also_ a distraction likely to get you jail/ conviction. (In some
places it's considered driving impaired).

The accident would just never have gotten the scrutiny it did. Unless the
driver came out and mentioned they were using their phone, the whole incident
would have just vanished. They certainly wouldn't have bothered getting a
subpoena to check the drivers cell record.

~~~
onion2k
You're suggesting that if a driver _runs someone over and kills them_ the
police wouldn't do a basic check like looking at the driver's cell phone
record? It's no wonder people all over the US want to defund the police if
they're _that_ incompetent. That would not happen here in the UK.

~~~
ogre_codes
Just in general, if you ever want to kill someone in the US, run them over
with a car. So long as nobody sees you do it and you aren't drunk or do
something silly to make it obvious it was deliberate, it's unlikely you will
be convicted.

If they are on a bike, it's even easier to get away with.

------
rossdavidh
From the NTSB report: "The vehicle operator’s prolonged visual distraction, a
typical effect of automation complacency, led to her failure to detect the
pedestrian in time to avoid the collision."

In other words, while this person should not have been doing what she was
doing, we can expect it to happen again if we let self-driving vehicles on the
road again. But, Uber (who can afford a much better lawyer than the driver)
was found not to have committed a crime. I think they were at least as
negligent as the driver, in this case.

------
itronitron
The Volvo's emergency braking system would have prevented that accident if
Uber had not deactivated it.

~~~
delfinom
It also helps if Uber used quality vision cameras instead of whatever garbage
they we're using and attempting to say "look at how unlit and dark the road
was!!". Meanwhile it was a pretty well lit city streets when people visited
it.

~~~
MertsA
That footage was not what was used by the computer to do any vision
augmentation. It was solely a dash camera separate from the self driving
system. Uber's sensor platform was more than adequate.

------
simon_000666
There's a good black mirror episode that sorta covers this :
[https://en.wikipedia.org/wiki/Smithereens_(Black_Mirror)](https://en.wikipedia.org/wiki/Smithereens_\(Black_Mirror\))

Is it the drivers fault? Is it the company that put them in a situation which
makes it more than likely to happen? Is it big tech fault's for systematically
addicting people to dopamine and nudging them into the belief that they can
successfully multi-task without problem?

There are a couple of open questions for me : Did Uber take the proper care to
train and onboard this person and make it clear that their role was to be
attentive at all times? Why did they disable the built-in autostop without a
back-up? Why was there no system monitoring and alerting on the attentiveness
of the driver? - This is surely easier in comparison to a fully automated
vehicle.

I wonder whether if instead of checking a mobile phone, she had been
distracted by lighting a cigarette how would this change the case?

These are the types of legal grey zones we will see more and more of with
increasing automation of transport, weapons etc.

My own opinion is that ultimate culpability is with all three parties.

~~~
Thorrez
I wonder if the Black Mirror episode was based on this crash.

------
rbtprograms
I think fault, specifically when it comes to litigation, is something that
will end up holding back self-driving cars (at least in the USA). If there is
an accident or someone is injured, who ends up getting sued? How does
insurance resolve these claims?

Please note that I am not saying I think someone needs to get sued. I just
think that the concept of liability is deeply ingrained in the American legal
and insurance systems, and I almost forsee a future where we cannot come to
terms with who is liable in different situations and it coming to preventing
the adoption of and progression of self-driving vehicles.

~~~
rvnx
Perhaps a solution would be a liability chain ? Get sued personally as the
vehicle operator, and then sue the car manufacturer yourself ?

This way people will be pushed to pick the safest cars, or to adapt their
attitude when the self-driving functionalities are unsafe.

~~~
thelean12
This doesn't help if you go to jail for the actions of the self driving car.
(Generally. I'm not talking about the specific case linked).

Also, consumers can't be trusted to pick the safest cars. Most consumers
choose on price, not safety features.

~~~
baddox
How do you define and determine whether a consumer "chose on price" or "chose
on safety features"? Surely consumers consider both the price and the
desirable features when they purchase a car, just like for every other
purchase. I suspect that most consumers wouldn't buy a $100 car than has a 50%
chance of exploding every time you enter it, and most consumer wouldn't buy a
$500,000 car that was 5% safer than a $40,000 car.

~~~
thelean12
The point is that you shouldn't be able to sell or buy a $100 car that has a
50% chance of exploding every time you enter it. Or more to the point, you
shouldn't be able to sell a shitty self driving system. The post I was
responding to was basically saying "let the free market figure out which car
has a better self driving system" which is a horrible way to go about it.

------
ck2
If coder writes sloppy code that causes a car to mow someone down, or a
dirty/broken sensor sends bad info, shouldn't there be liability if a company
is going to sell "self driving" cars?

As someone who has to run and bike on roads without sidewalks and use the side
lane, I am not sure if self-driving cars are going to save me from the ever
increasing number of drunk/high/distracted drivers or some bug is going to
think I am okay to graze at high speed because just trash or something.

------
dathinab
The problem is if you have nothing to do it's _really_ hard to stay
concentrated even if you star on the road instead of your phone.

It's like saying:

\- we drive for you so that you don't have to concentrate

\- if you don't concentrate you might end up in prison

IMHO self driving cars are only viable if the car manufacturer is responsible
for mistakes the (AI) driver they employ does.

Which sadly also means they would only be on abbo basis.

It's a different thing for assistant, through it's not always clear when a car
with assistant becomes a self-driving car.

~~~
jmercouris
abbo is not English, many people will not know that you mean "subscription"

------
bryanrasmussen
So thinking about this "car going the speed limit even if conditions warrant
going slower" thing, it makes me ask some questions:

1\. Are there any statistics on how often road conditions do not match speed
limits? I mean governmental statistics.

2\. Are there any statistics on the same, but at night (I would assume speed
limits are more likely to be higher than conditions warrant at night)

3\. If 1 and or 2 does Uber have data on how often drivers take over the car
because of conditions not matching the posted speed limit, how often does that
happen?

4\. Does Uber have any sort of policy regarding drivers who take over
frequently? It just seems to me, given what I know of corporate mentality,
that Uber as a whole (or some managers hoping to up their metrics) might
penalize or get rid of drivers who took over 'too' frequently in comparison to
other drivers. Obviously taking over is good for the machine learning, but you
can't reward it either or people will start taking over when they shouldn't. I
think there are probably incentives for the company to reduce the amount of
drivers taking over.

Aside from that I was thinking - if you're a lawyer on HN and think this
person is taking the fall somewhat unfairly, there might be arguments in this
thread that would be worth sending to the lawyer representing the driver (his
name is given in the article) - I of course mean worth it from a legal
standpoint, because there might be expertise in some of these posts that the
lawyer would not be aware of unless informed.

------
aronpye
Isn’t the pedestrian that got hit at least partially responsible? They were
crossing a live road, unlit, and not at a designated crossing. I get that the
driver has a duty to mitigate / avoid an accident if possible, but the
circumstances didn’t involve the driver running a red light or driving into a
designated pedestrian area.

I’m curious as to whether her defence in court will fall along those lines, as
it simply can’t be a case where she is entirely at fault? I would have thought
it would be 50/50 liability as both parties had equal opportunity to avoid the
accident. The driver should have seen the pedestrian, and the pedestrian
should not have jaywalked.

I get that this may be an unpopular line of reasoning, but please comment if
you disagree, don’t just downvote.

------
j_walter
It has been awhile since this happened, but at the time I thought the video
showed that the woman came out between two cars right before the Volvo hit
her. Perhaps more information came out during the indictment, but there has
been many cases in the past where even drunk drivers were cleared of the more
serious charges when it was determined that even an attentive driver wouldn't
have had enough time to avoid a crash. Will be interesting to see how this
plays out.

~~~
stefan_
You mean the video of a dashcam Uber released to deliberately make it appear
as if it was pitch-black and this women came into frame like a second before
impact?

No, indeed it turns out this is a very well lit, wide open area and your eyes
would have had no problem seeing and predicting her path from many many feet
away. As of course could the $$$ industrial cams Uber _actually_ uses to
inform its vision.

~~~
bzb5
Are you suggesting Uber maliciously edited the dashcam video to hide the
jaywalker? Or that a better, human-readable (ie not raw sensor data) version
of the video exists and Uber has not released it?

~~~
molf
Yes, it appears that Uber may have edited the video. Even if they didn't edit
it and just used a very shitty dashcam, it's still incredibly misleading.

There are no alternate versions of the dashcam available. However, there are
comparisons of the same location made by other drivers that make it pretty
clear this is a fairly well-lit stretch of road. [1] It looks nothing like the
released Uber footage.

[1]: [https://arstechnica.com/cars/2018/03/police-chief-said-
uber-...](https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-
came-from-the-shadows-dont-believe-it/)

------
thrownaway954
but is the whole point of autonomous driving is so you can be doing other
things while the car is driving itself??? cause, you know, if you have to pay
attention, then technically your driving.

not for nothing but they are blaming the wrong person and it upsets me that
they are charging them in the first place. i love how the company is not at
fault yet the employee now has to pay the price. it sickens me how our justice
system works. most of the time people get charged specifically cause the da
wants to make a name for themselves and i see it with this case.

btw... uber better have their entire legal team helping this person out, cause
this could REALLY set a president to the point where it could kill this
technology. imagine if self-driving cars really do become a thing and this
happened to you.

personally i think the biggest mistake we made as a society is putting too
many automated feature in vehicles. drivers are becoming way too laxed and
dependent rather than alert.

------
foobar1962
Uber disabled the Volvo's auto crash-avoidance feature. That should amount to
some culpability.

Who is going to want work as autonomous vehicle backup drive now? I don't
think it's reasonable to expect a person to be attentive 100% of the time,
given the nature of the task (just sit and wait for a problem).

------
seesawtron
>>Tempe police found in its investigation that Vasquez's personal cell phone
had been streaming a TV show at the time of the crash.

I am curious how they figured this out. ISP providers were made to share the
internet history for this device?

------
thedudeabides5
No comment on her particular case, but interesting to think that buried in the
End User License Agreement is fine print that says (to some effect):

"If you rely on our product for autonomous driving and accidentally kill
someone, images from your car's internal cameras will be used against you in
the court of public opinion, as well as the court of law."

Like, not sure this is something people who buy 1996 Toyota Camry really have
to worry about.

[https://images1.phoenixnewtimes.com/imager/u/blog/11494117/v...](https://images1.phoenixnewtimes.com/imager/u/blog/11494117/vasquez-1.jpg)

------
z3t4
Good to know that the safety equipment in the car (camera etc) is there for
the car company's protection and not the drivers and pedestrian protection.

------
pvelagal
I am wondering if the company responsible for self driving software/hardware
be liable ? If not 100% , then to what extent?

~~~
ping_pong
This is going to be what happens in the future, if Autopilot runs over a
person. Tesla will do their best to prove that your hands weren't on the
wheel, and they weren't at fault, just like when that Apple engineer died on
101.

~~~
pvelagal
I think insurance companies have a role to play here. They should refuse
coverage when driver is not operating the vehicle 100% of the time. To them it
is two drivers splitting responsibility. Insurance/ law enforcement must
demand additional coverage, for self driving cars.. a separate coverage for
“software driver”. In addition DMVs should test each self driving car for a
special drivers license.. just as they test a regular driver.

------
kag0
The car was speeding before it hit the woman, and the driver was most probably
watching TV before/as it happened. I can see how this would be bad press for
self-driving cars, but this really seems like the error is on the driver.
Maybe Uber failed to adequately communicate to her what her role was in the
car, but otherwise it seems like pretty open and shut negligence.

~~~
fernly
The article say the collision happened at 39mph. Was that over the limit for
that location?

If so, the (design of the) automated car should be partly at fault. Self-drive
cars should never exceed a speed limit.

The defense attorney can argue that if the car had gone at legal speed for the
location, the driver would have had more time to react, while the victim might
have survived a slower impact. To which the prosecutor might reply, the safety
driver had responsibility to keep the speed legal, along with everything else.

~~~
throwaway0a5e
The speed limit is simply irrelevant from a safety perspective.

The number on a sign is just that, a number. Sure you can use it in the
absence of all else but it should be tossed if there's any other information
to use.

Self driving cars should be going as fast as whatever is normal for the rest
of the traffic in those conditions and at that time. Creeping along at 55mph
in the right line while 75mph traffic dodges you is anything but safe. Safety
is achieved through homogeneous traffic flow and in practice that means not
having a minority of traffic that's going very fast nor very slow nor a
bimodal distribution.

Modern civil engineering doctrine is to set the speed limit based on traffic
speed and to control the traffic speed with the road design/features so "but
the law" holds about as much water as a net if the goal is safety.

That said, the court cares about the letter of the law (as it should) so your
proposed defense may in fact be beneficial.

Edit: Whether I care matters not.

~~~
turtle-san
>The speed limit is simply irrelevant from a safety perspective.

That depends on the type of road. On an interstate or highway your argument
holds, but in urban and suburban areas, where there are side-streets, stop-
signs, traffic lights, crosswalks, bicyclist, pedestrians that is not the
case.

The faster a driver is going the less time they have to respond to new
information, this is clear. Also I argue that with most cars going fast, and
most cars are not giving the car in front of them enough space, that a car is
less likely to slow down when they should (like when passing a bicyclist or
making a turn across a crosswalk before they look who is in it or approaching
it). I also argue, as does the commenter below me, that anomalous speeding
vehicles can be dangerous to anyone that is using/crossing a road using a
decision making processes that relies on cars going a certain speed.

My experience as a bicyclist on some roads, where there are 4 narrow lanes (2
in each direction), with a 30 mph speed limit but where the cars drive 40-45
mph, is that I can have two cars both pass me at the same time on my left,
passing very close, because the car farthest on the right does not slow down
and get into the left lane.

My experience as a walker has cars turning into my crosswalk at high speeds
all the time. Left-hand turns are particularly bad.

I would also argue that higher speed limits are more stressful for everyone
involved. Not nice for bicyclists, pedestrians, exercisers, scooter riders,
people going for a stroll, or sitting outside, or even living next to such a
road. Not even nice for other drivers who just want to go the speed limit and
not get ran off the road by impatient drivers. For this reason as well,
quality of life in urban and suburban areas, do I argue that speed limits
matter.

------
kfarr
ITT: Fundamental Attribution Error. Suggested reading: Normal Accidents by
Charles Perrow

------
modzu
was the vehicle operator an uber employee testing this thing? or like a
regular uber driver? the article doesnt say, but i do know she was
transgender. how is that relevant

------
chasd00
i agree with the comments that say she's at fault. However, and hindsight is
always 20/20, there should have been two people in the car.

------
rudiv
When Ambani's driver went to jail in place of his son his family got crores. I
wonder how much Uber's gonna give this person and/or their family?

~~~
scott31
Why should Uber give them anything?

~~~
rudiv
Why shouldn't Uber give them anything? Also, who's them?

------
miduil
What a transphobic article omg, why to highlight the person's transgenderness?
Why her deadname? So rude. (These are rhetoric questions, please don't answer
this)

~~~
miduil
Of course you're getting down voted on HN for pointing out transphobia. :(

------
paul7986
Disgusting that Uber isn't getting the same, along with the heads who spun/put
this killing machine on the road.

Pays to be rich and well connected! Gross!!!

