
Arizona Uber crash driver was 'watching TV' - room271
https://www.bbc.co.uk/news/technology-44574290
======
rayiner
Uber should get charged with manslaughter. I don't say that lightly. This
isn't an example of programmer error, where a alpha-version self-driving
system failed. That would've been unfortunate, but not grossly negligent.
Here, Uber put a "driver" in control of the car that _by design_ wasn't able
to avoid collisions with pedestrians. According to the NTSB, Uber disabled
_Uber 's own_ emergency braking system (not just the built-in Volvo one) “to
reduce potential for erratic behavior.”[1] That rises to gross, criminal
negligence.

It is no defense to say that Uber also told a human driver to be present. If
Uber had directed a five year old to drive the SUV, with a back-up driver
ready to take control, I think everyone would agree that qualifies as gross
negligence notwithstanding the presence of the back-up driver. (To be clear, I
think some risk in the name of progress is acceptable. But it's one thing to
use testing on public roads to work out the kinks in the software. It's
another to put vehicles on the road that _you know_ cannot perform the basic,
essential functions of driving.)

[1] [http://www.latimes.com/business/autos/la-fi-uber-arizona-
nts...](http://www.latimes.com/business/autos/la-fi-uber-arizona-
ntsb-20180524-story.html) ("However, Uber also disabled its own emergency
braking function whenever the test car was under driverless computer control,
'to reduce potential for erratic behavior.'").

~~~
xvf22
Criminal negligence seems to me to be more appropriate than manslaughter. They
disabled safety features and replaced them with their own which failed to
function correctly. They whole testing program seems to have been rushed and
corners were cut.

~~~
yock
The devil is in the details, so they say, but what you describe is not
negligence. They didn't disable a safety feature and do nothing, they replaced
a safety feature with something else. You'd need to make the case that this
something else was somehow deficient in some material way _and_ management
shipped this feature knowing it was materially deficient for this to be
negligence.

Why don't we instead start with the individual who was supposed to be
monitoring the road and the vehicle's systems? It's well known at this point
that they in fact _were_ negligent in their duty.

~~~
woodrowbarlow
they disabled both the built-in emergency braking system and their replacement
emergency braking system. they absolutely did disable a safety feature and did
not have anything in its place.

~~~
yock
They still had a person in the car who was supposed to be operating, in
effect, as an emergency braking system.

~~~
rayiner
As you acknowledge, the human was only the secondary driver. That meant that
Uber put a primary driver in the car that was _designed to not stop for
pedestrians._

------
jackvalentine
> The Tempe police report said the crash was "entirely avoidable" if the Uber
> operator, Rafaela Vasquez, had been watching the road while the car was
> operating autonomously.

And thus should end the bewildering “but you couldn’t see her to the last
minute and I don’t understand how bad cameras are in low light”

~~~
spuz
Yes I don't understand why so much focus has been put on the failure of the
technology rather than the failure of the back up driver. We all know that
self-driving technology does not exist yet, that is why the back up driver is
there. If you have driven at night you will know that a human can see much
further than what is shown on the video footage of this incident and so it
should be clear that the human back up driver was to blame. Unfortunately for
her, the fact that she was not only not watching the road but distracted by a
TV show makes any claim that she was not negligent very hard to defend.

~~~
finnthehuman
>Yes I don't understand why so much focus has been put on the failure of the
technology rather than the failure of the back up driver.

It was well known all along that humans are poorly equipped to remain actively
engaged in situations that require split-second reactions, while exerting zero
effort or control for hours on end.

Air traffic is as safe as it is because the accident analysis always pushes
past "pilot error" as the root cause. If one pilot can make the error, why
won't another? Did the system set them up to fail? Even if it didn't, could
the system be enhanced such that the failure (or class of failures) could be
prevented?

In the case of the human in Uber's self-driving car, they were set up for
failure. And in such a well known way that makes any claim Uber was not
negligent very hard to defend.

~~~
jryle70
I believe all other driveless tests work the same way, with sensors but
requiring human driver at all time. Would you also consider Waymo, Cruise, et
al, negligent?

The technologies are not ready. They are being tested. Aircraft get to such a
safe state like today after so many iterations and improvement, both in
technology and process.

A terrible tragedy happened in this case. Hopefully it will help save a lot of
lives in future.

~~~
rayiner
There is a difference between putting technology out there that is "not ready"
and putting technology on the road that fundamentally doesn't do the basic
things that a car must do, such as brake for pedestrians. It's an unfortunate
but avoidable tragedy if a self-driving system fails due to a computer bug.
Here, Uber _disabled_ the self-driving system's emergency braking function:
[http://www.latimes.com/business/autos/la-fi-uber-arizona-
nts...](http://www.latimes.com/business/autos/la-fi-uber-arizona-
ntsb-20180524-story.html). As far as I know, Waymo at least is at least
theoretically capable of braking for a pedestrian in autonomous mode.

~~~
malandrew
The overwhelming majority of vehicles on the road today don't break for
pedestrians. Just a few models of cars sold in the past few years have
features to automatically engage braking.

[https://www.consumerreports.org/car-safety/where-
automakers-...](https://www.consumerreports.org/car-safety/where-automakers-
stand-on-automatic-emergency-braking-pledge/)

------
msravi
I'd suspect that it's harder to take over and avoid a colision in a self
driving car, than a car that you're continuously in control of. You first have
to recognize that the system is failing or about to fail. And that has to
happen well in advance for you to take appropriate action. Doesn't seem like a
reliable failover procedure, even if the person behind the wheel is paying
attention.

~~~
narrowtux
> You first have to recognize that the system is failing or about to fail

For that to work, I'd like that cars that have features like this show you
what they're seeing. Tesla does this a bit by showing all the cars it has
detected in the dashboard. But it should go further and other car brands
should do the same.

Without that, you're right, you can not be sure if the car is going to react
to something on its own.

~~~
Someone1234
I'm legitimately surprised nobody has a HUD that simply displays all detected
objects/potential obstacles. They have this data, just pipe it up to a HUD.

~~~
KineticLensman
If you are watching a very complex display, why not present it in a format the
driver will understand without training, such as a processed video image. Here
[1], [2] are some (very optimistic) mock-ups. There was another one released
in the last few days but I can't find the link. Showing anything like the raw
data with rapidly changing risk annotation would get tiring very quickly,
especially as it masks the view the human is actually able to understand, the
raw view.

[1] [https://futurism.com/self-driving-car-video/](https://futurism.com/self-
driving-car-video/)

[2] [https://next.reality.news/news/augmented-reality-cars-
compan...](https://next.reality.news/news/augmented-reality-cars-companies-
tech-driving-us-into-future-0182485/)

------
Jaruzel
A related BBC article[1] states:

 _" A toxicology test carried out on Ms Herzberg after the accident returned
positive results for methamphetamine and marijuana.

She did not look before crossing the road in a poorly lit area and was wearing
dark clothes, the NTSB report says. And the reflectors and lights on her bike
were at right-angles to the Uber car's path."_

Even though, the self-driving software failed to recognise her, and also
totally not excusing the lack of attention of the Uber driver, you cannot rule
out that under normal circumstances with a non-automated car, the pedestrian
would not have been hit.

Although this is a sad event, the pedestrian does carry a certain amount of
blame here. It also shows that the biggest blocker to effective self driving
vehicles is people, not technology.

\---

[1]
[https://www.bbc.co.uk/news/technology-44243118](https://www.bbc.co.uk/news/technology-44243118)

~~~
mental1896
Why are we treating the jaywalking here with such a light touch? If the
pedestrian had crossed at a crosswalk, we wouldn't be having this discussion
at all.

~~~
ceejayoz
Jaywalking shouldn't carry a death sentence.

If the car can't handle a jaywalker, it can't handle a toddler running into
the street. That's a problem that shouldn't be hand-waved away.

~~~
vorpalhex
Can a human driver handle those things?

I had a near-collision when I was headed to work one day. I was going the
speed limit on a major four lane road. I noticed ahead of me that a car was
stopped in a neighboring lane. It wasn't clear if it had broken down or what,
so I started slowing down. Cue a group of kids jaywalking, and running in
front of my car causing me to slam on my brakes.

Had I not seen or reacted to the already stopped car (or had it simply not
been there), I probably could not have stopped in time. Not that I wouldn't
have absolutely tried my best, but physics dictates that a compact SUV going
at 45 mph doesn't come to a stop immediately.

That doesn't mean we shouldn't evaluate whether Uber seriously messed up and
should face dramatic consequences - it certainly seems like they disabled a
critical safety system that would of saved the pedestrians life.

Rather, we shouldn't make such grandiose statements like "Jaywalking shouldn't
carry a death sentence" which don't really touch on the facts of the case. We
should instead ask, "Did Uber's negligence cause this to result in a death
when it shouldn't of been one?"

~~~
SmellyGeekBoy
> Can a human driver handle those things?

I'll point out that in this case the crash was "entirely avoidable" according
to the official police report.

------
tga
Contrary to many here, I feel that it's perfectly reasonable to have a backup
human driver _if they are trained to act accordingly_.

The Japanese pointing-and-calling technique comes to mind as a good example of
keeping drivers engaged: they would have to continuously, actively point at
dangers and at the car's appropriate response.

[https://www.youtube.com/watch?v=9LmdUz3rOQU](https://www.youtube.com/watch?v=9LmdUz3rOQU)
(quite fascinating to watch)

Combine this with short sessions (not driving around for hours with nothing to
do), and I think the driver would have had a reasonable chance of preventing
this accident.

~~~
will4274
> I feel that it's perfectly reasonable to have a backup human driver if they
> are trained to act accordingly.

Agreed. I think the right analogy is an airplane pilot. Operating an airplane
and an autonomous car require special training and require professional
operators. Replace car crash with plane crash. My judgement is predicted on -
did Uber train their drivers? Did Uber evaluate their drivers before allowing
them to operate on their own? What policies did Uber have in place for
drivers? What punishment did drivers typically receive for violating policies?
Did Uber occasionally review footage of the driver to verify driver
compliance?

It's hard to expect a layperson to pay attention for hours on end without
input, but somebody appropriately trained should be able to do so.

------
soziawa
> The Tempe police report said the crash was "entirely avoidable" if the Uber
> operator, Rafaela Vasquez, had been watching the road while the car was
> operating autonomously.

Absolutely no surprises there. Everyone who has ever taken a picture in the
dark should have known this after seeing the footage from the car.

~~~
uptown
The videos shot in the same location by another camera give another version of
how well-lit those roads are:

[https://arstechnica.com/cars/2018/03/police-chief-said-
uber-...](https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-
came-from-the-shadows-dont-believe-it/)

Visibility looks considerably better than what the Uber video portrays.

~~~
andyv
I live there, that road is well-lit. The Uber video is misleading.

------
pjc50
Proposed solution: it should be an offence for the manufacturer to describe a
car as "self-driving" or "autonomous" if it is not capable of doing so
_entirely_ by itself. Systems which rely on the car driving 99% of the time
and then throwing up its hands in order to make the human responsible for the
crash are a ridiculous abdication of responsibility.

This system would have to be described as "driver assist".

~~~
dzhiurgis
Do you actually think a single line of description are going to change human
behaviour?

~~~
HenryBemis
YES!! Unfortunately people take marketing buzzwords at face value. Since the
industry is not yet ready to release a true "autopilot" then they should not
be using these words. People die so that Uber, Tesla and others can ab-use
these words.\

Someone died that day because that "safety operator" was watching TV? Imagine
how 'extra happy' the cyclist's family is right now.

~~~
acdha
We already have plenty of evidence that this is true, too: people have died
using the Uber and Tesla systems while nobody has taken the Volvo safety
system Uber disabled as absolving them of responsibility for driving, and
those systems have been in the hands of drivers for a number of years by now.

Not calling it an “autopilot” is a huge psychological queue that it's an
emergency safety measure rather than something you should rely on.

------
wepple
Any vehicle that is not the highest level of autonomy (ie has no steering
wheel, or no need for one) should have systems in place to verify that a human
driver is alert, in charge of the vehicle, and able to respond immediately -
eye focus cameras, steering wheel sensors, confirmation prompts, etc.

Is there a valid reason this should be law?

~~~
megaman22
Then what's the point? Might as well just drive a car the way we always have
then, and avoid all this complicated crap.

~~~
icebraining
The point is to improve the vehicle until it _is_ at the highest level of
autonomy.

~~~
megaman22
Well fine, but I'm gonna run old vehicles that havent been infected by this
until then.

Having to babysit the self driving car is a no starter

------
O1111OOO
Another facet of this is how quickly people adopt new technologies - whether
they're proven to be safe (or good) or not. Personally, I've evolved into not
trusting technology, not trusting the people behind most of the technology
being produced today.

In contrast, this woman was so quickly at ease (I wonder what she was told
beforehand, during her training) that she felt comfortable enough to watch TV.
I also wonder, as this kind of tech progresses, how it will be sold to the
public. Perhaps the same: "we take your security seriously"...

------
linsomniac
Paying a "safety driver" to sit in the car seems like a small price to pay if
it means this (minimum wage?) person takes the manslaughter charge instead of
Uber.

~~~
sandworm101
>> minimum wage?

Lol. You can get a real non-robot driver for that price. The whole point is to
get some sort of contractor non-employee to work below the minimum wage.

------
Spivak
Can we put this issue to bed once and for all? Humans are not sufficiently
equipped to act as a 'backup driver' in emergency situations and any system
which relies on such a thing for safety is inherently unsafe.

Doesn't matter if you glue our hands to the steering wheel and hold our eyes
open, if we're not doing anything 99% of the time we won't be ready to react
with split second timing to recover from some failure.

------
mcguire
The NTSB preliminary report directly contradicts this article.

[https://www.ntsb.gov/news/press-
releases/Pages/NR20180524.as...](https://www.ntsb.gov/news/press-
releases/Pages/NR20180524.aspx)

On scene police reports are often unreliable. This is why the NTSB does not
speculate before the investigation is completed.

------
helmsb
I had an Uber driver who was watching an extremely graphic and violent movie
in his phone setup directly in his field of view. I had him drop me off early
and reported it to Uber. Apparently it’s becoming more and more common talking
to friends.

------
hi41
This incident reminds of a post Nicholas Carr wrote. We are offloading
critical activities to automation but at critical times human expertise is
needed to resolve dangerous situations. The driver put too much confidence in
the automation and watched a TV show on her phone.

------
Molaxx
So they'll drop it on the driver?! The whole setup was an accidental waiting
to happen. Uber executives should be held responsible or this will happen
again and again.

~~~
FireBeyond
Right, Uber's position will be / is, simultaneously, one - "it's not our
fault, that's why we had the human driver there", and two - "in order to cut
costs, we used to have a human passenger there to record anomalies while
driving - we axed that and require the human 'driver' to do so, whilst being
responsible for the safety of the vehicle".

------
bitL
Why manslaughter though? Wasn't the victim the real offender, crossing a road
on unmarked spot without paying attention to oncoming traffic? It's a safety
failure of a self-driving car tech surely, but the offense came from the
victim. Especially given how some human drivers can't avoid similar situations
either (see some videos of Chinese driving)...

~~~
ollie87
Maybe it's different where you live, but where I live the only place where a
pedestrian doesn't have right of way is a motorway.

~~~
emodendroket
One of the most jarring things about visiting Niagara Falls in Ontario was a
big sign at the crosswalk warning that vehicles, rather than pedestrians, have
the right of way. I don't see any good justification for making things this
way except to basically give a free pass to anyone who kills a pedestrian.

~~~
wilsonnb2
It's always made perfect sense to me that vehicles _should_ have right of way
on roads. They're intended for vehicles. Giving pedestrians the right of way
on the road makes about as much sense as giving cars the right of way on the
sidewalk.

~~~
emodendroket
The crosswalk is made for vehicles? How is anyone ever to cross the street?
Why shouldn't the person operating the dangerous machine requiring a license
be subject to greater responsibility, for that matter?

~~~
bitL
[http://www.ncsl.org/research/transportation/pedestrian-
cross...](http://www.ncsl.org/research/transportation/pedestrian-
crossing-50-state-summary.aspx)

"Arizona: Vehicles must yield the right-of-way to pedestrians within a
crosswalk that are in the same half of the roadway as the vehicle or when a
pedestrian is approaching closely enough from the opposite side of the roadway
to constitute a danger. Pedestrians may not suddenly leave the curb and enter
a crosswalk into the path of a moving vehicle that is so close the vehicle is
unable to yield. Pedestrians must yield the right-of-way to vehicles when
crossing outside of a marked crosswalk or an unmarked crosswalk at an
intersection. Where traffic control devices are in operation, pedestrians may
only cross between two adjacent intersections in a marked crosswalk."

~~~
emodendroket
This particular incident concerns Ontario.

