
Waymo self-driving car crashed when driver fell asleep and triggered manual mode - fixermark
https://qz.com/1410928/waymos-self-driving-car-crashed-because-its-human-driver-fell-asleep/
======
polskibus
That tester detected a serious bug in the whole concept, inadvertently risking
his life during testing. He should've been promoted.

~~~
resonanttoe
Agreed, but lets not get ahead of ourselves.

The statement is the "Driver no longer works for waymo" not that "they were
fired for the incident.".

That feels somewhat deliberate choice of words to me.

~~~
randyrand
clearly picked up by a competing firm for presumably a large salary increase

~~~
nmeofthestate
Headhunted by Casper Mattresses.

------
feefie
On 26 Sep. 1982 in 'Knight of the Phoenix' when Michael falls asleep KITT
automatically takes over.

Google should have a camera pointing at the driver and use machine learning to
detect someone that is asleep or medically impaired and automatically take
over if the car starts to leave its lane.

Perhaps the driver could configure where it wants the car to go in such a case
(if asleep: continue to final destination; if medical issue: drive to nearest
hospital and call "Parent" or "Spouse" on cell phone and explain the
situation; if smart watch can detect alcohol in your sweat therefore
unconsciousness likely due to intoxication: drive to a trusted friend and call
ahead, etc.).

At the very least, if the person appears to be asleep it could ask "Are you
sure you want to return to manual control?". It should be able to tell the
difference between someone sleeping and someone alert wanting control back.

It's been some decades since fiction inspired us, but we're getting close!

~~~
andrewflnr
Maybe later, but there are an awful lot of fuzzy heroic heuristics in there.
For now, I'd say the only safe thing to do when the driver falls asleep is
come to a safe stop.

~~~
Aeolun
A safe stop on the highway?

~~~
andrewflnr
Under the circumstances, I'd accept just slowly rolling to a stop with the
hazard lights on. People behind the car will hate it, but probably no one will
die.

~~~
erezyehuda
Sadly, I can still imagine someone crashing into a static car on the highway.
I can imagine normal-speed continuation, mid-lane stopping, and attempting to
navigate to the side of the road all seem to have some pretty heavy risks. I
do wonder if a low-speed continuation would be in any way suitable.

------
k_sze
And this is why engineers have invented the dead man's switch a long time ago:
[https://en.wikipedia.org/wiki/Dead_man%27s_switch](https://en.wikipedia.org/wiki/Dead_man%27s_switch)

The basic idea is that the switch can only be maintained in a non-braking
state by an alert, not-incapacitated person. If you press too hard, it brakes;
if you don't press hard enough, it brakes. You have to apply just enough
force, and that's something only an alert person can do.

Clearly, using the gas pedal to turn off auto-pilot is not good enough.

By the way, we now have a pilot program in Hong Kong where cameras are
installed in buses and use computer vision to detect if a bus driver is dozing
off. The system then automatically makes the driver's seat vibrate and sound
an alarm to wake the driver up.

~~~
dsamarin
How do you expect a driver to maintain a dead man's switch the entire time
they are driving in manual mode?

Or conversely, suppose the switch must be maintained while the car is in
autonomous mode, should the car pull over and stop on the side of the freeway?
In terms of user experience I would expect the car to continue to the
destination. The commenter discussing the use of sensors to detect the state
of the driver and act accordingly makes more sense.

~~~
jackvalentine
> In terms of user experience I would expect the car to continue to the
> destination.

For a production car, sure. But these are test drivers in test cars.

------
lolsal
I feel like an old man shaking my first at the slow march of inevitable
progress - but - this is exactly why 'drive assist' or any other flavor of
automated driving that is not 100% automation is not useful to me. If I have
to pay attention and babysit piloting the vehicle, I might as well be driving
(assuming I'm of sound enough mind and body to actually drive).

~~~
icebraining
Waymo agrees with you, that's why their custom prototype car (the Firefly)
didn't even have a steering wheel.

~~~
dekhn
The Firefly was a joke. They had to install steering wheels shortly after
getting them. The vehicles saw very little use.

~~~
icebraining
I disagree it was a "joke"; I'd say it was a concept car - their vision of the
future. They need steering wheels because they're still testing them, but the
Firefly is the goal.

~~~
dekhn
they made a few then had to install wheels after it was pointed out it was
illegal in CA to drive without them- IE, lack of basic due diligence. And
they've reverted to conventional cars as a platform because that makes more
economic sense. Fine to have a goal but if you never get there and make basic
bungles on the way, it's a joke.

------
fernly
I love this understated line:

"He regained alertness once the car crashed..."

Yeah I bet he did.

------
Pxtl
That's tricky. So you need to be able to take control of the car quickly in an
emergency, but at the same time you want the human pseudodriver to be free to
be distracted or drowsy, and so in a state where they could accidentally hit
this button.

~~~
bryanlarsen
That would be classified as "Level 3" autonomous, aka "Driver must be
available to take over controls". Waymo has publicly stated that it feels that
Level 3 autonomous is unsafe and that it's goal is Level 4 "Driver not
expected to take control".

Waymo currently has Level 3 cars, which is why they employ safety drivers and
haven't made their cars publicly available.

~~~
dmoy
I thought they also had some l4 cars without steering wheels and stuff. Those
must not have manual mode (right?! Or what? How?).

I have no idea how to know exactly though. Waymo is pretty tight lipped about
this. I work at Google and I don't know for sure.

~~~
williamscales
They had the little bubble cars running around for a while that didn't appear
to have a steering wheel but I believe they were controllable via a joystick.

------
throwbacktictac
>> At a company meeting to discuss the incident, one attendee reportedly asked
whether safety drivers were on the road too long, and was told that drivers
can take a break whenever they need to.

I hope they also have reasonable mandatory breaks. It would seem that there is
pressure to stay on the road as long as your peers. Which could lead to
drivers not taking breaks when they should.

~~~
lovich
"whenever they need to" for breaks usually translates to, "whenever they need
to find another job". HR/PR might say it, and they might even believe it, but
the company will incentivized management for output. Those managers are going
to be pushing their employees to do as much as possible with as little breaks
as possible.

Unless they set up a completely separate chain of command that is in charge of
validating if an employee is legitimately being fired, then it's always going
to end up following their incentive structure

------
userbinator
_The dozing driver didn’t respond to any of the vehicle’s warnings, including
a bell signaling the car was in manual mode and another audio alert_

Even if he did respond, there's still the problem of whether he can respond
_in time_. Driving normally requires sub-second reaction times.

------
xoa
While no serious injuries here make it easier to be objective about it, I
think it's actually good to see one of the suspicions about self-driving cars
actually borne out in real life testing: that this is an area where
progression/safety isn't a smooth linear progression but more of a step
function or something with a valley before rising. That means that Waymo's
approach is in fact the best one: either stick to level 1, maybe a bit of
level 2, or bite the bullet and go directly to 5 and figure out how to bring
statistical rates there combined with the positive utility function to a point
where they're flat out superior to human drivers, and _then_ sell that. This
is actually kind of unusual for a lot of the history of tech I think, where
the (not unwarranted) common wisdom tends to be more that gradual improvement
and live testing and maturing over years is more effective then aiming for big
leaps all at once. But there are exceptions that prove rules and this seems to
be one of them: that overall safety is a combined function of human driver
quality to car AI quality, but they're non-linearly inversely linked in that
the more the AI takes over the more the human loses ability to stay focus and
practiced, and furthermore has to deal with context shifts which are
inherently slow and would inherently be required exactly when it's most
serious. So it's a lot more all or nothing.

None of that should take away from the value of pursuing it however. Humans
are in fact bad drivers, and have large numbers of failure modes like getting
inebriated or sleepy or angry that a system can simply avoid entirely. But
arguably even _more_ important then safety, which is what tends to come up
most, is _utility_ (after all we put up with vastly less safe cars and driver
practices for decades and decades before the present time already). Personal
arbitrary-point-to-point mechanized transportation is effectively required for
full participation in much of the modern world, it's just the danger and
expense that makes it a quasi-right rather then a full one. The young and
elderly being unable to take advantage is already a suboptimal state of
affairs, and it's frankly a waste of everyone's valuable and limited human
brain time that they're required to act as uncreative biocomputers in that
role. It's all been worth it anyway because it's just that valuable, but
removing a human having a required role is really just fulfilling what the
tech should have been like from the beginning if our synthetic mechanical
development didn't far outstrip our synthetic information collection and
processing development. A certain amount of ongoing casualties and damage is
entirely acceptable to achieve a societal improvement of that level of value,
even if it wasn't much safer at all.

------
swlkr
I'm starting to think it might be cheaper to lay down more rail lines or
something like Japan. I don't mind driving the last few miles or biking.

~~~
icebraining
You might not mind, but someone who can't drive or bike does.

~~~
fixermark
... as does the person whose house gets knocked down to make room for the new
rail line.

------
paul_milovanov
apropos, Tim Harford is an economist and an author of multiple exceptional
books on economics targeted to the general public, not a reporter. He manages
to show how deeply fascinating economics is and that economic phenomena are
part of the fabric of not just our societies, but the very natural world. I
highly recommend his "Undercover Economist" and other books.

~~~
kwhitefoot
He also does a rather good podcast on the BBC. More or less: Behind the stats
at
[https://podcasts.files.bbci.co.uk/p02nrss1.rss](https://podcasts.files.bbci.co.uk/p02nrss1.rss)

------
Razengan
This instantly evokes ideas about distinct technologies from different sources
coming together to form a greater whole:

Things like the Apple Watch monitoring your awareness and activity level, and
sending it to the car which uses that information to influence how it drives,
or when it vibrates the seats to keep you awake or wake you up, and so on.

------
jessriedel
Given that this was a crash with no injuries, I think Waymo's severity-
weighted crashs per mile is better than a typical human's. Anyone know the
numbers on this?

------
wiradikusuma
No surprise there. Some people e.g me will feel sleepy when on passenger seat.

------
jacknews
doh!

There's a lot more 'understand humans' to add to these systems before they're
really ready.

------
kalleboo
It's so sad how these companies are spending billions on developing this
technology and then still cheap out on the test drivers by having them drive
solo and for too long hours (and presumably at minimum wage)

~~~
Fricken
Waymo's vehicle operators make $20 bucks an hour. The driver was about an hour
into his shift when he dozed off.

~~~
toomuchtodo
The lack of requirement for driver input is what causes the operator to drift
off or fall asleep. The same reason they put test/dummy positive weapon images
that aren't physically present in the xray image stream at security
checkpoints; to keep the operator alert.

If you have a human overseer in the seat, you want to find ways to continue to
keep them engaged even if it isn't necessary for the vehicle's autonomy.

~~~
Fricken
Well the driver did provide input, he touched the gas pedal, which disengaged
the autonomous system and put the vehicle in manual mode, and that's
presumably what you want to have happen.

When it's time to disengage the vehicle you don't want to be fighting for
control with the computer, manual override needs to be on a hair trigger in
case the computer makes a sudden error. Unfortunately so long as there's a
human in the loop, the propensity for human error is going to be there. I
honestly don't know what Waymo could do to prevent this, multiple bells and
whistles went off.

Waymo takes this stuff very seriously, they absolutely do not want to have to
face the music should one of their vehicles get into an at-fault accident or
suffer some other kind of catastrophic failure.

I just did a Google News search for "asleep at the wheel", and in the last
week a few people have been killed, a few more hospitalized, and the more
minor incidents don't even get reported. Waymo sincerely wants to solve this,
but there's an awkward no-mans land between no autonomy and fully autonomous
where Vigilance Decrement[1] is an issue.

Fortunately this was a minor accident. Waymo's test fleet reportedly covers
25,000 miles a day, and in regular day to day driving minor accidents happen
about once every 160,000 miles. So if incidents like this are happening once a
week for Waymo, that's about par for the course.

[https://en.wikipedia.org/wiki/Vigilance_(psychology)#Vigilan...](https://en.wikipedia.org/wiki/Vigilance_\(psychology\)#Vigilance_decrement)

------
village-idiot
Poor UX, poor safety controls.

At this stage in the game, these prototype vehicles should absolutely detect
whether or not their operators are paying attention, since they’re part of the
safety system of the vehicle.

Also, did they not learn from Uber’s fatal crash? A sole operator in these
vehicles is a _terrible_ idea. One person to monitor road conditions and one
person to operate any computer equipment should be the standard until we hit
level IV or V autonomy.

~~~
duxup
What's the point if you have to pay attention?

I'm being serious when I ask. I question weather people can safely use these
vehicles in these modes where they're supposed to not interact ... but pay
close attention at the same time.

I think it is going to have to be full autonomy, or not at all.

~~~
mikestew
_I question weather people can safely use these vehicles in these modes_

Of course they can't. Watch people who've been looking at their phone while
the light's red. Watch how they drive after it turns green. Some of them drive
away like they're drunk, lurching and weaving, like they're reorienting
themselves to the driving task. On the rare occasions I get even slightly
engaged with the phone at a light, it's almost like I have to "reboot" back to
driving mode when the light goes green.

The part of the "throwing it in the human's lap at the last second" that
bothers me is that a lot of context gets missed when it's time to grab the
wheel. At least for me, I get context a little bit at a time. That guy in the
left lane is looking at his phone more than the road...<a few seconds
later>...big truck's going to want in this lane when his ends, leave
room...<more seconds>...I see taillights 12 cars ahead, what's going? And
Level 3 is going to just hand me the wheel without me knowing any of that?
Yeah, I see why Waymo is shooting for Level 4.

~~~
duxup
Indeed.

You also miss the interaction between humans. If another driver and I look at
each other, we're at least aware of our existence together and I know what is
up.

If a driver doesn't look at me ... does he know? does his car know? If he is
suddenly given the wheel ... I don't think humans can handle that.

