
California DMV: Self-driving cars must have driver behind wheel - not_that_noob
http://www.mercurynews.com/business/ci_29262037/california-self-driving-cars-must-have-driver-behind
======
donpdonp
Its nonsensical to think a human can take over at a moment's notice during a
particular maneuver. Its the threat of death/accident that makes me pay
attention to every moment of driving. On autopilot I wont have that kind of
attention. Its simply a trait of the human attention span.

That being said, every self-driving car needs a manual override because there
are plenty of situations that cant be easily explained to a computer. the
override could be done with a steering wheel and gas pedal, or just a joy-
stick mode on the touchscreen pad.

For instance someone has a big empty lot and they want to move a car from row
A to row B. The new spot is exactly 50 feet to the right. I just want to back
the car up, and go forward while steering over to the right a bit. Row B has
no 'street address' to program into a route-computer. I could click on a map
but the map doesnt know where on the grass i want to drive.

Even with lidar, etc. there are grey areas that self-driving algorithms are
going to have trouble with. Drop a sheet of tissue in the path of a self-
driving car moving at 60MPH. The lidar will see it as a solid object and might
decide a very dangerous plan - maximum braking - is the best plan, when going
through it is actually the best plan.

~~~
not_that_noob
This already happens today with cruise control - and yes, the driver will have
to be paying attention while driving, at least in this version of the rules.

~~~
mindslight
Cruise control still requires one to mostly drive, and the only "moment's
notice" thing a human would do with cruise control would be stomp the brake.

Regardless of what any rule says, stand-by humans will not be paying attention
and ready to drive. Even if reading/sleeping/daydreaming were successfully
policed, the human will still be out of touch with how the car responds to
controls.

Think of the first time you drive after not having driven for a few months -
now imagine being already moving at highway speed.

------
Animats
CA DMV's position is basically "let's try it with human backup for 3 years,
then re-evaluate". That's reasonable enough.

Tesla may have set the field back. They released their "autopilot" as a beta,
didn't provide sensors to enforce driver hands-on-wheel, shipped a system that
works well only on freeways but will engage on other roads, and claimed that
if it crashes, it's the driver's fault. Everybody else shipping similar
capabilities (BMW, Mercedes, Cadillac, etc.) has put more restrictions on them
(such as a hands-on-wheel sensor) to avoid over-reliance on the automation.
Now Tesla has had to remove some features from their "autopilot".[1]

Google may be able to get approval for their 25MPH max speed mini-car
operating without driver hands-on-wheel, on the basis of the slow speed.
Google's system is much more advanced, and has a detailed model of what
everything else moving around it is doing.

Expecting the human driver to take over if the automation fails will not work.
The limits of human monitoring are well known from aircraft automation. It
takes seconds to tens of seconds for a pilot to grasp the situation and
recover properly if the autopilot disconnects in an upset situation. That's
for pilots, who are selected and trained much better than drivers, and who go
through elaborate simulator training in which failures are simulated. Watch
"Children of the Magenta"[2], where a chief pilot at American Airlines talks
to his pilots about this.

[1] [http://www.bizjournals.com/sanjose/news/2015/12/16/tesla-
to-...](http://www.bizjournals.com/sanjose/news/2015/12/16/tesla-to-limit-
self-driving-functions.html) [2]
[https://www.youtube.com/watch?v=pN41LvuSz10](https://www.youtube.com/watch?v=pN41LvuSz10)

~~~
knughit
Planes are more complex than cars. Drivers react to road surprises far faster
than 10s, when making a decision to brake or swerve

~~~
truncate
That reaction is not necessarily right, its just reflex which can produce more
disastrous results than no reaction at all. Can't trust them much,
particularly when I'm not fully aware of the situation around me (and I think
one is unlikely to be fully aware of situation when on self-driven cars).

------
chrisfosterelli
> That cautious approach requires that the cars have a steering wheel, and a
> licensed driver must be ready to take over if the machine fails.

The article takes the tone that this will drastically limit the technology,
but honestly at this point a restriction like this seems completely
reasonable.

The cars are _not good enough_ to completely drive themselves in all
scenarios. At times, they will decide they can't handle a situation and safely
ask a driver to take over. This is becoming more and more rare, but it's
definitely still the case. Until the technology is proven to basically never
need human intervention, it seems reasonable to require a person who can drive
to be present.

~~~
Lawtonfogle
For cases such as if the car detects a malfunction, this probably won't be a
big deal, but for most extreme situations where a human will need to take
over, and that can't be fixed by stopping the car, will a human be able to
take over in a timely enough manner to solve the issue? We are often talking
split second decisions, and it will involve a human who likely wasn't paying
any more attention to the road than a passenger.

Imagine a car where the passenger's side is fully set up to drive, and with
the push of a single button, the driver can swap all control to the passenger.
I'm having a hard time thinking of situations where this is better than just
stopping the car.

~~~
mamon
Hmm, that's makes me wonder how driving lessons work in US. In Poland it is a
legal requirement that the car used for such purpose has a second set of
pedals on passenger side, where instructor sits.

When I took a driving lessons my instructor was perfectly capable of pressing
the brake or overtaking the wheel on time when I was about to do something
stupid. I don't see why manual override of autonomous car should be harder.

I see however, that this can easily defeat purpose of using autonomous car: if
the driver must be fully concentrated all the time, be ready to react, then it
is even worse than driving normal car.

~~~
telotortium
It works the same way in the US. However, in that case, the instructor is
normally paying attention to the road and the student's driving, to grade them
if nothing else. Usually, passengers can't be relied on for that level of
attention.

------
abduhl
How will this impact one of the key arguments for self-driving cars:
elimination of DUIs?

If a driver is required behind the wheel able to take over at a moment's
notice then the self-driving car's purpose is defeated - you cannot do work
while the car drives, you can't use the car as a sober ride home from the bar,
you can't take a nap on your commute, etc.

This is a legislated death sentence for self-driving cars as we envision them.

~~~
gkoberger
While I agree that it defeats many of the purposes (including Uber-style
services), remember that this legislation is in response to current state of
self-driving cars. Once they get better, legislation can become more lax.

~~~
abduhl
As a general rule of thumb, legislation never becomes more lax.

~~~
gkoberger
That's a silly thing to say. Uber and AirBnb were illegal when they launched.
You couldn't use your phone on an airplane during takeoff until recently. The
original speed limit for cars was "no faster than walking".

With the lobbying power of Google, Tesla, Apple and all the car manufactures?
It will change.

------
not_that_noob
Even though it doesn't look like it, this is actually great news since this
provides a path for Google et al to get out there and build broad customer
comfort around the technology.

And frankly I don't ever foresee driverless cars that don't have some form of
manual ovveride - there will always be situations that won't make sense to the
computer that make perfect sense to me.

==== Edited to add this clarification, as people on HN seem overly ready to
pick nits. ====

The larger point is that humans will be humans and will never buy a car that
will interfere with their freedom to do what they think is right at that
moment.

And if you want a better example - maybe I want to speed to get to the
hospital in an emergency. Maybe I want to block another car that's leaving the
scene of a crime. You get the idea.

~~~
pc86
> _park in this no parking zone blocking traffic for 30 seconds while I run to
> get something._

No? Either park legally or go without. You are literally saying to everyone
else "me saving 5 minutes by not having to park is more important than
whatever it is you're doing."

~~~
mindslight
Actually since the "driver" had not left the car, it would be considered
"standing" not "parking", at least by MA law.

~~~
knughit
Only if the "driver" can decide to move out of the way if needed, without the
passenger directing it

------
mtgx
Sounds like a good idea for the first 10 or so years of self-driving cars.
Remember, not all self-driving cars will be made by Google or Tesla. Some will
even be made by the spaghetti-code-Toyota, the cheating-Volkswagen, or the we-
send-updates-over-unencrypted-HTTP-BMW. It's best not to jump head-first
(literally?) with this.

And it's not just about the "safety" code of these self-driving cars, either.
Many of them probably still won't care too much about hiring top-notch
security experts in the first few years, to protect their cars against remote
hacking (which could be much easier with self-driving cars if they are
connected to the Internet).

------
rabboRubble
If I have to be responsible for AI and mechanical errors, I don't want an AI
driving my car.

Either I want to read a book or take a nap, completely zoned out, or I want to
be alert and paying attention to the road.

Sitting there doing nothing bored and disengaged, not able to read a book, not
actively engaged in the activity of driving, and also held responsible by the
state for failures is the worst of both worlds.

~~~
mquander
If that AI has, say, one-third the risk of accidents as I would have, then I
want it driving my car even if I enjoy it slightly less.

~~~
rabboRubble
It's not the I would enjoy the experience less, it's that the experience of
just sitting there would render me an inattentive dullard with very poor
response to a mechanical or AI failure. If I'm suppose to be responsible per
California State, I need to be paying attention. Just sitting there, inactive,
mile after mile, hour after hour doing nothing, maybe for months or even years
at a time, I will be woefully unprepared to respond to the emergency.

I actually gave up driving for 16 years then restarted when I last moved. What
I found when I restarted was that, while physically I could operate the car,
my decision making skills when faced with a problem were non-existent. I could
see problems coming my way, but the reflexive decision making was gone. I had
to relearn the decision making process.

Designing the system to place responsibility on humans to take over only
occasionally will ensure that the system creates catastrophic results when
humans are found to be unprepared for the responsibility.

------
daniel-cussen
Brilliant idea on the DMV's behalf. I'm reminded of Douglas Adam's thought, in
Hitchhiker's Guide to the Galaxy, that solving a problem ("Now that it's
perfectly air-conditioned,") shouldn't allow the removal of redundancy
("...you won't need to open the windows, ever."). Because the problem won't
REALLY be solved in all cases, and if the driver finds he's in an ugly
situation where he cannot act, there's no steering wheel to handle the
convenient killing machine he's riding. Can you imagine the horror?

~~~
nulltype
We can include an emergency bike in the trunk so you can still get where you
are going.

~~~
daniel-cussen
Bikes have the highest practical value of 100% computer-free objects, so this
looks like a reasonable solution.

------
joesmo
This would be a huge setback if approved simply because once approved, such
things are unlikely to change even when the time comes that human drivers will
not only be unnecessary, but the most dangerous component in all situations. I
understand the state's motives: liability, fear of bad press, keeping traffic
cops employed (and thus the biggest avenue for random arrests), etc. but none
of them justify limiting the technology simply to serve their rather short-
sighted, selfish needs. At the same time, the people that could benefit most
from driverless cars like the elderly, disabled, youth, etc. are shit out of
luck. And all this without a single accident yet attributed to driverless cars
(afaik).

------
mindslight
This is the exact wrong idea. Frankly, consumer-ready self-driving cars should
be _prohibited_ from changing into "manual mode" without a user request. And
the _manufacturer_ should be liable for an at-fault accident in auto-drive
mode. Until these conditions can be satisfied, self-driving cars are simply
not ready for prime time.

The proposed rule will lead to proliferation of shoddy "99%" software designed
around an expectation that a human can take over at a moment's notice. But
humans don't work that way, so any such exceptional condition will likely
result in a crash. The only acceptable failure procedure for a self-driving
car is to stop.

~~~
dawnbreez
See my comment above on two situations where coming to a stop is actually more
dangerous.

TL;DR: If the car fails while driving on a highway or crossing a busy
intersection, stopping will cause a wreck.

~~~
mindslight
Sure, coming to a stop is dangerous in those situations if other drivers don't
react appropriately. It's not a great idea, but I also don't mean that self-
driving cars should be doing such emergency stops regularly. And obviously
they should do it only as an extreme last resort, when things like pulling
over have failed.

I still think it's more reasonable to stop either of those two places than to
suddenly hand control of a car traveling 75mph to a human who hasn't driven in
a year or longer.

edit: concrete example: You're biking along. Would you rather A. a car in
front of you come to an abrupt stop, or B. a car passing you at twice your
speed be controlled by a drowsy driver who doesn't know where the edges of his
car are and doesn't have a feel for how the steering wheel reacts ?

~~~
dawnbreez
A fair point. The argument really comes down to this: Is it likely that a
malfunction will occur while the auto-car is closely followed by another car?

~~~
mindslight
Well, I wouldn't say "likely" comes into it. I'm explicitly describing what
happens with a choke_and_die() function if say the software crashes with no
recovery, or major sensors are damaged - unpredictable by definition. The "I
don't know how to navigate this situation" case would be handled by gracefully
pulling over, and should _rarely_ happen because developers would be
encouraged to work out the kinks rather than having such a terrible user
experience be common.

My point is that giving developers an easy out of forcing a human to take over
just encourages sloppy non-robust software. And if there's anything that
consumer technology companies are good at creating, it's non-robust software
that has no place in a car.

------
thinkingkong
Does the requirement need to have a control unit at all times? It would be
neat to continue to make it an option but also allow the console to hide the
wheel and perhaps replace it with some more fly-by-wire smaller replacement.

~~~
tomschlick
Maybe not hands on the wheel control but I would at least want it to be a
wheel that is ready to go by just grabbing it, same with the brakes. If they
collapsed or folded into the dash that could lead to issues where a software
glitch doesn't let you take back control and the robot overlords end you.

------
jryan49
I feel like it could be more unsafe this way. What if a human gets scared,
takes control, and overreacts when the car is properly handling the situation?
Might cause more accidents.

EDIT: reworded sentence so it actually made sense :P

~~~
maxxxxx
The human who drives a self driving car now should know what he's doing. These
cars are prototypes now and whoever takes them to the road should be
responsible.

------
state
When I read the geohot article [1] earlier I was thinking that there was no
way what he was working on was legal. Would be curious to have someone with
more expertise weigh in though.

Does this make comma.ai [1] or cruise [2] more acceptable in the eyes of the
state?

1 -
[https://news.ycombinator.com/item?id=10744206](https://news.ycombinator.com/item?id=10744206)
2 - [http://www.getcruise.com/](http://www.getcruise.com/)

~~~
Animats
California rules (summarized):

A manufacturer may conduct testing of autonomous vehicles on public roads in
California if:

(a) The manufacturer is conducting the testing.

(b) The vehicle is operated by an autonomous vehicle test driver who is an
employee ...

(c) The manufacturer can cover $5M in damages via insurance, bond, etc.

(d) Manufacturer’s Testing Permit from DMV.

Current California permit holders:

    
    
        Volkswagen Group of America
        Mercedes Benz
        Google
        Delphi Automotive
        Tesla Motors
        Bosch
        Nissan
        Cruise Automation
        BMW
        Honda
        Ford

------
ctoth
Welp. As a blind person who has been looking forward to self-driving vehicles
since the 2004 DARPA grand challenge, this is a not totally unexpected kick in
the balls.

------
ars
Most driving is fairly routine, but without strong AI I don't see how a self
driving car can handle things like:

There is a car accident on the side of the road, and there is a human cop
waving cars one at a time to drive past on the wrong side of the road.

I am part of a funeral procession, or a police escorted parade and I need to
drive through red lights in order to stay in the procession. (That at least I
know from the start to take manual control - except google wants a car with no
wheel, and in a world with lots of such cars people won't know how to drive.)

Or, there is an ambulance behind me sirens wailing and in order to get out of
the way I must drive illegally for a moment (through a red light, or on the
wrong side of the road).

How about something as simple as driving the car up some ramps so I can
service the car. How would you do that without both a steering wheel and
experience in how to drive?

I don't see how a self driving car can handle any of those.

~~~
cheez
Self-driving cars should only need to handle 90% of driving tasks. For the
rest, a qualified human can and should be required to take over.

~~~
ars
That leaves humans without enough experience to actually know how to drive. It
takes years of driving to get good at it.

If people drove only 1/10 as much as now no one would ever get good at it.

It's called _" The Paradox of Automation"_ and it's a huge problem.

Basically the better the Automation the less often humans do anything, but it
means that when humans _do_ have to do something it's much much more critical
than usual. Which is exactly the situations the humans have no experience with
since they take control so rarely.

Basically some automation is good, but a lot of automation is actually
detrimental to safety. For each industry you have to calculate that cutoff
point and do no further automation past that point, until you are able to
reach 100% perfection.

------
lutorm
Maybe the cars should be required to pass a CA driver's test? If it's good
enough to test a driver, why not a car?

~~~
Zikes
This reminds me of the recent discussion about Baidu's expulsion from an image
recognition competition. They submitted too frequently, causing their machine
learning algorithm to solve specifically for the test, rather than for general
purpose image recognition.

In your proposed scenario, knowing the CA driver's test constraints, the
engineers could easily (and perhaps even inadvertently) be able to design a
driver AI capable of passing the test with flying colors, yet fail horribly
once faced with real-world driving.

That's not to say that's actually what would happen, only to say that the test
would actually be rather useless to certify a self-driving car.

~~~
uremog
Another win for Test Driven Development!

------
yarrel
, servant walking 20 paces in front carrying red flag.

