
California Proposes Rules For Autonomous Cars - phreeza
http://www.motorauthority.com/news/1073910_california-proposes-rules-for-autonomous-cars
======
tatsuke95
> _"The vast majority of accidents are due to human error."_

Obviously, as there is very little alternative to human error right now. But
there's no reason to believe that to be the case when you introduce autonomous
vehicles. There's a whole new variable.

Still, a required step for progress.

~~~
kennystone
Manufacturing defects, bad roads, poor signage - Government has optimized
these pretty well.

~~~
burgerbrain
Weather conditions as well. Black ice is one hell of a thing.

~~~
MaxGabriel
An interesting note from Sebastian Thrun--I think in the second office hours
of his CS373 course--Google's self driving car can't currently drive in snow
because it covers up the road, and the sensors can't see the lines on the road
(but they can adjust for the rain).

------
kroo
Here's the proposed text, for anyone who's interested:
[http://leginfo.ca.gov/pub/11-12/bill/sen/sb_1251-1300/sb_129...](http://leginfo.ca.gov/pub/11-12/bill/sen/sb_1251-1300/sb_1298_bill_20120223_introduced.pdf)

Edit: As I read it, this is pretty straight forward, doing the following
things:

\- Makes it explicitly legal to operate an autonomous car on public roads, if
your car has met a safety standard yet to be devised.

\- Authorizes the establishment of safety standards for autonomous vehicles by
the California Highway Patrol.

\- Until these standards are devised, it does not prohibit autonomous cars
from operating on CA public roads.

"Autonomous Cars" in this case are defined fairly narrowly: a car capable of
driving "without active control and continuous monitoring of a human
operator".

------
adriand
Prediction: at some point, there will be an accident involving an autonomous
car. The event data recorder (aka the "black box") will indicate that the
human operator took control of the vehicle before the accident occurred. The
driver/passenger, however, will claim this was not true, and a lawsuit will
commence where there will be claims that the EDR was hacked or that the car
manufacturer/software provider modified the EDR to falsely blame the human in
the event of an accident.

~~~
brnstz
More generally, how can we evaluate an autonomous car's effectiveness in
avoiding an accident, if there is always a human sitting in the driver's seat?

I think most drivers would instinctively take control of the car if they felt
in danger, whether or not it's statistically in their interest.

------
VengefulCynic
I would really love this if it creates a minimally-invasive legal framework to
enable innovation and curb nonsense like pedestrians jumping in front of cars
and trying to sue the manufacturer when they get hit (hopefully while
providing a mechanism where manufacturers can indemnify their vehicles without
too much liability danger in such cases).

On the other hand, I have to say that I don't exactly have the greatest level
of faith in the California Assembly based on past performance. Here's hoping
they buck the trend and establish a framework to encourage rather than inhibit
innovation.

~~~
DanBC
Have cameras on the car. Have them record into a 120 second buffer. When the
car detects a crash write the buffer to disk with a timestamp.

~~~
robk
Ugh, as a driver I'd prefer not to have all my actions recorded regardless of
whether it's beneficial to me. This reeks of invasion of my privacy and
possibly opens me up to prosecution if the camera were subpoenaed in a
criminal proceeding for something unrelated to the car's function. Same reason
I don't leave a GPS trail everywhere by choice.

~~~
DanBC
In the UK we have automatic number plate recognition and a nationwide network
of CCTV. Some places can recognise a stolen car and dispatch police within 60
seconds. (Notably the City of Westminster and Heathrow airport.)

> _I'd prefer not to have all my actions recorded_

Fair enough, but it's too late for some countries.

~~~
burgerbrain
I mean no offence (I fully recognize that the UK is a sovereign state with
rules that largely satisfy it's own culture and people), but I feel that if we
universalize this idea ( _"Fair enough, but it's too late for some
countries."_ ) the result will be nothing more than a race to the bottom.

------
drewblaisdell
Does anyone know if Nevada or California have discussed operating an
autonomous vehicle while intoxicated yet?

~~~
sheraz
:-) The SEO kings are going to be waaaaayyy out in front on those terms now.
Shall we start the domain rush?

------
mwerty
Are there companies working on this besides Google?

~~~
k-mcgrady
Car companies. Google certainly wasn't the first to be working on self-driving
cars, they just got a lot of time in the tech press for it. The big car
companies have been working on it for years. I can't find any details but I
remember hearing about some companies working on it about 8-10 years ago. More
recently BMW have been working on it:
[http://www.motorauthority.com/news/1072117_on-the-road-
with-...](http://www.motorauthority.com/news/1072117_on-the-road-with-bmws-
self-driving-car-video)

~~~
Game_Ender
There is a difference in complexity though. Most systems developed by the
Automotive companies are just a step above highway lane keeping, and adaptive
cruse control. Google's system handles city roads with pedestrians and more
complex traffic patterns.

There is a cost to that of course, Google's system uses $100k of sensors, and
isn't well integrated into the body of a car. This means it's father off from
being seen a in a production car. I could see it being sold as an aftermarket
add on to specific cars, maybe even only to business customers, who then want
to sell the service and automated car can provide.

~~~
k-mcgrady
You're right there is a difference. Google has better software. The car
companies have better cars. If Google does continue to push the technology I
hope they license it. I dread to think what an actual Google car would be
like. It's an interesting market that's been progressing for years. It seems
now that we are very close to seeing it in production models.

------
agumonkey
Is there any article/video (I didn't investigate) showing multiple autonomous
cars interacting ? I'd be curious to see if there's any behavioral resonance
leading to epic havoc.

~~~
state_machine
DARPA Urban Challenge (2007):
[http://www.google.com/search?tbm=vid&q=darpa+urban+chall...](http://www.google.com/search?tbm=vid&q=darpa+urban+challenge)

Several teams competed, entering driverless cars which interacted with each
other and some vehicles operated by real people on a closed course. Carnegie
Mellon's Boss car completed all the DARPA conditions and won the grand prize.

~~~
agumonkey
Thanks a lot. I wish there was a higher density of cars. Also different teams
=> different systems.. that might introduce good jitter to avoid systemic
locks. Anyhow it feels pretty safe.. can't wait to see it becomes standard.

~~~
Game_Ender
One issue that I haven't seen much talked about is the is system wide control
instabilities if everything is automated, as you have alluded to. I also don't
know if anyone is doing any work in preventing cross talk and interference
when you have several dozen vehicle all with similar lasers and radars in
close proximity. If you operate two of the popular Velodyne lasers near each
other, used on the Google vehicles, you get a good amount of noise on both
sensors.

------
duaneb
Excellent. The sooner we get the laws into place, the sooner autonomous cars
become a reality. Does anyone know if California's laws are modeled after
Nevada's?

~~~
kroo
Looking at
[http://www.leg.state.nv.us/register/2011Register/R084-11I.pd...](http://www.leg.state.nv.us/register/2011Register/R084-11I.pdf),
it looks like Nevada decided to define a class of license (G) for operation of
autonomous vehicles. The CA legislation just makes operating an autonomous car
legal if it meets certain safety and performance criteria, which it doesn't
define (instead, asks the CHP to come up with these rules). The way this bill
is drafted, it looks to me like CA would be much less restrictive of
autonomous car operation than Nevada.

------
noduerme
Has anyone written or thought deeply about all the ways that self-driving cars
could be tricked or hacked into causing accidents, kidnapping passengers,
driving off cliffs, running people over, or otherwise creating havoc? Seems
like it was just a couple years ago that Toyota was recalling cars over a
"sudden acceleration" problem.

A system like this is only going to be as good as the data coming to the car,
and given the knowledge that all cars will react a certain way to a certain
stimulus, it's a lot easier to design a low-tech hack that would kill a lot of
people. Here's one that comes to mind:

Given an two-lane road with a narrow shoulder and an embankment, place a small
boulder on the right side of each lane. A human being will either swerve off
the embankment or rip their transmission out on the rock. What's a bot going
to do?

~~~
SoftwareMaven
Or you could fill a truck up with diesel and fertilizer and blow it up. This
technology is far more likely to reduce deaths associated with cars than
increase.

Go back to the beginning of the 20th century and think how much damage could
be caused by creating a nationwide electric grid. People could electrocute
people at will. Personally, I'm kind of glad we went ahead with it.

~~~
noduerme
[EDIT] Again, I really don't understand why I'd be downvoted for asking a
legitimate question like this. It's rather disconcerting. While it's great to
think about a future in which this technology is safe and widely available, it
seems to me I'm being attacked for simply asking basic security questions that
I would ask about any large system that was going through trials prior to mass
rollout. And rather than hearing any specific answers, I'm just being attacked
and voted down. I think anyone who's spent time in IT would probably ask these
things first before they deployed a new system in their company office, so I
don't think it's unreasonable to ask them about about a potentially game-
changing social innovation.[/EDIT]

That's true. But blowing up a diesel truck is hard to do remotely, and
requires someone to actively attack at a specific place and time.

I'm not sure why I got downrated for my post; I'm just asking, doesn't this
create a lot of security holes and attack vectors that need to be studied
before allowing it? I mean, I haven't heard anything about security, at all.
The focus seems to be on a safe driving experience, but where's the white
paper on counter-hacking measures? Can you imagine if Google launched Gmail
without any kind of plan to mitigate stolen passwords or hijacked accounts? As
it is, there are plenty of people who _do_ have their accounts hijacked.
Luckily, that doesn't lead to collisions and deaths.

Consider for a moment how many people run Windows and IE, with the latest
security updates, who are still vulnerable to zero day exploits. Consider how
many don't update their software and get swept up in botnets a few days later.
Now imagine each and every compromised PC has physical control over 2-3 tons
of rolling aluminum and steel, that can go anywhere on a public highway, with
human beings inside it.

An attacker who had taken control over a botnet of compromised autonomous
carscould drive swarms of them wherever they wanted by remote control.

Now, rather than downvote me, tell me what security protocols will be in place
to prevent the scenarios I've outlined.

~~~
kelnos
No network access. The computer that controls the car doesn't need to talk to
anything other than the car. That eliminates a slew of attack vectors right
there.

~~~
noduerme
Is "no network access" going to be part of the legal framework under which the
vehicles operate? Presumably they need to download maps from somewhere, along
with traffic updates, road hazards, etc. Most new cars have network
capabilities as it now stands. So it's unreasonable to assume that they won't
have any network access. And as we know, anything with network access can
eventually be rooted.

~~~
icebraining
My guess is that the basic systems (keep the car from going off-road or
colliding, letting people takeover, etc) will run on an real-time barebones OS
on a embedded system and will have a very simple and well defined interface to
a full machine that runs all the crap like the UI, navigation systems and
such.

I wouldn't discount having your car stolen remotely, but hijacking with humans
inside is unlikely to work, and so is crashing into things.

~~~
waterlesscloud
If it's navigating according to a map that's downloaded from a network, upload
a faulty map to it. Then it will go whereever you want it to.

~~~
Game_Ender
These aren't "dumb" cars. For example, Google's have a high fidelity laser
range finder that builds dense 360 point clouds 15 times second. If the car is
given a bad map which sends it into a build or other cars, the collision
avoidance system will recognize that fact before an accident occurs and stop
the vehicle.

~~~
waterlesscloud
It's a lot more complicated than that. The car can't always stop when the map
disagrees with the sensor. There are any number of situations where that's a
bad idea.

~~~
zaphar
If a cars sensor says there is a wall in front of the car and it goes with the
map then some coder some where made a mistake. If the car sensor says there is
a cliff in front of the car and it goes with the map some coder some where
made a mistake.

replace wall and cliff with obstacle/dangerous environment of your choice and
the sentence will always end with some coder some where made a mistake. It's
really is as simple as that. sensor wins over map when it comes to avoiding a
crash. What possible condition can you come up with that would make it
desirable for a car to ignore it's sensors and go with what a map says is
supposed to be in front of it?

~~~
waterlesscloud
If the car stops every time the map differs from the sensors, then I just give
you a nonsensical map and you go nowhere. Or if the map is outdated, which
will of course happen.

If sensors say the road turns and you go with the sensors, what happens to the
navigation? Eventually they become irreconcilable. The car will completely
lose track of where it actually is, having only local (and perhaps some
limited amount of historical) sensor data.

The scenarios aren't just limited to "STOP or CRASH", there's a lot of subtle
ways things can go wrong.

~~~
buu700
> _If the car stops every time the map differs from the sensors, then I just
> give you a nonsensical map and you go nowhere._

..Yeah, as opposed to driving nonsensically? I think I'll take the car that
defaults to whatever won't kill everyone around me.

> _Or if the map is outdated, which will of course happen._

Assuming the cars download new maps on a regular basis, I'd consider this
situation pretty unlikely on any official road or highway (unless we are to
accept that in certain locations every car will consistently stop driving).

However, if this situation were to occur, option one: sync with the latest map
data; failing that (network issues, etc.), option two: pull to the side of the
road, stop, and enter manual mode.

> _If sensors say the road turns and you go with the sensors, what happens to
> the navigation? Eventually they become irreconcilable. The car will
> completely lose track of where it actually is, having only local (and
> perhaps some limited amount of historical) sensor data._

What do you mean by this? Google Maps and most GPS navigation systems
recalculate routes perfectly fine.

