
On Google's self-driving car acident rates - gwern
http://ideas.4brad.com/google-releases-detailed-intervention-rates-and-real-unsolved-problem-robocars
======
brianmcconnell
Licensed pilot here (learned to fly in 1987). People need to look at self-
driving cars in the way pilots use an auto-pilot.

Auto-pilot is most useful in two situations. 1) long cross country legs where
there is not much flying to do (just maintain heading and altitude), so A/P
frees the pilot up to manage other systems, enjoy the view, etc, and
alleviates fatigue, 2) flying a precision instrument approach, reducing the
risk that the pilot will succumb to spatial disorientation in the setup for
(usually manual) landing.

With cars, auto-drive capability will be useful in reducing accidents in two
modes: 1) long duration highway driving where fatigue is a big issue, 2)
intervening to prevent a distracted driver from causing an accident (rear end
collision etc).

I'd be perfectly happy with a car that can drive itself in cruise mode on the
interstate, but requires an alert driver on local roads (with the added bonus
that if I am about to slam into something, it will brake to avoid or lessen
the impact).

Something for the liability crowd to consider, self driving cars won't be able
to avoid every potential mishap, but they will be able to reduce their
severity. A car that can automatically brake to reduce its speed by 25% just
before impact, will reduce its kinetic energy by roughly half, and the
potential for injury by more still.

~~~
stillsut
Your comment made me think: Could driverless cars kill the domestic airline
industry?

If I could get from NY to Florida in less than a day in my driverless car, and
at sub-$2-gas, who would want to do the security check-in, luggage ripoff,
flight delay thing when going less than 1000 miles?

~~~
knughit
Are you familiar with Amtrak.com?

~~~
grillvogel
at least on the west coast, amtrak routes are slower than driving and more
expensive than flying

------
underbluewaters
I think it's pretty clear that the bar for safety must be much higher (10x)
than human-level for these to be accepted by consumers. Auto accidents are
very common, and the first time someone has an accident with a car like this
everyone they know will hear about it. It will be terrifying. If these are
only as safe on average as a human driver then nearly everyone is going to
have at the very least a 2nd-hand negative experience.

The emotional response to these accidents is not going to be entirely
irrational either. If I have a minor accident with my traditional truck, I'm
going to probably have a good understanding of what went wrong and how I can
prevent future collisions. With an autonomous vehicle... software upgrade? I'd
rather take responsibility for my own safety if that's the case.

~~~
thrownaway2424
The idea that humans learn well from accidents stands contrary to all
available evidence.

~~~
ajkjk
What evidence? Certainly individual humans (anecdotally: like myself) do.

~~~
tyre
Look at insurance premiums. Premiums go up after an accident because it is
more likely that you are a poor driver and likely to have another accident.

~~~
snowwrestler
Insurance companies raise premiums after accidents because there is an excuse
to do so. It has nothing do with your likelihood to have another accident, and
everything to do with charging you more so that they can attract new customers
with a lower price. Premiums go up even if the driver is found to be not at
fault.

~~~
aetherson
I've been involved in several reported accidents that were judged to be fully
the fault of the other driver, and my rates have never increased.

~~~
Spooky23
They will drop you for being unlucky if they are too closely spaced together.

~~~
kyllo
Relatedly, my home insurance company dropped me for being unlucky enough to
get burglarized.

------
abalone
What about manufacturer liability? That's the critical question here.

It can't just be 2X or even 10X safer than humans if the manufacturer is
liable for the accident. That would EXPLODE the cost of these cars and
bankrupt companies -- even if just a small fraction of today's fatal accidents
due to driver error became Google's liability.

Consider that even single-vehicle accidents, which today might be the driver's
fault, could result in a manufacturer lawsuit.

This is why I think _fully_ autonomous without any human to intervene is a
pipe dream, because the safety level required for that is pretty close to
perfectly bugfree in an almost infinitely complex world of roads and
conditions. Just for liability reasons.

These test cars of course have human copilots. But it remains to be seen if
that really will grant manufacturers immunity for accidents. If the car is
perfect 99.999% of the time, wouldn't that train the human to trust it and not
pay as close attention? And then miss that 0.001% of the time when it hurts
someone? Would a judge find it reasonable for a human to stay vigilant and
responsible for that fatal corner case bug that cropped up after two years of
perfect autonomous driving?

Liability is The. Critical. Question.

~~~
DenisM
There is no explosion. The cost of collisions today is reflected in the car
insurance, so about $100/mo. If the manufacturer were to absorb this liability
at current incident rates, it would make cars about 30% more expensive
(assuming $300/mo loan payments). If Google reduces the (incidence * impact)
of collision by 10x, the cost will come down to $10/mo, or about 3% of the
total car cost.

And in return for the modest price increase you get a _self-driving car!_

~~~
abalone
You're forgetting about tort damages. Insurance does _not_ typically cover the
cost of killing or seriously injuring someone. Those damages go into the
millions, far exceeding the maximum coverage. Cases of serious injury and
death often end up bankrupting the responsible individual.

Also consider that there are more people who can sue per accident, because
humans aren't responsible anymore. Today if you're driving the car and get in
a accident and it's your fault, you can't sue anyone. Now you can. That's one
more party per accident, and a whole category of single-vehicle accidents than
now result in lawsuits.

~~~
dragonwriter
> You're forgetting about tort damages. Insurance does not typically cover the
> cost of killing or seriously injuring someone. Those damages go into the
> millions, far exceeding the maximum coverage. Cases of serious injury and
> death often end up bankrupting the responsible individual.

And you're forgetting that, to the extent that such injuries can be laid at
the food of manufacturers' defects, manufacturers are _already_ liable for
them, and since plaintiffs are unlikely to be able to fully collect from
drivers, will often be sued _now_ , so self-driving cars don't actually change
things that much.

> Also consider that there are more people who can sue per accident, because
> humans aren't responsible anymore.

Incorrect.

> Today if you're driving the car and get in a accident and it's your fault,
> you can't sue anyone.

Incorrect. If you get in an accident and suffer damages and you allege that
this is due to a manufacturer's defect, this is a basis for suit against the
manufacturer. Now, you won't _win_ if the manufacturer can show by a
preponderance of the evidence that the accident was your fault rather than
theirs. [0]

Similarly, quite likely, in a self-driving car regime, if your car crashes you
can sue the manufacturer and claim a defect was responsible, but if they can
prove its the fault of something else (such as your own failure to properly
maintain the vehicle), they won't be liable to you.

[0] [http://www.nolo.com/legal-encyclopedia/product-liability-
cla...](http://www.nolo.com/legal-encyclopedia/product-liability-claims-
defective-cars-29648.html)

~~~
tanker
I think your logic is correct, but your implied outcome is probably wrong.

>"If you get in an accident and suffer damages and you allege that this is due
to a manufacturer's defect, this is a basis for suit against the
manufacturer."

Right now, this is an edge case. The self driving car may cause it to become
common.

~~~
dragonwriter
> The self driving car may cause it to become common.

Yes, because _exactly_ those cases where the driver (as operator, as distinct
from the -- often the same person's role as -- owner-as-maintainer) is
responsible now become the manufacturer's responsibility. But, again, that's
exactly what insurance covers.

But, again, we know the cost of that liability is -- its the cost of driver's
insurance.

~~~
snowwrestler
Driver's insurance is the cost of liability _to drivers_ , whose ability to
pay damages is practically capped by their own insurance coverage.

A major corporation does not have the same practical ability to cap their
damages. I am not convinced it's a straight comparable cost as you keep
saying. The two risks are not exactly comparable, thus I would not expect the
insurance costs to be the same.

------
absherwin
Minor crashes are even more frequent than the article estimates. Thus, the
real human accident rate is even higher. Probably between 1 in every 24000 and
87000 miles.

The VTI driving study[1] equipped 100 cars with sensors and was therefore able
to measure all crashes experienced. It directly measured 1 crash per 24000
miles. If we extrapolate based on the 17.4% police report rate, that suggests
1 per 87000 miles.

[1][http://www.nhtsa.gov/DOT/NHTSA/NRD/Multimedia/PDFs/Crash%20A...](http://www.nhtsa.gov/DOT/NHTSA/NRD/Multimedia/PDFs/Crash%20Avoidance/Driver%20Distraction/100Car_ESV05summary.pdf)

~~~
aetherson
That's an interesting study. I'd be cautious at trying to create a one-line
conclusion from it. It is, however, a fascinating full read, and not very long
and not very technical, so I'd encourage people to read the whole thing.

Note the narrow demographic data and geographic data.

------
huangc10
I live in Mountain View and I see Google self-driving cars around all the time
(day and night). My personal feel is that they look "different" and my
attention will deliberately focus on them (which may or may not affect my
driving when I'm around them).

Why not test with cars that look "normal" (think early 2000s Corolla).
Wouldn't this further decrease the chance of possible accidents?

I guess what I'm saying is, who's decision was it to make it look like a toy
car and not just an actual regular car that doesn't divert my attention off
the road?

*edit for better readability

~~~
grillvogel
google is only capable of designing things that look they were made by and for
5 year olds

~~~
a_c_s
In this case the more non-threatening the cars are, the more likely people are
to accept them.

------
zachwooddoughty
Google cache:
[http://webcache.googleusercontent.com/search?q=cache:9UNd6-l...](http://webcache.googleusercontent.com/search?q=cache:9UNd6-lo4fwJ:ideas.4brad.com/google-
releases-detailed-intervention-rates-and-real-unsolved-problem-
robocars+&cd=1&hl=en&ct=clnk&gl=us)

~~~
Fuzzwah
I'm going to use this as a place to mention how ridiculous it seems to me that
drupal displays the database details on the error page:

> the username is drupal and the database server is localhost

I'm sure this isn't a huge information leak, but it still seems kind of silly
to me.

~~~
SomeCallMeTim
I used to use Drupal. At this point I never, ever recommend people use it.

Pregenerating most of your pages a-la Jeckyll is just so much better that I
don't know why anyone ever thought generating all of their pages with PHP was
a good idea.

I remember thinking it was a good idea, to be honest. But at this point I look
back and see how mistaken I was.

~~~
enzanki_ars
To help people trying to Google/understand the parent comment, it is spelled
Jekyll.

~~~
SomeCallMeTim
Foo. Too late to edit.

Thanks for the clarification.

------
DanFeldman
The CA dmv also publicly reports all accidents involving autonomous vehicles
[1]. Most accidents are caused by drivers taking over control manually, or by
other actors on the road driving erratically (or rear-ending the self-driving
vehicles). The entries on the page report Google, Delphi, and Cruise
automation as having incidents, though they're all minor damages, if there are
any at all.

[1]([https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/auton...](https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/autonomousveh_ol316))

------
bduerst
>How does that number compare with humans? Well, regular people in the USA
have about 6 million accidents per year _reported to the police_ , which means
about once every 500,000 miles.

Aren't accidents only reported to police if there is a hit & run, or some
other criminal activity?

Seems like the wrong metric for comparison, given the way they define the
self-driving "accidents" and that the majority of human fender benders are not
reported.

~~~
dsp1234
In some(most/all?) states, all accidents over a certain dollar amount or that
caused injury to any party are required to be reported to the police.

But the lack of police reports is why the article mentions that insurance
companies believe the accident rate is higher than the 'official' number.

Anecdotally, the number of times that someone has hit my car, and then asked
me not to report it in exchange for cash is non-zero.

------
aresant
When it comes down to differentiation what do I care about if a robot is
driving me around?

Safety. Period.

The equation to get to safety is sensors + computing + map / map geometry.

The sensor differentiation will matter for a while, some OEMs will do very
well, but ultimately it will just boil down to the software.

So how is the future of this space not just "everybody licenses Google's
platform"?

If life/death safety is the differentiator (and measurable) I feel like the
writing is on the wall. Google is going to dominate this.

Maybe Musk sneaks in there with his brash approach of using his drivers as
test subjects but his position as a competing auto manufacturer seems less
compelling than supporting an "Android-like" effort from Google.

~~~
dperfect
> The equation to get to safety is sensors + computing + map / map geometry.

The equation to get to safety is actually what you mentioned + roads where
100% of vehicles are tied in to a common network orchestrating each vehicle's
movement in relation to the others and the environment. Without that,
differentiation in software is likely to lead to unanticipated dangerous
scenarios, just as the HFT market is susceptible to flash crashes... except on
the road, they will be _real_ crashes with the potential to be more
destructive than human-caused accidents.

Hopefully when that network becomes a reality, we can build it on top of some
kind of open, distributed platform (perhaps related to blockchain technology)
that no single entity can abuse.

~~~
jessaustin
That seems really brittle. The baseline for every vehicle should be safety in
every situation. (Sometimes that might mean: pull over to the side of the road
and stop.) Only once every vehicle is proved safe, can certain combinations of
vehicles negotiate compromises to safety in pursuit of other goals like speed.

~~~
dperfect
(responding to your comment and the other comment about a possibility of
attackers)

I agree that the baseline should be independent, autonomous, safe operation of
each vehicle without any network/cooperation, but in time it's going to go
beyond that. I have no doubt that it will happen.

Once independent operation is proven to be reasonably safe and dependable for
everyday use, it will have achieved a certain degree of safety and efficiency
- significantly better (at least in safety) than humans, but still prone to
occasional accidents (regardless of who/what is at fault), especially when
human drivers are still allowed to drive on the same roads.

At that point, people will demand even higher levels of safety (similar to the
nines of high availability), and greater efficiency (i.e., speed).
Unfortunately, with human drivers sharing the same road, those two demands are
at odds with each other; greater efficiency inherently reduces safety. At some
point though, the demand will be great enough that we'll have to adopt
something better - something with a higher level of coordination between
vehicles, and where only computer-controlled vehicles can participate.

It certainly won't be pushed or adopted universally in one step, and it'll
likely _never_ cover all road infrastructure. It will be adopted first on high
throughput routes, much like you have private toll roads today. Instead of
paying a toll (or perhaps in addition), your car will enter the roadway by
joining the network. Without network compatibility, you'll be unable to use
the road.

Once you've joined the network, however, your vehicle can literally drive at
breakneck speeds, avoiding other traffic by incredibly small margins, and all
with significantly higher safety than non-networked roads.

Of course having a single network controlling vehicles opens up a possibility
for attacks, just as modern commercial aircraft can be vulnerable to attack.
It will be an issue, but not one without reasonable mitigation strategies and
best practices. The "network" in question may not even be centralized in
nature, but rather a decentralized, local mesh network with safeguards in
place for each vehicle to detect and respond appropriately to situations where
the network seems to be guiding vehicles outside safe bounds. There are a lot
of ways that it could be implemented, but it'll be important (at least to me)
that it's not all controlled by one government/corporation/entity, but instead
rolled out as a set of interoperable industry standards.

~~~
jessaustin
Why such confidence in the future adoption of this system that you stipulate
will be both dangerous and inconvenient? This might work as a scifi plot
device, but it's not predictive.

[EDIT:] What if a moose crosses the road?

------
chad_strategic
Next time you pull up to a stop light, look to the left and then right. I'm
willing to be the drivers on the left and or right are looking at their phone.

It's getting worse... the phone, blue tooth, the dashboard computer, traffic,
impatient drivers. etc...

I for one can't wait these self driving cars, they have got to better than
distracted drivers we have now.

~~~
Thirdegree
>Next time you pull up to a stop light, look to the left and then right. I'm
willing to be the drivers on the left and or right are looking at their phone.

And the dude between them is busy staring at his neighboring drivers rather
than the road!

------
tjl
Is Google only testing in California? It seems that the data for accidents was
given for the whole of the US, but it should be compared to California only. I
don't know if it's noticeably different or not, but I'd think that snow and
ice would increase the accident rate in some states.

Snow and ice I think would provide a challenge for self-driving cars.

~~~
illegalsmile
Turns out snow and ice provides challenges for cars driven by humans!

------
bertil
I am surprised that Google has not started having their software recording on
cars that are driven by humans, to learn. They could increasingly have
feedback to say “Our software would not drive that fast/that close to that
curb” and progressively simulate a lot more dangerous situation and learn
about tough situations than they currently can.

~~~
occamrazor
They have a vast number of human operated vehicles with extensive telemetry
data: Street view cars.

------
fweespeech
[http://webcache.googleusercontent.com/search?q=cache:9UNd6-l...](http://webcache.googleusercontent.com/search?q=cache:9UNd6-lo4fwJ:ideas.4brad.com/google-
releases-detailed-intervention-rates-and-real-unsolved-problem-
robocars&num=1&hl=en&gl=us&strip=1&vwsrc=0)

Cache. Its down for me.

------
joe_the_user
How safe does a self-driving car have to be before it is allowed on the road?

I would say that's going to be a societal decision rather than an engineering
decision.

An interesting point is that the introduction of the private automobile itself
was an imposition on the social space of that time which basically was wildly
unsafe with automobile accidents still being on of the leading causes of death
in this country.

If it's determined that self-driving cars will be allowed, ordinary drivers
will be forced to adjust to their presence and will have to learn their
quirks.

Certainly, cellular phones realistically are being used by a significant
fraction of drivers today and if the accident rate has gone up at worst only
slightly due to that (not enough to counter-weight other safety measure), it's
because non-cell users have adapted to the presence of the cell-user, however
annoying that might be.

~~~
rhino369
>How safe does a self-driving car have to be before it is allowed on the road?

I'd argue that self driving cars just don't have to be safer than average
drivers. They should be safer than average drivers with computer driving
assistance. It's already happening, but we can take the self driving programs
and use them with human drivers. To get a best of both worlds scenario. Of
course we'd have to mandate all new cars come with computer driving
assistance.

The counter argument is that we accept the risk of cars now, why not let self
driving cars have the same level of risk. My answer is that the risk of cars
is so great we either shouldn't be accepting it or are only accepting it
because there isn't an easy alternative. A lot of Americans die on the roads.

~~~
maxerickson
It depends on the distribution of drivers that cause incidents. If the average
(mean) driver is involved in an average number of incidents, then fine, base
the safety estimation on them. If the average driver is involved in a below
average number of incidents, the safety standard should be tuned to the
drivers that are involved in incidents.

~~~
melling
When we calculate averages, are we including the number of drunk driving
accidents, for example? 10,000 Americans are killed every year due to that. If
we calculate equivalent accident rates that include impaired humans, I'm not
sure that's good enough. Computer + human should be able to address this
problem, right? Reduce speed, pull over and stop, if someone can't drive
between the lines or they are driving the wrong direction on a road.

------
iamleppert
All of the brouhaha about self driving cars is moot until real user testing
begins. I'm not talking about driving a google mobile around the streets of
mountain view with an engineer in the driver's seat.

These vehicles (I would hope) are designed to drive around people in different
circumstances. They're software for use by people and until we have realistic
tests with different kinds of people, ages, driving experience levels, etc.
it's all academic.

I'm somewhat wondering if all the self driving car stuff by Google and Tesla
is primarily a vehicle (no pun intended) for marketing rather than actual tech
that will yield a real product.

~~~
bertil
It seems, from previous reports by Google, that their ability to understand
discrepancy between sensors relies on knowing the road very well, not just the
map with nuances like “cars can go there, but it’s mainly a pedestrian street”
but more local undocumented habits. They mentioned details that prove that
they would have to drive a lot in new cities to learn enough and make their
car safe.

The one that I remember is that they have started driving in Austin and came
across a very local species of vehicle with a unique habit: the hipster fixie
rider with his stand-stop motion at red lights. Committed bikers do not have
free wheels and stay standing in place not by putting their feet down but
rocking back and forth; that motion was new and not interpreted clearly by the
car, that hesitated to see it as a false start. It’s since been fixed. The
article didn’t specify if the shirt pattern had any meaningful weight in the
interpretation.

------
willvarfar
I can't wait for self driving cars. End of that discussion each time we eat
out about who is going to drive us home...

------
TazeTSchnitzel
If the humans always take the wheel in dangerous situations, how do you know
the cars are safe? The cases where they are most likely to get into trouble
are the cases where the AI is not in operation!

------
epaga
*accident

------
tim333
>we have to figure out just how to test these vehicles so we can know when a
safety goal has been met. We also have to figure out what the safety goal is.

Or you can just crack ahead and try them with users like Tesla.

------
dsfyu404ed
The average human driver is consciously forgoing the option of driving in the
hyper-conservative manner that the google cars do...

------
f137
250,000 miles on average between accidents? Does anybody think this is
unbelievably high number?

------
vonnik
typo in the headline. we need one more 'c' in accident.

------
VLM
Why bother?

In a world of massive un- and under- employment, mechanical turk and massive
connectivity, increasing income inequality, and generationally declining
standards of living, a handful of millionaires will sit in their unimaginably
expensive self driving car, and five mechanical turk remote drivers will
cooperate on a 3 of 5 vote basis to drive the millionaire.

Most brainpower is underutilized, the quantity is increasing, and its
manufactured by unskilled labor. There eventually comes a peak-automation
point where its better and cheaper for the economy and culture to just hire a
dude to do it.

The world of the future is only ivy league grads will have real jobs and
they'll have all the money, whats so inherently awful about someone in the
favelas getting a job as a driver, especially if its cheaper and safer?

~~~
oofabz
The world unemployment rate is 8% according to the CIA World Factbook. What's
going to make this increase so dramatically? We've had industrial automation,
computers, and the internet for decades now, and people still have jobs.

Also why would you want someone remote-controlling your car instead of sitting
in the drivers seat? Wireless connectivity is way too flaky. Tunnels block the
signal, rural areas have poor coverage, and equipment breaks and requires
maintenance. It would be insanity to bet your life on a cellular signal.

