
A Closer Inspection of Tesla’s Autopilot Safety Statistics - stardust83
https://medium.com/@mc2maven/a-closer-inspection-of-teslas-autopilot-safety-statistics-533eebe0869d
======
olliej
The author explicitly states that they believe that Tesla is correct to only
include miles driven with autopilot enabled, specifically they say:

"I agree with Tesla’s methodology on Autopilot mileage because the road
conditions under which a partial autonomy system is rated for operation
(highways, clear lane markings, etc) are systematically different from
manually-driven miles."

But if that were the case you must also only include crash data from manually
driven cars in those same circumstances. But the majority of automobile
accidents do not occur on freeways -- which are the most autopilot friendly
roads being driven.

We can't easily work out that traffic split, but we can do something else that
should be just as effective:

Compare accidents per million miles driven in autopilot enabled cars, to
accidents per million miles driven in regular vehicles. If autopilot does
meaningfully improve driving safety, we should expect that the average for
autopilot enabled cars to to be lower than other cars by a statistically
significant margin.

If we want to get specific per [1] more than 50% of accidents occur at
intersections, given Tesla's sample will dramatically over-represent
intersectional crashes we could go extreme and say that their statistics
should be immediately less than half the crash rate of non-autopilot vehicles.
(this is an exaggeration, but given they're not being honest I don't really
care)

Personally I'd also control for class of cars we're comparing -- I suspect the
stats for high end cars are different from low are different from sports cars,
etc.

[1]
[https://www.fhwa.dot.gov/research/topics/safety/intersection...](https://www.fhwa.dot.gov/research/topics/safety/intersections/)

~~~
rhino369
No, she is saying you shouldn’t just count the miles where autopilot is turned
on. Her reasoning is that you can’t compare autopilot-on miles to regular cars
because there is no comparable data for regular cars.

That might be acceptable if you compared Tesla autopilot capable cars to other
luxury cars, where the only significant difference in safety is autopilot.

But that wouldn’t account for things like: maybe certain Tesla safety features
save a lot of lives but get canceled out by a very dangerous autopilot.

And she doesn’t compare to similar cars.

~~~
taneq
> That might be acceptable if you compared Tesla autopilot capable cars to
> other luxury cars, where the only significant difference in safety is
> autopilot.

This is the comparison they should have done. Comparing with just the national
average includes all the poorly maintained 20-year-old beaters being driven by
teenagers.

------
w0mbat
They would do better to look at the few serious autopilot crashes Tesla has
had. They tend to involve the car driving straight into a large stationary
object, in situations where a much cheaper car with a basic automatic
emergency braking system (AEB) would refuse to collide and would just stop.

It looks like Tesla’s over-sophisticated system which builds a live 3d map of
the whole environment, too heavily filters out large non-moving objects,
believing them to be things like overhead bridges which it can pass under.

It needs a supplemental and independent AEB system based purely on highly
directional forward facing sensors which can confirm that the path ahead is
clear. This is so cheap that it is becoming common on mainstream cars like
Hondas and Toyotas. The current model S probably already has the sensor
hardware onboard to enable this extra redundant safety system in software and
save lives.

~~~
kalleboo
> They tend to involve the car driving straight into a large stationary
> object, in situations where a much cheaper car with a basic automatic
> emergency braking system (AEB) would refuse to collide and would just stop

Would it?

[https://www.wired.com/story/tesla-autopilot-why-crash-
radar/](https://www.wired.com/story/tesla-autopilot-why-crash-radar/)

> Volvo's semi-autonomous system, Pilot Assist, has the same shortcoming. Say
> the car in front of the Volvo changes lanes or turns off the road, leaving
> nothing between the Volvo and a stopped car. "Pilot Assist will ignore the
> stationary vehicle and instead accelerate to the stored speed," Volvo's
> manual reads, meaning the cruise speed the driver punched in. "The driver
> must then intervene and apply the brakes.” In other words, your Volvo won't
> brake to avoid hitting a stopped car that suddenly appears up ahead. It
> might even accelerate towards it.

> _The same is true for any car currently equipped with adaptive cruise
> control, or automated emergency braking_. It sounds like a glaring flaw, the
> kind of horrible mistake engineers race to eliminate. Nope. These systems
> are designed to ignore static obstacles because otherwise, they couldn't
> work at all.

> “You always have to make a balance between braking when it’s not really
> needed, and not braking when it is needed,” says Erik Coelingh, head of new
> technologies at Zenuity, a partnership between Volvo and Autoliv formed to
> develop driver assistance technologies and self-driving cars. He's talking
> about false positives. On the highway, slamming the brakes for no reason can
> be as dangerous as not stopping when you need to.

~~~
mhandley
I don't work on these radar systems, but have played a lot with ultrasound for
robotics. It seems possible these radar systems give almost continuous false
positives. You could get a pretty strong return from potholes, slightly sunken
drain covers, etc. Using doppler to reject any return that matches the car's
own road speed just leaves the returns from moving vehicles. You can then
track speed vs distance to sanitize that data. It becomes a tractable problem
then to detect that the car in front is slowing quicker than you are. But
telling the difference between a stationary car and a slightly misaligned
bridge expansion joint is probably not trivial.

~~~
zip1234
The Tesla has multiple different sensors, including ultrasound, radar, and
cameras. Fusing information from the different sensors should help correct for
such errors.

------
jaclaz
There is a factor that is never mentioned.

The actual "fleet" of Tesla (and of others experimenting in the autonomous
vehicle field) is made exclusively of decently new and perfectly or nearly
prefectly maintained vehicles.

Besides and before any other consideration, the way a modern, newish and well
maintaind car handles (steers, brakes, etc., particularly in an emergency
situation) is hardly comparable with the way the "average" car can do, think
of unbalanced brakes, worn down tyres, etc., but also about the undeniable
truth that your - say - 1998 pickup won't ever be as stable as a sports car.

Since there is no real data about the comparison term (all the other vehicles)
status and capabilities (of both the car and the drivers), each and every
comparison tends to be on favour of the Tesla for reasons that have nothing
connected to the actual automation.

Even an "internal" comparison (Tesla's on autopilot vs. Tesla's on manual
driving) wouldn't be IMHO much representative as drivers of Tesla are I
believe - if not an elite - a definite group of people, not too young, not too
old, possibly with an interest or passion about cars, while "the rest" will
comprise just licensed drivers (possibly with little experience and with a
tendency to risk too much), elderly people (possibly with slower reaction
times).

~~~
boznz
The only elite thing about most tesla owners is they are richer than most
other drivers.. It does not mean they are better drivers.

------
makomk
"Delaying the roll-out of partially-autonomous vehicles costs lives. This
conclusion assumes that (1) automakers make steady progress in improving the
safety and reliability of their partially autonomous vehicles and (2) drivers
are comfortable enough with monitoring the partially-autonomous vehicles so
that new sources of error associated with the transition to and from manual
and autonomous control do not increase fatality rates."

In other words, if you assume (1) something which we can't actually know
unless we have the ability to predict the future, and (2) something we already
have strong reason to believe isn't true, then we must roll out partially
autonomous vehicles as soon as possible and anyone who questions this is
basically killing people.

~~~
tedsanders
The previous sentences are worth quoting as well:

>This model shows that rolling out just as safe or a little safer partially-
autonomous vehicles by 2020 will save 160,000 more lives over 50 years than a
scenario that waits until 2025 to roll out almost perfect autonomous vehicles.
Delaying the roll-out of partially-autonomous vehicles costs lives.

Sure, rolling out autonomous vehicles will save lives if you _assume_ that
they are safer than humans. But right now all the evidence points to
autonomous vehicles being substantially more dangerous than humans. Waymo
reports disengagements every 5,500 miles, and estimates that something like
10% disengagements would have led to a collision.

[https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disen...](https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2017)

And taken as a whole, self-driving cars have already killed one person with
only about 10 million miles driven. It would take the average human driver
more than 100 million miles to kill someone.

Edit: My comment refers to safety of level 4/5 autonomy, not the current Tesla
Autopilot.

The real interesting question is whether it's worth it to deploy dangerous
cars, risking today's lives to help save future lives. It's not that
interesting to ask whether deploying hypothetically safer cars will save
lives.

~~~
stardust83
Why do you think only 10 million miles have been driven with Autopilot? This
article says a billion+ miles have been driven with Autopilot cars:
[https://electrek.co/2016/11/13/tesla-autopilot-billion-
miles...](https://electrek.co/2016/11/13/tesla-autopilot-billion-miles-data-
self-driving-program/)

~~~
rhino369
Autopilot isn’t self driving. It’s glorfied cruise control.

And Tesla’s had two autopilot fatalities. And Uber had one.

~~~
zip1234
Does the Uber one count in the same statistical category since it was a
pedestrian? I thought those were there own category?

------
tomohawk
It's good that there's starting to be more focus on whether partially
autonomous vehicles are actually safer, but what about other aspects?

We have social networks now that have "terms of service" and who are kicking
out people they disagree with. Is it too far out there to suppose that an
autonomous vehicle will come with similar "terms of service"? What if the
vehicle refuses to drive you to a competitor's store? Or to a political rally
that the car company doesn't like? Or to a gun store? Or to a religious
gathering?

Since most drivers control and own their cars, these are not concerns today,
but they likely will be. It seems like these are bigger decisions to hash out
than whether a car is merely safe. It doesn't matter if the car is safe, if it
safely takes away your freedoms.

------
Animats
Tesla has data on how much time their cars spend on autopilot. If Tesla wants
to promote their crash rate, they need to disclose the raw data.

Total vehicle miles for vehicles with a system that only works right on
freeways is guessing. The accident rate of interest is autopilot miles on
freeways vs all vehicles miles on freeways. Here's the US data summary for all
vehicles.[1] See table 35, which breaks out divided highway data.

The focus on fatal collisions is misplaced. There are far more non-fatal
collisions, which provides much more information. Evaluating Tesla's
"autopilot" is about measuring driving error, not crash survivability.

[1]
[https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8123...](https://crashstats.nhtsa.dot.gov/Api/Public/Publication/812384)

~~~
grogers
There needs to be at least some emphasis on fatalities. If autopilot reduces
total collisions but increases fatalities, you might draw the wrong conclusion
about its safety. It's early but from the incidents we know already, it seems
like a decent probability that this is the situation that Tesla is in, lower
crash rate but more likely for those crashes to be fatal.

------
stardust83
The Rand article says the most lives can be saved if Autopilot was rolled out
after it was 10% better than humans. However, we won’t have statistical
confidence until billions of miles are driven. If we think this problem can be
solved, we have to take a risk and go for it.

------
erentz
One thing I don’t understand about the current way autonomous vehicles are
going is avoiding more of a “positive control” type of setup. I grant it’s
more complicated and it means you won’t have it right away, but shouldn’t we
come up with some highway design standards that will make autonomous vehicles
safer, then some signaling method to say “this is on” or during construction
say “this is off”? It just seems a smarter more incremental way to go. That
way you can engage “autopilot” only in conditions it’s known to work under.

------
cma
Tesla's stats compare autopilot equipped Teslas against motorcycles. They are
deflecting.

------
f_allwein
„humanity could be ushered into a new economy where driving is a hobby, only
for sunny days along clear roads with a view. The struggles and tedium of the
daily commute could be handled by autonomous vehicles, traffic accidents could
fall to nil, passengers could focus on working and relaxing in their mobile
offices, and the elderly, disabled, and blind could have considerable mobility
and autonomy.”

Sounds remarkably like a world with good public transport.

~~~
jaysonelliot
"Good public transport" is an oxymoron.

The best you can hope for is public transport that will get you where you're
going in time.

There's no public transportation that will provide privacy, consistent
comfort, or even a seat. Never mind the most basic problem with public
transportation—the other passengers. Any time you get into an enclosed metal
box with an arbitrary number of random people, you roll the dice. You could
have a quiet, safe ride, or you could end up with loud music, obnoxious body
odors, food spilling on you, bags hitting you in the face, or someone vomiting
all over the floor. Any long-time New Yorker has their fair share of subway
and bus stories.

Public transportation is a vital part of any city. But it's not a "relaxing
mobile office," it's not always easy or convenient for the elderly and
disabled, and you can't have it as the only option.

~~~
f_allwein
well, you certainly can get some work done on a train once you figured out
your commute in a way that will get you a seat. Grande, that is not an option
for anyone.

On the other hand, what would the world look if we all used autonomous cars?
Seems like we would end up with the same, or worse, traffic jams. If people
were willing to share their cars, they would be a bit less. But remember any
means of public transport uses less space than cars.
[http://humantransit.org/2012/09/the-photo-that-explains-
almo...](http://humantransit.org/2012/09/the-photo-that-explains-almost-
everything.html)

~~~
mmt
> But remember any means of public transport uses less space than cars

I think this is indicative of the PR problem that transit advocates have and
that the GP is trying to point out: the advocates keep focusing on moving
people from point A to point B.

Transit advocacy always seems to be about X thousand passengers per hour and
saving Y thousand square feet of real estate. Quality of life of those
passengers doesn't get a mention.

Granted, creature comforts may not be all that high on everyone's priority
lists, but I'm confident they are for many, especially as they get to middle
age and have the financial means to vote with their wallets.

