
How Safe Is Tesla Autopilot? - sitkack
https://betterembsw.blogspot.com/2016/09/how-safe-is-tesla-autopilot.html
======
rhino369
The 94 million mile mean is for all types of driving, in all types of
situations, all types of people, in all types of cars.

First and foremost, highway driving is much much much safer than regular
driving. There are no intersections (which cause 40% of all accidents), there
is high viability, slow turns, no pedestrians, and a super simple traffic
pattern. Autopilot is only doing the "easy" driving.

Second, that 94 million miles figure includes motorcycle deaths. Motorcycles
have 34X more fatalities per mile driven. That juices the number of fatalities
per mile.

Demographics matter too. Telsa owners are disproportionately older than
average (not a lot of teens driving Teslas). They are more educated and
richer. These people tend to be safer behind the wheel on average.

Telsa's also have a lot of safety features that should reduce fatalities to
way below median anyway.

If you want to test if Autopilot is safer you have to test it against a
control group that is the same but just didn't use Autopilot.

~~~
gnicholas
Completely agree with most of these points, but the demographics one is
actually a mixed bag. Fatalities plotted against driver age follow a U-shaped
curve [1], with younger and older drivers being the most likely to cause a
fatality. The U is higher on the younger driver side of the curve, but this is
likely due in part to the fact that teens have less money and therefore drive
cheaper/less safe cars.

Because Tesla owners probably skew older, we might expect to see slightly more
fatalities with Teslas than with other vehicles. The male/female fatality
rates [2] are also relevant, and I would guess that Teslas are driven more by
men than women.

(My estimation of Tesla demographics is based on my commute down Sand Hill
road, so not scientific but probably the highest-traffic area for Teslas.)

Lastly, I haven't seen data that wealthier people are safer drivers than
poorer people. Source? It's believable, but I have looked in the past for data
on this and not seen this particular conclusion.

1:
[https://www.aaafoundation.org/sites/default/files/2012OlderD...](https://www.aaafoundation.org/sites/default/files/2012OlderDriverRisk.pdf)

2: [http://www.cbsnews.com/news/men-vs-women-who-are-safer-
drive...](http://www.cbsnews.com/news/men-vs-women-who-are-safer-drivers/)

~~~
jsprogrammer
We don't know if we have accurate fatality numbers for Teslas though. IIRC,
the first fatality was unreported for a couple months.

------
Animats
That's a superficial analysis. Tesla's MTBF has been discussed here before.
For one thing, Tesla's auto-driving is only expected to work on freeway-type
highways, and fatality rates for those are lower than for all driving.

Tesla's "autopilot" now has a second fatality, in China.[1] There's dash cam
video. "When it was approaching the road sweeper, the car didn’t put on the
brake or avoid it,” a police officer said in the CCTV report. “Instead, it
crashed right into it." Take a look at the photo. It's possible that the big
rotating brush behind the street sweeper didn't produce a radar return.
Remember, Tesla's radar is only at bumper height. Teslas are radar-blind at
windshield height. The rear profile of the street sweeper truck, which is high
off the ground, may not have been recognized as a vehicle by MobileEye,
especially since the street sweeper is throwing up dust. Video: [2]

So far, there seem to be at least two Tesla autopilot defects that can kill
people:

#1 is a high obstacle. That was the trailer crash fatality. It's also the
auto-parking failure where the vehicle didn't stop for a truck with
overhanging ladders out the rear.[4] Teslas are radar-blind at windshield
height, and if it doesn't look like a car or truck rear end, the Mobileye
vision system won't see it.

#2 is a left side obstacle partially blocking a lane. There are two videos of
crashes in this situation.[2][3] The "how big is the open space" vs the "how
wide is the vehicle" analysis may be defective.

#1 is the result of an under-sensored system with a huge blind spot. #2 may be
a software bug.

Looking forward to the NTSB report. Nothing new from them yet.

[1] [http://www.nytimes.com/2016/09/15/business/fatal-tesla-
crash...](http://www.nytimes.com/2016/09/15/business/fatal-tesla-crash-in-
china-involved-autopilot-government-tv-says.html?_r=0) [2]
[http://www.dailymail.co.uk/news/article-3790176/Shocking-
das...](http://www.dailymail.co.uk/news/article-3790176/Shocking-dashcam-
footage-shows-Tesla-Autopilot-crash-killed-Chinese-driver-futuristic-electric-
car-smashed-parked-lorry.html) [3]
[https://www.youtube.com/watch?v=qQkx-4pFjus](https://www.youtube.com/watch?v=qQkx-4pFjus)
[4] [http://www.roadandtrack.com/new-cars/car-
technology/news/a29...](http://www.roadandtrack.com/new-cars/car-
technology/news/a29133/tesla-self-driving-crash-summon-autonomous/)

~~~
mdorazio
Tesla contests that the China fatality was due to autopilot (they say it may
not have been active). But in actuality, it doesn't even matter that the
sweeper was throwing dust - autopilot straight up can't detect and avoid
stationary objects in some situations, regardless of ground clearance. Here's
an example of it hitting a normal van stopped in the left lane:

[https://electrek.co/2016/05/26/tesla-model-s-crash-
autopilot...](https://electrek.co/2016/05/26/tesla-model-s-crash-autopilot-
video/)

~~~
Animats
Tesla didn't get telemetry data from the vehicle, so they have no idea what
happened yet. (Does Tesla get telemetry data from cars in China at all? Does
that get through the Great Firewall? Do Tesla cars in China report to a server
in China or back to the US, or what?)

~~~
Animats
If you watch the video, the car is tracking the lane very accurately. Either
the driver is working very hard to stay precisely in lane while not watching
for obstructions, or the autopilot system is driving.

------
gnicholas
Good background on statistics and sample size. Also relevant is the fact that
Teslas are (1) heavier than the average car, (2) newer than the average car in
America, and (3) more expensive than the average car. All of these factors are
not only correlates of—but actually cause—better vehicle safety outcomes. For
more detailed numbers, see
[http://www.thedailybeast.com/articles/2016/07/14/why-
tesla-s...](http://www.thedailybeast.com/articles/2016/07/14/why-tesla-s-cars-
and-autopilot-aren-t-as-safe-as-elon-musk-claims.html) (disclaimer: I am one
of the co-authors).

~~~
vkou
> One Tesla owner describes this Catch-22, after being told that a crash was
> her fault because she turned off Autopilot by hitting the brakes: “So if you
> don’t brake, it’s your fault because you weren’t paying attention,” she told
> The Wall Street Journal. “And if you do brake, it’s your fault because you
> were driving.”

If this is true, this is a fairly damning statement about the reliability of
autopilot - it calls any statistical claims Tesla provides into question.

~~~
J-dawg
The "Catch-22" doesn't make a lot of sense, really.

In either case you _are_ driving. Until we legally recognise autopilots as
being fully in control of the car, the driver is _always_ driving.

Her statement does make me wonder how relaxing it would really be to drive a
car with autopilot, though. If I can't sit back and trust it 100%, I think I'd
rather just be fully engaged in driving the car.

~~~
gnicholas
You're not alone—humans are not very good at constantly monitoring systems,
which is essentially what are asked to do in the early days of "autopilot":
[http://www.csmonitor.com/Business/In-Gear/2016/0827/NASA-
thi...](http://www.csmonitor.com/Business/In-Gear/2016/0827/NASA-thinks-Tesla-
Autopilot-is-a-bad-idea)

~~~
J-dawg
Thanks for the link, really interesting. It does seem that there's a huge
amount to overcome as autopilot moves from driver aid to fully autonomous.

I'm starting to think the ideal compromise (for now) is a car where I'm in
control, but with lots of systems that cut in when I make a mistake, rather
than the other way around.

------
ChicagoBoy11
Great article on the statistical nuance of their claims. But even before you
get that far, you also need to address the fact that national highway
statistic they are comparing to is highly misleading. Like gnicholas
mentioned, the age, price, geographical distribution, weather conditions in
which autopilot runs, and typical owner characteristics of a Tesla, all differ
sharply from your average car which rides in the highway. And, I'd suspect if
you zeroed in on proper comparables, all of these factors would drive the
average highway safety way up.

So not only is their claim yet statistically unproven, they are also setting
themselves an artificially low bar.

~~~
deegles
> all of these factors would drive the average highway safety way up.

Up? Why not down? More data could easily show the opposite.

~~~
ChicagoBoy11
What I was trying to get at is that the benchmark that Tesla is using as
comparison (1 fatality/94 million highway miles) takes into account all
highway vehicles. Tesla is a high-end, expensive, luxury vehicle, and the
autopilot also does not function in certain conditions (like heavy rain). It
would seem to me that if you just took into account other luxury vehicles, and
excluded really adverse driving conditions, that the 94 million mile number
would probably be higher, and I think that would be a fairer comparison for
Tesla to use if they want to claim the car is faster than human driving.

Are you trying to say that you think taking these factors into account would
actually make comparables comparitively less safe?

~~~
deegles
gnicholas@ linked to this article that explains the factors:
[http://www.thedailybeast.com/articles/2016/07/14/why-
tesla-s...](http://www.thedailybeast.com/articles/2016/07/14/why-tesla-s-cars-
and-autopilot-aren-t-as-safe-as-elon-musk-claims.html)

From what I understand, if you limit the stat to a) cars made in the last 2
years and b) luxury cars and c) the conditions where Autopilot is mostly
driven, the fatalities per million miles would be lower. i.e 94 million is too
low of a number

~~~
ChicagoBoy11
Right, that's what I was saying! I guess we just misread each other, but this
is absolutely right - their comparison should use a higher denominator than 94
million!!!

------
etendue
I think this point from the post deserves emphasis, for consideration in
future discussions:

> If Tesla makes a "major" change to software, pretty much all bets are off as
> to whether the new software will be better or worse in practice unless a
> safety critical assurance methodology (e.g., ISO 26262) has been used. (In
> fact, one can argue that _any_ change invalidates previous test data, but
> that is a fine point beyond what I want to cover here.) Tesla says they're
> making a dramatic switch to radar as a primary sensor with this version.
> That sounds like it could be a major change. It would be no surprise if this
> software version resets the clock to zero miles of experience in terms of
> software field reliability both for better and for worse.

I am admittedly a broken record on this point, but Tesla moves very fast and
very nimble on a system that is under design controls. It would be very
educational to learn about their development process and how it maps to ISO
26262.

------
codingdave
The naming of Autopilot is unfortunate, because it is not one... but that name
may give drivers an inclination to trust it more than they should.

~~~
danielsju6
Am I missing something with this argument? It's consistently made on these
anti-Tesla threads here and on Reddit, but from my understanding an aircraft's
autopilot is actually dumber than Tesla's. You set bearing, altitude, and
speed and the plane will adjust to and keep that course—that's it. Some will
land in optimal conditions, but they're unable to taxi. Also a great deal of
plane crashes are due to "handoff" issues.

So it seems to me that Tesla's auto-pilot is aptly named.

~~~
ChicagoBoy11
The problem with the naming has nothing to do with the technical capabilities,
but an understanding of how that will impact human-machine interaction. You
are absolutely right that aircraft autopilot is a lot "dumber" in a sense. But
there is no ambiguity in the fact that when the pilot turns on the autopilot,
he can -- and does expect -- the computer to take over all flight operations.

By naming the system autopilot, Tesla has also created that same expectation
with their customers. Not only that, but it has at best turned a blind eye to
reports and videos and stories of customers who willfully take their hands of
the wheel/sit in the back seat/etc. and at worst encouraged an erroneous
public perception of how the autopilot system should be used.

The issues you cited in handoffs with airplanes are far more complicated when
it comes to cars, where decisions need necessarily to be taken in a much
shorter period of time. But all of that seems really downplayed by Tesla
marketing, which instead chooses to aggressively market just how close the
system is to full driving autonomy, instead of more like a gentle aid to
monotonous highway driving.

The disconnect between their official documentation and their marketing is
immense. That is what is potentially dangerous.

~~~
mikeash
"But there is no ambiguity in the fact that when the pilot turns on the
autopilot, he can -- and does expect -- the computer to take over all flight
operations."

What do you mean by this? There are many autopilots which won't take over all
flight operations. Some only handle a single control axis, or two. More
sophisticated ones can indeed handle everything, but that's not the only kind
out there.

If people _think_ that's what all autopilots do, then the name is still a
problem, but that's a bit different.

~~~
jedmeyers
An autopilot in newer light GA aircraft, like Bendix/King KAP-140, is able to
hold/climb/descend to an altitude, and maintain heading based on the GPS
input, so it does handle hands-on flight operations leaving the pilot to
maintain awareness and program the GPS.

~~~
mikeash
Looks like the KAP-140 comes in a single-axis version too.

------
mhneu
The naming of Autopilot was likely originated by Tesla's marketing department.

Just like the blog post this article critiques was likely written by content
marketers, or even more likely, by a collaboration between top-level marketing
and engineering officers. The success of the Autopilot feature is a critical
aspect of Tesla's business and all communications about it certainly rises to
the C-suite.

Edit: the success of self-driving cars in 2016 is more about reputational and
liability risk than it is about engineering effort. A self-driving car rollout
effort by any company is being driven (no pun intended) by legal and
marketing, not engineering.

------
Tharkun
The article is a good summary of Statistics 101.

------
trackofalljades
TLDR for every piece written about this, it's still safer than a human driving
alone, period. When used properly, it's vastly safer, and when used
improperly, it's probably still safer but only just...the thing to consider is
the kind of driver who's selfish and dumb enough to misuse this kind of thing
is a high risk driver without it and would probably still be worse.

~~~
wiremine
Not disagreeing, but is there any hard, independent third-party research to
back up this claim? As an engineer, I tend to lean this way. As a father of
two, I want to see some numbers to back up statements like "still safer than a
human driving alone, period."

~~~
jfoutz
Hard to find current statistics. This [1] seems pretty believable. Not sure if
it's "good" or not, but there you go. It's interesting that, just eyeballing
it, mass seems to be the dominating factor. The crown vic is discontinued, but
was fairly inexpensive, looks like it was originally $25-30,000. Compared to a
C Class, $30-35,000, which had more fatalities.

Of course, this is all 5 years old, so maybe there are better safety features
available now.

There are a couple of things that may or may not be believable. Maybe it's
possible for one or a few drivers to greatly improve the global traffic system
[2]. I believe it, but i can't find a real study that proves it. I do think a
handful of self driving cars can smooth out those compression waves, just by
driving sanely.

Finally, purely anecdotally, a friend of mine is a truck driver. We get to
catch up over dinner every few months. He sees people doing crazy things
multiple times a day. He sees the helicopter ambulances on a weekly basis,
passing by wrecks in either direction. His assertion, if people would just
stay calm and consistent, a lot fewer people would die. It's either someone
not paying attention, or being super aggressive.

I, also, would like to see a study. I would bet quite a bit of money, that a
highway with _only_ autopilot drivers would have zero wrecks.

I guess the real question is, does sensor failure (device failure, snow,
things like that) dominate wrecks? or is it interaction with unpredictable
(aggressive, inattentive) drivers?

[1] (pdf)
[http://www.cbsnews.com/htdocs/pdf/Driver_DeathRates_sr5001_e...](http://www.cbsnews.com/htdocs/pdf/Driver_DeathRates_sr5001_emb.pdf)

[2] [http://trafficwaves.org/](http://trafficwaves.org/)

