
Misfortune - dwaxe
https://www.tesla.com/blog/misfortune
======
cjensen
From Tesla's statement: _" contrasted against worldwide accident data,
customers using Autopilot are statistically safer than those not using it at
all"_

I think this is a foolish statement: fatal accidents in the US are 1.3 per
100M miles driven. Tesla reports autopilot has driven 100M miles. That's not
enough data collected to draw this kind of conclusion.

There's a parallel here with software testing. I've seen bugs that happen 25%
of the time, for example, and take 10 minutes to run a test. Our test guys
have great intentions, but if they test 5 times and see no bug, they think the
bug is gone. There is no instinctive understanding that they have insufficient
data to draw a conclusion.

~~~
cperciva
_I think this is a foolish statement: fatal accidents in the US are 1.3 per
100M miles driven. Tesla reports autopilot has driven 100M miles. That 's not
enough data collected to draw this kind of conclusion._

Also, not all miles are equal. I'd hope that people are enabling Autopilot
primarily while driving on straightish roads with good driving conditions --
in other words, at times when the rate of fatalities with a human driver is
far less than 1.3 per 100M miles.

~~~
usaphp
I thought most fatalities are on the highway during good driving conditions,
where the road is so smooth and boring that humans lose attention, fall asleep
or staring at their phones. Speed is much higher when driving on those roads
and there is a bigger chance a car will rollover/bigger impact.

~~~
flohofwoe
At least in Germany, driving on the Autobahn is about 3x to 4x safer than
driving on other road types:
[https://en.wikipedia.org/wiki/Autobahn#Safety](https://en.wikipedia.org/wiki/Autobahn#Safety),
not sure how it compares to the US, but "fast and boring roads" seem to be
safer.

~~~
whazor
Germany is a unsuitable country for comparison, as they have many highways
without speed limits. I would argue that this invalidates the boring part.

~~~
jsemrau
Seems you have never been to Germany ;-) No offense. Most highways have a
speed limit only some have none.

And most of the highways that don't have a speed limit are quite narrow so
while it is possible, no sane driver would do so.

If you look at highway A5 as an example. This highway has 8 lanes, which makes
driving fast quite dull since you can not really feel the speed relative to
the environment.

~~~
MagnumOpus
The figure I've seen is 50% of Autobahn is without speed limit, 25% with
permanent limits (due to noise control, permanently high traffic volume or
intersections) and 25% with temporary limits (depending on buildings works,
weather, congestion). Of course the perception is probably worse because more
people use the congested parts with speed limits, and due to the limits they
spend more time on those parts...

------
Dwolb
I didn't really have a problem with Tesla or Autopilot's latest issues until I
re-read this sentence:

>Autopilot was not operating as designed and as described to users:
specifically, as a driver assistance system that maintains a vehicle's
position in lane and adjusts the vehicle's speed to match surrounding traffic.

My problem is with Autopilot's branding - it's called AUTOPILOT.

The name isn't "Maintain Lane Position" or "Cruise Distance" or something
boring that describes it better - it has AUTO in the name.

Typical drivers aren't airline pilots who complete thousands of hours in
flight training and have heavily regulated schedules. We're just people who
are busy and looking for solutions to our problems.

If Tesla doesn't want people to think Autopilot functions as crash
avoidance/smart vehicle control/better than humans in all situations or blame
Tesla for accidents (whether human or machine is at fault) it should come up
with a less sexy name.

~~~
prostoalex
Isn't the plane's auto-pilot pretty much a pilot assistance system designed to
keep the plane at the specified altitude and follow a straight line (heading
bug on older systems, GPS coordinates in the flight plan on modern ones)?

It's not designed for collision avoidance, runway taxiing, emergency
situations, short/soft field landings or departures. It's occasionally used
for normal landings (according to [https://www.quora.com/How-often-are-
commercial-flights-lande...](https://www.quora.com/How-often-are-commercial-
flights-landed-using-autopilot)) but it doesn't seem prevalent.

~~~
dingaling
_" The Avionic Flight Control System (AFCS) provides manual or automatic modes
of control throughout the total flight envelope from take-off to landing."_

Lockheed L1011, 1972. Flight trials led to demonstration of a fully automated
trans-continental flight, from rest to rest. Pilots did not touch the
controls.

Incidentally also the only airliner which was certified for operational Cat
IIIC autoland, with _zero_ visibility. Frequently used at London-heathrow but
needed a ground-control radar to guide the pilots to the gate once the
aircraft had stopped itself on the runway.

Aircraft autopilots are technically capable of completely controlling the
flight but are restricted from doing so by technical provision ( e.g. lack of
rearward-facing ILS / MLS for departure ) or regulatory caution ( e.g. not
executing TCAS collision-resolution automatically, even though every FBW
Airbus can do this ).

~~~
Animats
The technology exists, but there's reluctance to give it too much authority.
That's changing, especially in the military.

Full-authority automated ground collision avoidance is now on many F-16
fighters. It's a retrofit, and 440 aircraft had been retrofitted by 2015.
First avoided crash was in 2015.[1] Here's a test where the pilot goes into a
hard turn and releases the controls, and the automated GCAS, at the last
second, flips the plane upright and goes into a steep climb.[2] Because this
is for fighters, it doesn't take over until it absolutely has to. The pilot
can still fly low and zoom through mountain passes. The first users of this
system, the Swedish air force, said "you can't fly any lower". It's a
development of Lockheed's Skunk Works.

This technology is supposed to go into the F-35, but it's not in yet.

This may eventually filter down to commercial airliners, but they'd need more
capable radars that can profile the ground. This is not GPS and map based;
it's looking at the real world.

[1] [http://aviationweek.com/defense/ground-collision-
avoidance-s...](http://aviationweek.com/defense/ground-collision-avoidance-
system-saves-first-f-16-syria) [2]
[https://www.youtube.com/watch?v=aPr2LWctwYQ](https://www.youtube.com/watch?v=aPr2LWctwYQ)

------
Animats
I haven't found US figures, but for the UK, motorway driving has a far lower
fatality rate than non-motorway driving. "Although motorways carry around 21
per cent of traffic, they only account for 6 per cent of fatalities and 5 per
cent of injured casualties. In 2015, the number of fatalities on motorways
rose from 96 deaths to 110."[1] Since Tesla's "autopilot" is only rated for
motorway (freeway) driving, it should be compared against motorway fatality
rates, which are about a third of general driving rates. So a realistic
estimate of the fatal accident rate for human freeway driving is maybe 0.3 per
100 million miles driven. Tesla is doing much worse than that.

To rate an automatic driving system, you want to look at accident rates, not
fatality rates. Accident rates reflect how well the control system works.
Fatality rates reflect crash survivability. Tesla needs to publish their crash
data. That's going to be disclosed, because the NHTSA ordered Tesla to submit
that information.

[1] [http://www.racfoundation.org/motoring-
faqs/safety#a5](http://www.racfoundation.org/motoring-faqs/safety#a5)

~~~
baby
I don't think it's fair to compare Uk and US figures. In the UK you actually
have to make an effort to get a driver's license.

~~~
Animats
I've found New York State figures.[1] But they don't have a breakdown by
fatal/nonfatal. Like the UK data, they show much lower accident rates on
divided, limited access highways.

California has a fatality rate of 0.94 per 100 million miles traveled.[2]
That's lower than the US average. But it's not broken down by freeway/non
freeway. (You can request a login to query the database directly and download
data, and it might be possible to compute freeway accident rates.)

[1] [https://www.dot.ny.gov/divisions/operating/osss/highway-
repo...](https://www.dot.ny.gov/divisions/operating/osss/highway-
repository/Table2_2014.pdf) [2] [https://www.chp.ca.gov/programs-
services/services-informatio...](https://www.chp.ca.gov/programs-
services/services-information/switrs-internet-statewide-integrated-traffic-
records-system/switrs-2013-report)

------
nl
I don't get why people are so eager to defend Tesla autopilot. We've had
Andrew Ng call it irresponsible[1] and Li Fei Fei say she wouldn't let it
drive with her family in the car[2]. These aren't anti-tech luddites, but
people with a very good understanding of the current state of the art.

I love Tesla, but they are SO weak at taking criticism or realising when they
make a mistake.

[1] [http://electrek.co/2016/05/30/google-deep-learning-andrew-
ng...](http://electrek.co/2016/05/30/google-deep-learning-andrew-ng-tesla-
autopilot-irresponsible/)

[2] [http://a16z.com/2016/06/29/feifei-li-a16z-professor-in-
resid...](http://a16z.com/2016/06/29/feifei-li-a16z-professor-in-residence/)
(you'll need to listen to the podcast though)

~~~
wbillingsley
> I don't get why people are so eager to defend Tesla autopilot.

I think there's a strong recognition that self-driving vehicles, when they can
be made to happen generally, will be a significant social good. And that it's
tricky to get there unless society is willing to put _something_ out onto the
streets.

It's taken fifty or more years of popular human-driven vehicles to get to the
stage that most of our cars are pretty safe, and quite a lot of effort in
improving road design too.

Eventually, though, I suspect it won't be solved until we redesign the roads.
A significant part of rail safety is that the signalling system can sense
whether there is a train on a stretch of line. (via the rather simple
technique that the axles form an electrical connection between the two rails)
Right now, it's as if we're trying to do autonomous traffic by an ant colony
model -- independent agents that know nothing about each other except what
they can sense. Which is always going to be harder than if the road can help
them out too.

~~~
pm90
Rail safety and autonomy were designed without the use of advanced machine
learning/computer vision that we have today. Also, redesigning highways seems
like a rather expensive proposition: the US is already not investing enough in
its existing infrastructure.

~~~
wbillingsley
Tarmac / asphalt only has a service life of 26 years. (And resurfacing after
13 years.) So within the sort of timeframes governments are already used to
for infrastructure construction (eg, HS2 is due for completion in 2033),
almost all the road surfaces will already have been replaced anyway.

The trick is to do it economically. Do some of the major trucking routes
first, as well as common city roads. ie, automate trucks and busses first.
Especially as to begin with you'd probably want to exclude bicycles, horses,
etc, so that means not doing every road.

But that's just speculation.

------
_sentient
This is from July 6th BTW. There has since been a fair amount of back-and-
forth on this between Musk and Stephen from Fortune. This episode has also
granted us this particularly delightful AMA on reddit, wherein Stephen roundly
ignores comments calling out the questionable links between recent Fortune
coverage and the Koch's ongoing crusade against renewable energy:
[https://www.reddit.com/r/IAmA/comments/4rqa6q/hey_i_am_steph...](https://www.reddit.com/r/IAmA/comments/4rqa6q/hey_i_am_stephen_gandel_a_senior_editor_at/)

~~~
nl
I don't see why the Koch's campaign against electric cars and renewables is
particularly relevant. Driverless cars can be powered by fossil fuels too.

The problem here is Tesla's Autodrive implementation. I think it is fair that
questions are asked.

------
cubaia
"That Tesla Autopilot had been safely used in over 100 million miles [...]
That contrasted against worldwide accident data, customers using Autopilot are
statistically safer than those not using it at all."

That is such a weak statistical claim, that it border the disingenuous.

Previous discussion:
[https://news.ycombinator.com/item?id=12082893](https://news.ycombinator.com/item?id=12082893)

------
ghughes
Belligerent as usual. I wonder if Musk writes these himself.

edit: I'm being downvoted for this, but I wasn't using "belligerent"
negatively here; I was wondering aloud whether Tesla's characteristically
aggressive approach to damage control is the result of direct involvement from
Musk. Doesn't seem that crazy to imagine that it is.

------
alfredxing
Previous discussion here:
[https://news.ycombinator.com/item?id=12046444](https://news.ycombinator.com/item?id=12046444)

------
smegel
I don't get Tesla. What is so special about them? Why does everyone want one?
If the demand for electric vehicles is so high, why hasn't all the big makers
already started offering electric versions of their little hot hatches? That
way you get an electric vehicle AND a build quality based on decades and
decades of quality control engineering. Autopilot seems to be a curio rather
than a drawcard, but again, surely the big makers, with access to billions of
metric readouts from existing cars would be best place to develop AI to
control them. Is it the Elon Musk factor? Seems to be a strange reason to buy
a car. The fan factor might be justified for an Apple car...but I just don't
get the buzz and hype around Tesla.

~~~
__david__
Have you ever driven one? They are nice cars that don't burn fossil fuels. The
super-car level acceleration is a huge draw for some. The range is also more
or less unmatched, as far as I know. Their software upgrade policy makes it
feel like the car is always up-to-date with the latest stuff—something that I
can't ever picture other car makers ever doing. Also you don't have to deal
with car dealerships (and almost every encounter I've had with a car
dealership has been negative).

Are they perfect cars? No. But the company operates in a fundamentally
different manner than the rest of the auto industry, and that is exciting to
some.

~~~
evgen
They are nice cars that burn fossil fuels somewhere else, preferably places
where rich people and SV types do not live. As their PR spin after the recent
"autopilot" fatality shows, they are just another car company with much better
marketing.

~~~
ferbivore
Power plants are vastly more efficient than car engines.

~~~
evgen
I never said they were less efficient, but they do pollute. As to the ones
that do not use fossil fuels, they are few and far between at the moment. The
bulk of our electicity production is still gas a coal. Toronto may get a lot
of green electricity from Quebec hydro, but please do not claim that this is
anything but an outlier at the moment.

Electric cars are more efficient than ICE vehicles and they pollute less, but
they are not powered by rainbows and dreams of utopia. Zero emmisions at the
tailpipe is nice, but don't try to make the claim that these cars are not
following that grand SV tradition of moving the negative externalities
somewhere else so that someone else can pay the cost.

------
taneq
I think their points here are valid, but I must admit that it's starting to
shake my confidence the way that, every time something bad happens, they
instantly respond with such strident defensiveness.

------
aerovistae
This is from 10 days ago. Why am I finding it at the top of HN _now_ ?

~~~
dayaz36
Because HN loves to bash on Tesla and since nothing negative has come out in
the last ten days they felt the need to re-discuss a two week old article

~~~
y4mi
every comment section on HN about tesla and musk in general was always
extremely positive until they started to cover up their mistakes. seriously,
these mistakes can result in governments outlawing autonomous driving
outright.

~~~
aerovistae
"Cover up their mistakes"? Which part of the fact that fewer accidents occur
with autopilot on, than when it's off, is arguable for you?

~~~
muraiki
Tesla has not shown that that is the case, despite proclaiming so. I've
commented enough in this thread already, but see
[http://www.rand.org/pubs/research_reports/RR1478.html](http://www.rand.org/pubs/research_reports/RR1478.html)

I really would like to hear Tesla's response to this criticism of their
apparently flawed statistical analysis.

------
euske
Is there such a thing as the universally acknowledged definition of "car
safety"? When you start combining the word "statistically" and "safe", I feel
that the statement loses its scientific rigor. To me "safe" is a very vague
term and it's mostly used in a subjective context. It makes me wonder if their
assumption was really valid when they were testing this feature.

~~~
SideburnsOfDoom
> Is there such a thing as the universally acknowledged definition of "car
> safety"

you might try the "deaths per passenger mile" metric, or "deaths per million
passenger-miles".

On that metric, long-distance air travel is very safe, as one trip transports
hundreds of people a very long way, and motorcycles are at the other end of
the scale.

See:

[http://www.bustle.com/articles/83287-are-trains-safer-
than-p...](http://www.bustle.com/articles/83287-are-trains-safer-than-planes-
statistics-are-clear-about-which-mode-of-transportation-is-safest)

------
abalone
Labeling this a "statistical inevitability" obscures the issue. It seems clear
from numerous demos posted to YouTube that some autopilot users are very
comfortable taking their hands off the wheel. That's an outright violation of
the TOS. Yet, lots of people do it. Some erroneously refer to it as "hands
free" mode.[1] Some even _observe the fact that they 're not supposed to do
it_ while doing it.[2]

The human factors here are tough. But safe design needs to account for human
factors. The enthusiast community seems especially prone to over-trusting the
autopilot, and that's something Tesla should be examining in their safeguards.

[1]
[https://www.youtube.com/watch?v=2geQ4hvvkNA](https://www.youtube.com/watch?v=2geQ4hvvkNA)

[2]
[https://www.youtube.com/watch?v=8H1qUhpjE5M](https://www.youtube.com/watch?v=8H1qUhpjE5M)

------
free2rhyme214
Once again this confirms Aaron Swartz was right about the news -
[http://www.aaronsw.com/weblog/hatethenews](http://www.aaronsw.com/weblog/hatethenews)

------
0xmohit
> there is no evidence to suggest that Autopilot was not operating as designed

Obviously, a dead man wouldn't be available to testify.

> That Tesla Autopilot had been safely used in over 100 million miles of
> driving by tens of thousands of customers worldwide, with zero confirmed
> fatalities and a wealth of internal data demonstrating safer, more
> predictable vehicle control performance when the system is properly used.

[https://en.wikipedia.org/wiki/Lies,_damned_lies,_and_statist...](https://en.wikipedia.org/wiki/Lies,_damned_lies,_and_statistics)

------
Theodores
> We self-insure against the risk of product liability claims, meaning that
> any product liability claims will have to be paid from company funds, not by
> insurance.

Why do they do this? I can understand it when government property is not
insured, e.g. UK Civil Service, as the enterprise is so vast and general
taxation can fill the gaps. I can also understand that some things can`t be
insured, e.g. nuclear power plants, but why does Tesla `vertically integrate`
insurance, particularly given that the product is statistically likely to kill
someone in due course?

~~~
morgante
Many large corporations self-insure. If you have the assets to pay out
expected claims, it makes sense to avoid paying premiums.

~~~
eru
And even if you don't have the assets, you can self-insure the first few
million USD, and use reinsurance for the rest.

That's what insurance companies do for risks with a long tail.

Think of it as buying car insurance with a very high excess. Those can be very
cheap.

~~~
danieltillett
I have always wished that I could as an individual could reinsure my long tail
risk. I am happy to take on the risk that I can afford to lose and just let me
pass on the risk I can't.

~~~
ISL
If you chat with the right insurance agent, that might be possible. While it
wouldn't go by the name 'reinsurance', look for a policy with a large
deductible and a large maximum payout.

~~~
Someone
I recently did for a sports club. It wasn't an option.

Thing is, insurers live by the law of large numbers
([https://en.wikipedia.org/wiki/Law_of_large_numbers](https://en.wikipedia.org/wiki/Law_of_large_numbers)).

Cutting away the high risk, low payout parts of the insurance decreases the
number of payouts significantly. That does mean variance in payouts goes up.

So, insurers will either need to find lots of new customers to get N up again,
or relatively high amounts of capital to survive those high payouts.

If they think they cannot find those customers, they need more capital. To
finance that, they need more income, which means charging you more, which
means fewer customers, which means charging you even more, etc.

I'm sure you get that insurance through Lloyd's, but it wouldn't be cheap.

~~~
morgante
I'm fairly certain there are policies designed for high net worth individuals
who can cover the first million in losses themselves but don't want to be
exposed to long tail risk. I've definitely heard people talking about it,
particularly for real estate.

------
pastagawins
It could end up in a legal historical litigation.

You Know the story with the microwave and the cat inside. The old ma that just
wanted a Quick solution to dry its beloved pet.

That's the same thing with the guy driving in its Tesla with the Autopilot on.
He just believed in the marketing campaing

------
known
Tesla success depends on
[https://en.m.wikipedia.org/wiki/Energy_density#Energy_densit...](https://en.m.wikipedia.org/wiki/Energy_density#Energy_densities_of_common_energy_storage_materials)

------
XJOKOLAT
Of course, I sympathize with any casualties. However, this seems to be a
question of responsibility in my eyes.

If you're not aware that potential dangers still exist when you step into a
car, you shouldn't be driving the car (which is a shame as it's a fantastic
car).

Sorry. Tesla is not at fault here - however much people want to call it that
way.

The Model S is not some magical car designed by aliens. It's a machine.
Problems may occur. We are not at autonomous-vehicle stage yet. However
Autopilot is a damned comfortable upgrade compared to the old cruise control.

I can't believe people are blaming Mr Musk or the marketing department for
people not taking responsibility or being careful when they get into a car. As
they should in any car. Especially any car with autopilot like capabilities.

------
Alexey_Nigin
While I agree that Tesla's article is not perfectly logical and its marketing
campaign is not impeccable, I would like to demonstrate that people at Tesla
Motors have a point.

1\. "STATISTICALLY SAFER" CUSTOMERS. Yes, this statement makes no sense. One
fatal crash is not a large enough sample size to make such conclusion.
However, this article was aimed not at Hacker News readers, but at average
buyers. Most of them do not have a firm grasp of high school math, so for them
"statistically safer" means just "don't worry." And indeed there are reasons
for them to worry, given that independent news agencies continually publish
hysterical things (It is a “a wake-up call!” “Reassess” self-driving cars! The
crash “is raising safety concerns for everyone in Florida!” [1]). Tesla's
response was nothing but a necessary defence. Or did you expect them to say,
"You know, there are not enough data yet, so let's wait until 10 or so more
people die, and then we will draw conclusions." This is much more logical, but
I feel that customers wouldn't like it.

2\. WHY IT IS CALLED "AUTOPILOT." This is just marketing. They couldn't sell
it under the name "The Beta Version Of The System That Keeps Your Vehicle In
Lane As Long As You Keep Your Hands On The Steering Wheel And Are Ready To
Regain Control At Any Moment™." And honestly, I do not think that even
relatively stupid customers will just press the button and hope for the best
without reading what the Autopilot is all about in advance.

In my opinion, it is now a difficult time for Tesla, and we should not
criticise it for trying to stay afloat.

[1] [http://www.vanityfair.com/news/2016/07/how-the-media-
screwed...](http://www.vanityfair.com/news/2016/07/how-the-media-screwed-up-
the-fatal-tesla-accident)

EDIT: You might think that the phrase "trying to stay afloat" is unnecessary
pathos, since a single crash, even coupled with a bunch of nonsense news
articles, cannot lead to anything serious. However, the history shows it can.
In 2000, Concorde crashed during takeoff, killing all people on board [2]. The
event was caused by metal debris on the runway, not by some problem with the
plane itself. Nevertheless, Concorde lost it reputation of one of the safest
planes in the world. The passenger numbers plummeted, and Concorde retired
three years later. That crash is the number one reason why it now takes 12
hours to get from Europe to America.

[2]
[https://en.wikipedia.org/wiki/Air_France_Flight_4590](https://en.wikipedia.org/wiki/Air_France_Flight_4590)

------
calgaryeng
Sue them for libel then?

------
dayaz36
This is two weeks old and is on front page of HN for the first time on a sat
night...a little late to the discussion

