
Misfortune - runesoerensen
https://www.teslamotors.com/blog/misfortune
======
chollida1
If you've never used the Tesla autopilot its a weird feeling.

It's not a fully self driving car and its obviously not strictly human
controlled. Unfortunately it ends up being more mentally taxing to use this
hybrid approach than to just drive yourself.

Consider highway driving, with normal human powered mode you are in full
control so if you see brake lights a half kilometer up ahead you can disengage
the cruise control and react on your own, everyone who has driven is
comfortable with this.

With this assisted driving the car doesn't slow down right away, and its not
clear if the car just can't see the tail lights lighting up yet or if its
decided it doesn't need to react yet and as such you start to second guess the
car,

\- should I drive or should I leave it to the car?

\- what if it turns when I grab the wheel? Will I make things worse by
driving?

\- does the car even see the object up ahead? How can I tell, its impossible
to expect the car to tell you of an object that it cant' even see.

It becomes just more taxing to use the hybrid approach to driving and as such
I don't use it at all. I have no doubt that the Tesla auto pilot is safer than
driving, but I also have no doubt that it can royally screw up.

I've come to the conclusion that some assisted driving, like auto braking for
obstacles that you will imminently hit is good but the kind of assisted
driving where it can almost automatically drive for you is not really anywhere
near ready, if you follow Tesla's rules on how to use it, it actually makes
driving harder:(

~~~
eclipxe
I have a completely different opinion. What do you have your following
distance set to? I have mine set to 6 or 7 and it starts slowing down well
ahead of when I would slow down.

I think you're over thinking things - just let it do its thing, but be ready
to resume control if needed. I drive 90% of the time in Auto Pilot, and it
makes driving exponentially more relaxing and safer.

~~~
argonaut
Yes, but that's the problem. Just "letting it do its thing" and
relaxing/trusting/ _expecting_ it to "do the right thing" is what leads to
accidents like this.

And if you trust Autopilot to do the right thing, and get into a fatal
accident, Tesla will say you should have been paying attention!

~~~
sethjgore
Isn't that similar to saying horseless carriages cannot be as good as horses?
Our experience in driving is never compared to the experience of riding in a
carriage or even harnessing a horse. Driving is considered a discrete
experience in its own class. Perhaps our fears and expectations of mechanical
error is ignorant of our own overconfidence in our own driving control. Are we
always in control of our attention? Do we always make the most accurate
observations as we drive?

Why do we establish the experience of being driven by autopilot as something
comparable to the actual action of human, hand-feet-sight-controlled driving
(steering wheel-acceleration/deacceleration-dashboard/windshield/mirrors)?

We cannot trust the horse to do the same things as the car nor can we trust
the car to do be same as horse.

We are on the cusp of moving responsibility from the human engineered
transport (car) to the machine engineered (autopilot) transport, which
transfer was previously animal to human engineered transport (horse to car).

We will believe what establishes our worldview and ignore what demolishes the
same foundations.

~~~
argonaut
No, it's not the same. Not sure why you think that analogy even makes sense.

------
thesimon
Context is probably [http://fortune.com/2016/07/06/tesla-autopilot-crash-
material...](http://fortune.com/2016/07/06/tesla-autopilot-crash-material-ceo-
elon-musk/)

SEC statements are usually covering a lot of stuff, so I doubt it's very
surprising.

But I'm still not convinced by their argumentation on the safety of the
Autopilot.

>That contrasted against worldwide accident data, customers using Autopilot
are statistically safer than those not using it at all.

I can only cite German data [0], but on the Autobahn there is a death every 90
million miles driven. But that includes drunk drivers, old cards, old drivers
etc. Teslas Autopilot being save in over 100 million miles doesn't really
sound that much better, especially considering the smaller sample size. And in
this case, the death could've probably been avoided by not using autopilot.

[0]:
[https://www.adac.de/_mmm/pdf/statistik_4_5_Unfallgeschehen_S...](https://www.adac.de/_mmm/pdf/statistik_4_5_Unfallgeschehen_Strassenarten_42780.pdf)

~~~
jsprogrammer
Tesla itself claims knowledge of a statistical inevitability that a collision
will occur while Autopilot is engaged.

Some questions now are: What did Tesla do to warn of this inevitability? What
did they do to mitigate it? What are they doing to mitigate it? Is there any
way for users to reduce their chance of being a count in the collision
statistic, other than reading and following all relevant disclaimers and
warnings?

~~~
eclipxe
Yeah they can pay attention, even while AutoPilot is engaged

------
suprgeek
Note that another possible Autopilot Crash is now being investigated:
[http://electrek.co/2016/07/06/nhtsa-probing-tesla-model-x-
ro...](http://electrek.co/2016/07/06/nhtsa-probing-tesla-model-x-rollover-
accident-pa-autopilot-involve-tesla-updates-statement/)

As I have noted before:
[https://news.ycombinator.com/reply?id=12012688&goto=item%3Fi...](https://news.ycombinator.com/reply?id=12012688&goto=item%3Fid%3D12011419%2312012688)

TESLA really needs to tread carefully here to avoid giving the whole Self-
Driving Tech a bad name.

You CANNOT move fast and Break Things in this particular case.

~~~
cocotino
>TESLA really needs to tread carefully here to avoid giving the whole Self-
Driving Tech a bad name.

What does it matter if it cannot be proven to be better than regular driving?

~~~
jsprogrammer
Valuations of companies in the space may drop unless they can provide an
alternative prospectus.

~~~
cocotino
As someone who owns no stock, I don't care about that. I thought we were
talking about advancing the mankind and that kind of thing...

~~~
Aelinsaar
Talk is just talk, follow the money.

------
vessenes
Tesla is surprisingly aggressive with media organizations. I think it's one of
their hallmarks actually -- I'm not sure why they are this way, but I would
guess that they have over the years hardened up in response to one-sided press
(I'm thinking of the NYT article claiming their car died and Top Gear in
particular).

At any rate, it's refreshing to read rebuttals to journalists. My own
experience is that it is very rare that journalists both want to get a story
right and have time to get it right; the vast majority of modern 'news' is
written by glorified bloggers on a deadline.

~~~
poof131
Personally, I think Tesla and Musk should shut up. This bantering back at the
media might have been “new” and “cool” with other topics, but not when it
involves somebody’s death. While it’s fine to respond to inaccurate facts, the
catchy headlines (“Misfortune”, “A Tragic Loss”) and the condescending and
confrontational tone is totally out of place. Stick to the facts. Talk about
how you take this very seriously, are going to look more into it, but do
believe the autopilot is safe.

After a decade in the military, where bad things happen frequently, I couldn’t
imagine public relations acting in this manner: "We bombed the wrong house,
but this happens, it's war, the media just doesn't understand". Don’t give
people the impression you care more about your business than their lives.
Don’t dismiss someone’s death as a statistic.

I want Tesla to succeed, but they are starting to leave a bad taste in my
mouth. I remember my XO saying, “never get into a shouting match with an
idiot, bystanders can’t tell the difference.” Ignore the media if you can’t
respond appropriately. The real misfortune here is that somebody died in what
was probably a preventable accident, not a poorly sourced Fortune article.

~~~
etendue
Wanted to say that, in addition to agreeing with what you wrote here, I've
noticed myself frequently nodding in agreement with your other nicely worded
and considered comments.

I prefer your version, but I would like to share a slightly different
formulation of that aphorism that is equally amusing: never argue with idiots,
they'll bring you down to their level and then beat you with experience.

------
iamleppert
The numbers he quotes of "world wide accident deaths" and "do the math" is a
totally ingenuous argument. It's nowhere near an apples to apples comparison.

The fact of the matter is, a Tesla driver is not your regular driver. A better
test would be to give everyone with a Honda Civic some kind of autopilot.

The data that they do have is a curiosity at best, and cannot be compared vis
a vis regular joe blow accident data.

~~~
sverige
Wait, what? Do you mean that buying a Tesla makes you a good driver, or the
kind of driver who watches Harry Potter movies while tooling down the highway?

------
joshdickson
These comments are absurd.

Fortune did not "assume that Tesla had complete information from the moment
this accident occurred." Fortune merely noted that there was a long period of
time between the time of the accident and when the information was disclosed.
Fortune is not saying that Tesla _did_ have knowledge of what happened, it's
saying that Tesla _should_ have had knowledge. And Fortune is saying that
Tesla should have investigated more quickly _especially_ considering the
impending stock sale. Perhaps they did not because Tesla feels as though the
death is not "material." Musk's logic on whether the death was material is so
convoluted it defies any sort of logical counterargument, and is ultimately
for the courts to decide.

Regarding whether or not the accident was not an Autopilot failure, of course
_technically_ it appears that the truck should not have turned, but this is
what happens in normal traffic. We do not design autonomous driving systems
with the idea that nobody ever breaks driving rules or laws, rather we design
them with the idea that every crazy situation that we can think of, and a lot
that we cannot, are likely to happen and we should attempt to avoid or lessen
the impact of a crash whenever possible. When Musk says that there is "no
evidence to suggest that Autopilot was not operating as designed and as
described to users," I seem to have missed the marketing material where they
say that the features work only when other drivers follow traffic laws. It's
one thing if a construction crane drops on top of the vehicle (cannot be
reasonably avoided); it's another when a truck crosses in front of you (where
we can take various evasive measures to lessen the impact of the crash even if
we cannot entirely stop the crash from occurring, none of which were taken in
this case).

Regarding vehicle accident statistics, as someone who worked in active safety
for several years (i.e. systems like radar-based crash avoidance that did not
trigger in this case), when we look at these numbers to make comparisons, the
entire idea as of 2016 is that when we combine a human driver with a lot of
electronic help, you are far safer than driving without that assistance or if
the electronics were driving for you. But Musk only compares autopilot, which
is only usable in certain scenarios, with the entirely of vehicular death
information (which by the way, is largely much cheaper, older cars with worse
safety systems that lead to more deaths). We know for a fact that autopilot
lulls drivers into becoming distracted and doing other things, taking their
eyes off the road, etc, because it does not enforce drivers to keep their
hands on the steering wheel (See NYTimes report today). The simple, obvious
fact of the matter is that a Tesla would be far safer if it were using all of
its accident avoidance technology _and_ required the driver to be engaged with
the steering wheel. Yet it does not.

As someone with experience with this technology (though not at/with Tesla), I
would greatly like to see them push a software update that requires you to
have your hands on the wheel while autopilot is engaged. Saying that you
should do so is not enough when the tools to enforce it are there. Is is clear
and obvious that that would make the technology even safer. Yet rather than
accept even the tiniest shred of responsibility, Tesla has entirely closed
ranks and blamed everyone else. I, for one, am disappointed in Tesla as a
whole and Elon Musk in particular.

~~~
omarforgotpwd
Tesla is pretty clear with users that auto pilot is little more than an
advanced cruise control feature and you can't rely on it. The whole Auto pilot
name is just marketing. It is really no different than a Mercedes with the
same features getting in a crash. As these systems are widely deployed people
will continue to die. What's important is that the deaths per million miles is
lower than with auto pilot off, and that's already true today's.

~~~
joshdickson
I worked on the Mercedes system and have driven it for thousands of miles. How
the systems approach enforcement is entirely different. If you remove your
hands from the wheel for more than a second or two, the system turns off. I
can watch dozens of videos of people with their hands off the wheel of their
Tesla while speaking into a camera on YouTube. I consider that very different.

~~~
omarforgotpwd
Perhaps but at some point software has to let people take their hands off the
wheel. It's very hard to say when the software is "good enough" to let people
do that but it's a leap we need to make to get to where we're going.

~~~
studentrob
> at some point software has to let people take their hands off the wheel

At that point, drivers shouldn't be expected to put their hands back on the
wheel. They should be in the backseat, which is Google's plan.

There isn't time to put your hands back on the wheel in the event of a car
accident. Planes get minutes to react to errors. Cars have seconds. Car
drivers are more akin to train conductors who, these days, have complex
deadman's switches that guarantee the conductor is still focused on driving
the train.

