
Autopilot not on during crash in PA, Musk says it would have prevented accident - maratd
http://electrek.co/2016/07/14/autopilot-tesla-model-x-crash-pa-elon-prevented-accident/
======
osteele
I wonder if the driver thought the autopilot was on. A few times I've thought
I'd enabled it, and only discovered that I hadn't, when I noticed the car
starting to drift to the edge of the lane. It's especially easy to make this
mistake when traffic-aware cruise control is on, providing some degree of
automation but less than you think.

If you're using the autopilot to provide smoother control, and as a second set
of eyes on the road, then thinking it's on when it's off isn't of much
consequence. There's been speculation that the May fatality involved using
autopilot as a _replacement_ for real-time human oversight. Whether this
driver had the autopilot on or off, the root cause – thinking that the
presence of the autopilot makes it safe to ignore the road – was the same. If
the autopilot was off and the driver thought it was on, though, then one of
the suggested solutions – requiring hands on the wheel – might not have
addressed this case. Or maybe any cases: I imagine someone who thinks the
autopilot relieves them of responsibility is just going to rest a hand on the
wheel while they zone out anyway.

~~~
niftich
> If you're using the autopilot to provide smoother control, and as a second
> set of eyes on the road, then thinking it's on when it's off isn't of much
> consequence

I understand the sentiment behind this statement, but I'm not sure I agree.

Compare with simple cruise control -- ordinarily, the driver needs to provide
constant gas pedal input to overcome inertia at highway speeds. If for a
moment the driver thinks that cruise control is on but it's actually off, the
car will begin to appreciably coast, making it clear that it's off. The
'failure mode' of this scenario is rarely surprising or dangerous to the
driver or others, and easy to recover from.

Whereas features like automatic lanekeeping, blind-spot warning, pre-collision
emergency breaking... those are features that _need_ to work unless explicitly
disabled, because the failure modes all end in a potential accident.

~~~
foxyv
There is more than one failure mode for cruise control.

EG: * Cruise control leaves throttle full open. * Cruise control fails to
disengage. * Cruise control compression brakes on ice causing skidding. *
Cruise control activates without driver's knowledge.

For an example see here: [http://www.abc.net.au/news/2009-12-16/driver-
recalls-freeway...](http://www.abc.net.au/news/2009-12-16/driver-recalls-
freeway-cruise-control-horror/1180688)

------
salgernon
It's probably allowed in the T&C of owning a tesla, but is anyone else
bothered by Musk calling out details of log files that "normals" probably
aren't aware of?

Should people expect any privacy with regards this data? Maybe just share it
with the NTSB and allow them to issue the statement that this accident was
entirely not tesla's fault?

~~~
cperciva
_It 's probably allowed in the T&C of owning a tesla, but is anyone else
bothered by Musk calling out details of log files that "normals" probably
aren't aware of?_

It's quite common in investigations of plane crashes that the aircraft
manufacturer is involved in the investigation and makes public statements
(usually to the effect of "we've ruled out any problems with our plane... must
be those darn pilots again"). The situation with cars is a bit different in
that it's (usually) private individuals rather than professionals, but I'd say
that the track records of car safety vs. aircraft safety definitely point
towards following the model used for air crash investigations.

------
ricw
At last some "positive news" with regards to autopilot. As much as I love the
feature and it's progressiveness, I'm afraid the authorities might shut it
down because of public safety concerns. Which might have some truth to it in
the short term, though in the long run it almost certainly will be superior to
the average human safety record.

On another note, it's interesting how often the autopilot is often blamed
despite not being on. Thankfully that can easily be verified. The damage has
been done nonetheless..

~~~
slaunchwise
>> I'm afraid the authorities might shut it down because of public safety
concerns

Yup. Dead people are certainly a public safety concern. Pesky dead people!

~~~
imgabe
Like the 30,000+ dead people from car accidents caused by human driver error
every year?

------
boznz
Logs are a great thing, however I have had situations where once pointing out
someones error (and obvious lie) that I was required to provide information to
auditors how the data could not have been tampered with (difficult because
lets face it, it could).

I wonder how its working in these situations, obviously a good/dodgy lawyer
could easy spin the record as being tampered so I would want it rock solid.

~~~
jrockway
That's why we have trials, right? If the facts were always perfectly accurate,
there would be no work for juries.

In the real world... I imagine that the "dodgy lawyer" would provide a theory
as to why the data is wrong, and Tesla's lawyer would provide a theory as to
why it's right, and then it's up to the jury to decide what's true and not
true.

------
Animats
Remember, a touch to the brake disables Tesla's "autopilot". Looking forward
to the NTSB report.

Where is this data from? It would seem to require physical access to the
vehicle. Was Tesla given access to a wrecked vehicle that's the subject of
NHTSA and NTSB investigations?

~~~
williamscales
Surely they must stream the logs back to Tesla as they're taken.

~~~
greglindahl
Things like "Car just had an accident" get sent immediately -- Tesla will call
the owner and/or 911 -- but it appears that most logs accumulate on the car or
are downloaded when the car is attached to WiFi.

------
mjevans
I didn't lookup the exact Tweet, however based on the precise language of the
linked article it is possible that the following occurred:

* Driver engaged the autopilot (assist)

* Driver did not have hands on the wheel long enough for the Tesla to disengage the autopilot.

* Driver may have instinctively reacted by applying acceleration to stay at speed, without further reacting.

* Vehicle crashed due to no steering control; auto-pilot not engaged and driver reckless for not being in control and situational awareness of the vehicle.

If I needed to get a new vehicle today (and there wasn't a waiting list) I
think I'd probably get a Tesla; however I don't think I'd use the automation
as I agree with the Google Self Driving Car stance on this form of technology.
For that same reason I don't even like contemporary cruse control.

~~~
osteele
When the Tesla autopilot disengages because it doesn't detect your hands on
the wheel, it takes a large number of seconds with increasingly obnoxious
dashboard and audio warnings. It's hard to miss if you're paying any attention
at all.

Is “contemporary cruise control” constant-speed, or traffic aware? Having
driven tens of thousands of miles with each, I feel that the first increases
the chance of an accident and the second decreases it (unless you let it lull
you into ignoring the road). Tesla's cruise control has some blind spots
(going around a curve into stopped traffic), but they're complementary to my
own blind spots (the car in front suddenly decelerates; zoning out in stop-
and-go traffic); my subjective impression is that the two of us together are
much safer.

------
cperciva
I wonder if we'll ever see Tesla's autopilot defaulting to a "watching and
ready" mode, whereby it detects signs of inattentive drivers (hands not on
steering wheel, drifting within lane, etc.) and _turning itself on_ to bring
the car to a safe stop.

~~~
Washuu
That would be amazing for those who were having a sudden unexpected medical
issue.

~~~
cperciva
That's definitely the extreme case of "inattentive driver". There are
different tradeoffs though -- if the driver is having a heart attack, you want
to pull over and signal the emergency ASAP, whereas if the driver is merely
distracted it's probably better to only take over when there's a danger to be
averted.

------
nashashmi
Seems like he intended to slam on the brakes and instead slammed on the
accelerator. And he turned the wheel left to avoid the obstacle.

If autopilot auto disengages, it is not the fault of the driver, but the fault
of the design.

I wonder what we need more of ... better autopilot design, or better pilot
training.

