
An Update on Last Week’s Accident - runesoerensen
https://www.tesla.com/blog/update-last-week%E2%80%99s-accident
======
jVinc
I really hate the whole Tesla angle of "technically we are not lying but we
know people are going to mis-remember and misrepresent what we are writing".

Take for instance: > The driver had received several visual and one audible
hands-on warning earlier in the drive...

What this means is that during this incident there were _No visual or audible
clues that a crash was about to happen_

What they are saying is that while he was driving there was an earlier point
at which the car did not crash, where the car gave a visual and audible clues.
And you have to ask? Why the hell is that relevant? It isn't. They are stating
it as a fact because they know some people will incorrectly claim that the car
warned the driver prior to the crash.

With their eager to try to explain what happened why aren't they talking about
what actually went wrong? Why did the car drive straight into a barrier. If
their claim is that it was cased by the driver not having his hands on the
wheel for 5 seconds, then they need to fold the company and give up trying to
create self driving cars. That is _Not_ acceptable of a self driving system.

The fact that the barrier was damaged intensified the accident, but that does
not in any way excuse their system driving straight into it. And no, you can't
get out of culpability by claiming statistical superiority. That's like a
gang-member drying to get out of jail after killing a rival gang member
because his gang statistically kills fewer people. Tesla has put a product in
the hands of consumers, when it kills those consumers they need to step up to
the plate and be honest about their fuckups, not just blame the drivers, blame
the infrastructure and point to statistics.

~~~
sweden
It's widely known that RADAR systems are not able to detect stationary
objects, only moving ones. Actually, they are are to detect them but just not
where they come from, so a barrier in the middle of the road and a traffic
sign on the side of the street, they would look like the same to the RADAR
system.

But to be honest, I am more worried with the markings of the road rather than
the inability of the autopilot of not foreseeing the accident:
[https://imgur.com/a/hAeQI](https://imgur.com/a/hAeQI)

What's wrong with the US road administration? Why is this even look like a
driving line? Where are the obvious markings? It's a very misleading road
layout, I am curious the amount of accidents that happen there every year.

This is how I expect this kind of thing to look like:
[https://i.imgur.com/dfZehmd.gif](https://i.imgur.com/dfZehmd.gif)

Given on how the road looks like, it makes more sense why Tesla is reinforcing
the fact that the driver wasn't paying attention to the road.

Edit: Since people are curious about the limitations of the RADAR, the manual
of the car mentions this limitation:

"Traffic-Aware Cruise Control cannot detect all objects and may not
brake/decelerate for stationary vehicles, especially in situations when you
are driving over 50 mph (80 km/h) and a vehicle you are following moves out of
your driving path and a stationary vehicle or object is in front of you
instead."

You can also read here that Volvo's system faces the same problem:
[https://www.wired.com/story/tesla-autopilot-why-crash-
radar/](https://www.wired.com/story/tesla-autopilot-why-crash-radar/)

~~~
bufferoverflow
> _It 's widely known that RADAR systems are not able to detect stationary
> objects, only moving ones. Actually, they are are to detect them but just
> not where they come from_

That doesn't make any sense. Radar systems do detect stationary objects just
fine. In fact, from the point of view of the moving car, almost nothing is
stationary.

I wouldn't be surprised if you're correct about the markings on the road, I
almost crashed into a temporary barrier driving at night, because there were
two sets of lane markings - the old ones, and the ones going around the
barrier. Construction workers simply didn't bother to erase the old ones.

~~~
sweden
It is mentioned in the manual of the car:

"Traffic-Aware Cruise Control cannot detect all objects and may not
brake/decelerate for stationary vehicles, especially in situations when you
are driving over 50 mph (80 km/h) and a vehicle you are following moves out of
your driving path and a stationary vehicle or object is in front of you
instead."

You can also read here that Volvo's system faces the same problem:
[https://www.wired.com/story/tesla-autopilot-why-crash-
radar/](https://www.wired.com/story/tesla-autopilot-why-crash-radar/)

~~~
saalweachter
This is not a RADAR problem, it’s a world modeling problem.

So you’re sending out pulses and listening for echos, which tells you how far
away something is in a particular direction. You correlate subsequent pulses
to say whether the object is moving toward you or away. If you have a car 50 m
ahead of you every time you ping, everything is good. Now that car suddenly
swerves around a car stopped in front of it, and your ping off that object
says 60 m. A crash is less likely, your model thinks! The object in front of
you is rapidly speeding away! By the time it realizes it isn’t, boom, crash.

~~~
adamlett
That’s not how it works in my limited understanding. Radars don’t just detect
the positions of object, but also their relative speed. For this, they take
advantage of the Doppler effect. When a pulse gets reflected, its frequency
rises if the object is moving towards the radar, and falls if it’s moving
away.

~~~
saalweachter
You could be right, either in this case or other cases. I'm entirely inferring
what autos are doing based on their failure case (not detecting stationary
cars when the car in front of you swerves around them); that sounds a helluva
lot like mistaking the new car for the old.

Again, not saying it's a limitation of RADAR; it sounds like a deficiency in
the way they're using it.

------
obeattie
To me there is a stunning lack of compassion and decency in the response to
these incidents by both Tesla and Uber.

Uber made sure to point out that the victim of their incident was homeless.
Tesla is pointing out how the driver received unrelated cues earlier in the
journey. None of this information is relevant. They’re trying to bamboozle the
reader in an effort to improve their images at the expense of victims who
can’t defend themselves.

I don’t understand why it is so impossible for these companies to act humbly
and with a sense of dignity around all this. I don’t expect them to accept
responsibility if indeed their technology was not to blame, but frankly that
isn’t for them to decide. Until the authorities have done their jobs, why not
show remorse, regardless of culpability, as any decent human would?

~~~
zeroxfe
I agree with the lack of compassion bit -- I think the messaging could have
been far more empathetic. However...

> Uber made sure to point out that the victim of their incident was homeless.
> Tesla is pointing out how the driver received unrelated cues earlier in the
> journey. None of this information is relevant.

I don't understand how you can equate those first two lines. Uber's
observation is clearly irrelevant, but the fact that the Tesla driver received
multiple "get your hands back on the wheel" notifications, as close as six
seconds before the accident seems very relevant to me.

~~~
obeattie
> the fact that the Tesla driver received multiple "get your hands back on the
> wheel" notifications, as close as six seconds before the accident

But that isn't what their statement says. It says the victim had his hands off
the wheel for six seconds before the crash, and that he received hands-on
warnings "earlier in the drive." It does not say that during those crucial six
seconds he was being warned. Nor does it explain the fact the car plowed into
a barrier.

~~~
irq11
_”Nor does it explain the fact the car plowed into a barrier.”_

This is critical. No matter how many notices you give, slamming into something
at speed is the wrong answer. It’s almost unbelievable that they’re trying to
use that as an excuse.

~~~
heliodor
I'm finding it hard to believe the wording on this crucial sentence was so
misleading by accident. Upon first reading it, the wording seemed a little off
and seemed to me that the two statements should have been two separate
sentences instead.

~~~
irq11
Even then, I fail to see how it’s relevant information.

I’m supposed to trust an “autopilot” that warns me six seconds before it slams
me into a wall?

~~~
wilun
In their next statement about the next person they kill: 'The Autopilot
clearly both stated in vocal synthesis and by displaying on the internal
screen: "WARNING - I'll attempt to kill you in six second". Yet the driver did
not manage to regain control against the rogue AI. He is clearly at fault for
having died, and his family should be fined to recover the cost the the
investigation.' :P

------
andrewstuart2
Really not much additional info here aside from the number of warnings the
driver had. Also, the standard "Tesla is 10x safer" metrics that get pulled
out each time a crash gets sensationalized.

What I think they fail to address, especially in this case, is that the
autopilot did something a human driver who was paying attention would never
do. Autopilot does a great job of saving people from things even a wary driver
would miss, much less a negligent one, but the fatal accidents that occur in
statistics are not from people who are fully watching the road missing the
fact that there's a concrete barrier with yellow safety markings directly in
their path, and hitting it head on for no good reason (e.g. evasive action
because of another driver, etc, oh a stupid last-minute "oh crap that's my
exit").

I want autopilot to succeed, and I want Tesla (and Musk) to succeed, and for
the sake of their public image they have to realize that this isn't an average
accident statistic, a lapse in attention or evasive maneuvering. It's a car
that seemingly plowed right into a concrete barrier while still under complete
control. That's not a mistake a healthy human will make.

~~~
hedora
> the autopilot did something a human driver who was paying attention would
> never do.

Then how did the crushed barrier get crushed _before_ the Tesla hit it?
Clearly, the stretch of road is unsafe enough to trick humans drivers (and,
clearly, Tesla should improve).

~~~
andrewstuart2
I'm obviously not privy to the details of the prior crash, but I'm pretty sure
any healthy human would not _deliberately_ drive into the barrier.

The most likely scenarios I can think of would be not paying attention and
drifting into the barrier, attempting to avoid a car merging into the lane
(and not paying enough attention), or being struck by another car and being
forced into the barrier.

~~~
Robotbeat
I think the answer is that the path into the barrier looks just like an actual
lane, and that this was enough confusion that a human driver apparently had
made the same exact mistake a week earlier.
[https://imgur.com/a/iMY1x](https://imgur.com/a/iMY1x)

[https://techcrunch.com/wp-content/uploads/2018/03/screen-
sho...](https://techcrunch.com/wp-content/uploads/2018/03/screen-
shot-2018-03-27-at-7-51-49-pm.png)

~~~
praptak
This divider looks rather unforgiving. Is it common for dividers in US to be
just plain concrete blocks without anything to prevent rapid deceleration?

~~~
voxadam
If you look at the imgur link you'll see a impact barrier that had previously
been crushed and not reset or repaired.

~~~
yani
I dont see it

~~~
tstrimple
It's in the techcrunch image, not the imgur. The barrier with the yellow face
is designed to collapse.

[https://techcrunch.com/wp-content/uploads/2018/03/screen-
sho...](https://techcrunch.com/wp-content/uploads/2018/03/screen-
shot-2018-03-27-at-7-51-49-pm.png)

~~~
arethuza
That still looks pretty risky to me - I compared a similar junction here in
the UK near to where I live (on the M90 in Fife) and it has about 40 impact
attenuation barrels in a triangle.

~~~
sgc
Those are also present in many locations in the US, although they seem to be
phasing them out so I think they are considered old tech. I don't know the
performance specs, but new cars are a lot smaller and safer than old cars -
the same is probably true here too.

------
oldgradstudent
In keeping with the "Autopilot" terminology, this was "Controlled Flight info
Terrain".

Tesla demonstrates their usual abuse of statistics. 1.25 fatalities per 100
million miles is the average across all types of terrain, conditions, and
cars.

The death rate in rural roads is much higher than in urban areas. The death
rate on highways is much lower than average. The death rate with new cars is
much lower than average.

The autopilot will not engage in the most dangerous conditions. This alone
will improve the fatality rate, even if the driver does all the work.

Tesla cars are modern luxury cars. They appear to have done a great job
building a well constructed, safe, car. This does not mean their autopilot is
not dangerous.

~~~
URSpider94
Note: Tesla is comparing the fatality rate of autopilot-equipped cars with the
average accident rate. They are NOT only counting miles where autopilot is
used, they are counting all miles driven by their cars. It could be that Tesla
drivers are inherently safer than the general population, or that the cars
themselves are much safer. What they are not doing (at least with the data) is
making a statistical claim about autopilot safety.

~~~
oldgradstudent
That's a very good point. But it raises a new question, why didn't Tesla
compare the safety record of Teslas with Autopilot against Teslas without
Autopilot.

~~~
ricardobeat
They have done so. As mentioned in their previous blog post, _Tesla 's crash
rate was reduced by 40% after introduction of Autopilot based on data reviewed
by NHTSA_.

------
chaboud
"The driver had about five seconds and 150 meters of unobstructed view of the
concrete divider with the crushed crash attenuator, but the vehicle logs show
that no action was taken."

This seems to be trying to suggest that the driver had 6 seconds of clear
driving-toward-death time to correct for the car's actions without explicitly
making such a ridiculous statement (while also throwing in the crushed crash
attenuator). If the car makes a quick change in direction due to an autopilot
error, a driver at speed would have very little time to make an effective
correction.

Depending on the system behavior, it could be akin to having a passenger reach
over and yank the wheel. I'd honestly rather Tesla just said "ongoing
investigation" instead of being so transparently evasive.

~~~
utnick
Yup this also mean that Tesla's radar and camera systems had 5 seconds to
realize it was driving straight into an unmoving concrete barrier and did
nothing...

~~~
Someone1234
What's really odd is, in my Subaru which has eyesight, if I drove straight
towards that barrier the breaks would engage. Eyesight uses two visual
cameras, about two feet apart, and uses the parallax between them to judge
depth/distance (in the same way human eyesight judges depth).

So even if Tesla's autopilot steered the vehicle towards that divider,
shouldn't the auto-braking system have engaged to avert the accident? Tesla
vehicles also have visual image cameras as well as radar based ones, so the
information density is even higher than Subaru's system.

I guess what I am getting at is: Is auto-braking disabled while autopilot is
on? Wouldn't leaving auto-braking enabled (particularly using the visual
cameras) offer a second layer of safety is autopilot made an error?

I raised the same issue when a Tesla with autopilot on drove straight into the
side of a truck and the driver was decapitated. The discussion was all about
"well radar couldn't distinguish it from road signs!" while ignoring that a
Tesla has visual (optical) cameras front and center.

~~~
ricardobeat
In the case of the truck, it was established that neither the driver nor the
cameras would have identified the side of the trailer against the sky, and
it's still ultimately the driver's responsibility to be alert and avoid such
crashes. See the report here: [https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.PDF](https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.PDF)

~~~
chaboud
From the report “NHTSA’s crash reconstruction indicates that the tractor
trailer should have been visible to the Tesla driver for at least seven
seconds prior to impact.”

Unless someone is operating trucks with adaptive optical camouflage, I think
that the driver would be expected to note the presence of a big rig. Binocular
cameras and radar may be expected to do the same, too.

(Note: Tesla models, to the best of my knowledge, do not have binocular
cameras)

------
hcnews
> In the moments before the collision, which occurred at 9:27 a.m. on Friday,
> March 23rd, Autopilot was engaged with the adaptive cruise control follow-
> distance set to minimum.

Would've loved to know the exact moments we are talking about here. Is it 5
seconds or 5 minutes?

> The driver had received several visual and one audible hands-on warning
> earlier in the drive and the driver’s hands were not detected on the wheel
> for six seconds prior to the collision.

What does this mean? Did Tesla want to give over control to driver? Or just
normal no hands?

> The driver had about five seconds and 150 meters of unobstructed view of the
> concrete divider with the crushed crash attenuator, but the vehicle logs
> show that no action was taken.

Again its hard to know if Tesla wanted to give up control or not. I hope they
aren't just saying that because the driver's hands were off the steering
wheel, the crash occurred and is his fault for not obeying the agreement.

~~~
sp332
If you ignore it long enough, the autopilot will disengage entirely. The car
will drive in a straight line and coast to a stop.

~~~
userbinator
_The car will drive in a straight line and coast to a stop._

In this case, it would be more like "drive in a straight line and... collide
to a stop."

Perhaps braking to a stop at a reasonable rate would be the right thing to do,
given that it surely should've detected it was going to hit something?

~~~
sp332
But according to Tesla, the autopilot was engaged. The car's behavior when
disengaged is not the question here.

------
riffic
Hey Tesla, traffic and vehicle safety advocates do not call crashes
“accidents”.

[http://www.roadpeace.org/take-action/crash-not-
accident/](http://www.roadpeace.org/take-action/crash-not-accident/)

[https://www.crashnotaccident.com](https://www.crashnotaccident.com)

~~~
FireBeyond
Yeah, when driving an emergency vehicle, in my case an ambulance and fire
engine, you take a course that used to be called Emergency Vehicle Accident
Prevention.

It's now called Emergency Vehicle -Incident- Prevention.

------
hartator
> In the US, there is one automotive fatality every 86 million miles across
> all vehicles from all manufacturers. For Tesla, there is one fatality,
> including known pedestrian fatalities, every 320 million miles in vehicles
> equipped with Autopilot hardware. If you are driving a Tesla equipped with
> Autopilot hardware, you are 3.7 times less likely to be involved in a fatal
> accident.

"Equipped with Autopilot" is different than running with Autopilot. As far as
I know, all model S fatal accidents except one was with Autopilot engaged.
Making Autopilot far more dangerous than manual driving from a statistical
standpoint.

~~~
pooya13
They all had windshield wipers as well. Maybe that is the cause?

------
bladers
very deceptively crafted sentence. if i drove on AP for 15 minutes and i was
alerted to put my hands on the wheel 2 minutes in.

It would fit tesla's specially crafted statement.

It makes it look like user was warned and didn't respond by putting their
hands on the wheel but its completely false. The AP drove him straight into
the barrier with no warning.

~~~
jijojv
[https://www.tesla.com/blog/update-last-
week%E2%80%99s-accide...](https://www.tesla.com/blog/update-last-
week%E2%80%99s-accident)

>very deceptively crafted sentence. >The AP drove him straight into the
barrier with no warning.

Exactly - anyone who owns autopilot knows this.

~~~
imh
It seems gross that they spend so much of this saying "yeah he died and it may
be our fault, but statistically you're still better off as our customer."

The atlantic had a relevant piece on the morality there this morning
[https://www.theatlantic.com/technology/archive/2018/03/got-9...](https://www.theatlantic.com/technology/archive/2018/03/got-99-problems-
but-a-trolley-aint-one/556805/)

And that's assuming they can identify that autopilot is statistically safer.
If autopilot isn't used in the same conditions as all human driving, the rates
aren't apples to apples comparisons, and the "you are 3.7 times less likely to
be involved in a fatal accident" figure they imply is misleading.

~~~
cma
They didn't even control for wealth. People that can afford a Tesla and the
auto pilot add on are much less likely to be driving on drugs, alcohol, etc.
It doesn't say anything about the statistics when auto-pilot is engaged, it
just says cars with auto pilot. And it doesn't give the statistics for Teslas
without auto-pilot (Teslas have other safety factors that might mean auto-
pilot isn't responsible for the mortality reduction, etc.)

~~~
icebraining
_People that can afford a Tesla and the auto pilot add on are much less likely
to be driving on drugs, alcohol, etc._

Is that true? I couldn't find statistics that grouped by income or wealth.

~~~
cma
Also look at occupation. Tesla owners = disproportionately software engineers.
Software engineers are one of the lowest car insurance risk pools (and we're
even before any of them started using auto-pilot).

------
nafizh
"The driver had received several visual and one audible hands-on warning
earlier in the drive and the driver’s hands were not detected on the wheel for
six seconds prior to the collision."

Shouldn't you have more audible warnings than visual if you are aware that not
complying will lead to an accident?

~~~
andrewprock
This feels particularly evasive. If you want to claim that your technology is
10x safer, you should be able to back that up with self-insurance, and not try
to put the blame on the driver.

If the autopilot does not think it is capable of driving safely, it should
pull over to the shoulder of the road, not beep and flash.

~~~
1123581321
It doesn't seem like pulling to the right would be safe in this situation. I
think that automatically slowing down would be, though. Even slowing at a rate
similar to a human letting their foot off the gas (to avoid unexpected
behavior) would reduce the chance of fatality in a crash.

~~~
andrewprock
People pull over to the shoulder safely everyday. If an autopilot car cannot
do this, it should not be on the road.

~~~
1123581321
People do not pull over to the shoulder when they are between lanes in
(presumably) busy traffic, which is where you have to be to hit a median.

I am not sure if Tesla's current system, which is still just cruise control
with steering, should ever be changing lanes unprompted. I don't have a Tesla
but I would not want my car's cruise control to do that right now. I do think
that pulling over in safe conditions is something that a self-driving system
needs to be able to do at some point on the path to full autonomy.

------
snarfy
There have been too many times when driving that I've needed to visually make
eye contact with either another driver or pedestrian in order to decide how to
proceed. Can an autopilot differentiate between a 6 year old chasing a ball
into the street unaware and an 8 year old chasing the same ball but also sees
the car coming? Or someone patiently waiting to cross the street while another
impatiently sees they are missing their bus and dart across the road anyway? I
see it in their face and body language and respond differently. Can autopilot
do that? Until it can I don't trust it. At best it should be restricted to
roads without pedestrians like freeways.

------
mannykannot
In several places, this statement is missing some important information, which
leads to the impression that the story is being spun to imply the driver had
more responsibility in this crash than may have been the case.

The statement says "The driver had received several visual and one audible
hands-on warning earlier in the drive and the driver’s hands were not detected
on the wheel for six seconds prior to the collision. _The driver had about
five seconds and 150 meters of unobstructed view of the concrete divider with
the crushed crash attenuator, but the vehicle logs show that no action was
taken._ " (my emphasis.)

What is not stated here is how long before the impact was it clear that the
car was off course and was not going to make a correction - i.e. how much time
would an attentive driver have to realize that something was going wrong?
Tesla may not yet know the answer to that question, but if it does not, I do
not think it should suggest that the driver had five seconds to respond (the
only information I have read on this matter is Tesla's own statements, so I do
not know if other reporting on the crash shows that this was the case.)

Hands off the wheel does not prove that the driver is inattentive. More
importantly, hands on the wheel does not show that the driver _is_ attentive.
IMHO, something at least as effective as eye tracking is necessary to
establish that. Furthermore, Tesla's own statement indicates that it is
allowing too long a period of inattentiveness, even for highway driving.

It is reasonable for Tesla to use statistics to justify the utility of their
systems, so long as they compare like with like. Unfortunately, this statement
is cagey about that: "If you are driving a Tesla equipped with Autopilot
hardware, you are 3.7 times less likely to be involved in a fatal accident."
Compared to what? Teslas without such hardware? Regardless of whether the
hardware is being used? depending on what they are comparing it to, these
statistics might be on account of simpler, more reliable safety features of
the vehicle, or even owner demographics.

Finally, the absence of a working crash attenuator, while a tragic fact, has
no bearing on the issue of Autopilot's reliability.

------
jehlakj
I find it terrifying how auto pilot cars give the driver a false sense of
security. When a car has statistically proven claim that it is safe, drivers
are more likely to have less focus on the car. It’s like comparing reaction
times of a manually driving individual to that of a passenger. What seems to
be 6 seconds may in reality feel like a split second due to the lack of
context. I want to say that there are overhead costs to getting back into
driving mode when you haven’t been fully aware of your surroundings. This
happens with a much lesser degree when I’m using cruise control, but I still
avoid it when there is a fair number of cars on the road.

If I have to be well aware of the warnings that the system puts out, I might
as well stick to no auto pilot. The chances are low, but any chance is enough
to keep me from using it. the roads are designed for humans after all.

------
prepend
“For Tesla, there is one fatality, including known pedestrian fatalities,
every 320 million miles in vehicles equipped with Autopilot hardware. If you
are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less
likely to be involved in a fatal accident.”

Their stat is for miles in vehicles equipped with autopilot, not miles driven
by autopilot. So if their claim was whether a Tesla equipped with autopilot is
safer than a standard car, this would be valid.

The comparison should be with miles driven with autopilot engaged. They must
know this number because Tesla tracks everything.

It’s unethical of them to publish such misleading stats.

Cynically, I also think that the fatality rate of autopilot miles is either
really bad or just not comparable due to too few miles. Either way, they
should reveal this.

~~~
ricardobeat
True, they know this number, it is public and has been ratified by transit
authorities; see pages 10-11: [https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.PDF](https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.PDF)

~~~
slavik81
Why do they use weasel words and misleading statistics when they have this? A
crash rate of 1.3 per million miles before installation and 0.8 per million
miles after sounds like a slam dunk for the autopilot's statistical safety.

------
United857
The biggest failure here is Tesla's marketing department. Calling it autopilot
implies full level 5 autonomous driving in the average person's mind. Very few
will read the fine print.

In reality, the current state of it is level 2 or so -- like an enhanced
cruise control. You absolutely can't take your eyes off the road still which
is what apparently happened here.

------
yashap
I can’t help but feel that Tesla, Uber and others are being crazy reckless
with their self driving technology, maybe even criminally so. Google have put
massively more effort into their self-driving tech, AFAIK massively more
engineering and QA hours (by probably better engineers and ML people), and
they still don’t consider their tech ready for sale to the general public.
Tesla seem to be shipping an MVP, which is insane for safety critical
software.

Tesla claim “Tesla’s equipped with autopilot have a 3.7x lower fatality rate
compared to the average car,” but that’s a very weak claim. It doesn’t say
anything about the safety of the autopilot tech itself, as it’s lumping in
“normal driver” stats. Also, do new, expensive cars simply have lower than
average fatality rates in general?

I think self-driving cars will be great for society, but we need to be super
cautious about shipping them. Google seem to be very cautious, Tesla seem to
be very reckless. If it’s found they shipped autopilot without strong evidence
that it’s much safer than human drivers, and it’s out there killing people,
IMO Tesla execs should go to jail.

------
dzink
I have a lot of respect for Tesla, but this incident convinced me not to buy
one any time soon. I have children with car seats. The fires that break out
with battery right under the middle makes it impossible to save someone in the
driver seat, let alone in the back.

More importantly, self driving cars are not the same - Waymo has lidar, Tesla
and Uber don’t (correct me if wrong). If a system is clearly struggling with
the detection of still objects in the middle of a highway, on a sunny day, it
cannot be trusted. If I ever bought a Tesla, I wouldn’t use autopilot unless
their hardware proves reliable.

~~~
lostmsu
I thought fire started because rescue team cut someone out of the car.

~~~
userbinator
If you look at the pictures, I don't think much cutting was needed, and
certainly any that might be would be nowhere near the batteries located
beneath the car; the front end basically disintegrated completely from the
impact.

------
gonesilent
See also this interview. The driver said he had problems with auto pilot in
same location. [http://abc7news.com/automotive/exclusive-autopilot-part-
of-t...](http://abc7news.com/automotive/exclusive-autopilot-part-of-tesla-
crash-investigation-i-team-rides-in-model-x-to-site/3284757/)

------
newnewpdro
Wow, that's a disgusting amount of spin and statistical misuse.

I had intended to buy TSLA after their next awful earnings call and hold it
long, now I'm not sure I want anything to do with this business.

------
lifeisstillgood
I have two quite different takes on this and the recent Uber crash

1\. Simply stop using automated driving in real world until we have clear
golden standards (something like billions of hours of simulated driving
against real world data collected by _other systems makers sensors_. The
accidents so far have been characterised by "that should not have been
possible, not that was really hard edge case"

2\. Accident reports make horrific reading (and upsetting for families
especially). But like reports in day the FAA, there are benefits - openly
sharing crash data makes (ok should make - see 1.) the next round safer.
secondly just reading these things makes the humans involved more ... aware.

Some time ago i visited Ireland and was amazed to hear the local news
announcing recent car deaths - it was a mandated thing, to try and raise
awareness and reduce future accidents - it apperntly worked.

------
Tempest1981
The other factor is:

At 9:30 am, the left lane (lane #2, that stays on 101) is usually packed. He
may not have been able to merge in from the #1 lane (HOV-to-Hwy85-flyover
lane) into that left lane. Especially if there was a big speed differential.

Although 150 meters is plenty to stop in the median.

(Do we know which lane he was in before hitting the divider?)

------
bambax
> _In the US, there is one automotive fatality every 86 million miles across
> all vehicles from all manufacturers. For Tesla, there is one fatality,
> including known pedestrian fatalities, every 320 million miles in vehicles
> equipped with Autopilot hardware. If you are driving a Tesla equipped with
> Autopilot hardware, you are 3.7 times less likely to be involved in a fatal
> accident._

This is incomplete and therefore, very misleading. One has to compare what's
comparable: what is the sample size for each? That is, how many miles are
driven by Tesla vehicles per year, versus how many miles total for all US
vehicles on US roads?

It seems, Tesla puts up around 3.2 billion miles/year, vs 3.2 trillion miles
for all US vehicles; while the numbers are both big, that's a 1000x times
difference, or 3 orders of magnitude.

Tesla numbers seem to imply there are around 10 people killed per year using a
Tesla or being hit by a Tesla (because 3.2 billion / 320 million = 10). So, if
a Tesla with 5 people on board crashes into a wall this year, and kills
everyone inside, that will represent a _50% increase_ in the fatality rate.

And the phrase "3.7 times less likely" is also very misleading. People who
drive Teslas usually wouldn't be driving dangerous vehicles (such as small,
4-door cars which are the deadliest), if they didn't have a Tesla. They would
probably drive a German car (BMW, Audi, Mercedes) which are the safest cars in
the US today.

> _There are about 1.25 million automotive deaths worldwide. If the current
> safety level of a Tesla vehicle were to be applied, it would mean about
> 900,000 lives saved per year._

This is another level of nonsense entirely. "Worldwide"? A lot of things
happen worldwide; in some countries there are more scooters than cars; in
others, there are many very unsafe and poorly maintained country/dirt roads.
The Tesla numbers are shady enough when limited to the US, but expanding them
for a worldwide comparison is ludicrous.

------
sp332
I notice they don't mention that the driver complained that the autopilot had
trouble with this specific spot on the highway a few times before.
[https://news.ycombinator.com/item?id=16719736](https://news.ycombinator.com/item?id=16719736)

~~~
sp332
Late edit: According to [https://arstechnica.com/cars/2018/03/tesla-says-
autopilot-wa...](https://arstechnica.com/cars/2018/03/tesla-says-autopilot-
was-active-during-fatal-crash-in-mountain-view/) Tesla has no record of his
complaints.

------
biktor_gj
I don't care about Tesla at all, not a fan boy or anything, but I don't
understand all the blaming on Tesla for this accident. So, the guy had
complained 'between 7 and 10 times' about Autopilot malfunctioning there, yet
he kept on using it Autopilot disengaged before the crash, giving visual and
audible alerts before turning off, yet he didn't take the wheel, so he
probably was doing something else instead of keeping attention in a zone where
he said the thing failed... How many times do they have to say that even if
you're using AP you need to stay focused on the road? I'm sorry for the guy,
but if you don't trust a system that's as critical as this you either don't
use it or you're very cautious when using it.

~~~
yougotit
Ha, you just fell for Tesla PR. Maybe you should read their statement again.
It is gullible people like you that they create these misleading statements.

~~~
kj65557
No need for ad hominem attacks. You make no rebuttals to the points made by
the comment above and just dismiss it as PR. Just because something is (or
isn't) PR doesn't mean that it's false.

~~~
yougotit
I love people on this site. It is literally the only site where you respond to
a comment and people start listing fallacies. My answer stands. Numerous
people in other comments have explained how Tesla carefully crafted the
statement so that gullible people would think it said something different and
this guy fell for it. Better start listing some more fallacies in this
response! Here is a list if you need help
[https://en.wikipedia.org/wiki/List_of_fallacies](https://en.wikipedia.org/wiki/List_of_fallacies)

------
houqp
The thing I don't understand is the driver has complained about AP not working
on this same location multiple times in the past yet still kept the hands off
the wheels for more than 5 seconds?

~~~
dlp211
We don't know that is what happened. Tesla is using weasel wording to confuse.

------
rkagerer
If the driver had "five seconds and 150 meters of unobstructed view of the
concrete divider", then didn't the car, too?

I'm not disputing their statistics, but reading past the first two paragraphs
of this article left a sour taste in my mouth. Feel like a better message
ought to have been "we're going to learn from this, and won't stop improving
until the crash rate is 0".

~~~
dlp211
You should dispute the use of statitics here. They make no attempt to control
for various factors and compare apples to apples.

------
zadwang
I just wouldn’t buy a Tesla at this point because: 1) the autopilot is not
reliable. The accident shows clearly that it can dive straight into wrong path
without giving warning. This is unacceptable, however small the probability.
2) the battery explosion is making the accident ever so dangerous. 3) Tesla’s
response is disgusting. It may be true that statistically or on the average it
is safer, but I am not a number. If I drive responsibly I can beat the
statistic. Plus, relying too much on the comfort of statistics may be
attributing to the lack of human intervention. 4) price is also a factor but
the least important one compared to the above three.

~~~
nugi
My ONLY reason I don't already own one (before recently) is their draconian
anti-repair strategy, remote bricking, and other heavyhanded tactics against
'owners'. Apparently you are just paying for the privlige of borrowing one
from tesla.

------
IkmoIkmo
I hate how Tesla framed this as a statistics game.

It's like an amazing heart doctor one day suddenly stabbing his patient in the
heart with a scalpel, killing him, and then saying... well, sometimes I
malfunction, but by and large I cure patients 10x more often than my
colleagues.

That isn't okay and it's important to talk about that strange behaviour to fix
it.

~~~
polar8
Sure it’s strange behavior, but I’d still choose him over his colleagues.

------
tuna-piano
Their statement blames:

-The driver

-The crushed barrier

-Statistics

But they do not take responsibility themselves. It's apparent and in my
opinion shameful.

If Tesla's goal of this is to just share the facts, why didn't they state what
AutoPilot did? In addition to statements like "...the driver’s hands were not
detected on the wheel for six seconds prior to the collision.", wouldn't it
make sense to have a statement like "AutoPilot seems to have made a grave
error here, and steered the car into the barrier."

They never admit that their car didn't perform properly. This reads like a
carefully crafted, legal approved statement - not the more human statements
that we see many times after tragedies. This absolutely lowers my opinion of
Tesla by a decent amount.

"We feel very bad about what happened... and want to take responsibility and
do what's right" \- A quote from the Walmart CEO after a Walmart truck
accident injured Tracy Morgan and killed another man. I wish Tesla could stand
up and say the same in this case (if the evidence points that way, which it
appears to).

[https://www.cbsnews.com/video/walmart-ceo-on-tracy-morgan-
ca...](https://www.cbsnews.com/video/walmart-ceo-on-tracy-morgan-car-accident-
defends-companys-truck-drivers/)

~~~
sp332
It was a median in the middle of the highway. The car probably continued in a
straight line instead of "steering into" a barrier.
[https://imgur.com/a/iMY1x](https://imgur.com/a/iMY1x)

~~~
subinsebastien
Why the car was not able to detect and avoid the collision in such a simple
situation (from a human POV)? We have seen Tesla's autopilot performing very
well in a much more complex scenario than this. Ref:-
[https://www.youtube.com/watch?v=FadR7ETT_1k](https://www.youtube.com/watch?v=FadR7ETT_1k)

~~~
wcoenen
Stationary obstacles are a harder problem, because there are so many
stationary objects and features around a road that have to be ignored.

This autopilot incident is similar to the one with the tractor-trailer[1] in
that regard.

[1] [https://www.theguardian.com/technology/2016/jun/30/tesla-
aut...](https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-
death-self-driving-car-elon-musk)

~~~
oldgradstudent
Or the one in China with the street sweeper:
[https://www.youtube.com/watch?v=fc0yYJ8-Dyo](https://www.youtube.com/watch?v=fc0yYJ8-Dyo)

Or the one with the fire truck:
[https://www.theverge.com/2018/1/23/16923800/tesla-
firetruck-...](https://www.theverge.com/2018/1/23/16923800/tesla-firetruck-
crash-autopilot-investigation)

------
breatheoften
I think there the possibility to find a standard for response that genuinely
serves the maximum future good.

In my mind — the standard response to such accidents should be to setup an
endowment dedicated to the study of techniques promoting the enhancement of
autonomous safety, named after the victim, in the geographical region of the
accident. Build a damn research facility near the accident site — spread the
valuable high tech knowledge across generations and demographics — along with
the understanding that the ultimate mission being pursued is the safest
autonomous systems possible.

This kind of response would go over really well I think and have a lot of
benefits beyond just PR. There are very real possibilities that accident rates
will not be uniform across driving regions due to the inability of models to
optimize for the uneven distribution of risk sources across all driving
scenarios. This gets _really evil_ if profit motive is allowed to be be the
only mechanism driving the development of vehicle safety models. I cringe for
a future where car accidents are biased to occur more often in poor or ethnic
neighborhoods ...

------
empath75
> Tesla Autopilot does not prevent all accidents – such a standard would be
> impossible – but it makes them much less likely to occur. It unequivocally
> makes the world safer for the vehicle occupants, pedestrians and cyclists.

These kinds of unequivocal statements are just begging for a lawsuit, imo.
They simply don't know that this is true.

~~~
IncRnd
Exactly. They are taking both sides of the an argument to get credit but not
take blame. They want credit for allegedly reducing crashes, but they don't
want to be held responsible for driving into a wall.

~~~
duxup
Even if it was us... it probabbly wasn't!

~~~
IncRnd
Yes! That's what I said, when I forgot to say that.

------
tptacek
According to the IIHS, which is sponsored by insurance companies and rates
cars for safety, for models years 2009-2012, the following cars have had
_zero_ fatalities:

* Kia Sorento

* Lexus RX 350

* Subaru Legacy

* Mercedes Benz GL

* Audi A4 4WD

* Honda Odyssey

* Toyota Highlander

* Toyota Sequoia

------
bastijn
> “In the past, when we have brought up statistical safety points, we have
> been criticized for doing so, implying that we lack empathy for the tragedy
> that just occurred.”

I have to say that is exactly how I felt reading the piece and I’m by no means
easily offended by these kind of stories. The whole piece reads like a story
Uber would have published. Throwing all sorts of excuses why it isn’t their
fault but the big bad world being in their way. No matter if it is true or
not, this is not the kind of tone people want from you right now. They could
have brought the same information and show some empathy; especially when
criticized before.

------
_pmf_
> After the logs from the computer inside the vehicle were recovered, we have
> more information about what may have happened.

Why does Tesla have access to the car after the accident at all?

> The driver had received several visual and one audible hands-on warning
> earlier in the drive

No. The logs likely show that signals to trigger the warning signals have been
issued; they most likely do not indicate whether they had any effect (minor
issue, since they most likely had an effect, but still presenting the likely
"happy case" as a fact).

------
1024core
Apparently, the victim had complained to Tesla repeatedly that the autopilot
would veer towards the barrier: [http://abc7news.com/automotive/i-team-
exclusive-victim-who-d...](http://abc7news.com/automotive/i-team-exclusive-
victim-who-died-in-tesla-crash-had-complained-about-auto-pilot/3275600/)

On the other hand, if I knew the autopilot was trying to do such things, I
would not use it so much.

~~~
fastball
Yeah, and it looks like the AP got confused by the lack of attenuation
barrier, so a double whammy -- the AP probably would've avoided it if the
barrier had been there, __and __it would 've saved his life.

Honestly, I agree with Tesla in that maybe blame shouldn't be directed towards
them. They should be held responsible for the crash, but not the death -- if
you want to save lives, those barriers should be replaced __immediately __,
and that is something CalTrans should definitely do.

------
linsomniac
As far as the autopilot previously giving several audible and visual warnings
previously in the drive, and not detecting his hands on the wheel 6 seconds
prior to the crash...

The Tesla model S detection of hands on the steering wheel is, in my
experience, not very reliable. It often detects my hands off the wheel if I'm
holding it lightly, ready to take over. And it often detects me taking control
and disengages if I am holding it firmly. And if I'm paying attention to
traffic, I will about a fifth of the time not see the visual warning in the
dash.

These additional pieces of data can't really provide information on whether
the driver was paying attention.

However, I am really curious about the reports that this driver had several
times reported to Tesla that the autopilot was trying to drive him into this
barrier. If so, I wonder why Tesla hasn't reported that fact. And why the
driver would be using autopilot in that area at all, let alone not paying
attention while doing so.

------
imh
>semiautonomous Autopilot

Seems like an oxymoron that's getting drivers into danger.

>Tesla said its vehicle logs show the driver’s hands weren’t detected on the
wheel for six seconds before the collision, and he took no action despite
having five seconds and about 500 feet of unobstructed view of a concrete
highway divider.

And it's not the first time.

~~~
userbinator
_and he took no action despite having five seconds and about 500 feet of
unobstructed view of a concrete highway divider._

I suspect he may have been distracted or otherwise not paying direct
attention, thinking "the car will drive itself, I know I need to watch it but
I'll probably be fine if I don't watch it _all_ the time"... and just happened
to pick a very wrong time to not pay attention.

...and even when he did realise he was going to crash, there might've been a
very strong
[https://en.wikipedia.org/wiki/Buridan%27s_ass](https://en.wikipedia.org/wiki/Buridan%27s_ass)
effect which prevented him from quickly deciding which way to turn the wheel.

------
mentos
I wonder if Tesla can recreate the crash environment given the data captured
from their sensors to reproduce the failure in a simulation?

See this article about a company attempting to do this in Unreal Engine 4

[https://www.unrealengine.com/en-US/blog/51vr-crafts-
photorea...](https://www.unrealengine.com/en-US/blog/51vr-crafts-
photorealistic-city-streets-in-unreal-engine-for-ai-
training?sessionInvalidated=true)

Imagine a future where every car uploads their sensor data to reconstruct a
growing 3D model of the road network. Using this network AI can simulate
millions of miles of driving without ever risking a physical real world crash.
Any update to the cars autopilot system could first prove itself against 1
billion miles of simulation before ever being deployed to the 'real' world.

~~~
thebruce87m
I would hope the serious players are already doing this.

------
devit
Comparing the accident rate of autopilot Teslas to all cars is absurd and
disingenuous, since obviously Tesla drivers are wealthier than average (they
bought an expensive car), and thus much less likely to have accidents in the
first place (car better maintained, more intelligent, more risk averse, less
stressed, less dangerous behavior, etc.)

They are also admitting that their system can be used as a self-driving system
(you can not have your hands on the wheel, and the warnings can apparently be
ignored), and that it failed to act as such for 5 seconds and 150m hitting a
static object.

Again, there should be criminal liability for them, and they should not be
allowed to offer any system that is autonomous enough to be used without hands
on the wheel but that can't correctly self-drive the car.

------
reubenswartz
The _one thing_ I want from a self driving system is to not drive at full
speed into an object straight ahead of me.

Knowing that the auto pilot system had problems on this stretch of road, I
don’t know if it makes good sense to trust it, but this shows we have a ways
to go on the most basic test.

------
jacquesm
> The driver had received several visual and one audible hands-on warning
> earlier in the drive and the driver’s hands were not detected on the wheel
> for six seconds prior to the collision.

This would be hilarious if it wasn't so tragic. _So did Tesla 's autopilot_!

------
jwtadvice
Reading between the lines:

> The Tesla brand name takes this seriously. Question the headlines, opinions,
> etc.

> You are safe using Tesla. What happened to this person won't happen to you.

> The driver made some questionable decisions; the blame isn't uniquely
> Tesla's. Other safety mechanisms not under Tesla's control also failed.

> Statistically, Tesla Autopilot is safer than not using Autopilot.

> We accept that we will not be able to prevent all deaths. You need to too.
> Telsa has saved lives in addition to taking them. Focus on this please.

> We care about safety.

------
simion314
I am wondering if 100 Tesla would have passed that spot at that time would all
100 hit the barrier? Tesla should reveal why they can't detect solid objects
and hit into them.

------
MaikuMori
Now that we're set on introducing more and more autopilot and autonomous
vehicles why dont we add sensors and beacons to our infrastructure to make all
these things easier. Nothing really has changed since traffic lights.

It would help if barriers and construction obstacles would have beacons in
them. This is not to replace cameras and radars but to help augment data
whenever possible.

I think there is an opportunity for governments to help solve autopilot and
autonomous vehicle challanges.

------
perilunar
Why isn't the end of the barrier tapered instead of vertical e.g.:

[http://fortressfencing.com.au/pub/media/catalog/product/d/b/...](http://fortressfencing.com.au/pub/media/catalog/product/d/b/db80-08.jpg)

(I know there was a previously destroyed crush zone, but still, better to be
lifted and deflected than brought to a sudden stop.)

~~~
perilunar
Answering my own question: "These crashes often led to vehicles flying at high
speed into the very objects which guardrails or barriers were supposed to
protect them from in the first place. To address the vaulting and rollover
crashes, energy-absorbing terminals were developed."

[https://en.wikipedia.org/wiki/Traffic_barrier#Barrier_end_tr...](https://en.wikipedia.org/wiki/Traffic_barrier#Barrier_end_treatments)

------
theclaw
I can understand this happening if the Tesla was following a human driver with
“follow-distance set to minimum,” and that driver dodged the barrier, meaning
the Tesla had no time to brake and could not move out of the solid-white
lines.

According to another comment here the Tesla manual states that if you’re
following another car that moves to reveal a stationary object, the autopilot
may not be able to stop in time.

------
Neil44
The driver has ultimate responsibility for operating their car safely.
Autopilot does not change that, it says so in the manual.

~~~
dlp211
And yet, that is not how Tesla advertises autopilot.

------
michjedi
Does anyone have any data of % of accidents of autopilot vs drivers that don't
speed? I am wondering, because I recently say some statistics that showed that
a majority of drivers normally speed by at least 10mph, and that the majority
of crashes occur when people are driving over the speed limit.

------
diebir
Is this going to be inherent and unavoidable for the machine learning based
self driving cars? Perhaps self driving is never going to become a reality,
just because of the problems like this (unless/until the road infrastructure
supports them explicitly).

------
booblik
How much can we trust this data, provided by Tesla? Maybe the sensors that
tell if the hands were on the wheel malfunctioned? Maybe he tried to restle
the car into control? Would be helpful to have a video recording of the driver
instead.

~~~
Gustomaximus
A level of skeptism is always healthy. Also recognise there are a endless
'maybes'. Anyone that works with data knows there has to be a level of trust
and common sense otherwise you hit analysis paralysis.

And on 'trust' Tesla have a history of being open and upfront. Organisations
tend to repeat a culture. So for Uber I would closely question their
statements given an ongoing history of unethical behaviour across many levels.
It seems likely they will lie or show limited truth to reach an agenda.
Tesla's history seems open and honest.

~~~
FireBeyond
Yeah, they've been open and upfront about things like the fact you can't
maintain them yourself (they refuse to provide service manuals except where
required by law in MA, where even then you pay a three digit fee to -view-,
not get, a copy, and have to make an appointment).

Or the fact that they will remotely downgrade your car (to an older software
version) and remove features of it (disable ethernet ports) if they find you
"snooping" and discover references to new models.

Thats the kind of openness and honesty you laud them for?

------
Jerry64545
Forget about Software / Ai bugs, My understanding is there is a hardware(lidar
or radar) in tesla car. Shouldn't the car should have stopped when it's
hardware sensed that its definitely going to hit the wall.

------
animex
I just cancelled my Model 3 reservation based on the way this is being spun.

------
lancewiggs
Imagine if every single fatality caused by a car had this much scrutiny. They
should, and while Tesla is copping it , I’m very happy that safety in autos is
actually a topic of interest.

------
bob1847
The dude didn’t have his hands on the steering wheel for over 5seconds?
Madness

------
AKifer
Humans do not understand such an arguments in such a situation. Hope you will
do better business on Vulcan dear Tesla.

------
RayVR
PLEASE stop comparing crash statistics for a car with MSRP $80-150k to the
general population. Sample bias (among other basic problems) is such a basic
consideration that Tesla ignoring it makes me sick. This company will do/say
anything to manipulate public perception. People on HN should at least attempt
to see through this rather than buying it because Tesla gives them the warm
fuzzies.

[https://www.technologyreview.com/s/601849/teslas-dubious-
cla...](https://www.technologyreview.com/s/601849/teslas-dubious-claims-about-
autopilots-safety-record/)

------
matte_black
Personally I would only use a self drive feature on the middle lanes of a
highway if I was going to be distracted with something else, there’s too much
bullshit going on at the edges for it to be safe.

~~~
octorian
As someone who drives this exact stretch of road every day, in a Tesla, I
agree. This lane is the end of a long stretch of road with either _zero_ or
_minimal_ shoulder, where even the slightest glitch could cause a scrape on
the side of your car. I'm always paranoid driving this stretch.

This concrete barrier is also quite a distance past where this left exit lane
starts to separate from the rest of US-101, so there are really only two ways
you'd risk hitting it. First, and probably most common, is people who change
their minds last-second about taking that exit. Second, would be a serious AP
screw-up (like what might have happened here).

FWIW, I've _never_ had AP engaged when driving through this spot, and almost
instinctively disable it when taking a highway off-ramp/change-over.

------
cctt23
_In the moments before the collision, which occurred at 9:27 a.m. on Friday,
March 23rd, Autopilot was engaged..._

The rest may be true, may be germane, or may not be. The only part that
matters to me until the results of the NTSB investigation is made public, is
that the autopilot did this.

~~~
dingaling
The fact that Tesla even made a press release of this nature disturbs & annoys
me.

When the NTSB is investigating an aircraft or train accident we don't see the
vehicle manufacturer rushing out a "well we told him to take control"
statement and a dump of telemetry. At most they state "we are of course
working with the authorities to establish the cause and any corrective
actions" until the investigation is completed.

Tesla needs to mature from a blame-the-user software-industry mentality to
something more befitting their responsibilities.

------
domevent
This is 100% anecdote: A friend of mine test drove a Tesla, and said that when
he was in the far right lane on a highway, the autopilot would get a little
sketchy at off ramps. It didn’t ever off or anything, he described it as
“jiggling” like it wasn’t sure what do when the right lane marker disappeared.
He found that less than inspiring, although he did find the acceleration
intoxicating.

------
fwgwgwgch
I have a question- why keep ap on at all? Are they beta testing it and using
metrics? Because I don't think it's the difference between someone buying a
car and not.

------
bladers
I repeat.

This is a very deceptively crafted sentence. if i drove on AP for 15 minutes
and i was alerted to put my hands on the wheel 2 minutes in.

It would fit tesla's specially crafted statement.

It makes it look like user was warned and didn't respond by putting their
hands on the wheel but its completely false. The AP drove him straight into
the barrier with no warning.

But this deceptive statement worked as Tesla carefully planned. look at
Jalopink title "Tesla Says Autopilot Was On Before Fatal Model X Crash, But
That Driver Didn’t Abide Warnings"

~~~
dang
Please don't post duplicate comments to HN. It lowers signal-noise ratio and
makes it hard to merge threads.

------
tardo99
I get downvoted reliably for taking issue with Tesla and Musk, but here I
largely agree with them. People need to consider such incidents in the context
of the overall danger of driving.

That said, Tesla needs to maintain its PR in order to do their next capital
raise shortly or they may be out of luck.

