
Tesla’s Autopilot found partly to blame for 2018 crash on the 405 - casefields
https://www.latimes.com/business/story/2019-09-04/tesla-autopilot-is-found-partly-to-blame-for-2018-freeway-crash
======
dreamcompiler
Fire truck driver here: People crash into parked fire trucks and ambulances
and cop cars _all the damn time_. That's why we park the Big Red Truck at an
angle behind the accident -- so that when the 2-ton car crashes into the
30-ton truck, the car will bounce off and the people behind the truck (us)
won't be injured.

I don't disagree that Tesla's software is partly to blame here, but the null
experiment also has a lousy track record.

~~~
TheSpiceIsLife
This one time I go stopped at a fruit fly control point on the border between
South Australia and Victoria, near Bordertown.

The person staffing the control point indicated with hand signals for me to
stop on the highway in the lane I was in.

So I slowed down to walking speed and pulled off the highway into the car park
next to the office.

The agent then lectured me about not following instructions and that she could
fine me for disobeying.

I then calmly explained to the agent the reason I didn't stop in the lane on
the highway: _drivers of cars and heavy vehicles are known to have a terrible
habit of not seeing other vehicles stopped on the road._

~~~
newnewpdro
It always blows my mind when a highway patrol scolds me for not stopping
immediately on the side of the road, instead pulling off at the first
exit/intersection to make it safer for us.

I'd estimate it's about 10% of the time. But on the contrary, I've also had
them express appreciation for doing the same thing.

~~~
cortesoft
Jesus, how many times have you been pulled over that 10% of the time is more
than once!?

~~~
newnewpdro
There was a period of a few years where I drove a track-prepped race car on
the street daily, with out of state plates for emissions reasons, and it
resulted in a _lot_ of police encounters.

~~~
SOLAR_FIELDS
Has there been any studies of this? I think it’s well known that if you drive
a sports car you get pulled over more often, but I drive a beat up pickup in
the south of USA and almost never get pulled over even though I’m frequently
5-10 MPH over the speed limit. Would be interesting to see pickup vs a normal
mid range sedan vs sports cars in numbers. Maybe throw in out of state plates
as an extra variable.

~~~
Washuu
I drive a Subaru WRX STI which is one of the most expensive to insure vehicles
due to it be one of the most ticketed and crashed vehicles. I have never been
pulled over while driving it. I have been pulled over in my Wrangler many
times while obeying all laws, but never had a ticket from it.

[https://insurify.com/insights/car-models-with-the-most-
speed...](https://insurify.com/insights/car-models-with-the-most-speeding-
tickets/)

------
Animats
_Figure 7 depicts the movement of the Tesla and the two lead vehicles in the
last 15 seconds before the crash. In the left panel—covering 15 to 8 seconds
before impact—the Tesla is accelerating from 9 to 18 mph while following a
lead vehicle (red car) at 30 to 46 feet.In the middle panel—covering 7 to 4
seconds before impact—the Tesla is traveling at a constant 21 mph while
following a lead vehicle (green car) at a distance decreasing from 148 to 108
feet. In the right panel—covering the last 3 seconds before impact—the Tesla
is accelerating from 21 to 31mph, with no lead vehicle and a forward collision
warning half a second before impact._

...

 _About 0.49 second (490 milliseconds) before the crash, the system detected a
stationary object in the Tesla’s path. The forward collision warning
activated, displaying a visual warning and sounding an auditory warning to the
driver. By the moment of impact, the Tesla had accelerated to a speed of 30.9
mph._

...

 _AEB did not activate during the event, and data show no driver-applied
braking or steering before the crash. Tesla’s AEB is a radar /camera fusion
system that is designed for front-to-rear collision mitigation or avoidance.
According to the company, the system requires agreement from both the radar
and the camera to initiate AEB; complex or unusual vehicle shapes can delay or
prevent the system from classifying vehicles as targets or threats._ [1]

Notice the last line. The system _still_ has to recognize an obstacle as being
visually car-like before braking to avoid hitting it. If it's not recognized
as a vehicle, the car will not stop.

This was in January 2018, but it was a 2014 vehicle, which would mean the old
Mobileye vision based collision system, right? We know Mobileye works that
way, trying to draw boxes around car rear ends, because people have bought
them and made videos showing what they see. Did that get fixed? Can it be
fixed with a vision-based system? This is the big weakness of a no-LIDAR
system.

Here, the driver had 3 seconds to react, because the car ahead turned out of
the lane to evade the fire truck well before reaching it. Compare this 2017
video of a Tesla hitting a construction barrier. The car ahead didn't turn
until just before the barrier, only giving the driver under 1 second to
react.[2] Same failure to detect a lane obstruction, but not enough time for
the driver to do anything about it.

[1]
[https://www.ntsb.gov/investigations/AccidentReports/Reports/...](https://www.ntsb.gov/investigations/AccidentReports/Reports/HAB1907.pdf)

[2]
[https://www.youtube.com/watch?v=VTdcWnGnnJQ](https://www.youtube.com/watch?v=VTdcWnGnnJQ)

~~~
matt-attack
It only highlights how _incredibly dumb_ these autopilots are. They’re
incredibly diligent but completely fail at the _necessary_ skill of
_understanding_.

What is the point of any of this if the car with all of its sensors knowingly
plows into a stationary red 30-ton obstacle. It’s something that a 4 year old
human would understand _instantly_ as an object to be avoided at all cost.

I really think folks don’t realize how critical all of our human understanding
is to driving.

~~~
dahfizz
This is completely ridiculous. Adults drive into stationary objects all the
time. 4 year olds walk into stationary objects all the time.

As far as I'm concerned, there is only one criterion that matters when
evaluating self driving cars: is it better than human drivers?

The fact that we are still discussing this _one_ accident that happened _a
year ago_ indicates to me that the answer is a resounding yes.

~~~
toast0
Well, we could also talk about the time the Tesla accelerated into a gore
point when the lead car followed the lane to the overpass. Or the couple of
times the Tesla drove into a semi trailer. Or the several other stationary
emergency vehicles they've hit. At least the semi's were moving.

We don't really know how well the Tesla automation does vs human drivers,
because the statistics are complex, and Tesla only gives us deceptive
summaries.

The capabilities that Tesla cars have aren't that much different than other
luxury cars, but the other manufacturers don't have NTSB investigations,
because their marketing isn't convincing people the car can drive itself.

~~~
FireBeyond
> The capabilities that Tesla cars have aren't that much different than other
> luxury cars, ... because their marketing

It's actually somewhat impressive, if frustrating, how well Tesla's marketing
works.

In a previous article here a couple of weeks ago, a Tesla owner was touting
"how much better" Tesla's blind spot monitoring was than "other
manufacturers", because it took speed differentials into account, while my
2015 Audi A4 did exactly the same thing. If you were faster than someone in
your blind spot, it wouldn't activate. If speeds were similar, it'd activate
at close range. If they were vastly outpacing you (I'd see it when adjacent to
a HOV lane), it'd activate well within a safety distance to factor in speed
differential.

While there are "dumb" Blind Spot sensors, multiple Tesla owners were of the
fervent belief that this was an example of how unique Tesla's technology is.

~~~
Fins
Even 2011 has it, and I am sure it wasn't the first year it was available
either.

------
Confusedcius
The driver was found using his phone, over-relied on the autopilot system, and
lied about what he was doing when being questioned. Sadly, we can't 100% trust
these automated driving systems yet so folks need to stay attentive behind the
wheel. The more accidents like this will cause lawmakers to create laws that
can potentially slow down automated driving development.

~~~
c3534l
The best way to stay attentive behind the wheel is to actually be the one
driving. There's virtually no way of being both attentive and passive for long
periods of time. If a driver can't check their email or whatever on their
phone while autopilot is on, then autopilot is not safe to put in cars. And
while I'm okay with not-exactly-safe for most things people willingly consume
or use, driving is not an area where it's okay to roll out a feature that may
cause people to stop paying attention to the road.

~~~
chrisseaton
> There's virtually no way of being both attentive and passive for long
> periods of time.

How do aircraft pilots manage it?

~~~
organsnyder
Extensive training, restriction of hours worked, health restrictions (pretty
sure my ADHD would prevent me from being a commercial pilot)...

Also, my understanding is that aircraft autopilot systems are designed to
allow for pilot response to take up to three seconds. Three seconds on the
highway is a lifetime.

~~~
anticensor
Car driving also has health preconditions.

------
KT-222
Did the sun play a role in this? This picture from the report shows a low sun
directly ahead. (Southbound 405 is going SSE in Culver City. 8:40AM accident)

[https://imgur.com/a/OaHNqdQ](https://imgur.com/a/OaHNqdQ)

Assuming it was taken ~1 hour after accident, the sun would have been lower
and to the left at time of the accident. The slightly askew firetruck might
have been basically facing the sun, and its prominent rear features would have
been in as much as shadow as possible. Obviously there were other factors
involved (yes, ultimately driver's fault), but do Tesla's cameras have the
dynamic range to be looking directly at a low sun and also see details in
shadows? Originally I thought this was a high-speed accident and am surprised
to learn it happened between 20-30 MPH.

------
jmpman
If Tesla is unable to brake for a stationary fire truck with 3+ seconds of
visibility, why should I be confident to pay $6k for “Full Self Driving”?
Recognizing a stopped fire truck (effectively a wall) in front of me is a
prerequisite for me to invest further. A part of me believes that the public
is being intentionally misled on the capabilities, and Tesla’s lack of
transparency, coupled with their financial incentive to obscure, makes me more
cynical.

What Tesla needs to do is to classify every known accident, and then recreate
the accident (video please) followed by a recreation with their fixes. If the
scenario can’t be prevented in software, it needs to be disclosed to the
buyer.

Would Tesla crash into that same fire truck again with the newest software? I
suspect yes. I even suspect there are a bunch of engineers sitting around
Tesla confident that crashing into that fire truck was the right thing to do.

Heck, I think Tesla should be required to have an asterisk next to every self
driving claim, with an asterisk for every known scenario where their product
has failed, including links to the crash reports, including full video
evidence. The public should have a right to know exactly where/when/why the
system failed as long as those failures aren’t prevented in software from
happening again.

~~~
WhompingWindows
Why should anyone buy a feature for $6k that basically gets them nothing until
_a future point_ when they could buy it anyways?

~~~
grecy
It will be more expensive later.

~~~
klwejchkwejrhc
The thing is, it almost certainly still won't exist later.

------
hnburnsy
Report...

Highway Accident Brief: Rear-End Collision Between a Car Operating with
Advanced Driver Assistance Systems and a Stationary Fire Truck

[https://www.ntsb.gov/investigations/AccidentReports/Pages/HA...](https://www.ntsb.gov/investigations/AccidentReports/Pages/HAB1907.aspx)

~~~
c22
Thanks for the link. Interestingly, the NTSB's page appears unable to scroll
at all with javascript disabled, but I was still able to view all the content
by resizing the viewport(?) in both Firefox and Chrome. A very odd design
decision...

------
jakeogh
This shifting blame to the cloud is a bad trend. It's the driver's fault.

~~~
olliej
... but the driver was the car...

~~~
sjwright
Not legally. Has Autopilot got a driver’s license?

The fact that software was in control at the time is no more relevant than the
guy who asks his twelve year old son in the passenger seat to reach over and
steer while the driver rifles through stuff on the back seat.

Responsibility for an accident doesn’t ever rest with whatever entity is
steering the car, it rests with the licensed driver. (Which could be software,
when it gets good enough.) Anything else is madness.

~~~
wbl
This report is about determining what features are inherently dangerous. It is
very relevant that autopilot can't detect vehicles.

~~~
jakeogh
A common marketing trick is to steal a word by naming your product it.
Suddenly you have Autopilot when it's really "Autopilot". Or better "Tesla
Autopilot".

~~~
sjwright
Yes, a respectable company like Chrysler wouldn't use such an inaccurate term.

[https://i2.wp.com/www.curbsideclassic.com/wp-
content/uploads...](https://i2.wp.com/www.curbsideclassic.com/wp-
content/uploads/2018/01/Chrysler-1958-Auto-Pilot-Brochure-03.jpg)

------
quotemstr
I worry much more about public overreaction to rare incidents slowing and
overregulating self driving technology than I do about the rare crash here and
there. Overall statistical safety is what matters, and statistics goes out the
window when the public emotionally fixates on specific mishaps and demands
regulation that slows and cripples overall-safe technology.

Everyone believes, without reason, that _he_ is a good driver and will always
beat those janky autopilots. HN threads about autonomous vehicles are always
full of unrepresentative anecdotes and angry moral denunciations of
technology. It's sad.

~~~
tim333
Overall statistical safety just now for Tesla autopilot is probably worse than
plain human driving. You can reasonably argue though that allowing research
now will save lives in the future.

------
aduitsis
NTSB Report here:
[https://ntsb.gov/investigations/AccidentReports/Reports/HAB1...](https://ntsb.gov/investigations/AccidentReports/Reports/HAB1907.pdf)
(found it here: [https://www.theverge.com/2019/9/4/20849499/tesla-
autopilot-c...](https://www.theverge.com/2019/9/4/20849499/tesla-autopilot-
crash-culver-city-2018-ntsb-report))

Copying: _When the last lead vehicle changed lanes—3 to 4 seconds before the
crash—revealing the fire truck on the path of the Tesla, the system was unable
to immediately detect the hazard and accelerated the Tesla toward the
stationary truck. By the time the system detected the stationary vehicle and
gave the driver a collision warning—0.49 second before impact —the collision
was imminent and the warning was too late, particularly for an inattentive
driver. The AEB system did not activate. Had the driver been attending to the
driving task, he could have taken evasive action to avoid or mitigate the
collision._

So there you have it: "particularly for an inattentive driver".

Not really an expert in the field, but I think a major problem with self-
driving in general is handing over from an autopilot to a human. This is
something that cannot be easily solved just by throwing smarter technology at
the problem. Even if autopilot technology manages to be reliable 99.9% of the
time, this won't be good enough, because people will invariably think they can
rely on it 100% of the time and fall asleep or get distracted. So this 0.1%
can be potentially catastrophic, because you cannot go from being asleep to
full alertness in 0.5 seconds. Besides, even if one _actually wants_ to
maintain alertness, it may be harder to be alert and not doing anything than
being alert and doing something (driving).

So, unless autopilots can really really handle anything thrown at them, this
will be a real issue. Although there is much willingness for people to
convince others and be convinced that this kind of autopilot is very close to
being a reality, I suspect it may be not.

~~~
olau
There was an interview with George Hotz from comma.ai where he talked about
this.

His fix is a very attentive driver monitoring setup. So if you are not paying
attention, it will barf at you. IIRC he was also talking about how this driver
monitoring setup is worth something in itself for trucking companies, fewer
damages.

------
torpfactory
I think we should look at this probabilistically. Yes, autopilot should be
improved to prevent this, but no system is perfect. Certainly not human
drivers. IMO the correct comparison is whether Teslas are on average more
likely to hit firetrucks while on autopilot than normal vehicles while being
driven by humans.

I'm sure this data most likely won't be forthcoming, but I think it's
important to keep in mind that we should be looking at accidents fleet-wide
and not in isolation. Autonomously driven cars will have accidents. More
importantly, their failure modes will be different than human drivers.
However, as soon as they are even a small margin safer than human drivers,
their adoption would save lives. Besides, the systems will continue to get
better. Human drivers have proven stubbornly difficult to improve.

We ought not reject autonomous driving technology only because it sometimes
gets in accidents that appear to be strange or arbitrary to us. I consider the
carnage perpetrated by human drivers to be arbitrary, but most people just
take death on the roads as a fact of life.

------
joe_momma
Hey Elon, why don't you build a car course and fill it with all the things
your cars have failed to avoid. Why haven't you made some OTA for large
obstacle non car thing?

------
alexanderh
Even with Autopilot, everyone I know who owns one: Its their little baby...
they don't even dare risk it, at all. These people doing stuff like this are
either really stupid, or really rich... or both. Because I have never seen
anyone I know dare drive recklessly with autopilot.

~~~
ModsCtrlideas
I was looking to get a leaf or Cadillac in a few years when their autopilot is
on used vehicles.

Isn't the point that its... Autopilot?

------
kazinator
> _The vehicle’s design “permitted the driver to disengage from the driving
> task”_

Which vehicle's design does not permit a driver to disengage from the driving
task?

~~~
olliej
Autopilot. By design it encourages drivers to stop paying any attention - the
fact that it has a “signal the driver” to try and maintain driver focus
indicates that the core design produces this behavior.

If the core design produces the same behavior, in a large enough portion of
the user base to require such a feature, that’s a fairly good indication that
the core design is causing a problem.

~~~
m463
You are answering the opposite of the question.

 _all_ cars allow you to stop paying attention.

I think the idea behind autopilot is that it reduces your workload (like
cruise control, or automatic wipers, or automatic headlights or any other
labor-saving device)

~~~
olliej
Tools like autopilot _encourage_ lack of attention.

The problem isn't "you don't have pay attention to X" it is "you don't have to
pay attention to _anything_ during normal operation". It is human nature, if
not outright physiology that it is incredibly difficult to keep someone
engaged, and paying attention to a task that they are not actually doing.

The problem with halfway solutions like autopilot isn't that they control the
car some of the time, it's that they can, essentially without warning _Stop_
controlling it.

Self driving cars will happen. But the current model is essentially "do enough
of the driving to maximize the likelihood that the human 'driver' is not going
to pay attention, but don't do enough to make their lack of attention safe".

My statement isn't "cars allow you not to pay attention", it is "systems like
autopilot actively encourage a lack of attention", that's backed up by the
need for alert systems when they have obvious signs that the driver isn't
paying attention, like "no longer holding the wheel". If the system didn't in
general usage result in lack of attention then such a system would not be
necessary.

------
navigatesol
The Tesla damage control astro-turfing is out in full force.

Between the surveillance state they've built and their complete disregard for
safety, Silicon Valley will eventually be forced to atone for this from the
public at large. Make sure you're on the right side of history.

~~~
tim333
Historically the right side has generally been technology making peoples lives
better. At least in material terms such as life expectancy, diseases cured and
so on. They still moan though.

