
Tesla ‘on Autopilot’ slams into parked fire truck on freeway - lakisy
https://www.mercurynews.com/2018/01/22/tesla-on-autopilot-slams-into-parked-fire-truck-on-freeway/
======
frgtpsswrdlame
You know I take issue with Tesla on these incidents. They're trying to have it
both ways. They market it as 'autopilot' to appeal to the dream of complete
self-driving but if an accident actually happens it can never be the fault of
autopilot. After all it's required for the driver to be paying attention so if
they didn't override autopilot it's obvious they weren't paying attention.
Really I'd like to see the government to step in and force Tesla to market
their product more honestly.

~~~
jsjohnst
I couldn’t agree more. It baffles me how Tesla can get away with such blatant
false advertising (it’s anything but “autopilot” in any layman sense),
especially in light of there being multiple incidents of bad accidents which
showed them at fault.

~~~
sidcool
From Elon's interview, he mentioned that 'autopilot' was a name akin to the
flight autopilots. Pilots are never expected to get their eyes off the
controls.

This crash is unfortunate and should be investigated. But autopilot naming is
something I don't think is the problem.

~~~
ucaetano
> Pilots are never expected to get their eyes off the controls.

Pilots do so all the time, because airplanes aren't flying in a congested 2D
space. Airspace is mostly empty, with very large distances between planes and
features, allowing for plenty of early warning.

And an airplane's "autopilot" has nothing to do with Tesla's Autopilot. Their
purpose and working mechanisms are completely different.

The airplane analogy is pure bullshit.

~~~
manicdee
The aircraft autopilot is pretty much the same thing. It steers on the
designated lane, maintaining attitude, and relies on the pilot to be aware of
conditions.

It operates the controls but relies on the human operator for supervision.
Pilots talk about “flying in the magenta” and exactly the same thing applies
to Tesla Autopilot.

Believing that autopilot means something different is the bullshit attitude
that has to change.

Warren Van Der Burch gave a classic presentation, Children of Magenta:
[https://youtu.be/pN41LvuSz10](https://youtu.be/pN41LvuSz10)

~~~
nightski
An aircraft autopilot is not designed to automatically avoid collisions with
other traffic or animals. It's purpose is only to maintain course, speed, and
altitude. In VFR conditions while autopilots are useful continuous scanning of
your surroundings is necessary.

Meanwhile a Tesla "autopilot" is designed to drive in traffic with other
vehicles.

~~~
jsjohnst
> An aircraft autopilot is not designed to automatically avoid collisions

The A380 begs to differ...

[http://www.airbus.com/newsroom/press-
releases/en/2009/08/eas...](http://www.airbus.com/newsroom/press-
releases/en/2009/08/easa-certifies-new-autopilot-flight-director-tcas-mode-
for-a380.html)

~~~
nightski
Yes I was aware of TCAS. There are some cool technologies out there. I am
pretty sure out of all aircraft with autopilots, the amount with this
technology is extraordinarily low. Especially if you are considering only
personal aircraft and not ones designed to carry over 1,000 passengers.

~~~
jsjohnst
The topic of conversation in this thread is about the system that the general
population thinks about when they hear “airline autopilot”. Most folks would
naturally think of commercial passenger planes and their (believed)
capabilities.

------
michaelmior
> The California Highway Patrol and Culver City Fire Department confirmed the
> southbound Tesla had struck the fire truck, but could not immediately
> confirm whether the vehicle had been on Autopilot.

It sounds like it's not 100% confirmed that Autopilot was at fault. The first
sentence of the article also reads " _reportedly_ on 'Autopilot'"

~~~
ProfessorLayton
The more I think about it, the more I think that there should be an outward
signal, such as a special light, that a car is in self-driving mode. We
already require signals when changing lanes, reversing, and for electric cars
to be audible at low speeds (starting in 2019). Those nearby shouldn’t have to
guess if a car is under computer control.

~~~
SheinhardtWigCo
What do you think human drivers should do differently when driving near a car
that is under computer control?

~~~
sjwright
The equally interesting question is what you think _computer drivers_ should
do differently when driving near a car that is under computer control.

For example, if your computer control car knows that the car behind is also
computer controlled, it could be more aggressive on the brakes to avoid a
forward collision without causing a rear collision.

~~~
greglindahl
Lots of cars have automatic braking these days, and I bet exactly zero of them
care about the car behind them.

~~~
sjwright
Correct, that's the situation today.

Tomorrow, the situation might be that a computer controlled car could see that
the car behind is tailgating and the car in the next lane is computer
controlled — then determine the safest action would be to swerve into the next
lane rather than brake suddenly...

~~~
greglindahl
That's getting into trolley problem territory. I'd prefer that my car not do
crazy shit.

~~~
sjwright
There's no way to avoid the trolley problem as long as you're developing self-
driving cars. You _can_ wilfully ignore the trolley problem and let the
algorithm do whatever it does, but that doesn't make it go away.

And I would dispute that emergency braking suddenly while being tailgated is
less "crazy shit" than swerving into the next lane where your presence will,
with an exceptional degree of confidence, be immediately detected and
compensated for.

To clarify, the scenario in my mind is one that would be bold but not
dangerous; something a good human driver might do with 360° situational
awareness. In this instance it could be the difference between swerving to
miss an obstacle or a multiple-car collision — even if your self driving car
can stop in time, the tailgater is likely to slam into you, which in turn
slams you into the obstacle.

Or to change the situation a bit, if the "tailgating" car happened to be a
self driving car with an aggressive profile, that could swing the decision
back towards braking. Point is, knowing that a car is being driven by a
computer can aid decisions made by other cars.

------
dreamcompiler
Firefighter here. This kind of collision happens all the time -- although it's
usually a manually-driven car. This is why we park the BRT* behind the
accident and we try to stay in front of the truck. A moving 4000-lb car is no
match for a stationary 30,000-lb fire truck.

*Big Red Truck

~~~
greglindahl
Yeah, it's pretty amazing how tough that truck is compared to the car.

------
jread
As a current Tesla owner and frequent autopilot user I can attest to this
being a scenario where autopilot has not worked for me - traveling at high
speed and approaching a stopped vehicle from a distance. I now always take it
off autopilot when this occurs because it will not brake in time.

~~~
nikkwong
From a technical perspective—why would Tesla not have taken more care to
address this scenario? It seems like low hanging fruit in comparison to other
challenges that autopilot has to be able to handle.

~~~
benfar
They did address it, starting with software 8.0. See
[http://www.tesla.com/blog/upgrading-autopilot-seeing-
world-r...](http://www.tesla.com/blog/upgrading-autopilot-seeing-world-radar)
. Chances are that evidence will show that the vehicle detected the fire
truck, gave audible warning, and then started braking. That's all if the
driver had not disabled forward collision warning and automatic emergency
braking. They are standard safety features and unless disabled are active
during manual driving as well as when Autopilot is active. I don't work for
Tesla, but I am very familiar with this type of technology and Tesla vehicles'
owner manual descriptions of operation.

------
taneq
As I understand it (and please correct me if I'm wrong) the automatic
emergency braking (AEB) system is separate from the autosteer system on a
Tesla. It seems to me that this kind of incident is a result of AEB failure,
not of autosteer failure.

I wonder how Tesla's AEB stacks up against similar systems on other vehicles?
There are a lot of vehicles on the road now with automatic braking, but we
don't see every single frontal collision on such cars reported as some kind of
systemic failure.

~~~
benfar
Tesla's AEB likely triggers earlier than many others. See this example
[http://youtu.be/FadR7ETT_1k](http://youtu.be/FadR7ETT_1k) . The vehicles
ahead were moving, but after [http://www.tesla.com/blog/upgrading-autopilot-
seeing-world-r...](http://www.tesla.com/blog/upgrading-autopilot-seeing-world-
radar) was implemented in software 8.0, I believe the same was possible for
already-stopped vehicles, too. We'll find out in the accident report, and
perhaps directly from Tesla after they retrieve the vehicle's data.

------
drtz
A Tesla slammed into the back of a stopped firetruck at 65mph and the driver
walked away un-injured.

Why am I not already driving a Tesla?

~~~
FireBeyond
Because most modern vehicles are made to the same crash safety standards and
would likely have a similar outcome?

The Tesla isn't magical, made of adamantium, or such.

Not to sound overly snarky, but as someone who has been a firefighter /
paramedic for the best part of a decade, I've certainly seen my fair share of
MVAs up close.

~~~
Doxin
It however does have an impressive crash rating. Let's not pretend all modern
cars are equal in crash safety.

------
rootusrootus
I wonder if it is imprudent to create too many vehicles that have partial
self-driving capability before we have ones that can do 100% of it. Instead of
to the point where you can take your attention away from the road, but not to
the point where it can reliably avoid accidents on your behalf. Too many
people aren't wise enough to use the partial autonomous technology without
abusing it.

~~~
ocdtrekkie
People have been safely and intelligently using cruise control for... decades?
It is a partial automation solution that reduces half of what you have to
manage when controlling the car (how fast you're going).

I suspect the primary problem is not the partial autonomous technology, but
the marketing which is convincing people they can trust it. Nobody ever told
you your cruise control means you don't have to pay attention, so very few[1]
incidents happen due to misuse of it.

First and foremost, Tesla should never have been allowed to release
"Autopilot" being called that. From the very name it provides a false
impression of what it can do.

[1]I know you can find one. Humans are remarkably stupid.

~~~
aardvark291
> "People have been safely and intelligently using cruise control for...
> decades?"

More than half a century in mass-market automobiles.
[https://en.wikipedia.org/wiki/Cruise_control#History](https://en.wikipedia.org/wiki/Cruise_control#History)

------
js2
This is the drivers's fault. But how the heck did both the driver and the
autopilot not see a bright red truck in broad daylight?

------
ojosilva
To me Tesla autopilot accidents are like airliner crashes: they lead to a
safer system, one that in the future will avoid the same mistake again.
Compare that to human-operated vehicles: we keep getting into the same
accidents over and over. The fact that autopilot data gets sent back to the
factory, is analyzed and results are fed back into the the production line is
a game changer that will help Tesla keep ahead of the competition a while to
come. Right now the rate and importance of autopilot system interations are an
order of magnitude higher than other auto-safety evolution from the past.

------
manicdee
I take issue with people who insist on abrogating responsibility. The Tesla
tells you to keeps your hands on the wheel.

Pilot training covers techniques for avoiding automation dependency (aka
letting autopilot kill you and all tour passengers). Perhaps we need the same
for car drivers: if the car was as good a driver as you, it wouldn’t give you
the option to drive.

[https://youtu.be/pN41LvuSz10](https://youtu.be/pN41LvuSz10)

------
NDizzle
You know how stoplights are in tune with emergency vehicle flashing lights? I
think the auto pilot systems need to get more in tune with those systems as
well. An alert needs to fire when it detects an emergency.

I've heard of people not turning off their cruise control when going through
an accident area. Boneheads. This is no different - just involving the thing
after cruise control.

------
cypherpunks01
Seems fairly impressive that nobody was killed.

Will this turn out to be another "brightly lit sky" autopilot bug, like the
2016 semi truck incident?

[https://www.tesla.com/blog/tragic-loss](https://www.tesla.com/blog/tragic-
loss) "Neither Autopilot nor the driver noticed the white side of the tractor
trailer against a brightly lit sky, so the brake was not applied"

~~~
JustSomeNobody
Wait, so is non (human-) visible wavelengths not part of the autopilot system?

~~~
kuschku
Tesla, to save money, is trying to build autopilot from regular video cameras
in visible spectrum. EDIT: Some models still contain other sensors, but Musk
has been trying to go with purely optical for a long time.

~~~
jeffasinger
I really think they're going to have to consider using forward facing radar as
well.

~~~
toomuchtodo
They do. After the Florida crash, they lean even more heavily on the forward
facing radar.

[https://www.tesla.com/blog/upgrading-autopilot-seeing-
world-...](https://www.tesla.com/blog/upgrading-autopilot-seeing-world-radar)

------
spraak
Just the other day I was using the Tesloop [1] service from near LAX and the
driver was praising the autopilot feature. I thought it was really neat at the
time, but I would be much more hesitant about it now :/

[1] [https://tesloop.com/](https://tesloop.com/)

------
imron
I guess it's not all bad news - a collision with a stationary object at 65mph
and the driver was able to walk away

> Because of the force of the impact, firefighters advised the Tesla driver
> that he should be taken for a medical evaluation, but he showed no
> significant injuries and refused treatment,

~~~
justasitsounds
Bets the driver was voluntarily impaired?

~~~
taneq
Are you implying they may not have been sober? (Which is a fair guess if they
can afford a Tesla but want to avoid a medical examination, I guess.)

~~~
justasitsounds
That is what I was implying - but people do act irrationally in the aftermath
of a traumatic event like that

------
euske
I was thinking, is it possible to attach a weird control gizmo to your car and
call it autopilot, and blame the device when it crashed?

It is not allowed in Japan (where I live) because of the draconian vehicle
inspection program, but I guess in some US states it is more liberal.

~~~
CodeWriter23
Or as an excuse for driving drunk
[https://mobile.twitter.com/CHPSanFrancisco/status/9544189332...](https://mobile.twitter.com/CHPSanFrancisco/status/954418933225762816)

------
sosuke
"The California Highway Patrol and Culver City Fire Department confirmed the
southbound Tesla had struck the fire truck, but could not immediately confirm
whether the vehicle had been on Autopilot."

So was it on autopilot or not.

------
throwaway42312
This is worrying. The Florida case was understandable. I fail to understand
how a big red vehicle with flashing lights could've been missed by the Vision
system of Tesla.

------
Clanan
How obvious is the data recording on a Tesla? There's always a chance an
inattentive driver will blame the smart car.

~~~
smt88
It's pretty granular, almost like a flight recorder. They've defended
themselves against similar claims before by publishing details from the car's
logs.

~~~
FireBeyond
Yeah, though some of their defenses have been a little ... on the nose.

"Autopilot was not active at the time of collision, according to logs"...
because it was deactivated by hard manual braking a moment before, because
Autopilot was about to cause a hard collision...

------
inamberclad
Level 2 autonomy is killer - literally. It's like no one there has see
Children of the Magenta.

------
vortico
The driver was uninjured, and although his car (which most people on this
forum love) appears to be totalled, why is this an interesting news story?

