
Federal safety official slams Tesla, regulators for misuse of its Autopilot tech - velmu
https://www.latimes.com/business/story/2020-02-25/tesla-autopilot-crash-hearing
======
tristanb
I don't agree with many of the points here. My partner bought a Tesla, I had
little interest in them. I now would buy no other car. The autopilot has
changed how I drive forever. We regularly take long three hour drives to get
to a scenic spot that I would never consider doing in another car. With
autopilot engaged, I am monitoring everything - probably with more accuracy
then if I was driving, due to the decreased cognitive load. I don't get tired,
like I do in an other car, with all the negative effects that come with that.

Im a safe driver, and autopilot makes me better at driving. I am not surprised
that if you don't follow the instructions bad things can happen. (I believe
statistically less than meat powered driving). That is on the human, not the
technology.

The guy in the article was playing a phone game. He crashed into a barrier he
had had problems with before.

Same guy, playing a phone game, whilst on cruise control in a BMW. Should we
regulate that? No, we should regulate idiots in cars.

~~~
allovernow
>With autopilot engaged, I am monitoring everything - probably with more
accuracy then if I was driving, due to the decreased cognitive load.

This sounds like a false sense of security. I find it extremely difficult to
believe that a typical human can maintain their attention glued to the road in
a purely passive manner for hours on end. Even if you were paying attention,
without your hands constantly on the steering wheel you would have a slowed
reaction time where milliseconds matter.

More importantly this kind of vigilance defeats the purpose of autopilot. I'm
not necessarily saying you're doing anything dangerous since the autopilot
function seems to be fairly safe, but your apologism comes across a bit like
like an alcoholic claiming they drive better when drunk.

~~~
zaroth
It's actually less exhausting than manual driving.

With manual driving you have to focus predominantly on one specific task -
keeping your car between the lines. Bends in the road, poorly drawn lines,
irregularities in the road surface, gusts of wind, all conspire to make this
task require constant attention. Look in the rear view mirror for a couple
seconds and then look back ahead and you might find that you've drifted, etc.

With AutoPilot the task has changed. You are no longer concerned with minute
adjustments to the wheel, and you no longer have to maintain a constant high
priority task of staying centered in the lines. Now, instead, you perform
higher order functions of route planning, estimating traffic flow, observing
other drivers and whether they are paying attention or drifting toward you,
etc.

So by no means are you passive at all. It's actually extremely engaging, but
significantly less monotonous form of driving. I find it quite pleasant, and
not at all hard to stay engaged.

If I'm going to play a game on my phone, it's because I'm choosing to risk my
life and the lives of the drivers/passengers around me, not because I couldn't
handle looking out the window to pay attention.

~~~
eaurouge
> You are no longer concerned with minute adjustments to the wheel, and you no
> longer have to maintain a constant high priority task of staying centered in
> the lines. Now, instead, you perform higher order functions of route
> planning, estimating traffic flow, observing other drivers and whether they
> are paying attention or drifting toward you, etc.

Staying centered in the lane should be muscle memory for an experienced
driver. The higher order functions you mentioned are all things you should be
doing regardless, with or without autopilot. I can perform those tasks just
fine while driving a stick shift, most drivers can.

The other side of the coin is that reduced engagement in the driving process
can lead to lapses in concentration. That's not just my experience, it's the
same opinion expressed by F1 drivers tasked with bringing the lead car home in
an unchallenging grand prix.

~~~
LeifCarrotson
I've driven about 250,000 miles in the last 10 years (a lot of trips for work,
coupled with a 50-mile commute for all but the last 2). 2 of 3 personal
vehicles and 2 of the 5 work trucks were manual transmission.

Apparently it takes more than that to be an "experienced driver", because
lanekeeping and speed control leave me very fatigued after a 3 hour drive. I
became accustomed to eating in at lunch diners and sitting down for dinner
halfway home but other co-workers were not fatigued and could go through the
drive through and eat in the truck.

It's definitely easier than it was, and I don't have to think about the
mechanics much at all anymore when backing up a 5th wheel or controlling a
truck sliding in the snow, but this human doesn't have a chunk of my brain
that can do lane centering automatically. That is not a universal brain
function, every human can drive but it takes a lot more energy and
concentration for some of us. My stress levels have gotten lower and my life
has gotten a lot better since I realized that and cut my commute down to a 2.5
mile trip with ~monthly travel instead...

~~~
AstralStorm
Still, you can have lane control without all the rest of alleged autopilot
problems, as well as automatic speed control and emergency braking.

The system is supposed to immediately start heading when it detects you're not
responding. Keeping hands on wheel is not good enough, an active attention
system similar to train ones should be employed and sharper at that.

Something as simple as a steering wheel button which is required to press in
half second after lighting a signal or automatic emergency system is engaged.

Perhaps even in all cars, not just automated, though it's hard to implement
safely without automatic steering and lane system.

------
dmitrygr
I remain amazed that Tesla can keep marketing "autopilot" and "full self
driving" as such, despite the obviousness of the fact that it has and will
continue to keep killing people silly enough to believe the hype.

Hopefully the NTSB recommendations on this include better and clearer
explanations that full self-driving is anything but, and that those
recommendations get implemented into serious laws with serious penalties.

I get that it is easy to outsell your rivals when you can market fairy tales.
I just don't think it's okay when people start getting killed because of it.

~~~
nullc
I recently commented to a new Tesla buyer that their car seemed nice but I
couldn't see myself ever spending so much money on a car. They responded by
gushing that self-driving would be available within a couple years and then
the car would be making money and would pay for itself in no time so the car
was actually better than free.

I asked why wouldn't they just wait until it was actually available and the
claimed that at that point tesla would raise the prices to hundreds of
thousands per car because of all the income they earn.

It took me a little effort to not respond by backing away slowly.

Are these beliefs widespread and is tesla promoting them? Or was I just
talking to an isolated weirdo?

~~~
tenacious_tuna
anecdotal evidence with n=1, but I'm a Tesla owner who refuses to pay for the
"Full Self Driving" tech, expressly because I think it's way further off and
overhyped compared to what I'd actually want for the cost.

I especially think the "your car becomes a taxi and pays for itself" gimmick
is beyond fantasy. I chalk it up to Musk being an actually mildly insane CEO,
which has benefits to go with its downsides.

As for the cost of the car, I wanted an EV with more range than other
manufacturers provided. (I also make use of the longer range on a regular
basis--probably about 2 times a month?) (edit:) I also wanted a car that Went
Fast. Not practical in any real sense, but a lot of fun.

There are around a dozen Tesla owners at my company. As far as I know, none of
them are banking on their cars being a taxi and earning tons of dough. At
least one has the FSD upgrade, and uses it primarily to back his car out of
the garage on its own so his kids don't bang up the doors when they get in.

~~~
lavezza
Also a Tesla owner (Model 3, have the FSD upgrade). I think Elon has a
particular way of seeing the future. There are people that say FSD will never
happen. But do they really mean that? Will people in 10,000 years be using a
steering wheel?

Elon sees FSD happening between 2020 and 12020. There is no know reason that
it can't happen. Well let's assume if we work real hard we can do it in 5
years instead of 5000. If it takes 15 or 50 or 100 years, no matter. That's
basically the same, but the earlier we start the sooner we have it.

Of course, there are those who think there is some spiritual/religious
component to true AI that we'll never be able to simulate. But Elon either
disagrees or things that level of AI isn't required for FSD.

~~~
dmitrygr
That's fine. Then don't say "your car can self drive" and don't accept money
for a "full self driving package" till you can actually DELIVER that. Or at
least add a disclaimer saying "* this may never happen, no promises but thanks
for the money, lol"

~~~
omgwtfbyobbq
To be fair, they do state that FSD features will be released in the future
based on achieving better reliability than humans over billions of miles and
after regulatory approval has been obtained.

 __ _All new Tesla cars have the hardware needed in the future for full self-
driving in almost all circumstances.

...[blahblahblah FSD features]...

The future use of these features without supervision is dependent on achieving
reliability far in excess of human drivers as demonstrated by billions of
miles of experience, as well as regulatory approval, which may take longer in
some jurisdictions. As these self-driving capabilities are introduced, your
car will be continuously upgraded through over-the-air software updates._ __

[https://www.tesla.com/autopilot](https://www.tesla.com/autopilot)

If FSD as described doesn't happen, I'm sure there will be lawsuits and
refunds just like there was with AP1.

[https://www.reuters.com/article/us-tesla-autopilot-
lawsuit/t...](https://www.reuters.com/article/us-tesla-autopilot-
lawsuit/tesla-settles-class-action-lawsuit-over-dangerous-autopilot-system-
idUSKCN1IQ1SH)

~~~
t0mas88
But the refund after settlement was $ 20 to $ 280 while these people paid $
5000 for a previous version of Autopilot (now it's even more).

~~~
omgwtfbyobbq
The settlement was for the delays in delivering features, not because Tesla
didn't deliver features. I'm sure Tesla would have had to much significantly
more had they not delivered the features promised.

[https://techcrunch.com/2018/05/25/tesla-settles-class-
action...](https://techcrunch.com/2018/05/25/tesla-settles-class-action-suit-
over-autopilot-claims-for-5m/)

------
Animats
The fundamental problem with Tesla's driver assistance system is
straightforward - it won't brake for anything other than a car rear end. So
far, Teslas on autopilot have hit at speed at least two semitrailers, a fire
truck, a street sweeper, and various other objects partly obstructing a lane.
The NTSB has pictures of most of these.[1] In most cases there's no braking at
all - the thing just plows into the obstacle at full speed.

Musk's decision not to use LIDAR is paid for in blood.

[1]
[https://www.ntsb.gov/news/events/Pages/2020-HWY18FH011-BMG.a...](https://www.ntsb.gov/news/events/Pages/2020-HWY18FH011-BMG.aspx)

~~~
ebg13
> Musk's decision not to use LIDAR is paid for in blood.

LIDAR is not the only way to record depth information. Saying that LIDAR is
some kind of savior device is heavily into "solve all your fusion problems
with this one weird trick" territory.

~~~
totalZero
Don't be ridiculous. It's more like "see better with different glasses."

~~~
ebg13
Do you have a paper comparing states of the art flash LIDAR to real time high
density stereo that makes you think it's better? Because my experience with
the field has been that lasers aren't better for this except for very specific
non-real-world scenarios.

~~~
ip26
Why is a paper required? Many of Tesla's early self-driving accidents involved
low contrast scenes where the system was unable to distinguish, e.g., a white
object from a wall, or a blue object from the sky. This is always a weakness
with optical, even for our own eyeballs.

------
gok
From the CNBC article [1]:

> The NTSB also called out Huang’s employer, Apple, for failing to set a
> strict policy for its employees banning non-emergency use of mobile devices
> while driving

Yeah the real problem here is insufficient employer paternalism during
commutes.

[1] [https://www.cnbc.com/2020/02/25/ntsb-calls-out-tesla-and-
app...](https://www.cnbc.com/2020/02/25/ntsb-calls-out-tesla-and-apple-for-
neglecting-driver-safety.html)

~~~
mr_toad
I see a lot of comments here and elsewhere that amount to exhortations to
protect people from their own stupidity, whether that be Tesla, or the
Government, or whoever.

Never underestimate the ingenuity of idiots. No matter how carefully you
design your safety protocols, some genius-level moron will find a way of
getting themselves killed.

~~~
quadrifoliate
The entire point of driver licensing is to "protect people from their own
stupidity" by teaching them how to drive. Now you have companies using nudge-
nudge-wink-wink terms like "Autopilot" that purport to be able to drive the
car for you, while also saying in their EULA or whatever that _actually_ , it
doesn't work like that.

I think it's quite reasonable, from a point of view of preventing _false
advertising_ to stop this. Removing all references to "Autopilot" and similar
and replacing them with sensible descriptions like "Driver Steering Assist"
like the other car manufacturers are using would definitely satisfy me.

------
erik_landerholm
My friend has one and uses the auto-pilot responsibly, works very well. Until
someone shows me concrete evidence that autopilot is responsible for an
inordinate amount of accidents at a rate higher than normal human
driving,...this seems like a huge waste of time and fear mongering.

~~~
ravedave5
That's the thing, Tesla tracks these stats and driving with auto-pilot has
less accidents -
[https://www.tesla.com/VehicleSafetyReport](https://www.tesla.com/VehicleSafetyReport)

~~~
czzr
I looked at the first report in your link. It says:

“we registered one accident for every 3.07 million miles driven in which
drivers had Autopilot engaged.”

And then: “By comparison, NHTSA’s most recent data shows that in the United
States there is an automobile crash every 479,000 miles”

This is true. It’s also a complete lie.

It’s set up to make you think that this proves Autopilot is safer than human
drivers, but it compares a national stat for all cars in all possible driving
conditions to a Tesla stat for high end cars in good weather freeway driving.
The latter are inherently much safer.

But so many people are falling for this blatant lying - plenty of examples in
this thread alone.

I hate that this works for them, and that it just encourages others to lie
similarly. I wish we had higher standards, as a society.

~~~
ceejayoz
Given the funny numbers, I'd also be interested in the number of accidents
right _after_ an autopilot disengage.

Does the car disengaging autopilot and smashing into something a quarter
second later count as an "autopilot engaged" crash?

------
BooneJS
That concrete is terrible there. At night it’s hard to see the painted lines
but the lines made by the concrete totally bend towards that left exit.
Challenging to tell which lines to follow to stay in your lane.

I don’t mean to give Tesla a pass mor marketing autopilot, but that section of
road is hard for humans too.

~~~
swiley
I'm not sure the driving problem is defined well enough to be solvable and
this is a great example of that.

That's probably why ML gets used so much, you can't really prove it's
incorrect but it works a lot of the time so you can get away with selling it
until someone dies.

------
KingMachiavelli
> Two, the cushion already was severely damaged. After a Toyota Prius crashed
> into it two months earlier, the length of the attenuator was shortened,
> offering less protection against the 3-foot-tall concrete median wall behind
> it.

Maybe the repairing crash mitigation devices of frequent & deadly hazards
should be a higher priority.

Or maybe better marking the 'gore' lane as in its current state it's simply an
empty lane that would look identical to a real lane (i.e if you can't see the
divider then it looks like a normal lane).

I don't get why we aren't looking into slightly changing our roads to better
work with self driving cars when all it takes is a little more paint.

~~~
dmitrygr
> I don't get why we aren't looking into slightly changing our roads to better
> work with self driving cars when all it takes is a little more paint.

Because 99.9999% of drivers are human and have no issues seeing the concrete
barriers and not hitting them head on at full speed. Until self driving is
better than humans in _all_ cases it will not be acceptable.

Nobody will spend money to adapt roads to self driving cars until they
comprise a significant percent of cars (read: decades from now, if ever)

~~~
drewrv
> Because 99.9999% of drivers are human and have no issues seeing the concrete
> barriers and not hitting them head on at full speed.

In this particular case, a human driver did crash head-on into the barrier
somewhat recently.

~~~
cjhopman
According to [https://dot.ca.gov/programs/traffic-
operations/census/traffi...](https://dot.ca.gov/programs/traffic-
operations/census/traffic-volumes/2017/route-101), ~125,000 cars drive through
that area per day (scroll to "JCT. RTE. 85", divide by 2). According to the
report, the previous collision was 2 months earlier, so ~7.5Million cars
passed through there. 99.9999% doesn't seem that far off.

------
free652
The problem is that the autopilot almost works, so drivers dont need to pay
attention most of the time.

~~~
dmitrygr
This is actually a phenomenon that is well studied (as it happens, by NTSB).
Automation that is 99% reliable - kills! Less is unrelible enough that you
notice and watch it carefully, more is reliable enough to actually rely on. It
is like the uncanny valley of automation.

~~~
rcMgD2BwE72F
>Automation that is 99% reliable kills!

It also saves more lives than it kills people. So?

~~~
falcolas
> It also saves more lives than it kills people. So?

Do you have data that the rest of us don't - that is data which identifies
incursions (that is near-crash interactions which do not end in a crash) where
vehicular automation acts correctly and human drivers do not?

If not, than you really can't make that statement.

You can say that "Tesla's luxury cars running under autopilot show fewer
crashes per mile than average for all cars", and be accurate, but that's not
"saving lives".

------
Glyptodon
I think it's bizarre that they call out Apple. Like what, they add "though
shall not game and drive" to a document somewhere or put it in a memo and
magically things like this wouldn't happen?

------
Rebelgecko
I dunno if LA Times changed the headline after this was submitted to HN, but
it's actually "NTSB slams Tesla, Apple and regulators over a fatal Autopilot
crash"

The "slamming" of Apple seems particularly lame. Partially it's because they
make iPhones, but the head of the NTSB also seems to be implying that an
employer has a responsibility to prevent its employees from engaging in risky
behaviors when they're off the clock.

~~~
FireBeyond
> an employer has a responsibility to prevent its employees from engaging in
> risky behaviors when they're off the clock

_IF_ (and I absolutely don't know in this case, but we've all certainly heard
of it before) an employer expects that you will be responsive just because you
have a phone with email / Slack access, etc., let alone if they factor that
into performance, whether it be informal or formal, then perhaps they do
(though certainly a minority, and, again, I have no idea about this particular
instance and employer).

------
noahtallen
I thought autopilot did have a mechanism to get the driver’s attention when
they stop touching the wheel?

~~~
crmrc114
Just googled this and found this nifty little bit o' kit
[https://www.autopilotbuddy.com/](https://www.autopilotbuddy.com/)

Guess we are really going to need eye tracking in any proper system that does
this much 'autopilot' for drivers.

~~~
nathanaldensr
This is hilarious:

 _> Do Not Use the AutopilotBuddy on public streets. The Autopilot Buddy is
not for street use, it is for closed track use only. The AutopilotBuddy Nag
Reduction Device is no longer available to USA Residents._

Look, ma! I'm driving on a track with _no hands_! What a joke.

~~~
Dylan16807
You wouldn't? It sounds like a fun thing to try.

(Yes, obviously you wouldn't be going anywhere near full speed.)

------
davidgardner
"In a 2017 study for Rand Corp., researchers assessed 500 different what-if
scenarios for the development of [self-driving] technology. In most, the cost
of waiting for almost-perfect driverless cars, compared with accepting ones
that are only slightly safer than humans, was measured in tens of thousands of
lives. “People who are waiting for this to be nearly perfect should appreciate
that that’s not without costs,” says Kalra, a robotics expert who’s testified
before Congress on driverless-car policy."
[https://www.bloomberg.com/news/features/2019-10-09/tesla-
s-a...](https://www.bloomberg.com/news/features/2019-10-09/tesla-s-autopilot-
could-save-the-lives-of-millions-but-it-will-kill-some-people-first)

------
zaroth
The CDC reports that every day in the US about 9 people are killed, and more
than 1,000 injured due to _distracted_ driving. [1] Annually, over 1 million
people die globally in car crashes.

Driving is both the source of trillions of dollars of economic value annually,
as well as the cause of ~$1 trillion a year in economic damages (including
pain, suffering, and loss of life).

A focused and alert human is, overall, fairly awesome at driving, even in
surprisingly adverse conditions. Humans, however, can also be lazy, easily
distracted, and make irresponsible choices.

Cars to some extent try to protect humans against making these bad choices
(like seat belt warnings) but to other extents respect that humans are not to
be subjugated by a machine, and this means even being allowed to make bad
decisions.

ADAS systems like AutoPilot and Level 3-5 self-driving are the most promising
solution to the daily devastation that occurs on our roadways. This is
literally a multi-trillion dollar problem. For an ADAS system to be most
effective, it must be reliable and predictable, and it _also_ must be easy to
use. An ADAS system cannot increase driver safety if the driver turns it off,
or won't turn it on because it's too annoying.

As a Tesla owner I can definitively state, that the safest way for me to drive
is with AutoPilot enabled, with my hands on the wheel, and looking out the
damn windows. This is how cars are supposed to be driven, and it's how Tesla
consistently tells its drivers to drive the car. While Tesla consistently and
repeatedly instructs drivers to keep their hands on the wheel and to remain
alert, it doesn't stop humans from taking their focus off the road, just like
humans do consistently with or without AutoPilot.

My point is basically this; it does not necessarily make driving _more safe_
to make AutoPilot impossible to abuse. The way most drivers that I see on the
road are operating their vehicle, it should be illegal to drive _without_
AutoPilot.

In short, I think it's important to understand that a technology can literally
save lives while still not saving _every_ life. It's a tragedy every time
someone dies on the roads, and it will be a miracle when we have technology
fully deployed that can prevent 99% of those deaths.

[1] -
[https://www.cdc.gov/motorvehiclesafety/distracted_driving/in...](https://www.cdc.gov/motorvehiclesafety/distracted_driving/index.html)

------
dmitrygr
Some of the info NTSB released, direct from them:
[https://www.ntsb.gov/news/press-
releases/Pages/NR20200211.as...](https://www.ntsb.gov/news/press-
releases/Pages/NR20200211.aspx)

------
veeralpatel979
I wonder why they chose the name Autopilot!

The name seems to imply something the tech does something that it's not,
though in all fairness to Tesla, they are very clear about the fact that
Autopilot does require an attentive human at the wheel.

------
neonate
[https://archive.md/5nRO8](https://archive.md/5nRO8)

------
clouddrover
Bloomberg article on the same:
[https://www.bloomberg.com/news/articles/2020-02-24/tesla-
to-...](https://www.bloomberg.com/news/articles/2020-02-24/tesla-to-face-
fresh-autopilot-scrutiny-after-company-snubs-ntsb)

------
deegles
If I was a passenger in a Tesla, I would not let the driver turn on Autopilot.
If they refused, I would get out and get a rideshare.

~~~
tristanb
Have you tried it?

~~~
ceejayoz
I can say "I'd rather not take cyanide" without having to try it first.

~~~
tristanb
Sure but cyanide definitively kills you, and you're talking about a product of
which (it appears) you have had no first hand experience.

~~~
ceejayoz
I have no first hand experience with cyanide.

------
agumonkey
maybe this will make Elon musk think twice

------
crmrc114
Good, for a car that uses AI to control windshield wipers (
[https://www.inverse.com/article/39877-elon-musk-tesla-ai-
wip...](https://www.inverse.com/article/39877-elon-musk-tesla-ai-wipers) ) You
cannot tell me that Tesla cant be bothered to do something like simple driver
attention checking.
[https://en.wikipedia.org/wiki/Driver_drowsiness_detection](https://en.wikipedia.org/wiki/Driver_drowsiness_detection)

Tesla does not want to fix this issue- because they have to know that a large
number of their buyers are wanting the car specifically to abuse the assertive
technology.

When a rental Ford Fusion yells at me using Adaptive Cruise without sensing my
interaction with the wheel you have got to be kidding me that Tesla cannot
take the lead on this technology.

~~~
xxpor
Autopilot yells at you too if you don't interact with the steering wheel.

~~~
crmrc114
Totally, but when using AI to control wipers its sad they cant be bothered to
respond to safety regulators when they could be leaders in this tech by simply
adding a camera system that monitors the driver like some other automakers are
experimenting with implementing.

~~~
jsight
Tesla has a camera that they could use for this. They claim that its
performance was not great. I hope that some comparisons are made between
vehicles so equipped vs the ones that are not, in order to provide further
evidence.

Although that is difficult, as in many cases the "cooperative" manufacturers
have not yet implemented such features. They have only promised to implement
them, sometimes in vehicles years away from production.

~~~
filoleg
Using the camera for that purpose might be a bit problematic (in terms of a
technical challenge, not as that adjective would normally be used by twitter
mobs).

One very common scenario (that I have no idea how it would manage to handle)
is someone driving with sunglasses on.

~~~
totalZero
GM's Super Cruise watches drivers' eyes and there are some videos showing that
it works for people wearing sunglasses. (I'm sure it varies based on the lens
material and polarization.)

