
I Just Drove Eight Hours on Tesla Autopilot - nreece
https://www.bloomberg.com/news/articles/2016-08-10/i-just-drove-8-hours-on-tesla-autopilot-and-lived-to-tell-the-tale
======
steven777400
Put aside the silly title and the inconsistency between the subtitle and
content ("the only real problem was the driver behind the wheel" vs "The
sensors failed to register a rapidly accelerating gold-colored SUV beside me
and would have driven directly into its path had I not quickly taken over")
and this article has a really good point to make:

As we get closer but not entirely to autonomous vehicles, figuring out how to
communicate to users (at what point do we stop calling them drivers?) the
necessary level of engagement (and notifying them when that level changes) is
going to be increasingly critical.

~~~
protomyth
> the necessary level of engagement (and notifying them when that level
> changes) is going to be increasingly critical.

Is there an "uncanny valley" style term for this?

~~~
arjunnarayan
There should be, and the best candidate term is "Children of the Magenta", a
term describing pilots who are too dependent on the Magenta lines on their
instrumentation.

99% Invisible did a good podcast episode on this a year ago:
[http://99percentinvisible.org/episode/children-of-the-
magent...](http://99percentinvisible.org/episode/children-of-the-magenta-
automation-paradox-pt-1/)

Their follow-up podcast episode talks explicitly about this problem (first
seriously discussed in the context of airplane automation) becoming important
in the context of self-driving cars, and how Google is very aware of this
(hence Google's push for 100% automated cars, since the middle valley is so
dangerous).

------
abcampbell
_" Consider this: If a Toyota driver had standard cruise control set for 70
miles per hour on the highway and failed to take over and reduce speed for a
25 mph turn, would we blame the cruise control for the resulting crash?
Relinquishing full control to Autopilot is no different."_

That's why Toyota calls is _cruise control_ and not _autopilot_

~~~
jads
>An autopilot is a system used to control the trajectory of a vehicle without
constant 'hands-on' control by a human operator being required. Autopilots do
not replace a human operator, but assist them in controlling the vehicle,
allowing them to focus on broader aspects of operation, such as monitoring the
trajectory, weather and systems

Autopilot doesn't mean fully autonomous. It was never that originally in
aviation either, basicLly just controlling altitude and speed. I'd say Tesla's
use of "autopilot" is pretty spot on.

~~~
detaro
The difference is that "keep height, speed and direction" is a pretty safe way
to operate an airplane at altitude: there are no sudden obstacles, and there
is a lot of infrastructure to make sure traffic stays nicely separated. A
human reaction time of a few seconds is fine.

It's not for a ground vehicle in normal environments.

~~~
dingo_bat
If I'm not wrong, autopilot can follow set route also and if the airport
supports it, it can land autonomously too. Tesla's use of the term is highly
misleading.

~~~
antmldr
No, but granted we're getting into semantics. Feature 1 you describe is
provided by the Flight Management System[0], and feature 2 is provided through
Autoland[1]. [0]:
[https://en.wikipedia.org/wiki/Flight_management_system](https://en.wikipedia.org/wiki/Flight_management_system)
[1]:
[https://en.wikipedia.org/wiki/Autoland](https://en.wikipedia.org/wiki/Autoland)

------
jgamman
when you call it Autopilot and not SuperiorCruiseControl you've set
expectations. It's no use trying to re-define the term more accurately after
the fact. They'll survive but the halo got dented.

~~~
delinka
The Professionals® that actually use a real autopilot tell us that Tesla's is
aptly named-- it's limited in its functionality and is designed to _assist_
the operator of the machinery and to reduce fatigue.

The Problem® is one of public perception/branding. Hollywood has convinced the
overwhelming preponderance of the population that airplanes on autopilot can
taxi away from the terminal, navigate to the runway, avoid collisions with
buildings, vehicles and other planes, line up on the runway, take off,
navigate to the destination, line up with the correct runway from the correct
direction, land the plane, and taxi to the final terminal.

Naming your not-fully-autonomous car driving software "autopilot" is going to
have a perception problem. Educating the public after the fact is a hard
battle.

~~~
hueving
Read the post-mortem of the Air France flight from Brazil to Paris. The pilots
did not even know the basic error recovery conditions of flying the plane they
were in (recognizing dual input, recognizing a stall) because they so rarely
flew the plane. Auto pilot can handle navigation to destination, lining up
with the runway after being given the approach vectors by the pilot that
he/she received from ATC.

If auto pilot in Tesla were used like auto pilots in modern commercial air
travel, it would cover 95% of the operation time of the cars.

~~~
mikeash
One pilot, singular, did not know how to recover. The other two did. The one
who was at the controls attempted to perform a correct recovery. He was
thwarted by the unbelievably idiotic design of the Airbus control system, in
which conflicting inputs are simply averaged, and the only indication that
something strange is happening is a warning light.

Edit: regarding the pilot who did the wrong thing, I don't think it's fair to
say that he didn't know how to recover, either. He almost certainly knew how
to recognize and recover from a stall. He was a glider pilot, where stalls are
a matter of course. It seems that he panicked, which prevented him from
thinking clearly and taking the proper actions. People do that sometimes, and
a big reason you have two pilots is so the other one can step in when one of
them gets stupid. The fatal flaw with the Airbus was the system which failed
to adequately inform the other pilot of the first pilot's stupid actions. (In
any sensibly designed aircraft, the two sets of controls will move together,
providing physical feedback of each pilot's inputs to the other pilot.)

~~~
hueving
>and the only indication that something strange is happening is a warning
light.

In addition to the warning light there is an audio warning "DUAL INPUT" and
tactile feedback (stick vibration). The fact that the pilots did not recognize
this condition feeds into my point about them not flying the plane nearly
often enough to actually understand how it behaves in anything other than an
ideal condition.

------
flashman
It looks bad that the Tesla didn't detect the SUV. But on the flip side, if
the SUV had also had Autopilot, maybe it would have been a less dangerous
situation. And when more cars have autopilot systems, maybe they'll be better
able to communicate their proximity to one another.

------
nols
The problem isn't autopilot, it's overconfidence in autopilot. The driver who
was killed was allegedly watching a movie and speeding when he crashed.
Tesla's response is that autopilot is still in beta and people shouldn't rely
on it too heavily, oh and by the way here's a story of a man who's Tesla drove
him to the hospital after he had abdominal pain. There are plenty of Youtube
videos of people screwing around while using autopilot, with a few near misses
when autopilot screwed up.

When building a several ton vehicle you need to design for human flaws, and
Tesla hasn't. Every other manufacturer with adaptive steering and autopilot-
esque features has created attention checkers to ensure the driver doesn't
rely too heavily on autopilot. If one of them released a beta autopilot the
NHTSA would be on the phone in record time telling them to knock off the
foolishness.

Tesla makes very cool cars, and the technologies they and other autonomous car
companies are developing will absolutely save lives, but it's stupid to ignore
the human factor until you've developed a car that 100% doesn't need a human
driver.

------
tbrock
The author states that autopilot made him a better driver but I can't see how.

If the car is doing most of the work that requires a person's attention then
they will pay attention to something else.

This will prove fatal when the sensors fail or there is some unforeseen
condition.

Hunans are, at this point, able to detect and possibly correct for for more
errors. Sure the car would be able to react more quickly but maybe it won't
react at all.

------
DannyBee
Am I wrong in thinking that the problems complained about here are precisely
because it uses sensors and a camera and not lidar?

(I did a bit of research, and reading around says "probably", but i trust HN's
opinion a bit more :P)

~~~
Animats
The situational awareness of this thing is not good. The author points out
that it didn't notice a rapidly approaching car prior to a lane change. The
only side-looking sensors Tesla has are ultrasonics, which are too short
range.

The Tesla sensor suite is:

\- a bumper-level 2D radar, good for not rear-ending cars in front. Not good
for obstacles at windshield height. Will detect a car. Might detect a bicycle.
Won't detect most pedestrians.

\- a forward facing camera using MobileEye software. This 1) recognizes lane
lines, 2) tries to range the car ahead by putting a rectangle around the
recognized rear end, and 3) reads speed limit signs. It may be able to
recognize pedestrians. (Unclear. See [1])

\- some ultrasonic sensors which detect obstacles to the side and rear at
short range. Mostly useful for parking.

This just isn't enough. Tesla isn't profiling the road, and will drive off a
cliff if the lines lead there, as the parent article points out. We had that
figured out in the DARPA Grand Challenge a decade ago, where the vehicles
didn't need road lines.

Compare Google's system.

[1] [https://electrek.co/2016/06/26/tesla-autopilot-tests-
pedestr...](https://electrek.co/2016/06/26/tesla-autopilot-tests-pedestrians-
auto-emergency-braking-video/)

~~~
flukus
Could a human have detected the vehicle? Our only sensor is line of sight
optical (with a few mirrors) and near misses when merging are very common.

~~~
pcl
From reading the article, it sounds like the human _did_ detect the vehicle,
and that's what prevented a collision.

~~~
flukus
Yeah but it wasn't clear on the exact circumstances. If it was accelerating
from behind then it could have been occluded by another car. The human may
have only seen it because they were free from concentrating on other things,
etc.

------
cesarb
I like this video, which shows the Tesla autopilot in action with the driver-
side display (which is usually hidden by the wheel) visible:
[https://www.youtube.com/watch?v=ZetTg73Ebyo](https://www.youtube.com/watch?v=ZetTg73Ebyo)

Looking at it, it seems that the autopilot mostly adds another layer of
indirection. Instead of driving directly, the driver gives the autopilot
orders like "maximum speed 60" or "move one lane left/right". The driver still
has to pay attention to what's happening around to give it the correct orders.
In the example given in the article, the driver still has to check whether the
lane is free before giving the car an order to move into that lane.

------
tbihl
To me this is analogous to streets that are dangerous because pedestrians and
cyclists are seldom seen. Drivers come to expect very simple patterns, and
roads are engineered with forgiving design, and so drivers know they can be
distracted with no issues.

Then something changes.

------
deegles
I believe there will be another death in a Tesla vehicle with Autopilot
enabled within a year. As drivers get more used to using it they will get more
complacent. More features and improvements will roll out, requiring almost-
but-not-quite complete inattention, which will increase driver's trust in the
Autopilot, ironically putting them in more danger from the 1-in-100000
situations that humans handle easily.

Full autonomy is the only safe mode long term.

~~~
ux-app
i agree. it's a dangerous game Tesla is playing at

------
ams6110
Interesting that this story made the front page and the story today of the
Tesla autopilot crash in China that was posted several times did not.

------
p1esk
The scariest bit from the article: the driver who died was very familiar with
the autopilot limitations.

~~~
chc
Most people who die due to inattentive driving are aware of their car's
limitations.

~~~
p1esk
Now imagine a whole bunch of drivers who are not very familiar with autopilot
limitations, especially if the autopilot works great 99.99% of the time.

~~~
clarky07
if it works 99.99% of the time that will likely be higher than the average
human driver

~~~
p1esk
You're missing the point. The current Tesla "autopilot" is not ready, and is
not intended, for full time driving. Not yet. However, it might appear to work
well, and some Tesla drivers will forget that it's not ready. That's the
problem with half-finished "autopilots".

------
jfoster
I wonder to what extent the Tesla not seeing the SUV was due to it being gold
colored. There must also be colors of cars that humans are less likely to see,
too. I wonder if certain vehicle colors might be phased out to aid road safety
in the near future.

------
chollida1
I wrote the following a little while ago just after the auto pilot crash was
reported. Since then my feelings on auto pilot haven't' changed. it's the
future but right now it actually makes things harder due to the uncanny valley
situation its currently in.....

If you've never used the Tesla autopilot its a weird feeling.

It's not a fully self driving car and its obviously not strictly human
controlled. Unfortunately it ends up being more mentally taxing to use this
hybrid approach than to just drive yourself.

Consider highway driving, with normal human powered mode you are in full
control so if you see brake lights a half kilometer up ahead you can disengage
the cruise control and react on your own, everyone who has driven is
comfortable with this.

With this assisted driving the car doesn't slow down right away, and its not
clear if the car just can't see the tail lights lighting up yet or if its
decided it doesn't need to react yet and as such you start to second guess the
car,

\- should I drive or should I leave it to the car?

\- what if it turns when I grab the wheel? Will I make things worse by
driving?

\- does the car even see the object up ahead? How can I tell, its impossible
to expect the car to tell you of an object that it cant' even see.

It becomes just more taxing to use the hybrid approach to driving and as such
I don't use it at all. I have no doubt that the Tesla auto pilot is safer than
driving, but I also have no doubt that it can royally screw up.

I've come to the conclusion that some assisted driving, like auto braking for
obstacles that you will imminently hit is good but the kind of assisted
driving where it can almost automatically drive for you is not really anywhere
near ready, if you follow Tesla's rules on how to use it, it actually makes
driving harder:(

~~~
mikeash
I just did about 400 miles on autopilot today (and thousands previously) and I
find it to be the total opposite. It's infinitely more relaxing than fully
manual driving.

You still have to pay attention and decide, but the minutiae of making tiny
steering movements is gone, leaving more capacity for higher level observation
and planning.

The second guessing you describe simply doesn't happen for me. I'm capable of
taking over and driving competently in any situation autopilot can handle. I
never try to guess whether the car can see something; if it fails to respond
in a way I like, I just take over. If I want to brake for a car and autopilot
isn't braking, I brake. There's no need and no reason to wait for the car to
maybe do it on its own or maybe not.

Disengaging autopilot, performing some maneuver, and reengaging it takes no
significant time or effort beyond the maneuver itself. If you're sitting there
wanting the car to react to something and it hasn't yet and you're waiting to
see if it will, you're doing it all wrong.

~~~
derefr
> If you're sitting there wanting the car to react to something and it hasn't
> yet and you're waiting to see if it will, you're doing it all wrong.

Okay, but what if you're the sort of prematurely-conscientious person who
holds a door for someone who's still 30 seconds away from the door? You'll
_always_ be doing things before the autopilot gets to them, because you're
doing them before you're "supposed" to do them (which in normal human driving
is almost never a problem, so there's nothing discouraging people from this.)
Thus, the autopilot will never get to do anything, so there's no point to
turning it on.

~~~
jfoster
Suppose you're right about that type of person and autopilot never getting to
do anything. I think it's still worth having it on. People make errors. The
one time that person misjudges something and is about to hit something on the
road, there's a good chance autopilot will see it and brake. It only needs to
happen once to be worth it.

