
Four hundred miles with Tesla’s autopilot forced me to trust the machine - jonbaer
http://arstechnica.com/cars/2016/05/four-hundred-miles-with-teslas-autopilot-forced-me-to-trust-the-machine/
======
Animats
From the article: _" Auto-steer had to wait for the freeway, where I toggled
it on. Unfortunately, I toggled it right back off. The portion of I-45 near my
home is undergoing a multi-year $6 billion rebuild, with a long stretch
spanning several exits being destroyed, reconfigured, widened, and otherwise
"improved." The result is a multi-mile segment of road that snakes back and
forth across what will eventually be both sides of the finished project. For
now, there's no consistent surfacing, no real lane markers, and nothing for
the lane-keeping system to see."_

 _" Lane keeping was almost excellent. The car kept itself squarely planted in
the middle of the lane without ping-ponging between the edges like some
earlier systems did. The car would still sometimes "lunge" at exit ramps when
lane markings would fall away to the right or left, though the behavior was
never enough to make me grab the wheel and disengage."_

So Tesla just has lane keeping and smart cruise control, like everybody else.
NHTSA Level 2. But they're way ahead in PR.

Google can handle construction, changed lanes, and offramps, city streets, and
follows navigation directions. They're NHTSA Level 3.

Handoff from automation to human in emergency situations is still a big
problem. Here's such a case. [1] Car with autopilot rear-ended stopped car on
freeway. Why? The driver tapped the brake while approaching the stopped car.
This disengaged the autopilot, including the emergency braking function, under
the assumption the driver was now in charge. Tesla claims the driver should
have understood that, as of firmware version 6.2, the auto-braking function
was disabled by any manual braking.

[1] [http://arstechnica.com/cars/2016/05/another-driver-says-
tesl...](http://arstechnica.com/cars/2016/05/another-driver-says-teslas-
autopilot-failed-to-brake-tesla-says-otherwise/)

~~~
daveguy
Having emergency braking turn off in the event of a manual brake tap is
definitely a failure on Tesla's part. Emergency braking should not only not
turn off it should brake harder (to the expected traction limit) and then, of
course, engage abs if the traction limit is exceeded. Emergency braking should
be on unless explicitly disabled (just like abs and traction control are auto
on in cars that have them).

~~~
beamatronic
>> Emergency braking should be on unless explicitly disabled

I'm trying to think of a scenario outside of a mechanic's garage that
emergency braking would _ever_ be disabled. This should really be mandatory on
all cars.

~~~
ctdonath
Which promptly trains all drivers into a "meh" attitude toward braking,
leaving it to the computer, and then surprised into paralysis (and subsequent
crash) when it doesn't work.

~~~
kalleboo
Another user linked a Fifth Gear video testing the auto-braking systems in
Volvo/Mercedes/VW cars. In it, they say the auto-braking systems only engage
by braking hard at the last second specifically so you don't get blasé about
it as if it were a smooth controlled braking. It's not a mode you want to use
because it really feels like you're going to crash.
[https://www.youtube.com/watch?v=PzHM6PVTjXo](https://www.youtube.com/watch?v=PzHM6PVTjXo)

~~~
LoSboccacc
that's interesting, I can see the reasoning behind, but what happens on a
slippery road?

------
mhandley
I was chatting with a friend who runs a haulage company recently. Some of
their new trucks come with auto-braking features. He told me they had to have
this disabled. The truck would be driving along in the slow lane on a motorway
minding its own business, and suddenly a car would dive across the front of
the truck from the fast lane, aiming for an exit. The truck would go to
maximum auto-brake, trying to avoid a collision that wasn't going to happen,
even though the car was crossing only a few feet in front. Not only dangerous
for the manual truck behind, but it caused too much damaged cargo and
corresponding insurance claims. Apparently car drivers do this all the time.
Anyway, I thought it was interesting that the correct action here was to not
auto-brake in some situations that were clearly dangerous.

~~~
aedron
The car should react to something closing up rapidly in front of it, which is
not the case in that situation. Seems like just a poor implementation.

~~~
amooiman
This is how my Subaru behaves, and it feels right. The only false positive
stopping I've encountered is someone coming from the opposite direction
turning left across my path. And even then, it only happens when they're close
enough that i they weren't moving, it would be a problem.

------
j2bax
How far out are we from truly autonomous personal vehicle travel? I want to
set my route and play a game or work or sleep. I want the vehicle to
automatically stop and charge when it needs to. If I'm feeling like it, I can
get out and use the restroom.

I would be so much more likely to make frequent cross country trips to visit
family and so much less likely to ever book a flight unless I'm needing to
cross water. I wonder how this would hit the airlines in the US.

~~~
lettergram
I'm guessing it would impact airlines minimally. I'm in California and my
family is in Illinois. Even the best route, speeding the whole way, it takes
me twenty hours. Hell, just to go to San Diego from San Francisco is 7 hours
minimum (took me 12 last Friday).

People fly to save time, and if youre going to a hub city (I.e. Chicago, Los
Angeles, New York, Atlanta, etc.) It will likely be cheaper to fly as well, I
just booked a flight from San Francisco to Chicago for $148 (driving would
easily be $250).

~~~
j2bax
If you are going from one non hub city to another non hub city, the overall
air travel time goes up substantially (as well as the cost). It will still be
a good deal faster than driving cross country, but if driving is relatively
stress free and I can do it through the night, I'm going to be far more likely
to skip the stressful lines, running between gates, delays and high prices of
air travel. I don't think I'll be alone.

~~~
Heliosmaster
So, like taking a coach bus? In Europe it is very common to do so.

~~~
maxerickson
In areas of the US with high population density, bus service is pretty okay
(more or less the Northeast Corridor).

The bus service from Houston to Miami is a little less okay, but it's because
it's 1900 kM and won't have a lot of users (Google suggests public transport
would take 34 hours, a train and a couple of buses).

------
kendallpark
This is a major reason I'd love to own a Telsa (or car with similar features).
Right now I have to avoid driving long distances at night because I have a
hard time staying awake. While I would have no intention of taking a full-on
nap while on autopilot, it would make sitting behind a wheel a lot safer
during long hauls between cities.

~~~
ars
No, that would make it much LESS safe.

Instead of now where you force yourself to be fully awake, instead you will
allow yourself to semi-snooze, until you hit a situation the computer can't
handle - and then you are in real trouble.

~~~
developer2
This is exactly why I wish these cars would be banned from being on the road
at all in their current state. Some people are assuming it means you don't
have to pay as much attention. In the hands of your average consumer, these
cars are not safe. As an _above average_ user of technology, I would not trust
myself to remain in constant control of such a vehicle. I certainly don't
trust the average or below average consumer to handle it!

Most who claim they would remain attentive are not aware that they are making
false claims, and this likely includes a lot of HN visitors. The better the
cars get at remaining on autopilot, the more complacent drivers will become. A
really good autopilot is training its users into trusting it to handle
everything. If your average commute results in autopilot practically never
kicking in, your brain will naturally - regardless of your intent - find
something else to occupy your attention. Reading a book in your lap, checking
the weather, grooming yourself in the mirror, or even just spacing out and
daydreaming.

Frankly I'm of the opinion that self-driving cars are something that will
happen in 50+ years. When cities are built with "roads" that are actually
tracks; tracks that are designed to handle _only_ these cars and nothing else.
I imagine we will not own such cars; they'll be more like subway/metro public
transit systems, but with individual pods that hold a small number of people
and with hundreds/thousands of small stations to embark/disembark from.

These vehicles should not be allowed until there _is no manual takeover_. If
the vehicle cannot operate at all times on autopilot, it should not be on the
road.

~~~
arcticfox
Is there any evidence that these cars, abused as they are, are less safe when
under autopilot than the average driver on the road?

This is the only reference I've seen to that
[http://electrek.co/2016/04/24/tesla-autopilot-probability-
ac...](http://electrek.co/2016/04/24/tesla-autopilot-probability-accident/).

~~~
developer2
>> when under autopilot

I care _a lot_ about what happens when autopilot disengages.

I truly and honestly believe it is a biological, psychological fact, that
autopilot will result in less attentive occupants in the vehicle. We're going
to blame "drivers" \- who are really now just passengers, expected to
sometimes take the wheel - for accidents caused by autopilots disengaging,
because the driver "should have been in control". Except that most people are
going to be mentally unable to maintain the kind of focus needed to be able to
take over a failed autopilot.

Let me put it this way: I believe that today, if my mother were to cause an
accident, she would have done something _tangible_ to be deserving of being
"at fault". In a future where my mother can be considered at fault for killing
someone because she failed to take over an autopilot in under a second... that
is not a future I want to experience.

I'm afraid of the moment we begin blaming people for loss of life caused by
technology expecting us to interfere when it is innately difficult to do so.

------
extr
Everyone is giving examples of situations where they think an autonomous car
will have trouble but many of them boil down to the same thing. Cars will not
truly be able to drive as well as humans until they can model humans as other
rational agents and interpret subtle environmental cues (like nonstandard
signage). Maybe this still isn't strong AI, but it's pretty damn close. So in
that regard "fully autonomous" cars are still a ways off, more like decades
than years. But I've always thought the industry will still push forward
quickly, the core tech is finally getting good enough that there is a strong
incentive to just go in and brute force many of the little logic "defects"
inherent in the current state of AI. If it can work really really well but
only for 70% of weather/driving conditions that still has huge value.

~~~
colejohnson66
4 way stop signs were really confusing for me when I first learned how to
drive. It seems there's so many nuances about when you should go or not. Is
there even an algorithm available to handle them?

~~~
extr
According to a NYT time article [1] they can already do that. I'm sure by this
point they can model other vehicles behavior in common situations like this.
But ktRolster brings up a good point, there are many situations where a glance
is all that is used or needed to signal intent. In the article they mention
that at four-way stops they do this by just inching forward and testing the
waters, but obviously you don't want to inch forward into a pedestrian or
bicycler. Makes me wonder if we might want to standardize the method of
communicating intentions to pedestrians across all self-driving cars.

[1]
[http://www.nytimes.com/2015/09/02/technology/personaltech/go...](http://www.nytimes.com/2015/09/02/technology/personaltech/google-
says-its-not-the-driverless-cars-fault-its-other-drivers.html)

------
repler
With the exception of changing lanes automatically, my Jeep Cherokee does all
this stuff.

It does, however, automatically adjust your speed to match the lane you're
changing to.

At the end of the day, it's all about sensors and software. I'm sure Tesla has
better engineers than Fiat Chrysler but I'm also pretty sure it's off the
shelf for them and they just bolt it into the car at this point.

Good stuff, though. Can't wait to see full point A to point B navigation.

------
fma
"The car had already seen a 60mph (100km/h) speed limit sign, so it set that
as its target speed. "

So...what about those signs where kids spray paint 35 to be 85. Will it zoom
to 85?

~~~
educar
If kids spray paint 35 to 85, can a human detect it? If a human can, I am sure
a machine can as well (at some point).

~~~
startling
The question is whether it does, not whether if it's possible.

~~~
tobltobs
The human or the machine?

------
dragontamer
Good article. It captures the wonder / awe of this without going full-tilt
fanboy. It keeps in mind other competing technologies to help the audience
stay aware of where the other competitors are.

------
chestervonwinch
Just curious - at one point, he is going 85 mph in a 75 zone, but the car is
on autopilot. He mentioned he had the cruise set at 80. Who is liable for
accidents in this situation?

~~~
DanFeldman
For now, and at this level of automation (level 2) it is the drivers fault.
When cars reach higher levels of automation (that generally means deciding
more things like lane changes, target cruise speed, etc) it then becomes the
car's fault and by extension the manufacturers.

~~~
ctrlalt_g
Is that written, or just your opinion (professional or otherwise)?

~~~
kalleboo
For one, Volvo has decided to take responsibility for accidents that happen
during Autopilot driving [http://www.volvocars.com/intl/About/Our-Innovation-
Brands/In...](http://www.volvocars.com/intl/About/Our-Innovation-
Brands/IntelliSafe/IntelliSafe-Autopilot/News/Volvo-Cars-responsible-for-the-
actions-of-its-self-driving-cars)

------
huuu
After watching this and a Volvo movie I wonder what happens when you touch the
steering wheel when you are in autopilot mode.

It's nice that you can check your mail or read a news paper in autopilot mode
as advertised but doesn't this create the danger of touching the steering
wheel, turning autopilot of, while still holding a newspaper in your hand?

Also I can imagine people could fall asleep when the car is driving.

~~~
kqr
What? Are they actually suggesting you can read the newspaper while driving,
as long as you have auto-throttle, auto emergency braking and lane keeping on?

~~~
huuu
[https://www.youtube.com/watch?v=xYqtu39d3CU](https://www.youtube.com/watch?v=xYqtu39d3CU)

In this Volvo movie the driver is opening a note book and making notes and
also is watching a movie.

Both Tesla and Volvo have auto pilot mode which still needs an alert driver.
But they advertise it like autonomous driving.

Edit: Volvo will launch Level 4 autonomous driving indeed. So they are a step
ahead of Tesla. But it's confusing because they also call it auto pilot:
[http://www.volvocars.com/au/about/innovations/intellisafe/au...](http://www.volvocars.com/au/about/innovations/intellisafe/autopilot)

------
thinkMOAR
Still waiting for the KnightRider feature where you can talk to your watch to
have the car rescue you, i mean pick you up :)

------
magoon
Auto-braking is mandatory in all cars by 2020.

