
Tesla Model S adaptive cruise control crashes into van - aresant
https://www.youtube.com/watch?v=qQkx-4pFjus
======
simonsarris
This is a known issue. Bottom right of page 64:

[https://www.teslamotors.com/sites/default/files/Model-S-
Owne...](https://www.teslamotors.com/sites/default/files/Model-S-Owners-
Manual.pdf#page=64)

> Warning: Traffic-aware cruise control may not brake/decelerate for
> stationary vehicles, especially in situations when you are driving over 50
> mph (80 km/h) and a vehicle you are following moves out of your driving path
> and a stationary vehicle or object is in front of you instead. Always pay
> attention to the road ahead and stay prepared to take immediate corrective
> action. Depending on TrafficAware Cruise Control to avoid a collision can
> result in serious injury or death.

~~~
codemac
Good to hear that Tesla knows it's an issue.

Bad to hear that they released it knowing this was an issue, because I
wouldn't expect myself to be able to react to this.

Humans are not capable of handling these types of nuances well at speed. Being
an active driver and being an inactive driver that needs to stay alert enough
to catch the few cases where you run into "known issues" is just not going to
work well.

As I'm typing it I'm suddenly having a wave of guilt for all kinds of software
bugs I've written where the answer was "it's a known issue".

~~~
jsight
Tesla is not a self-driving car at this point. The fact that people rely on
this as the primary means of breaking is completely crazy.

~~~
falcolas
When you only have to brake in 1% of the use cases, and the "Autopilot"
handles the other 99%, is it really that crazy that a driver does not
instinctively recognize that 1%?

Or is it crazy to expect a user to be able to react instantly to the 1% use
case?

~~~
ricardobeat
This is not very different from any normal car with cruise control.

~~~
falcolas
A normal car with normal cruise control does not keep itself in a lane. Nor
does it adjust for the speed of other cars. All it does is make your speed a
constant; you still have to steer and initiate any and all braking or
overtaking maneuvers.

~~~
ricardobeat
Some do adjust speed. The point being, neither is a fully automated auto-pilot
- the human behind the wheel is always responsible for stopping the vehicle or
steering away from obstacles in an emergency.

------
honkhonkpants
There are a bunch of these videos on YouTube. The best part about them is
watching the Tesla Owners' Club (Internet Comment Subdivision) crawl out of
the woodwork to explain away why the software worked perfectly and if only the
meatbag behind the wheel had read section 74.3(f) of the operating
instructions they would OF COURSE have known about this.

~~~
aggie
While I would hope automakers don't push the technology before it's ready, and
relying on disclaimers does seem more CYA than an actual safety measure, it's
easy to be defensive when an issue like this has a high probability of being
treated irrationally. A meatbag in a conventional vehicle getting into an
accident doesn't make the news. An autonomous vehicle completing a trip
without incident doesn't make the news. The relative accident rates of
autonomous vehicles and meatbag-piloted vehicles doesn't make the news. These
are sensitive times for public acceptance of the technology.

------
bargl
In a continuation of regulation being outpaced by technology. I feel the
current law doesn't properly cover this. My opinion is that, If a car
manufacturer is providing a service that controls the brakes and acceleration
or turning of your car in an autonomous manner they should be responsible for
any accidents that arise because of it.

It is not responsible to expect humans that are sitting behind the wheel of a
car that covers 90% of situations to pay attention for the 10%. This sort of
situation puts the human at eases and gives them false confidence in a
vehicle.

Even if the fine print SAYS they are in control. They are adding a layer
between the human and the car that delays their reaction speed a little more.
Because the human is not trained to drive in a buggy ass car, they aren't
trained to respond to this sort of thing.

Liability for these sorts of accidents should fall on Tesla, and they should
not be having their drives be guinea pigs. If a driver wants access to this
autonomous mode they should have to sign a release because by default Tesla
should be responsible. Again the law isn't current with this.

I also want to say I love what tesla is doing but I think the bad PR of
irresponsibly integrating autonomous driving when it isn't ready (or they
don't have the proper sensor bank to make it work) is going to delay the
acceptance of self driving cars in the public in general.

Tesla's self driving car is to Googles self driving car as the Blue Origin
landing was to Space X's landing . But in general people don't know or see the
difference they are just cars that drive themselves or rockets that land. Also
google's car is typically driven with a trained driver.

I also want to say I'm a huge proponent of self driving cars.

~~~
FussyZeus
The technology in question is cruise control, not auto pilot or auto steer or
anything of the sort. Mercedes' have had radar enabled cruise control since
the 80's, and the principle is the same: The vehicle will keep pace with
traffic and if you watch the video, that's exactly what it did. The driver
maintains full control of the vehicle and it's clear by the sound of the
conversation that the driver wasn't paying attention to the road, ergo,
driver's fault 100%.

~~~
bargl
Does it brake and accelerate? If yes, then the car is responsible for braking
and accelerating and any damage done while it is braking and accelerating.

Does it turn? No, then no responsibility for turning.

If I can rely on the car to break and go, then I should be able to RELY on the
car to do so. This seems like a feature that makes people complacent and will
end up hurting self driving car adoption in the long run.

Look at how many people harp on the few accidents Google has been in, and they
have a well trained driver in it. Tesla is just putting a 1/2 measure in place
so that they can leverage their users as testers and pawning off
responsibility when the users turn out to be TERRIBLE testers.

~~~
FussyZeus
> Does it brake and accelerate? If yes, then the car is responsible for
> braking and accelerating and any damage done while it is braking and
> accelerating.

Except that this car's manual, along with every other car I've ever owned and
seen that had active cruise control, specifically states that the car will not
stop for stationary objects. People need to RTFM. It comes with the vehicle
for a reason.

------
cjensen
I'm not an owner, but even I know that the Adaptive Cruise control is for use
on clear highways and can't cope with much.

This was a situation where the driver plainly needed to take over, had plenty
of time to take over, and the issue was sufficiently obvious that even a non-
alert driver who was looking forward would see they needed to take over.

There's an argument to be made that relying on partial automation is a
fundamentally bad idea because the driver will not become sufficiently alert
in time to save themselves. This is not an example of such a situation.

~~~
soccerdave
As the driver said in his comments, when he'd seen the system stop 1,000 times
in the past he had a false sense of security that it was actually going to
stop for him like it was supposed to.

~~~
cjensen
If the driver had used it off-highway 1,000 times in the past, but then the
Tesla made an error off-highway, is it Tesla's fault or the driver's fault?

This isn't even a Tesla issue: most car makers now ship cars with adaptive
cruise control with stop-and-go. If they stop 95% of the time in this
situation, do we really have to ban them because some drivers insist that past
experiences constitute a valid test?

------
Aelinsaar
It's still a great track record, but I can't shake my misgivings about
automotive autopilot beta testing being an open-road thing. This seems like
begging for trouble, with the biggest issue being the occasional ambiguity
about a momentary brake-tap turning a system off.

~~~
holyoly
Adaptive cruise control is not autopilot. It does not steer for you. It's not
in beta, it's been a production feature on high end cars for 10 or so years.

Regular cruise control has always deactivated with a momentary brake tap. Been
this way since cruise control started coming on cars.

~~~
falcolas
Based on my experience, when adaptive cruise control is braking, manually
braking has unexpected consequences.

A tap can disable the current braking completely, resulting in an unexpected
change in velocity. Frequently a bad idea, since the ACC is braking for a
reason.

Applying normal braking pressure can add your braking pressure to the existing
braking pressure, resulting in an abnormally large braking force, sometimes
enough to initiate ABS on dry roads. Very disorientating, and it makes you
unpredictable to drivers behind you.

The solution is to tap, then press a second later, but this requires you to be
aware of the need to take manual control early in the situation.

------
Corrado
The video has been marked as Private on YouTube and is no longer available to
view. Does anyone have a mirror?

~~~
wrmsr
ish
[https://media.giphy.com/media/3o6EhCnV9fgZQgw6xG/giphy.gif](https://media.giphy.com/media/3o6EhCnV9fgZQgw6xG/giphy.gif)

~~~
whamlastxmas
Wow that's it? I was imagining a crumpled ball of steel from a 70MPH impact
and potentially deaths.

------
heptathorp
I cannot wait to be killed by some idiot putting 100% faith in his Tesla to do
his job of driving for him.

~~~
toomuchtodo
Would you prefer an 18 year old snapchatting at 107 MPH in a Mercedes with a
pregnant passenger? Resulting in permanent brain damage for her crash victim?

[https://www.washingtonpost.com/news/morning-
mix/wp/2016/04/2...](https://www.washingtonpost.com/news/morning-
mix/wp/2016/04/28/lawsuit-blames-snapchat-for-107-mph-crash-in-mercedes-
caused-by-teen-girl-using-speed-filter/)

The point is no matter how many accidents autopilot gets into, its safer than
human drivers. And it will only improve.

Autopilot could kill ~30K people a year, and that's still less than the number
of people killed by people driving in the United States alone.

~~~
centizen
I'm not sure how this story is relevant, it's well known that human drivers
are more dangerous than self driving car. This particular case, while tragic,
is not unique.

But this isn't a self driving car - it's automated cruise control. It's not
going to let you snapchat and drive safely. You still need to be aware, and
rhetoric that presents it as a solution to distracted driving is dangerous,
leading to situations exactly like this one.

Sure it's still safer; that's not the point.

~~~
toomuchtodo
> Sure it's still safer; that's not the point.

That's entirely the point.

~~~
centizen
When it is safe enough you can operate the vehicle without also being a
responsible driver, then I'd agree with you. But it's not, as demonstrated by
this thread.

------
ntrepid8
Mine picks up the stationary vehicles 9 out of 10 times, but you do have to
watch it. This happens most frequently at stop-lights on roads where the speed
limit is above 55 MPH and I'm approaching the stationary car stopped at the
light.

------
SFJulie
Too much assistance make people less aware of danger especially when there are
slight perturbations to the handled case:

It happens in airplane industry (where people forgot some basics because they
use to much autopilots)

It happens in science when people use too much black magic statistic metadata
(and forget to check the details in the slight differences of inputs).

But be aware that more automation of dangerous industry also will have the
same effect ; mass transport, nuclear plant, chemical plants, medical
practice, energy grids.

Well, aren't coding with google&SO and/or frameworks kind of a growing
autopilot for most coders?

------
bryanlarsen
Read the YouTube comments -- the OP has responded to many questions there.

------
whamlastxmas
More accurate title: Tesla successfully emergency brakes to gently tap the
rear end of a van instead of ramming it in what would be a potentially fatal
accident

------
purpleidea
Who's got a mirror? Video is private.

------
ethbro
From the sounds, it seemed to detect it at the last moment. Can Model S owners
confirm?

Also, there are variable settings for breaking, right? Has anyone tested what
happens if they're intentionally set too low to prevent accidents?

So many experiments I'd run with a Model S and an EM mocked up crash pad...

------
donald123
apparently tesla drivers are just beta testers

~~~
chipperyman573
They made that pretty clear when they released a beta version of the software.

~~~
jjn2009
"production ready" software seems very difficult without all the data they are
gathering from the beta release.

------
sschueller
The car appears to accelerate as soon as the car in front drivers around the
van. I would think it would wait a second to reevaluate the situation before
rapidly accelerating into the van.

------
arprocter
I noticed a recent Merc TV ad had a disclaimer along the lines of '*vehicle
may not automatically brake in all circumstances'

------
revelation
Cruise control in the left lane? Not paying attention? Ticks all the boxes.

~~~
dsfyu404ed
Cruise control that will AUTOMATICALLY DETECT AND FOLLOW THE SPEED LIMIT(!!!)
in the LEFT(!!!) lane.

HR would have a field day with all this box checking

------
cloudjacker
got a mirror?

------
iamleppert
Wow, what kind of awful sensor technology and low sampling rate are they using
for this? You could probably build something that worked better with an
Arduino.

Vehicles equipped with these features are actually more dangerous, based upon
the preliminary data. I'm all for new technology, but if you want to use
clearly "beta" \-- and that's putting it lightly, as its clear there are
definite hardware/sensor/sampling problems here, not software -- you should be
required to carry high risk insurance.

It's not fair to the rest of us drivers who have to be on the road. We did not
agree to be Tesla's beta testers too, and its only a matter of time before
someone is seriously injured or killed by one of these "features".

I'm sure someone here will chime in and say, "but its safer than a human
driver, a reckless teenager". To that I say, that's why teenager's insurance
rates are so high! We need to increase the insurance rates here to reflect the
risk presented to both the driver and general public of this reckless, half-
backed autonomy.

~~~
firethief
> You could probably build something that worked better with an Arduino.

What are you waiting for?

~~~
toomuchtodo
OP is waiting for their confidence bias to subside.

[https://en.wikipedia.org/wiki/Overconfidence_effect](https://en.wikipedia.org/wiki/Overconfidence_effect)

