
Tesla owner almost crashes on video trying to recreate fatal Autopilot accident - evo_9
https://electrek.co/2018/04/02/tesla-fatal-autopilot-crash-recreation/
======
eddyg
Previous discussion from a few hours ago:
[https://news.ycombinator.com/item?id=16732436](https://news.ycombinator.com/item?id=16732436)

Also worth noting that this video is from a highway in Illinois, not
California.

------
TallGuyShort
Note that I'm not saying Tesla Autopilot is ready to bet lives on, etc, etc
etc... but I think we also need to address the problem of bad lane lines for
humans too. The right side of the divider was virtually invisible. Anyone
fully paying attention should also see the right-side lane markers continuing
straight and stay with them, and see the barrier very early (although I
usually see those with a lot more reflective markings and barrels to absorb
energy), but - bad lane line markings can definitely throw off imperfect
humans plenty too. The I-15 in Utah is notorious for this. Areas that have
been under constant construction for years have layers of previous temporary
lane markings that are badly removed and weren't clear to begin with, and you
can be driving along with several lanes of traffic who suddenly get confused
about which lane they should be following and start converging into the same
lanes.

Clear lane markings are not a requirement just for autopilot.

~~~
orev
I agree that lines should be fixed, however humans rely on things other than
lane lines, and autopilots should too. When you can't see you get cues from
other cars, estimated distance from both sides of the road, sign placement
above your head or on the side of the road, etc...

If we need to repaint all roads to make them work for auto-driving cars, we
might as well just move to one of those simple systems like where a robot
follows a pre-drawn line on a piece of paper. Maybe throw some bar codes in
there so the car knows where it is.

If the cars can't deal with environments that even humans can, I'm not sure
what the point is anymore. I expect a self-driving car can see through things
like fog and darkness better than a human.

~~~
TallGuyShort
>> If we need to repaint all roads to make them work for auto-driving cars

No I'm saying the section in the video is in need of repainting even for
humans.

~~~
orev
And I did not disagree, however if auto driving cars can’t deal with that
situation at least as good a humans, then we have a problem. Obviously we’re
at 1.0, so things will get better.

------
jonheller
This is pretty shocking, more so than any other autopilot related news or
videos I've seen. It's something that no normal driver paying attention would
ever do.

This further confirms for me the feeling that autopilot in cars won't become
even remotely common for another decade, especially in any parts of the
country with poor weather.

~~~
loceng
From my understanding the driver who died in the crash had previously reported
problems in this area. I wonder if it was this specific crash site or not, I
wonder why too if the person was aware of it why they didn't pay more
attention during it - and to stop trusting it. I can understand how easy it is
to be uncertain of something and have enough trust or rather a thread of hope
that "nothing bad will happen to me."

I wonder too, if it's true that the person who died in the crash reported the
area, what process Tesla followed and .. Any area reported with a problem IMHO
should immediately start warning other drivers going in the same area until
the issue is resolved, even disabling auto-pilot completely until Tesla's
resolved it. I'm sure that would make people safer than the temporary
inconveniences they would cause - I'd hope people would take that attitude
anyway.

~~~
floatingatoll
If the driver previously reported problems on a section of roadway and was
then inattentive when driving that section of roadway, that’s a perfect setup
for a reckless driving ticket. They knew a problem existed and recklessly
ignored it. Autopilot does not excuse this in any way whatsoever.

------
amluto
> Then it seems like Autopilot’s Autosteer stayed locked on the left line even
> though it became the right line of the ramp. The system most likely got
> confused because the line was more clearly marked than the actual left line
> of the lane.

This doesn’t surprise me at all. I test drove an “autopilot 2” car once on a
local road at 30-ish MPH. It appeared to try to drive centered between the
lane line and the curb, totally ignoring the fact that the right half of the
“lane” was parking. It then tried to drive me into a parked car and later into
a trash can. Unless they’ve improved dramatically, this is the same thing.

Autopilot seems to be a mediocre adaptive cruise with mediocre steering
assistance that pretends it can actually drive.

~~~
neo4sure
Autopilot should only be used in free ways...

~~~
amluto
Then the software shouldn’t allow it to be enabled off of freeways. And it
should be able to dodge big concrete and metal dividers.

~~~
neo4sure
Then read the user manual and use only accordingly.

------
martinald
Tesls have made out that the driver ignored the warning; but it didn't
actually say that. It said previously in the drive he ignored that warning,
but phrased it cleverly that makes it sounds like during the accident he
ignored the warning.

Feel free to correct me if I'm wrong; but I'm 99% sure that Tesla managed some
pretty spectacular weasel wording there.

~~~
stiGGG
Yes, their statement included definitely a lot of room for interpretation. I
was surprised that most media turned it into a story that took Tesla at least
a little out of the line of fire. Especially here in Germany, were media (and
most car enthusiasts) generally wants to see Tesla going bankruptcy asap.

~~~
lsaferite
> Especially here in Germany, were media (and most car enthusiasts) generally
> wants to see Tesla going bankruptcy asap.

Why?

~~~
stiGGG
Actually there are multiple reasons.

First of all a lot of people here believe that our cars are the best in the
world and no one could even compete close. A lot of this is based on
tradition, think about that gas and diesel engines were both invented by
germans. A foreign company getting so much hype just doesn't fit to their
ideology. Especially an american one, there is no general bad reputation for
american products here (rather the opposite), except for cars though. American
cars are considered generally as poor made and technological far behind here.
A Tesla (who is this anyway?) more innovative than a Mercedes, BMW, Audi or
even Porsche??? It cannot be, what must not be.

Then there is still a general rejection of electric cars by most people, it's
ridiculous how many fake news gets widespread around that topic here in media,
always to worship the diesel engine as the holy grail.

The only reasonable point ist that there is a big fear that electric cars will
cause a lot of people to lose their jobs, because they are much easier to
build. Keep in mind, that the automotive industry is the heart of germanys
economy. But instead of accepting that technologies change over time and
supporting and pushing our industry to make a good transition to keep their
world wide market share, they still hope this could be prevented at all in
some kind.

------
notheguyouthink
Man, I panicked just _watching_ that. I can only imagine how I'd react if an
"autopilot" started beeping, forcing me to take over in an already bad
situation.

I'd love to own a car with all these sensors, but currently all I'd want to
use them for is keeping distance to cars in front of me, and stopping if
needed/etc - ie, advanced cruise control.

~~~
DKnoll
I'm terrified to drive a car that could decide to fully apply the brakes if it
deems necessary. Imagine if it falsely detects a human in the middle of the
highway when you have a car following closely. Very unlikely scenario, some
would say impossible... but I'd rather just pay attention to the road myself
instead of getting my car to do it for me.

~~~
actsasbuffoon
Though eventually that hypothetical car behind you will also have autonomous
breaking, so it'd be fine. It's just a matter of getting adoption rates high
enough.

------
lebski88
Unrelated to the autopilot stuff - this junction seems like an absolute menace
to me. The lines are confusing and that crash barrier seems purpose designed
to kill people. I've only driven a bit on US roads but I don't remember seeing
junctions as bad as this. Is this common or an outlier?

~~~
rurounijones
I had to watch the video twice because I thought the driver had messed up and
dropped the camera before the important bit. Only on my second viewing did I
realise that the white line was no longer the divider for the lane.

You can _Just_ make out the white lines of the lane going to the right and
chevrons between them and the offramp but I agree that, forgetting all the
autopilot stuff. I could easily see a tired driver going straight into that
barrier or even a non-tired one in bad visibilty taking cues from the white
lines.

------
EADGBE
FWIW, this article is worded as if it's the same exit. It's not; the video
attached is in Chicago.

Ramps like this are ubiquitous across the US, a self-driving computer unable
to parse faded lane markings is alarming, to say the least.

------
gapo
If this is a reproducible 'bug' \- then Tesla's stance of Autopilot is ...
certified by US Govt to reduce accidents by 40% ... . "That does not mean that
it perfectly prevents all accidents — such a standard would be impossible — it
simply makes them less likely to occur." is going to severely discourage a lot
of people from not using Autopilot.

Tesla's expectation that a driver keep their hands on the steering wheel
within 6 seconds or they can die - is not going to bode well.

Also, "Internal data confirms that recent updates to Autopilot have improved
system reliability." [2] is going to come under a lot of questions. What
standards/tests do Autopilots even go under ?

[2] [https://www.tesla.com/blog/update-last-
week%E2%80%99s-accide...](https://www.tesla.com/blog/update-last-
week%E2%80%99s-accident) [1] [https://www.tesla.com/blog/what-we-know-about-
last-weeks-acc...](https://www.tesla.com/blog/what-we-know-about-last-weeks-
accident)

~~~
loceng
The types of collisions will shift to different concentrations as well, so
there could be a "500%" increase in one type of collision, while a "20,000%"
decrease in another area.

------
jonawesomegreen
Shouldn't autopilot at least stop the car because of the barrier in front of
it? Even if it can't figure out the lane configuration?

~~~
markstos
Stationary objects are less likely to be recognized than moving ones:
[https://www.wired.com/story/tesla-autopilot-why-crash-
radar/](https://www.wired.com/story/tesla-autopilot-why-crash-radar/)

~~~
loceng
I wonder why these systems aren't linked to GPS and maps, surely you'd be able
to calculate that going right into the middle of two fork options isn't going
to end well.

~~~
danbtl
GPS by itself isn't accurate enough to determine what lane you're in.

Even WAAS GPS [1] (used by autopilots for airplanes) is specified to be
accurate only up to 25 ft, although the measured accuracy is usually around 3
ft.

[1]
[https://en.wikipedia.org/wiki/Wide_Area_Augmentation_System](https://en.wikipedia.org/wiki/Wide_Area_Augmentation_System)

------
TYPE_FASTER
You can see the left lane line stay blue on the dash.

That's a really dangerous exit. The dark color on the end of the divider
blends in with the background.

I would have thought their radar would see the lane divider. Either it didn't,
or their software elected to follow the lane, or that car does not have radar
hardware, or is running an older version of Autopilot
([https://www.tesla.com/blog/upgrading-autopilot-seeing-
world-...](https://www.tesla.com/blog/upgrading-autopilot-seeing-world-
radar)).

Or this is proof that their software doesn't use the output of the radar.

Without code and data, it's all just conjecture.

------
millzlane
It was reported that the other fellow that died a few days ago also knew of
this bug before he was killed. I wonder if he too died while trying to
reproduce this bug, and if so, could you really hold Tesla at fault?

You know there is a deadly flaw in a product, (Imagine a car without Anti lock
brakes.) But you continue to try to reproduce a deadly bug in the software. Is
the maker of the car at fault?

~~~
EADGBE
Could absolutely be true.

The problem I see in this software issue is that the Production environment
isn't fully-reproducible for testing/debugging.

We're debugging on live code on; with the ability to completely ruin lives
with the wrong logic.

In comparison; Anti-lock brakes tests can be done on an abandoned airstrip.

------
EADGBE
As someone not familiar with autopilot or the programming required to "self-
drive" it appears rather confusing to the car, that the left most solid-white
line in the right-most non-exit lane is throwing this off. It's as if it
follows it entirely.

------
dustinmoorenet
First off, I want to say, Tesla looks bad in this case. But at what point can
we start blaming the cities and states for not properly marking roads. That
crash suppressor was already compressed before the accident, so did some human
do the same as the Tesla?

------
shofu
This scares me because this looks like it would be a pretty common edge case/
something you would easily think of. How could a bug like that in such a high
stakes scenario not be tested/be missed??

------
senectus1
jeepers it seems so obvious doesn't it.

its following that white line and at a guess the profile of that stationary
sudden stop at the end just seems to not fit within its target detection
range.

------
arcaster
Leave it to shitty Boston roads to nearly kill occupants and confuse Tesla's
auto-pilot!

