
Tesla driver killed in crash with Autopilot active, NHTSA investigating - davidiach
http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s
======
NeutronBoy
> In the blog post, Tesla reiterates that customers are required to agree that
> the system is in a "public beta phase" before they can use it

I absolutely disagree with this, and it should not be used as a 'get out'
clause from Tesla. If you work with non-technical people on technical issues
on a day to day basis you'll understand why - non-technical people literally
don't understand what stuff like this means. They'll read it, then say 'Oh but
they installed it in my car anyway so it must be safe', and use it anyway.

~~~
gshulegaard
Beware clickbait and intentional inflammatory posturing. I noticed The Verge
selectively quoted Tesla without directly linking to the source (although it
was part of Elon's tweet):

[https://www.teslamotors.com/blog/tragic-
loss](https://www.teslamotors.com/blog/tragic-loss)

The full context of the agreement:

"It is important to note that Tesla disables Autopilot by default and requires
explicit acknowledgement that the system is new technology and still in a
public beta phase before it can be enabled. When drivers activate Autopilot,
the acknowledgment box explains, among other things, that Autopilot “is an
assist feature that requires you to keep your hands on the steering wheel at
all times," and that "you need to maintain control and responsibility for your
vehicle” while using it. Additionally, every time that Autopilot is engaged,
the car reminds the driver to “Always keep your hands on the wheel. Be
prepared to take over at any time.” The system also makes frequent checks to
ensure that the driver's hands remain on the wheel and provides visual and
audible alerts if hands-on is not detected. It then gradually slows down the
car until hands-on is detected again."

So the key here is that Tesla autopilot is a driver assist but is positioned
such that the driver needs to remain alert and able to assume control of the
vehicle at any moment. So when it is said, "Neither Autopilot nor the driver
noticed..." it is critical to note that ultimately the driver failed to
control their vehicle. Personally, I find this reasonable.

~~~
saulrh
> The system also makes frequent checks to ensure that the driver's hands
> remain on the wheel and provides visual and audible alerts if hands-on is
> not detected. It then gradually slows down the car until hands-on is
> detected again

I don't care much about the user agreement, but this should mean that the
driver had their hands on the wheel when this happened. Does we know if it
requires _both_ hands on the wheel, or can you placate the car with one hand
while you read reddit with the other?

~~~
gshulegaard
Seeing as I have seen videos of drivers sleeping in traffic with autopilot
engaged, I imagine there may be a way to evade/circumvent this check entirely.

So your question is sound, and I don't know the answer...but saying "A
distracted driver ran into a truck trailer" is less catchy.

~~~
tim333
It seems the drive was watching Harry Potter on a video player. Not sure if he
was touching the steering wheel.
[http://www.telegraph.co.uk/news/2016/07/01/tesla-
autopilot-c...](http://www.telegraph.co.uk/news/2016/07/01/tesla-autopilot-
crash-driver-was-watching-harry-potter-movie-wit/)

------
netinstructions
Tesla's statement[1] provides some more details

> What we know is that the vehicle was on a divided highway with Autopilot
> engaged when a tractor trailer drove across the highway perpendicular to the
> Model S. Neither Autopilot nor the driver noticed the white side of the
> tractor trailer against a brightly lit sky, so the brake was not applied.

Failing to detect the white side of a tractor against a brightly lit sky is
something I can see camera/image based sensors struggling with, but not LIDAR
based sensors.

[1] [https://www.teslamotors.com/blog/tragic-
loss](https://www.teslamotors.com/blog/tragic-loss)

~~~
at-fates-hands
This is going to be a major morality issue with self driving cars.

What if a deer crosses the road? Will the car hit the deer to save the life of
the driver? Avoid the deer to avoid an accident? Or save the deer and run into
the ditch, possibly harming the driver?

Unfortunately, this technology is not capable of making those decisions and if
ultimately the driver is responsible, then maybe we should just keep it that
way then.

~~~
massysett
"Or save the deer and run into the ditch, possibly harming the driver?"

This is not even a slight morality issue. The person is more important than
the deer.

"Unfortunately, this technology is not capable of making those decisions"

Of course it is. Program the machine so that its primary concern is protection
of human life. Which should be its secondary concern (protection of property
or protection of deer) is a more interesting question. But running into the
ditch to save the deer while threatening harm to the driver is simply not an
option.

~~~
arprocter
Replace 'deer' with 'child' and there's suddenly more of a quandary

------
GrinningFool
My sympathies to the people who built the systems that make Autopilot
possible. Even knowing it was statistically bound to happen eventually, this
has to weigh heavily from the "damn, I should've thought of this once-
in-a-130-m-miles corner case" perspective.

~~~
jholloway7
Unlike many other fatal crashes, however, they presumably now have data from
the corner case they can use to regression test every future release of the
system.

------
themgt
A company's "public beta phase" system named "Autopilot" just drove a customer
head-first into a trailer and killed them, at least in part due to inadequate
sensor hardware that seems unlikely a software update can remedy. Tesla really
ought to take a step back and consider the damage they may do to their brand
if they start killing customers and blaming the dead person for having trusted
a Tesla product.

------
michaeldwan
Oddly, the guy who was killed got a dash cam video of Autopilot avoiding
another collision earlier this year
[http://www.theverge.com/2016/6/30/12072634/tesla-
autopilot-c...](http://www.theverge.com/2016/6/30/12072634/tesla-autopilot-
crash-autonomous-mode-viral-video)

------
nathanaldensr
I wonder what the driver was doing when the vehicle crashed. Tesla says that
the driver is ultimately responsible for controlling the vehicle but then
provides a system that is likely result in the driver _not_ paying attention--
sleeping, reading, or otherwise being distracted. How can Tesla and the
governments that license drivers resolve this situation?

~~~
henrikschroder
Any concession means that the driver somehow isn't ultimately responsible,
which is a situation noone wants to end up in. It's also ridiculous.

~~~
semi-extrinsic
Both Mercedes and Volvo have publicly stated they are taking full
responsibility for any accidents which occur during use of their semi-
autonomous systems. Tesla needs to step up, or it will hurt their public
image.

~~~
henrikschroder
That's probably very easy to say before you have a product on the market.
Let's see if they change their tune afterwards.

I also wonder if that would encourage more risky behaviour from users or less.

~~~
semi-extrinsic
Have you seen a 2016 Mercedes E-class? It has autonomous systems on
essentially the same level as a Tesla, just without the flashy "Autopilot"
branding. The new S-class coming out next year is going to have a lot more
autonomous features on top of that, including Lidar (which Tesla lacks).

~~~
cynix
> on essentially the same level as a Tesla

Actually, it appears to be nowhere near the level of a Tesla.

[http://mashable.com/2016/06/16/2017-mercedes-benz-e-class-
fi...](http://mashable.com/2016/06/16/2017-mercedes-benz-e-class-first-drive/)

~~~
semi-extrinsic
From the article you linked:

> Turns out, the 2017 E-Class is just playing coy when it comes to self-
> driving capabilities. It is the first production car to receive an
> autonomous driving license in Nevada.

So the technical system (hardware+software) is as good as or better than a
Model S, but Mercedes believe it is irresponsible to call that Autopilot and
pretend like the car is fully autonomous, so they've put in restrictions like
requiring some tiny driver input once every 60 seconds so it knows you're
paying attention.

------
roadnottaken
"This is the first known fatality in just over 130 million miles where
Autopilot was activated," states a post on Tesla's corporate website. "Among
all vehicles in the U.S., there is a fatality every 94 million miles.
Worldwide, there is a fatality approximately every 60 million miles. It is
important to emphasize that the NHTSA action is simply a preliminary
evaluation to determine whether the system worked according to expectations."

~~~
burnguy123
From what I understand, the autopilot can only be used on highways. I would
suspect that highway travel is much safer per mile in the first place. Are we
comparing highway driving with autopilot to regular mixed city/highway
driving?

~~~
seizethecheese
Highways are the most dangerous environment. High speed combined with cross
traffic. City streets don't have enough speed usually for crashes be lethal.

~~~
malka
what ? highways in the US have cross traffic ? In my country, the ABSENCE of
such traffic is almost the definition for highways.

~~~
ferrisford
It's common to use the term 'freeway' or 'Interstate' (meaning Interstate
Highway/Freeway) to refer to controlled access highways in the US.

There's lots of highways in the US that are just 2 lanes, one in each
direction, with a speed limit of 55mph and cross traffic from smaller roads
that should stop first before turning on the highway in either direction.
These are what we usually mean with the word 'highway'.

------
karyon
discussion is here:
[https://news.ycombinator.com/item?id=12011419](https://news.ycombinator.com/item?id=12011419)

------
techthroway443
Should we be calling the technology Autopilot? I feel like the name implies
more autonomy than it provides inadvertently misleading people.

~~~
United857
Tesla itself refers to it as Autosteer in the in-car UI (but Autopilot in its
marketing materials).

------
ibrahima
I can see why Volvo criticized Tesla for rolling out a system that's not fully
autonomous. If drivers are willfully distracted because they think the car
will handle any situation, that's a very dangerous place to be in. Though it's
not clear what the driver was doing in this situation, so it's possible that
this would have been a fatal crash in any other car without autopilot as well.

------
tlrobinson
_Tesla reiterates that customers are required to agree that the system is in a
"public beta phase" before they can use it_

I'm happy to let you all beta test this one for me.

------
johansch
[http://jalopnik.com/volvo-engineer-calls-out-tesla-for-
dange...](http://jalopnik.com/volvo-engineer-calls-out-tesla-for-dangerous-
wannabe-au-1773519459)

~~~
icebraining
All he's saying is that the Tesla system requires an attentive driver, which
is something that Tesla themselves have stressed. It's just a PR piece.

~~~
johansch
So that's why they called the feature "Autopilot"?

~~~
cynix
Because, just like an airplane's autopilot feature requires an attentive
pilot, this driver assistance feature requires an attentive driver.

------
wldcordeiro
So if I read it right the trailer merged from a Lane next to the Tesla? So the
trucker had the Tesla in a blind spot? The article is vague.

~~~
tfinniga
More information here: [https://www.levyjournalonline.com/police-
beat.html](https://www.levyjournalonline.com/police-beat.html)

Relevant section: In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near
the BP Station west of Williston, a 45-year-old Ohio man was killed when he
drove under the trailer of an 18-wheel semi. The top of Joshua Brown’s 2015
Tesla Model S vehicle was torn off by the force of the collision. The truck
driver, Frank Baressi, 62, Tampa was not injured in the crash. The FHP said
the tractor-trailer was traveling west on US 27A in the left turn lane toward
140th Court. Brown’s car was headed east in the outside lane of U.S. 27A. When
the truck made a left turn onto NE 140th Court in front of the car, the car’s
roof struck the underside of the trailer as it passed under the trailer. The
car continued to travel east on U.S. 27A until it left the roadway on the
south shoulder and struck a fence. The car smashed through two fences and
struck a power pole. The car rotated counter-clockwise while sliding to its
final resting place about 100 feet south of the highway. Brown died at the
scene. Charges are pending.

Here is a google maps view of the accident location:
[https://goo.gl/maps/SSKyoxhoaxp](https://goo.gl/maps/SSKyoxhoaxp)

~~~
Jemmeh
Thanks for the link to more info and the map! I drew on it to show the path in
Paint.

[http://imgur.com/gallery/aWDK0y9/new](http://imgur.com/gallery/aWDK0y9/new)

