
Self-Driving Tesla Was Involved in Fatal Crash, U.S. Says - reuven
http://www.nytimes.com/2016/07/01/business/self-driving-tesla-fatal-crash-investigation.html
======
robbrown451
They really need to scrap this idea that people can sit there and not have to
do anything for long periods, and still be alert.

It's not reasonable to expect that -- and this sort of thing will happen when
you rely on that system working.

~~~
brian-armstrong
This is an interesting point. Seems like it'd be useful if the car made you
drive for 5 minutes out of every 30 of your trip

Edit: Also I realize that technically you are always driving. By driving here
I mean in the active sense -- no automatic steering or cruise control

~~~
nojvek
My Toyota RAV4 has adaptive cruise control and can auto steer back into lane
if I'm too far off. I wished that it could auto follow a lane. It has
everything that it needs to make that happen.

But I also like that they dumbed it down that I have to have the hand on the
steering wheel otherwise it complains loudly.

Automotive car crashes should be investigated with the rigour of aeroplane
crashes. The only way to make it safer for the next driver is to learn as much
from the bug here and fix it.

I also strongly believe that cars should pass a rigorous test of edge cases
and be scored on that.

------
jakozaur
Even if self-driving cars will be 10x safer than humans, I doubt if they can
eliminate all deadly accidents. Though many people die or get severely injured
every day, each machine failure is treated as major news.

Similarly plane accidents are perceived way more and that perception persist
over decades: "A sold-out 727 jet would have to crash every day of the week,
with no survivors, to equal the highway deaths per year in this (USA)
country." [http://anxieties.com/flying-
howsafe.php#.V3YVE5N95AY](http://anxieties.com/flying-
howsafe.php#.V3YVE5N95AY)

What that means for AI? Either we create 1000x more safer AI than humans, or
we self-driving cars would face harsh public/regulatory opposition?

~~~
kkennis
While flying is certainly safer than driving, that statistic speaks more to
the gross volume of cars driven vs. planes flown than anything else.

The perception of danger in planes and self-driving cars is perceived as such
due to lack of control. If a plane is about to crash, there's not much to do
besides hold and pray. If your car accelerates when it's supposed to brake,
you better be paying attention and act fast to avoid a crash. With your hands
on the wheel, you're always in a position to act, and it's easier to believe
that there's always something you can do to avoid disaster.

~~~
exDM69
These statistics are typically deaths per passenger-mile travelled or
similarly normalized. Or do you mean that highway traffic is more dangerous
because there are more cars in a smaller space?

------
steejk
This was inevitable given Tesla calling the system an 'Autopilot', when in
fact the driver must have their hands on the wheel and monitoring full time.

~~~
iamphilrae
I hope the pilots in the planes I fly in are doing that too!

~~~
madaxe_again
No, you don't, unless you want an exhausted pilot trying to land in a
crosswind at the end of a journey.

Aviation autopilot is very much "set waypoints, sit back", as if anything goes
awry it lets you know, loudly, and then you have time to deal with whatever
needs dealing with. The whole reason autopilot was developed was to combat
fatigue related accidents - and it works.

~~~
akira2501
> "set waypoints, sit back"

"and maintain situational awareness." Some failures can happen quickly, and
the plane can change modes unexpectedly. Plus the warnings you receive may not
be accurate.. see Air France 447.

You can sit back, but your role is to constantly check your instruments, your
performance, your route, your radios and your environment.

~~~
madaxe_again
Yes, but you don't need to sit there with your hands on the yoke, feet on the
pedals, wired to the nines waiting for disaster to strike. I used to take a
book with me for longer flights (when you're sat alone in a plane for hours
flying over cloud things get tedious), as short of having a wing fall off
there isn't all that much that can go very suddenly wrong - and most of the
"very suddenly wrong" situations will kill you regardless. Keep one eye on
instruments, mutter intermittent obscenities at the oil pressure gauge - but
mostly relax.

------
yk
I really do not like Tesla's strategy of luring the drivers to doze off while
driving. However, the Guardian article on the accident claims that Teslas
drove 130 million miles in autopilot mode, while the rate of fatal accidents
is closer to 100 million miles per accident. [0] It is statistically
complicated to measure anything from a single event, and ars had a nice
article on the problems for self driving cars [1], but I would interpret this
as a Tesla autopilot together with the human driver is as good (with some
indication of slightly better) than a unassisted human driver.

Apart from that, it is important to remember that fatal accident statistics
are not just numbers, but are tabulations of individual tragedy. My
condolences to his family, but sadly we can not expect autonomous cars to be
perfect in the first try.

[0] [https://www.theguardian.com/technology/2016/jun/30/tesla-
aut...](https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-
death-self-driving-car-elon-musk)

[1] [http://arstechnica.com/cars/2016/04/car-makers-cant-drive-
th...](http://arstechnica.com/cars/2016/04/car-makers-cant-drive-their-way-to-
safety-with-self-driving-cars/)

~~~
frankchn
There is only one data point from Tesla so i won't read into it too much
either way.

That said, all Model S are expensive and fairly new care with good safety
features compared to the average car on the road today. Thus,I wonder if it is
better to compare the 1 fatality / 130 million miles statistic to Model S (and
S-class / 7-series) fatality rates while not on Autopilot mode.

~~~
c0g
A comparison also needs to include the fact that Autopilot is really only used
on highway like streets, which are much safer than most other types of
driving.

~~~
mSparks
Not sure i agree.

You may be more likely to have an "accident" elsewhere.

But im reasonably sure that you are much more likely to die at highway speeds.

This is one of the great things about tesla logging everything tho.

The data is there to "prove" it one way of the other.

~~~
c0g
I'm basing my assertions off things like this:
[http://www.npr.org/2009/11/29/120716625/the-deadliest-
roads-...](http://www.npr.org/2009/11/29/120716625/the-deadliest-roads-are-
rural)

~~~
mSparks
->The roads traveled least are the nation's deadliest roads

Two factors for that tho. Firstly is bad road markings. Which computers will
be infinately better at handling.

But more importantly is the response time of emergency services and letting
someone know theres been an accident.

Even minor injuries end up fatal if no one finds you trapped in the car for a
week. And every second counts for more serious collisions.

Only so much manufacturers can do about either.

Dui and driver fatigue. As you will see are among the big killers. (And before
you think "yeah but i dont drive drunk or tired" remember you arent the only
person on the road).

Think its just about being carefull with what is meant by "safer" and "other
types of driving".

You are more likely to have an accident in busy periods on busy roads.

But you are much more likely to die if you have an accident somewhere quiet or
on high speed roads.

~~~
c0g
Bad road markings are very hard for current self driving systems to solve.

Your other points back up the claim though - self driving cars need to be
compared to like-for-like mortality: same roads, same demographic, similar
safety features. My guess is they're currently considerably less safe.

------
girvo
> _that view is now in question_

Does anyone else find it fascinating that for computers to be considered
"safe" they must be as close to 100% as you can get, when the equivalent human
drivers have fatal accidents every single day?

~~~
madaxe_again
I wouldn't say "fascinating" so much as "irritating", as this is more about
journalists wanting to get a juicy story where there isn't one, rather than a
lesson on human nature.

Statistically, autopilot is orders of magnitude safer than humans.

If there's a story here, it's about how the US doesn't require underrun
protection on trucks, thus making them death traps, but that wouldn't be
anywhere near as good for circulation as a "rise of the machines, they want to
kill you" story.

~~~
arghnoname
I don't think we can say statistically whether it is safer than humans right
now or not.

~~~
madaxe_again
150,000,000 miles on autopilot, one fatality. Humans, it's 66,000,000 miles
per fatality, in the US. Admittedly the sample is small, but still, it
suggests an advantage over humans.

~~~
arghnoname
Apples and oranges. The statistic you cite is all cars and roads, as far as
I'm aware. High end vehicles with comparable safety features on roads where
autopilot is likely to be engaged is more comparable.

I wouldn't be surprised if autopilot is safer, but I think it is overstating
things to say this is a statistical fact.

------
argonaut
Whether Tesla has a lower accidents/mile rate (or is some factor better or
worse) is a red herring [1].

The key factor in whether this is a legitimate flaw in Tesla's system is:
_could an alert, competent driver easily avoid the accident_?. An alternate
formulation of this criteria is: would it be relatively easy for an engineer
to hard-code a brute-force solution to the edge case, if they really wanted
to?

Obviously we'll need to see the crash video to verify, but my guess is the
answer is yes, this is a legitimate flaw in Tesla's system. Tesla's system
should have been able to detect a car traveling perpendicular to the highway
[2]. Tesla's system was unfortunately not sophisticated enough to detect a car
that an alert human would be able to detect in their peripheral vision, and
their system was not sophisticated enough to perform the simple action that a
human would be able to perform (slamming on brakes). In fact, if they ever
want to scale up to a full urban-capable system this is a situation they need
to be able to handle.

Examples of accidents where I wouldn't blame Tesla's system: the other car
swerves across the center divide and hits you head on, the other car is
speeding and rear ends you, the car in the next lane swerves into you, the
tire blows out and you hit a tree, etc. Although in the long-term we would
ideally have systems that could handle these situations.

[1] We need to wait for more miles and (inevitable) accidents to be logged
rather than rely on this one data point. Also my guess is highway driving is
less fatality prone than urban driving, on a per-mile basis.

~~~
exDM69
I don't think that autonomous cars are anywhere near a situation where a large
scale deployment is feasible. In particular, I think that there needs to be a
cross-vendor car-to-car or car-to-road communication protocol. Until then,
autonomous cars will only work as long as there are relatively few of them or
they run in controlled conditions like major highways.

When I'm in traffic, I notice myself communicating with other people a lot.
Most commonly it's just brief eye contact ("I have seen you"), but it enables
smooth and safe traffic for everyone.

Just blindly following rules and reacting to your immediate surroundings won't
work in large scale. There are plenty of situations where that would end up in
a deadlock traffic jam (e.g. 4 way intersection with no right of way).

This is by no means an impossible task, I just have not heard of an effort
trying to solve it yet.

~~~
argonaut
I rarely communicate with other drivers (other than with signal lights and
brake lights), so I don't actually think it's _necessary_ for self-driving
cars. The only use of communication is right of way at stop signs, but this
can be resolved by seeing which car decides to move forwards into the
intersection.

------
Animats
Previously discussed on YC at [1].

The obituary of the driver, Joshua Brown, is now available. Age 40. Former
Navy explosive ordnance disposal specialist. Worked with Navy SEALs. Studied
physics and computer science. CEO of Internet of Things startup.[2] It would
be hard to find someone more qualified to operate a vehicle with Tesla's
"autopilot". Placing the blame on that driver probably isn't going to fly.

The NHTSA is investigating.[3]

[1]
[https://news.ycombinator.com/item?id=12011419](https://news.ycombinator.com/item?id=12011419)
[2]
[http://www.detroitnews.com/story/business/autos/2016/06/30/t...](http://www.detroitnews.com/story/business/autos/2016/06/30/tesla-
driver-crash-technology-company/86579674/) [3] [http://www-
odi.nhtsa.dot.gov/owners/RecentInvestigations](http://www-
odi.nhtsa.dot.gov/owners/RecentInvestigations)

------
emp
No mention of the fact that the human driven truck pulled out in front of the
care. From the description it sounds like the car should have break breaking
hard - one doesn't just hit the side of a truck - the driver didn't see the
oncoming car either.

~~~
throwaway2016a
This is key. The truck was sideways in front of the car on a divided highway.
That is not normal. And all the situations I can think of where that could
happen involve really quick unexpected moves on the truck driver's part. And
at high speed.

Combine that with the fact the truck was high enough that it was over the hood
of the car (and therefor probably over the hight of many of the sensors).

I don't think as a human I would have done any better.

------
mrb
The NHTSA is full of smart people. They _know_ that a Tesla in self-driving
mode is safer than when driven by flesh and bones. Data proves it. We have
millions of hours of self-driving Teslas and only _one_ fatal accident.

Therefore I predict that this accident (and other future ones) will not
significantly slow down the development, and adoption of self-driving cars...

~~~
thesimon
>Data proves it.

[citation needed]

Tesla press release says this is the first accident in 130 million miles,
whereas other vehicles had an accident every 94 million miles.

But that includes old cars, old drivers, drunk drivers etc.

Doesn't seem that much safer

~~~
Hermel
I don't get why you are being downvoted. There is not enough data to prove
anything yet.

------
torvald
This came to mind; «Teslas are so safe that they make headlines every time one
crashes.»

\-
[https://www.reddit.com/r/Showerthoughts/comments/4n9tbv/tesl...](https://www.reddit.com/r/Showerthoughts/comments/4n9tbv/teslas_are_so_safe_that_they_make_headlines_every/)

------
foobarbecue
I'm kind of surprised to learn that the autopilot was relying on visual
sensors (since I haven't read much about the Model S). I wonder if onboard
lidar or radar could have prevented this accident.

~~~
jakobegger
I think the Tesla does have front facing radar, but radar isn't as precise as
a visual camera; so the autopilot hardware probably uses a combination of
sensors.

EDIT: From the Tesla website:

> Every single Model S now rolling out of the factory includes a forward
> radar, 12 long range ultrasonic sensors positioned to sense 16 feet around
> the car in every direction at all speeds, a forward looking camera, and a
> high precision, digitally controlled electric assist braking system.

[https://www.teslamotors.com/de_AT/blog/dual-motor-model-s-
an...](https://www.teslamotors.com/de_AT/blog/dual-motor-model-s-and-
autopilot?redirect=no)

~~~
throwaway2016a
Even "up"?

I know my car has all the sensors mounted low. I'm not a radar expert. If the
side of the truck was significantly above where the radar is looking would it
be able to see it?

------
rolandukor
I think at these early stages of autopilot driving, drivers should always
assume they are the ones driving as there are sharing the roads with mere
mortals behind wheels. There are reports that the driver may have been
watching a movie on the screen. I would have thought the driver should be
unable to watch a movie on the screen while driving even in autopilot mode.
For example, the range rover has a dual screen view where the front passenger
can watch a movie on the screen, but the driver will not be able to view nor
hear whats going on even though (s)he is looking at the same screen.

------
meeper16
Networks of hackable cars is a bad idea.

------
mtgx
I love Tesla, but make Tesla liable for this. I don't care that "the driver
was supposed to be careful - it's in the license agreement!" or whatever. If
you offer any kind of "self-driving" technology, even if it's only for
changing lanes, then the auto-maker should be liable for the accidents that
happen because of that. Don't put the blame on the driver.

Also something I've said before, I wouldn't be a "beta tester" of self-driving
technology for at least the first 10 years. I think that's stupid, but hey I'm
glad others are willing to die to prove the technology for me, I guess. Even
if you trust companies like Google, Tesla, or Apple to do it right, the hype
could spill over to other companies like say Fiat, who are clueless about
software, but will also offer "self-driving" tech, and people will believe
it's _just as good_ , when it may be far from it.

Make them pay for every accident that happens because of their technology, and
you'll see how quickly they improve it and how carefully they release the
technology. No other regulation may be needed. Other regulations may not be
sufficient on their own anyway, and the auto-makers will just do the minimum
necessary to comply with them, but that could still mean people could end up
dead under those regulations.

~~~
kogepathic
> I love Tesla, but make Tesla liable for this.

This is such a slippery slope, especially in lawsuit crazy USA.

> I wouldn't be a "beta tester" of self-driving technology for at least the
> first 10 years.

Yes, I totally agree.

> Make them pay for every accident that happens because of their technology,
> and you'll see how quickly they improve it and how carefully they release
> the technology.

I don't agree that the civil legal system is the right way to solve this. I
also don't believe software developers for Tesla should be criminally charged
for the death of drivers if it's shown that the software developers were not
negligent.

What I really want to see happen, is that any autonomous cars have their
software independently and fully audited before they're shipped in production
vehicles.

This way, there is a neutral third party, certified by the NTSB, that
certifies whether a self-driving car is road legal.

We hold drivers responsible if they kill someone while behind the wheel. I
think that we're setting ourselves for a dangerous precedent if we also expect
to hold software developers criminally liable. This is why the software in
self driving cars must be audited and certified. This means there is a legal
burden on software developers to provide safe software, and it removes the
criminal liability component should there be a crash.

Treat self driving car crashes less like a witch hunt, and more like an
airplane accident. In airplane accidents, no one is trying to point the
finger, everyone is simply trying to determine the cause, and how it can be
avoided in the future.

> Other regulations may not be sufficient on their own anyway, and the auto-
> makers will just do the minimum necessary to comply with them

Exactly. This is why tougher regulations are needed. IIRC even ECU code in ICE
cars is currently audited. This emerged during the Toyota accelerator
lawsuit.[1]

[1]
[https://news.ycombinator.com/item?id=9643204](https://news.ycombinator.com/item?id=9643204)

~~~
vertex-four
Part of the issue, of course, in America, is... who's going to pay everything
necessary to get an injured person's life back on track?

