
A Tragic Loss - runesoerensen
https://www.teslamotors.com/blog/tragic-loss
======
BinaryIdiot
I mean I understand Tesla has to make a statement here and I understand they
want to ensure everyone that it's not really their fault but to title a post
"A Tragic Loss" and then spend the majority of the post discussing all of your
car's safety features and how it wasn't your fault just seems tone deaf and
distasteful to me.

Maybe they had to do it for legal reasons I don't know (I'm certainly not a
lawyer) and I'd love to own a Tesla but couldn't they have worded this a
little more sympathetic and a little less lawyer?

~~~
angusb
It's not just about Tesla though.

They have to defend autopilot not only to protect the brand but to protect the
public's perception of autonomous vehicles in general.

Self driving tech is poised to save many many lives. So from a utilitarian
perspective, it's probably justified to take extraordinary measures to make
sure reactionary media and public whim doesn't kill it off, however
uncomfortable that might seem in the short term.

Whilst this case is incredibly sad (and I don't want to downplay that in any
way), if you're trying to minimise the overall amount of fatal crashes,
exonerating the tech is the priority (if it is truly not at fault).

~~~
YeGoblynQueenne
>> Self driving tech is poised to save many many lives.

Based on what?

~~~
pacificmint
> Based on what?

Human drivers are very bad. We kill over 32,000 people a year (and that's a 60
year low, just a few years ago it was over 40,000).

So if self driving cars are just a little better than humans, they will save
many lives. If they are a lot better than us, they will save many more.

~~~
YeGoblynQueenne
>> if self driving cars are (...)

That's a hypothetical. You say autonomous cars are "poised" to save many lives
-not even avoid many deaths, actively "save".

That's very strong language to use based on an "if" that nobody yet knows the
answer to with any certainty.

That is, unless you are privy to some information that the rest of us are not?
Are you?

What is it that you base your assertion on? Do you base it on anything or is
it just a pretty turn of phrase you thought sounded cool?

~~~
Nadya
Not who you are questioning.

There are already videos of assisted-driving cars avoiding accidents which
likely would have been fatal. We already know they can potentially save lives.
It's something we already accept for flying. Autopilot in planes has resulted
in deaths of crew and passengers. So we have precedent in at least one mode of
transportation.

Drivers get T-boned or rear end other people all the time. Those are two
scenarios where driver-assist could have applied the brakes and _actively_
saving lives for drivers inattentiveness. You'll also find no shortage of
Liveleak clips of people being killed by drivers who failed to see them and
brake in time. Many being scenarios that Google's self-driving cars have the
technology to avoid.

~~~
brokenmachine
Planes don't have to deal with semi-trailers or wildlife appearing suddenly in
their path regularly either.

Autopilot in a plane doesn't have to deal with obstacles besides the odd
mountain here and there.

So I'd say that in many (most?) ways it's easier and safer to autopilot a
plane than a car, and as you say, even planes with autopilot have failed.

~~~
protomyth
> Planes don't have to deal with semi-trailers or wildlife appearing suddenly
> in their path regularly either

Planes most certainly do have to deal with wildlife (e.g. birds) appearing
suddenly in their path regularly.

~~~
abduhl
This is a ridiculous statement. Are you seriously saying that birds regularly
fly at over 30000 ft? Or are you trying to imply that autopilot during
landing/takeoff actively tracks birds and tries to avoid them?

In other words, the plane autopilot doesn't "deal" with wildlife at all unless
you count running into it. Which is apparently what Tesla autopilot does as
well.

~~~
protomyth
I am saying that planes use auto pilot (e.g. Terrain-following radar and
Autoland) have to deal with birds and yes they hit them.

------
jacquesm
This is why driving AI is 'all or nothing' for me.

Assisted systems will lead to drivers paying less attention as the systems get
better.

The figures quoted by Tesla seem impressive but you have to assume the
majority of the drivers is still paying attention _all the time_. As auto-
pilots get better you'll see them paying attention less and then the accident
rate will go up, not down for a while at least until the bugs are ironed out.

Note that this could have happened to a non-electric car just as easily, it's
a human-computer hybrid issue related to having to pay attention to some
instrument for a long time without anything interesting happening. The longer
the interval that you don't need to act the bigger the chance that when you do
need to act you will not be in time.

~~~
NamTaf
This is what I've now said 3 or so times in various autopilot threads. It has
to be an all or nothing thing. Part of responsible engineering is engineering
out the many and varied ways that humans can fuck it all up. Look at how UX
works in software. Good engineering eliminates users being able to do the
wrong thing as much as possible.

You don't design a feature that invites misuse and then use instructions to
try to prevent that misuse. That's irresponsible, bad engineering.

The heirachy of hazard control [1] in fact puts administrative controls at the
2nd-to-bottom, just above personal protective equipment. Elimination,
substitution and engineering controls all fall above it.

Guards on the trucks to stop cars going under are an engineering control and
also perhaps a substituion - you go from decapitation to driving into a wall
instead. It's better than no guards and just expecting drivers to be alert -
that's administration - but it's worse than elimination which is what you need
if you provide a system where the driver is encouraged to be inattentive.

User alertness is a very fucking difficult problem to solve and an extremely
unreliable hazard control. Never rely on it, ever. That's what they're doing
here and it was only a matter of time that this happened. It's irresponsible
engineering.

edit: My source for the above: I work in rail. We battle with driver
inattention constantly because like autopilot, you don't steer but you do have
to be in control. I could write novels on the battles we've gone through just
to keep drivers paying attention.

[1]:
[https://en.wikipedia.org/wiki/Hierarchy_of_hazard_control](https://en.wikipedia.org/wiki/Hierarchy_of_hazard_control)

~~~
jacquesm
> I could write novels on the battles we've gone through just to keep drivers
> paying attention.

Please do, and link them here. I'd be very interested in reading about your
battles and I figure many others would too. This is where the cutting edge is
today and likely will be for years to come so your experience is extremely
valuable and has wide applicability.

~~~
NamTaf
Here's a comment from 5 months ago about one example - not me personally but
it's one of the major case studies in the AU rail industry - that covers
exactly this topic. It also sort of morphs into general discussion about
alertness tools in physical design.

[https://news.ycombinator.com/item?id=11017034](https://news.ycombinator.com/item?id=11017034)

------
daveguy
A direct reply from Elon Musk on twitter about why radar did not recognize the
white side of a trailer across the road when the camera missed it:

"Radar tunes out what looks like an overhead road sign to avoid false braking
events"

[https://twitter.com/elonmusk/status/748620607105839104](https://twitter.com/elonmusk/status/748620607105839104)

EDIT:

When the "overhead sign" comes down below overhead clearance of the vehicle
the signal should not be masked. There should have been _some_ braking action
in this case. If there was not then the tesla autopilot is unsafe. This is the
same blind spot discussed a few months ago that caused a tesla to run into a
parked trailer using summon mode:

[https://news.ycombinator.com/item?id=11677760](https://news.ycombinator.com/item?id=11677760)

This seems like a serious flaw in autopilot functionality. Trailers are not
that rare.

I would be interested if "autobrake/autofollow" functions of other car
companies have similar problems.

~~~
sologoub
It also seems strange that the rest of the tractor trailer was not picked up
by the radar when it was entering the road in the first place - the front part
that is pulling the trailer is typically lower and should not have been
mistaken for a road sign or otherwise.

~~~
daveguy
It may have. AI isn't quite to the level of understanding "object permanence"
yet. So, if you would let a 2 year old drive you should be fine with
autopilot. The more I think about this the more I think it will set self-
driving back 10 years.

------
spenvo
The top comment in another comment thread, which has been "duped"[0] pointed
out how marking a feature in cars as "beta" is irresponsible.

What's beyond the pale IMO is that when auto-pilot was first demonstrated (at
the unveil event) - "hands on the wheel" was not part of the story.
Journalists and (what appeared to be) Tesla employees were using the feature
without hands on the wheel. It looked like Tesla cashing-in on the positive PR
without correctly framing the limitations of the tech.

Furthermore, Tesla includes sensors to map the entire surroundings of their
cars, but why can't they include sensors to ensure customers have hands on the
wheel? (update: comment says they do, but the check frequency is low. why
can't it be high?!) It's not just the driver's life at stake, it's everyone
else on the road--Tesla should disable this feature on cars [unless it
ensures] drivers' hands are on the wheel. Engineers/execs at other companies
taking a more responsible approach must be furious at the recklessness on
display. One death is too many.

Tesla Auto-pilot fail videos:
[https://www.youtube.com/results?search_query=tesla+autopilot...](https://www.youtube.com/results?search_query=tesla+autopilot+fail)

It's incredibly unfair to other drivers on the road to let someone else use
beta software that could cause a head-on-collision.

[0] -
[https://news.ycombinator.com/item?id=12011635](https://news.ycombinator.com/item?id=12011635)

~~~
fdsaaf
> hey do the check frequency is low. why can't it be high?!

Because that makes it less useful. I am a Tesla owner. I am an adult capable
of monitoring the car and taking control when autopilot gets confused. My hand
being on the wheel at all times is neither a necessary nor a sufficient
condition for verifying that I am paying attention and am ready to take over
control.

> One death is too many.

I am sick and tired of absolutist statements about risk. Why do you allow cars
on the road at all? Why allow cars to have cupholders? Why are drive-through
restaurants legal? You make utility-risk trade-offs all the fucking time.

~~~
spenvo
> My hand being on the wheel at all times is neither a necessary nor a
> sufficient condition for verifying that I am paying attention and am ready
> to take over control.

Then you are not using the feature the way Tesla says you should, the only way
Tesla says it's safe to be used.

> I am sick and tired of absolutist statements about risk.

Recklessly rolling out tech is screwing the industry-at-large, given the
regulatory hurdles that must be overcome.

>> One death is too many.

You took this out of context. When Tesla makes no genuine, up-front attempt to
educate users on how to use Auto-Pilot--yes, one death is too many.

Correctly-deployed autonomous driving stands to save thousands of lives
annually; what's at risk is some overeager company @#!$ing the regulatory
efforts by being irresponsible at scale.

~~~
derefr
You're ignoring the other half of the quoted sentence.

> a necessary _nor a sufficient_ condition

As in, not only can you pay attention without having your hand on the wheel,
but you can also have your hand on the wheel without paying attention. Having
your hand on the wheel is "safety theatre"—something done because it makes you
_feel like_ you're in control, not something you do because it actually _puts_
you in control.

Cars could do a lot of things to _actually_ ensure the driver is in control. A
start would be administering a breathalyzer-like automated _attention_ test
(dual-N-back?) before the car will start, and then every few hours of
continuous operation after that. Tired drivers would be pulled over to the
side of the road and told to take a nap and resume in the morning. (This would
probably automatically cover drunk drivers as well, now that I think of it.)

~~~
caf
Some models of Volkswagen now include driver fatigue detection, based on
analysing steering inputs (and possibly other signals?). It just warns you to
pull over, though.

------
archagon
This is on the road to being off topic, but still relevant given some of the
commentary in this thread:

It makes me a bit sad that the political zeitgeist in the tech community is
leaning towards "acceptable losses" when it comes to accidents in automated
cars, to the point of pre-emptively expressing disdain at ordinary people
reacting negatively to such news. I sense it's going to become harder and
harder for us to talk about our worries and skepticism regarding automated
driving, since the louder voices claim it will all be worth it in the end.
Surely — _surely_ — you're on the side of _less death_? But personally, I find
the utilitarian perspective distasteful. We're perfectly happy to let
technology (literally) throw anonymous individuals under the bus as long as
less people die overall, but what if it's _you_ that gets hit by an auto? What
if it's someone you care about, not Anonymous Driver On TV? The point is that
humanity is not a herd to be taken as a whole; every life has rights,
including the right not to be trampled by algorithmic decisions or software
bugs for the betterment of all. (Sure, you could argue just as well that we
have the right not to be run over by drunk and otherwise negligent drivers,
but at least this kind of death is not methodical and has some legal
recourse.) I feel this perspective needs a strong voice in the tech community
too, to counter the blind push forward at the expense of human lives.

Now, this isn't necessarily what happened in _this_ case, but I find Tesla's
behavior in these kinds of situations to be creepy and self-serving, at best.
Is every death going to come with a blog post describing how much safer
automated features are compared to human drivers? Every auto-related casualty
is, and should be, a massive event, not a minus-one-point on some ledger in
Elon Musk's office.

~~~
ModernMech
> It makes me a bit sad that the political zeitgeist in the tech community is
> leaning towards "acceptable losses" when it comes to accidents in automated
> cars

Yes, this is another thing that makes me so angry about this situation.

Back in the day, I worked on a project to give autonomy to power wheelchair
systems. In order to get it to market, we had to go through FDA approval,
which involved testing the equipment in every conceivable scenario. We had to
show how it performed in rain, snow, dust storms, sunlight, etc. and it had to
be incredibly rigorous, defining all limitations, and how the user would be
impacted if the systems failed.

If we came the FDA and said "Our system works in many scenarios, but we've
found a pathological case where on certain sunny days it will crash into
certain brightly colored objects, resulting in almost certain death for the
user" we would have been summarily rejected. Non starter.

And yet we're willing to put a system with the above disclaimer (which should
have been known to Tesla engineers. This is sensors 101.) on the market with
absolutely no oversight? Seriously?

~~~
jfrisby
Except that the car wasn't autonomous here.

The human behind the wheel had been explicitly instructed to keep hands on the
wheel and feet on the pedals -- and that it was his responsibility to override
the machine's judgement if circumstances warranted.

~~~
rasz_pl
yet all the PR material talks about autonomous driving, car drives itself wink
wink

~~~
jfrisby
PR is PR -- and yes, the ultimate vision is to get to trustworthy un-assisted
self-driving.

And yet, before you can engage this mode the Tesla gives you a very stern
warning that it's NOT perfect, and you DO need to keep an eye on what
decisions it's making and you MUST keep your hands on the wheel.

We're not talking pages of obtuse legalese, we're talking about a single
screenful of clear, straightforward warning text.

Also, as a _licensed_ driver, you have obligations that you are expected to be
mindful of, not the least of which is _maintaining control of your vehicle at
all times_.

Are some people going to ignore it? Unfortunately, that's likely. But despite
that reality, Tesla's autopilot is apparently doing better in terms of
accidents per 100m miles driven than unassisted drivers.

------
kafkaesq
_This is the first known fatality in just over 130 million miles where
Autopilot was activated. Among all vehicles in the US, there is a fatality
every 94 million miles._

Given that Autopilot is generally activated under safer, more routine driving
scenarios -- decent weather, regular suburban traffic and so on; for which we
can naturally expect significantly lower fatality rates -- it doesn't sound
like it's off to a particularly good "batting average" so far. Especially
since we've been promised until now that self-driving cars will ultimately be
not just incrementally safer, but _categorically_ safer than human-piloted
vehicles.

~~~
aetherson
Also of note: Tesla often claims that their cars perform very well in an
accident. It may be that a crash that would result in a fatality in the median
vehicle on the road does not produce a fatality in a Tesla. So the salient
number is not "fatalities per vehicle mile among all vehicles in the US," it
is "fatalities per vehicle mile among non-autopilot-driven Teslas."

------
petercooper
_Neither Autopilot nor the driver noticed the white side of the tractor
trailer against a brightly lit sky,_

I'm intrigued that the color is relevant in the _car 's_ case - wouldn't it be
using some sort of radar to detect and map objects rather than vision? I
appreciate I am probably missing something.

~~~
mikeash
Tesla's Autopilot uses a radar unit, a front-facing camera, and twelve
ultrasonic sensors on the front and back bumper. The ultrasonic sensors are
short range and are for detecting adjacent cars and obstacles while parking,
the camera is used for detecting lanes, and the camera and radar work together
to detect cars.

The radar is low to the ground and probably doesn't pick up a trailer that's
high off the ground. The camera could, but not if contrast is too low. (And
I'm not sure if the software is able to recognize the side of a trailer
anyway.)

~~~
curiousgal
Would an infrared camera have picked up the trailer?

~~~
mikeash
Good question. I wouldn't be surprised if the existing hardware already
incorporated some infrared, but I really have no idea.

~~~
jdmichal
The range of most, if not all, digital camera sensors ranges into the infrared
spectrum. I don't know if it's a _useful_ part of the spectrum for this
purpose, nor whether Tesla uses image processors that maintain the full sensor
data. Would definitely be an interesting thing to work with, from a tech point
of view.

------
Animats
" _Neither Autopilot nor the driver noticed the white side of the tractor
trailer against a brightly lit sky, so the brake was not applied._ "

Now that's blaming the user for a design flaw. There is _no_ excuse for
failing to detect the side of a semitrailer. Tesla has a radar. Did it fail,
was its data misprocessed, or is the field of view badly chosen? The single
radar sensor is mounted low on the front of the vehicle, and if it lacks
sufficient vertical range, might be looking under a semitrailer. That's no
excuse; semitrailers are not exactly an uncommon sight on roads.

I used an Eaton VORAD radar in the 2005 Grand Challenge, and it would have
seen this. That's a radar from the 1990s.

I want to see the NTSB report. The NTSB hasn't posted anything yet.

~~~
neurotech1
NTSB would only investigate a major road transportation crash, unless
specifically requested. This mishap doesn't appear on their docket.[0]

Tesla is under a regulatory requirement to report all relevant mishaps to the
NHTSA, even if they occur outside the US.

[0]
[http://www.ntsb.gov/investigations/AccidentReports/Pages/hig...](http://www.ntsb.gov/investigations/AccidentReports/Pages/highway.aspx)

~~~
studentrob
> NTSB would only investigate a major road transportation crash, unless
> specifically requested. This mishap doesn't appear on their docket.[0]

It seems to be on their radar now,

[http://www.detroitnews.com/story/business/autos/2016/07/08/n...](http://www.detroitnews.com/story/business/autos/2016/07/08/ntsb-
launches-probe-fatal-autopilot-tesla-crash/86873328/)

------
simonsarris
> What we know is that the vehicle was on a divided highway with Autopilot
> engaged when a tractor trailer drove across the highway perpendicular to the
> Model S. Neither Autopilot nor the driver noticed the white side of the
> tractor trailer against a brightly lit sky, so the brake was not applied.

Tragic no doubt, but I'm relieved that this was not a "Autopilot did something
very very wrong" story.

Autopilot has the potential to save a large number of lives (I'm sure Tesla
execs are thinking about touting "estimated lives saved by autopilot" if the
numbers work out, after a few billion miles), so I hope incidents like this
don't hamper public perception and therefore research.

~~~
georgecmu
_Autopilot engaged when a tractor trailer drove across the highway
perpendicular to the Model S. Neither Autopilot nor the driver noticed the
white side of the tractor trailer against a brightly lit sky, so the brake was
not applied._

How is this not an "Autopilot did something very very wrong" story?

Tesla autopilot drove the vehicle full-speed into a tractor-trailer, which
wasn't registered by its sensor systems. It's great that Tesla already has an
explanation for the failure, but the situation as described is far from an
improbable edge case. What they are saying is that the autopilot can't detect
white objects when a bright light is present -- that's a pretty serious
limitation to overlook.

~~~
alkonaut
The autopilot doesn't drive the car, the driver does. The autopilot is a help
just like cruise control. Tesla are very clear (apart from the naming of the
feature) that the driver must still drive the car.

The autopilot will avoid obstacles it can detect and the driver is expected to
avoid others (since he has eyes on the wheel and eyes on the road the whole
time).

I think they misleading thing about Teslas Autopilot I its current state is
that they call it "Autopilot" while other manufacturers call this kind of
assist features e.g "Lane Assist".

A distance-sensing cruise control can't be trusted not to crash into the back
of the vehicle in front either, not least because it can't be sure it can
"see" it. I suspect motorcycles and other small vehicles may fool radars just
like this case seems to have been about contrast and colors in a camera
sensor.

~~~
mdorazio
That's some PR firm-level explaining. First, autopilot is not just lane
assist, it's lane assist + adaptive cruise control + additional features (it
can change lanes if you tell it to). Other OEMs have purposely shied away from
enabling an autopilot-like feature set together, even though it's technically
possible, because they are absolutely terrified of something exactly like this
situation happening.

Second, you're completely ignoring that autopilot will very much entirely
drive the car by itself for minutes at a time (reports are anywhere from 5 to
15 depending on software version), which will obviously encourage drivers to
not pay attention to the road. I don't care about disclaimers, I don't care
about nag screens or chimes, and I definitely don't care about some warranty
text that flashes on the screen. If you make a car that can drive itself you
encourage drivers to let it do just that and you should be prepared for that
eventuality. Period.

~~~
Retra
This autopilot is very new and fairly experimental technology. You're
basically demanding that this not be true somehow.

~~~
ceejayoz
Very new and fairly experimental technology that can kill you tends to be
fairly heavily regulated. New drugs, rockets, aircraft, etc. don't generally
get released via software update to the general public. They go through very
regimented, restricted, contracted trials.

Tesla risks getting regulators to crack down on new car features across the
entire industry if they're not careful.

------
Animats
Here's the NHTSA investigation information. This is the beginning of the
process that leads to recalls.

INVESTIGATION Subject : Automatic vehicle control systems

    
    
        Date Investigation Opened: JUN 28, 2016
        Date Investigation Closed: Open
        NHTSA Action Number: PE16007
        Component(s): FORWARD COLLISION AVOIDANCE
        Manufacturer: Tesla Motors, Inc.
    

SUMMARY: ODI has identified, from information provided by Tesla and from other
sources, a report of a fatal highway crash involving a 2015 Tesla Model S
operating with automated driving systems (?Autopilot?) activated. This
preliminary evaluation is being opened to examine the design and performance
of any automated driving systems in use at the time of the crash.[1]

INVESTIGATION RESUME

    
    
        Manufacturer: Tesla Motors, Inc.
        Products: 2015 Tesla Model S
        Population: 25,000 (Estimated)
    

Problem Description: A fatal highway crash involving a 2015 Tesla Model S
which, according to Tesla, was operating with automated driving systems
(“Autopilot”) engaged, calls for an examination of the design and performance
of any driving aids in use at the time of the crash.[2]

[1] [http://www-odi.nhtsa.dot.gov/owners/RecentInvestigations](http://www-
odi.nhtsa.dot.gov/owners/RecentInvestigations) [2] [http://www-
odi.nhtsa.dot.gov/acms/cs/jaxrs/download/doc/UCM5...](http://www-
odi.nhtsa.dot.gov/acms/cs/jaxrs/download/doc/UCM530776/INOA-PE16007-7080.PDF)

------
sverige
Bottom line: Lots of work to do if the car can't see a semi-trailer in front
of it. I would be willing to say that it's much more likely that the driver
would have noticed in time if Autopilot wasn't on. It's likely that most
average drivers will think so too, which pushes the acceptance of this
technology somewhere further in the future.

And Tesla's statement does nothing to alleviate these reasonable doubts about
putting your life and the lives of your family and friends in the hands of
automotive software engineers.

------
vkou
Is this the new motto for autonomous vehicles? Move fast and break bones?

With these minor bugs, Tesla seems to be doing a solid job of poisoning the
well for self-driving cars. I'd like to see them explain how their competitors
should not be tarred with the same brush, once the political backlash hits.

It doesn't matter how many disclaimers you give before you turn on autopilot -
a driver who focused on driving the car (As opposed to letting autopilot
cruise) would have probably noticed a tractor driving across the road on a
bright, sunny day.

It's a dangerous system. Instead of arguing about the trolley problem, I'd
first like to see a car be good at making decisions that save its passengers.

~~~
krschultz
I'm personally very worried about the poisoning the well angle. Tesla is
playing fast and loose with this compared to Google and the results could
strangle the whole field.

------
mercurialshark
The loss is of Joshua D. Brown, 40, of Canton, Ohio. A former Navy SEAL and
technologist. Josh was a member of the Naval Special Warfare Development Group
(SEAL Team Six) prior to founding Nexu Innovations.

[http://www.legacy.com/obituaries/ohio/obituary.aspx?pid=1799...](http://www.legacy.com/obituaries/ohio/obituary.aspx?pid=179994314)

------
CobrastanJorji
> It is important to note that Autopilot is...still in a public beta phase...

No, it's not important to note that. You should not be able to hide behind the
word "Beta" for systems that could kill people. Either you're willing to let
people risk their lives on your software or you're not, and you were.

~~~
curiousgal
Judging from the description the accident would've happened either ways.

~~~
vkou
The accident probably wouldn't have happened if the driver was focused on
driving the car... As opposed on letting the car drive itself.

When the autopilot acts correctly 99.9% of the time, you'd get incredibly
complacent and inattentive. When you don't have an autopilot, you are paying
far more attention to the road.

------
johngalt
The thing about self driving cars is that every accident will have a wealth of
information in regards to how it occurred and the decision making that went
into it. Once we understand and correct the problem every subsequent car will
be safer. This is not the case with humans whom regularly fail to learn from
consequences to other humans.

Imagine if the first time someone fell asleep at the wheel and crashed, you
could just tell everyone "hey don't fall asleep at the wheel". And it just
never happened again.

------
mathattack
Tragic. Good that they own it, though I'm not thrilled with this:

 _It is important to note that Tesla disables Autopilot by default and
requires explicit acknowledgement that the system is new technology and still
in a public beta phase before it can be enabled. When drivers activate
Autopilot, the acknowledgment box explains, among other things, that Autopilot
“is an assist feature that requires you to keep your hands on the steering
wheel at all times, " and that "you need to maintain control and
responsibility for your vehicle” while using it. Additionally, every time that
Autopilot is engaged, the car reminds the driver to “Always keep your hands on
the wheel. Be prepared to take over at any time.” The system also makes
frequent checks to ensure that the driver's hands remain on the wheel and
provides visual and audible alerts if hands-on is not detected. It then
gradually slows down the car until hands-on is detected again._

~~~
jackpirate
This statement is also disengenuous:

 _This is the first known fatality in just over 130 million miles where
Autopilot was activated. Among all vehicles in the US, there is a fatality
every 94 million miles. Worldwide, there is a fatality approximately every 60
million miles._

The wording implies that these numbers are directly comparable, but they are
not. I would guess that Autopilot is much more likely to be activated in safe
situations. But the other statistics include all situtations.

~~~
mozumder
Also, the US fatal accident rate is one death per 108 million miles, which
includes motorcycles and teens and drunk people.

Not only that, you'd expect drivers in the higher-income demographic of US
auto customers to have less fatal accidents.

The real comparison should between a model S and cars like the Mercedes
S-Class, since this is all about car marketing anyways.

~~~
gjem97
> Not only that, you'd expect drivers in the higher-income demographic of US
> auto customers to have less fatal accidents.

What? Why?

~~~
mozumder
Because they can afford cars with stronger safety features.

~~~
twinkletwinkle
They can also afford cars with stronger danger features, like turbochargers.

They also might be more sheltered from the consequences of their actions and
likely to be reckless while driving.

Your assumption may even be correct, but I don't think it's obvious.

~~~
mozumder
Rich people are also older, and older people have low accident rates.

In the end, the accident rate comparison to the global population is
completely wrong.

------
greybox
Entitled "A tragic loss" but reads like a massive disclaimer. . .

~~~
maxvu
I didn't read it like that. It felt like it was addressing directly the Tesla
owner spooked at the news.

Also, if I were the target of the PR shenanigans pulled on Tesla, I would want
to use precise language, too.

~~~
twinkletwinkle
"If our imperfect software kills you, too, rest assured that we will offer our
empty condolences to your family while simultaneously using technical language
to absolve ourselves of any possible fault"

Do you think that reassures Tesla owners?

------
davidiach
It seems the driver was Joshua Brown, the guy who posted the viral video of
how his Tesla avoided a crash on the highway some months ago. Really sad.
[http://www.theverge.com/2016/6/30/12072634/tesla-
autopilot-c...](http://www.theverge.com/2016/6/30/12072634/tesla-autopilot-
crash-autonomous-mode-viral-video)

------
tachim
"What we know is that the vehicle was on a divided highway with Autopilot
engaged when a tractor trailer drove across the highway perpendicular to the
Model S. Neither Autopilot nor the driver noticed the white side of the
tractor trailer against a brightly lit sky, so the brake was not applied."

That's really disingenuous. There's no way to know for certain what the driver
did or didn't see, but there are a lot more salient features for them to go by
other than the white trailer, like the fact that the cabin has driven across
the road or that the wheels of the trailer were square in the driver's lane. A
much more likely explanation is that the autopilot set expectations too high,
the driver was not paying attention (like in all the youtube videos of the
system in action), and the autopilot failed and crashed into the truck.

------
brianstorms
As a pre-autopilot S owner, I've been skeptical of autopilot since the day it
was announced. I think folks put too much trust in it, and edge cases, as
anyone who's developed software/hardware knows are the gremlins that are
almost impossible to completely account for. Edge cases are the things that
have kept me from being excited about autopilot. Even 99.99999% reliability
means eventually you run into an edge case. Simple math.

I also think that Tesla does not do anywhere near enough to educate brand-new
owners, especially the non-techie later-adopter owners that are starting to
buy Teslas in droves. They're turned loose with these cars and there are
enough weirdnesses with the usability and user experience of driving and
operating the car that it catches up on people. Add to that the cognitive
distractions of 21st century life (esp mobile devices) and you have the
makings of new kinds of "normal accidents."

I suspect that some time soon it will be a requirement that either DMVs issue
new rules for having a license to operate a vehicle with autonomous
capability, or, they'll require the manufacturer to certify that the buyer has
gone through a certified course of instruction and training offered by the
manufacturer or its designate.

It may well be that in this case, the victim, who was clearly a Tesla
enthusiast and knew a lot about autopilot, may have passed any such testing by
DMV or manufacturer with flying colors. Not the point. In general, as Tesla
and other manufacturers make autonomy more mainstream, we're going to see
these edge case situations more frequently.

Me, I view autopilot as a dangerous feature that I would never, ever trust. As
a set of incredible SENSORS that help me with second-by-second situational
awareness while driving, fine. As a replacement for my driving, hell no.

------
gordon_freeman
Nobody is talking about whether it is safe to put Beta software of such an
advanced autopilot system in real world? I mean they are giving choice to
people to turn on this Autopilot functionality as if it were just another
driver assistance safety feature like 'Honda Sensing' but in reality it is
much more advanced system and it seems Tesla is doing this to use the data to
feed to it's upcoming autonomous technology. But at what cost!

UPDATE: Just read this news on NYTimes:

"The traffic safety agency is nearing the release of a new set of guidelines
and regulations regarding the testing of self-driving vehicles on public
roads. They were expected to be released in July."

Good and much needed step from NHTSA here.

~~~
fdsaaf
Exactly what standard do you hold beta features before release? For the sake
of the future of humanity, we can't get into a situation where nothing new is
allowed so long as someone, somewhere might get hurt.

~~~
ceejayoz
> Exactly what standard do you hold beta features before release?

With stuff that could easily kill you? If it were a drug, there'd be an
ethical review and progressively larger trials in carefully selected groups of
people receiving informed consent. Something like that might be more
appropriate than "your car just updated! enjoy!" beta software.

~~~
fdsaaf
First of all, given the number of miles Teslas have driven before this
incident, any study would have seen zero accidents. Second, drug studies are
_also_ too cautious, and many useful drugs are denied to the public on the
basis of rare and tolerable side effects. (For example, look at weight loss
drugs.)

~~~
ceejayoz
Zero _fatal_ accidents, but I suspect such a trial would've revealed a lot of
near misses and other "adverse events" with the autopilot. There are certainly
plenty of YouTube videos - I recall seeing one where it tries to jerk into
oncoming traffic.

I'd be interested in specific weight loss drug examples. There are a number of
them available on the market right now.

------
_vk_
This sucks, but it seems likely that we've had the first victim of autopilot-
induced inattentiveness.

From the deceased's viral video
([https://www.youtube.com/watch?v=9I5rraWJq6E](https://www.youtube.com/watch?v=9I5rraWJq6E)):

>I actually wasn't watching that direction and Tessy (the name of my car) was
on duty with autopilot engaged. I became aware of the danger when Tessy
alerted me with the "immediately take over" warning chime and the car swerving
to the right to avoid the side collision.

Combine this with the strange claim that Tesla put in their press release:

>Neither Autopilot nor the driver noticed the white side of the tractor
trailer against a brightly lit sky, so the brake was not applied.

It's hard to see how one can miss a tractor trailer "against a brightly lit
sky" when one is paying attention. The human visual system isn't that bad. A
likelier explanation is that he just wasn't looking. :(

------
TaylorGood
As most of you have already seen, Autopilot has detected similar situations:
[https://www.youtube.com/watch?v=9X-5fKzmy38](https://www.youtube.com/watch?v=9X-5fKzmy38)

~~~
ModernMech
I don't know if autopilot really "saved the day" there. Maybe for an
incompetent driver (which is most, so this is a net positive for safety).

But as a human driver, I would have taken human psychology into account and
guessed the turning car was going to take advantage of that gap, thus causing
me to approach much more cautiously.

I don't know how fast that tesla was going, but I don't think I'd be doing
more than 5-10 mph in that kind of traffic. I would also expect cars in the
right lane to unexpectedly switch into the left. It seems to me the Tesla was
overconfident and going about 25 mph.

edit: I feel like my tone here is too harsh (mainly because my personal
reaction is that this autopilot feature is professionally irresponsible). Let
me pull back my emotions a little and try again:

The only reason the autopilot feature here "saved the day" is because it was
driving in an irresponsible manner in the first place. As the comment below me
points out, it was driving at 45 MPH, compared to the 5-15 MPH you should be
driving. For a defensive human driver, there shouldn't have been a close call
in the first place.

The Tesla is driving irresponsibly because it cannot take into account human
behavior. It stopped in time in this scenario, but there's obviously a point
at which the stopped car could turn and the Tesla could not physically stop in
time. This can happen at any speed, but the 45 MPH collision is much much
worse than a 10 MPH collision.

I'd also like to point out that the autopilot was likely breaking the law. The
speed limit on that road is probably 55 - 65 MPH, but the speed limit indicats
the maximum speed _in perfect conditions_. When it's dark and raining and
heavy traffic, going the speed limit is in fact speeding. If the Tesla had hit
the turning car, the driver of the Tesla would probably have been cited for
speeding, even at 45 MPH.

~~~
sampo
There were lots of lights and reflections, and it was night, and the turning
car was black. Watching the video the first time, I didn't see the car before
it was already in front of the camera. Watching the video second time and
knowing what to expect, then I of course see the car in advance.

~~~
ModernMech
> There were lots of lights and reflections, and it was night

All those factors indicate the car was going too fast given the conditions --
autopilot or not. It appears that in general, the Tesla AP does not take road
and environmental conditions into account (or weigh them heavily at least). In
this case, there was no collision in spite of the conditions. But this should
have been a giant red flag for Tesla engineers. Unfortunately, the story at
hand tells us that they're not doing their job.

> Watching the video the first time, I didn't see the car before it was
> already in front of the camera.

I didn't either, not at first. But given just the first frame and knowing a
collision was about to be avoided, my first reaction was the autopilot was
going to prevent a rear-end due to a short stop, or it was going to stop as
someone merged in from the right lane.

But you and I are not the driver of this car. The driver should have known the
highway was not divided, should have known of the possibility for cross
traffic, and he should have adjusted his behavior accordingly. I guarantee you
there was also a sign (before the start of the video) indicating there is an
opportunity for cross traffic ahead. Given that, those white headlights (they
look reddish in the video, but in real life they would be bright white) should
caution the driver that a car is probably looking to turn.

The autopilot did not take any of this context into account, and that's
exactly why someone is dead today. It's operating with superhuman reaction
speed, but with the common sense and wisdom of a 10 year old.

------
UnoriginalGuy
Tesla sat on this for two months and then released it on the 30th of June.

Why is that date important? It is the end of the third quarter in the US which
runs from the 1st of April until today. It is also a holiday weekend.

So while, unsurprisingly, everyone here wants to defend Tesla's tone it is
hard not to look at the date of this release with a lot of cynicism. They've
waited for a great day to bury bad news, and to hide any financial costs.

No doubt all the replies below will claim this is coincidence that Tesla
announced this at the end of a quarter on a holiday weekend late in the day.

------
camkego
Main issue is that once you take away 90% of the requirement for a human to
focus on the task at hand (driving) they are going to ignore the remaining 10%
where an exception occurs.

------
ChuckMcM
Very sad. I once designed a robot that used infra-red collision detection to
avoid obstacles, my home built, and painted flat black, stereo cabinet
registered as open space and the robot drove right into it at full speed. It
was a quick lesson on sensor fallibility for me.

I've also observed human pilots on busy roads nearly colliding with obstacles
when driving toward the setting sun, the visor pulled down and still trying to
shade their eyes.

But robots aren't humans, and they don't have to rely only on vision, there
are so many ways to "look" it seems like we should have several different ways
of identifying obstacles. Different spectra at least.

~~~
cs2818
I have had similar experiences working in robotics and I would imagine anyone
else who has worked with designing navigation, mapping, or obstacle avoidance
for robots would also be well aware of the shortcomings of each type of
sensor.

Even in the confines of indoor environments with simple problems like
transparent surfaces, specific problematic fabrics, and acoustic panels it is
necessary to fuse several types of sensor data to provide reasonably reliable
detection of the environment.

I would hope that as sensors and output analysis techniques continue to be
developed and costs decrease, creators of autonomous systems would incorporate
a wider variety of sensors into their products.

------
TheMagicHorsey
I think it's a bit ridiculous that the Tesla feature is called an autopilot
when it only covers a subset of circumstances a human could handle. I don't
see how a human being would not apply brakes when they see a truck
perpendicular across a highway ... it can't be missed. The guy was not paying
attention simply because the Tesla smoothly handles 95% of circumstances ...
something in the other 5% occurred and he relied on the machine because of the
smooth previous experience ... and died.

The robot led him to believe it was more competent than it really was.

------
blinkingled
I don't feel like Autopilot will ever be able to take over 100% of the time -
there are so many corner cases, so many unexpected/illegal maneuvers etc. to
deal with that at least in the initial stages it would be wise to accept those
limitations and not market it as Autopilot - instead make it DriverAssist or
something.

It should really only take over in situations the driver failed to handle -
asleep at wheel causing car to shift lanes when it is not safe - loud buzzer
and lane correction to stay in the lane, not seeing object in the front - auto
brake with buzz etc. Then it would be whole different story if it failed - the
mistake was driver's to begin with even if driver assist could not correct it.
And it will still save a lot of lives no doubt.

The whole idea that complex technology like this can guarantee 100% safe and
reliable outcome 100% of the times is I think one hell of a over promise that
will continue to cause people to become even more unaware and distracted than
your usual car driver already is. And as this shows it will not always end
well.

Edit: A Volvo Engineer had similar opinion - From The Verge - "Some autonomous
driving experts have criticized Tesla for introducing the Autopilot feature,
with a Volvo engineer saying the system "gives you the impression that it's
doing more than it is.". That's exactly right.

------
suprgeek
Here is the meat of the Post:

"Neither Autopilot nor the driver noticed the white side of the tractor
trailer against a brightly lit sky, so the brake was not applied. The high
ride height of the trailer combined with its positioning across the road and
the extremely rare circumstances of the impact caused the Model S to pass
under the trailer, with the bottom of the trailer impacting the windshield of
the Model S"

1) The Sensors did not "see" the trailer due to Bright sunlight & <XYZ> 2) The
Collision was at the WindShield Level and above but not to the Front

Neither of these sound like "Extremely Rare" occurrences. That part is just
pure PR Spin.

Request for TESLA - You are doing great things to move the whole industry
forward. Do not F* it up by being too aggressive here.

1) Do not rely on purely Visual cues use (multiple) Radars/LiDARs/Some thing
that works irrespective of Visible Light conditions - so an additional Radar
unit on Top that is checking for obstacles at the height of the car would make
sense.

2) Improved Crash sensors

Till then DISABLE the system except in very limited circumstances even if the
owner tries to activate it. This is NOT a "Move Fast and Break things"
situation. Some poor soul lost his life because of Tesla's inability to
foresee this situation lulled him into a false sense of complacency (and
partly his own inattention).

------
chdir
If autonomous cars can lead to a significant reduction in road fatalities,
then we aren't we in the right direction ?

I was hoping that the discussion here would consider the outcome of autonomous
vs non-autonomous cars in totality instead of having an emotionally charged
outburst against a big corporation.

Yes, maybe this is not the right time for Tesla to go all defensive but that's
expected & that's how almost any company would respond. Why waste all the
energy discussing that only.

Yes, people will get careless with autonomous cars, but don't they do it with
regular cars too ? Cellphone, coffee, makeup, etc. So I disagree that
autonomous cars have to be perfect.

An automobile is like a weapon in the driver's hand. If they aren't careful,
they can hurt someone including themselves. Autonomous systems are there to
help reduce the chances of mundane accidents due to the driver being
distracted or less than capable. They are not their to replace a human yet.

My point of view is that if a human can't avoid a collision, don't expect the
autonomous system to fill the gap. It's their to catch "some" of the errors
that a human might make, it's not a foolproof safety net.

~~~
userbinator
_Yes, people will get careless with autonomous cars, but don 't they do it
with regular cars too ? Cellphone, coffee, makeup, etc. So I disagree that
autonomous cars have to be perfect._

Yes they do, but trying to do something else while driving is additional
cognitive load --- they still have to be paying _some_ attention to driving,
whereas "it just drives itself" automation allows a complete diversion of
attention. In other words, the automation encourages carelessness.

~~~
chdir
> automation encourages carelessness

Is that a hypothesis or are there statistics on accidents & fatalities to back
it up. Curious, not denying what you say. I don't have statistics either.

------
YeGoblynQueenne
So, reading this made me finally realise what I don't like about autonomous
cars.

tl;dr: stupid bugs can now kill you dead.

You know how AI systems, when they fail, they really, really fail? Like, when
you ask Google Image Search for "unprofessional hair" and it returns images of
black womens' hair because it doesn't have a clue? Or when you ask Google
Translate to translate "swallows" to Greek and it comes back with the Greek
work for "swallowing" (true story) because it has no idea what either word
means, just that they're somehow similar, ish?

That sort of thing. When AI fails, it fails in ways that nothing with a brain
will ever deign to fail. People don't crash into the sides of barns because
they mistook them for the open sky. People don't run over old ladies because
they thought they were flowerbeds.

Human drivers fail in situations where they don't have time to react- but even
given all the time in the world there are situations where a program will
simply derp, and derp and derp again until the end of time. And if this
happens in the middle of a life or death situation, people die. The same
happens with autopilots on planes.

So this is what, well, bugs me. I don't guess I get to choose how I die
(unless I do it myself) but I'm pretty sure I don't want to die because of a
dumb bug in a stupid machine. Therefore, I don't want no cars being "smart"
around me. I don't want to be the one driving them and I don't want others
driving them around me.

The current tech simply can't avoid completely brain-dead mistakes and no road
vehicle should be given autonomy until that changes.

------
Splines
All I can say is that I would not want to be a Tesla engineer on the Autopilot
feature.

And I know what you're thinking - it's not because I think some poor engineer
is going to get the axe (ok, maybe that's not what you're thinking). It's
because I would not want to be responsible for the code that needs to deliver
millions of people billions of miles in their car. It's a daunting
responsibility.

------
davidw
I'm not particularly a Tesla fan and have no skin in that game, but I think
people can both be defensive of the technology and genuinely, humanly sad
about the loss of life. I don't see this as being cynical.

------
rsp1984
_Neither Autopilot nor the driver noticed the white side of the tractor
trailer against a brightly lit sky, so the brake was not applied._

It appears to me like this is a case that shows the limitations of stereo
vision as the main sensor system guiding the Tesla autopilot. I would even
venture to say that it could have possibly been avoided using LiDAR
technology. According to various interviews however Elon Musk is not a big fan
of LiDAR [1]:

 _“I don’t think you need LIDAR. I think you can do this all with passive
optical and then with maybe one forward RADAR,” Musk said during at a press
conference in October. “I think that completely solves it without the use of
LIDAR. I’m not a big fan of LIDAR, I don’t think it makes sense in this
context.”_

[1] [http://www.techinsider.io/difference-between-google-and-
tesl...](http://www.techinsider.io/difference-between-google-and-tesla-
driverless-cars-2015-12)

~~~
mandevil
LIDAR puts pretty severe limits on top speed. Basically, LIDAR has range
limitations which means it can't detect anything further than <mumble mumble>
away reliably (I last worked on a driving robot project a decade ago, forgot
the exact distance). But the range, versus human-comfortable braking distance,
works out to a top speed of roughly 30kph. Go faster and your car will detect
the obstacle too late to avoid hitting it. Since Autopilot is trying to solve
a problem at highway speeds (100kph+), I don't think it's a good fit for what
Musk is trying to do.

Now whether what Musk is trying to do is a good idea is a different
question...

------
mxfh
It's a blind spot. Dozens of cyclist get mowed down by right turning trucks
every year in Germany alone and nobody cares about implementing a mandatory
technological solution (beyond upgrading mirrors and optional camera systems)
for that well defined space and situation at relative slow speeds, because
costs to the logistic companies.

At least with the autonomous cars everybody still seems to care about
eliminating more or less rare errors all together, because we seem less
forgiving to technological than human error. And adding a fitting sensors and
detection algorithms to eliminate that blind spot is relatively easy and cheap
if the car has already a sensor network infrastructure.

I doubt that similar accidents with human drivers would even get their own
statistical category.

------
funkysquid
I don't think they should have said anything at all in this case, beyond their
condolences.

1\. Autopilot didn't cause the accident, but they're sort of making it sound
like it did by bringing it up

2\. If people are arguing that autopilot contributed by making the driver feel
like they didn't need to pay as much attention, the only way to defend against
that is to blame the driver - why would you do that?

3\. It's sort of invading the privacy of someone who recently died by
detailing the crash, making it about Tesla. I don't believe other car
companies would issue a PR release if one of their cars got in a crash, even
with accident avoidance features on?

It somehow manages to feel self serving and make Tesla look bad at the same
time.

------
dikaiosune
I think it would have read a little better if the last paragraph was earlier.
It ends up sounding tawdry after the massive disclaimer.

------
danso
Worth pointing out that this crash happened on May 7:

[http://www.nytimes.com/2016/07/01/business/self-driving-
tesl...](http://www.nytimes.com/2016/07/01/business/self-driving-tesla-fatal-
crash-investigation.html)

It's a little bit odd that the press release's sole _specific_ reference to
time -- "We learned _yesterday_ " \-- is in its 3rd word...which, for me, made
me think the accident happened yesterday, or shortly before...Not _two months_
before.

------
java-man
I could never understand why Tesla needed the auto-pilot feature. They have
too much on their plate already. What possible benefit could it bring to
justify adding such a high risk decision?

------
ChicagoBoy11
I wrote this on a comment thread 260 days ago when the autopilot features were
introduced.

>A common phrase in aircraft cockpits nowadays is "What the heck is it doing
now?" as pilots have migrated from actually flying the plane to simply being
glorified systems managers. While planes have become so, so, so much safer
because of all this automation, pilots uncertainty regarding autopilot
functioning is a major concern nowadays, and the reason for several accidents.
There are very interesting HCI challenges around properly communicating to the
pilot/driver "what the heck it is doing" and clearly communicating just how
much control the human has or doesn't have at any given point. This
"announcement" certainly doesn't inspire any confidence that they have really
thought this through deeply enough (I think they probably have, but it should
be communicated like it). As a huge Tesla fan, I can't help but feel like I
need to hold my breath now and make sure something terrible doesn't happen
because of this, and it ends up leading to even more regs setting us back on
the path to full car automation.

I really hate having been right. But from the ensuing discussion here about
the rightness or wrongness about labeling features "beta" in the car, to the
disconnect of how the technology was "meant" to be used versus how it was
portrayed by the company itself, it is pretty clear they really fucked up.

The huge discrepancy between the cautionary tone of this press release and
everything that came before it is a great reminder that while some of us may
be really worried about preventing AI from having unintended consequences in
our society, there are at the same time many more pressing issues that we must
address that are very much caused by very human motivations and actions.

------
itg
The fact that Tesla will blame their users for anything that goes wrong makes
me avoid considering one. Alone with all the hype they try to sell.

But what disappoints me the most is how the "tech" crowd will defend Tesla
every time.Someone lost their life because Tesla can't put out a decent semi-
autonomous driving systems and we are supposed to brush it aside?

~~~
rasz_pl
Works for Apple every time.

BTW iPhone 6/6+ will probably get a recall in couple of years (if sheep
complain loud enough) because again there is a design defect (like in every
other Apple product, nvidia gpus, you hold it wrong, no cooling in laptops,
moisture sensitive parts positioned right next to ports etc), Touch IC is not
supported by metal rf shield like in all previous phones and cracks joints due
to normal usage flex of whole phone body = grey bar on top of lcd and dead
touch.

------
JTenerife
The reason why most other manufacturers haven't released their autonomous
driving is not that their systems aren't as sophisticated as Tesla's, but that
all systems out there aren't ready yet. Tesla is stellar in many ways but to
release their autopilot was way to risky.

------
niftich
Other premium automakers have automatic braking and pre-collision systems that
take effect regardless of whether the (enhanced) cruise control is engaged.

We have seen from other incidents that Tesla's autobraking will not engage
when autopilot is disabled (potentially by tapping the brakes), which is
contrary to widespread user expectation -- disclaimers notwithstanding.

However, in this particular case, it was the obstruction detection that
failed. Based on their blog post disclaimer, it appears to be a visual sensing
system; eye-height radar-based system used by most other manufacturers would
have correctly detected the obstruction.

Is there an industry standard (yet) for the efficacy and behavior that these
systems must meet?

~~~
michalnaka
Which manufacturers use eye-height radar-based systems?

~~~
niftich
I stand corrected about the term 'eye-height' \-- apparently no one I could
credibly find mounts their radar at eye-height.

Mercedes-Benz uses two different radar sensors: "the DISTRONIC radar is
configured to monitor three lanes of a motorway to a range of up to 150 metres
with a spread of nine degrees, the new [DISTRONIC PLUS] 24-Gigahertz radar
registers the situation immediately ahead of the vehicle with aspread of 80
degrees and a range of 30 metres." [1]

Volvo has a combination of grill-mounted 15° FOV radar and a 48° FOV greyscale
camera behind the top of the windshield [2].

Volkswagen uses radar [3], but not sure where it's mounted. Honda's CMBS is a
radar in the grill [4]. Subaru uses two cameras mounted behind the top of the
windshield [5], and no radar, albeit radar is used with some of their other
safety systems.

[1] [http://www-nrd.nhtsa.dot.gov/pdf/esv/esv20/07-0103-O.pdf](http://www-
nrd.nhtsa.dot.gov/pdf/esv/esv20/07-0103-O.pdf) [2] [http://www-
nrd.nhtsa.dot.gov/pdf/esv/esv20/07-0450-o.pdf](http://www-
nrd.nhtsa.dot.gov/pdf/esv/esv20/07-0450-o.pdf) [3]
[http://en.volkswagen.com/en/innovation-and-
technology/techni...](http://en.volkswagen.com/en/innovation-and-
technology/technical-glossary/umfeldbeobachtungssystem_front_assist.html) [4]
[https://techinfo.honda.com/rjanisis/pubs/om/JA0606/JA0606O00...](https://techinfo.honda.com/rjanisis/pubs/om/JA0606/JA0606O00325A.pdf)
[5] [http://www.roadandtrack.com/new-cars/news/a6852/subaru-
camer...](http://www.roadandtrack.com/new-cars/news/a6852/subaru-camera-
controlled-cruise/)

------
alkonaut
After reading a lot of the comments here it's obvious that the best thing
Tesla could do for the safety of this feature is to simply rename it "Magic
Cruise Control" or "CoPilot" or something to that effect.

------
scythe
> Neither Autopilot nor the driver noticed the white side of the tractor
> trailer against a brightly lit sky,

So this got me wondering: are there areas where retinae still do better than
cameras at certain kinds of discrimination problems? Nature still has human
technology beat on hardware in a few categories (particularly joints), but I
didn't expect eyes to be one of them. When I look at or near a bright light, I
can see objects close to the light much better than I can in a photograph, but
I always assumed this was only because I had a cheap camera.

The corresponding question is: how expensive is the camera that prevents this
accident?

~~~
marvin
Cameras are obviously worse in situations where there is very high contrast
(e.g. when you are facing the sun), because the digital representation of
color/luminosity has a far more narrow range than that of the eye. Uncertain
on whether this has any practical consequences, but it seems likely.

------
fdsaaf
I'm really not looking forward to the moralistic haranguing that's going to
happen over the next few days. One commentator complained that the tech
community is adapting an "acceptable losses" posture. I don't see a problem
with this attitude --- anything else is unrealistic and is an argument against
technological progress.

If our ancestors had adopted the attitude that we could do nothing not
perfectly safe, we'd have never left the damn savanna. We'd have certainly
never allowed aircraft to fly.

------
United857
I couldn't find any other details about the accident, but it seems from the
description like it was a side collision at an intersection.

In order for that to happen, it's likely that one of the vehicles had to
violate a stop sign or red light.

Note that the Autopilot _isn 't_ aware of these things (it won't stop unless a
car in front stops). If the Tesla was the vehicle at fault, then most likely
it was unawareness of this limitation on the part of the driver (or
complacency thinking there wouldn't be any traffic).

------
ebbv
I count myself as a big Tesla fan but this announcement is too heavy on the
disclaimers and ass covering and too little on empathy and sympathy.

------
dgudkov
It's a big mistake to allow autonomous driving with speeds that are
potentially fatal -- the technology is still not tested enough. It should've
been a good decade of autonomous driving with speeds strictly not more than
10-20mph(15-30kph), and only then gradually increasing it. It was clearly a
marketing rush for sensational features to ensure sales.

------
agumonkey
The situation is a bit confusing, did the tractor cross in a hurry just before
impact ? was the victim really blinded by the colors ? maybe it's just
vocabulary, a tractor trailer seems huge and hard to ignore.

Even though I'm eager to see SDV benefits, Autopilot seems like way too much
hassle for such possible consequences. An overengineered auto cruise.

------
BoarAndBuck
Condolences to the family. Since he was "a friend to Tesla" there might not be
a lawsuit. If every Tesla owner is considered a friend, there might never be a
lawsuit. These things are bound to occur. Remember Morrison and "Keep your
eyes on the road...and your hands upon the wheel!"

------
ikeboy
[http://electrek.co/2015/12/22/man-dies-tesla-model-s-
crash-d...](http://electrek.co/2015/12/22/man-dies-tesla-model-s-crash-dump-
truck-first-death/)

Interesting, says model s was rated the safest of any car tested.

------
mariust
I think that the radar is broken in terms of the angle that it does it scan,
it should be 180 per 180 degrees from left to right and from road surface to
top. I have a hunch that now it's just 180 left to right and somewhere around
30-45 road to top.

------
a3n
> It is important to note that Tesla disables Autopilot by default and
> requires explicit acknowledgement that the system is new technology and
> still in a public beta phase before it can be enabled.

The driver of the truck wasn't asked to sign that.

------
ascotan
Being an early adopter for something that can kill you seems quite risky.

~~~
JoeAltmaier
And yet, folks take restless-leg syndrome pills and Viagra without a second
thought. As a society we're OK with some risk.

------
yardie
Wow. I'm speechless.

I do hope they are able to learn more from this. This is just a bad
coincidence of environment meets chance and someone died because of it.

I do think computer vision still has a but further to go. A white trailer and
a white sky shouldn't be a problem. But everyday I'm impressed with how human
sight can find and lock into the most obscure details. Like I've been
sunblinded before and yet I knew something was in front of me because I
noticed a grey shadow on grey asphalt and immediately hit the brakes.

We're almost there but not quite.

~~~
ModernMech
> I do hope they are able to learn more from this. This is just a bad
> coincidence of environment meets chance and someone died because of it.

Nope, this is completely expected when unproven technology is packaged as a
push-button feature and shipped on consumer technology. I don't work for
Tesla, but I am a robotics engineer who has worked on exactly this technology,
and I saw this coming from a mile away. Tesla should have as well.

~~~
ejk314
Oh they know. It's just that the expected value added to their car sales, even
factoring in the negative PR and potential lawsuits, is >0

------
gohrt
> The high ride height of the trailer

Is non-standard ride-height a safety failure? Should trailers have crash
panels to avoid drive-unders?

~~~
mdorazio
They already do in the back of the trailer where most collisions are likely to
take place (it's typically a welded set of cross-beams that go down to roughly
bumper height). I think that very few collisions are of this side-on type that
would make it worthwhile to add side panels as well. It's kind of one of those
"how safe is safe enough?" situations where you have to compromise at some
point.

~~~
mthoms
Elsewhere in this thread someone mentions that side rails on semis are
mandatory in Europe. Makes sense.

------
aab0
This sounds exactly like the previous Autopilot failure! A high-up truck where
the Tesla slides under and crashes.

------
hristov
I am not sure what happened in this situation, but I have to warn any HNers
that have teslas or might get behind the wheel of one -- do not rely on the
autopilot!!! If you are using the autopilot you must be in a heightened sense
of alert, not lowered. The autopilot can get you into a dicey situation, and
then you have to take over control and react very fast.

~~~
DennisAleynikov
I hope you read the release and find out what happens, I look forward to a
thorough investigation into the sensors that failed in this case. I may be
wrong, but a majority of HN readers with teslas understand that it's a
glorified cruise-control that can stay within a lane up to a certain degree.
It's an autopilot in the same sense that autopilots exist for airplanes - and
airliners are NOT WITHOUT PROFESSIONAL PILOTS. It might be that the bar needs
to be raised for drivers licenses for semi-autonomous vehicles.

------
Artlav
Somehow this reminds me of Dr. Daystrom defending the M-5 computer, back in
the original series.

~~~
spb
For reference: [http://memory-
alpha.wikia.com/wiki/The_Ultimate_Computer_(ep...](http://memory-
alpha.wikia.com/wiki/The_Ultimate_Computer_\(episode\))

------
robbiet480
It's all over CNBC right now, Tesla is down 2.58% in after hours after closing
up 0.99%

~~~
devy
Care to share a link? Google News doesn't turn up any CNBC results yet as of
now.

~~~
robbiet480
Was watching CNBC live, they devoted maybe 15-20 minutes to it.

------
vinceguidry
If Autopilot had been on the truck, the accident would have never happened.

------
BoarAndBuck
Sad

------
return0
Tesla should not allow autopilots. They have nothing to gain by the negative
publicity they will get from cases like these. Regardless of how bad luck the
driver had, tesla can always be blamed for relaxing the driver's reflexes by
giving a false sense of safety. It doesnt matter if it was by default turned
off or hidden in menus.

What other companies use this kind of driving assist system? The ones i know
are only hinting to the driver , and disengage if the driver does not respond.

Got to be careful about these things. You don't put beta stuff in cars. It's
like your doctor making drug trials on you. They should figure out a way that
autopilot engages the driver and keeps his eyes on the road.

~~~
BookmarkSaver
>Tesla should not allow autopilots. They have nothing to gain by the negative
publicity they will get from cases like these.

This is literally the only way to develop such systems. And regardless of
incidents like this, these systems will be developed sometime in the fairly
near future and whichever company is on the leading edge of them will command
a massive new market that will probably start growing incredibly rapidly right
after it takes off.

~~~
mdorazio
This is just factually incorrect. Google is developing self-driving tech just
fine without releasing it to consumers as a "beta". So is Delphi. So,
presumably, is Uber now. Personally I am also of the opinion that releasing
autopilot in its current state (especially with the weak sensors) was not a
good move in terms of safety for drivers. Great move for PR and gaining data,
yes, but we can now make a very plausible argument that it has killed at least
one person.

~~~
Retra
And you think when Google, Delphi, and Uber start selling cars, everything is
going to work perfectly?

~~~
mdorazio
Of course not, but it will work a lot better than autopilot, with
significantly lower accident rates. No system will ever be perfect, but there
is a huge difference between privately testing something to the best of your
abilities before releasing it, and turning on a beta feature easily capable of
killing drivers via a software update protected by a disclaimer screen.

~~~
Retra
What reason do you have to believe that? What are you basing this on?

Just because something is in development longer doesn't mean the quality is
better. Furthermore, Tesla might be gaining some critical information with
their early release. And whose to say Tesla _didn 't_ test this stuff to the
best of their abilities?

'Significantly lower accident rates' compared to what, exactly? The highly
accurate accident rates collected from the many decades of experience with
Tesla accidents that never happened?

~~~
mdorazio
I'm basing it on what those companies are actually doing right now in the real
world. Both companies have paid drivers babysitting their "driverless" cars on
real roads right now, and they will continue to test them this way, out of the
hands of the public, for years, gathering data in a safe way over millions of
miles driven. Because that's the way to gather data when real lives are at
stake. Both companies are also adamant about having LIDAR and additional
sensors in their vehicles that Tesla has decided aren't needed. Neither
company has or will release a version to the public until they actually think
it is safe to turn control over to the car entirely.

Now compare that to Tesla - beta software and sub-par sensors that can't
detect stationary objects in the road (there was a recent case where a Model S
on Autopilot hit a stalled car on the freeway despite plenty of time to stop),
can't detect obstacles at windshield level (there was a recent case where a
Model S hit a parked trailer when summoned), and apparently have trouble
distinguishing between sky and lightly colored objects in full daylight (the
most recent accident). This is a company that is prioritizing technology
releases ahead of human lives.

Significantly lower accident rates compared to Model S Autopilot, if that
wasn't obvious. In the future, you will take accidents per miles driven in
autonomous mode and compare between Tesla, Google, Apple, Uber, etc. and then
see which one has the lower rate. I am saying that Tesla will be the worst.

------
zallarak
No other auto maker would do this. Please correct me if I'm wrong, but this
makes me respect Tesla even more.

It's also a reminder to engineers to have an extremely high bar of excellence.

EDIT: thanks for telling me I'm wrong. This was a self-serving blog post on
their part to damage control.

~~~
dvcc
This was entirely defensive and self-serving on behalf of Tesla. It's a
general PR release, companies do these all the time. What made this so
special?

~~~
fixermark
the fact that they could.

Imagine how many press releases there'd be per day if another auto company
felt compelled to do one for each accident their cars were involved in.

~~~
coralreef
This release only happened because Tesla knew it would make headlines, and
wanted to get out in front before a sensationalized story gets out first.

Other auto accidents are non-interesting to the public, but this one involved
automated driving.

~~~
dragontamer
Ehhh?

The Tesla suspension issue with non-disclosure agreements __also __made front
pages, just a week or two ago.

[http://www.cnbc.com/2016/06/11/tesla-revises-
nondisclosure-c...](http://www.cnbc.com/2016/06/11/tesla-revises-
nondisclosure-clause-as-musk-accuses-customers-of-fraud-on-suspension-
claims.html)

Tesla's blog is their primary marketing / damage control arm. Anything they
publish there becomes instant news.

