
Tesla confirms “Autopilot” crash in Montana - kempbellt
http://www.ktvq.com/story/32427989/tesla-confirms-autopilot-crash-in-montana
======
Animats
With three accidents on "autopilot" in a short period, the defects in Tesla's
design are becoming clear.

1\. Tesla's "hands on wheel" enforcement is much weaker than their
competitors. BMW, Volvo, and Mercedes have similar systems, but after a few
seconds of hands-off-wheel, the vehicle will start to slow. Tesla allows
minutes of hands-off time; one customer reports driving 50 miles without
touching the wheel. Tesla is operating in what I've called the "deadly valley"
\- enough automation that the driver can zone out, but not enough to stay out
of trouble.

The fundamental assumption that the driver can take over in an emergency may
be bogus. Google's head of automatic driving recently announced that they
tested with 140 drivers and rejected semi-automatic driving as unsafe. It
takes seconds, not milliseconds, for the driver to recover situational
awareness and take over.

2\. Tesla's sensor suite is inadequate. They have one radar, at bumper height,
one camera at windshield-top height, and some sonar sensors useful only during
parking. Google's latest self-driving car has five 3D LIDAR scanners, plus
radars and cameras. Volvo has multiple radars, one at windshield height, plus
vision. A high-mounted radar would have prevented the collision with the
semitrailer, and also would have prevented the parking accident where a Tesla
in auto park hit beams projecting beyond the back of a truck.

Tesla is getting depth from motion vision, which is cheap but flaky. It cannot
range a uniform surface.

3\. Tesla's autopilot behavior after a crash is terrible. It doesn't stop. In
the semitrailer crash, the vehicle continued under power for several hundred
feet after sheering off the top of the Tesla driving under the semitrailer.
Only when it hit a telephone pole did the car stop.

The Pennsylvania Turnpike crash is disturbing, because it's the case the
"autopilot" is supposed to handle - divided limited-access highway under good
conditions. The vehicle hit a guard rail on the side of the road. That may
have been an system failure. Too soon to tell.

The NTSB, the air crash investigation people, have a team investigating Tesla.
They're not an enforcement agency; they do intensive technical analysis.
Tesla's design decisions are about to go under the microscope used on air
crashes.

Tesla's spin control is backfiring. They tried to blame the driver. They're
being sued by the family of the dead driver, and being investigated by the
NHTSA (the recall people), and the NTSB. In comparison, when a Google self-
driving car bumped a bus at 2mph, Google admitted fault, took the blame, and
Urmson gave a talk at SXSW showing the data from the sensors and discussing
how the self-driving car misjudged the likely behavior of the bus.

~~~
toomanythings4
>Tesla allows minutes of hands-off time; one customer reports driving 50 miles
without touching the wheel.

>They tried to blame the driver.

In this incident, the driver was using autopilot in a fashion that it should
not be; twisting road at high speed. The driver IS at fault.

>They're being sued by the family of the dead driver

This is no proof of anything. You can find a lawyer to sue anybody over
anything.

>and being investigated by the NHTSA (the recall people), and the NTSB

Well, of course they are. They're supposed to. Again, no proof of anything
Tesla did wrong.

I'm not saying there isn't a problem with this. I'm saying your reasoning is
wrong.

I'm also curious as to how many autopilot sessions take place every day. If
there have been three crashes, such as this where the driver is at fault, out
of a million then that's one thing not considered so far.

~~~
TetOn
>In this incident, the driver was using autopilot in a fashion that it should
not be; twisting road at high speed. The driver IS at fault.

I disagree on this point. In any system that's supposed to be sophisticated
enough to drive the car and but also carries a giant caveat like "except in
these common driving situations..." failure is not instantly down to "user
error." Such a system should gracefully refuse to take over if it's not
running on an interstate or another "simple enough" situation; otherwise, as
many have noted, the threshold for "is the driver paying sufficient attention"
should be set much, much lower.

That the system is lulling the users into bad decisions is not automatically
the fault of said user. Some blame, maybe most of the blame, has to fall on
the autopilot implementation. When lives are on the line, design should err on
the side of overly-restrictive, not "we trust the user will know when not to
use this feature and if they are wrong it is their fault."

~~~
Vraxx
It's not lulling anybody into anything! The guy was given multiple warnings by
the car that what he was doing was unsafe. The last of which was extremely
pertinent to the crash. To quote the article, "As road conditions became
increasingly uncertain, the vehicle _again_ alerted the driver to put his
hands on the wheel". This is after the initial alerts telling him that he
needs to remain vigilant while using this feature. This isn't some burden of
knowledge on when to use and when not to use the system, it's plenty capable
of knowing when it's not operating at peak performance and lets you know. At
_that_ point, I'm willing to shift blame to the driver.

~~~
TetOn
I just want to see "the vehicle again alerted the driver to put his hands on
the wheel" followed by "and Autopilot then pulled to the side of the road,
turned on the hazard lights, and _required_ the driver to take over."

~~~
foolfoolz
so far we have been lucky the accidents have only impacted the driver. is it
going to take the first accident where an autopilot car kills another innocent
driver before people see the problem?

~~~
Nadya
When autopilot kills more innocent drivers than other drivers you can point me
to a problem. Time will tell if it is better or worse and the track record up
until May/June was going pretty well.

I'd rather 10 innocent people die to freak autopilot incidents than 1,000
innocent people die because people in general are terrible at operating a
2,000-3,500lb death machine. Especially because everyone thinks of themselves
as a good driver. It's fairly obvious not everyone is a good driver - and that
there are more bad drivers than good.

Maybe I only see things that way because there have been four deaths in my
family due to unaware drivers which could have easily been avoided by self-
driving vehicles. All of them happened during the daytime, at speeds slower
than 40mph, and with direct line of sight. Something sensors would have
detected and braked to avoid.

------
aresant
I wrote this on a previously buried thread:

Naming their system "autopilot" goes beyond the marketing dept overstepping
and into the territory of irresponsibility.

The "average" definition of autopilot per Wikipedia is "autopilot is a system
used. . . WITHOUT CONSTANT 'HANDS-ON' control by a human operator being
required."

And yet Tesla's disclaimer states:

"Tesla requires drivers to remain engaged and aware when Autosteer is enabled.
Drivers must keep their hands on the steering wheel."(2)

The disconnect between the name and the actual function is needlessly
confusing and probably irresponsible.

They should change the name to "driver assist" or something more accurate
given that mis-interpreting the system's function can lead to death of the
driver or others.

(1)
[https://en.wikipedia.org/wiki/Autopilot](https://en.wikipedia.org/wiki/Autopilot)

(2)
[https://www.teslamotors.com/presskit/autopilot](https://www.teslamotors.com/presskit/autopilot)

~~~
staunch
If it's a good enough name for airplanes, it's a good enough names for cars.
Both require users to be ready to take control at any moment.

~~~
witty_username
Pilots (require certification and extensive training) are different from car
drivers.

~~~
wvenable
Flying a plane is more difficult, therefore requires more training. But
driving a car requires licensing (certification) and training.

~~~
darpa_escapee
I had to pass a 20 question multiple choice test, drive at 15mph on a short
course and parallel park to get my license at 17 years old. That was my only
certification process. Unless laws change, I will never have to go through a
training or certification process again in my life.

Comparing pilot certification to that of a driver's is disingenuous.

~~~
wvenable
Thankfully, it's not that easy in all countries. For my test, 3 failures in
any category was an instant fail. It's fairly hard to pass the first time.

Although if US drivers are so untrained, perhaps auto-pilot is even more
relevant.

------
tmptmp
It's sad. I am a big fan of Tesla and had/still have very high hopes from
them, but it's decreasing. It seems Tesla is hell-bent on committing suicide.
This way, they don't need the fossil-fuel car lobby to destroy them. The BMW,
Volkswgen and the likes of _dirty-gas-guzzlers-producers_ would be chuckling
now.

Why the hell are they so bullish on auto-pilot? I cannot see any sense in it.
They must just concentrate more on the battery, charging-stations and
efficiency/performance part of the equation. Instead, I am seeing more efforts
are being thrown in not-so-relevant areas and that too aggressively.

Semi-automatic auto-pilot is the idea rejected even by Google. Tesla should
learn fast and get back to its core business: battery powered electric cars,
period and not the battery powered _automatic_ electric cars.

I'd be happy if they prove this wrong.

edit: many typos and grammatical mistakes, sorry.

~~~
return0
> The BMW, Volkswgen and the likes of dirty-gas-guzzlers-producers would be
> chuckling now.

Why? Autonomous cars =/= electric cars

~~~
tmptmp
With Tesla in the picture, there is a lot of pressure on the BMW, Volkswagen
etc to invest in the EV technology. So they would be chuckling now because
with Tesla getting out of the competition (or with Tesla becoming weaker),
they would have an easier task pushing their _dirty-gas-guzzlers_ and of
getting loads of _easy profits_.

With Tesla out, they can even squash the entire electric car efforts
altogether and will plunder the money thus "saved" in the form of _fat
bonuses_ to the higher-ups (ala Volkswagen style), which they would have to
invest in EV research otherwise.

I may seem exaggerating, but with the recent Volkswagen _pollution scandal_ ,
many will agree with this. Tesla is better, Elon Musk is better in terms of
innovation and their direction seems more promising for the public good. They
are disrupting the fossil fuel giants who are hellbent in garnering as much
money as they can at whatever costs.

I know I may (turn out to) be a foolish naive believing too much in Tesla and
Musk, but I find myself helpless here.

~~~
return0
> Tesla is better, Elon Musk is better

Perhaps you live in a bubble, as most car makers barely raise an eyebrow over
tesla as a competitor so far. Making tonloads of batteries is not necessarily
better than making efficient, safe and less polluting "gas guzzlers".

From the outside, it seems there is a push to revive the american car industry
through EV, supported by lenient and eager to see results US regulators. Of
course, time will tell, and in the process hopefully we will all have safer
and eco-friendly cars.

------
a3n
I don't understand why Tesla is allowed to release a safety critical system on
to the roads when it requires buyers to acknowledge its not-finishedness.
They're putting beta systems into an uncontrolled environment. No driver in
the nearby lanes has been given that choice.

~~~
Draiken
Am I the only one that thinks people should be held responsible for their own
choices?

This argument is completely bogus. No driver in the nearby lanes is aware the
other driver is drunk, a bad driver or emotionally unstable.

The entire idea of beta testing is to have your system on an "uncontrolled
environment". The system has bugs, as any beta system does, and whoever uses
it should be fully responsible for it. Yes, fully responsible.

Tell me what's the difference between using a beta system and not caring
(leaving hands off the wheel, etc) and drunk driving? On both situations the
driver is choosing to put everyone at risk, nobody knows about the fact until
after an incident and if there is an incident, who's fault is that? The beer
company? The car company?

If the driver assistance made cars crash because it ignored a command from the
driver or something like that, then sure... it's 100% Tesla's fault. But
clearly on all of these accidents the drivers were careless to the extreme by
leaving hands off the wheel for several minutes.

We can discuss if they should try to improve how they handle people that
remove their hands from the wheel, but that doesn't change the fact that
ultimately the choice to be careless was made by the driver, not the car
manufacturer.

~~~
gkya
"Tell me what's the difference between using a beta system and not caring
(leaving hands off the wheel, etc) and drunk driving?"

The latter is a clearly defined criminal offence whereas the former is
currently unclear. While the driver is certainly liable for reckless driving,
that the marketing seems to have suggested that the feature is usable, is a
fact.

~~~
Draiken
And does that remove the driver's responsibility?

~~~
UnoriginalGuy
Does it remove Tesla's responsibility?

You're literally now comparing Autopilot to criminal offenses and asking if it
should be allowed, seems like you're suggesting Autopilot should be illegal.

Sure, drivers suck, this is a known bug in humans. I'd like to see Autopilot
changed to counteract the human limitation rather than acting like humans
should be perfect for a system to work.

This is how you get into bad UI design ("it is NOT my UI that sucks, the users
just don't understand it! Read the manual!"). Except in this case people can
literally die.

~~~
gkya
What's criminal is abuse of autopilot, not itself. Just like with alcohol,
drink whatever you like, just don't get behind the wheel.

------
taneq
> "As road conditions became increasingly uncertain, the vehicle again alerted
> the driver to put his hands on the wheel," said Tesla. "He did not do so and
> shortly thereafter the vehicle collided with a post on the edge of the
> roadway."

If the autopilot's confidence was dropping that rapidly, why did it not slow
the vehicle? If it hits zero confidence it should halt completely until the
driver restarts the car in manual mode, and possibly disable itself until
checked out.

~~~
readams
I think you'll find suddenly braking in the middle of the road poses its own
dangers.

~~~
Matthias247
The "Autopilot" (aka Lane Assist) of my VW does that. If you don't get your
hands on the wheel after it wants you to do (which is about 10-15s) and after
a first warning it will hit the brakes with a lot of power for the fraction of
a second.

So it's not actually stopping but giving you a clear signal that you should do
something and would wake you up if you fell asleep. I don't know if it will
come to a stop if you also ignore this warning, because it's not something you
want to test with other participants on the road behind you.

I think that's the important difference: This system actively disencourages
the driver from fully relying on the system while Teslas implementation does
not seem to do it.

~~~
danso
Apologize for not googling this myself: what mechanisms do autopilot programs
have for detecting traffic conditions behind them, in terms of
camera/radar/algorithms? Are they less robust than the mechanisms governing
detection in front of the vehicle?

~~~
Matthias247
I guess the current lane-assisting and adaptive cruise control systems don't
do anything at all. Most cars only have ultrasonic parking sensors in the back
and probably a parking camera which both don't have any range. They probably
rely on the fact that vehicles behind them should drive in the normal safety
distance and that following cars are responsible for avoiding rear end
collisions.

------
NamTaf
This is going to keep happening until Tesla change their approach from a move-
fast-and-break-things startup philosophy to sound engineering one that puts
safety in design front-and-centre.

Traditional real-world enigneers have been drilled in this for longer than
living memory. When the software guys decide to bring their software
appraoches to the real world, this is what happens.

From 12 days ago:
[https://news.ycombinator.com/item?id=12013713](https://news.ycombinator.com/item?id=12013713)

~~~
dayaz36
You do know Tesla has the highest safety rating of any car ever rated in
history right? [https://www.teslamotors.com/blog/tesla-model-s-achieves-
best...](https://www.teslamotors.com/blog/tesla-model-s-achieves-best-safety-
rating-any-car-ever-tested)

~~~
mattlondon
Er - that is wrong.

2014 Model S got 5-star/82%[0] (where 100% is best) adult occupant protection
on EuroNCAP. For comparison, the smaller, lighter, cheaper 2015 Honda Jazz got
5-star/93%[1]

Other cars are significantly better than this [2]. E.g. 2016 Alfa Romeo Giulia
5-star/98%, 2016 Toyota Prius 5-star/92%, 2015 Volvo XC90 5-star/97%, 2014
Porsche Macan 5-star/88%. I could go on, but the Model S is not the safest car
ever tested - it has approximately the same level of crashworthiess as the
2014 Smart Fortwo (4-star/82%) [3] according to EuroNCAP when looking at adult
occupant safety.

Cars have come a long way in the past decade or two though - its amazing to
see "small" cars like the Jazz where they dont even get a cracked windscreen
when basically driving into a wall at 40mph. For comparison check out the 2000
Citroen Saxo fold up like a tin can... [http://www.euroncap.com/en/ratings-
rewards/latest-safety-rat...](http://www.euroncap.com/en/ratings-
rewards/latest-safety-ratings/en/results/citro%C3%ABn/saxo/15511)

[0] -
[http://www.euroncap.com/en/results/tesla/model-s/7897](http://www.euroncap.com/en/results/tesla/model-s/7897)

[1] -
[http://www.euroncap.com/en/results/honda/jazz/21484](http://www.euroncap.com/en/results/honda/jazz/21484)

[2] - [http://www.euroncap.com/](http://www.euroncap.com/)

[3] -
[http://www.euroncap.com/en/results/smart/fortwo/7894](http://www.euroncap.com/en/results/smart/fortwo/7894)

~~~
icc97
The differences between the Tesla and the Jazz are a bit strange. The main
difference in score between the two appears to be the side impact from a pole
- which is all 'good' for the Jazz, but it was 'weak' for the Tesla in the
body section. But if you look at the two videos of the cars, after the pole
crash, Tesla is almost completely intact (29 kph) [1] and the Jazz is half
destroyed (32 kph) [2]. The Jazz impact was 10% faster - but the difference in
damage is remarkable.

It's also worth noting that where as this child rating is higher for the Jazz
- the actual 'Crash Test Performance' was rated as Good for the Tesla but
Adequate for the Jazz. The lower test score for the Tesla is because they
don't have Isofix installed.

    
    
      [1]: https://youtu.be/7uhhYKJWEBw?t=1m4s
      [2]: https://youtu.be/PHptsZglju8?t=2m18s

~~~
tdaltonc
The point of a crash test is not to show that the vehicle is super rigid and
impervious to damage. It is to show that a human inside the car is safe. To
that end, the car _should_ "crumple." It shouldn't remain 100% rigid, and it
should fall apart like tissue paper. It should absorb as much kinetic energy
as possible so that the human doesn't have to.

The Jazz is doing the right thing.

------
the_common_man
Nobody reads manuals and follows instructions . I feel Tesla is going to learn
this lesson that hard way.

From the article:

"We specifically advise against its use at high speeds on undivided roads," it
said. Tesla states clearly in its owner's manual that drivers should stay
alert and keep their hands on the wheel to avoid accidents when the Autopilot
feature is engaged.

~~~
Bromskloss
> drivers should stay alert and keep their hands on the wheel

What is then the point of having an autopilot?

~~~
CaptSpify
And this is why I don't think they should call it "autopilot". The name is
misleading, imo

~~~
coredog64
How about"copilot"?

~~~
kirrent
CoPilot? Someone who can completely take over for you when you're feeling
tired. Compared to an Autopilot which requires pilot or skipper supervision I
think copilot would be a far worse term if it weren't for the fact that the
name doesn't really matter too much.

------
meddlepal
Tesla needs (or is going to be forced) to issue a recall soon. Putting "beta"
technology in a car isn't OK when it puts drivers and others on the road in
danger. This is especially problematic because people have a preconceived
notion of what "auto pilot" means and just because Tesla says it's not ready
for prime time so therefore you should "keep your hands on the wheel" doesn't
make it OK to put into production.

~~~
michaelmcmillan
I thought the entire autopilot could be replaced with an over-the-air software
update?

~~~
_Codemonkeyism
See the comments above, if the sensor suite is insufficient, over-the-air
software updates won't help.

~~~
azernik
But OTA updates can disable the feature, which is the essential safety
response right now. Even if they _wanted_ to recall and add sensors, I don't
think that would make this system safe enough - the bigger issue is the human
factor involved in a not-quite-good-enough autonomous vehicle.

~~~
_Codemonkeyism
With the PS/Linux decision, not sure they can just 'disable' the feature.

~~~
azernik
Not sure exactly to what you're referring, but if it involves hacking your
Tesla to re-enable it,

a) that will vastly reduce the level of usage, and b) the OTA update could
very well completely remove the code from storage, making the job of hacking a
much more involved process than just enabling a flag.

~~~
haikuginger
Sony is settling a class-action suit with people who bought PS3s while relying
on the "Other OS" feature. So, it's not about technical ability to re-enable
the Autopilot feature; it's about being liable to their customers who relied
on the feature being present when they shelled out $70k+.

~~~
azernik
Ah. Indeed. I suspect that if they do disable this, though, it'll be my NTSB
mandate, so they won't have much of a choice - similar to the VW thing. They
might _then_ have to deal with the resulting lawsuits, but I can see many
scenarios where they don't have a chance to avoid them. Tesla _did_ , after
all, sell a product with an advertised feature that isn't ready for prime-
time.

------
agumonkey
Musk is losing its mind. Blaming the user won't work long with life critical
systems. Unless he expects backing from the NRA, putting complex and dangerous
devices that can't escape dangers themselves in the hands of the average joe
will create incidents. And even if he's legally not to blame I don't think
he's gonna sale cars long with such images in the public mind.

I hope he's not putting too much hubrys in the AutoPilot thing and try to
enforce it at any cost. Soon there'll be so many corner cases, AP will only
"best used while parked with nobody inside nor outside the car, preferably
with engine stopped too".

The reason I liked Tesla/Musk before is that he pushed quality and safety
Unconditionally. I'm starting to think it was mostly nerd honeymoon phase.
Either revamp the logic (sensors-recall and software) or disable it.

------
pluma
> "As road conditions became increasingly uncertain, the vehicle again alerted
> the driver to put his hands on the wheel," said Tesla. "He did not do so and
> shortly thereafter the vehicle collided with a post on the edge of the
> roadway."

This is the real danger of autopilot: not that it is unreliable but that
humans are lazy and don't think rare events (like dangerous situations that
result in a crash without intervention) don't happen to them.

I'm not sure autopilot is a good middle ground between assistance and full
automation exactly because of what these accidents show. It basically makes
the typical ride so uneventful and mind-numbingly boring drivers stop paying
attention and become unable to react appropriately when a dangerous situation
does occur.

Cruise control had similar problems: it reduces speeding-related accidents
(because drivers are less likely to micro-optimize their speed) but it
increases the risk of rear-ending because drivers don't have to stay as alert
for the majority of the driving (and then miss those situations in which they
need to react quickly).

Humans are not good at staying attentive for extended periods of timing,
especially when nothing ever happens. Self-driving cars can fix this by
removing the need for human intervention, but autopilot seems like one of
those technologies that should be safer in theory but ultimately fails to
consider the human factor.

------
cloakandswagger
> "No force was detected on the steering wheel for over two minutes after
> autosteer was engaged," said Tesla, which added that it can detect even a
> very small amount of force, such as one hand resting on the wheel."

Why doesn't the "autopilot" turn off if no hands are detected on the wheel?

~~~
lucb1e
Well the whole autopilot feature is silly if you need to keep your hands on
the wheel. Then you might as well steer yourself, or am I missing something?
(I don't know details about this feature, I just know it steers automatically
on highways.)

Until full driver attention is no longer needed, enabling "autopilot" and
saying in the manual "PS. you still need to stay as alert as you normally
would _and_ keep your hands on the wheel" (i.e. no benefits from having this
feature) is just asking for accidents.

Imagine, as you suggest, it does turn off when there's no hand on the wheel.
People are just going to rest a hand on it and still not pay attention. What's
the next step, tracking people's eyes? Or shall we just not enable autopilot
in the first place, if it can't do its only job?

~~~
Corrado
I think the value of "autopilot" for me would come in increased awareness when
my attention is fading. When I'm coming home at 2AM from a gig and the road is
long, straight, flat, and boring, my attention wanders. Sure, I start out
strong, but soon my eyes just don't focus correctly anymore. I roll down the
windows and turn up the radio to stay awake but sometimes I don't realize how
I got from here to there.

That 30 minute trip could end in disaster or, using something like
"autopilot", it could end with me pulling into my garage without issue. I tend
to think that it will be _very_ useful in these types of situations.

~~~
studentrob
> Sure, I start out strong, but soon my eyes just don't focus correctly
> anymore. I roll down the windows and turn up the radio to stay awake but
> sometimes I don't realize how I got from here to there.

I understand what you're saying, and that it is unlikely that something will
happen, but Tesla still recommends full attention in this scenario. You need
to be fully attentive at all times when driving, even with an assisted driving
system. Tesla states this clearly in their manual. If something were to come
across the road that the car were not prepared for, it may beep at you, or it
may not.

This seems to be what happened to the Florida driver. He likely did not think
he was abusing the system. He probably crossed that road hundreds of times
before on autopilot as part of a commute. And, neither the driver nor the
system pushed the brake.

If I were you and got into an accident, I know afterwards my brain would
search for a reason why this happened and how I can help it not happen again
both to me and to others. The solution is to not drive when tired, and to use
the system as designed. An additional thing we can do is put in a few measures
to check that the driver is still alert, like requiring hands on the wheel at
more frequent intervals.

------
freyir
Autonomous vehicles will require engineering that's more like NASA and less
like weekend hackathon. Given Musk's association with SpaceX, I was hopeful
we'd get the former, but it's starting to look like an autopilot solution was
released half-baked.

The current SV engineering climate -- youth beats experience, move fast and
break things, release early and often -- works great for photo sharing apps
but not great when lives are at stake.

~~~
tdaltonc
Actually I think SpaceX and NASA do think quite differently about things. NASA
has to engineer toward maintaining their national mandate. Every launch is a
PR game targeting the American public and congress. SpaceX instead has to
worry about keeping customers happy. I think SpaceX has a simpler job there.
Sourcing, manufacturing, deployment, insurance, etc. all seem simpler for a
private company. They're shown that they're not shy about breaking their toys
once the customers payload is off if they think they have a good chance of
learning something.

SpaceX hasn't had people in their vehicles yet. Maybe they're be more careful
when they do. Or maybe not. I can't find the quote right now but I think it
was Sagan who said that part of what makes exploration so captivating is that
it is dangerous, maybe we shouldn't make safety the number one priority
because it might slacken our awe.

------
danhak
From the Article:

 _> The driver in Montana was headed from Seattle to Yellowstone National Park
when he crashed on a two-lane highway near Cardwell. ..."It's a winding road
going through a canyon, with no shoulder"_

From the Tesla Owners Manual:

 _> Traffic-Aware Cruise Control is primarily intended for driving on dry,
straight roads, such as highways and freeways._

...

 _> Autosteer is particularly unlikely to operate as intended in the following
situations:

• When driving on hills.

• The road has sharp curves or is excessively rough._

~~~
cmotzakut6
Can autopilot not tell a straight road from a curvy road? If it can, why does
it apply throttle when it detects that it is driving in a type of road it
cannot handle?

~~~
snuxoll
A driver is still supposed to remain alert and have their hands on the wheel,
this allows Tesla to collect telemetry on autopilot so it _CAN_ work on
improving the software to handle these conditions better. I think the flaw
here is the hands-on requirement is much more lenient than it should be, if it
allowed the driver to keep their hands off for two full minutes (it _is_
supposed to slow down if a driver isn't keeping their hands on the wheel) then
I'd say that's a very poor design choice.

------
ModernMech
I can't even use my car's (Mazda) built in GPS without the car being in park.
And yet Tesla allows autopilot without hands on the wheel? Insane.

~~~
studentrob
You have to call your product "disruptive" to get the free pass

------
dkarapetyan
Hindsight is 20/20, no pun intended. Why was this shipped at all? I mean the
self-driving features. This has now done way too much long-term damage and
impacted all self-driving initiatives across the board. Just like they opened
up their patents to benefit everyone everywhere by ushering the age of
low/zero-emission vehicles they should have done the same for the self-driving
features. This should have baked in the lab for another 3-5 years.

------
raverbashing
"It's a winding road going through a canyon, with no shoulder," Shope told
CNNMoney. The driver told Shope the car was in Autopilot mode, traveling
between 55 and 60 mph"

Wow, really?

This is not the Autosteer's fault, it's pure negligence

~~~
npizzolato
If the autopilot couldn't safely handle going 55-60mph, it should have slowed
down to speeds that it could handle. If it couldn't safely handle those
conditions, it should have safely slowed to a stop. Failing ungracefully is
completely autopilot's (and Tesla's) fault.

~~~
takk309
55-60 may be too fast for some of the curves in this area. There are curves
with an advisory speed of 45-mph. From the description of road and the area, I
would guess it is MT-2 between Cardwell and Three Forks (I live in this area).

[https://www.google.com/maps/@45.8317611,-111.9106714,3a,75y,...](https://www.google.com/maps/@45.8317611,-111.9106714,3a,75y,26.62h,80.69t/data=!3m6!1e1!3m4!1snmhJ6A_zvKFmo2QkBLsViQ!2e0!7i3328!8i1664)

------
yladiz
> "We specifically advise against its use at high speeds on undivided roads,"
> it said. Tesla states clearly in its owner's manual that drivers should stay
> alert and keep their hands on the wheel to avoid accidents when the
> Autopilot feature is engaged.

Most people don't read the manual unless what they're doing actively doesn't
work. Especially with press surrounding Autopilot saying it's basically the
best self driving car and is completely trustworthy (even if they don't
explicitly state that, it's still implied), it's not unthinkable that someone
would completely trust the car and not worry about sensors. The car should be
much more proactive at getting the user's attention, either by actively
slowing the car down to some safe speed with cautions on and/or significantly
lowering the time limit of taking their hand off the wheel before any
indicators appear. I'm not sure about the former, but the latter is done in
every other car that has this kind of technology (e.g. cars with Lane Assist)
and the time is very low, around 10 seconds.

------
electic
The most glaring flaw here is the name - Autopilot. It gives the driver the
impression this is a true self-driving system. I have seen videos of people
sleeping in their Tesla while Autopilot is engaged. A more appropriate name
might be "co-pilot"

------
ben_jones
Ironically Tesla, a company willing to invest long-term in its products, chose
short term revenue in the form of their Marketing team boasting of "autopilot"
in their vehicles, in exchange for violating long-term public trust and
responsibility.

------
eorge_g
Tesla is getting hundreds of millions of miles logged for R&D on their
autopilot feature. It makes a lot sense business wise to launch this tech when
they did.

It's clear from their PR response with the semi-truck accident that they see
fatalities as an acceptable consequence as long as they can say they have
fewer accidents per mile than 'un-assisted' drivers.

The question is: do we share their belief that this is ethical?

------
basicplus2
If the auto pilot calls for the driver to be on alert and place their hands on
the wheel and the driver doesn't, surely the auto pilot should stop the car

------
snuxoll
I really think Tesla needs to make autopilot's "hands-on" requirement less
lenient, the driver had their hands off the wheel for a full two minutes and
the car didn't bark at him? I think (maybe I'm wrong and the other non-fatal
one did) this is the first incident where autopilot was involved with an
inattentive driver, and probably the one that is going to put the screws to
Tesla, unfortunately.

~~~
ctrl_freak
> this is the first incident where autopilot was involved with an inattentive
> driver

The other (fatal) crash involved an inattentive driver watching a Harry Potter
movie:

[https://www.theguardian.com/technology/2016/jul/01/tesla-
dri...](https://www.theguardian.com/technology/2016/jul/01/tesla-driver-
killed-autopilot-self-driving-car-harry-potter)

~~~
ars
The latest data is that he was not watching anything.

The claim claim from the truck driver hoping to avoid responsibility for
crossing a highway in an unsafe manner.

------
3pt14159
The real issue is whether or not "Autopilot" is safer than a human or not. If
the number of deaths for similar road conditions per distance travelled are
less then Tesla should be allowed to continue the way that they are doing
things. It's obvious to me the way other systems do this (by slowing down if
the user doesn't put his or her hands on the wheel frequently) leads to people
not using the feature because it is less useful which may, paradoxically,
leads people to have a worse driver at the wheel.

Just the stats, ma'am. Are they safer or not? If so, then it's fine by me.
What we gain from self-driving cars in terms of real world usage data is
extremely valuable.

~~~
wepple
instead of aiming for "the same number or slightly less" accidents than a
human, why not have humans give oversight to prevent autopilot errors, and
autopilot assist where humans might make a mistake?

actually try to let both computers and people excel where they do best, and
massively decrease road accidents?

~~~
tdaltonc
That sounds good to. I think 3pt14159's implicit question was, "should Tesla
be allowed to continue doing this the way they want?" not "Is tesla take the
best of all possible approaches to car automation?"

------
_ph_
Tesla should contact the design team which creates the manuals for Ikea to
create a purely graphical "owners manual" clearly describing what the
autopilot can and cannot do. Admittedly, a lot of the Tesla marketing seems to
indicate otherwise, but the autopilot is just a very sophisticated lane and
distance/speed assistant. It seems to handle the typical _freeway_ very well,
but does not promise to handle most other driving situation.

For example, it does not handle any crossing traffic, nor stationary obstacles
as well as debris on the road. Off the freeway it is entirely the
responsibility of the driver to keep the car safe, the autopilot cannot (by
design) handle most of the situations.

The discussion of having the hands on the wheel is a bit a red herring. Having
the hands on the wheel limits the amount of distracting things you can do
while driving, but the key point is paying attention to what is going on.
Hands off on an empty highway, why not? But when a truck appears on the
horizon with some potential to cross - better be prepared to take action. The
same applies to small country roads, and especially when the autopilot starts
to warn the driver.

------
trhway
>"It's a winding road going through a canyon, with no shoulder," Shope told
CNNMoney. The driver told Shope the car was in Autopilot mode, traveling
between 55 and 60 mph ...

in SUV (Model X)... are people trying to win Darwin award?

~~~
sriram_sun
Agree. I was shocked to read the actual speed it was going at. However, the
insurance companies might be starting to worry.

------
obj-g
I just _really_ don't understand the whole self-driving car craze. What is so
attractive about it? Why do people want it so badly? Personally, I like
driving, rarely even use cruise control. I would _never_ fully trust these
auto-pilots. Ever. So what's the point? If I can't crawl into the backseat and
take a nap and wake up at my destination, then what's the use?

~~~
gajjanag
I agree with these sentiments, but for slightly different reasons.

One of the main selling points is that commuters can save time (the focus on
self driving cars, not trucks/rail). However, long commutes are mostly a
constraint imposed by suburban sprawl and associated planning failures. The
same applies to automated delivery mechanisms from shops, again of limited use
in sufficiently dense communities. In such a scenario, this feels a lot like
optimizing 1% of execution time in a program.

For large scale transport of goods or people, the same situation applies for
the economics - the labor of the driver should be a small fraction of the
operating costs. There could be some safety benefits, but this remains
uncertain. I found [https://www.bloomberg.com/view/articles/2015-05-28/cars-
will...](https://www.bloomberg.com/view/articles/2015-05-28/cars-will-drive-
themselves-before-freight-trains-do) quite interesting in this respect.

Seeing what happened with the invention of the automobile, I won't be
surprised if self driving cars become the norm somehow. In the most extreme
case there might be an outright ban on non autonomous cars at some point in
the future. This is something I certainly don't look forward to.

------
iask
IMHO, autopilot means "vehicle drives by itself without my involvement"...it's
not autopilot if my hands have to be on the wheel. It seems that this
technology is Cruise Control v2.

Having the hands on the wheel only gives the car manufacturers leverage in
denying there is any problem on their end.

------
dimino
What the hell is happening in this comment thread? Do we not all realize that,
per-mile-driven, Tesla's auto pilot is _much_ safer than human driving?

I thought we were prepared to deal with the _inevitable_ crashes, but based on
the chicken littles here, maybe I was wrong.

~~~
studentrob
> Do we not all realize that, per-mile-driven, Tesla's auto pilot is much
> safer than human driving?

Good question. It's not safer. Tesla's quoted statistic of one death per 94
million miles driven in the US is not as useful as it could be. That includes
motorcycle deaths and roads and conditions in which the Tesla AP is not likely
to be engaged. When you limit it to divided roads, the rate is more like 1
death per 150 million miles driven [1], which makes the Tesla less safe than
your average American vehicle, and that still includes motorcycles.

[1]
[https://www.reddit.com/r/technology/comments/4sgxv1/while_te...](https://www.reddit.com/r/technology/comments/4sgxv1/while_tesla_appears_surprised_over_criticism_of/d59eazl)

~~~
phaemon
> Good question. It's not safer.

I think it is. Given the number of Autopilot miles driven, you'd only expect
one death so far, so that we have one doesn't really tell us anything.

However, _injuries_ from car crashes are (fortunately!) much more common -
roughly 60 times as likely. So you'd expect to have about 60 Autopilot
incidents resulting in injury by now. I don't really keep up with Tesla news
but, based on the reaction to this story (where the guy wasn't even injured),
I'm going to assume the actual number of injuries is tiny.

What do you think?

~~~
studentrob
> What do you think?

I think it's not safer based on existing data. Musk was pretty quick to tout
that autopilot is safer over 100 million miles driven. It turns out his
statistic was not as useful as it could have been. There was no mention of
conditions in which autopilot is not likely to be engaged such as adverse
weather or non-divided roads. Plus some deaths are from motorcycles, which, it
seems, should somehow be excluded from the comparison.

On top of that, it might make more sense to compare the Tesla models' safety
records to cars in a similar pricing class. Comparing a Tesla to a beater from
1990 isn't so useful. So there are different degrees of comparison. Tesla's
headline-ready numbers do not tell the full story.

I don't feel badly about drawing different conclusions from the same data
Tesla used in its response to the incident. I think they overlooked a few
things. You're certainly free to intuit what you like.

> I don't really keep up with Tesla news but, based on the reaction to this
> story (where the guy wasn't even injured), I'm going to assume the actual
> number of injuries is tiny.

Tesla does not make this information available, so I don't think it is worth
speculating. Effort would be better spent encouraging regulators to demand
more detailed reports from companies that sell vehicles equipped with driver-
assistance systems. Until all companies are required to report such
information, Tesla is unlikely to do so. We can turn the screws on Tesla too,
and I think they would be served by lobbying for tighter regulations to create
a level and safe playing field for car companies and drivers alike, but I
wouldn't count on them to spend much extra effort on holding themselves
accountable in such a fashion. They're already having a tough time meeting
their production demands, and Musk is still setting aggressive targets for
complete autonomy that no other car company, including Google, is even close
to. I think Musk has said full autonomy is 2 years out, whereas Google says
it's between 3 and 30 years. Mobile Eye has a 5-10 year plan.

~~~
dimino
It sounds like you're willing to speculate/entertain assumptions (e.g.
regarding miles of "relatable" road usage) when it harms Tesla, but not when
it might help them.

~~~
studentrob
Must every critical argument I make be balanced with something positive in
order to demonstrate I'm not 100% against Tesla?

Tesla definitely done great things. They're just not front and center in the
news at the moment.

I think it's best to look at the data or issues being discussed rather than
some hidden motive. When I discuss politics, I don't just say "I'm voting for
Billy", I try to say, "I like Billy's position on X because Y". Same thing
here.

~~~
dimino
It's not about balance, it's about you willing to make guesses regarding the
_bad_ things Tesla has done, and not make guesses about the potential _good_
things.

This isn't a "let's give 50% time to the creationists" kind of thing, this is
a "let's not stretch only one side of the truth" kind of thing.

You said the equivalent of, "I'm voting for X because of my speculation about
Y but I refuse to speculate about Z as it does not fit my current argument."

~~~
studentrob
> It's not about balance, it's about you willing to make guesses regarding the
> bad things Tesla has done, and not make guesses about the potential good
> things.

You're being pedantic. I can wager guesses about good things too. They're just
not pertinent to the focus of this article. Constructive criticism, for me, is
a positive thing.

> You said the equivalent of, "I'm voting for X because of my speculation
> about Y but I refuse to speculate about Z as it does not fit my current
> argument."

I gave some raw analysis of the statistics drawn, further here [1], and all
you can do is complain that I am somehow voting against Tesla. I never said
any such thing. I said we should demand more transparency from them about
accident data. I made some further speculation about Tesla's likelihood to do
this on their own because it supports the argument that _we_ should be part of
this conversation.

To expand on that, I think we should ask this of all companies that enable
driver-assist. Currently, California requires monthly reports on accidents
involving fully autonomous vehicles. I think it makes sense to do that for
every car [2].

If you think Tesla and other companies will become more transparent and you
don't feel the need to do anything, great, that's your prerogative.

[1]
[https://news.ycombinator.com/item?id=12087603](https://news.ycombinator.com/item?id=12087603)

[2]
[https://news.ycombinator.com/item?id=12084226](https://news.ycombinator.com/item?id=12084226)

------
hop
I think lawsuits and government intervention will kill self driving cars for
the next decade or two. Even if autonoumous are 10x better, thats still
hundreds of deaths per year that will be attributed to a half dozen autonomous
car maker's software.

~~~
studentrob
Tesla's not the only one at this. Google still has a good program that nobody
seems upset about.

------
rayiner
For folks apologizing for Tesla: people ignore instructions and prompts from
their computers or gadgets. That's not driver fault: it's a part of the design
problem that engineers have to deal with. As an engineer, you take the world
as you find it.

I had a professor in college who specialized in airplane cockpit design. Her
philosophy was that everything is a design failure. If the pilot pushes the
wrong button, the design must be fixed to keep that from happening. It would
seem to me that principle applies with even more force when we're talking
about consumer products operated by random people rather than trained pilots.

------
DanBC
I'd be interested to see how Google sees this.

Paving the way for autonomous driving by making crashes something that people
expect sometimes?

Or putting people off autonomous driving by making people think crashes are
not acceptable at all, ever?

~~~
studentrob
Google has a completely different strategy. They're aiming for 100% automation
via a more gradual testing process before making any sort of automation
available to consumers.

I believe their thought is that the closer we get to complete automation, the
less likely a driver is to remain aware. We're only at roughly 10% automation
right now and drivers are already taking their eyes off the road. When we get
to 90%, humans will have an even tougher time retaining attentiveness. Watch
their talks to be sure.

Google's self driving car group gives some awesome transparency reports every
month, including details about every accident [1]. It's like they're ready to
become their own company.

Fortune had a good article critiquing Tesla's strategy vs. Google's and other
car companies' [2]. The author says Tesla is being defensive and resistant to
public critique. Other companies expect pushback from the public and
incorporate that into their product offerings.

[1]
[https://www.google.com/selfdrivingcar/reports/](https://www.google.com/selfdrivingcar/reports/)

[2] [http://fortune.com/2016/07/11/elon-musk-tesla-self-
driving-c...](http://fortune.com/2016/07/11/elon-musk-tesla-self-driving-
cars/)

~~~
schiffern
> _Google 's self driving car group gives some awesome transparency reports
> every month, including details about every accident_

Google isn't doing this out of the goodness of their heart. They have always
been required by law to send autonomous vehicle accident reports to the DMV,
which are made available to the public on the DMV's website.

[https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testi...](https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testing)

So _the transparency itself_ is mandatory. Google merely chose to control the
message and get free PR by publishing an abbreviated summary on their website
too. Judging by this comment, it's working!

For example, here's the full report on Google's accident in February, the
first one where the autonomous car was the vehicle at fault:
[https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52...](https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52-8f80-b33948df34b2/Google+Auto+LLC+02.14.16.pdf?MOD=AJPERES)

~~~
studentrob
> So the transparency itself is mandatory. Google merely chose to control the
> message and get free PR by publishing an abbreviated summary on their
> website too. Judging by your comment, it's working!

Hahaha I guess so! :-D

There should be a similar requirement for transparency of driver-assist
vehicles. Clearly, the companies are not giving this up themselves until
they're all forced to do it at once.

------
chadlavi
"It's a winding road going through a canyon, with no shoulder"

Doesn't sound like the recommended use case for autopilot? I thought you were
only supposed to use it on a limited-access throughway?

Anyway, good to hear no one died this time, but both of these accidents do
sound like there's a lot of user culpability involved. It seems like it would
be reasonable on Tesla's end, since all Teslas know where they are, to limit
autopilot to only roadways where it is reasonable to use it.

------
dsfyu404ed
Can someone who has a Tesla answer this for me:

Does the autopilot routinely alert for short duration events where it's
confused but the situation passes quickly and it's not a problem or is there a
very low false positive rate and an alert is pretty much a guarantee that
you'll need to take over?

Basically, does the system train you to ignore most alerts because of a high
false positive ratio?

------
ilamont
_" It's a winding road going through a canyon, with no shoulder," Shope told
CNNMoney. The driver told Shope the car was in Autopilot mode, traveling
between 55 and 60 mph when it veered to the right and hit a series of wooden
stakes on the side of the road._

Thank goodness no one was hurt.

But I have to ask, what was the driver thinking? Autopilot late at night on a
winding mountain road with no shoulder?

A question for Tesla: Why isn't Autopilot disabled when it detects dangerous
conditions? TFA said it alerted the driver to put hands on wheel. If there is
no response, what is the protocol? At what point will the vehicle attempt to
pull over and stop (for instance, in the event of a medical emergency or
conditions are simply too dangerous)?

------
lucb1e
What I don't understand is why Tesla doesn't whitelist roads where this
feature can be used, since the manual sets very specific conditions.
Especially given that the feature is currently enabled, they must know which
roads work with high confidence values.

Or assume they wanted to do the right thing from the start, and they don't
have this dataset. It's not as if we don't have OpenStreetMap (or as if they
couldn't buy a commercial dataset) in which you can easily filter out straight
highways.

With this it could probably even work without being attentive as a driver,
which would be a killer feature over the current state of affairs (currently
it has no benefits whatsoever, if you follow the instructions, which is why
probably almost nobody does).

~~~
studentrob
> What I don't understand is why Tesla doesn't whitelist roads where this
> feature can be used, since the manual sets very specific conditions

I think they want the data of drivers attempting to drive with autopilot on
these roads.

Tesla does, I've read, restrict how fast autopilot can drive on unapproved
roads, so it appears they already have them mapped out.

They're using customers as guinea pigs for their tech.

~~~
kylec
Why not have Autopilot running and monitoring the sensors to learn the road,
but not have it actually take control of the vehicle? Even if the driver is
manually in control the Autopilot system could still be learning.

~~~
studentrob
Tesla may have begun by doing that. Recall that last October autopilot was
introduced "over the air", so, customers who already had vehicles could use it
immediately. Presumably, Tesla was collecting data up until that point.

I imagine Tesla is generating more sales with a usable autopilot. It generated
quite a bit of media hype and they are having trouble keeping up with
production demands.

So, Tesla chose to introduce this feature to increase sales and get more data
on how drivers interact with the system. It seems excessively risky to me. I'm
not building a self-driving system myself, nor running a huge company, so I
can't say much about what it takes to do that. I can say I would not like to
drive on the road next to a vehicle that can unexpectedly make the wheel turn,
forcing its driver to be corrective at a moment's notice.

------
c0g
If your design fails one user, blame the user. If it keeps failing, blame the
design.

------
jacquesm
Tesla is trying to solve too many problems at once. That's a risky strategy.

------
helmett
TIL: car crashes happen. but because it's Tesla it gets 100x the attention

~~~
ModernMech
That's not really why this is getting attention. The core issue here is that
Tesla is shipping a product called "autopilot", which gives the expectation
that it is in fact an autopilot. In the background, people are hearing that
autonomous cars are "3 years away!!!" so the expectation is that the Tesla
Autopilot feature is the real deal.

Thus you have people watching movies, falling asleep, or driving on dangerous
roads using a technology that nowhere near delivers what the name promises.
The result is people are dying and getting hurt by an immature technology
being pushed too hard by an irresponsible company.

~~~
cynix
> The core issue here is that Tesla is shipping a product called "autopilot",
> which gives the expectation that it is in fact an autopilot.

An autopilot system is defined as a system that assists, but does not replace,
the human operator of a vehicle. Therefore Tesla's system is, in fact, an
autopilot.

~~~
kbart
From wiki: _" An autopilot is a system used to control the trajectory of a
vehicle without constant 'hands-on' control by a human operator being
required."_

That's the clear opposite of what Tesla's manual suggest. If it required
hands-on wheel and constant attention, call it what it is -- a driver
assistance system, not autopilot.

~~~
cynix
The actual implementation only requires periodic, not constant, hands-on-
wheel. Although it's in the driver's interest to pay attention constantly, as
evident from this accident.

------
hanklazard
The autopilot feature seems like one that should be taken through rigorous
testing approved by the appropriate regulatory structure before being allowed
out on the roads. The fact that Tesla has been able to simply update firmware
to include this experimental new feature on their vehicles as they please
feels very "wild west" and as much as I don't tend to prefer regulation, this
feels like the proper place for it. By the way, this has been my feeling on
the matter before any autopilot crashes were reported.

~~~
eatporktoo
The real problem is, how do you actually design a test that will catch these
issues. If you have 3 crashes in millions of miles, you are pretty unlikely to
find those errors during testing.

I wish Tesla was more open about the way the actually test their Autopilot
software.

~~~
danso
While automated driving is a relatively new field, the science and data of
traffic conditions and accidents is quite deep. You don't have to blindly wait
until your beta users get into accidents. In the case of the first fatality,
how is it possible that Tesla engineers were unaware of a situation in which a
lightly-colored vehicle might show up on days in which the sky is bright?

Of course they were aware of that situation. They chose to deprioritize that
to reduce the incidence of false positives that came from overhead highway
signs. Nothing wrong with attempting to squash false positives, which can lead
to deadly situations themselves. But there was nothing preventing Tesla from
doing adequate testing to determine what unintended consequences their
software modifications would entail. They don't have to wait for someone to
get decapitated to realize that there's a trade off in reducing false
positives from overhead signs.

------
readams
We'll just have to see how liability laws and juries treat these technologies.
We may well see a situation where thousands of lives could be saved from
automation, but any accident leads to millions in lawsuits and the technology
is killed. Hopefully this isn't the outcome.

Airbags occasionally kill or injure people also despite on balance saving
lives. So there is precedent for this sort of math, but this is a very
different kind of technology and not an ordinary engineering tradeoff.

~~~
HillaryBriss
Takata airbags have been in the news the last several years because at least
two people have died and more than a hundred have been injured.

I've heard commenters opine that this "could mean the end of Takata as a
company." But I haven't heard anyone say this "could mean the end of airbags
as a feature in cars."

I guess what the Tesla autopilot needs is a lot of _good_ publicity every time
it _avoids_ an accident which a human driver would otherwise have actually
caused.

But where would anyone get the counterfactual for those cases? Does anyone
even have information about that?

~~~
Meekro
> I guess what the Tesla autopilot needs is a lot of good publicity every time
> it avoids an accident which a human driver would otherwise have actually
> caused.

Sounds like a plan! Let's start right now. Here's an example of the autopilot
dodging a reckless driver and avoiding a collision that many human drivers
wouldn't have avoided:
[https://youtu.be/9I5rraWJq6E](https://youtu.be/9I5rraWJq6E)

~~~
studentrob
That's a video by the guy who died.. Some have argued he could've been more
aware of the incoming truck due to its short amount of time to make its
desired transition from left to right. Also the camera was positioned farther
forward than the driver's eyes.

------
Fej
When is Tesla going to stop blaming the drivers? Whether it's their fault or
not, it looks bad.

To be honest I don't feel comfortable with that kind of data collection
anyway. I'll never even think about getting a Tesla until they give us an
option to turn their tracking systems off. It's creepy and unnecessary.

------
antaviana
Looking at the recent crash reports (driver watching Harry Potter, driver
refusing to take over in winding road), it seems that it is just a matter of
time for a news headline about a Tesla crash report with nobody on the front
seats and a couple making out on the back seats.

------
jaimex2
I think the next software update needs to place limitations on where auto
pilot can be deployed.

Heck, mine the data from streetview and disable activation on roads where it
can't work 100%

------
seesomesense
What I find just as disturbing is the highly conflicted M&A scam he is running
using SolarCity. The SolarCity scam exposes him as a dishonest scamster.

~~~
studentrob
That one is crazy. No way that goes through. His job as CEO of SolarCity is to
shop around for the best price from other acquirers. Michael Dell did this
when he combined two of his companies and still got fined for it.

Maybe I'm drinking the Fortune coolaid. Their article seemed logical to me [1]

[1] [http://fortune.com/2016/06/22/tesla-elon-musk-
solarcity/](http://fortune.com/2016/06/22/tesla-elon-musk-solarcity/)

------
noonespecial
This feels like the beginning of air travel again. Each new rule and system
update has a tombstone behind it.

------
return0
Great, more data points.

~~~
yladiz
That's kind of tone deaf, don't you think? Just because the crash might
provide some useful data, doesn't mean it's something to look positively upon.

~~~
return0
Sorry, i was being sarcastic. I consider what Tesla is doing at this stage is
outright criminal behavior.

------
dllthomas
[http://www.gocomics.com/bloomcounty/2012/07/29](http://www.gocomics.com/bloomcounty/2012/07/29)

------
andrewclunn
What the hell is there to hit in Montana?

------
sergiotapia
Really irresponsible of Tesla to name it autopilot.

My wife calls me crazy for not even using cruise control on our car.

~~~
ars
Cruise control is to keep the car at an even speed when going up/down hills.
What does that have to do with safety?

~~~
wvenable
Feet aren't on the peddles. Which is slight more dangerous than feet on the
peddles. Kinda like hands on the wheel is less dangerous than hands off the
wheel (at least until automation becomes better at driving than humans).

~~~
ars
It's not like your foot is on the brake anyway - it's on the gas.

So keep your foot "active" and ready for use, even with cruise control.

Personally I lightly rest it on the gas. (Not hard enough to actually do
anything, but in the right position to react to things.)

------
kelvin0
Being carless with Autopilot on Tesla qualifies one for a Darwin award, right?

------
kilroy123
I hate to say it, but there are going to be some deaths on the road to full
autonomous cars. (no pun intended)

The problem I see is, Tesla is the only one actively testing autonomous
driving.

The fact that over 100 million miles driven, and we're finally starting to see
accidents, is a good sign.

~~~
itg
Or you can test the cars the way Google does, without involving the public and
having deaths on your hand from overhyping your product.

[https://www.google.com/selfdrivingcar/reports/](https://www.google.com/selfdrivingcar/reports/)

~~~
yes_or_gnome
There are tens of thousands of people either walking or driving the streets of
Mountain View and Sunnyvale where Google is testing its self-driving cars. So,
the members of the public are at risk.

But, I understand that you mean behind the wheel. Which is fun because the
Google cars do not have a wheel. As I'm sure you may know, instead of a
traditional steering wheel they have horizontally mounted, valve wheel with a
handle. It looks a little bizarre when you first see it, but I would imagine
it could be a safer, smoother alternative to the current standards steering
wheel.

Unfortunately, I couldn't quickly provide an image; I've seen it multiple
times on my walk to and from work.

~~~
studentrob
> There are tens of thousands of people either walking or driving the streets
> of Mountain View and Sunnyvale where Google is testing its self-driving
> cars. So, the members of the public are at risk.

There is a _huge_ difference between Google operating a handful of cars in
certain areas and Tesla selling 10-20 thousand vehicles per month that can be
operated by untrained consumers anywhere in the US.

Tesla is using its customers as data-generating guinea pigs. In return, Tesla
may become a guinea pig itself by showing other companies how not to progress
towards autonomous vehicles.

~~~
Retra
>untrained consumers

 _Licensed drivers._ Drivers who have a license that indicates that they will
take responsibility for the actions of any vehicle they control and understand
how to operate any features which may put themselves or others at risk.

~~~
studentrob
> Drivers who have a license that indicates that they will take responsibility

The fact that customers are licensed does not mean the seller is free from any
and all regulation. Gun owners are licensed, and guns are required to have
safety switches. Cars have a long history of being regulated [1]

Basically, anything that can contribute to deaths is going to monitored
closely by consumer protection bureaus, and will probably be heavily
regulated.

[1] [http://www.pophistorydig.com/topics/g-m-ralph-
nader1965-1971...](http://www.pophistorydig.com/topics/g-m-ralph-
nader1965-1971/)

~~~
Retra
I didn't mean to imply otherwise. Tesla probably could do a much better job
informing it's customers what they are getting into, because they certainly
aren't in a position to figure it out themselves. And they definitely can make
their software better.

What I'm saying is, if your car injures someone while you sit in the driver
seat, and you could have easily taken an easily foreseeable action to prevent
it, then you are at least partially responsible.

~~~
studentrob
> What I'm saying is, if your car injures someone while you sit in the driver
> seat, and you could have easily taken an easily foreseeable action to
> prevent it, then you are at least partially responsible.

Oh absolutely. No question there for me at this time.

There may yet be some class action or something that reveals some unjust
action by the driver-assist companies. I agree that right now, all other
things being perfect, if a driver of one of these cars is in an accident, then
some person is responsible.

