
Tesla in autopilot mode crashes into parked Laguna Beach police cruiser - extesy
http://www.latimes.com/local/lanow/la-me-ln-tesla-collision-20180529-story.html
======
abalone
I keep wondering what the customer benefit of Level 2 autopilot is if not to
lower your attention and relax your mind. Tesla's "out" is that drivers are
supposed to retain full attention and oversight of the autopilot system -- but
then if you strictly follow this rule, what is the benefit of autopilot?

I can see the benefit _to Tesla_ and future Tesla customers of essentially
crowdsourced fleet learning. But what is the benefit _right now_ if you
strictly follow the rule of remaining alert enough to intervene at a second's
notice?

In previous threads the best explanation I got from Tesla owners was that it
frees you from the "details of physically driving" so you "can now supervise
instead."[1] That just seems suspect. Supervising autopilot to the degree
required to correct sudden mistakes seems, to me, to be probably very close to
the mental load of the details of steering yourself.

[1]
[https://news.ycombinator.com/item?id=17151116](https://news.ycombinator.com/item?id=17151116)

~~~
userbinator
_but then if you strictly follow this rule, what is the benefit of autopilot?_

Indeed. When reading Tesla's statement,

"The Palo Alto-based automaker, led by Elon Musk, has said it repeatedly warns
drivers to stay alert, keep their hands on the wheel and maintain control of
their vehicle at all times while using the Autopilot system."

...one can't help but think, "then _WTF is it good for_ , if I still have to
drive the car myself while using it?"

 _Supervising autopilot to the degree required to correct sudden mistakes
seems, to me, to be probably very close to the mental load of the details of
steering yourself._

I'd say the mental load is _higher_ than manually driving --- not unlike being
a driving instructor to a student driver who is OK most of the time but
occasionally makes a possibly fatal mistake that you need to watch out for and
quickly correct. Normally, a car will go in a straight line with no steering
input, but the autopilot almost "has a mind of its own" and can steer itself
in the wrong direction.

This is subtly different from cruise control, which lets you concentrate on
steering and rest your foot on the brake so you can use it quickly if needed.
Cruise control doesn't encourage lowering attention, but enhances it.

~~~
slg
>This is subtly different from cruise control, which lets you concentrate on
steering and rest your foot on the brake so you can use it quickly if needed.
Cruise control doesn't encourage lowering attention, but enhances it.

What has lead you to this conclusion? Autopilot is really just advanced cruise
control. You can treat it the exact same way by leaving your foot on the brake
and leaving a hand on the wheel. If people aren't using it that way, that is
certainly something Tesla should try to fix. But I am just not sure I buy the
argument that cruise control is inherently a safer technology than cruise
control plus auto steering.

~~~
toxik
Because the fully-assisted driving ("autopilot") removes the need for
interaction completely. Your brain will stop spending power on backseat
driving, because the brain is really lazy--and clever about being lazy too.

~~~
slg
>Because the fully-assisted driving ("autopilot") removes the need for
interaction completely.

But it isn't fully assisted and doesn't completely remove the need for
interaction. It just is another step forward from cruise control that is more
assisted and requires less interaction. I am not sure why everyone assumes the
sweet spot for driver assistance ends immediately after adaptive cruise
control and before cruise control plus lane keeping.

>Your brain will stop spending power on backseat driving

That is an interesting phrase to use considering backseat drivers have zero
responsibility or control of a car but still manage to pay attention.

~~~
abalone
Backseat drivers pay less attention than frontseat drivers, obviously.

Your characterization of autopilot as “just cruise control” is fundamentally
incorrect. Mere cruise control still demands significant and continuous
attention and manual control on your behalf (steering). Autosteer makes the
leap to fully automatic, which opens the door to not paying attention for long
stretches of time.

~~~
slg
I am honestly not sure how to discuss this topic when in the same thread Tesla
gets ridiculed for blurring the lines between Autopilot and full self driving
while people who are critical of the system say things like Autopilot is
"fully automatic", "fully-assisted driving", or "removes the need for
interaction completely".

From a practical perspective, Autopilot still needs to be monitored. But more
relevant to this discussion, the user still needs to interact with the car to
keep Autopilot engaged. It is possible to do that while paying very little
attention. It is also possible to drive a car with cruise control while paying
very little attention. Both systems can be abused. I have already said Tesla
should do a better job of trying to crack down on the abuse of Autopilot. But
I don't know why it is controversial to say that if used properly both systems
can lead to greater safety by allowing the driver to focus on other aspects of
driving.

~~~
abalone
You're mischaracterizing the issue. Autopilot _should_ be monitored but it can
go for long stretches without manual intervention (as if fully automatic).
Mere cruise control, on the other hand, _requires_ manual steering.

You're just trying to make them sound similar, e.g. "if used properly" or
"both can be abused". But there is an ocean of difference. It is vastly easier
to ignore a system that is capable of driverless operation for long periods of
time.

------
iamtew
Is the article available to Europeans somewhere? All I'm getting is a message
with this:

> Unfortunately, our website is currently unavailable in most European
> countries. We are engaged on the issue and committed to looking at options
> that support our full range of digital offerings to the EU market. We
> continue to identify technical compliance solutions that will provide all
> readers with our award-winning journalism.

On the other hand, I'm wondering how much I want to read an article from a
website where they _must_ track me to when I just want to read something...

~~~
sschueller
This is so stupid, they are still infracting on European Citizens that just
happen to not be in Europe. For example I can get to the page but I am in
Switzerland which is not part of the EU yet smack in the middle of Europe.

~~~
lagadu
The GDPR's scope is EU (or EEA, I can't recall) residents in EU territories,
it has no provisions at all for EU citizens who reside outside the territory.

~~~
ggus
I'm in San Marino, I can read the page, and San Marino is not EU but it is in
the scope of the GDPR law.

------
confiscate
I don't get it, why is it so hard to detect stationary objects in front of
you? wouldn't that be the "hello world" of any self-driving tech?

Every time I ask this question, the response is "radar as a technology has too
much noise, not accurate enough to detect stationary objects".

But it DOES detect stationary objects. Otherwise no one would even turn on
Tesla AP at all.

Why is it so hard to detect stationary objects in front of you?

~~~
mentat
Subaru Eye Sight does this reliably and actually auto brakes correctly. Going
from that to APv2 has been a real step back. There's a curve on 92W that it
consistently try to accelerate you into an embankment. I think Elon just
doesn't understand the real world reality of how bad AP is.

~~~
djsumdog
It really bothers me manufacturers are pushing incremental updates to car
safety systems. Does anyone besides me think this is a terrible idea?

~~~
Gustomaximus
Aren't cars doing that today with their physical saftey systems?

How else would they do it than incremental?

~~~
InternetOfStuff
> How else would they do it than incremental?

How about... not?

I'm not being disingenious here.

If there's autonomus tech, then I find it scary that it might change its
behaviour. You see, I've learned how (say) Autopilot responds in certain
situations. I know when to trust it, and when not to.

If this behaviour changes (perhaps even if it changes for the better!), that's
a very dangerous thing. Some of my experience is now invalidated, but I don't
know which part. But it mostly works as before, which gives me a false sense
of security.

Changing the driving behaviour of a vehicle, especially in totally un-obvious
ways, is super dangerous, and such changes must have very strong
justifications.

Just think of that poor guy whose Tesla drove into the divider. Oh look, it
behaves just like before -- except for its lane following behaviour.

~~~
Gustomaximus
I dont understand how you would expect not to do incremental improvement.
Build it perfectly the first time? That's not realistic.

Pretty much everything humans do is incremental improvement. I understand your
point that we dont want to introduce weaknesses in 'upgrades' but this should
be a discussion about how to create robust QA rather that stopping incremental
improvement.

~~~
InternetOfStuff
> I dont understand how you would expect not to do incremental improvement

I wonder if we're talking past each other? Sure, incrementally improve during
engineering. But don't incrementally change the safety-critical behaviour of a
car once it's in the customer's hands.

> Build it perfectly the first time? That's not realistic.

Absolutely agreed.

But for subjects such as autopilots, predictability may very well trump
improvement.

I'm assuming that the initial iteration is already reasonably safe and useful
(if it isn't it shouldn't have been shipped, right?).

If the behaviour changes unpredictably (how will you predict autopilot changes
caused by an update?), this may well be less safe than keeping the existing
system, whoose quirks the dirver has noww learned.

> Pretty much everything humans do is incremental improvement. > I understand
> your point that we dont want to introduce weaknesses in 'upgrades'

The nasty thing is that even a change to objectively superior behaviour may be
problematic, because the driver will expect the car to behave one way, when it
now behaves a different way (regardless of merit).

Not even a changelog would help in such cases. I feel like the autopilot ought
to inform the driver in each situation where it used to decide one way, and
now decides another way. That might be a reasonably safe way of allowing
incremental change, but is likely too annoying for most people.

> but this should be a discussion about how to create robust QA rather that
> stopping incremental improvement.

Robust QA is certainly a must.

(side note: I used to work in safety-critical automotive projects. I may have
a more intimate understanding of the issues)

------
cherioo
People always lament about how Tesla tries to mislead customer's about its
capability, but is there actually data that shows this is truly the case among
Tesla owners? How many times does "attention needed" beep is needed before
customer can be considered reasonably informed that it is not fully self-
driving?

Even among non-Tesla cars, whose manufacturers don't try to "mislead", 71
percent of people believe automatic emergency braking can avoid all crashes
[0]. Is this percentage higher or lower for Tesla, and is that difference
warranted?

[0] [https://newsroom.aaa.com/2016/08/hit-brakes-not-self-
braking...](https://newsroom.aaa.com/2016/08/hit-brakes-not-self-braking-cars-
designed-stop/)

[1] Quote from above "When traveling at 45 mph and approaching a static
vehicle, a scenario designed to push systems beyond the stated limitations,
the systems designed to prevent crashes reduced speeds by 74 percent overall
and avoided crashes in 40 percent of scenarios. In contrast, systems designed
to lessen crash severity were only able to reduce vehicle speed by 9 percent
overall."

~~~
eyeing_see
These cars are driving on public roads, so the question is not whether the
user is informed but whether they are compliant. It doesn't take many ignored
beeps to learn that a particular customer is not using the feature as
intended, and deactivate it for the safety of others. Offer to turn it back on
after watching a trainig video at the dealer and signing some papers once,
escalate to a mandatory counselling session after that, third strike and the
feature is permanently off. Put it in the TOS so that asshole drivers can't
sue and win in court. The customer should not be king when the safety of
others is on the line.

~~~
vsl
So speeding is the manufacturer’s fault too? Informed by the car, non-
compliant driver...

Oh, and Tesla does disable Autopilot if you ignore the hands-on-wheel
warnings.

------
mikeash
How many non-Autopilot cars have crashed into parked vehicles? It happens a
lot, to the extent that a lot of states have “move over” laws explicitly
designed to reduce it.

Does Autopilot actually make the problem worse, or is it just more newsworthy?

~~~
DannyBee
If you go look at IIHS, they have tests that will tell you exactly how good
non-Autopilot cars are at this kind of crash avoidance:

I cannot find any current (2018) car that crashes into the parked car.

(I only spot-checked prior years). I found one in 2014 (audi q5) that hits the
car at 1mph when going 12mph.

[http://www.iihs.org/iihs/ratings/vehicle/v/audi/q5-4-door-
su...](http://www.iihs.org/iihs/ratings/vehicle/v/audi/q5-4-door-suv/2014)

By contrast, the 2014 subaru outback avoids collision even at 25mph.

[http://www.iihs.org/iihs/ratings/vehicle/v/subaru/outback-4-...](http://www.iihs.org/iihs/ratings/vehicle/v/subaru/outback-4-door-
wagon/2014)

~~~
mentat
Subaru Outback is vastly superior collision avoidance than AP which probably
says a lot about software quality. It certainly doesn't have better sensors.

~~~
mikeash
They're equally capable according to that test, and it's not optional
equipment on the Tesla.

~~~
mentat
My experience says they're not with a few thousand miles of driving but sure,
whatever.

~~~
tgb
How often are you attempting to drive into parked cars?

~~~
mentat
Cars in front of you decide to do random things like slam on the brakes or
make sudden turns. Eye sight consistently would warn early then brake as much
as necessary. I only had it do emergency braking 2 times and both would have
been accidents as I'd misjudged the distance and speed of other objects. They
have you test driving towards a trash can as part of the sale, it works, so
not just cars in a lane. More interestingly though, the warnings, especially
as early as they happen, made me a better driver fairly quickly. Tangentially,
the warning sound on the Outback is far more compelling than on the Tesla.
Weird UX thing.

------
Animats
Has Tesla blamed the driver yet? They might get away with that here. The road
was not divided and unsuitable for their lane-keeping system. The pavement
markings are unusual.[1]

Of course, as usual, their obstacle detection failed to detect a stationary
obstacle that didn't look like the rear end of a car in the same lane. We need
a minimum standard for a vehicle which takes automated control of steering and
braking. It must reliably stop for obstacles.

[1] [https://goo.gl/maps/BjqTZoD5Yws](https://goo.gl/maps/BjqTZoD5Yws)

~~~
joshuakcockrell
Here's the exact spot on that road that matches the picture in the article
[https://goo.gl/maps/pj3X4gAuG672](https://goo.gl/maps/pj3X4gAuG672)

------
RobLach
I remember reading people saying that insurance premiums will plummet for
Teslas because of how much safer they will be on the road.

Today I read Tesla is getting into the insurance business because premiums are
getting out of hand. It makes some sense since internally they’ll have more
data they can use to deny claims.

It’s interesting how reality plays out.

[https://electrek.co/2018/05/29/tesla-insuremytesla-
insurance...](https://electrek.co/2018/05/29/tesla-insuremytesla-insurance-
model-s-most-expensive-car/)

~~~
mikeash
I’m skeptical that it’s because premiums are getting out of hand. The report
that Teslas are the most expensive to insure only looked at the top of the
line version of each model. For the Model S they looked at the P100D, which
costs something like $140,000, and is one of the fastest accelerating street
cars ever made. I’m not surprised that’s expensive to insure, but that doesn’t
mean a more mundane one will be.

~~~
tim333
They seem to have impressive repair costs. If I was an insurer I'd quote high
because of that, not because they were more likely to crash. (eg
[https://cleantechnica.com/2018/05/20/heres-what-7000-of-
dama...](https://cleantechnica.com/2018/05/20/heres-what-7000-of-damage-looks-
like-on-a-tesla-model-3/))

------
jread
I live in Laguna, cycled by this accident today, own a Tesla, and can attest
to consistent autopilot issues in this section of the highway. The road widens
with a turnoff to the right, and in my experience that is exactly the
direction autopilot wants to go every time, rather than follow the center
line, even when it is tracking a lead vehicle that does so. There are no lines
to the right, so I’m not quite sure why it does this. But, I can understand
how an innattentive driver might end up in this situation because until this
point in the road autopilot works great.

------
olliej
I really wish all the car manufacturers would stop with this bullshit "you
still have to be driving it" self driving nonsense.

Claiming that the driver is still responsible for driving the car, while using
a system designed to encourage the "driver" to not pay attention is a dumb
idea, and all I can see is this kind of thing leading to regulations the delay
actual self driving cars.

I'm not a self-driving car fanboy or anything. I don't think it's just around
the corner, but it's clearly going to happen /eventually/ and I can't imagine
it being more dangerous than regular drivers. But this kind of "self driving
but not" features feel like the sort of thing that calls to government
agencies for regulation, and sufficiently "dumb" mistakes from these systems
seems like the sort of thing that triggers a reactive over-regulation.

~~~
AceJohnny2
Blame Elon Musk. I'm serious.

He's been promising "self-driving" for years, except the damn technology isn't
ready/safe enough for what people are clearly using this from. The
manufacturer puts a warning label, but the usage model/UI is clearly flawed.

His engineers know this, which is why Tesla's self-driving group has had such
a high turnover rate over the last few years. Musk keeps overriding them.

~~~
admyral
No offense but this is exactly the attitude that inhibits progress and led the
auto industry to languish in mediocrity for years. If you wait until the
technology would prevent all accidents, you'd be waiting forever.

~~~
api
It needs to be as good as an average human. Do we have enough stats to compare
yet?

~~~
admyral
As good as a human at what? It's 100% better at not falling asleep behind the
wheel or driving while under the influence which statistically are the most
common cause of traffic accidents.

------
iamleppert
“Come check out our really awesome manual auto-pilot! It’s fully manual
automatically!”

------
danso
Assuming that every major AP collision, it is pretty amusing that the latest
non-fatal ones have all coincidentally involved emergency vehicles. I imagine
it must be related to how such vehicles have the right to park just about
anywhere, including on roadways in which drivers do not expect parked
vehicles.

~~~
gwern
The photo looks like it was parked in a marked parking spot (those white
crossed lines indicating each slot along the curb).

~~~
goshx
Half on the sidewalk, half on the road.

~~~
gwern
After being crashed into. Why would they have driven up onto the curb when
there's so much empty space and all the other cars in the photo are parked
fully in the street? Not to mention, being further in should make it easier to
avoid, not harder.

------
cr4zy
One way to avoid these types of crashes IMO is anomaly detection. It's quite
simple to do anomaly detection in pixels using modern deep pixel prediction
nets like PredNet. In my experiments you get a few seconds lead time on
something like a car cutting you off (the car starts to head out of the lane
before actually crossing it for example). This allows alerting the driver, and
with a full windshield HUD you could even highlight the anomalous pixels on
the windshield. The nice thing about this is that it can be trained in an
unsupervised manner on all the available data. Some important details are to
find anomalies in object bounding boxes, using something like Tensorflow's
object detection pretrained net. Otherwise buildings with lots of striations
would light up the anomaly detector. Also, you should detect anomalies in a
human colorspace like CIELAB so that white cars (#fff) are not artificially
weighted as more anomalous.

Finally, you could use this as input to a planner like Model Predictive
Control where a higher cost is incurred for approaching anomalous objects.

~~~
toxik
Great you solved AD with this one simple trick!

------
theCricketer
Elon Musk has constantly underestimated the difficulty of autonomous driving.

This video ([https://youtu.be/wsixsRI-
Sz4?t=1h18m28s](https://youtu.be/wsixsRI-Sz4?t=1h18m28s)) shows Elon Musk, two
years ago, saying the following:

"I basically consider autonomous driving to be a solved problem".

"A Model S and Model X can drive with greater safety than a person, already.
Right now."

"We are less than two years away from complete autonomy".

------
Shank
Totaled? Maybe insurance totaled, but that police cruiser looks relatively in
fact from the picture. The driver's side door is open and and it doesn't look
like it was crumpled in any way in that zone. The rear driver's side passenger
door looks like it took more damage, and the entire frame of the cruiser still
looks like it's in relatively okay condition. The Tesla is in the same shape.

I'm not trying to downplay the impacts of the driver or Autopilot here, but
even if a police officer was in the cruiser, it doesn't look like it would
have been as catastrophic as the article implies.

------
jfim
From TFA:

> "Tesla has always been clear that Autopilot doesn't make the car impervious
> to all accidents, and before a driver can use Autopilot, they must accept a
> dialogue box which states that 'Autopilot is designed for use on highways
> that have a center divider and clear lane markings,'" a Tesla spokesperson
> said in an emailed statement.

Does this dialog box appear every time the car is started or is it a one time
thing, just like terms of services?

~~~
zwily
One time thing. The display does tell you to pay attention and keep your hands
on the wheel on every activation though.

~~~
xvf22
Lovely... But if it works great 99.9% of the time it may just lull you into a
false sense of security. Not saying the driver isn't at fault but this is
especially true for Tesla where a firmware update may change autopilot
behaviour in ways the driver doesn't expect.

------
oliv__
If you need to stay alert and have your hands on the wheel, why is the damn
thing called "Autopilot"!? This should not be legal.

------
pmontra
Los Angeles Times blocked Europe because of GDPR. Here's a link we can read
from there [https://www.reuters.com/article/us-tesla-autopilot/tesla-
hit...](https://www.reuters.com/article/us-tesla-autopilot/tesla-hits-parked-
california-police-vehicle-driver-blames-autopilot-idUSKCN1IU2SZ)

------
dmitrygr
Good reading: [https://www.vanityfair.com/news/business/2014/10/air-
france-...](https://www.vanityfair.com/news/business/2014/10/air-france-
flight-447-crash)

Basically this is a well-known issue in aviation. As automation gets better
and better, humans rely on it more and more and get less and less able to
handle even minute failures in it. Additionally humans are _absolutely
terrible_ at being able to be dropped into a complex situation and have to
make immediate rational decisions about it - thus the failure mode of "in case
of error, tell human to handle it" is a bad idea if time between "tell human
about it" and "crash into things" is under ten seconds or so. This has been
the subject of many NASA studies and NTSB reports and the above article does a
good job presenting this info in a form a layman can understand.

There are currently no known easy solutions, sadly.

------
fijal
I enjoy how "EU" means "outside of us". I'm in South Africa and it's blocked
for the eu

~~~
mkj
Maybe they're using browser timezones

~~~
Symbiote
That wouldn't be enough, there is EU territory outside Europe.

(French Guyana, Martinique, Réunion etc.)

------
ucaetano
2 fire trucks, 1 police cruiser.

Tesla really is sticking it to the man!

~~~
mertd
Usually other types of vehicles get a handsome ticket when they stop somewhere
they should not.

------
readhn
They should name the feature smart cruise control. Case closed. IT does not
imply self driving capability but rather more advanced cruise control system.
Which is basically what it is.

Nobody is going to set a cruise control to a certain speed and look away from
the road!!!

~~~
mikeash
The numerous drivers I see texting every day would beg to differ.

------
keerthiko
Tesla could still salvage this somewhat by loudly rebranding Autopilot to
"Cruise Control 2.0" and issuing a serious apology.

But somehow they (or just Elon?) seem unreasonably averse to doing that. The
man is losing hero-status by the minute.

------
melling
“You are about to hit a solid object. Stop.”

Why can’t we solve the most basic use case? We don’t have the sensors and it’s
all done with cameras?

~~~
userbinator
Too many false positives; sensors would only be usable for cars that moved a
few km/h at most. Even if you're driving normally, various other objects
appear in your path all the time.

~~~
twblalock
And yet the forward collision warning on a lot of inexpensive cars like Hondas
would have warned the driver and then hit the brakes in this situation.

We clearly don't have an epidemic of cars with forward collision warning
systems slamming on their brakes for no reason, so those systems must be able
to deal with the false positives somehow.

------
TangoTrotFox
Articles like this should really include incident rates per miles driven for
the self driving vehicle and compare that against human rates. Without that
this feels a lot like an intentional effort to mislead. What matters are the
crash rates. E.g. - how many humans crashed into an unmoving vehicle
yesterday? An immense number but of course they also drove far more miles as
well. So the question is exactly how many miles?

The first milestone for vehicular automation is not perfection, but to simply
be better, and in the worst case - comparable, to humans. Given the media's
tendency to try to make big news out of every single self driving vehicle
crash, it seems like we're already approaching (if not passing) this first
milestone, as these articles are relatively rare while these vehicles have
driven hundreds of millions of miles in autopilot. I can't find the latest
number, but back in October 2016 Tesla already had 222 million miles under
autopilot. It's safe to assume it's now some substantial multiple of that.

Some time it would be interesting to see media releases around the time we
transitioned from horses/carriages to automobiles, to see how the state of the
media has changed. Or perhaps it has always been this way. " _I will add, that
the man who never looks into a newspaper is better informed than he who reads
them; inasmuch as he who knows nothing is nearer to truth than he whose mind
is filled with falsehoods & errors. - Thomas Jefferson, 1807"_

------
cameldrv
Tesla Autopilot doesn't see stationary objects. It sees moving objects with
its radar, and tries to avoid hitting them. It sees lane markings with its
camera, and it tries to stay between them. Sometimes lane markings are
confusing, worn, or inconsistent. If that happens, your Tesla will smash you
at 70mph into a concrete barrier/police car. It will not slow down if it is
uncertain about the lane markings, it will not beep if there is a stationary
object ahead.

------
djsumdog
Let's make the roads out of rubber and the wheels out of pavement. I feel like
this is the backwards mentality with autonomous vehicles. I wrote a while ago
about how they simply cannot solve the transportation problem, at least not in
a way that's cheaper than just building mass transit:

[https://penguindreams.org/blog/self-driving-cars-will-not-
so...](https://penguindreams.org/blog/self-driving-cars-will-not-solve-the-
transportation-problem/)

I realize this particular case is different with us talking about
assistance/safety features, but I still feel we're going the wrong direction
here. These features reduce your awareness. They're like a teenage driver you
constantly have to monitor, but worse.

They teach people to be comfortable and less alert with systems that have
potentially fatal bugs that we can't yet identify.

We can build mass transit now, at a fraction of the cost of developing the
tech for automated vehicles. There are so many corner cases, with highways,
city driving conditions that simply cannot pan out for autonomous vehicles
without a lot of testing; and a lot of people getting injured and dying.

I wish people would shelve this pipe dream for now. Maybe once we have really
solid rail transport again in the US, like we use to decades ago, and rich and
poor people can have a means to work even in smaller cities; then we can work
on frivolous junk that will benefit the few that can afford it. Right now we
just run the risk of killing more people who can't.

~~~
tim333
We've had mass transit for a century. I don't see much harm in trying
something new that may save a million lives per year.

~~~
djsumdog
But we haven't, not in America. Sure the rest of the world does, but we've
lost most of our mass transit over the past century.

Knowing you can get on a train at 2am and make it home does play a significant
role in reducing drunk driving, and can benefit people of all income levels.

------
shanghaiaway
Another auto pilot crash. List so far: 1 Hebei, Fatal China crash
[https://jalopnik.com/two-years-on-a-father-is-still-
fighting...](https://jalopnik.com/two-years-on-a-father-is-still-fighting-
tesla-over-aut-1823189786) 2 Florida, Fatal Joshua Brown crashes into truck.
[https://jalopnik.com/tesla-driver-in-fatal-florida-crash-
got...](https://jalopnik.com/tesla-driver-in-fatal-florida-crash-got-numerous-
warnin-1796226021) 3 Mountain view, Fatal Model X crash.
[https://www.mercurynews.com/2018/03/30/tesla-autopilot-
was-o...](https://www.mercurynews.com/2018/03/30/tesla-autopilot-was-on-
during-deadly-mountain-view-crash/) 4 Reference to auto-pilot crash in
Hayward. [http://abc7news.com/automotive/i-team-exclusive-tesla-
crash-...](http://abc7news.com/automotive/i-team-exclusive-tesla-crash-in-
september-showed-similarities-to-fatal-mountain-view-accident/3302389/) 5 You
You Xue crashes in Greece [http://abc7news.com/automotive/millbrae-driver-
says-tesla-mo...](http://abc7news.com/automotive/millbrae-driver-says-tesla-
model-3-was-in-autopilot-when-it-crashed-in-greece/3525746/) 6 Utah autopilot
crash, Tesla driver injured, Firetruck driver injured.
[https://www.sfgate.com/business/technology/article/Unclear-i...](https://www.sfgate.com/business/technology/article/Unclear-
if-Tesla-s-Autopilot-engaged-in-crash-12913423.php) 7 Texas autopilot crash
'Car crashed, then continued to accelerate'
[https://www.dallasnews.com/business/autos/2016/08/25/dallas-...](https://www.dallasnews.com/business/autos/2016/08/25/dallas-
man-says-tesla-crashed-continued-accelerate-using-autopilot-kaufman) 8 SF VC
autopilot crash, she took over control just before the autopilot crash, Tesla
blamed her for being in control when the crash happened. In her words 'its
your fault if you take control, its your fault when you do not take control'.
[https://www.digitaltrends.com/cars/tesla-autopilot-
related-c...](https://www.digitaltrends.com/cars/tesla-autopilot-related-
crash-dispute-california/) 9 Laguna Canyon road crash into a parked vehicle.
Driver injured.
[https://twitter.com/LBPD_PIO_45/status/1001541486146547717](https://twitter.com/LBPD_PIO_45/status/1001541486146547717)
10 FenderBender Crash video on youtube.
[https://www.youtube.com/watch?v=qQkx-4pFjus&feature=youtu.be](https://www.youtube.com/watch?v=qQkx-4pFjus&feature=youtu.be)
11 Wisconsin Model X crash [https://www.carcomplaints.com/news/2018/insurance-
company-su...](https://www.carcomplaints.com/news/2018/insurance-company-sues-
tesla-model-x-crash.shtml) 12\. Culver City crash into Fire Truck
[http://abc7.com/traffic/2-federal-agencies-investigate-
tesla...](http://abc7.com/traffic/2-federal-agencies-investigate-tesla-crash-
in-culver-city/2985342/) 13 Autopilot crash in Germany
[https://www.reuters.com/article/tesla-germany-crash/tesla-
cr...](https://www.reuters.com/article/tesla-germany-crash/tesla-crashes-into-
bus-in-germany-driver-says-used-autopilot-idUSL8N1C55Y7) 14\. Model X crashes
into Semi in California. Not many details.
[https://electrek.co/2017/03/27/tesla-model-x-autopilot-
crash...](https://electrek.co/2017/03/27/tesla-model-x-autopilot-crash-semi/)
15\. Laguna Canyon road crash into semi at same spot as (9) april 10, 2017.
[http://ktla.com/2018/05/29/tesla-on-autopilot-crashes-
into-l...](http://ktla.com/2018/05/29/tesla-on-autopilot-crashes-into-laguna-
beach-police-patrol-vehicle/) There are probably more crashes that have not
been reported prominently and not disclosed by Tesla either.

~~~
askafriend
This list is misguided.

How many crashes do we have from people using regular cruise control features?
Is the incident rate per mile when using Autopilot from Tesla higher or lower
than regular cruise control?

Those are the right questions in my opinion.

~~~
shanghaiaway
Why are you asking me? Ask Tesla! And they won't answer. Don't be a fan boy.

~~~
askafriend
My point was that your list doesn't answer the most important questions and
trying to proxy any information from a selected list like that is flawed
thinking.

~~~
shanghaiaway
Tesla does not answer the most important questions about their own product.

~~~
askafriend
That's a fair position to hold, I concede that.

------
mirimir
I can't figure out the geometry of this accident. It looks like the Tesla hit
the cruiser near the left rear wheel. And the cruiser was pushed sideways over
the curb. But its right rear tire looks to be inflated.

So why would the Tesla have been heading toward the roadside? Was the cruiser
perhaps parked in the middle of the roadway?

------
lph
How does Tesla fail so badly at basic collision avoidance?? Surely the LIDAR
is able to detect an imminent collision, but at least in the recent fire truck
and lane-split barrier incidents, the cars didn't even slow down?! That seems
really fundamental.

Edit: so they don't use lidar... Is this a machine vision failure, then?

~~~
Cogito
My understanding is that they detect moving obstacles much more easily than
stationary.

This is because almost everything in the environment is stationary to the car,
so it is hard to distinguish between stationary things in your way and
stationary things just off to the side.

So if the car in front of you slows down, it's obvious that you'll hit it if
you don't slow down.

A car parked partially in your lane might just look like regular noise to the
sensors. That is, using the current technology, if they (correctly) classified
the parked car as an obstacle then they would (incorrectly) classify so many
non-dangerous situations as obstacles so as to make the system infeasible; the
car would emergency brake constantly as it drives down the road.

Some would say that until those obstacles can be classified correctly such
technology should not be used at all, but I think it's probably doing more
good than harm at the moment and hopefully a good solution can be found
quickly. Even if they do fix this problem soon, I'm sure we'll hear plenty of
stories where something goes wrong anyway, but hopefully they won't seem so
obviously avoidable (who hits a parked car!)

~~~
s_m_t
How dumb of an idea is it to attach the camera/lidar/what have you to a 2 axis
rail and move it around at a high speed to introduce some parallax movement to
the scene?

~~~
Cogito
I assume you could achieve the same thing with less complexity and more
cheaply using a fixed system.

Either having more than one sensor or using mirrors or something.

In fact, I'm sure that the camera systems in place are already doing this to
some degree. I think it's more that it's not yet good enough at picking out
these stationary objects that are in the path of travel.

------
SriniK
I really wish everyone had a chance to use the autopilot feature and judge
themselves. I use it every single day and trust the machine.

Issue is, with every complex system, there are few ins and outs that users
need to know - like lane merge buffer lines, sharp lane merges by another
vehicle in front of view, etc. Within your first few rides(below 10), users
could pretty much guess when it works and when it might have issues.

On labeling the feature, it is marked clearly BETA and wiggle prompt is shown
pretty much in all the confusing cases - so alert is never missed for me.

My only gripe is with all these incidents, Tesla is forced to update the
feature to make it less user friendly or take out completely. I genuinely feel
safe and come home every single day with less stress. Elon's tantrums are not
helping the situation either.

~~~
yellow_postit
Blaming the users doesn’t seem like a winning strategy. If the system needs
users “guessing” about its behavior maybe it should be gates behind a test
with periodic checkpoints as the software behavior changes.

------
calgoo
I liked a comment from the hosts of The Grand Tour (paraphrasing): "When the
owners of the car companies that promote these self-driving cars will allow
the cars to drive them along the Yungas Road in Bolivia without a steering
wheel, then it can be classified as safe."

------
ryeguy_24
This is another problem with the “imperfect specialist”. Fortunately and
unfortunately we don’t need to grow our own vegetables anymore. Most of us
have adopted the concept of buying vegetables from “specialists” who grow them
more efficiently then we can. So, we use these specialists and detach
ourselves from the growing process altogether. Then we receive a batch of
Romain Lettuce comes with ecoli and die. This doesn’t happen often but we’ve
delegated this task to someone else and inherited the imperfections of their
process. I’d argue vegetable growing is pretty damn safe these days. I’m ok
with that risk. But I don’t think Tesla is anywhere near worth the delegation
of my driving. I’ll keep my brained turned on for now.

------
sundvor
Step 1: Stop calling it autopilot.

------
vblord
In the car's defense... I think the police cruiser looked a lot like a highway
median.

------
shyn3
The fix is to implement self driving bumper cars.

------
Tomte
I wish all those advanced assist functions in today's upper middle-class cars
would trickle down to cheaper cars faster. That would give us much more safety
now.

It seems Teslas autopilot cannot reliably do much more than conventional lane
assist etc.

Tesla could disable autonomous lane changing and autonomous changing of
streets, requiring driver's input for all changes of direction (that are not
smoothing out, but staying in lane), and it would be really good.

But Tesla needs the impression of game-changing, not just best-in-class for
financial markets reasons.

I believe authorities should order call-backs and force Tesla to disable all
traces of autonomy.

~~~
Maxion
The thing is, a tesla crashing while running on autopilot is more newsworthy
than a tesla crashing while being driven. We should not let anecdotal evidence
drive our fears.

~~~
Tomte
I don't see how this comment could possibly be a reply to mine.

------
alphabettsy
Tesla and the driver are at fault here. Tesla for not prohibiting operation on
the types of roadways where it is not designed to be used and the driver for
using it where it’s not designed to be used.

------
iamgopal
Why they not partner with Google? I think they have much better talent pool at
AI to handle this. I hope they are using all the crash data to make it more
robust.

~~~
wmf
Because Google/Waymo is using lidar which adds ~$150K to the cost of the car.

~~~
brandon
The cost of the Lidar hardware was down to $7,500 well over a year ago and may
have come down even more in the intervening time:
[http://www.businessinsider.com/googles-waymo-reduces-
lidar-c...](http://www.businessinsider.com/googles-waymo-reduces-lidar-
cost-90-in-effort-to-scale-self-driving-cars-2017-1)

------
paul7986
Pay up the nose to be a billionaires guinea pig so he can go can down in
history as a pioneer of robot cars while while you go up in flames...HELL
NO!!!

------
TekMol
All I see is this:

"Unfortunately, our website is currently unavailable in most European
countries. We are engaged on the issue and committed to looking at options
that support our full range of digital offerings to the EU market. We continue
to identify technical compliance solutions that will provide all readers with
our award-winning journalism."

Maybe we should develop a habbit here on HN not to link to geo-fenced sites?

------
TeeWEE
"Unfortunately, our website is currently unavailable in most European
countries. We are engaged on the issue and committed to looking at options
that support our full range of digital offerings to the EU market. We continue
to identify technical compliance solutions that will provide all readers with
our award-winning journalism."

I can't read the article. Weird. is this because of GDPR?

~~~
exolymph
Yep, it's because of GDPR.

------
DanBC
> The Palo Alto-based automaker, led by Elon Musk, has said it repeatedly
> warns drivers to stay alert, keep their hands on the wheel and maintain
> control of their vehicle at all times while using the Autopilot system.

That Tesla is taking so long to learn what "human factors" are is painful,
especially since HF has been part of other industries for decades.

------
jijojv
Also discussed at the RealTesla
[https://www.reddit.com/r/RealTesla/comments/8n20d9/this_morn...](https://www.reddit.com/r/RealTesla/comments/8n20d9/this_morning_a_tesla_sedan_driving_outbound/)

~~~
jumelles
I never knew about that subreddit, thanks!

------
flyGuyOnTheSly
My sinister side is tingling...

How convenient for non-autopilot auto manufacturers... to have a competitor
out there boasting semi-autopilot capabilities...

I wonder how many of these "autopilot crashes" could be machinations... if not
already, then in the future perhaps?

~~~
cm2012
Or, the incredibly more likely and predicted option is that you shouldn't
promote a self driving car that requires constant attention from the driver.
Comments on every HN thread have literally warned about this for _years_.

------
dingo_bat
To successfully develop self-driving cars, without spending NASA moon mission
like time and money, we need to put a value on a human life. Maybe even
different values for passenger, driver, pedestrian, other motorists,
policemen, etc. Then the risk, time, and money to spend will be easily
defined. If some task will take a software engineer (paid $150k/year) half
year to develop but only increase safety enough to save 1 pedestrian per
decade, and a pedestrian is valued at $50k, then maybe the dev can work on
something else. Numbers are made up just to illustrate the point. But this is
how Tesla, or any other self-driving car-maker can quickly bring self-driving
tech to the public in a cost-effective manner, and within reasonable time-
frames.

~~~
rubidium
Your guess at numbers, even though made up, have a concerning low regard for
human life.

Since they're required to, the us dept of transportation puts it at just over
6 million.

Priceless is of course the right answer, but engineering wise you're in the
5-10 million range.

------
mannykannot
That disclaimer wears thinner each time someone does something stupid like
this. Tesla can keep repeating it until a third party gets killed or maimed,
or it can act responsibly.

------
rasz
"Elon Musk, has said it repeatedly warns drivers to stay alert"

Its AutoPilot, why the hell are all those people expecting it to Pilot the car
Auto matically? ....

------
itchyjunk
Just like humans are self driving and self crashing the car, "auto-pilot" is
doing the same. Machines are too good at learning from humans. /s

------
soziawa
> Cota said. > Cota said. > Cota said.

Three times in a row, has this article been written by a robot? It doesn't
feel like something a human would write.

------
Shivetya
they need to reign this in fast before the NHTSA forces their hand. supposedly
they already on that agencies bad side and the last thing Tesla needs is a big
recall or stop sale. they will upset some of their fans but in the end it may
be safer to OTA restrict its usage.

~~~
jessaustin
Yes Tesla are leveraged so that they are far more vulnerable to a crisis of
confidence than any of the traditional manufacturers.

------
mymythisisthis
When will the right mirror be replaced with a camera and dedicated screen.
Until the 1980s cars didn't even have a mirror on the right side. Getting rid
of the mirror will make cars more aerodynamic. It will also make cars safer,
no more knocking over cyclists with the right side mirror. It will make cars
cheaper to maintain, mirrors are expensive to maintain.

------
mrtest002
In the last 24 hours, 100 people died in car crashes - but 2 crashes a year
apart and its "why does this keep happening". I realize the numbers are not
comparable - however, 100 people dying a day is seen as act of God, but 100%
driver-less society and getting 1 person dying a week will get these cars
banned probably.

------
xfactor973
I could be wrong but that looks like an autopilot 1 car which is an older
version of the tech that they're not really developing anymore. The newer
autopilot 2 and 2.5 is what they're going forward with for now. That also
doesn't look like a highway and autopilot isn't designed to handle that yet.

------
aviv
Tesla made a huge mistake naming this feature "Autopilot".

~~~
readhn
ditto

------
SCAQTony
More overpromising and deadly consequences. He has overpromised on delivery
dates, autopilot abilities, he has underestimated capital requirements, and
timeline trips to Mars.

Perhaps Tesla should consider a new CEO and Elon take a move to a more
figurehead position?

------
__m
Elon Musk: „Fake News!“

------
hagreet
Unavailable in most European countries. #GDPRproductivity

------
torgian
Sounds like Tesla has the right idea?

------
SpecialistEMT
Acab :D

------
readhn
how is autopilot feature still allowed to be called auto pilot? it mistakenly
leads people to believe that it can drive itself. The name literally means - a
system used to control the trajectory of an aircraft without constant 'hands-
on' control by a human operator being required.

After multiple deaths now the head of marketing at Tesla should be criminally
prosecuted (or whoever is the big shot who signed off on this name).

~~~
Filligree
Its capabilities are about on par with an airplane autopilot.

I have a lot of problems with Tesla, including the way they market this, but
the _name_ is accurate.

~~~
dirkgently
Well, then they should also provide the same regorous training that the pilots
recieve, or make Tesla available to those who are equally trained.

~~~
Filligree
That, I can agree with. One might also argue that a mere autopilot does not
imply good enough capabilities to work in a cluttered, dense environment such
as roads.

------
fareesh
Autopilot is like the steering wheel, brake, stickshift, etc. You don't get to
blame an accident on it. Airplanes have autopilot as well. We live in a world
where movies like jackass are made, based on real people and their attitudes.
It's not reasonable to suggest that safety labels or terminology can fix
everything. Some people are plain irresponsible and foolish.

------
TeeWEE
Everybody is so hard on Tesla. And I do agree that the name "AutoPilot" gives
the wrong impression of what it can do.

But think about it like Elon Musk would: If AutoPilot is _statistically_ saver
that a car without AutoPilot then it's good enough to be sold on production
cars. Tesla cars are saver that normal cars statistically.

Off course, picking one crash and saying AutoPilot sucks is statistically not
the right approach. I do agree that Tesla should improve on this. But crashing
into a parked car probably means that the driver was playing with his phone,
or the cars touch screen too much. Tesla should have systems in place that
detects the driver not paying attention. (something better than touching the
steer)

