
Tesla issues strongest statement yet blaming driver for deadly crash - otalp
http://abc7news.com/automotive/exclusive-tesla-issues-strongest-statement-yet-blaming-driver-for-deadly-crash/3325908/
======
Animats
Tesla's system demands _more_ from drivers than manual driving. The driver has
to detect automation failures after they occur, but before a crash. This
requires faster reaction time than merely avoiding obstacles.

Here's a good example - a Tesla on autopilot crashing into a temporary road
barrier which required a lane change.[1] This is a view from the dashcam of
the vehicle behind the Tesla. At 00:21, things look normal. At 00:22, the
Tesla should just be starting to turn to follow the lane and avoid the
barrier, but it isn't. By 00:23, it's hit the wall. By the time the driver
could have detected that failure, it was too late.

Big, solid, obvious orange obstacle. Freeway on a clear day. Tesla's system
didn't detect it. By the time it was clear that the driver needed to take
over, it was too late. This is why, as the head of Google's self driving
effort once said, partial self driving "assistance" is inherently unsafe. Lane
following assistance without good automatic braking kills.

This is the Tesla self-crashing car in action. Tesla fails at the basic task
of self-driving - not hitting obstacles. If it doesn't look like the rear end
of a car or truck, it gets hit. So far, one street sweeper, one fire truck,
one disabled car, one crossing tractor trailer, and two freeway barriers have
been hit. Those are the ones that got press attention. There are probably more
incidents.

Automatic driving the Waymo way seems to be working. Automatic driving the
Tesla way leaves a trail of blood and death. That is not an accident. It
follows directly from Musk's decision to cut costs by trying to do the job
with inadequate sensors and processing.

[1]
[https://www.youtube.com/watch?v=-2ml6sjk_8c](https://www.youtube.com/watch?v=-2ml6sjk_8c)

~~~
maxxxxx
" Tesla's system demands more from drivers than manual driving. The driver has
to detect automation failures after they occur, but before a crash. This
requires faster reaction time than merely avoiding obstacles."

I have been thinking that too. At what point do you decide that the autopilot
is making a mistake and take over? That's an almost impossible task to perform
within the available time.

~~~
corney91
If I was driving a car like that I think I'd feel safer if there was a
"confidence meter" available, I have no idea about how these auto-drive
systems work but I'm guessing there's some metric for how confident the car is
that it's going in the correct direction. Exposing that information as a basic
percentage meter on the dash somewhere would make me feel more confident about
using the auto-drive.

I can understand why this wouldn't happen from a business perspective, and
it's also presumably not as simple to implement as I'm implying, but I can't
think of a better way to get around the uncertainty of whether the car's
operating in an expected way or not.

~~~
shawabawa3
If they had the ability to measure confidence in the system, they would use it
to issue warnings or disable autopilot

The "confidence meter" would almost certainly be 100% right up until it
crashes into an obvious obstacle

~~~
corney91
> If they had the ability to measure confidence in the system, they would use
> it to issue warnings or disable autopilot

Yeah, that's basically what I'm asking for, maybe a warning light at a certain
threshold would be a better default, personally I'd still find a number more
trustworthy though.

> The "confidence meter" would almost certainly be 100% right up until it
> crashes into an obvious obstacle

If image recognition algorithms have associated confidence levels, I'd be
surprised something more complicated like road navigation was 100% certain all
the time.

~~~
salawat
The problem is, given a neural network, the decided course of action IS at
100% confidence.

Unless you design the network to specifically answer "how similar is this?"
and pair it with a training set and the output of another executing neural
net, a "confidence meter" isn't possible.

This is one of the trickier bits to wrap one's mind around with neural
networks. The systems we train for specialized tasks have no concept of
'confidence' or being wrong. There is no meta-awareness that judges how well a
task is being performed in real-time outside of the training environment.

Humans don't suffer from this issue (as much) due to the mind bogglingly large
and complex neural nets we have to deal with the world. Every time you set
yourself to practicing a task, you are putting a previously trained neural
network through further training and evolution. You can recognize when you are
doing it because the process of doing so takes conscious effort. You are not
'just doing <the task>', you are 'doing <the task>, and comparing your
performance against an ideation or metric of how you WANT <the task> to be
performed'. This process is basically your pre-frontal cortex and occipital
lobe for instance, tuning your hippocampus, sensory and motor cortex to
perfect the perfect tennis swing.

When we train visual neural networks, we're talking levels of intelligence
supplied by the occipital lobe , hippocampus, alone. Imagine every time you
hear about a neural net that a human was lobotomized until they could perform
ONLY that task with any reliability.

Kinda changes the comfort level with letting one of these take the wheel
doesn't it?

Neural nets are REALLY neat. Don't get me wrong. I love them. Unfortunately,
the real world is also INCREDIBLY hard to safely operate in. Many little
things that humans 'just do' are the results of our own brains 'hacking'
together older neural nets with a higher level one.

~~~
Eruditass
Have you actually worked on applying neural networks in the real world?

Calibrating the probabilities of machine learning algorithms is an old
problem. By nature of disciminative algorithms and increasing model
capacities, yes training typically pushes outputs to one extreme. There is a
ton of information still maintained which can be properly calibrated for
downstream ingestion, which anyone actually trying to integrate these into
actual applications should be doing.

Most recently:

[https://arxiv.org/abs/1706.04599](https://arxiv.org/abs/1706.04599)
[https://arxiv.org/abs/1802.03916](https://arxiv.org/abs/1802.03916)

------
tc
Tesla probably shouldn't be saying anything about this at all, even just to
avoid giving it more news cycles. But if they were going to say something,
here's what they should have said the first time.

\----

We take great care in building our cars to save lives. Forty thousands
Americans die on the roads each year. That's a statistic. But even a single
death of a Tesla driver or passenger is a tragedy. This has affected everyone
on our team deeply, and our hearts go out to the family and friends of Walter
Huang.

We've recovered data that indicates Autopilot was engaged at the time of the
accident. The vehicle drove straight into the barrier. In the five seconds
leading up to the crash, neither Autopilot nor the driver took any evasive
action.

Our engineers are investigating why the car failed to detect or avoid the
obstacle. Any lessons we can take from this tragedy will be deployed across
our entire fleet of vehicles. Saving other lives is the best we can hope to
take away from an event like this.

In that same spirit, we would like to remind all Tesla drivers that Autopilot
is not a fully-autonomous driving system. It's a tool to help attentive
drivers avoid accidents that might have otherwise occurred. Just as with
autopilots in aviation, while the tool does reduce workload, it's critical to
always stay attentive. The car cannot drive itself. It can help, but you have
to do your job.

We do realize, however, that a system like Autopilot can lure people into a
false sense of security. That's one reason we are hard at work on the problem
of fully autonomous driving. It will take a few years, but we look forward to
some day making accidents like this a part of history.

~~~
patcheudor
>t's a tool to help attentive drivers avoid accidents that might have
otherwise occurred.

This needs far more discussion. I just don't buy it. I don't believe that you
can have a car engaged in auto-drive mode and remain attentive. I think our
psychology won't allow it. When driving, I find that I must be engaged and on
long trips I don't even enable cruise control because taking the accelerator
input away from me is enough to cause my mind to wander. If I'm not in control
of the accelerator and steering while simultaneously focused on threats
including friendly officers attempting to remind me of the speed limit I space
out fairly quickly. In observing how others drive, I don't think I'm alone.
It's part of our nature. So then, how is it that you can have a car driving
for you while simultaneously being attentive? I believe they are so mutually
exclusive as to make it ridiculous to claim that such a thing is possible.

~~~
fapjacks
I have to preface my post to say that I think developing self-driving
automobiles is so important that it's worth the implied cost of potentially
tens of thousands of lives in order to perfect the technology, because that's
what people do; make sacrifices to improve the world we live in so that future
generations don't have to know the same problem. But I think you're right. I
think the "best" way to move forward until we have perfected the technology is
not something that drives for you, but something that will completely take
over the millisecond the car detects that something terrible is about to
happen. People will be engaged because they _have_ to be engaged, to drive the
car. The machine can still gather all the data and ship it off to HQ to
improve itself (and compare its own decisions to those of the human driver,
which IMO is infinitely more valuable). But if there's one thing the average
person is terrible at, it's reacting quickly to terrible situations. You're
absolutely right that people can't be trusted to remain actively engaged when
something else is doing the driving. Great example with the cruise control,
too.

~~~
mozumder
People love their cars too much.

Cars should just be phased out in favor of mass transit everywhere.

Yes, you can live without the convenience of your car. No really, you can.

Now think about how you would enable that to happen. What local politicians
are you willing to write to, or support, in order to enable a better mass
transit option for you? And how would you enable more people to support those
local politicians that make that decision?

This is the correct solution, since the AI solution of self-driving cars isn't
going to happen. Their high fatality rates are going to remain high.

~~~
Silhouette
_Yes, you can live without the convenience of your car. No really, you can._

Maybe, but unless you can change the laws of nature, you can't build a mass
transit system that can serve everyone full-time with reasonable efficiency
and cost-effectiveness, and that's just meeting the minimum requirement of
getting from A to B, without getting into all the other downsides of public
vs. private transportation in terms of health, privacy, security, etc.

~~~
mozumder
OK. Anything else you want to make up?

Let's see what that imagination can craft.

~~~
Silhouette
There's no need to make anything up. Mass transit systems are relatively
efficient if and only if they are used on routes popular enough to replace
enough private vehicles to offset their greater size and operating costs (both
physical and financial). That usually means big cities, or major routes in
smaller cities at busier times.

Achieving 24/7 mass transit, available with reasonable frequency for journeys
over both short and long distances, would certainly require everyone to live
in big cities with very high population densities. Here in the UK, we only
have a handful of cities with populations of over one million today. That is
the sort of scale you're talking about for that sort of transportation system
to be at all viable, although an order of magnitude larger would be more
practical. All of those cities have long histories and relatively inefficient
layouts, which would make it quite difficult to scale them up dramatically
without causing other fundamental problems with infrastructure and logistics.

So, in order to solve the problem of providing viable mass transit for
everyone to replace their personal vehicles, you would first need to build,
starting from scratch or at least from much smaller urban areas, perhaps 20-30
new big cities to house a few tens of millions of people.

You would then need all of those people to move to those new cities. You'd be
destroying all of their former communities in the process, of course, and for
about 10,000,000 of them, they'd be giving up their entire rural way of life.
Also, since no-one could live in rural areas any more, your farming had better
be 100% automated, along with any other infrastructure or emergency facilities
you need to support your mass transit away from the big cities.

The UK is currently in the middle of a housing crisis, with an acute lack of
supply caused by decades of under-investment and failure to build anywhere
close to enough new homes. Today, we're lucky if we build 200,000 per year,
while the typical demand is for at least 300,000, which means the problem is
getting worse every year. The difference between home-owners and those who are
renting or otherwise living in supported accommodation is one of the defining
inequalities of our generation, with all the tensions and social problems that
follow.

But sure, we could get everyone off private transportation and onto mass
transit. All we'd have to do is uproot about 3/4 of our population, destroy
their communities and in many cases their whole way of life, build new houses
at least an order of magnitude faster than we have managed for the last
several decades, achieve total automation in our out-of-city farming and other
infrastructure, replace infrastructure for an entire nation that has been
centuries in development... and then build all these wonderful new mass
transit systems, which would still almost inevitably be worse than private
transportation in several fundamental ways.

~~~
mijamo
Why so big though? I lived in a 25 000 people town in Sweden and did not need
a car more than a few week ends per year. There was 5 bus lines for local
transport, and long distance busses and trains with quite high frequency.

And that's not taking into account the fact that bicycle is a very viable way
to move around in cities < 200 000 inhabitants.

I have actually never owned a car, I just rent some once in a while to go out
somewhere where regular transports don't get me. I have lived in Sweden,
France and Spain, in 10 cities from 25 000 to 12 million inhabitants. Never
felt restricted. I actually feel much more restricted when I drive because I
have to worry about parking, which is horrible in both Paris and Stockholm.
Many people I know, even in rural Sweden or France, don't own a car because it
is just super costly and the benefit is not worth it. It's very much a
generation thing tough because my friends are mostly around 26-32 whereas
nearly all the person I know over 35 owns a car, even if they don't actually
have that much money and sometimes complain about it.

~~~
Silhouette
You've almost answered your own question, I think. Providing mass transit on
popular routes at peak times is relatively easy. It's more difficult when you
need to get someone from A to B that is 100 miles away, and then back again
the same day. It's more difficult when you are getting someone from A to B at
the start of the evening, but their shift finishes at 4am and then they need
to get home again.

To provide a viable transport network, operating full-time with competitive
journey times, without making a prohibitive financial loss or being
environmentally unfriendly, you need a critical mass of people using each
service you run. That generally means you need a high enough population
density over a large enough urban area that almost all routes become "main
routes" and almost all times become "busy times".

------
txcwpalpha
I used to be in awe of Teslas and I have always been a big fan of Musk, and
even after the crash I still imagined myself one day buying a Tesla. But after
this response, I've completely lost respect for the company, for Musk, and I
don't think I'll ever buy one or advocate for someone buying one.

The driver had previously reported to Tesla _7 to 10 times_ that there was an
issue with autopilot, but Tesla told him "no there is no issue". There is also
video evidence of this same issue happening in other parts of the world, but
with similar road conditions. But again, Tesla's response has been "there is
no issue."

And now their response is "the driver knew there was an issue but he used
autopilot anyway"? Seriously? Either there is an issue or there isn't, Tesla.
At first you said there is no issue, and now you're saying there _is_ an
issue? And as the cherry on top, you're blaming the driver for continuing to
use the feature _even after you told him repeatedly that it 's okay to use_?

For shame, Tesla. For shame.

~~~
whistlerbrk
> The driver had previously reported to Tesla 7 to 10 times

I understand this is not a popular opinion on news forums anywhere nor do I
write this to absolve in my view Tesla of their poor response nor of their
poor choice in naming the system 'autopilot' to start.

But at what point does corporate or government responsibility end and personal
responsibility start? If I knew something didn't work and that something could
kill me and had reported the issue 7 to 10 times, I would be watching, like a
hawk for the issue to recur, or more likely just not using it at all.

People have this insane idea that just because something isn't their fault
they don't have to take corrective action in order to avoid the consequences.
It is better to be alive than to be correct.

~~~
gamblor956
In the US, we have laws that require products to work as advertised. If they
don't, generally strict product liability applies to the manufacturer, seller,
and all middleman (in this case, they're all Tesla), regardless of whether the
customer-victim was misusing the product.

In this case, there is no evidence that the driver was misusing the car. But
there is a carload of evidence that Autopilot failed.

At this point, the only thing Tesla is accomplishing with these public
statements is adding more zeros to the eventual settlement/damage award.

~~~
greglindahl
The driver was required to remain alert.

~~~
LoSboccacc
the car should stop itself to the emergency lane then when user disengagement
is detected.

~~~
Matheus28
That's a very naive comment. What if there's no emergency lane? Should it just
crash towards whatever is on the right side of the car? :)

Tesla's autopilot already slows down then stops moving if the user doesn't
touch the steering wheel for a period time, but it gives plenty of warning to
the user before that happens, as it should be.

~~~
LoSboccacc
So let me see, what are the alternative here.

Inconveniencing the driver and making it a learning experience vs crashing
into concrete and killing him

Well. Maybe you are the naive one. Machinery shouldn’t give warnings. We
already learned those lesson in industrial automation settings.

The average mentality of “just user error bro” seen on hackers new is what
scares me the most. That is not how you build secure machines, far from it,
and at this point my only hope not to be killed from an autonomous car in the
next 30 years is for the regulations hammer coming crashing down full force on
the wanton engineers that are treating loss of life as user error

------
jijojv
This is gonna be interesting as apparently the "Barrier" [1] bug is only in
newer AP2 models per latest A/B tests done by cpddan (who did the very first
crash reproduction scenario in Chicago [2])

[https://www.youtube.com/watch?v=WX0bR_EQ47E](https://www.youtube.com/watch?v=WX0bR_EQ47E)
Tesla Model X with AP2.5, SW 2018.6.1

[https://www.youtube.com/watch?v=6zK2Om8Q0IA](https://www.youtube.com/watch?v=6zK2Om8Q0IA)
MS 75, AP1, SW 2018.6.1

1\.
[https://www.reddit.com/r/teslamotors/comments/8a0jfh/autopil...](https://www.reddit.com/r/teslamotors/comments/8a0jfh/autopil..).
"works for 6 months with zero issues. Then one Friday night you get an update.
Everything works that weekend, and on your way to work on Monday. Then, 18
minutes into your commute home, it drives straight at a barrier at 60 MPH."

2\.
[https://www.youtube.com/watch?v=6QCF8tVqM3I](https://www.youtube.com/watch?v=6QCF8tVqM3I)

~~~
ansible
I get that the vision system might have been fooled by how the lane widens
before the split.

What I don't get is why the radar doesn't pick up the barrier itself. It
should have been getting a nice, healthy return signal from it, lots of metal
at all kinds of angles. What I'm saying is, this is not a stealthy barrier.

~~~
frafra
Radar is not active if the speed is higher than a specific threshold; it seems
it is written in the manual.

~~~
ansible
> _Radar is not active if the speed is higher than a specific threshold; it
> looks like it is written on the manual._

I'm having trouble understanding why that would be the case. Why anyone
wouldn't want the radar active when going down the highway.

It is not as if it doesn't work at highway speed...

~~~
thebluehawk
That's because he is wrong. The radar is active at all speeds. In my
experience, the radar basically only identifies cars and car shaped objects.

~~~
ansible
Radar on the Teslas, if it works like how I am familiar with, just detects
"things". Things that have a decent radar return. This class of things does
indeed include cars and other vehicles, but should also include walls and
such.

It is likely that they're using at least 2-D radar, so the system should be
able to discriminate between an object directly in the path of travel, and one
that is off to the side.

------
eridius
I have to wonder why Tesla keeps doubling down on victim-blaming. Is this a
case of Elon Musk being insecure about criticism (since he's demonstrated a
tendency in the past to react poorly to it)? Or is Tesla trying to actually
lower their legal liability?

Edit: To clarify, in most of these crashes, it's usually a combination of
Tesla's fault and the driver's fault. Which is to say, the driver could have
avoided the crash, but Tesla could also have avoided it, and the driver's
biggest fault lies in trusting the "autopilot" more than they should. In the
case of this particular crash, I certainly agree that the driver should have
been paying more attention to the road, especially because they already knew
the autopilot had issues at that particular point, and if they were paying
attention they could have avoided the crash. However, the fact that autopilot
can't handle that barrier correctly is a problem especially because the driver
reported that exact issue to Tesla multiple times in the past (heck, Tesla
probably should have blacklisted that particular GPS location to force the
user to take control when approaching that point, if they can't handle the
barrier correctly in autopilot). Similarly, Tesla allows drivers the most
freedom to ignore the road of any of the "autopilot"-like systems, and they
continue to call their system "autopilot"¹, both of which only serves to make
this sort of crash much more likely.

¹Yes I know it's technically correct, but it doesn't match what the general
public thinks when they hear the term "autopilot". It gives the wrong
impression that the system is smarter and more reliable than it actually is.

~~~
toomuchtodo
Tesla is exerting and reinforcing that it is not liable for poor driver
decisions. If you use any vehicle safety aid while not paying attention, and
you die or kill someone while in control of the vehicle, that is your
liability.

Whether they call their system "Autopilot" is immaterial. _You_ are the
responsibly party as the driver, and this is made clear both at purchase time
and during vehicle use.

~~~
watwut
How they call the system drives user expectations and trust. They call it
autopilot so that people buy it due to having these expectations. It is
consequential and thus legitimate subject to critique.

~~~
Robotbeat
Calling it autopilot implies you have to pay attention. It implies it's NOT
self-driving or autonomous any more than airplanes (which virtually all have
some sort of autopilot) are.

~~~
Fricken
The name is irrelevant. What's relevant is that 56% of commercial pilots have
admitted to falling asleep while flying[1].

People suck at remaining alert and attentive when doing boring shit. So when
autopilot was all over the road and screwing up every 30 seconds, it wasn't
much of a problem. Now that it can navigate highways for an hour or more
without needing a safety critical intervention, we have a problem.

[1][http://www.bbc.com/news/uk-24296544](http://www.bbc.com/news/uk-24296544)

~~~
Robotbeat
What percent of commercial pilots also fly with a copilot?

Additionally, I'm not a professional driver, I don't have any autopilot on my
car, but I've definitely briefly fell asleep while driving late at night. What
do you think rumble strips are for?

Such an extremely low number of (but extremely highly publicized) fatality is
not reason enough to say autopilot is a net negative. In fact, the extreme
media focus that every fatality during autopilot causes should goad
improvement of autopilot safety until it's far better than human drivers.

(And let's not pretend that this autopilot error doesn't also happen with
human drivers... The only reason this failure was fatal is because less than
two weeks earlier, a human driver crashed into the same barrier--likely due to
the same confusion--and destroyed its effectiveness since the highway
department was slow to reset it.)

~~~
Fricken
In the article I posted it mentions that of the 56% of pilots who admitted
falling asleep, 29% also admitted to having woken up to a sleeping copilot.

There is no path from Autopilot as it is to full autonomy without doing a
major shift in strategy. Google figured out very early on that the incremental
approach is unfeasible. Most of the other big car companies at the outset were
intending to develop autonomy incrementally as well, but then they thought
about it for a few minutes.

~~~
Robotbeat
Why aren't major shifts in strategy possible? LIDAR may become very cheap
(i.e. $1000, much less than drivers paid for the full self-driving package).
Improvements in radar may also help. Requiring much lower speeds (driving like
a Waymo grandma) and specific conditions (i.e. dry, well-lit, well-mapped) are
also feasible.

As the technology improves, conditions can be relaxed for full self-driving.

The very fact that in this case, the driver had multiple problems with this
same spot, and that other drivers also noticed problems in this area, points
to an obvious mitigation strategy: flag problem areas geographically for
autopilot, develop specific strategies that address the problem, and verify it
has been fixed by examining how other Teslas on the road in "ghost" mode (i.e.
with the human driving but the software pretending to self-drive) respond to
the changes.

Tesla, due to its massive and well-connected fleet, has a lot of tools in the
toolbox to address these problems, in some cases more than Google (although I
agree Waymo's self-driving capability is currently more impressive).

~~~
sah2ed
> flag problem areas geographically for autopilot, develop specific strategies
> that address the problem, and verify it has been fixed by examining how
> other Teslas on the road in "ghost" mode

This is what I have been thinking as well, except it appears that Tesla
doesn't currently have a way to handle such scenarios.

Ideally, they should be recording when a driver makes a sharp swerve to
correct autopilot's naïve lane marker following algorithm, then use that
information to inform future drives using autopilot on that stretch by
disabling autopilot as that patch of road is approached, while they work on a
reliable fix.

~~~
Robotbeat
Tesla's new Waze-like real-time traffic system as part of its mapping software
update could help handle this. If the autopilot system (engaged or in ghost
mode) generates an error signal (either binary or some variable degree of
error) that is pushed to the traffic system along with the traffic data, it
could easily alert other drivers (and hopefully the autopilot system) of such
problems.

Considering they have enough data to do realtime traffic analysis, they easily
should have enough data to determine problem areas for the autopilot.

------
exotree
This is a very upsetting response. Yes, the driver could have paid more
attention, and yes, he shouldn't have activated Autopilot in an area where he
knew Tesla's autopilot was not working correctly.

However, Tesla calling it "Autopilot" should be no longer allowed. It's clear
that it is most certainly not that, and customers are not treating this
feature in a careful and cautious fashion, and I do believe that's due to
Tesla overselling Autopilot's abilities. In addition, its warning systems are
obviously not enough to get drivers to take corrective action and ensure their
hands are on the wheel. (Why you can take your hands off the wheel at all when
engaging autopilot is a mystery to me.)

~~~
supreme_sublime
I see your point on calling it "Autopilot", but if you think about the
application of the term in terms of aviation, I think it makes more sense.
Autopilot in planes won't land the plane for you. It basically will just keep
you at a certain elevation and direction.

I think that Tesla's Autopilot function is comparable to planes. That's
probably how they will justify it. However there might be a legitimate point
in that people don't have that kind of understanding of what "Autopilot"
means.

~~~
mikepurvis
Aviation autopilots handle flying duties in a giant expanse of open sky. It's
a way, way easier job to do than navigating a highway from behind a steering
wheel.

So even if the "capabilities" may be comparable between the two systems, the
amount of supervision they require is not, so I don't think they should use
the term. It may be "lane-keeping cruise control" isn't nearly as sexy, but it
doesn't establish unrealistic expectations which get people killed.

------
otalp
Full statement:

"We are very sorry for the family's loss.

According to the family, Mr. Huang was well aware that Autopilot was not
perfect and, specifically, he told them it was not reliable in that exact
location, yet he nonetheless engaged Autopilot at that location. The crash
happened on a clear day with several hundred feet of visibility ahead, which
means that the only way for this accident to have occurred is if Mr. Huang was
not paying attention to the road, despite the car providing multiple warnings
to do so.

The fundamental premise of both moral and legal liability is a broken promise,
and there was none here. Tesla is extremely clear that Autopilot requires the
driver to be alert and have hands on the wheel. This reminder is made every
single time Autopilot is engaged. If the system detects that hands are not on,
it provides visual and auditory alerts. This happened several times on Mr.
Huang's drive that day.

We empathize with Mr. Huang's family, who are understandably facing loss and
grief, but the false impression that Autopilot is unsafe will cause harm to
others on the road. NHTSA found that even the early version of Tesla Autopilot
resulted in 40% fewer crashes and it has improved substantially since then.
The reason that other families are not on TV is because their loved ones are
still alive."

~~~
ajross
I'm normally an automation booster, but this is the wrong PR take for sure. I
mean, to take this at face value it sounds like Tesla is saying its autopilot
doesn't actually work and that if you don't continue to drive your car you're
going to die. Maybe that will work in court, but it's not going to sell a lot
of cars.

I mean... clearly (hopefully) this isn't the attitude being exhibited _inside_
the company. We all know there's a big strike team or whatnot that's been
assembled to figure out exactly what happened, and how. We all know that it's
probably a preventable failure. So that's what we need to see, Elon. Not this
excuse making.

~~~
rootlocus
> it sounds like Tesla is saying its autopilot doesn't actually work and that
> if you don't continue to drive your car you're going to die

No, it doesn't just "sound like", it's actually

> extremely clear that Autopilot requires the driver to be alert and have
> hands on the wheel. This reminder is made every single time Autopilot is
> engaged. If the system detects that hands are not on, it provides visual and
> auditory alerts.

~~~
deelowe
> Autopilot requires the driver to be alert and have hands on the wheel

Clearly that's not the case, because if it "required" it the car would cease
to function if you took your hands off the wheel. Instead, the Tesla continues
to drive itself for some time and then after a few seconds warns you, but
that's about it. If you want to see what a real requirement looks like, go
explore any of the other car makers that have rolled out similar systems at
this point. Those really, actually do _require_ "the driver to be alert and
have hands on the wheel."

Has everyone forgotten Tesla's marketing from when they first rolled this
poorly implemented system out?

------
DannyBee
I don't get taking this position. Even if they don't want to engage on the
autopilot issue, the hands on wheel vs crash avoidance issue is a lost cause
already (IE in most states, they'd lose in any lawsuit on it).

If i take any 2018 model year car with front crash avoidance (outside of a
tesla :P), and try to drive it into a barrier like this, depending on speed,
most avoid the collision, all of them slowed down (AFAICT - i went through the
summaries by category on iihs.org).

None of them require you have your hands on the wheel to do that.

In most jurisdictions, to show a defective design, it's enough to show the
existence of an alternative design that would have worked (it doesn't usually
even have to have been implemented) , and that it is feasible, economically
feasible, and would still perform the original function. Almost all of these
cases live and die by the cost of that design and whether it's "too much".

Unfortunately for Tesla, there are tons of them here, they all almost
certainly cost _less_ than what Tesla is doing, and also all exist. So i'm
positive any lawsuit will either settle, or they will show plenty of them
would have worked, by driving cars into an identical setup barrier, and Tesla
will lose.

(This assumes, of course, Tesla can't force them into a jurisdiction with
weird defective design rules)

------
abalone
Tesla is recklessly using its own customers to train their fleet for level 5.
AT BEST their response is, drivers need to be 100% as attentive as they would
without autopilot. _What then is the point of autopilot,_ except to train the
fleet?

They are in complete denial that autosteering can seduce reasonable people
into lowering their guard and mentally relaxing. Again, what other immediate
user benefit would it provide? What would a reasonable person expect autopilot
to do?

Isn’t it contradictory to offer an automation feature while claiming that
legally it does ZERO automation for you?

Note: collision avoidance and emergency braking are on by default even without
autopilot. “Autopilot” here really means the autosteering feature.

~~~
deepGem
I think the phrase 'Auto-pilot' is a misnomer. It's a lead-on phrase, confuses
people to let their eyes off the road. They should have named it 'driver
assist' or something mundane people shouldn't get confused by. Auto-pilot for
any person means that the car will steer automatically, which it does, but
also kills you. At least by now, it should be obvious that a half baked auto-
steering or auto-pilot is a colossal mistake. Admit that you made a mistake
and be done with it.

Tell people explicitly not to use auto pilot. Tesla can still use the onboard
cameras and everything else for training the AI.

~~~
abalone
Yeah, Tesla’s response to that is it’s like a plane: autopilot means you still
have be at the controls.

But it’s nothing like a plane. Concrete walls don’t suddenly appear in the
sky. The required reaction time to avoid an autopilot collision in some of
these videos is _seconds_.

------
timewarrior
I own a Model X. I love the car in many ways. However, I also think the
automation is stupider than a 2 year old. Would you trust a 2 year old to
drive your car or to decide when to close your Garage door?

I use the Auto-Steer, only when road is empty, is in a reasonably straight
line and I am in the middle lane of at least 3 lane road. Otherwise I don't
use it. It might also be fine to use it in stop and go on a divided highway
because there is on way you will die at those speeds even if you are in an
accident.

But sadly, some people trust the car automation with much more than what it is
capable of. This leads to accidents like these. And I would blame Tesla here.
Tesla has clearly been misleading in marketing this feature. Not everyone is
very savvy and they sometimes take what a company says at face value. I have
really smart friends who have seen the AutoPilot link
[[https://www.tesla.com/autopilot](https://www.tesla.com/autopilot)] and think
that Tesla already have these advanced features.

In the end Tesla is just slightly more advanced than autonomous features of
other manufacturers. However, Tesla claims to be much more advanced than
others. Because it claims a lot more credit than it deserves, it's only fair
that they also face criticism proportionately when the system fails.

Normally, I am against unnecessary regulation. However in this case, I think
it is important to bring regulations around consistent messaging and 3rd party
rating for autonomous features from all manufactures. We can't have "Move fast
and break things" kill people. It's not the same thing as a social app losing
a few posts.

Tesla should rename it's feature to "Drive Assist+" whatever, because that is
all it is.

On an ending note - I feel sad that the deceased continued to use Autopilot at
this spot, when it was known to them that is has failed at the same spot
multiple times. If someone well educated, tech savvy and already aware of the
problem can fall for this, I worry that a lot more people might do worse.

~~~
cpt1138
If the usage is so subjective, I can't even understand why its allowed. You
seem to be saying its no better than a two year old driving. You create a
situation in which its "ok" for a two year old to drive? We have laws that
state no 2yo's are allowed to be behind the wheel. Why is this 2yo allowed at
all?

~~~
darkstar999
The same reason cruise control is allowed.

~~~
rohit2412
I think the solution is to hold Elon Musk/Tesla liable for their misleading
statements on FSD, Tesla Network, AP2 and their timelines about future
capability.

Future promises if made should come with renaming the product to highlight its
capabilities.

~~~
akvadrako
That's potentially very complicated. As long as drivers are required to
purchase insurance and insurance is required to pay out fair sums for
accidental death, the accountability should come automatically in the form of
higher rates.

~~~
rohit2412
Do you also hold the same view regarding medicines? That the FDA shouldn't
regulate medicines, and manufacturers can make insane promises? After all, we
love free market over human lives.

------
matt2000
The thing that sometimes gets lost in the debate around Tesla autopilot (IMO),
is that it's possible for self driving cars to be theoretically better than
human drivers, AND that Tesla's implementation of that concept to be overall
worse.

From the recent accidents it seems clear that Tesla autopilot is just not that
great in it's current form. Even if it's following a line that it mistakes for
a lane marker, how is it ok that it can't stop for a large obstacle in front
of the car when given plenty of distance before a crash? How is this not a
totally unacceptable type of failure for a system like this?

~~~
thebluehawk
Autopilot is not an implementation of self-driving cars... You have a
completely wrong understanding of this system.

It's basically cruise control that has some lane keeping abilities. If you put
your (non-TACC) car in cruise control on the freeway and then ignored
everything around you, you will rear end someone, and it's not cruise controls
fault. If you put a lane-keeping assist car (like Tesla or others) and
completely ignored the road, it will crash eventually. AutoPilot is cruise
control with some lane keeping abilities, it is not self-driving. You choose
when to activate it (just like cruise control), and every single time you do
it tells you to pay attention and keep your hands on the wheel.

~~~
brain_dev
> It's basically cruise control that has some lane keeping abilities.

And this is exactly why calling it Autopilot is immoral and dangerous.

------
zaidf
_" The reason the other families aren't on TV are because their loved ones are
still alive."_

Wow, this sounds like a horrible, tasteless burn to end the statement with.
And I am a Tesla fan.

~~~
zmarty
And where exactly does it say that?

~~~
marcell
The statement is here:
[https://twitter.com/dannoyes/status/983899361779236864/photo...](https://twitter.com/dannoyes/status/983899361779236864/photo/1)

------
jijojv
[https://www.cnbc.com/2018/01/31/apples-steve-wozniak-
doesnt-...](https://www.cnbc.com/2018/01/31/apples-steve-wozniak-doesnt-
believe-anything-elon-musk-or-tesla-say.html) "Man you have got to be ready —
it makes mistakes, it loses track of the lane lines. You have to be on your
toes all the time," says Wozniak. "All Tesla did is say, 'It is beta so we are
not responsible. It doesn't necessarily work, so you have to be in control.'

~~~
axaxs
You don't beta test with human lives.

~~~
tjomk
Or you don't use beta features where the ultimate mistake can cost you your
life.

~~~
dclowd9901
People don’t get this when you call the feature “Autopilot”.

~~~
sokoloff
It is comparable to an airplane autopilot, but not to the public misconception
of an autopilot. An airplane autopilot is another complex control, not another
pilot.

If I command my autopilot to fly me into terrain or into a thunderstorm, it
will do so. If I command it to climb at a vertical speed that the engine power
can’t support, it will fly the plane into an aerodynamic stall. If I’m flying
on autopilot, I still have to pay close attention and still log pilot time as
“sole manipulator of the controls”. If I crash on autopilot, it’s still
overwhelmingly [human] “pilot error”, not “autopilot error”.

People hear autopilot and incorrectly assume it’s, “Hey George, fly me to
LaGuardia and wake me upon landing.”

------
ypzhang2
The issue with the statements from Tesla is that they seem to prioritize PR
and obfuscation before actual relevant factual statements.

The fact that the driver received disengagement warnings prior in the drive is
not very relevant to the accident per se. Might as well say the driver has
received multiple disengagement warnings during his ownership of the Tesla. It
is not relevant to the actual circumstances of the accident. Note that they do
not say "he received disengagement warnings right before the accident and
chose not to respond". Tesla instead says that his hands were not detected on
the wheel 6 seconds prior to the accident. Okay, so thats the case, but
obviously the Autopilot system allows that to happen, as disengagement
warnings were not active at that time, therefore THE SYSTEM WAS OPERATING
WITHIN ITS ESTABLISHED BOUNDS.

This leaves it down to, the autopilot system drove into the barrier while the
driver wasn't paying attention, but was operating the vehicle within the
defined bounds of the semi-autonomous system, but violated the parameter of
"pay attention to the road and be ready to take control". He did not receive
any system warning before the accident, he just simply chose not to correct
the system behavior before the accident happened. Everything else is just
trying to kick up dust and obfuscate the facts of the situation it seems.

The issue with Tesla's semi-autonomous system and to a certain extent, other
systems, is that there is a poor boundary between system responsibilities and
user responsibilities. This causes issues around the boundary where sometimes
the autonomous system will accomplish something, but sometimes it can't and
the user must. Because of this fuzzy boundary, I think its almost more
important to invest into disengagement logic, equipment, and programming than
pure self-driving technology. A good example of investment into this arena is
actually GMs Super Cruise, which seems to have MUCH better user engagement
detection and much sharper boundaries than systems like Tesla's

~~~
toast0
> The issue with the statements from Tesla is that they seem to prioritize PR
> and obfuscation before actual relevant factual statements.

The issue with their statements is that they have no problem throwing their
customers under the bus. The proper release is simply "We are investigating
the circumstances of this tragic event in cooperation with the NTSB. something
something nice words about the family"

If pushed, refuse to discuss the issue while it's under investigation. This is
a perfectly justifiable position, lots of groups don't make statements while
an issue is under investigation; and you let the NTSB report throw the
customer under the bus.

------
imh
It should be noted that the 40% crash reduction they keep quoting is done with
an extremely simplistic analysis and is only a brief paragraph of the full
report.
[https://www.scribd.com/embeds/337007075/content?start_page=1...](https://www.scribd.com/embeds/337007075/content?start_page=1&view_mode=scroll&access_key=key-
DgnugphNmXkiFVNAWAvV)

It's not nearly as definitive as people (and Tesla) seem to claim it as.

 _edit:_ The full analysis (from page 10 of the link):

>5.4 Crash rates. ODI analyzed mileage and airbag deployment data supplied by
Tesla for all MY 2014 through 2016 Model S and 2016 Model X vehicles equipped
with the Autopilot Technology Package, either installed in the vehicle when
sold or through an OTA update, to calculate crash rates by miles travelled
prior to^21 and after Autopilot installation.^22 Figure 11 shows the rates
calculated by ODI for airbag deployment crashes in the subject Tesla vehicles
before and after Autosteer installation. The data show that the Tesla vehicles
crash rate dropped by almost 40 percent after Autosteer installation.

~~~
waitwhatt
I would argue that the people who can afford a Tesla are rich people and thus
likely well educated and are safer drivers.

Poor people with bad cars, no education and unsafe driving practices probably
have more accidents.

I don't think these stats mean much.

~~~
zaroth
I keep seeing statements to this effect, and they are incorrect. The
comparison is between _Tesla_ cars with and without the AutoPilot feature
installed.

The statistic has nothing to do with people not driving a Tesla.

“...airbag deployment crashes in the subject Tesla vehicles _before and after_
Autosteer installation. The data show that the Tesla vehicles crash rate
dropped by almost 40 percent after Autosteer installation.”

~~~
gamblor956
The bigger problem is that their study seems to have revealed that while
crashes went down, the severity of the crashes went up. People weren't dying
in their Teslas before Autosteer, but now we're looking at 4 deaths and
counting in the space of 2 years, all directly attributable to the use of
Autopilot.

~~~
zaroth
Okay lets move the goalposts.

> People weren’t dying in their Tesla’s before Autosteer

This is false. Although Tesla fatalities tend to be a bit over the top like
driving off cliffs, or stolen car escape gone wrong type of situations, there
have certainly been non-Autopilot related fatalities in Teslas.

> but now we're looking at 4 deaths and counting in the space of 2 years

This is false. There have been 2 fatalities in Tesla’s which are known to have
been operating with Autopilot at the time. There was also one fatality in
China where the family claims autopilot was on at the time but Wikipedia
states this was not confirmed.

> all directly attributable to the use of Autopilot.

I take issue with “directly attributable” simply based on NTSB (a neutral 3rd
party if ever there was one) disagreeing with that. They found Autopilot was
_not_ at fault in the Florida crash. I think a neutral fact-finder would defer
to the NTSB and rank your claim as false.

But finally, you disregard the obvious flaw in your logic, which is you can’t
possibly know how many potentially fatal crashes or maimings have been avoided
due to AutoPilot, except the one thing we do know is that _airbags in Teslas
are deploying 40% less with the feature than without_.

------
TaylorAlexander
I am a huge fan of Elon Musk and a small TSLA share holder, but there is
something about Tesla and Musk’s response to this that feels so wrong. It
seems two faced. On the one hand, they shout about how great and safe their
technology is, and on the other hand they blame someone for dying when they
chose to rely on that technology. Is it safe or unsafe? I know it’s not that
simple, but this response still saddens me.

I know they must have people who want to bring them down through legal action.
And I think they want to establish a precedent that they are not liable in any
crashes out of a desire to protect the corporation. But still, this response
seems so cold, and so uncaring, that I am reminded that Tesla is still a
corporation that in the end must protect itself.

I want self driving cars, but not a world without compassion. Why is it that
we seem forced to choose one versus the other?

EDIT: Immediately after hitting send, two Model 3’s passed me one after the
other, before any other vehicles passed. The universe is strange.

~~~
robotkdick
I hope you're not driving...or using autopilot.. while you wrote this
(kidding)

~~~
TaylorAlexander
Hah, nope just walking through a Mountain View parking lot.

------
loser777
I think a good litmus test in this situation is "will this end up in an
engineering ethics textbook in 10 years?" Unfortunately, Tesla seems to be
taking the Ford Pinto route here. "It's only dangerous if you aren't paying
full attention 100% of the time" doesn't sound too far off from "it's only
dangerous if you get in a rear-end collision."

Driving with autopilot sounds like being a driving instructor with your own
set of controls: you have to be ready to react at less than a moment's notice,
yet feel comfortable to hand off controls most of the time. Most people are
not trained driving instructors, so I can't see how autopilot is not
inherently unsafe.

------
coenhyde
This is a disappointing response from Tesla. I understand why they are trying
to limit liability but they fucked up. They need to take responsibility. The
autopilot actively veered towards the barrier. That's very different from
autopilot failing to avoid a danger.

------
awestley
I had an early version of the Model S with the autopilot (P85D). Once it was
made available I noticed a DECREASE in the reliability after every update.
From day one it was great. It would be glued to the car in front of it and
kept me safe. Over the three years I owed it, it only got worse. My theory was
that they tried to make the car drive too much like a human. Rather than going
safely down the middle of the lane when passing it would try to give a little
more space and thus get closer to the median. It only got worse from there. I
no longer have the Model S, but it was the best car I ever owned. That said, I
won't get another until there are huge improvements to the autopilot.

------
neya
At first, I thought the news was click bait. I watched the video fully to
understand if a company like Telsa would really say something to this extent.

    
    
        Tesla's auto-pilot requires the driver to be alert and have hands on the wheel. 
    
        The crash happened on a clear day with several hundred feet of visibility ahead, 
        which means the only way for this accident to have occured is 
        if Mr.Huang was not paying attention to the road.
    

This is the official response. I'm completely baffled and at a loss of words.
I feel very sorry for the family and I hope they sue the sh*t out of Tesla
just for this arrogant statement.

And as for Tesla, you should fire the guy that wrote this. If this official
response was something approved by Musk, I just lost all respect for him.

Personally, this is why I am a hardcore manual guy. I drive a stick/manual. I
don't even trust automatics to do the job right, let alone some software.
There's no way I'm trusting a self driving car. I personally opine that
driving is actually very relevant to planning ahead. If you drove a stick, you
would know you need to shift down before a steep curve to maintain traction
and to reduce your inertia to turn around safely. Or, if you're planning to
overtake someone, or you foresee a situation that requires braking, you
downshift.

As much inefficient as gearboxes + clutches + ICEs are, the ability to plan
ahead and downshift correctly is an exclusive human edge that I believe
strongly that computers can't mimic. If they could, we wouldn't have
automatics (in 2018) that can't even downshift even with all that data from
its sensors. Even the most expensive automatics w/ some sort of software fail
to downshift correctly.

Unfortunately, in the driving world, it's not enough if your car can drive 99%
accurately as even one instance of wrong downshit/upshift can kill you easily.

I strongly believe driving is best left to humans and there are simply too
many variables that could be reliably fixed with mathematical algorithms.
Sometimes, it just doesn't "feel" right and maybe it's ok to just trust our
instincts.

------
mnl
"Well, too bad another of our customers made a mistake and killed himself
because he didn't get what 'autopilot' means (for us). Deal with it. Yours,
Tesla"

You all know the game Elon is playing here, what's the point of arguing?
Either you're OK with that or you aren't. It's that simple.

Imagine Ford doing this everytime. Oh, the outrage...

------
axaxs
This is so frustrating to me. 50 years ago we had the ability to fly to the
moon. A played out trope, sure. But today we can't even do a basic 'dont run
into shit' on a perfectly clear day. Physics in this way is and has been a
solved problem for decades. The fact that people are dying from hitting
stationary objects should be criminal. At lightest, they should bar these
companies from even participating.

~~~
lsc
>But today we can't even do a basic 'dont run into shit' on a perfectly clear
day.

so,uh, the emergency braking systems on other cars...so called 'AEB' systems,
or what Honda calls Collision Mitigation Braking System... would that have
caught this?

I would expect this sort of thing "big object in road" to trigger the AEB
system... even if it didn't stop me, I'd expect it to trigger before the
collision and slow me down. Are my expectations off? or is it just that the
tesla system, 'passive sensors only, with machine classification' isn't as
good?

I guess people rely on the tesla system more than the honda system, due to
advertising, so it could be that both systems make errors at similar rates,
it's just that advertising means that the tesla system is triggered more
often.

~~~
javagram
Honda's documentation for the Civic doesn't indicate it would stop in this
case.

 _" Collision Mitigation Braking System™ (CMBS™) Can assist you when there is
a possibility of your vehicle colliding with a vehicle or a pedestrian
detected in front of yours."_

I think like Tesla, it would run right into it - the systems have been tuned
to reduce false positives and so only detect and stop on objects matching
vehicles or pedestrians, not arbitrary objects like a road barrier.

The difference being, the Civic system doesn't give people nearly the same
sense of false assurance that the Tesla one does...

~~~
petre
The way traditional automakers are doing it is the proper way to do it -
cautiously and responsibly. Incrementally implementing driver assistive
technology instead of shipping an _autopilot_ which doesn't deliver and
crashes the car into obstacles. Traditional automakers are used to screw ups
that lead to recalls and lawsuits so they're quite cautious.

The Hyundai Ionic assistive tech slows down the car to a stop if you take your
hands off the wheel.

------
abalone
May I ask a very simple question here. I’ve never heard a cogent answer.

What exactly is the point of autopilot if you have to remain 100% attentive
with your hands on the wheel? What is the benefit?

I get that cruise control lets you take your foot off the wheel, letting your
foot physically rest. And it is my understanding that emergency braking and
collision avoidance is active on Teslas at all times even without autopilot.
So that basically leaves autosteering.

So, what is the benefit of autosteering when you have to have your hands on
the wheel and eyes on the road with exactly the same attentiveness as you
would without it? What is it doing for you, now, at level 2, apart from (a)
“gee whiz look at what it can almost do” and (b) seducing you into trusting it
so you can mentally relax?

~~~
dayaz36
Trains itself for levels 3,4,5

------
justsomedood
What is the point of Autopilot? It's not self-driving. You're not supposed to
let go of the steering wheel and you have to be completely alert to take over
if there are problems. Isn't it easier to just not use it in the first place?

------
enduser
“NHTSA found that even the early version of Tesla Autopilot resulted in 40%
fewer crashes ...”

True

“... and it has improved substantially since then.”

Non-specific to the point of being subjective. Untrue according to many HW2
users.

Bad logic.

------
dqpb
> "Autopilot requires the driver to be alert and have hands on the wheel."

By that definition every car I've ever driven has been fully self-driving.

~~~
Robotbeat
It's called autopilot, not self-driving.

~~~
hashbig
Autopilot is a miss-leading product name. It implies automation of the process
of driving.

It doesn't function like an autopilot and requires no co-pilot. Taking a
technical term used in aviation is meaningless and dangerous.

~~~
Robotbeat
It functions similar to an autopilot (ie follows terrain, but not perfect).
Even general aviation aircraft have them and don’t require copilots.

------
wehadfun
We need FAA style oversight on self driving cars.

~~~
Robotbeat
This isn't self-driving, though. It isn't called self-driving.

~~~
brain_dev
How else would you describe an Automatically Piloted Vehicle then?

If it's not self-driving, don't call it Autopilot, and don't lull people into
a false sense of security. Even with alarms to try and keep the driver
engaged, people are inherently going to place more faith in 'Autopilot' than
'Enhanced Cruise Control' or something similarly named.

The marketing around this feature is reckless, and Tesla + Musk should be
ashamed. To the point that I would consider working on that team to be
immoral.

For what it's worth, this is coming from a big Musk fan.

~~~
Robotbeat
It’s called autopilot because “autopilot” already has a technical meaning
which doesn’t imply self-driving. Heck, some aircraft autopilots are so
primitive they only keep the aircraft in level flight without terrain
following (although some do that). No versions of autopilot mean the pilot
doesn’t have to pay attention. This is all _already established_ through broad
use before Tesla introduced the term to cars.

The same linguistic games you’re trying to play for “autopilot” apply just as
well to “cruise control” or worse, since that term didn’t have an already-
understood technical definition when introduced to cars.

------
SCAQTony
First off, Tesla has no business calling it auto-pilot; it's over promising.
Call it driver assist to be more clear because if you have to keep your hands
on the steering wheel and your eyes on the road, that is not auto pilot:

"... Automatic Pilot: noun: a device for keeping an aircraft on a set course
without the intervention of the pilot. ..."

------
blackrock
It sounds like you need life insurance, as we enter into the age of alpha-
tested autonomous cars.

You need it to be a driver or passenger when inside a robotic car.

And you also need it when you are just a pedestrian, while walking your bike
on the road. Because you never know when a robotic car is going to run you
over.

------
mentos
Most surprising part of the crash to me was not that the car followed the
wrong lane markers but that it had no concept of a completely static barrier
in its path.

This should have resolved with the driver calling Tesla to complain about
whiplash from last minute breaking.. not plowing full speed into a concrete
barrier tearing the front of the car off.

Tesla will definitely now make a point to update their tests for this case and
similar deaths will definitely be avoided in the future. That is value that
has been added to Tesla's system at the cost of Mr. Huang's life. If I was a
lawyer on this case I'd probably take that position.

"What does Tesla feel Mr. Huang's life is worth for this improvement to their
system?"

------
InternetOfStuff
Unless they have other data to back up their claim of the hands not being on
the wheel, such as camera images, Tesla isn't making a statement of fact.
Rather they are recounting what imperfect sensors might be telling them
(that's convenient in the upcoming lawsuit).

To give perspective: I have a car (BMW) with lane-keeping "autopilot", which
often gets wrong whether I have my hands on the wheel.

It seems to me that it senses resistance to rotation to determine hands-
onness. If my hands are very relaxed, just letting the steering wheel do its
thing, apparently the sensor can't pick it up anymore.

I had my hands on the wheel, dammit car! Stop beeping and flashing at me all
accusatory.

------
fipple
Wow, Tesla instantly went from my dream car (having replaced Bentley) to Do
Not Buy. I simply will not buy a car or other dangerous equipment from a
company that doesn’t care if I die and only cares about their brand.

------
skywhopper
Tesla's marketing around this feature is completely at odds with what they say
when a problem happens. If you are required to be paying attention with hands
on the wheel at all times, then the "Autopilot" name itself is fraudulent, not
to mention all the blather about how it's safer than a human driver. Even if
self-driving cars were imminent (they aren't), this sort of blatant
misrepresentation by the likes of Tesla (a company who surely doesn't need the
lure of self-driving to sell its cars) would set back consumer interest and
adoption by years.

------
thsowers
> "Tesla is extremely clear that Autopilot requires the driver to be alert and
> have hands on the wheel."

I am a Tesla fan, and although I previously didn't understand the criticism
for the name, I now see some negatives of a name like "autopilot". IMO it
implies non-intervention (and yes I know Tesla warns about this etc), but at
the current stage of technology and regulation, maybe "autopilot" wasn't the
_best_ naming choice?

------
BlueGh0st
>Tesla sent Dan Noyes a statement Tuesday night that reads in part, "Autopilot
requires the driver to be alert and have hands on the wheel...

No it doesn't. Maybe the agreement when you use that feature does but if they
really wanted to/cared enough they would make it so it only works while your
hands are on the wheel.

I'm still on the fence and think commercially releasing this very experimental
technology as a product to the public is incredibly irresponsible.

~~~
thelastidiot
It is absolutely irresponsible. While facebook is in the news as a reminder
that companies that unethically exploit the lack of regulations in emerging
technologies for the competitive purpose of getting higher customer adoption
will, at one point, have to report to congress and become responsible.

------
josto
The line about fundamental legal/moral principle being a broken promise is
dead wrong. Who wrote this? Tesla has a horrible pr firm appareantly.

The fundamental principle is standard of care. One could say it was reasonably
foreseeable this would happen since the driver reported it! Tesla was
negligent and their best hope would be to show comparative fault by the
driver. But forget negligence.

Products liability is strict liability. Someone died here. Causation is the
only issue and imo it looks bad for Tesla.The fundamental premise of both
moral and legal liability is a broken promise, and there was none here Very
very stupid statement on their part. They should have quietly settled this.
Bad pr and legally peroulous. As a shareholder this makes me question their
competence to deal with these matters. They should fix the issue and settle.
All the discovery and everything about mobile eye and radar. Horrible strategy
by Tesla. It’s like they take it personal since auto pilot failed. People are
dumb. It doesn’t matter if they warned him to keep his hands on the wheel.
Someone died. There was an alternative design that was safer. Whoever wrote
that statement should be fired

------
arh68
Insurance rates will probably go up on "self-crashing cars" if this trend
continues.

For some reason, people aren't dying so dramatically due to cruise control. I
love cruise control, because it helps me focus on "driving" (which I consider
steering + observing). The Tesla autopilot feature seems to do the opposite:
it helps you focus on _not driving_. This ain't Volvo, folks.

~~~
Robotbeat
Cruise control DOES cause deaths: [http://www.indianainjury.com/trucking-
accident-verdict-leads...](http://www.indianainjury.com/trucking-accident-
verdict-leads-change-cruise-control-policy/)

...but it's not Tesla, so a single death isn't plastered over every news site
and discussed ad nauseum by tech sites.

EDIT: More cruise control deaths:
[https://www.theguardian.com/business/2016/nov/24/skoda-
drive...](https://www.theguardian.com/business/2016/nov/24/skoda-driver-
decapitated-in-stuck-cruise-control-mystery) [https://www.levylaw.com/cruise-
control-accidents-defective-p...](https://www.levylaw.com/cruise-control-
accidents-defective-product-liability-lawsuits-levy-phillips-a-konigsberg/)
[https://www.thesun.co.uk/news/5645195/m1-motorway-crash-
lorr...](https://www.thesun.co.uk/news/5645195/m1-motorway-crash-lorry-driver-
cruise-control-hands-free-call-eight-dead/)

~~~
arh68
The last one with the truck does count, I agree. If it enables
inattentiveness, it's likely to cause harm.

The Skoda driver makes less sense: if you are trying (like he was) to stop,
brakes should always outpower the engine. Unless the brakes failed. Toyota
went through a huge ordeal just like that (possibly floormats, possibly ECU
stack overflow, who knows). An attentive & trained driver should be able to
handle it. The truckers were willfully ignoring what pretty much every driving
manual tells you. If you drive at night without lights, in a blizzard, with
summer tires, I don't see how you can blame the car manufacturer if you hit
something. But if all you do is consistently go over potholes, and
consistently lose power steering & lights when the serpentine belt comes
loose, I think you can blame the manufacturer.

------
tompetry
I don't get it. Doesn't autopilot mean that the car is steering itself?
"Perfect" or not, if the car is steering itself, what is the point of having
your hands on the wheel? I get the point about being alert, and needing to
take over if you see an issue, but I would fear that I'd be overcompensating
autopilot by having my hands on the wheel while cruising along.

Can somebody explain?

------
zamalek
The Tesla warns you, what, 6 times before disabling autopilot. Why not have
the car engage the emergency lights and gradually slow down for each warning -
eventually coming to a full stop. This plays directly into human impatience,
likely resulting in hands on the steering wheel after the first warning. In
the event that the driver does not react there is probably something wrong
(e.g. seizure) and the car should come to a complete stop for safety. This is
a very real trolley scenario because a stopped car on the highway is asking to
be rear-ended, but it's still far better than a car traveling at 60mph with an
incapacitated driver.

My last idea on how to eliminate these types of casualties was met with fair
criticism (TL;DR - battery ejection)[1]. Am I missing anything with this idea?

[1]:
[https://news.ycombinator.com/item?id=16720680](https://news.ycombinator.com/item?id=16720680)

------
cmurf
I think the name "Autopilot" matters less than how it works. Do the raw
(whatever Tesla and NTSB have access to in a lab) recordings include hand
position, temperature and pressure? Or is the raw data a binary state
true/false similar to the in-car UI experience?

Back to autopilot: there's a wide range of autopilots in airplanes. Single
engine planes might not have them unless they're IFR equipped. There's no
requirement to have one, but they are useful for reducing cockpit workload.
Perhaps the most common, is just a simple hold heading autopilot where you use
a knob to set a bug on the direction gyro and the autopilot holds that
heading. Literally all it does is hold a heading. Not altitude, not speed, not
power. And then way on the opposite end of the spectrum are autopilots
certified for ILS CAT IIIc landings, and include autoland.

------
xHopen
I have to say I was a firm supporter of tesla's autopilot, but you know, this
is rediculous, Tesla should be ashamed say sorry, say oour system is crap , is
a demo version and it doesn't work, and thats it. The rest is history, I was
about to get a new tesla, not anymore.

------
blackrock
If the guy didn't use the Tesla Autopilot, then he would still be alive today.

I don't know how else to clearly say this.

When you are travelling at 65 MPH, then there is no ability to safely react,
should the car decide to go berserk.

You probably have less than 0.25 seconds to react. And even if you had your
hands on the wheel, there is just no way for you to safely process the
situation, and react to safely take control of the vehicle.

And if you do react, then you might end up over-reacting, which would cause a
different kind of accident. And which, Tesla would blame on you, for human
error for over-reacting.

This is not a good situation. The only way to survive, is to not play.

I have over hundreds of thousands of miles of driving experience, and I as a
human, would have never gotten into this situation.

------
sorokod
Given the nature of the product, the strategy for handling this must have been
ironed out a while ago.

This is it.

------
JonasJSchreiber
Maybe the cost of improvement is failure in this case. For Tesla to fix the
issue of a crossing tractor trailer not being positively detected there had to
be an instance of a Tesla hitting a crossing tractor trailer. Maybe for road
work barriers that change position as the road work progresses to be avoided
there needs to be an impetus for Tesla engineers to build in detection of
these cases.

This is all a way of saying that I reject the notion that autonomous vehicles
are more unsafe as compared to manually driven vehicles. This isn't an
impulsive reaction either; data compiled by the NHTSA has proven this out.

~~~
toast0
The choice isn't really autonomous vehicles vs manually driven vehicles. The
choice is autonomous vehicles vs manually driven vehicles vs manually driven
vehicles with computer assistance.

It turns out that humans are really bad at paying attention to be able to take
over for an autonomous system; but computers are pretty good at paying
attention in order to take over for a human driven system.

I do believe that eventually autonomous vehicles could be safer than a
manually driven vehicle with computer assistance, but Telsa's are not
currently autonomous vehicles, despite the perception that many of their
drivers apparently have.

------
oblib
As far as I know Tesla has never endorsed letting their cars drive themselves
for more than a short time and only when traffic and road conditions are
optimal. I'd be fine with anyone showing me otherwise but I'm pretty sure I
seen statements specifically saying do not use the autopilot in city highway
traffic conditions.

The driver is the only one to blame in this case. That's not a cold or
heartless thing to do, it's just a fact.

Tesla cannot remain silent about this. They have to tell people to not do what
this driver was doing. It could have been worse. He could have killed others
driving nearby.

~~~
FireBeyond
> As far as I know Tesla has never endorsed letting their cars drive
> themselves for more than a short time

They used to only require you to put your hands on the wheel every 15 minutes,
before being "encouraged" to reduce that to 1 minute.

------
philliphaydon
"We know that he's not the type who would not have his hands on the steering
wheel, he's always been (a) really careful driver," said Will.

Isn’t it obvious he didn’t have his hands on the wheel?!? If he did he would
have corrected the car.

I’m not saying Tesla or the driver are to blame. I know very little about this
incident cos I haven’t followed it. But can’t say he always has his hands on
the wheel when they’d clearly not the case.

Likewise everyone says he apparently reported it to Tesla 7-10 times? But he
still felt the need to not pay attention on this particular stretch of road he
was worried about?

------
rootusrootus
This reads like they're making a legal argument but in the court of public
opinion. Maybe it will be effective, but as a PR move I am not at all
impressed. Best to leave the legal wrangling for the real courtroom and stick
to "We feel terrible about this tragedy, and have no further comment." when
speaking to the public. Blaming the victim, especially when he is dead, is bad
form. I have to think a reputable PR firm would understand this, which makes
me think Musk is forcing his team to make these kinds of statements.

------
dayaz36
>"We know that he's not the type who would not have his hands on the steering
wheel, he's always been (a) really careful driver," said Will.

I don't understand. The victims family is trying to make the case that the
driver had his hands on the steering wheel but blame Tesla's AP at the same
time? So are they saying AP drove off road against his will while he was
trying to steer the car? Because that would be ridiculous.

------
cynusx
They should probably take a page from Waze and allow drivers to report areas
where autopilot doesn't function properly. Fleet learning but not crowd
sourcing?

~~~
zmarty
They technically do, in a Tesla you can report a bug using your voice while
you are driving. Now, I don't know if/how they act on the bug reports.

------
lumberjack
Tesla gambled with their lane assist by calling it autopilot in the hopes that
it would give them good PR as the first car manufacturer with a self driving
car. It failed spectacularly with multiple deaths. Most (all?) of the crashes
are the result of the impossible nature of the task for the human driver. You
cannot react fast enough when autopilot brain-farts in a seemingly trivial
length of road.

They fucked up big.

------
ilitirit
Given the facts, who/what else would be to blame?

This IMO has more to do with PR than anything else.

Well... marketing too. I've never felt comfortable with the term "auto pilot"
as it is used in the context of Tesla's vehicles. This is potentially a
finnicky topic, but I think there should be better standards and legislation
around what can be reasonably marketed as "auto pilot".

------
robotkdick
GM's recent TV advertisement shows the driver engaging their version of
autopilot and crossing his hands...I was like woah! That would never make it
through legal if they weren't very very confident about their tech. Contrast
that image with Tesla's demand that drivers keep their hands on the wheel and
it seems that GM is miles ahead of everyone on autopilot.

~~~
kodis
From what I've read, GM is less confident about their system's ability to
operate under as wide a variety of situations as Tesla's does. Yes, it
operates in a hands-free manner, but does so only on well mapped uninterrupted
divided highways, and even then it requires the driver pay attention. It does
this by way of the IR lights and camera mounted in the steering wheel, which
only allow the driver to look away from the road for a few seconds at a time.

Given the bad press that Tesla gets after any Autopilot driven accident, GM
seems to have made a safe choice here.

~~~
gamblor956
GM's AV systems are leap-years ahead of Tesla's. Pretty much everyone in the
AV world is, including Uber (the recent accident notwithstanding). However, GM
is taking the safe and conservative approach: restrict the usage of a feature
that could potentially kill its users to a specific set of situations in which
extraordinary testing has demonstrated that it won't. And that's exactly the
right way to do it until they get to true self-driving.

------
pvaldes
As most routes in a normal journey are a repetition, maybe the car could learn
from an human first. The owner drive a few times in their normal route and the
car uses this info to train the autopilot in the same route and paying
attention to the differences between what the car would have done and the
human really did.

------
samfisher83
They need to change the name of the feature from autopilot to lane assist.
That might keep people paying attention to the road.

------
pvaldes
As most routes in a normal journey are a repetition, maybe the car could learn
from an human first. The owner drive a few times in their normal route and the
car uses this info to train the autopilot paying attention to the differences
between what the car would have done and what the human really did.

------
sunstone
The self driving tech is trying to do everything 'by itself' in that it's
trying to drive perfectly without help but in the ends roads will probably
have to meet the tech part way. That is, the roads will need some (hopefully
not to expensive) adjustments or additions to make the self driving tech more
reliable.

------
thelastidiot
Driving a Tesla in the "auto-pilot" is like taking my 16yo kid for a teaching
lesson while I am in the passenger seat, sweating and sometimes hoping for the
best when she makes a mistake that could be fatal. And as a software engineer,
I have more faith in my daughter than Tesla's software.

------
jaakl
The good news is that Tesla does not manufacture toys for children. Otherwise,
for them label "not suitable for children under 3 years, must be used
carefully etc" would make every deadly mistake just another user's fault.

------
discordance
We heavily regulate factors/distractions that may reduce a drivers attention
from the road (alcohol, drugs, mobile phones etc).

If Autopilot still requires a user to monitor the road, but results in a
reduction in driver attention to the road, then why is it not regulated like
mobile phone usage?

------
RyanShook
The Apple Watch is fully submersible and in reality waterproof but Apple is
smart enough not to call it waterproof for the purpose of liability. Tesla has
chosen to market their product as “autopilot” and will have to face the
consequences of that marketing decision.

------
basicplus2
"She told the ABC7 News I-Team that Walter complained that his Tesla's
Autopilot had steered toward that same barrier on several occasions, she
recognized the location and the blue Model X."

------
plussed_reader
I think autopilot is worth it if it cuts down the drunk driving rate.

I can't imagine the edge cases that result in fatality for Autopilot would be
greater than the damage done by drunk driving.

------
mrb
Blaming the autopilot for the death of Walter Huang is like blaming the bullet
for a firearm-related death. The man at the wheel or the man at the trigger is
ultimately responsible.

~~~
rohit2412
What if Smith and Wesson announced that their guns will update over the air
and suddenly the bullets will become non-lethal and infact save lives as
they'll include water from the fountain of youth?

Because I don't see much difference from how Elon Musk makes misleading and
clearly false statements.

------
tshannon
I agree that this statement seems cold, and blames the driver, but from a
legal perspective, can it be anything else?

They can't admit anything or publically take any responsibility, because it
could be used against them in the court case right?

Even in their statement they don't say "Autopilot is not perfect" they say
"According to the family, Mr. Huang was well aware that Autopilot was not
perfect".

The only thing the actually admit in the statement is that "Tesla is extremely
clear that Autopilot requires the driver to be alert and have hands on the
wheel".

It's cold and heartless, but anything else opens them up to liability right?

~~~
dragonwriter
> I agree that this statement seems cold, and blames the driver, but from a
> legal perspective, can it be anything else?

Yes.

The law doesn't require, or even create an incentive, for them to make a
statement at all.

The law doesn't require, or create an incentive, for them include any
attribution of blame or responsibility in any statement they do make.

This is a hamfisted attempt at PR management by directing the _public_ blame
for the incident to the driver, not soemthing made necessary by legal
requirements or incentives related to legal liability.

~~~
tshannon
Thanks, my comment was a genuine question. If that is the case, then their PR
statement is indeed very cold.

------
mtgx
This is why we need government investigators to look at crash/accident data
(after the fact) and we need to ensure that the data hasn't been tampered with
by anyone, including the car makers.

We can't just rely on the company's words in a crash. _Of course_ the company
has all the incentive to lie or at least be as misleading as possible or try
to minimize its blame as much as possible, to avoid liability or bad press.
That should surprise absolutely no one. That's why we need to make sure that
the company's bias is taken out of the equation altogether.

~~~
TAForObvReasons
This is Silicon Valley. "move fast and break things"

~~~
neolefty
Common in early car and aircraft cultures as well. Always a tension between
innovation and safety.

I don't mean that as a cop-out. Rather that there's no single "right answer".
We are choosing something on a spectrum of risk vs reward.

Personally, although I have no direct experience, it's starting to look like
Tesla is being too risky. Yes, they're safer than human-only drivers, but the
mistakes they're making are too algorithmic. They're not _enough_ safer to
call Autopilot a success.

Contrast that with Waymo, who are paranoid-ly safe. They seem to be taking the
long view that building trust (and not taking any chances to lose that trust)
is as important as technical achievements.

Edit: Or, at some other extreme of poor citizenship, Uber, who seem to have
gone off the deep end of "Keep up appearances of having a self-driving
research program even though it's a disaster internally."

~~~
TAForObvReasons
I firmly believe that a technology deployed in a potentially life-threatening
context like a car or plane or pacemaker deserves a much more conservative
development approach. "Move fast and break things" isn't appropriate when
human lives are at stake. Waymo is applying the appropriate conservatism;
Tesla and Uber are playing fast and loose by comparison.

~~~
snthd
The dilemma is that slow development costs lives as well :-/

~~~
chaboud
Unfortunately, that sentiment, wielded as a shield by the ambitious, papers a
veneer of virtue over the callous disregard of cutting corners with other
people's lives.

The reality is that careful development is _expensive_. If Uber and Tesla were
racing to save lives, they'd open up their work to the public. They're racing
to make the most money possible in the end. Part of that equation is cost.
Another is maximizing the utility of available resources.

They only need to be safe enough to not get shut down or lose substantial
market share (ongoing capital).

~~~
saalweachter
I think the exact logical flaw is that saying "slowing down development for
testing costs lives" _assumes the solution will be validated as correct_.
Waiting a year to deploy the correct solution for want of testing costs the
lives that solution could have saved in that year, yes, but pushing one
correct and four incorrect solutions immediately into production - when the
incorrect solutions each cost twice as many lives as could have otherwise been
saved - and re-evaluating after a year kills far more people than are saved.

------
AcerbicZero
I suspect many of the same arguments being made in this thread would have been
made about cruise control when it was first introduced.

------
cowpig
Does HN have a policy against posting things that autoplay video/sound? If
not, I believe it should (or at least require a tag).

------
aviv
The naming of their self driving features "autopilot" will go down in history
as Tesla's biggest mistake.

------
kgc
Tesla should just change Autopilot to Copilot and reserve the Autopilot name
for when they achieve full self driving.

------
beedogs
Things like this have me fully convinced that we're never going to see
ubiquitous driverless cars in my lifetime.

------
lafar6502
Missing good old cars already oś still excited about meeting Tesla lawyers
every time something happens to you?

------
kyberias
What exactly is Tesla intending the Autopilot to be used if the driver has to
keep hands on the wheel?

------
oldpond
The solution is to require ALL vehicles to have self driving features like
collision avoidance.

------
ersh
Waiting for Ilon to be questioned in the Congress too.

------
dasanman
Wow this is a terrible article

------
jumelles
Not a good look for Tesla.

------
lovetrump
you would think that a self driving car would be aware of any obstacle
anywhere around the car (360 degrees) and aware of them far enough to have
time to react... where are their sensors looking if they can't even see fixed
obstacles in front of the car...

------
mluu510
he complains about the autopilot being unreliable yet he still chooses to use
it again and again. i think the negligence was on his part then.

------
stevew20
I vote that ABC News has it's journalism license revoked! All in favor?

------
joering2
Autopilot means it will pilot itself.. automatically.

Tesla is using "guidance assistant". Huge difference.

Their terminology is either very confusing at the best, or bluntly deceiving
at the worst.

~~~
jhall1468
Twice now people using absolutely stupid arguments to defend the driver here.
He _knew_ what Autopilot was and STILL took his hands off the wheel.

Autopilot is a name in the same way Amazon.com isn't a damn rain forest. It's
a term taken from aviation, but "autopilot" in a plane and "Autopilot" in a
Tesla aren't even remotely the same thing. And literally everyone understands
that, yet any time this occurs we start blaming semantics.

This guy wasn't confused. He explicitly stated his car was having trouble with
this section of the road and he trusted his car to handle it. It got him
killed. The word "autopilot" did not get him killed.

~~~
cjhopman
> STILL took his hands off the wheel

That's not clear. The car didn't detect his hands on the wheel, but that
doesn't mean his hands weren't on it. Tesla's implementation of hand detection
(like many of their other systems) is rather unreliable.

~~~
kid0m4n
So he had his hands on the wheel and took no action for 6 seconds while
driving straight to the barrier. Doesn't make it look any better for the
driver TBH.

~~~
ProfessorLayton
Well, the Tesla had the same amount of visibility and drove straight into the
barrier. Victim blaming doesn't get Tesla off the hook.

~~~
jhall1468
Tesla is a fucking computer. The dude that took his hands off is a human. One
of those has actual intelligence. You don't get to blame user error on the
computer because someone died.

He. Took. His. Hands. Off. The. Wheel.

------
josecurioso
The reason tesla needs to make this kind of statements is very simple: Guy in
a renault has this accident and it is just routine, renault won't see their
shares fall and there will be next to no coverage of the incident. In the case
of tesla any accident gets to the front page of reddit, trends in twitter, is
shared in facebook, let alone printed on newspapers, these statements are
needed to fight that amount of bad PR.

~~~
chaboud
Well, to be fair, if someone in the US is able to find a running Renault that
can go faster than 30mph under its own power, it's pretty exceptional.

I think that Tesla's response is tone-deaf and ill-advised. I'm not presently
a Tesla shareholder, but I do have a Model 3 reservation coming up soon.

Guess which one of those this public statement makes me want to change...

~~~
semi-extrinsic
I think this classic Renault will do a little over 30mph.

[https://www.hemmings.com/classifieds/cars-for-
sale/renault/r...](https://www.hemmings.com/classifieds/cars-for-
sale/renault/r5-turbo-2/2082048.html)

~~~
chaboud
There was one of these tooling around the campsites at Le Mans a few years ago
(probably there every year). Lamborghinis, Ferraris, TVRs. No. The R5 Turbo
got all of the crowds. It was awesome.

Nonetheless, I haven’t seen a Renault in the US, in person, in a few years.

------
sabareesh
I am not sure why there is so much anti Tesla sentiment here in HackerNews. I
am a new model 3 owner and very well informed so many times that "Enhanced
Autopilot" is nothing other than advanced cruise control and my attention is
always on the road. This is a great technology but someone cannot be more
obvious than this on how to use it responsibly.

~~~
buvanshak
>I am not sure why there is so much anti Tesla sentiment here in HackerNews.

I just hope it is not limited to HN, and there is more anti Tesla sentiment
out there as well.

They deserve every bit of it.

~~~
sabareesh
Please provide more info on why "They deserve every bit of it". I respect HN
that is why i am surprised a lot, and regarding other places on Internet there
is so much going on to manipulate the stock price.

~~~
buvanshak
Can you list all unwarranted ones among all the Tesla hate you see?

