
Apple engineer killed in Tesla crash had previously complained about autopilot - jelliclesfarm
https://www.kqed.org/news/11801138/apple-engineer-killed-in-tesla-crash-had-previously-complained-about-autopilot
======
fxtentacle
I avoid new technology, exactly because I'm an engineer.

I wonder if that is just me, but when new technology is introduced and hyped,
I usually take a quick look at implementations, research, and talks just to
get an idea of what the state of the art really is like.

As a result, I have become the late adopter among my group of friends because
the first iteration of any new technology usually just isn't worth the issues.

You know this effect when you open a new book and immediately spot a typo? I
felt that way looking at state of the art AI vision papers.

The first paper would crash, despite me using the same GPU as the authors.
Turns out they got incredibly lucky not to trigger a driver bug causing random
calculation errors.

The second paper converted float to bool and then tried to use the gradient
for training. That's just plain mathematically wrong, a step function doesn't
have a non-zero gradient.

The third paper only used a 3x3 pixel neighborhood for learning long-distance
moves. Doesn't work, I cannot learn about New York by waking around in my
bathroom.

That gave me the gut feeling that most people doing the research were lacking
the necessary mathematical background. AI is stochastical gradient descent
optimization, after all.

Thanks to TensorFlow, it is nowadays easy to try out other people's AI. So I
took some photos of my road and put them through the state of the art computer
visible AIs trained with KITTY, a self-driving car dataset of German roads.
All of them couldn't even track the wall of the house correctly.

So now I'm afraid to use anything self-driving ^_^

~~~
Alupis
This whole self driving, fully autonomous thing has become some sort of
strange faux-religion.

It's followers see no other possible solution as acceptable.

Instead of putting our heads together to make really good driver assistance
technologies and being satisfied, the darn thing needs to also drive itself
everywhere otherwise we're leaving something on the table! Untill we have zero
deaths, we cannot stop demanding self driving - as if somehow AI is a magical
solution that's always better than humans.

Forget better frame, chassis, and body panel design to protect pedestrians! We
won't need them if AI never hits anyone.

Forget better braking systems that apply themselves automatically. We don't
need that if AI can always avoid the need for sudden stops.

Forget seat belt enhancements since that'll just inhibit nap time in my self
driving car.

No, forget it all. My car needs to drive me all by itself, everywhere I need
to go, no matter how hard or impossible of a problem that is. Driver assist is
boring... Level 5 is sexy, and that's what I want!

~~~
neuronic
> This whole self driving, fully autonomous thing has become some sort of
> strange faux-religion.

What really bugs me, as someone working in consulting as a hands-on backend
developer with a previous background in technical consulting at an accounting
Big4 (student job), is the insane amount of PR, politics and marketing talk by
people who have ABSOLUTELY NO IDEA what they are talking about. I witnessed
politicians and C-level industry people talking out of their asses to ... idk
... drive stocks? Look smart in the face of Tesla? PR? No clue.

Self-proclaimed experts in magazines and talk shows raving about how AI is
going to change everything. I had colleagues telling customers about the
magnificent rise of AI and none of them could even spell "gradient descent".
Backed by a law or accounting degree they KNOW that self-driving is just
around the corner and they are very vocal about it while easily impressed by
tightly controlled demos at some international tech fair. Everyone just seems
to fall into the hype trap without a single brain-byte spent on researching
the actual issues and what's most sad is actual engineers/technical people not
doing their due diligence and informing themselves BUT THEN GOING OUT TO TELL
THEIR NON-TECHNICAL FRIENDS ABOUT THE AUTONOMOUS FUTURE. Ugh.

I held a minor internal talk at work about self-driving cars for people
generally with other backgrounds but light superficial interest due to "Tesla"
and "the hype". They were surprised to see that we are likely decades away
from _actual_ Level 4 (not the marketing garbage that some companies put out)
because even a slight change in weather can really fuck with all systems on
the roads right now.

~~~
xkemp
Predicting with All-Caps confidence that autonomous driving is "decades away"
is _at least_ as indefensible as overly optimistic predictions were.

~~~
snowwrestler
I think autonomous driving advocates would do well to look at the history of
computer handwriting recognition, an easier technical problem with lower
consequences that received significant investment over decades. But it has
never gotten good enough to succeed in the marketplace against alternatives.

Why? It never exceeded consumer expectations, which are extremely high for
automated systems. Even a correctness rate of 99.9% means multiple errors per
day for most people. Consumers expected approximately zero errors, despite not
being able to achieve that themselves, sometimes even with their own
handwriting!

Because handwriting is made by humans, there is some percentage of it that
simply cannot be reliably recognized at all. But people hold that against
computers more than other people because computers are supposed to be labor
saving devices.

Likewise because roads are made by people, and other cars are driven by
people, so a self driving car will never be able to be perfectly safe. But
that is essentially what advocates are promising.

That’s especially true if people expect the same level of convenience,
especially in terms of time. People speed and take risks all the time when
driving, in the name of saving time. I think it’s likely that an autonomous
car optimized for safety would also be a car that just takes a lot longer to
get anywhere with.

Speed matters. It’s a big reason we all use touch keyboards on our phones
instead of handwriting recognition.

~~~
fortran77
Handwriting recognition works very well if you can capture the actual strokes
used while writing and not just the end result.

~~~
nradov
I doubt it. My handwriting is at least average neatness, and stroke based
recognition systems still make multiple errors per sentence. It's just a
frustrating waste of time and now that we have touch screen keyboards there's
no longer any point to handwriting recognition.

The only handwriting recognition system which ever worked correctly with a low
error rate was Palm Graffiti. It forced the user to learn a new shorthand
writing style designed specifically to avoid errors.

[https://en.wikipedia.org/wiki/Graffiti_(Palm_OS)](https://en.wikipedia.org/wiki/Graffiti_\(Palm_OS\))

~~~
snowwrestler
The secret to Palm Graffiti's market success was that it hacked user
expectations.

Because it asked users to learn a new way of writing, when the recognition
failed, users were more likely to blame themselves, like, "Oh, I must have not
done that Graffiti letter right, I'll try again."

But when it came to recognizing regular (i.e. natural) handwriting, users
believed inherently (i.e. somewhat unconsciously) that they already knew how
to write, and the machine was new, so mistakes were the machine's fault.

------
salimmadjd
This is sad and tragic.

I remember going with a friend on a Tesla test drive. He was eager to try the
Autopilot system. And within a few minutes we got freaked out with what seem
to us as unpredictable action and we turned it off.

I think the engineer should have never relied on it. Especially if he had
concerns with the system.

At the same time, "Autopilot" is misleading and Tesla does own responsibility
here. Even if it's purely a marketing term it's designed to up sell you on a
half-baked technology.

They should call it "Driver assist" or something like "cruise control plus".

It's a heartbreaking story.

~~~
PopeRigby
I think self driving cars are a ridiculous solution to a problem we created
and doesn't need to exist. The US building out the country for cars wasn't a
good way to do transportation. If we just had trains going from city to city
and subways in the cities, we wouldn't even need self driving cars.

~~~
tiborsaas
I never owned a car, but going anywhere, anytime, at will is a very real thing
and cars are the obvious solution. Humans are terrible at safe driving and
self driving will eventually solve this. I don't see how this is not the
future of transportation.

~~~
hugh-avherald
> Humans are terrible at safe driving

I'd dispute this. I'd say humans are excellent at many aspects of it. The high
mortality is simply because driving is inherently dangerous and we do a lot of
it.

I imagine the solution is basically what has happened over the last 60 years:
gradual changes to improve safe driving.

~~~
tiborsaas
"Nearly 1.25 million people die in road crashes each year, on average 3,287
deaths a day.

An additional 20-50 million are injured or disabled."

[https://www.asirt.org/safe-travel/road-safety-
facts/](https://www.asirt.org/safe-travel/road-safety-facts/)

~~~
borkt
"Over 90% of all road fatalities occur in low and middle-income countries,
which have less than half of the world’s vehicles." Not really relevant
statistical baseline here. Self driving cars aren't going to help in areas
where people can't afford them.

~~~
tiborsaas
Not in the near future. Think 100+ years. I can totally imagine owning a human
driven car to be the luxury choice.

------
danielinoa
According to this article:
[https://www.caranddriver.com/news/a30877577/driver-tesla-
mod...](https://www.caranddriver.com/news/a30877577/driver-tesla-model-x-
crash-complained-autopilot)

"Data from his phone indicates that a mobile game was active during his drive
and that his hands were not on the wheel during the six seconds ahead of the
crash."

~~~
spookthesunset
Gotta love how quick Tesla is to throw their customers under the bus using
telemetry data the customer cannot access on their own..

~~~
ztjio
Their customer is perfectly aware of whether he has a video game going on in
his hands instead of a steering wheel. No telemetry needed from that
perspective.

~~~
spookthesunset
Yeah but it is auto pilot! It can drive itself!!

You have no idea how many Tesla drivers will openly admit to taking their
hands off the wheel and eyes of the road in order to take off a coat, tend to
a child, etc. I work with somebody who claimed to do just that routinely.

Here is somebody from a few days ago bragging about taking their hands off the
wheel and eyes off the road:
[https://news.ycombinator.com/item?id=22231927](https://news.ycombinator.com/item?id=22231927)

~~~
revscat
People already do the things you listed, though. Hell, when my kids were in
car seats I frequently had to reach around behind me to soothe them via touch.
AutoPilot would have been a godsend, because I wouldn’t have had to stress out
so much about juggling two things at once.

This is the part where you call me irresponsible, right? It seems like every
discussion on this topic follows that pattern. The fact remains, though, that
humans are in cars, and have all the weirdness that goes along with it. AP
makes it easier. A lot.

~~~
riffraff
> This is the part where you call me irresponsible, right?

Well, yes. But let's say everybody does things like this: the problem is that
on a normal car you'll try to keep the off-hand time at a minimum, keep half
hand on the wheel, or just give up and stop the car.

If you rely on autopilot you'l could just lower your attention for a longer
time, and you can't rely on autopilot to actually do the right thing, it may
work fine ten times and then crash your car into another on the eleventh
killing both you and your baby.

The stress you feel is a _good_ thing.

~~~
revscat
So are you saying autopilot is, overall, a bad thing? Because it seems that
the argument you are making is that the world is worse having it available, an
argument I find to be perplexing.

~~~
unionpivo
Yes at the moment. But tats just my personal opinion i have because a lot of
people get false sense of security using it, and stop paying enough attention.

Look at this guy. He even apparently noticed this behavior on this road
before, but still wasn't paying attention and got killed.

You either need 100% self driving or 0%

(driving assist is fine, as long as it's assist, and requres your attention)

------
siljon
He complained once before about this stretch, and yet he kept driving auto
pilot at the same location? I would never dare to do that, and sounds
suicidal. Also I would never dare to fully trust auto-pilot and I think they
should change the official name to driver assistance instead, until auto-pilot
is ready.

Which is how Tesla want to treat it according to the article: "Tesla says
Autopilot is intended to be used for driver assistance and that drivers must
be ready to intervene at all times."

~~~
lawnchair_larry
I would not be quick to make assumptions like this. He could have experienced
the issue like 1/20 times, and intended to remind himself to watch out when he
gets to that spot. And maybe he forgot this time. It’s easy to get into a
trance on long repetitive drives. It isn’t like he immediately did back to
back drives and changed nothing the second time.

And that’s the problem with an “autopilot 99.9% of the time, but still pay
attention just in case” feature.

~~~
mavhc
Which is why Waymo never released half self driving technology, their early
tests showed that even if they told Google engineers that they were being
recorded and they'd be fired if they didn't watch the road, in a few weeks a
lot of them trusted the software too much

~~~
scep12
Source?

~~~
mavhc
[https://www.vox.com/new-money/2017/7/5/15840860/tesla-
waymo-...](https://www.vox.com/new-money/2017/7/5/15840860/tesla-waymo-audi-
self-driving)

------
bmat
For every tragedy, there are tragedies avoided. I can attest to a few. In the
last 10,000 miles, Autopilot has: safely swerved to avoid a car that would
have sideswiped me, preemptively braked after detecting the 2nd car ahead (not
visible to me) had slammed on its brakes, and avoided colliding with a
completely stopped vehicle in the center lane of the freeway.

And FWIW I've never felt misled about Autopilot's capabilities. I started off
skeptical and it's since earned my partial trust in limited scenarios.
Ironically its hiccups actually make me more attentive since, ya know, I don't
want it to kill me.

~~~
hvidgaard
That squarely puts Autopilot in the "Driver Aid" category. That is fine, just
don't go telling people that it can drive unassisted.

~~~
Erlich_Bachman
> just don't go telling people that it can drive unassisted.

No one does. Literally no one. No owner, not Tesla themselves, not their
website, not their sellers. Why do people keep bringing this up. Everybody
knows it is not full self-driving, yes we all know, yes and? Where does this
zealotry come from?

~~~
hvidgaard
Their marketing for for a long time structured such that you got the idea that
is was a literal autonomous driving ability. They've only toned it down after
multiple accident, and I'm sure it's the reason so many people still believe
it to this day.

~~~
Erlich_Bachman
So the "problem" you referring to is solved? Their marketing is good now? I
just think that there is no reason keep repeating the old things then that are
not relevant anymore, that don't even refer to anything anymore and only
create a lot of confusion.

(Let's just assume it was a problem, even though I don't ever remember being
fooled by it or seing any commercial or anything without a disclaimer "NOT
FULLY AUTONOMOUS SELF-DRIVING, REQUIRES DRIVER ATTENTION".)

~~~
hvidgaard
They should start with not using a misleading name, Autopilot.

~~~
radium3d
Autopilot is not a misleading name actually. You, the press and others just
have ingrained a false definition of the word. "An autopilot is a system used
to control the trajectory of an aircraft, marine craft or spacecraft without
_constant_ manual control by a human operator being required. _Autopilots do
not replace human operators, but instead they assist them in controlling the
vehicle._ "

~~~
hvidgaard
If everyone defines Autopilot as an automatic system, shouldn't the definition
in the dictionaries be updated? That was a rhetorical question, because it
happens all the time.

Besides, Tesla claim that it can drive you unassisted on the highway, so they
are very much coining that definition as well.

~~~
Erlich_Bachman
Everyone does not. This is why you see this many people have a problem with
how you seem to use that word.

~~~
hvidgaard
Not everyone as the pedantic everyone - it's stupid to argue semantics. Tesla
claims when you order a car that it has

> Full Self-Driving Capability

And in my experience 9 out 10 people take that literally.

------
crazygringo
Yesterday here on HN there was a top thread about how a open dataset used for
training self-driving cars is rife with missing labels and mislabeling. [1]

Quite a few commenters insisted the issue was a non-issue -- that all datasets
are noisy and mislabeling occurs all the time, that nobody's building actual
self-driving cars from it, that surely if an object isn't labeled in one frame
it'll be labeled in the next.

Now obviously we don't what the cause of this particular crash was.

But I will say that I found people's willingness to defend widespread sloppy
labeling in training sets used for literal _high-speed life and death
situations_ rather shocking.

And that, hopefully, crashes like this serve to remind us of the far greater
responsibility we have with regards to quality and accuracy when it comes to
building ML models for self-driving cars, rather than when we're merely
predicting how likely a credit card customer is to pay their next bill on
time, or which ad is likely to be most profitable.

[1]
[https://news.ycombinator.com/item?id=22298882](https://news.ycombinator.com/item?id=22298882)

~~~
bagacrap
The consensus was that the training set in question was not used for any real
life situations.

The only player in autonomous vehicles that doesn't understand the gravity of
the technology is Tesla/Musk.

------
harrisonjackson
Tesla says autopilot works but that driver should be ready to intervene. This
is terrifying - like having a kid riding shotgun that might just reach out and
turn the wheel while you are on the highway.

Anyone who has driven with autopilot, how quickly might it react to a
perceived obstacle? Would it take a hard turn into a median faster than a
person could reasonably react if it thought there was something in the road or
that the road took a hard left?

 _Edit_ Ready to intervene can mean a lot of things - ready to take over in
traffic or inclement weather vs firm grip on the wheel to resist rogue hard
turns

Actual quote from the article

>Tesla says Autopilot is intended to be used for driver assistance and that
drivers must be ready to intervene at all times.

~~~
ip26
I can't help but think "ready to intervene" would require an even higher level
of alertness than just driving the car yourself, as you have to always be
prepared for it to do something completely unexpected on a dime- and to notice
without the feedback you normally get through the wheel, pedals, etc...

Even worse, if you are taking an action the autopilot should have taken but
didn't, you are axiomatically already late no mater how fast you react.

~~~
perl4ever
"you have to always be prepared for it to do something completely unexpected
on a dime"

This is inherently absurd, even if you remove the last three words. You could
write an essay, or a book, on why this is an example of a class of impossible
problems. Or, inverting it, it's practically the generic framework of all
catastrophes.

~~~
pferde
You are being downvoted because your reply is basically an unnecessarily long
"nuh-uh!" \- a dismissal without any counterargument. Wrapping that in a
string of fancy words does not change the fact that you are not providing any
argument at all.

~~~
mistahenry
As far as I can tell, the previous commenter was in agreement with the
sentence he quoted, going so far as to say that the “on a dime” stipulation
isn’t even necessary to consider the idea that we “must always be prepared for
it to do something unexpected” an absurd reality

~~~
pferde
Hm, after reading and re-reading the post half a dozen times, I realized it
can indeed be interpreted that way. Now I'm confused. :)

------
throwaway5752
There are going to be Autopilot crashes and people should disable it if they
uncomfortable with putting their lives in the hands of Tesla's software. I
would be. But I see no reason to not believe to not believe what they have on
[https://www.tesla.com/VehicleSafetyReport](https://www.tesla.com/VehicleSafetyReport)

It shows Teslas have much fewer accidents than the average car, and that
Teslas with autopilot enabled are substantially safe per mile than with
autopilot disabled (25-50% more miles driven per accident).

The article says a Prius hit the same place as this Tesla the week before. But
the NTSB isn't investigating Toyota.

~~~
truncate
I do believe autopilot has the potential to be safer than driving manually.
But I'm not sure what to take out of that safety report.

(1) Comparison against all other cars in US (and across all states) just
introduces lots of sampling bias. Teslas are like much more common in states
like CA, than say Tennessee. Tennessee accounts for much more accidents in
general than California and that adds up in overall stats, but not in Tesla
stats.

(2) What kind of variation do we have in Tesla drivers vs other car drivers.
As comment above mentions, Tesla drivers may be more responsible drivers.

(3) What accounts as accident? Minor collisions (fender benders) vs Major
collisions. More specifically, I want to know, how does Autopilot react when
it encounters something it has never seen before (like the crash in article).
Humans can handle things pretty well IMO given they are not way beyond speed-
limit, and all kind of random shit happens on road.

(4) Autopilot vs Manual Tesla stats is promising IMO. However, a little more
information would be nice to have, like proportion of 1-billion miles logged
as autopilot miles.

~~~
Brakenshire
There’s also a sampling bias for the kinds of roads where Autopilot is
available, and the kinds of circumstances where people are most likely to
trust and use it.

------
aerovistae
I was quite confused by this. He knew the car was taking a consistently
erroneous action that was likely to have fatal consequences, but he continued
to use autopilot on that same stretch anyway.

To be clear, I think Tesla has been cavalier and careless in their approach to
autopilot.

But Walter Huang knew what he was doing and did it anyway. His death is
largely by his own hand. I don't see this as different from people who
knowingly take other careless risks and pay a price.

~~~
mixmastamyk
Think of the "dumb" folks who take a selfie on top of a skyscraper and then
slip off. It's not until you are careening towards the abyss that the reality
of the situation is fully learned. Up until then, only theoretical.

------
smallstepforman
As a software engineer, all I have to do is think that my colleges wrote the
auto pilot code, and that would keep me from ever turning the feature on.

~~~
WilliamEdward
This kind of thing makes me think software engineering should require an
actual accreditation like a formal engineering job does.

~~~
asdff
Try to avoid putting your life in the hands of closed source software.

~~~
mkl
Wouldn't that make it impossible to drive almost any modern vehicle? Or fly in
any modern passenger aeroplane?

~~~
bagacrap
Or use the healthcare system

------
thbr99
Don't text and autopilot on level 3, which is a nascent technology. An
engineer should know better. Wonder why a level 3 is allowed to market as an
'autopilot'.

~~~
Balgair
> An engineer should know better

Maaaaybe? I'm not a hydraulic engineer and won't pretend to be one. So I hire
plumbers when things go wrong. If they screw up, I don't tend to blame myself
though. I think the same logic would apply to the deceased.

~~~
quelltext
For some reason "Apple engineer" is used a lot (see title) in discussions
around this on both ends (i.e. "he was an engineer and thus knew his stuff/was
credible, and he voiced some concerns to friends/family, so let's agree with
it being Tesla's fault" vs. "he was an engineer and should have
known/anticipated the system's caveats"). The thing is you cannot have it both
ways, so I think (and I think we agree) we should not appeal to the driver
being an engineer at all.

~~~
totalZero
Being an engineer should only add value to his comments about an engineered
technology. His complaint was made from a position of software expertise and
familiarity with technology.

His inability to anticipate the system's caveat that led to his death is a
testament to the unpredictability (from the user's perspective) of the
vehicle's behavior.

------
adaisadais
Tesla has some blame in this for sure but Caltrans (maintainers of the 101)
are equally if not more to blame.

California is rich in capital but so poor infrastructure. Moving here from
South Carolina(where taxes are much lower and roads are paltry) I was shocked
at how dismal a state many of the roads are and at the sheer amount of time it
takes to complete.

Patrick Collison of Stripe actually wrote a blog piece(1) condemning SF for
their lack of speed when it comes to road maintenance.

There is no end in sight. Until we we a nation hold our governments
accountable for road maintenance and proper engineering we will never have any
great resolution.

(1) [https://patrickcollison.com/fast](https://patrickcollison.com/fast)

~~~
mgoetzke
Many US citizens I talk to, seem to have the stance that they do not want to
pay any taxes and that taxation is theft.

I am sure California and others could do more and be held accountable (every
service provider should and must be), but that attitude has to change too.

~~~
chillacy
CA already has some of the highest taxes in the nation but the roads and
public transportation aren't leaps and bounds better than the cheaper states.
I'm not convinced raising taxes would help without fixing the accountability
issue.

------
curiousgal
This happened 2 years ago.

It was a left-hand exit (why are those even a thing?).

The seperator did not have contracting protectors because a car had previously
crashed into them, I wonder why.

No one ever dies in car accidents.

Hence, Tesla is super bad for marketing its autopilot according to HN.

~~~
toast0
There's a left-hand exit there because there's an interchange between
freeways, and the HOV lane exits on the left, to enable free flowing HOV
traffic to avoid merging through general traffic to get to a right exit.

At this interchange, I've seen lots of human drivers change between the
exiting HOV lane and the continuing HOV lane (or vice versa) much later than
is safe; presumably because they were surprised that their lane was exiting,
or they noticed the exit too late, but didn't want to miss it. I haven't seen
anyone drive in between the two lanes as if it was a lane, though; and usually
the late lane changers are braking, not accelerating.

It is unfortunate that the crash attenuator wasn't reset. Looking at the docs
for a similar attenuator[1], it seems like resetting is somewhat involved, and
you'd need a trained and properly equipped repair crew to do it, even if the
time required is not that much. Scheduling is probably an issue.

[1]
[https://www.dmtraffic.com/assets/sci_smart_cushion_design_an...](https://www.dmtraffic.com/assets/sci_smart_cushion_design_and_installation_manual_2015.pdf)
page 9-13

------
voodooranger
breaking news from 2018? there’s nothing new here, including the deceased
driver’s alleged complaints.

~~~
thedance
The news is that the NTSB released new information. Unfortunately their site
is now down.

~~~
LatteLazy
I'm not the person you replied to, but is this new info? I could have sworn I
read all this last year...

------
StylusEater
Why don’t we introduce some of the same requirements for drivers as we have
for private pilots (think traditional bi-annual flight reviews)? Not only
would we most likely reduce fatalities, accidents and injuries but we’d
probably also eliminate many unsafe drivers from the road while placing a
forcing function on public transit to increase usage. We’d probably also see a
knock on effect to the training market and increase employment in that field.
We might also see an increase in tax/fee income for government entities to
help reduce reliance on gasoline taxes. Obviously I haven’t done a rigorous
analysis but it would seem like a win all around in my humble opinion.

~~~
simonswords82
Can't speak for the FAA but the CAA only requires me to sit with an instructor
for one hour every two years to renew my SEP rating.

It's not onerous and I think you're right, a sit down with an instructor every
two years to correct any minor bad habits before they get out of control is an
excellent idea.

------
gwbas1c
Regular Tesla Autopilot driver here:

I just don't get this situation. With Autopilot, it takes a tiny amount of
pressure on the steering wheel to take over. The pressure is so small that,
when Autopilot jerks the wheel, you're more likely to turn it off.

So, assuming the driver was paying attention, with his hands on the wheel,
this makes no sense. The only way the accident makes sense is if the driver
fell asleep, wasn't paying attention, or accidentally turned off autopilot.

And falling asleep, or not paying attention, is a real risk in any car.

(Note: What I remember from older articles about this topic is that the car
was nagging him for awhile to put his hands on the wheel.)

------
ailideex
Maybe there is something wrong with the coffee I have had today but I utterly
fail to see the novelty or notoriety in the title as a whole or any subset of
it. Can someone maybe point it out to me?

People complain a lot. Engineers are people. People still do things they
complain about. People die in crashes. Cars crash. Teslas are cars. What am I
missing?

~~~
mcrae
I mean like, he had previously complained about the car swerving towards a
barrier on the 101. He later crashed his car _into the same barrier_ while on
Autopilot which took his life.

I think the intuition is that like, if a piece of technology repeatedly shows
itself to drive your car into a concrete barrier, maybe don't play video games
while passing that same concrete barrier.

~~~
Galaxeblaffer
... while using the same auto pilot that earlier steered your car towards
certain doom

------
nojvek
I’ve been using commaai EON for a while and very happy with it. They’ve done
the “what it knows, what it knows it doesn’t know” very well.

The system will beep hard at you when the vision system doesn’t have
confidence. It only has confidence in well lit, well marked roads. Otherwise
it yells at human to take over.

Take eyes off the road, it yells at you. The disengagement rate is pretty high
but I like it. I know it does 405 and I5 highways well and that’s where I use
it.

Easy to understand the limitations. They also make you explicitly say yes to
all the limitations on first start.

------
DennisP
I'm not convinced we should _entirely_ blame Tesla, given that humans have
been crashing into the same barrier:

> In the three years before the Tesla crash, the device was struck at least
> five times, including one crash that resulted in fatalities. A car struck it
> again on May 20, 2018, about two months after the Tesla crash, the NTSB
> said.

The article also says the engineer had complained about his Tesla veering
towards this particular barrier. I don't understand why he still relied on the
autopilot while driving past it.

------
Dumblydorr
This is from 2018 and the driver didn't even have his hands on the wheel,
though he knew there were problems at that spot. Very careless driving.
Personally I wouldn't use a mobile game while I'm driving a vehicle, nevermind
software in development.

------
ww520
I have a car with the "autopilot" feature like the Tesla, but they call it
adaptive cruise control and lane assist. The adaptive cruise control works
pretty well; I guess it only needs to check the radar to see what's in front
to slow down. The lane assist is problematic in recognizing different kind of
lanes and would steer off course from time to time. I guess the current state
of "autopilot" technology is simply not there.

~~~
_-___________-_
A lot of rental cars I've used in Italy have this lane assist feature. Italian
roads often have... inconsistent or incomplete... markings. After the first
day driving I had to Google how to switch it off, because it tried to crash
the car several times. I don't understand how these features can be launched
in this state.

~~~
rhlsthrm
Anecdotally to the contrary, in California where the lane markings are quite
good in most areas, the Tesla AP works great.

~~~
_-___________-_
It would seem sensible for the "autopilot" to have knowledge of where the lane
markings are known to be reliable, and only enable itself there. The vast
majority of the world's roads have lane markings that absolutely cannot be
relied upon.

------
Four8Five
I had a scary experience in a Model S on auto-pilot going over a bridge where
the car swerved left and almost hit the concrete barrier. Not entirely sure
what happened but since then, I've been hesitant to use auto-pilot 100% of the
time.

~~~
jryan49
Why would you use it at all at that point? If this guy took his experience
with autopilot the first time it almost crashed his car, he'd probably still
be alive.

------
anonsivalley652
Sigh. It's like the NTSB's complaints about the FAA: tombstone, reactive
regulation rather than proactively implementing recommendations.

I don't mean to dogpile on Tesla in particular, I think autonomous driving
will eventually be far, far safer in the long-term than human-error piloting,
but it won't be outlawed because of egos.

>>> Q: Is it still the case that there are several classes of insufficiently-
solved autonomous driving problems? IIRC one of the big ones is/was
recognizing stationary objects in the motorway when traveling at high speed
(80 mph / 129 kph). Shouldn't one of the "prime directives" of autonomous
driving be that it cannot operate at speeds higher than it can stop or avoid
objects and people? These conditions include blind curves in the hills,
children darting out from in front of parked cars, fog, rain or
headlights/darkness.

IIRC, is another difficult problem recognizing pedestrians or people on
strange modes of transport, such as winter clothing, different heights, skin
tones or doing things like being on stilts or riding a unicycle?

PS: when a type of system becomes relied on pervasively, so-called "edge-
cases" become everyday occurrences.

~~~
ironmagma
That would be all speeds. The worst case is that the computer doesn’t
recognize an object as being on a collision path until after impact is
imminent, which could be due to the object being filtered out as rain,
classified as vegetation, improper predicted trajectory... any time you set up
rules for self driving cars, things get very fuzzy very fast.

------
neilpa
Does anyone have a link to the actual NTSB documents linked in the article?

> [https://www.ntsb.gov/news/press-
> releases/Pages/NR20200211.as...](https://www.ntsb.gov/news/press-
> releases/Pages/NR20200211.aspx)

Looks like the ntsb.gov site is currently down and they haven't been indexed
on archive.org yet.

------
dkarl
Obviously Tesla is going to hide behind the consumer's decision here.
Autopilot scared the shit out of this guy on multiple occasions, and he kept
turning it on and trusting it with by far the most deadly thing he did every
day. Odd choice when you put it that way, but one a lot of people make, I'm
sure, maybe one that I would make.

I really hope the government isn't too aggressive in hindering companies from
doing self-driving car R&D on public roads, but I would be fine with the
government saying consumers shouldn't be used as guinea pigs for systems that
are less safe that typical human drivers. Anecdotal evidence can't establish
that self-driving cars are less safe than humans (even if it establishes that
they make mistakes that humans wouldn't) so I would expect a numbers-driven
standard, but self-driving technology shouldn't be unleashed on consumers
until it achieves at least a human level of safety.

------
HoustonRefugee
I don't trust my car. I roll down the windows if I leave the keys in it for
any reason so I don't get locked out when it decides to auto-lock itself or
leaving it running or to step or whatever reason. I sure am not trusting it to
drive itself. Maybe in 2150 when the technology is hammered out.

------
jug
I wonder if we get skewed opinions about autopiloted cars because every story
about them blows up, while people accidentally killing themselves in cars
don’t.

------
microcolonel
Tesla's autopilot has now actively hurtled their cars at full speed into
concrete barriers on more than one occasion.

They really need to work on this, or they will ruin self-driving for everyone.
This is basically a new class of collision, this is not the ordinary way
people die on highways. If they can not avoid actively smashing their
customers at over 100km/h into solid concrete barriers, on major roads _near
their company headquarters_ , then I don't know how they can make this feature
so accessible to a broad customer base.

------
zed88
The fundamental problem lies in the usage of the word 'autopilot' for a
glorified cruise control system.

------
mng2
I spent some time looking through the NTSB docket today. I was a little
surprised that Apple was willing to provide logs from a development model
iPhone. Evidence seems to point to the victim playing a game while commuting
-- in my opinion this is a major indictment of Tesla's approach to self-
driving, where the user is lulled into a false sense of security.

Another docket was updated today, a Tesla crash in Florida eerily similar to
the one a few years ago: semi turning across the highway, Tesla goes under the
trailer. Not a great failure mode.

------
WheelsAtLarge
Question: if the guy thought the Autopilot was failing, why did he continue to
use it?

~~~
taneq
It's the curse of SAE Level 3. The driver is responsible for monitoring a
system that works fine 99% of the time, they start to feel a false sense of
security and when it does fail they're not paying as much attention as they
need to be.

This is also why SAE autonomy levels are a poor measure of a vehicle's
capabilities, but that's another rant.

~~~
qeqeqeqe
It's still safer than not having it.

~~~
taneq
Arguably no, it's not. This is why Waymo stopped working on level 3 and
switched to level 4/5 systems. It's also why most vehicle manufacturers up
until now have only supplied ADAS features which provide backup to the human
driver (lane departure warnings, emergency braking assistance etc.) rather
than features which drive while using the human driver as backup.

There are a few coming out soon (SuperCruise, Pro Pilot) that will offer some
degree of level 2/3 operation. It'll be very interesting to see how that goes
liability-wise.

------
_ph_
The headline alone should show, what is odd with this case.

* Tesla clearly states, that the autopilot is a driving aid and should not be used unobserved.

* The engineer complained about the autopilot not working perfectly at the very place of crash. (Quite understandeable when seeing the state of the road and mostly absence of line markings in videos of the crash site).

* Evidence seems to show that prior to the crash he did not pay attention while driving on autopilot at this very part of the road.

So I really don't get why anyone whould trust their life to a driving assist
by not paying attention and closely supervising the system.

Where I think the lawsuit should have a merit and a good chance of succeeding
is the state of road and the maintenance by Caltrans. They are in my eyes
liable for the deadly outcome of the crash and that there was a crash at all:

* Left exits by themselves are a safety issue and should not exist (extremely rare here in Europe).

* The line markings were almost non-existant. Potentially dangerous sections of roads should have very good line-markings.

* As a consequence, it seems to be common that cars crash into this divider. That there was a "reusable" crash protection at it, should show the dangers of this section clearly.

* The crash protection was not active which lead to the deadly nature of the crash, as a car had crashed into it a week before. No other protections were in place (traffic cones, barrels)

So this section is shown as being very dangerous as human drivers regularly
crash into the divider, there are no traffic cones (or have been destroyed by
people almost crashing into the divider) and it was not secured after the last
crash. The exit and the left lane should have been closed for traffic as long
as the crash bareer is not in place.

~~~
ardit33
Nice, blaming the victim there.....

Bad pavement markings and "work in progress" sections, exist in almost every
major highway... it is not an exceptional case, and it should not be treat
such. The expectations is that the technology should work, or give you enough
advance warning for you to take control...

There have been plenty of videos, where the tesla autopilot seems to just
steer straight towards a side wall, with no warning whatsoever...

A Lidar would/might have prevented it... it would have served as a warning
that the car is going to smash straight ahead into a solid wall and that the
visual cues (lines in the pavement) are misleading.... I don't think just a
camera system is good enough for automated driving.... I feel Elon is beta
testing with people's lives...

~~~
_ph_
Considering the victim has complained that the system does not work well at
this very spot, the question stands, why did the victim use the system
unobserved at this very spot?

And independently who was in control of the car, I showed why this section was
in a miserable state of maintenance and dangerous by its setup, as humans
regularly crash into this very section. So this section is _dangerous_ and
should not exist in such a dangerous form, especially, as the installed
safeties were not active at the time of accident.

------
reallyscared
The thing about self-driving cars is that even though we may get to a point
where they are statistically better at driving than humans, the reasons they
crash will be unrelated to the driver. This means that humans, thanks to
modern life, are even further removed from any concept of self-determination.

Being a responsible person and not driving while drunk, or driving in a
reckless manner will no longer improve your chances of not being seriously
hurt in a car crash. While driving on the roads already removes a lot of self-
determination due to the fact that you can't control all the high-speed
objects flying around you, there was at least some degree of improvement
possible by being responsible.

Once self-driving cars are available, whether you live or die will be up to
random chance. Some people will be better off due to the self-driving cars,
and others will be worse off, depending on if they were above the average or
below in terms of their driving behaviour.

------
lmilcin
Well, being an engineer and understanding how fraught with problems AI is
(well, it is not AI it is just ML), I would not be handing my life to the
algorithm yet.

I understand these things are probably necessary step but maybe it shouldn't
be possible for a car to drive in complex traffic without steering and focus
of the human driver, yet.

------
mgoetzke
Why are there still these death traps on Highways ? No matter whether driving
on AutoPilot or Lane Assist or manually, it is statistically quite likely a
car can and will drift of for a second.

I have trouble thinking of any such places on the German Autobahn where
drifting of would lead to a full head on collision with a concrete barrier.

------
crca
Tesla publishes their safety numbers in terms of accidents per x millions of
miles driven. If you believe their numbers then Autopilot in its current
configuration is clearly safer than a purely human operated vehicle. Having
driven on Autopilot for about 85% of the total time I’ve spent in a Tesla, its
been a life changer. But obviously YMMV.

It really sucks that this specific interchange has cost so many lives - and
it’s apparent that there’s some frailty of the code that makes autopilot
vehicles more susceptible.

In general when it comes to crashes on Autopilot, it’s important to keep
perspective. If you take your hands off the wheel and look at your phone for 6
seconds in any other car you’re going to have a bad time. On Autopilot, it
took a confluence of bad road design, poor road maintenance, and an unlikely
software fault to initiate a crash.

~~~
mettamage
> If you take your hands off the wheel and look at your phone for 6 seconds in
> any other car you’re going to have a bad time.

Which is why you don't do it in any other car. I don't know much about
autopilot, but because we as humans don't know when it can go wrong, it's
easier to get it into situations it wasn't designed for. This is not true for
the 6 seconds any other car scenario as we will be predicting what the
steering wheel and traffic in front of us might do.

~~~
crca
Very true. Tesla has done a lot to try and make it clear to drivers that they
must remain vigilant - in this particular case it sounds like the driver
ignored several auditory tactile warnings to resume control of the vehicle, in
addition to the standard warnings when Autopilot is activated. Short of
changing the name I think they’ve done everything they can to manage
expectations for Autopilot once you’re in the car, but humans don’t always
work like that.

~~~
riffraff
I do think changing the name would be the responsible thing to do.

If you call something a knife you can't 100% blame the customers if they try
to use it to cut stuff, even if you say "Knife™ should not be used for
cutting"

------
shiftpgdn
IMHO the fault ultimately lies with Caltrans. Poorly drawn lane markers
resulted in a human driver hitting the barrier just days before destroying the
arresting system. If the lanes had been painted correctly and the arresting
system replaced the Tesla driver would not have died.

~~~
Aloha
I think an auto driving system needs to have a _higher_ margin of safety than
a human driver, not the same or worse.

~~~
perl4ever
People say a self-driving car is safer than the average person. I say, well,
I'm above average, because I don't drink or text while driving, for starters.
So people say "well you just _think_ you're a better driver - everyone thinks
that and on average they're wrong". And I say, assume I'm lying about not
drinking or texting, it's still a lot cheaper and more efficient to stop doing
that than to buy a new car with the latest tech.

If a self-driving car isn't safer than a _sober professional driver_ , then
wouldn't it be logical to just have people ride the bus instead of wasting
resources on tech? I knew a bus driver who went a million miles without an
accident.

------
Trias11
"Safety remains Caltrans top priority," he said.

Where these Corp dummies are trained to claim that <the thing we totally
screwed> is our top priority? This keeps repeating after every incident
exposing ignorance and lack of care and attention to essential things.

------
dreamcompiler
I am an expert in AI. Learned the symbolic kind before the AI winter, then
worked on neural nets (stealthily) during the AI winter.

Automonous driving cannot yet handle all the edge cases, and when it fails
people die. What's worse is when it fails it cannot explain why it failed, so
we just throw more training data at it and hope for the best. That's not
engineering; that's ML cargo cult voodoo.

Modern ML is great for finding family members in your photo collection, but
nobody dies when it labels your dog your grandmother.

Tesla makes great cars but I never use their autopilot. They should focus on
what they do best and give up this obsession with autonomous driving.

------
albertTJames
If you know your autopilot is malfunction on a certain highway segment, you
have observed it malfunction several times. Why would you still use it ? If
you were a brain surgeon, and observed that your brand new ultrasonic scalpel
created bleeding, reported it, but decided to still used it. The day a patient
dies of an hemorrage post-surgery, would it be the fault of the company
producing the ultrasonic scalpel or yours ? Those clowns admit to knowing it
was faulty exactly where it crashed and are still suing Tesla. This habit of
Americans to sue every time they have an issue is exhausting.

------
shaneprrlt
Makes you wonder if the point of technological progress is to actually make
human existence safer and easier or if it's just to push up a stock price. I'm
still hopeful it's the former.

We'll hear the same canned response about how many lives have been saved, but
just veering off into a concrete wall spontaneously? That's completely fucked.

The fact that millions of these vehicles are deployed with this software
running should concern every other driver on the roadway. Remember when Tesla
said 1 million robotaxis by 2020? o_O

------
scoot
This was mention in the article discussed here when the family first sued:

 _Tesla issued a statement placing the blame on Huang and denying moral or
legal liability for the crash.

“According to the family, Mr. Huang was well aware that Autopilot was not
perfect and, specifically, he told them it was not reliable in that exact
location, yet he nonetheless engaged Autopilot at that location."_

[https://news.ycombinator.com/item?id=19802284](https://news.ycombinator.com/item?id=19802284)

------
PerilousD
I use the radio in my car probably less than 1 percent of the time Im driving.
Most of the time I'm listening to streaming media podcasts, soundcloud etc so
why do all the accident reports stress DATA streaming on the phone like that
isn't pretty normal these days? Clearly doesn't mean texting and driving
although even texting can be done purely hands on wheel eyes straight ahead
just using voice - its like the press is still living in 2002 not 2020.

------
32gbsd
Last stat i saw was something along the lines of 3k people dieing in crashes
everyday. I personally "almost crash" at least 4 times a day. Driving is a
dangerous game.

~~~
ksml
You almost crash at least 4 times a day?? I have no stats but I don't think
that's normal. You should be driving much more defensively if that's the case.

~~~
32gbsd
The drivers around here are very aggressive while at the same time bad at
calculating cause and effect of their actions

~~~
ksml
Yikes. Not sure where you are located, but that sounds pretty frightening.
Best wishes!

------
ryanmarsh
I drive with autopilot (auto steer as Tesla calls it) daily. Frankly it’s a
mixed bag. Anyone who has used it with any frequency should know how dangerous
it can be.

I use it because it usually brakes faster than I do when traffic suddenly
slows down. It also slams on the brakes at random points on a 3 mile stretch
of highway on my way home. For some reason it thinks the speed limit is
suddenly 45 and brakes aggressively. I’m afraid one day I’ll get rear ended
when it does that.

------
u801e
I've always thought that having a person take over from a computer when
driving is a fundamentally flawed premise. People, by nature, are not going to
devote enough attention to a task they're not actively involved in and they
won't be able to take action fast enough to avoid an incident because of the
time it takes to context switch from whatever they were doing to handling the
situation that just came up.

------
anjc
I hope people aren't commonly doing what this poor guy did, turning autopilot
on and taking their hands off the wheel at _71mph_. At this speed, the
stopping distance is greater than the max effective distance of some of the
forward looking cameras, and a collision means almost certain death. I'm
surprised people will trust any autopilot system at a potentially fatal speed.

------
kraig911
Fail fast, fail often doesn't work when your life is on the line. I'm sure
Tesla did a good job testing but this is one of those things where the world
simply is hard to work out what happened. Mind you that the self driving car
is probably way better than a comparable human driver. Why do we not feel bad
when a human accidentally kills himself from faulty programming too?

------
dwild
Reading this article I got a big flash when I read "slammed into a concrete
barrier", because it nearly happened to me a few weeks ago and it shook me
quite a bit to think of the implications. It wasn't a self driving car though,
it was just a bit of aquaplaning, but I was still letting myself distracted
enough too close to the concrete barrier and not be aware enough of my
environment by missing the water accumulation on the side because of the
melting ice/snow.

That didn't made me think that cars are too complex and should no longer by
driven, that made me think that I should be more careful on my driving. Yet
here we are arguing whether that's a signal that we shouldn't allow autopilot
feature.

Autopilot or not, YOU are still the driver. If there's a mistake, YOU should
be responsible for it. As long as there's a steering wheel, and I believe
there will be one for a pretty long time, you are still the one responsible to
make sure theses things doesn't happen. If you can't manage it, don't use
autopilot, just like if I can't manage to drive, I won't use a car. Getting a
driving license may seems like a formality, but it's not, it's a test to see
whether you are apt to drive.

~~~
allovernow
>Autopilot or not, YOU are still the driver. If there's a mistake, YOU should
be responsible for it. As long as there's a steering wheel, and I believe
there will be one for a pretty long time, you are still the one responsible to
make sure theses things doesn't happen

But that totally defeats the purpose of autopilot...

~~~
dwild
> But that totally defeats the purpose of autopilot...

Of a true autopilot, sure, but for now they all are pretty far from being one.
For now it's only purpose is to have to make less effort while driving.

------
throwaway724
I'm increasingly of the opinion that technology like this will never see broad
regulatory approval. I just can't see an argument where any agency will be
comfortable with acceptable losses, and I don't see any endgame in which this
technology does not result in an infinitesimal, but non-zero amount of fatal
errors.

~~~
keanzu
On a regulatory basis true self driving only needs to outperform humans.
Humans are good at driving but imperfect we have a nasty tendency to get
ourselves into accidents, both fatal and non.

My thought was that the public would fear and refuse to use the technology but
some brave souls are jumping right in.

My opinion: full self driving is only a matter of time. At some point in the
future insurance companies will figure out that self driving is less dangerous
than human drivers and start offering cheaper premiums and deductibles to self
driving only cars.

~~~
semi-extrinsic
The facts around current car safety is that it's already really quite good. In
modern cars and "autopilot-feasible conditions" you are talking well below 1
fatality per billion vehicle miles travelled with regular human drivers.

This means that if a model has sold 1 million cars they each need to drive 100
000 miles _with autopilot enabled_ before the insurance company has enough
statistics to say "this is safer than a human".

~~~
mtgx
No, the "facts" you keep reading about (from the same companies trying to sell
you on the technology) are extremely misleading.

Tesla for instance does that statistic against "average driving" but the
"Average driving" happens in cities. And most Teslas enable Autopilot on
highways, where Tesla recommends enabling it, too.

Accidents happen much less on highways, so of course this "statistic" looks
better. Put Autopiloted Teslas in the cities and then see how that statistic
fares. My guess is it will become much, much worse.

The more real statistic is that even Waymo, which is about an order of
magnitude better than anything else on the market, has an "incident" where a
human driver would need to intervene every 5000 miles. For everyone else, a
human driver would need to intervene every few hundred miles.

That's far from the "self-driving" technology we were promised.

Two relevant posts from someone that used to lead the Waymo project, before it
was named Waymo:

[https://www.forbes.com/sites/bradtempleton/2019/06/10/gmcrui...](https://www.forbes.com/sites/bradtempleton/2019/06/10/gmcruise-
leaks-show-them-ridiculously-behind-waymo-its-time-for-better-more-public-
metrics/)

[https://www.forbes.com/sites/bradtempleton/2019/04/18/are-
ro...](https://www.forbes.com/sites/bradtempleton/2019/04/18/are-robocar-
teams-doing-safety-theater-or-acceptable-risk/)

[https://www.theguardian.com/technology/2017/apr/04/uber-
goog...](https://www.theguardian.com/technology/2017/apr/04/uber-google-waymo-
self-driving-cars)

~~~
sillysaurusx
It pains me to see you banned. You've been with HN a long time.

I compared your recent comments to 6 months ago. They seem a bit better now.

Why not just email and apologize? We all have bad days. Why let a string of
bad days tank your 8-year history?

Regardless of what you decide, I wanted to leave an encouraging comment. At
least one person is thinking of you and cheering you on. Good luck.

~~~
dang
If you go back to stirring up drama on HN with offtopic meta comments, we will
ban you again, much sooner than the years-long, hundreds-of-emails, dozen-
accounts process it involved last time. That was more agony than any other
user has single-handedly managed to cause on this site, and we won't go
through it again. No more of this please—nothing of the kind—nada—period.

------
crad
So guy notes autopilot malfunctions in a specific area of road, uses it
anyway, and dies. Am I missing something?

------
LoSboccacc
I don't understand, why did he kept trusting the autopilot on that stretch of
road after the first incident?

------
xbmcuser
So autopilot was having problems at a particular place every time and he was
still using it. As much as Tesla is to be blamed the driver is too. Why the
fuck would you trust something to work if you yourself have experienced it not
working multiple times.

~~~
trixie_
Not just autopilot, but regular people too where confused by the road, "In the
three years before the Tesla crash, the device was struck at least five times,
including one crash that resulted in fatalities."

------
frankus
I can’t help but think that our culture-wide experiment in the normalization
of deviance when it comes to speed limits (both how they are set and how they
are observed) has something to do with this.

A crash at 55 mph would have 40% less energy than at 71 mph.

~~~
jbtule
The NTSB report information is public, from the NTSB docket,
[https://www.ntsb.gov/investigations/Pages/HWY18FH011.aspx](https://www.ntsb.gov/investigations/Pages/HWY18FH011.aspx),
he wasn't traveling at 71 miles per hour.

"At 3 seconds prior to the crash and up to the time of impact with the crash
attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash
braking or evasive steering movement detected."

The autopilot accelerated into the barrier.

~~~
frankus
Could that be because it effectively registered a lane change by the car it
was following and was accelerating to the requested speed?

Or are you suggesting that the Autopilot accelerated beyond the top speed set
by the driver?

~~~
jbtule
You are right "traffic-aware cruise control speed was set to 75 mph". It was
traveling slower due to traffic.

------
agounaris
I don't get it! He complained that it was malfunctioning but he continued
using it?

------
empath75
Tesla should not be allowed to market their cars as self driving or call the
product autopilot. It’s a fancy driver assist system that will sometimes kill
you if you actually take Tesla marketing bullshit at face value.

------
fedxc
How can Tesla be liable for this? You should not trust software to drive your
car without human intervention. I don't understand the US culture where
everyone is entitled to sue everyone.

~~~
hyperbovine
Short answer is that the U.S. relies on litigation to fill the void left by a
lack of regulation. Tesla wouldn't be liable for this in Europe because
they're not even allowed to sell this technology there in the first place:

[https://www.cnet.com/roadshow/news/tesla-model-s-model-x-
aut...](https://www.cnet.com/roadshow/news/tesla-model-s-model-x-autopilot-
europe-regulations/)

------
1024core
I don't understand something. He had complained in the past about AutoPilot
steering the car towards the divider. _Then why was he still using the
AutoPilot in that situation??!_

------
jmspring
Suing caltrans for failing to maintain the roads? Probably the same people
that voted for Newsom and the redirection of funds for roads to high speed
rail boon doggies.

------
focus2020
Open source software and dataset to verify its effectiveness but restrict it
for commercial purpose would help build confidence in auto pilot.

------
mattcantstop
I have a Tesla and think Autopilot is neat. But I never rely on it, because it
is blatantly not safe enough to be relied upon.

------
jonplackett
Obviously this is completely Tesla’s fault. But you gotta ask - if he knew it
was malfunctioning there, why keep using it there?

------
holstvoogd
My god the endless stream of 'tEslA BAd!' itt.. You all just mad you didn't
buy stock?

This man already knew autopilot would have trouble in this spot and yet choose
to play on his phone. He is an idiot.

And to those who way 'they shouldn't call it autopilot if it can't do full
self driving!' I suggest you go read up on auto pilot. Direct from the
definition: 'Autopilots do not replace human operators, but instead they
assist them in controlling the vehicle.'

------
blackrock
Autopilot works great!

Until the moment it kills you. Then you become the statistic.

Better buy some life insurance with that expensive robotic car.

------
0000011111
It looks like more non-auto piolet drivers have died at this location than
auto piolet drivers.

------
vardump
I thought at first this was some new crash, but it actually occurred in
_2018_.

------
hurricanetc
Complains about autopilot.

Plays video game in the car with hands off the wheel.

Well, okay.

------
KurtMueller
I can sense the fruition of a conspiracy theory here...

------
freepor
Complains to his wife that Autopilot steers him towards concrete barrier;
continues to use Autopilot.

------
imvetri
Sorry to hear that.

------
terminaljunkid
People have these vague notions about what technology can do because
"influential" people in our industry read sci-fi badly like children.

------
runako
I am intrigued by Teslas, and am considering purchasing one. However, I would
feel better about the purchase if there was absolutely no autopilot software
or hardware on board. Ideally that would mean _zero_ , including non-autopilot
builds of all system and control software so there is no chance the car will
ever engage autopilot.

Is it possible to buy a Tesla completely devoid of autopilot, or is this type
of accident potentially a risk in all of them?

~~~
keanzu
It is an extra cost option. Don't specify it and it won't be enabled. The
hardware and software are still present in the vehicle, just turned off.

There's never been a Tesla incident, that I know of, where the system _has
turned itself on_. Every single crash has been after a deliberate engagement
by the driver.

~~~
Balgair
Unintended accelerations in Nissans were blamed on bit-flips due to cosmic
rays. I think that one actually was true in the end. So unless you'd like to
test the CPUs near nuclear reactors and electron guns, I'd say that there is a
(cosmically) small risk of it turning on despite not enabling it.

~~~
remmargorp64
Wow, that's crazy! I didn't even think about that, but I can totally see how
it could be possible. I imagine that the only reliable way to prevent it would
be to have multiple redundant systems that operated via consensus.

------
sneak
At the risk of sounding callous: how many Tesla drivers have previously
complained about autopilot and not been killed whilst using it?

How many non-Tesla drivers were killed in the same timespan by driving
manually?

This seems like a very naked appeal to emotion, and not relevant to the
situation.

