
Tesla Model 3 Drives into Overturned Truck in Apparent Autopilot Failure - elliekelly
https://jalopnik.com/tesla-model-3-drives-straight-into-overturned-truck-in-1843827507
======
cashsterling
Stuff like this is completely expected of nonlinear, high dimensional
controllers (and neural networks used for object detection and decision
anlaysis fit into this category)...

NN's are not fit for high-predictability, high-safety-factor control. I wonder
if, in fact, we cannot construct a NN controller with the level of unusual
object detection and edge-case scenario recognition required to match an
average human drivers capability to recognize and react appropriately to
unusual situations. I mean to say that I think this is something like an NP-
hard problem... to make it one-log better you need 100x the data and 100x the
NN dimensionality. Model complexity explodes...

I also think we are spending a lot of time and effort solving the wrong part
of the transportation problem. Cars have a thermodynamic problem for society
--> a 1500kg car to move 1-2 80kg people is very energy inefficient and
resource intensive to manufacturer.

We need remote work, more home delivery options (e-truck/vehicles), greater
incentive for electric bikes, and better/more public transportation.

my 0.02...

~~~
jvanderbot
Cars should not detect "obstacles". They should detect roads. They should not
seek _exceptions_ to driveable space through classification- or segmentation-
first strategies. They should only detect driveable space. Full stop.

You either detect a full-stopping-distance's worth of clear road through
reliable (non visual) ranging with dense enough samples to preclude wheel-
catching voids ... or you will continue to run into "objects" which slipped
through your classifier.

I can accept reduction in this margin (e.g. in dense traffic flow) when it is
first shown to work with margin.

~~~
rimliu
Ok, a wind blows a empty paper bag on the road. A human will see by the way it
moved that it is empty and safe to drive over. What will AI see? A driveable
road or a road with a rock on it? I still think that without the "model of the
world" no AI will be capable of driving. And that model requires GAI.

~~~
taneq
I'll take a self-driving car that makes a controlled stop for a wind-blown
paper bag over a self-driving car that ploughs into an overturned truck.

~~~
syspec
I agree but stopping on the highway for a paper bag could itself lead to
serious series of accidents

~~~
brutt
It's because human beings are unable to keep safe distance.

~~~
totalZero
Scapegoating of humans who make soft errors doesn't do anything to make
autonomous driving safer.

------
bronco21016
Somewhere along the line the marketing of Autopilot fell apart. As an airline
pilot when I hear the word autopilot I think of a system that generally drives
the aircraft along a programmed routing and vertical profile. It is most
definitely not a set it and forget it device. Training repeatedly stresses
that someone must be monitoring the actions of the autopilot at all times and
it’s taken very seriously as it can rapidly place you in a very undesirable
state.

Somewhere along the way Tesla Autopilot became marketed as a fully self-
driving system freeing us to tune out. I think the criticism about the
capabilities of the system could be toned down some but we most definitely
should be critical of what capabilities are advertised and what instruction is
given for the person operating the system. I think it’s very likely that an
Autopilot like system could make driving significantly safer but only if we
figure out the human interaction aspects and fully and truthfully communicate
the capabilities of that system.

~~~
teej
> Somewhere along the way Tesla Autopilot became marketed as a fully self-
> driving system freeing us to tune out

Do you own a Tesla or ever considered buying one? This is a misconception I
hear from folks who never planned to own one but not from owners.

~~~
ascorbic
They literally call it "full self driving". I had a Model 3 with FSD last
year, and it's really pretty pointless beyond a party trick. The actually
useful stuff (lane-keeping and dynamic cruise control) is available in the
base autopilot option, and is not much different from features in other high
end cars. The FSD features are so unreliable as to be totally useless. Lane-
changing was often scary, with it not spotting vehicles in the blind spot more
than once. It somehow managed to miss exits on navigate on autopilot, and in
any case the feature is available in so few places it's barely worth using
unless you're only driving on highways. They also don't make it clear that
advanced summon is basically useless outside the US, because it only works
when you're within 10 meters of the car.

When I had to return that car when I changed jobs, I immediately ordered a new
Model 3, because it's a great car and I really like driving it, but I didn't
get the FSD option because it's dangerous, useless, and a waste of money. I've
not missed it once. Now the only issue is that whenever I get in the car and
I'm reminded of Elon Musk's antics, which makes me cringe a little, but that's
another story.

~~~
AnssiH
> They also don't make it clear that advanced summon is basically useless
> outside the US, because it only works when you're within 10 meters of the
> car.

Hmm, is >10m a geoblocked feature or is <10m somehow more useful in US than
elsewhere?

~~~
ascorbic
It's a UN regulation that the US doesn't enforce. Having just checked, it's
actually just 6m.

------
morningcovfefe
This is not a self-driving car and it is not cruise-control (that demands you
pay attention to the lane ahead of you). This is a savant 8 year old driving
your car. Seems to behave fine in common cases but if you relax you might well
die if an uncommon case comes up that this child has no basis to handle. I
personally would rather have an actual 8year take the car, since then I would
have the correct amount of attention on the road out of well founded fear for
the limitations of the child's abilities

~~~
colinmhayes
Do we have any evidence that shows 8 year olds can't drive if given enough
practice? I mean as long as they can see and reach the pedal I'd say driving
is simple enough that they could do it if there aren't any distractions.

~~~
tgb
I'd be interested in whether dogs could be trained. I know they have literally
been trained to drive cars but not as far as I'm aware to such a level that
you'd let them on the highway. But my impression is that they are better at
focusing on the task than humans, when well trained, and maybe would out
perform them, given some navigational guidance. Seems like something for a
steam punk novel to have.

~~~
outworlder
> I'd be interested in whether dogs could be trained

Trained? Don't think so. I mean, there are dogs riding skateboards, so I
believe that, given the proper ergonomic controls, they could learn how to
drive a vehicle.

Driving a vehicle safely (as in, not stopping in the middle of a highway to
investigate an interesting object) would be a greater challenge.

But as far as navigating a space? Sure, they are amazing at that and far
better than anything we can come up with in the near future. Heck, if you hook
up an insect brain inputs and outputs, you would have a system that's better
than what we currently use.

~~~
s1artibartfast
similarly, cultured rat neurons can operate a mall robotics and avoid
obstacles.

[https://www.youtube.com/watch?v=1-0eZytv6Qk](https://www.youtube.com/watch?v=1-0eZytv6Qk)

------
elihu
The car did eventually detect the obstacle and brake prior to impact,
otherwise it would have been a lot worse.

What seems weird to me from the video is that there was a person beside the
road (perhaps the driver of the overturned truck) waving at cars to stop, and
the Tesla didn't detect the pedestrian and slow down, even though he/she
appeared to be standing on the edge of the car's lane. It's hard to tell for
sure, but it looks like the pedestrian would have been hit if he/she hadn't
stepped out of the way.

~~~
BrentOzar
> The car did eventually detect the obstacle and brake prior to impact

The driver is claiming he braked, not the car: "Huang says he slammed on the
brakes once he noticed the truck, however, it was too late to stop the nearly
two-ton sedan traveling at reportedly 68 miles per hour."

[https://www.thedrive.com/news/33789/autopilot-blamed-for-
tes...](https://www.thedrive.com/news/33789/autopilot-blamed-for-teslas-crash-
into-overturned-truck)

~~~
dawnerd
Driver wasn’t paying attention, not surprising. You have to actively ignore
the warnings to pay attention or hack together a rig to put torque on the
wheel.

~~~
perl4ever
I don't understand how paying attention can prevent this sort of thing.

Let's say you're paying attention with all your might. You see something up
ahead, like an overturned tractor trailer. You expect autopilot will sense it,
but you're preparing for it not to. But how do you know it's _not_ going to
sense it? Well, obviously, when it hasn't started reacting in a reasonable
time. But then it's too late!

In order for a human to have time to figure out that the software isn't
working it would have to routinely operate with (much) slower reflexes than a
human. But then it would be useless, as you can't slow down normal driving,
and in any case, that's not how they designed it.

The hypothesis that people have unreasonable expectations of something called
"Autopilot" seems unnecessary and irrelevant to me.

~~~
Sevaris
Ideally there'd be obvious visual feedback that reflects the current state of
the Autopilot and what it recognizes. Something that can't be ignored (front
and center on the windshield maybe) and that can give the driver sufficient
information to make a decision about whether to interfere or not. This doesn't
seem to be on the radar for anybody though. It's like everybody assumes
they'll figure out a 99.99% working driving AI where driver intervention is
never necessary.

It's even more egregious with Tesla, who is already shipping a half-baked
self-driving feature in its current state where accidents like in the article
are inevitable.

I could even imagine the windshield outlining everything that the Autopilot
sees. If you don't see an outline around a object, you know the Autopilot
isn't aware of it. No idea how technically feasible that is.

~~~
perl4ever
This sort of problem can be generalized I think to a lot of software design
problems, including the sort of things I'm assigned at work (even though very
mundane and boring as I'm not a real software engineer).

People say "make a program that does <thing> for me". And maybe you whip up
something that does 50%, or 80% or 99.5%. But that only makes them more
unhappy when it fails. A partial solution is no solution.

So, you have to come up with a model for humans to be augmented by the
software. Rather than trusting it to make the decisions, the software needs to
take a large amount of data and clarify and distill it so humans can more
easily make the decisions.

But people always want to avoid making decisions. That's why managers/leaders
have so much power even though they tend to be despised and seem incompetent.

------
jannyfer
When something similar happened in 2016, I read that Autopilot will tune out
bright objects against a bright sky.

[https://www.forbes.com/sites/dougnewcomb/2016/07/18/elon-
mus...](https://www.forbes.com/sites/dougnewcomb/2016/07/18/elon-musk-tweets-
tesla-will-make-changes-to-autopilot/)

I’m imagining that a low-dynamic-range camera may see certain
tunnels/underpasses’ exits as bright white objects blocking the road. The
above article refers to a now-deleted Musk tweet that says something similar.

~~~
simonklitj
Your comment made me remember that multiple Tesla engineers quit and spoke out
about the insufficient sensors a couple of years ago:
[https://www.technologyreview.com/2017/08/25/149475/some-
tesl...](https://www.technologyreview.com/2017/08/25/149475/some-tesla-
engineers-think-autopilot-isnt-safe/)

------
Animats
Didn't even slow down. Humans often brake late, but seldom fail to brake at
all.

It's grossly irresponsible of Tesla to ship a system that can't detect big,
solid obstacles. This happens over and over. So far, they've hit a crossing
semitrailer, a street sweeper, a car at the side of the road, a fire truck at
the side of the road, and two road barriers. This has been going on for five
years now, despite better radars and LIDARs.

~~~
wcoenen
> Didn't even slow down.

The car in the video does seem to be braking strongly 1 second before the
collision. There's a cloud emanating from the wheels.

> Humans often brake late, but seldom fail to brake at all.

Humans do fail to brake completely, e.g. when they fall asleep or have their
eyes on their phone.

~~~
dragontamer
[https://www.youtube.com/watch?v=LfmAG4dk-
rU&t=46](https://www.youtube.com/watch?v=LfmAG4dk-rU&t=46)

There are no brake lights on the Model 3. None at all. Based on this "reverse
video", my conclusion is that the M3 doesn't even brake.

\-------

Somewhat worrying is the pedestrian (the truck driver?) who jumps back to get
out of the way of the M3. Not only did the M3 miss the truck, but the M3
didn't seem to see the pedestrian either.

~~~
Groxx
That's a daytime video with a crappy camera at an abnormal-for-other-drivers
angle. It's entirely reasonable for them to not be visible, both due to the
low intensity difference and due to the angle not being what lights optimize
for.

Also note all the cars that followed and _clearly_ hit their brakes. They're
invisible or nearly-invisible, except when they go through the overpass's
shadow.

------
fnord77
> For any human driver paying even the slightest bit of attention, this
> accident is almost an impossibility, assuming the driver had the gift of
> sight and functional brakes.

No. A friend of mine was driving home up 101. There was a large boat sitting
in her lane. She thought her brain was playing tricks on her and did not
believe there was a boat sitting on the highway. And she crashed into the
boat.

~~~
moralestapia
Well then ...

A friend of mine was driving home up 101. There was a large boat sitting in
her lane. She thought her brain was playing tricks on her but eventually
figured out there was actually a boat sitting on the highway. She was able to
avoid the boat.

That's the problem with rumors.

------
bra-ket
i think autopilot is a misnomer in this case, it should have been called
brainless driving assistant

on a serious note, it's not that another edge case was missing in training
data but what we call "common sense" is fundamentally missing in neural
network based approach, and no amount of parameters, model permutations and
gpu power would compensate for that

~~~
tylerhou
This isn't a neural network edge case/common sense issue; this is an issue
with the radar sensors that Tesla is using for driver assist.

Specifically, radar is noisy — it will constantly detect non-existent
"stationary" objects in front of the car. Driver assist systems therefore
ignore any objects which are stationary on the road — if it didn't, then the
car would stop erratically.

~~~
sandworm101
>> Driver assist systems therefore ignore any objects which are stationary on
the road

If true, that is totally unacceptable. Any mountain road can have fallen
rocks. Any city road can have a child's toy be stationary in the road. And
EVERY road can at any time have a live person, a fallen
motorcyclist/bicyclist, lying crumpled up and stationary in the middle of the
road. Stationary objects, whether on radar, camera or both cannot be ignored.

If the radar 'sees' such an object for more than a heartbeat or two, the car
should take action. That's what any reasonable person does when they aren't
sure what they are looking at.

~~~
renewiltord
Why do you say this? I'm at the wheel so I can always act. The driver assist
makes this easier to do by helping out but I'm still here at the wheel and so
I don't need perfection, I just appreciate the help.

~~~
rtkwe
People aren't machines and when something works fine 95+% of the time they'll
get complacent and an uncommon event will eventually catch some people out.

~~~
renewiltord
My Subaru has Eyesight and I don't think I ever take my hands off the wheel to
let it drive, though it does lane-keep. It's a comfort feature.

I wonder if I would get more complacent in a Tesla.

~~~
rtkwe
Yeah there's a whole different level of automation between your car and what's
being advertised by Tesla. EyeSight does do a bit of driving for you at higher
speeds but it's not the stop and go capable 'Autopilot' of the Tesla.

------
mindfulplay
This is getting a bit ridiculous. Silicon Valley mindset being applied to
critical safety focused products is really bad. I hope they regulate the crap
out of Tesla and God I hope comma.ai whose founder seems even more batshit
crazy.

~~~
tim333
Comma.ai's system actually has an eyeball tracker that I think stops things if
you are not looking where you are going which would likely prevent this kind
of crash.

------
mindfulplay
The level of respect that someone like Andrej Karpathy is jarring: he should
publicly apologize instead of being all saintly (oh look ma, I built a toy
with 6 cameras, I am so proud of myself) while talking about deep learning or
whatever crackpot stuff they are cooking over thinking they are solving
world's "problems".

Pick your "heroes" properly HN crowd.

~~~
rcMgD2BwE72F
Was Autopilot even activated?

------
rck
Haven't seen anyone mention the human that almost gets hit. If the person
standing in the lane (the truck's driver maybe?) hadn't stepped out of the
way, the Tesla would have run him over without slowing down... Looks like
multiple clear perception system failures.

~~~
LeoPanthera
Or, autopilot was not engaged. I haven't seen any evidence yet that it was
turned on at all.

~~~
dagmx
Regardless, the car should have emergency breaking. As it stands, it seems
like Tesla struggles at novel stationary objects right in front of the car.

It seems like their training is more geared to Known object types in front,
and arbitrary objects on either side

~~~
rcMgD2BwE72F
Yet, in most third-party tests and evaluations, Tesla's AEB get better
restults than the rest of the industry (with a 94% safety rating:
[https://www.euroncap.com/en/results/tesla/model-3/37573](https://www.euroncap.com/en/results/tesla/model-3/37573)).
Mmh...

------
nihil75
Who was it that said (and I'm paraphrasing) - "Self-driving is easy, you could
build an AI yourself that will work 90% of the time. It's writing an AI that
works 100% of the time that's difficult".

He built a car that drove cross country.

It's not "cameras vs. LADAR", it's about software that needs to work
flawlessly.

I love "Lean" development process, and "just get it out there", but I wouldn't
write banking software that way.

Or autopilots for that matter. A blatant disregard for human lives.

~~~
legohead
100% is impossible. It's how many 9s are you willing to accept. And LIDAR
would add a nine or two, but Musk is just being stubborn.

~~~
justapassenger
He's stubborn, but he also realistically I don't they have any choice.
Retrofitting all the cars they sold with FSD, with set of LIDAR sensors (or
refunding those purchases) would likely bankrupt the company.

Buying a car with FSD HW years before we even know what's needed to get FSD,
is like buying a PC, spec'ed to run Duke Nukem Forever, in 1997.

~~~
gonesilent
As a 5 day new Tesla driver playing with auto pilot you can just feel the
thing is over worked. The road I live on HWY 36 in California was a joke on
autopilot. Car can't do it has to slow well below speed limit. Every other
turn it tries to drive off the road and gives up steering. Also roads with
cracks that are black with pavement sealer confuse the car to no end. The car
is color blind and the vision system gives it no time to react.

------
randcraw
Assuming autopilot was indeed engaged, this seems to be clear evidence that
cameras alone are insufficient to perceive big objects in a car's path OR
judge its distance to them.

This car traveled in a straight line in near-perfect weather for at least 10
seconds toward a large motionless object blocking its lane and the next one,
but _still_ overlooked the danger.

Imagine if this had happened in bad weather or at night, where risk of failure
is substantially greater. This degree of incapability implies that vision
alone is not only inadequate to perceive obstacles ahead that should be
obvious to any human, but _profoundly_ inadequate.

~~~
svachalek
Humans do it with vision alone, do they not? This seems to be a failure of
processing, not data.

~~~
thefounder
How do you think the objects are detected with "vision" by autopilot? Are you
considerig classifying all the objects in the world in all the possible
situations so that that dumb computer can avoid them? So far that seems the
plan.

~~~
skellington
You don't need to train with every object in the world to train in the general
concept of road occluders.

Autopilot users are ALWAYS supposed to watch the road and take control in
unusual circumstances. Like an autopilot system in an airplane, it will fly
you straight into a mountain if you give it that course and don't monitor
where you are.

~~~
thefounder
>> Autopilot users are ALWAYS supposed to watch the road and take control in
unusual circumstances.

It looks like there is no single general concept of road occluders. If
autopilot can't avoid a big object in the middle of the road what is it good
for? Tesla claims to provide more than an airplane autopilot, they say its
tech would avoid that mountain.

------
darwinwhy
I did a search for "lidar" in this thread and found nothing. But this seems
like the kind of obstacle that lidar would detect every time?

I'm not an autonomous vehicle engineer, so can anyone more qualified comment
on whether this crash is a symptom of Tesla's decision to avoid lidar?

~~~
MPSimmons
The number of obstacles that LIDAR could prevent hitting, which Autopilot
would not pick up, over the average life of a car, is very small.

Whether LIDAR would be worth the money is, I suppose, an academic exercise.
You could calculate the number of Teslas involved in such crashes, and divide
the price of a Tesla by the price of LIDAR, to decide if it is worth it (I
suspect not, at current prices), but ultimately, things will happen even with
LIDAR. It isn't a silver bullet.

I'm on my second Tesla. I'm totally biased. It has saved me from inattention
several times. I probably wouldn't _actually_ have crashed any of those, but I
might have. Regardless, I'm glad I have it, and I am safer with it than
without it. Sure, I could screw up and crash on autopilot, but it would be
easier to crash without autopilot.

~~~
Dylan16807
If the LIDAR is the critical piece to get the car above level 2 automation on
the highway, then that's worth quite a bit of money.

~~~
booi
afaik LIDAR is not useful in even minor precipitation (rain/snow/fog). If the
car is dependent on LIDAR at highway speeds then this needs to be addressed.

Being dependent on cameras is similar to a human situation and is (hopefully)
no worse.

~~~
Dylan16807
If the car can let me stop paying attention to the road, but only when there's
no precipitation, that's still a big improvement over a car that always makes
me pay attention. At least in most climates.

------
pengaru
Tesla's marketing, particularly of autopilot, has been criminally misleading
IMHO.

------
djstein
should the car have slowed down by itself? yes. was the driver not paying
attention and did not have their hands on the wheel? obviously. it is driver
assistance, you agree to the terms and conditions, you still have to DRIVE the
car.

~~~
toomuchtodo
Lots of cars still have a hard time with automatic emergency braking.

[https://www.caranddriver.com/features/a24511826/safety-
featu...](https://www.caranddriver.com/features/a24511826/safety-features-
automatic-braking-system-tested-explained/) (Car and Driver: We Crash Four
Cars Repeatedly to Test the Latest Automatic Braking Safety Systems)

> To understand the strengths and weaknesses of these systems and how they
> differ, we piloted a Cadillac CT6, a Subaru Impreza, a Tesla Model S, and a
> Toyota Camry through four tests at FT Techno of America's Fowlerville,
> Michigan, proving ground. The balloon car is built like a bounce house but
> with the radar reflectivity of a real car, along with a five-figure price
> and a Volks­wagen wrapper. For the tests with a moving target, a heavy-duty
> pickup tows the balloon car on 42-foot rails, which allow it to slide
> forward after impact.

> The car companies don't hide the fact that today's AEB systems have blind
> spots. It's all there in the owner's manuals, typically covered by both an
> all-encompassing legal disclaimer and explicit examples of why the systems
> might fail to intervene. For instance, the Camry's AEB system may not work
> when you're driving on a hill. It might not spot vehicles with high ground
> clearance or those with low rear ends. It may not work if a wiper blade
> blocks the camera. Toyota says the system could also fail if the vehicle is
> wobbling, whatever that means. It may not function when the sun shines
> directly on the vehicle ahead or into the camera mounted near the rearview
> mirror.

> There's truth in these legal warnings. AEB isn't intended to address low-
> visibility conditions or a car that suddenly swerves into your path. These
> systems do their best work preventing the kind of crashes that are easily
> avoided by an attentive driver.

> The edge cases cover the gamut from common to complex. Volvo's owner's
> manuals outline a target-switching problem for adaptive cruise control
> (ACC), the convenience feature that relies on the same sensors as AEB. In
> these scenarios, a vehicle just ahead of the Volvo takes an exit or makes a
> lane change to reveal a stationary vehicle in the Volvo's path. If traveling
> above 20 mph, the Volvo will not decelerate, according to its maker. We
> replicated that scenario for AEB testing, with a lead vehicle making a late
> lane change as it closed in on the parked balloon car. _No car in our test
> could avoid a collision beyond 30 mph, and as we neared that upper limit,
> the Tesla and the Subaru provided no warning or braking._

Emphasis in last paragraph mine.

Disclaimer: Model S, X, Y owner, TSLA investor.

~~~
FireBeyond
> Toyota says the system could also fail if the vehicle is wobbling, whatever
> that means.

Strange phrasing but they're most likely referring to cross-winds, such that
the camera (or the computer behind it, rather) cannot determine whether a
movement was a vehicle impinging on the path of travel, versus a momentary
shift of angle of attack of the camera vehicle.

------
argonaut
This demonstrates the superiority GM's Super Cruise approach. They have
interior cameras that make sure the driver is looking forward (you're allowed
to look away for brief periods). Tesla only forces you to keep one hand
applying some pressure to the wheel - you could easily be on your phone with
your other hand. GM's approach is much more relaxing and prevents accidents
like this. That being said, there's no reason Tesla can't easily adopt this
approach in the future.

------
tqi
Do ML models for self driving cars start from an assumption that it is safe to
proceed forward unless it identifies an obstacle, or does it assume that it is
unsafe to proceed forward unless it identifies that the path is clear?

------
joshuahedlund
I remember early on when Tesla was touting some kind of crashes/deaths per
mile average or something compared to the US average... Anyone have the
current numbers on that?

~~~
alexslobodnik
Directly from Tesla

<<In the 1st quarter, we registered one accident for every 4.68 million miles
driven in which drivers had Autopilot engaged. For those driving without
Autopilot but with our active safety features, we registered one accident for
every 1.99 million miles driven. For those driving without Autopilot and
without our active safety features, we registered one accident for every 1.42
million miles driven. By comparison, NHTSA’s most recent data shows that in
the United States there is an automobile crash every 479,000 miles.

Total overall miles and crashes were significantly reduced in this
quarter.[0]>>

[0]
[https://www.tesla.com/VehicleSafetyReport#:~:text=Q1%202019,...](https://www.tesla.com/VehicleSafetyReport#:~:text=Q1%202019,every%201.76%20million%20miles%20driven).

~~~
tqi
On thing to keep in mind is that the nature of the miles driven when autopilot
engaged are not the same as the ones without autopilot (ie mostly highway
miles vs city street driving), and likely also different from the miles driven
without active safety features.

~~~
gnramires
Agreed, those numbers alone don't convince it's safer.

Also, personally I would have a complication: would I be satisfied with being
safer than _average_? I don't know if I'm average. I tend to pay attention and
never have driven under influence (alcohol or w/e), or when sleepy. So a
little better than average would not bring me peace of mind.

I think some kind of situational statistic would be relevant here; maybe they
could have some part of the fleet as controls with AP disabled (and offer some
kind of reward), then the overall accident rate can be compared better.

------
ajankelo
Am I the only Tesla owner that finds the blame on autopilot a little hard to
believe? My car never reaches the point where I feel that I don’t have to pay
attention.

~~~
johnghanks
I'm blown away at how trusting of AP and FSD people are. I've only used AP
once or twice in the 6 or so months I've had my 3 and it consistently scares
the shit out of me.

It's not _bad_ per se, but it drives like a teenager and that has resulted in
a fundamental distrust of the system on my end.

------
stcredzero
How about a limited LIDAR that only looks straight ahead? (Perhaps in a 5x5
degree wedge?) That's likely to be far cheaper than a full LIDAR set, and
would provide useful data about a barrier in the vehicle's path.

~~~
zucker42
Somehow I doubt this problem is as simple as "add a LIDAR" considering there's
probably at least one smart person at Tesla who has thought of that idea. Sure
you might be right, but my impression is that the 1 in a million cases will
still exist.

~~~
tpmx
There would still be that Musk person dictating no LIDARs, so, no.

~~~
WrtCdEvrydy
Adding LIDAR would make the millions of cars on the road a liability. You'd
have to make those people whole and basically recall the cars. If you can
'make it work' with cameras, you save millions.

~~~
tpmx
Seems like they've sold about 1 million cars to date.

[https://www.theverge.com/2020/3/10/21172895/tesla-one-
millio...](https://www.theverge.com/2020/3/10/21172895/tesla-one-million-cars-
production-model-y)

------
Plutonsvea
Ah, an outlier. I don't see any significance to this story other than being a
$TSLA trigger word... I've seen many people, including truck drivers get hurt
as a result of relying too much on technology (Lane assist, auto-headlights,
etc). This is why they're all labelled as an assistive technology.

The driver should have seen a gigantic truck blocking his entire lane, and
though it was a failure of autopilot to not detect it, autopilot shouldn't be
receiving all the blame for something that is inherently not its job to do
100% perfectly.

~~~
MattGaiser
Claim a car is “full self driving“ and it’s news when it can’t drive itself.

~~~
ggreer
Tesla doesn't claim that their current cars are full self driving. They claim
that future software will allow FSD on some models sold today.

------
Erlich_Bachman
Many of the comments seem to jump onto the "autopilot doesn't work, we told
you so"-train. I think it is important to remember and be clear that no
autopilot system will be 100% safe, nor is any human driver 100% safe. What is
interesting is looking at statistics of how often these accidents happen, and
comparing that statistic to human drivers.

There are other issues like the way we calculate statistics: for instance we
shouldn't just calculate "miles driven with autopilot engaged", without
looking at how it was disengaged, what would have happened if it wasn't
disengaged at that time, whether that would lead to an accident, etc.

There are issues, yes, but some commenters are acting like they think that
this accident is what happens every time there is an obstacle, and as if the
autopilot does not have functional obstacle detection at all. This is not the
case. This is an exception, a failure in the obstacle detection, not the rule.
The fact that this news is getting so much traction is a testament to that. If
the autopilot was not detecting any obstacles, we would have news articles
like this every week.

------
fastball
Is it not necessary to confirm with a source besides the driver (e.g. Tesla)
that Autopilot was actually on?

I feel like "I had Autopilot on and it didn't work" is a pretty convenient
excuse if it's _not_ on and you have an accident all on your own.

Because to me that is entirely plausible, yet we're all operating under the
assumption (and having lengthy discussions about) this "failing" of Autopilot.

------
pintxo
Forget the truck, there seems to be a person (the truck driver?) standing some
50m in front of the truck, waving the oncoming cars to the side.

And the Tesla seems to completely ignore the fact that a human is standing
basically right in his planned trajectory.

Missing the truck, fine. But why is the system not reacting to a human on the
highway? Humans standing on dedicated highways is likely a 99% signal for
trouble ahead.

------
kirillzubovsky
Why is this stuff getting upvoted? All we see is a Tesla that rammed into an
obstacle on the highway, with evidence of anything else. We don't know that it
was on autopilot, and the driver in this case would say anything to try and
get his insurance money back. The car started to break when it saw an
inevitable collision, but otherwise, it's pure speculation.

------
cjdell
I have owned a Tesla Model 3 in the UK for almost a year now. The Auto Pilot
is only useful on dual carriageways and straight roads and still very
frequently suffers from "phantom braking". It should be obvious to any half
decent driver that concentration is not optional.

Having said that, the Auto Pilot has saved me from a couple of near misses:

\- It has detected the car in front braking suddenly and reacted before my
brain had even registered. \- It has nudged the car sideways when a car (in my
blind spot) moved too close.

At the moment it is a driving aid, a second set of eyes, not a driver
replacement. It is useful as that, but Tesla (as much as a love them) could
really do to be more honest in the marketing of their "Full Self Driving"
packages.

I expect it will get substantially better, but not in the time frame the Musk
says it will.

~~~
globular-toast
> Having said that, the Auto Pilot has saved me from a couple of near misses

I'm interested to know, have you ever been involved in any collisions on the
road before owning the Tesla? How long have you been using the road?

~~~
cjdell
Only 1 accident that was my fault (I was young and stupid). Been driving for
17 years and have had near misses before.

I don't know for sure if I would have crashed if it wasn't for Auto Pilot, but
it's nice to know that it's there, even if it makes mistakes from time to
time.

The other factor is that I'm driving more now that I have the Tesla, which
probably isn't too surprising (at least until the lock down).

~~~
globular-toast
Whenever I hear about these safety features saving people from near misses I
always wonder how they've got this far driving without dying. For example,
lane keep assist. If you can't stay in a lane you can't drive. It's such a
basic requirement. So I can only assume that people are deviating from their
lanes simply _because_ these safety features exist.

~~~
cjdell
That may be true of some people. I don't think I've become complacent. It's
impossible to tell what would have happened if the car hadn't helped in those
situations.

The car does have advantages over humans, like for example, the 8 cameras
mounted around the car, making blind spot collisions less likely.

If someone comes out of their lane into mine and happens to be in my blind
spot then there's not much I can do. I do turn my head when changing lanes,
but I can't see in all directions at the same time. And I'm human, I will make
mistakes from time to time.

The car can also react far more quickly than a human nervous system ever
could. In situations when I've needed to brake for someone, often I've moved
my foot to the brake pedal and found that it is already depressed.

I think safety features are a net win overall. I'm sure seat belts and air
bags have caused problems, but in the end we have decided we're better with
them than without.

------
tlapinsk
It seems pretty clear that while the AutoPilot technology may be cool or fun,
it is nowhere near ready for full-self driving as Tesla has marketed it as.
Using cameras + radar + neural networks may not be enough - no matter how cool
the underlying software stack is.

Similarly, I also question other self-driving car manufacturers (Waymo,
Cruise, Uber) approaches using a Lidar/Radar/Camera suite.

The long tail of the problem is immense and going to take years to solve with
the current technology.

To me it's clear that a new set of technologies that are more dynamic and
responsive going into unknown situations is going to be needed to make it
"foolproof" (if that is even possible at all).

Disclaimer: This is coming from someone who naively thought full-self driving
cars would already be on the road by now back in 2015

------
risyachka
These accidents don't really matter as long as the ratio of accidents it
avoided to accidents happened is greater than one.

People make fuss when autopilot fails, critic that its based on camera etc
etc. But when accident is avoided, no one will even notice.

------
mysterEFrank
That's thankfully far out of distribution. Why doesn't Tesla get more slack
for incidents like this? Uber had to shut down its self-driving lab for almost
a year when one of its vehicles failed and killed a pedestrian.

~~~
jcranmer
Why _should_ Tesla get any slack? This particular circumstance looks like it
should be within the typical guidelines for Automatic Emergency Braking
systems to detect and react to--the sorts of systems which can be found in
consumer models of cars.

Uber shut down its self-driving lab in part because the circumstances were
such in the realm of "the car should be capable of handling it" that excess
caution was warranted in figuring out why a very easy case was missed.

------
accurrent
Personally I believe we should have fail safe back ups. A radar/lidar would
have prevented such an accident from happening. You can't prove that a
controller based on a neural network wont collide. But you can prove that a
simple lidar based backup will not collide with a stationary object. Neural
networks are great for complex decision process, but they cannot 100%
gaurantee safety. Given the tesla pricepoint, and the falling price of lidars
I don't see why one cannot be installed. Personally, I think it should be
mandatory in all self driving cars to have a provable failsafe.

~~~
jayd16
The Model 3 has a front facing radar.

------
blacksoil
I'm very not comfortable with camera-only and no-sensor approaches in self-
driving cars. While they can train a neural network with millions or even
billions of sample images from real-life tests, I do think they're still bound
to fail in some corner cases. Remember FB's face detection that turned out
failed to detect faces of people of certain races? I think there's still some
possibilities that NN in self-driving cars would fail to recognize some corner
cases such as a grandma crossing a street with particularly unique combination
of clothes and colors.

------
anlqo
Looking forward to the day that they find out that LIDAR actually damages
human eyes.

Yes, I know that the current doctrine is that it does not, but these things
tend to get reversed after a decade.

------
mixmastamyk
A few years ago I found it hard to believe driver assist systems were doing
all sorts of massive complicated (potentially buggy) processing instead of
"large object directly ahead ---> slow/stop." Folks gave all sorts of reasons
why that wasn't good enough.

I'm going to reiterate here that I don't want anything more complicated than
very simple forward collision prediction (resulting in slow down) aka
"Automatic Emergency Braking" in any car I purchase.

------
627467
Unpopular opinion: Anything that promotes (directly or indirectly) __less
__driver engagement with driving should just be banned from civilian roads.

I'm sure one day when everyone is being driven by allsensing selfdriving
vehicles road deaths will have massively decreased. Until then, technologist
just will have to learn how to get there in much more controlled environment.

A legal waiver or T&Cs should not be the only thing 'protecting' the next self
driving victim.

~~~
jmpeax
> Unpopular opinion: Anything that promotes (directly or indirectly) less
> driver engagement with driving should just be banned from civilian roads.

I take it you turn off your phone and the car radio when driving, and also
tell all of your passengers they must also turn off their phones and maintain
strict silence.

------
fortran77
Does the budget "Tesla 3" have the same sensors and autopilot functions as the
full-featured "Model S" or are they stripped-down?

~~~
joenathanone
Same sensor suite.

------
whoisthemachine
I see a lot of mention of neural networks in relation to Autopilot. However,
it's hard to imagine that those pursuing automatic driving wouldn't combine
neural networks with normal old rules and safety thresholds (ie "just brake"
if a new, sufficiently large, mostly uniform object is filling much of the
sensors).

Is this not the case?

------
dmode
I have a Model 3. It still does phantom braking. Elon’s FSD claims are
laughable and this story is just another data point

------
DoingIsLearning
Is the Tesla 3 using stereo cameras _only_?

I thought most of these ADAS systems had some form of sensor fusion, combining
multiple systems like: Stereo camera + front radar + long range US ... etc.

This is a scenario that would be safely avoided with an entry-level AEB
system, is there something special about Tesla's design that I am missing?

edit: typos/clarity

~~~
tim333
I think they have a front radar. The trouble with those is differentiating
stationary stuff like bridges from stationary stuff like overturned trucks.

~~~
DoingIsLearning
I think negating low-hanging bridges is one of the scenarios OEMs specifically
calibrate for with auto manufacturers. It's not rocket science it's a
tuning/calibration processs for that specific vehicle's dimensions and layout.

Front radar calibration is not a new problem and is well understood in the
auto industry.

These accidents really raises the question, assuming they do have a front
radar, what is Tesla doing so differently that allows a competitor entry-level
AEB system to outperform them?

To me it seems that NHTSA, TUV, etc. should ban Tesla's auto-pilot on public
roads until they can show evidence of their safety.

------
Causality1
Why is it auto-safety systems just can't handle objects in the road? That's
all I personally need it for. I try not to drive in such as a way as to be
vulnerable to bad behavior from other cars but it'd sure be nice to not plow
into nearly-invisible semi-truck tires laying in the road at night.

------
yumraj
It seems that none of the airbags deployed (mentioned in the story), per:

[https://twitter.com/jsin86524368/status/1267423319495606272](https://twitter.com/jsin86524368/status/1267423319495606272)

~~~
teraflop
It's hard to judge from the video, but the car doesn't appear to have been
going very fast, and it looks like the side of the truck crumpled a bit to
absorb the impact.

Airbag inflation is a violent, potentially injurious, last-ditch effort to
prevent fatalities. If the airbags didn't inflate but the driver walked away
from the accident, then not activating was probably the safer option.

------
robszumski
Why don't they have 6 super cheap single beam lidars in some sort of pattern
to detect this type of thing? It doesn't have to be a giant spinning sphere on
the top of the car. Small moving prism in one axis?

~~~
WrtCdEvrydy
Because Tesla's CEO says he wants a camera-only solution.

------
lvs
The report that air bags failed to deploy should be a huge red flag for NTSB.
I hope they are sending a team to investigate. Airbag deployment failure
should not happen in any modern vehicle.

------
ashtonkem
Apparently the airbags didn’t deploy, which is a _serious_ problem.

~~~
LeoPanthera
If the car was already braking, and it hit a soft enough obstacle (in this
case, the _roof_ of the truck), the G force may not have been enough to
trigger them, which would be correct behavior.

~~~
ashtonkem
Possible, the first images of the wrecked vehicle seemed severe enough to
warrant airbags.

~~~
rconti
Thankfully, airbags do not deploy based on what us internet commentators deem
"seems severe", but, instead, based on data from accelerometers.

------
siliconunit
Continuous laser telemetry should be always available and active, anything
bouncing back at certain speed and distance starts autonomous braking... No
need to identify anything..

------
ronreiter
This literally is the you-had-one-job of self driving cars

------
jayd16
Does Tesla use autopilot for emergency break assist as well? This seems much
worse than what's in a lot of other cars in terms of automatic breaking.

------
ap46
Their under body sonar & image processing should have picked it up before the
attempted braking...hope they put out full postmortem.

------
kevinsundar
There are so many ethical issues[1] with Tesla marketing autopilot as self
driving given it makes mistakes as grave as this.

[1][https://arxiv.org/abs/1901.06244](https://arxiv.org/abs/1901.06244)

------
deegles
Under what conditions could a crash unequivocally be Autopilot’s fault? This
looks like it will be blamed on the driver.

~~~
gpm
If autopilot created a situation in which a crash was inevitable with no
warning to the user. E.g. if it suddenly grabbed the steering wheel and
swerved right resulting in the car spinning out in only a few hundred ms for
no good reason.

Even if autopilot caused a crash such as above, you still have to run a cost
benefit analysis of whether or not it prevented more crashes than it causes.
Considering the failure rate of humans it's ok for autopilot to have a non-
negligible failure rate, and it doesn't really matter if those failures happen
in the same places as the human failures.

~~~
deegles
Wouldn't Tesla say "you should have had your hands on the wheel, therefore
it's not Autopilot's fault"?

~~~
gpm
Hands on the steering wheel doesn't mean affirmative control. That's why the
example includes putting the car in an unrecoverable scenario faster than an
attentive person can react.

------
m0zg
I get how a computer vision system missed this unusual situation, but how did
a _radar_ miss it?

------
ksec
460 Comments Right now. Lots of criticism of AutoPilot and Marketing. "5" (
very brief) Mention of Elon Musk.

I am not sure what to make of that. May be people feeling what he has achieved
on SpaceX meant he could get away with whatever he said / promised on Tesla.

Or may be I am just hyper critical of Him ( With Respect to Tesla only ).

------
bequestry
Yikes, the airbag didnt go off... can the NTSB look into this case?

------
m0zg
That's called a "false negative", ouch.

------
lumberingjack
Worst part is no airbags worked so whole car was FUBR

------
ChrisClark
Every radar based distance following system has a disclaimer that it ignores
stationary objects. It will be great when it's purely based on vision.

------
solarkraft
What's the accident/distance driven now? How does it compare to a human?

------
philipov
Did the passenger die?

~~~
milansuk
From article: "Luckily, nobody was seriously hurt, but the Model 3 was trashed
and covered with what one source called “sauce”"

------
rowawey
Teslas have been marketed unsafely and promote unsafe behavior. They cannot
see many classes of stationary objects, especially at high speed. And because
of Dear Leader's over-reliance on "magic bullet" IP investment, they're not
going to change how it works voluntarily.

------
xiphias2
The most important part for me is that the driver is uninjured. Sure,
autopilot made a huge mistake, but Model 3 with Autopilot still has a very
good safety record.

~~~
rconti
Are you suggesting that the car is intelligent enough to only plow into
massive stationary objects if they'll crumple adequately so as to prevent
driver injury?

A better argument would be about overall occupant safety vs driver-controlled
vehicles per mile travelled.

~~~
xiphias2
No, I was quite worried about the driver being dead, and the article hasn't
written about it, or hid it somewhere.

It's just basic human empathy that in an accindent safety of people should be
no. 1, and then safety of the system should be addressed.

------
NicoJuicy
I somehow agree with Musk that a self driving car should be able to work with
only cameras, as humans do with their eyes.

If there are 2 cameras, depth can be calculated ( eg. 2 eyes) and Tesla is in
the best position to achieve this ( lots of data).

Additional sensors make the system way more complex and most of them have the
same issue: it doesn't work in the worst weather conditions.

They should however greatly check why the system failed and how they can
improve depth vision with their cameras. The failing of the system in broad
daylight is a severe issue/fault.

And I'm sure they will improve this.

Until full auto pilot, which is still a long way to go, the AutoPilot is
driver assistance and driver's should be aware.

~~~
rsynnott
The issue isn’t depth perception (someone with no functional depth perception
can drive!) so much as a lack of, for want of a better term, common sense. AI
is a bit of a misnomer; these cars are obviously not sentient, and don’t have
the same abilities as a human. If self-driving cars are possible at all with
current/near future tech (I’m dubious personally), they won’t be solely
dependent on cameras.

~~~
NicoJuicy
Why not? Humans can drive in less than optimal situations with 2 eyes as
sensors.

If the goal is to have a solution that can drive in the same situations as a
human.

What sensor is missing then and why?

Ps. My statement questions mostly LIDAR as additional sensor, I'm not sure if
AI/ML is good enough yet too, but we'll c. I'm pretty sure Lidar isn't as
important as many people think it is.

And for all Musk's "his faults", you can't deny he is correct on a lot of
cases.

Almost everyone questions the Lidar decision, but loses sight of the most
simple question.

What's missing with only cameras, if humans can do it?

~~~
jfk13
> What's missing with only cameras, if humans can do it?

Intelligence.

~~~
NicoJuicy
> What sensor is missing then and why?

But the discussion at hand is about LIDAR/sensors. Please read the entire
comment...

Not about if "AI is capable enough currently"

