
Tesla’s Autopilot was engaged when Model 3 crashed into truck, report states - Tomte
https://www.theverge.com/2019/5/16/18627766/tesla-autopilot-fatal-crash-delray-florida-ntsb-model-3
======
femto113
"our data shows that, when used properly by an attentive driver who is
prepared to take control at all times, drivers supported by Autopilot are
safer than those operating without assistance"

This is nonsense. The whole core of the argument against this sort of system
is that it _induces_ inattention. The fact that Tesla consistently refuses to
share full data on identical models driving with autopilot turned on, enabled
but not turned on, and not enabled is extremely suggestive that the overall
safety rates for autopilot on are not better.

(edited for formatting)

~~~
yourMadness
That might very well be the case.

Still we as a society usually decide not ban things that hurt only
irresponsible people. There is an exception to this for children, but they are
banned from driving already.

If the autopilot crashes start resulting in a number of deaths outside of the
drivers car that might shift the public opinion.

~~~
femto113
Lawn Darts only killed one person before they were banned, I think Autopilot
is at 3 or 4 and counting, and Uber's driverless system has killed at least
one pedestrian.

~~~
slezyr
Never heard about Lawn Darts.

> In the previous eight years, 6,100 people had been sent to the emergency
> room due to lawn darts in the U.S. Out of that total, 81% were 15 or
> younger, and half of them were 10 or younger. On the week the commission
> voted to ban the product, an 11-year-old girl in Tennessee was hit by a lawn
> dart and sent into a coma.

From wikipedia. It's a bad example.

~~~
femto113
It once was the canonical X in "we banned X why haven't we banned Y" (where Y
is something like "assault rifles" or "peanuts") but perhaps only for people
of a certain age. A more recent example: the Boeing 737 MAX was grounded after
two accidents out of a few hundred thousand flights.

------
modi15
Beats me why hasnt Tesla auto pilot been shut down already. If the car is on
Autopilot then people will get used to it and use that time to check on mail,
call a friend or simply doze off. Some of these guys will end up getting
killed and 'we told you so' simply doesnt cut it.

EDIT: The accident rate is a statistic thrown by Tesla and only serves the
purpose its meant to serve. Do we have any statistics on accidents caused
comparing the situation when car is on auto-pilot vs similar cars driven by
similar drivers under similar conditions without autopilot ?

~~~
bdamm
Because the accident rate is still lower than it is with humans driving.

~~~
gamblor956
Objectively not true....Autopilot is only used in a small subset of driving
situations that are less risky than overall driving. And yet in the last 8
months alone, Autopilot has caused 3 easily avoidable deaths.

Autopilot is getting _worse_ not better with time, and Tesla's tendency to
introduce regressions (as with the Bay Area crash) means any temporary
improvements in driving AI are just that--temporary.

~~~
dangus
> And yet in the last 8 months alone, Autopilot has caused 3 easily avoidable
> deaths.

This number by itself is absolutely meaningless without more context.

In 2017, 37,133 people in the United States died in an automotive crash. It is
_expected_ that some number of Tesla owners will die in a crash, regardless of
currently engaged safety features.

Tesla has produced around 600,000 vehicles in total, including over 100,000
Model 3 vehicles.

I'm not making any claim or dispute regarding the effectiveness of Autopilot,
whether or not it's improving. I'm just saying, making that statement you made
is meaningless at least and misleading at most.

~~~
gamblor956
_I 'm just saying, making that statement you made is meaningless at least and
misleading at most. _

I'm just pulling a page from Elon Musk and Tesla's marketing department.

------
lawguy
Here's a recap of the pertinent NTSB's conclusions from the last time this
happened. How many will apply again this time?

“There was sufficient sight distance to afford time for either the truck
driver or the car driver to have acted to prevent the crash.”

“The Tesla’s automated vehicle control system was not designed to, and did
not, identify the truck crossing the car’s path or recognize the impending
crash; consequently, the Autopilot system did not reduce the car’s velocity,
the forward collision warning system did not provide an alert, and the
automatic emergency braking did not activate.”

“If automated vehicle control systems do not automatically restrict their own
operation to those conditions for which they were designed and are
appropriate, the risk of driver misuse remains.”

“The Tesla driver was not attentive to the driving task, but investigators
could not determine from the available evidence the reason for his
inattention.”

[https://www.ntsb.gov/investigations/AccidentReports/Reports/...](https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1702.pdf)

------
toomuchtodo
Doesn't appear to differ from previous Tesla autopilot accidents where the
vehicle couldn't detect a tractor trailer crossing the path of travel with
cameras or front facing radar. Interesting that Autopilot had only been
activated for 10 seconds prior to the incident occurring.

I am not a domain expert. Front facing radar and cameras alone is clearly
inadequate for forward facing object detection [1] (unless we're going to
mandate radar detectable skirts and images superimposed on the sides of
tractor trailers for vehicle detection). Would solid state LIDAR (essentially
laser ranging I suppose, as you're not building a 360 degree point cloud)
solve this particular edge case? Some automakers are including laser
headlights in their vehicles [2] (although I don't believe this is yet
approved for the US market); would it not make sense to convert the headlights
into front laser ranging sensing systems along with their illumination
function?

[1] [https://www.caranddriver.com/features/a24511826/safety-
featu...](https://www.caranddriver.com/features/a24511826/safety-features-
automatic-braking-system-tested-explained/)

[2] [https://www.osram.com/am/specials/trends-in-automotive-
light...](https://www.osram.com/am/specials/trends-in-automotive-
lighting/laser-light-new-headlight-technology/index.jsp)

~~~
_ph_
I don't think we can conclude that cameras are inadequate for forward object
detection. Only that the processing systems used so far were. This can be a
matter of training of the neural networks and the amount of processing power
used in the computing units. Radar is limited, as long as trailers in the US
are allowed to have such high gaps to the ground. A lower bar on the sides
would both help the collision detection systems as well as reduce the crash
itself.

~~~
darkpuma
> _" Radar is limited, as long as trailers in the US are allowed to have such
> high gaps to the ground."_

One of these crashes involved a firetruck, which unlike a semi-trailer, does
not have a large gap underneath it.

Radars are limited period. The radars used by Tesla lack the angular
resolution to distinguish between a stationary object next to the road (e.g. a
building or parked car) with a stationary object parked right in the middle of
the road.

~~~
_ph_
_One of these crashes involved a firetruck, which unlike a semi-trailer, does
not have a large gap underneath it._

That fire truck wasn't crossing the road, it was at the road side, which
creates a different situation. The problem with high trailers is, that they
very much look like a bridge to a radar. In any case, a proper side bareer can
prevent deadly accidents.

~~~
darkpuma
If the radar cannot distinguish between a bridge with five meters of clearance
and a "bridge"(trailer) with a single meter of clearance, then the radar
simply lacks the angular resolution to be effective. It's the same exact issue
as it being unable to distinguish objects on the side of the road and objects
in the middle of the road.

It's insufficient angular resolution for the application, no matter which way
you slice it.

------
localhost
It seems like this is a case where the autopilot thought that the truck was a
stationary object. Based on my reading of the press release, it seemed like
the truck was turning across the car's path, which should look like a
stationary "bridge" to the model and subsequently ignored.

This seems to be a hard problem where you don't want the model to false
positive on a stationary object on the side of the road and slam on the
brakes.

~~~
Majromax
> Based on my reading of the press release, it seemed like the truck was
> turning across the car's path, which should look like a stationary "bridge"
> to the model and subsequently ignored.

So if I understand you properly, an adversarial attack against Tesla
autopilots would be to suspend a ladder across the road at windshield height?

"Vulnerable to being clotheslined" seems like a bit of an oversight.

~~~
jedberg
How many human drivers would be able to avoid that same thing though? Most of
the reason we can safely move cars is because most people follow the rules and
aren't murderers. There are plenty of "attacks" you can do against human
drivers too.

~~~
dsfyu404ed
I agree with your general message but basically no human drivers are going to
plow into the side of a a semi truck or a partial lane obstruction with no
attempt to afford it.

~~~
Woden501
Really? So no one ever is going to take a quick glance at their phone, not see
the truck start to pull out, and do the exact same thing this Model 3 did? No
one? I can say with 100% certainty that this exact same thing has happened at
least once in history to a normal driver who simply got distracted.

------
frenchie4111
I am beginning to get frustrated with news surrounding both Tesla autopilot
crashes, and self driving car crashes. I understand that we are seeing lots of
news about it because it is new, and people are scared that self driving cars
are going to kill people. But can you imagine if an article trended every time
someone got into an accident while using cruise control?

~~~
jocker12
This is about surrender responsibility to a piece of software and trust it
100% with your life and/or your family members lives or take responsibility
and engage in driving the car. The news, and legitimately so, are covering the
unusual - surrender your abilities, judgments and consequently and directly
your life, to faulty saftware and hardware.

------
djanogo
I have said this to my friend, so far all autopilot accidents resulted in
Tesla drivers death, but there is a person out there, hopefully not a child,
whose car Tesla will ram on autopilot and their death will put an end to this
stupid advertising/naming with fine print excuse.

