
NHTSA’s Implausible Safety Claim for Tesla’s Autosteer Driver Assistance System [pdf] - firebacon
http://www.safetyresearch.net/Library/NHTSA_Autosteer_Safety_Claim.pdf
======
firebacon
> The calculation of accurate crash rates of this type depend on reliable
> counts or estimates of both airbag deployment crashes as well as the mileage
> travelled exposing vehicles to the risk of a crash. But after obtaining the
> formerly secret, underlying data through a lawsuit filed under the Freedom
> of Information Act (FOIA) against the U.S. Department of Transportation, we
> discovered that the actual mileage at the time the Autosteer software was
> installed appears to have been reported for fewer than half the vehicles
> NHTSA studied. For those vehicles that do have apparently exact measurements
> of exposure mileage both before and after the software’s installation, the
> change in crash rates associated with Autosteer is the opposite of that
> claimed by NHTSA – if these data are to be believed.

------
zaroth
Having trouble following what they are saying... the specific terminology over
when mileage records were taken versus when autopilot was installed is
confusing.

They identify a subset of vehicles and run an analysis on just those vehicles
which seems to determine airbag deployments increased for those cars, and then
run a regression that states the more miles driven the less likely an airbag
is to deploy. They state this is unexpected but don’t explain what was wrong
with their data to cause it.

The headline numbers that they do present in a footnote;

“The summary of Column AX (Miles before Autosteer) is given in this worksheet
as 64,788,137. The summary of Column AY (Miles after Autosteer) is stated to
be 235,880,377. We counted 86 airbag deployments “before Autosteer” in this
Table and 192 de- ployments “after Autosteer.”

Basic math shows that airbag deployments per mile is decreasing from once per
753k miles to once per 1.223m miles, right?

The jist of TFA is that those “before” autopilot miles are significantly
undercounted. Without understanding more what the labels “Previous Mileage
before Autosteer Install” and “Next Mileage after Autosteer Install” mean, and
what it means when they are not reported, it’s hard to say.

But there do seem to be several significant cases where airbag deployment
count prior to Autosteer was increased for certain cohorts of vehicles where
there was _no_ increase in total mileage before Autosteer. Obviously a car
with zero miles can’t have crashed, so this is what calls the entire
calculation into question.

~~~
ejstronge
> Having trouble following what they are saying... the specific terminology
> over when mileage records were taken versus when autopilot was installed is
> confusing

I think this is one of the main points of the article. The NHTSA had the same
data, and made a very sensational graph, without any clarification on what the
headings mean. The authors essentially recapitulated the initial analysis and
then figured out its shortcomings.

> They identify a subset of vehicles and run an analysis on just those
> vehicles which seems to determine airbag deployments increased for those
> cars

The article clearly explains why that subset was the most important. This was
the only subset for which complete data was available.

> and then run a regression that states the more miles driven the less likely
> an airbag is to deploy. They state this is unexpected but don’t explain what
> was wrong with their data to cause it.

Actually, they do offer explanations in their discussion. They even refer the
reader to the discussion right after they post the table. (i.e., _See more on
this topic in the Discussion section_ ).

Also, see my initial point; this is not 'their data' so much as what was
relied upon by NHTSA to make the 2016 claim about autosteer safety.

> Previous Mileage before Autosteer Install” and “Next Mileage after Autosteer
> Install” mean, and what it means when they are not reported, it’s hard to
> say.

See my first point. However, they show pretty reasonably that those two
headings could mean 'recorded mileage right before the update' and 'recorded
mileage right after the update'.

> Obviously a car with zero miles can’t have crashed, so this is what calls
> the entire calculation into question.

This is another one of their points. They state multiple times that, _if the
data can be trusted_ , it doesn't support the NHTSA's claim.

\--

The article is actually very careful on all the points you raised, and would
merit a close reading before being dismissed.

~~~
manicdee
The NHTSA were using different data, since they had the non-redacted version.

The author uses the redacted version, doesn’t attempt to find an explanation
for anomalies, proceeds to analyse data based on faulty initial assumptions,
and compounds this by using a cherry picked example where the poorly
understood numbers best match their faulty assumptions.

When there is no explanation for “airbags deployed before 0km on odometer”
there is no point analysing the rest of the data. It’s all bogus, especially
the stuff that looks like it makes sense.

The fundamental fault of the author’s analysis is trusting data that is proven
untrustable. In the same set of data which has some cars deploying airbags
before ever being driven, the author assumes that the data which matches their
model of validity is magically more accurate and thus worthy of study.

The author continues to trust this data because it supports their biases, even
while pointing out the errors in the rest of the data. How is this not cherry
picking?

The entire study is bogus because it starts with bogus data and attempts to
make claims about reality based on a subset of bogus data that can be
misinterpreted as less bogus.

It’s like picking the few pages of manuscript from the infinite monkeys room
which have meaningful phrases on them and claiming that obviously monkeys are
really good writers.

------
notroot
This bit about before autosteer install airbag deployment counts coming from
vehicles with 0 miles of before autosteer install distance is certainly odd,
if true. "That is, simply because the data are missing for the Previous
Mileage before Autosteer Install and the “Next Mileage after Autosteer
Install,” NHTSA’s method of calculation assumes that all of the exposure
mileage must belong to the “after Autosteer” category. The three airbag
deployments without any exposure mileage in the “before Autosteer” category
show this is not the case. " [pg.11-12]

~~~
notroot
The plot thickens there in Figure 3 where they suggest mistreatment of
deployments/miles that occur between `Prev Mileage before Autosteer Install
Reported` and `Next Mileage after Autosteer Install Reported` "result in
estimates of the true exposure that are statistically biased downward –
resulting in crash rates that are somewhat too large." [pg.13]

~~~
notroot
And then, Figure 4, 15 deployments are counted before instillation while no
additional mileage is counted for the before instillation group. [pg.16] Could
the NHTSA group really have done that? It give us an incredible rate of 15/0
deployments/mi without autosteer which violates some very important law of
arithmetic, statistics and common sense.

I'm going to hold my breath until we see the data these claims are based on.

------
danbr
I worked in a research lab to help collect hundreds of thousands of miles of
non-biased, real-world, Tesla specific driving data.

I know the data is there to help prove or squash claims like this. I hope it
gets out there one day.

~~~
rjdagost
Having had a front-row seat to the data, what's your take? Do you think
Tesla's Autosteer helps, hinders, or keep accident rates unchanged?

~~~
rohit2412
Not him, but compared to an unassisted human, a human supervised autosteer
might be better. This is no condonation for driverless autosteer though.

But what we should compare to is a supervised steer to a assisted human. So
autosteer with Lane keeping assist. If Tesla offered such a service, we should
be able to compare. Till then everybody is just comparing unreasonably in my
opinion

~~~
danbr
Something like this.

>Having had a front-row seat to the data, what's your take? Do you think
Tesla's Autosteer helps, hinders, or keep accident rates unchanged?

From my subjective view point of the data - Tesla's autosteer overall helps to
reduce accidents. But only when the operator is fully aware of the systems
functions and alerts.

Driver assistance systems are wonderful technology and should be standardized
like seatbelts, airbags and ABS are. The underlying issue is related to its
slow adoption and use, and not a lack of functionality. How the alerts and
information of impending caution/take-over required/accidents are presented to
the driver is the main hurdle to its lack of wide spread adoption.

Tesla autosteer has a "Delightfully Counter-Intuitive" (as Musk might say)
manner in which drivers are taught to learn how this system functions and what
its abilities and limits are, therefore reducing the accident rates - in my
naive (data) opinion.

Main point being, driver assistance systems could help reduce accident
rates....as long as drivers know how to use them, how they function, and what
all their literal bells and whistles mean, to the point where you're always
supervising the system (whether you realize it or not).

~~~
AndrewBissell
> From my subjective view point of the data - Tesla's autosteer overall helps
> to reduce accidents. But only when the operator is fully aware of the
> systems functions and alerts.

You can't just throw that second qualifier on there. Whether Autopilot helps
reduce accidents is _100% dependent_ on whether users take it as an excuse to
let their attention wander from the road. If the behavior of many Tesla fans
on Twitter is any indication, they already think it's fine to treat it like a
full self-driving system:
[https://twitter.com/Scobleizer/status/1092477048482717696](https://twitter.com/Scobleizer/status/1092477048482717696)

And that's to say nothing of the fact that _Elon Musk himself_ was shown on 60
Minutes letting Autopilot drive, in traffic, with his hands off the wheel and
Lesley Stahl in the passenger seat. The user's manual may say to keep your
hands on the wheel, and in court when faced with an Autopilot-related lawsuit
Tesla always argues that drivers are 100% responsible for the car at all
times, but Musk & Tesla are nonetheless happy to irresponsibly encourage the
false perception that their cars are self-driving.

IMO constant in-car monitoring of driver attention (i.e. a camera focused
directly on the driver's face) is the only way these "kinda-self-driving-but-
not-really-when-it-counts" systems can be deployed safely.

~~~
manicdee
How much better is “check the user is looking at the road” than “check the
user has hands on the wheel”? Is the improvement in driver attention
measurable?

~~~
rohit2412
I say we should shut down the "kinda self driving but not when it counts"
systems, and have them only for accident protection. Basically "self driving
only when absolutely needed"

------
gok
They obtained the data by FOIA but didn't post it anywhere? They did post
their lawsuit documents though.

~~~
mikeortman
A FOIA can be done by mail. You don't have to sue to get the data unless the
NHTSA was being a total jerk.

~~~
tamalesfan
"You don't have to sue"

There is literally an App[1] to track the progress of the hundreds of FOIA
lawsuits outstanding at any given moment. The federal Government routinely
denies FOIA requests and forces applicants to pursue them in court. If
attributing this to Trump would help you inculcate this reality then have a
look at this[2]. Agencies are deliberately not investing in IT systems that
would serve to facilitate FOIA, according to MIT[3].

1\. [http://foiaproject.org/lawsuit/](http://foiaproject.org/lawsuit/) 2\.
[http://foiaproject.org/2018/01/16/lawsuits-trump-first-
year/](http://foiaproject.org/2018/01/16/lawsuits-trump-first-year/) 3\.
[https://www.pcworld.com/article/3097047/software/the-fbi-
is-...](https://www.pcworld.com/article/3097047/software/the-fbi-is-using-
outdated-it-to-foil-foia-requests-lawsuit-alleges.html)

Now that your objection to a key premise of this story has been
comprehensively discredited, perhaps you might make the necessary world view
adjustment and reconsider your conclusion.

BTW, the foiaproject links above are concerned only with Federal FOIA; similar
state laws exist and the number of FOIA lawsuits at the state level are
legion. Your governments are actively thwarting FOIA at every level.

------
manicdee
What’s the size of the error on this estimate given the quality of the input
data? At one significant digit are we talking an error margin larger or
smaller than the entire data set?

------
mcguire
The NHTSA's researchers didn't publish their study in a peer reviewed
journal?!?!

~~~
mcguire
I'm not one to complain about downvotes, but this entire episode seems to be
due to science-by-press release.

 _Has_ anyone got a full report on the NHTSA study?

~~~
AndrewBissell
NHTSA vigorously argued in court that they needed to keep the data secret
because its release posed a threat of "competitive harm" to Tesla. The authors
of this new study had to file a FOIA request and then sue when it was
initially denied by NHTSA. In the absence of the data, I'm not sure what the
editors of a peer-reviewed journal would have been able to review.

------
jaspergilley
This seems very fishy. Seemingly un-previously established website publishes
study critical of Tesla. Gas auto and oil interests have been known to do this
kind of thing before; wouldn't be surprised if this was the latest iteration.

~~~
AndrewBissell
Hey look, it's the same line trotted out every time someone suggests that
Teslas are not the safest, most reliable cars on Earth, that Autopilot is not
just one OTA update away from Level 5 self-driving, and that Elon Musk is not
a modern-day Thomas Edison about to end climate change and bring us all to
Mars:

"GOTTA BE THE OIL & GAS INDUSTRY!"

This is really just a way for Tesla and Elon Musk fans to inject a cloud of
distracting squid ink into discussions of any remotely damaging info.

If you are looking for some third party validation, Edward Niedermeyer is a
journalist who has followed Tesla & Elon Musk closely for years and is taking
this study very seriously:
[https://twitter.com/Tweetermeyer/status/1094318155285946368](https://twitter.com/Tweetermeyer/status/1094318155285946368)

~~~
omarforgotpwd
Andrew is one of the trolls who spends all day attacking Tesla on twitter. Go
look him up, Andrew Bissell. Top commenter is right, this document is highly
suspect. In any case an analysis from 2017 is irrelevant given how much the
software has changed. It is sad that Andrew would see people die to help his
Tesla short position.

Ah, looks like he just protected his tweets. Probably worried about his
employer or federal investigators looking into what he’s spending all his time
on.

[https://mobile.twitter.com/arbissell?lang=en](https://mobile.twitter.com/arbissell?lang=en)

~~~
AndrewBissell
^ The attempt to intimidate critics into silence with threats to their
employment is also a common tactic among Tesla bulls. A bit of behavior
learned from their Dear Leader after Elon Musk chased Montana Skeptic off of
Twitter and Seeking Alpha.

The idea that federal investigators are taking any interest in Tesla shorts is
OTOH pure Tesla bull fantasy. (Actually, that's not quite true: the SF SEC
reportedly spoke with one short seller to gather tips about the company.)

Anyone who wants to follow me without tweeting that my skepticism of Autopilot
is tantamount to murder, and that "God will judge me" for it (as Omar here
did), is welcome to send me a follow request.

~~~
omarforgotpwd
Damn Andrew, I didn’t think you had such thin skin. Enjoy the echo chamber.
Happy to unblock each other if you ever feel like listening to viewpoints that
are different than your own. And nobody’s calling your boss, I doubt anyone
really cares. They are trolling you.

------
omarforgotpwd
The origins of this report are suspect.

At worst it is saying someone at NHTSA did the calculations incorrectly — a
bold claim they admit even they can’t say for certain people because of
redactions.

Someone calculating the data incorrectly doesn’t make AutoPilot any more or
less safe than it has been since the NHTSA reviewed the data in 2007. In any
case an analysis from 2007 is hardly relevant today given how much the
software has changed.

~~~
ejstronge
> The origins of this report are suspect.

Potentially, but the data is now available and you can repeat the analysis
yourself, which is a better position than we were in two days ago.

> a bold claim they admit even they can’t say for certain people because of
> redactions.

I'm not sure where they claim this - their lawsuit actually asks for all
communications related to the NHTSA figure in question.

> Someone calculating the data incorrectly doesn’t make AutoPilot any more or
> less safe than it has been since the NHTSA reviewed the data in 2007.

Note the year in question is _2016_.

In fact, we did not know how safe autosteer technologies were - this report
shows that the report so many of us had relied on was, at best, poorly
constructed.

> In any case an analysis from 2007 is hardly relevant today given how much
> the software has changed.

Again, the correct year is _2016_. This could be true - hopefully Tesla will
release a response with the data made available for independent analysts to
assess the safety of their autosteer technology.

~~~
manicdee
This report shows nothing since the dataset was heavily redacted, the author
cherry picked clean-looking data from a garbage dataset, and the auhor’s
“findings” are entirely dependent on assuming the data is accurate having just
spent the entire report describing how inaccurate the data is.

