Hacker News new | more | comments | ask | show | jobs | submit login
NHTSA’s Implausible Safety Claim for Tesla’s Autosteer Driver Assistance System [pdf] (safetyresearch.net)
81 points by firebacon 12 days ago | hide | past | web | favorite | 39 comments





> The calculation of accurate crash rates of this type depend on reliable counts or estimates of both airbag deployment crashes as well as the mileage travelled exposing vehicles to the risk of a crash. But after obtaining the formerly secret, underlying data through a lawsuit filed under the Freedom of Information Act (FOIA) against the U.S. Department of Transportation, we discovered that the actual mileage at the time the Autosteer software was installed appears to have been reported for fewer than half the vehicles NHTSA studied. For those vehicles that do have apparently exact measurements of exposure mileage both before and after the software’s installation, the change in crash rates associated with Autosteer is the opposite of that claimed by NHTSA – if these data are to be believed.

Having trouble following what they are saying... the specific terminology over when mileage records were taken versus when autopilot was installed is confusing.

They identify a subset of vehicles and run an analysis on just those vehicles which seems to determine airbag deployments increased for those cars, and then run a regression that states the more miles driven the less likely an airbag is to deploy. They state this is unexpected but don’t explain what was wrong with their data to cause it.

The headline numbers that they do present in a footnote;

“The summary of Column AX (Miles before Autosteer) is given in this worksheet as 64,788,137. The summary of Column AY (Miles after Autosteer) is stated to be 235,880,377. We counted 86 airbag deployments “before Autosteer” in this Table and 192 de- ployments “after Autosteer.”

Basic math shows that airbag deployments per mile is decreasing from once per 753k miles to once per 1.223m miles, right?

The jist of TFA is that those “before” autopilot miles are significantly undercounted. Without understanding more what the labels “Previous Mileage before Autosteer Install” and “Next Mileage after Autosteer Install” mean, and what it means when they are not reported, it’s hard to say.

But there do seem to be several significant cases where airbag deployment count prior to Autosteer was increased for certain cohorts of vehicles where there was no increase in total mileage before Autosteer. Obviously a car with zero miles can’t have crashed, so this is what calls the entire calculation into question.


> Having trouble following what they are saying... the specific terminology over when mileage records were taken versus when autopilot was installed is confusing

I think this is one of the main points of the article. The NHTSA had the same data, and made a very sensational graph, without any clarification on what the headings mean. The authors essentially recapitulated the initial analysis and then figured out its shortcomings.

> They identify a subset of vehicles and run an analysis on just those vehicles which seems to determine airbag deployments increased for those cars

The article clearly explains why that subset was the most important. This was the only subset for which complete data was available.

> and then run a regression that states the more miles driven the less likely an airbag is to deploy. They state this is unexpected but don’t explain what was wrong with their data to cause it.

Actually, they do offer explanations in their discussion. They even refer the reader to the discussion right after they post the table. (i.e., See more on this topic in the Discussion section).

Also, see my initial point; this is not 'their data' so much as what was relied upon by NHTSA to make the 2016 claim about autosteer safety.

> Previous Mileage before Autosteer Install” and “Next Mileage after Autosteer Install” mean, and what it means when they are not reported, it’s hard to say.

See my first point. However, they show pretty reasonably that those two headings could mean 'recorded mileage right before the update' and 'recorded mileage right after the update'.

> Obviously a car with zero miles can’t have crashed, so this is what calls the entire calculation into question.

This is another one of their points. They state multiple times that, if the data can be trusted, it doesn't support the NHTSA's claim.

--

The article is actually very careful on all the points you raised, and would merit a close reading before being dismissed.


The NHTSA were using different data, since they had the non-redacted version.

The author uses the redacted version, doesn’t attempt to find an explanation for anomalies, proceeds to analyse data based on faulty initial assumptions, and compounds this by using a cherry picked example where the poorly understood numbers best match their faulty assumptions.

When there is no explanation for “airbags deployed before 0km on odometer” there is no point analysing the rest of the data. It’s all bogus, especially the stuff that looks like it makes sense.

The fundamental fault of the author’s analysis is trusting data that is proven untrustable. In the same set of data which has some cars deploying airbags before ever being driven, the author assumes that the data which matches their model of validity is magically more accurate and thus worthy of study.

The author continues to trust this data because it supports their biases, even while pointing out the errors in the rest of the data. How is this not cherry picking?

The entire study is bogus because it starts with bogus data and attempts to make claims about reality based on a subset of bogus data that can be misinterpreted as less bogus.

It’s like picking the few pages of manuscript from the infinite monkeys room which have meaningful phrases on them and claiming that obviously monkeys are really good writers.


"[They] run a regression that states more miles driven the less likely an airbag is to deploy. They state this is unexpected but don’t explain what was wrong with their data to cause it"

This is covered in the Discussion section, but not with much clarity. I think it makes more sense if you invert it:

    cars that have been in a crash accumulate fewer miles than cars that have not been in a crash
This effect seems obvious to me, since a car can't be driven while it is being repaired (and will never be driven again if it was totaled).

This bit about before autosteer install airbag deployment counts coming from vehicles with 0 miles of before autosteer install distance is certainly odd, if true. "That is, simply because the data are missing for the Previous Mileage before Autosteer Install and the “Next Mileage after Autosteer Install,” NHTSA’s method of calculation assumes that all of the exposure mileage must belong to the “after Autosteer” category. The three airbag deployments without any exposure mileage in the “before Autosteer” category show this is not the case. " [pg.11-12]

The plot thickens there in Figure 3 where they suggest mistreatment of deployments/miles that occur between `Prev Mileage before Autosteer Install Reported` and `Next Mileage after Autosteer Install Reported` "result in estimates of the true exposure that are statistically biased downward – resulting in crash rates that are somewhat too large." [pg.13]

And then, Figure 4, 15 deployments are counted before instillation while no additional mileage is counted for the before instillation group. [pg.16] Could the NHTSA group really have done that? It give us an incredible rate of 15/0 deployments/mi without autosteer which violates some very important law of arithmetic, statistics and common sense.

I'm going to hold my breath until we see the data these claims are based on.


I worked in a research lab to help collect hundreds of thousands of miles of non-biased, real-world, Tesla specific driving data.

I know the data is there to help prove or squash claims like this. I hope it gets out there one day.


Having had a front-row seat to the data, what's your take? Do you think Tesla's Autosteer helps, hinders, or keep accident rates unchanged?

Can't speak for the data, but can speak as a user of autopilot. I've probably put ~20k miles on autopilot, 5k of those miles on highways.

I think the biggest benefit, safety wise, is it gives the ability to put less mental focus in staying in the lane, and gives me the ability to spend more time focusing on my surroundings. Its easier for me to focus on keeping up with the traffic behind me (like if a car has creeped up into my blind spot) and any potential traffic that is about to cross my path (like a possible red light running coming in hot). In heavy traffic, its super helpful; the auto cruise control is very human-like and will build a gap when someone cuts you off instead of suddenly making one with a slam on the brakes.

There are AP users who prefer to use their time checking on that text message. Those people don't help the statistics / human gene pool.


Not him, but compared to an unassisted human, a human supervised autosteer might be better. This is no condonation for driverless autosteer though.

But what we should compare to is a supervised steer to a assisted human. So autosteer with Lane keeping assist. If Tesla offered such a service, we should be able to compare. Till then everybody is just comparing unreasonably in my opinion


Something like this.

>Having had a front-row seat to the data, what's your take? Do you think Tesla's Autosteer helps, hinders, or keep accident rates unchanged?

From my subjective view point of the data - Tesla's autosteer overall helps to reduce accidents. But only when the operator is fully aware of the systems functions and alerts.

Driver assistance systems are wonderful technology and should be standardized like seatbelts, airbags and ABS are. The underlying issue is related to its slow adoption and use, and not a lack of functionality. How the alerts and information of impending caution/take-over required/accidents are presented to the driver is the main hurdle to its lack of wide spread adoption.

Tesla autosteer has a "Delightfully Counter-Intuitive" (as Musk might say) manner in which drivers are taught to learn how this system functions and what its abilities and limits are, therefore reducing the accident rates - in my naive (data) opinion.

Main point being, driver assistance systems could help reduce accident rates....as long as drivers know how to use them, how they function, and what all their literal bells and whistles mean, to the point where you're always supervising the system (whether you realize it or not).


> From my subjective view point of the data - Tesla's autosteer overall helps to reduce accidents. But only when the operator is fully aware of the systems functions and alerts.

You can't just throw that second qualifier on there. Whether Autopilot helps reduce accidents is 100% dependent on whether users take it as an excuse to let their attention wander from the road. If the behavior of many Tesla fans on Twitter is any indication, they already think it's fine to treat it like a full self-driving system: https://twitter.com/Scobleizer/status/1092477048482717696

And that's to say nothing of the fact that Elon Musk himself was shown on 60 Minutes letting Autopilot drive, in traffic, with his hands off the wheel and Lesley Stahl in the passenger seat. The user's manual may say to keep your hands on the wheel, and in court when faced with an Autopilot-related lawsuit Tesla always argues that drivers are 100% responsible for the car at all times, but Musk & Tesla are nonetheless happy to irresponsibly encourage the false perception that their cars are self-driving.

IMO constant in-car monitoring of driver attention (i.e. a camera focused directly on the driver's face) is the only way these "kinda-self-driving-but-not-really-when-it-counts" systems can be deployed safely.


How much better is “check the user is looking at the road” than “check the user has hands on the wheel”? Is the improvement in driver attention measurable?

I say we should shut down the "kinda self driving but not when it counts" systems, and have them only for accident protection. Basically "self driving only when absolutely needed"

> a human supervised autosteer might be better. or it might not, so would be good to have more data and see what features help and what maybe don't help.

The question is why is the data secret, I mean I understand why Tesla does not want to share the training data but I am referring to the safety records and related statistics.


it’s a massive win for safety, no doubt in my mind. Speaking as a user.

They obtained the data by FOIA but didn't post it anywhere? They did post their lawsuit documents though.


A FOIA can be done by mail. You don't have to sue to get the data unless the NHTSA was being a total jerk.

"You don't have to sue"

There is literally an App[1] to track the progress of the hundreds of FOIA lawsuits outstanding at any given moment. The federal Government routinely denies FOIA requests and forces applicants to pursue them in court. If attributing this to Trump would help you inculcate this reality then have a look at this[2]. Agencies are deliberately not investing in IT systems that would serve to facilitate FOIA, according to MIT[3].

1. http://foiaproject.org/lawsuit/ 2. http://foiaproject.org/2018/01/16/lawsuits-trump-first-year/ 3. https://www.pcworld.com/article/3097047/software/the-fbi-is-...

Now that your objection to a key premise of this story has been comprehensively discredited, perhaps you might make the necessary world view adjustment and reconsider your conclusion.

BTW, the foiaproject links above are concerned only with Federal FOIA; similar state laws exist and the number of FOIA lawsuits at the state level are legion. Your governments are actively thwarting FOIA at every level.


TFA clearly details how they had to sue to get the data...

What’s the size of the error on this estimate given the quality of the input data? At one significant digit are we talking an error margin larger or smaller than the entire data set?

The NHTSA's researchers didn't publish their study in a peer reviewed journal?!?!

I'm not one to complain about downvotes, but this entire episode seems to be due to science-by-press release.

Has anyone got a full report on the NHTSA study?


NHTSA vigorously argued in court that they needed to keep the data secret because its release posed a threat of "competitive harm" to Tesla. The authors of this new study had to file a FOIA request and then sue when it was initially denied by NHTSA. In the absence of the data, I'm not sure what the editors of a peer-reviewed journal would have been able to review.

This seems very fishy. Seemingly un-previously established website publishes study critical of Tesla. Gas auto and oil interests have been known to do this kind of thing before; wouldn't be surprised if this was the latest iteration.

Hey look, it's the same line trotted out every time someone suggests that Teslas are not the safest, most reliable cars on Earth, that Autopilot is not just one OTA update away from Level 5 self-driving, and that Elon Musk is not a modern-day Thomas Edison about to end climate change and bring us all to Mars:

"GOTTA BE THE OIL & GAS INDUSTRY!"

This is really just a way for Tesla and Elon Musk fans to inject a cloud of distracting squid ink into discussions of any remotely damaging info.

If you are looking for some third party validation, Edward Niedermeyer is a journalist who has followed Tesla & Elon Musk closely for years and is taking this study very seriously: https://twitter.com/Tweetermeyer/status/1094318155285946368


Elon Musk is far greater an inventor/entrepreneur than Edison ever was. If Elon accomplishes a fraction of what he's set out to in his life, history books from the year 2500 will relegate Edison to a footnote in the chapter about Musk.

That’s a big claim. Can you back it up with some evidence? Edison played a key role in at least three technologies which transformed life in the 20th century. Musk has led some incremental improvements which are nice but I’m skeptical that he’ll realistically match, much less eclipse, that impact.

Just because they are crazy, doesn't mean they are wrong

In this case, the safety researchers in question have previously held many legacy automakers' feet to the fire, and Sean Kane has past association with Ralph Nader. Oil & gas stooges these are not.

Andrew is one of the trolls who spends all day attacking Tesla on twitter. Go look him up, Andrew Bissell. Top commenter is right, this document is highly suspect. In any case an analysis from 2017 is irrelevant given how much the software has changed. It is sad that Andrew would see people die to help his Tesla short position.

Ah, looks like he just protected his tweets. Probably worried about his employer or federal investigators looking into what he’s spending all his time on.

https://mobile.twitter.com/arbissell?lang=en


^ The attempt to intimidate critics into silence with threats to their employment is also a common tactic among Tesla bulls. A bit of behavior learned from their Dear Leader after Elon Musk chased Montana Skeptic off of Twitter and Seeking Alpha.

The idea that federal investigators are taking any interest in Tesla shorts is OTOH pure Tesla bull fantasy. (Actually, that's not quite true: the SF SEC reportedly spoke with one short seller to gather tips about the company.)

Anyone who wants to follow me without tweeting that my skepticism of Autopilot is tantamount to murder, and that "God will judge me" for it (as Omar here did), is welcome to send me a follow request.


Damn Andrew, I didn’t think you had such thin skin. Enjoy the echo chamber. Happy to unblock each other if you ever feel like listening to viewpoints that are different than your own. And nobody’s calling your boss, I doubt anyone really cares. They are trolling you.

The origins of this report are suspect.

At worst it is saying someone at NHTSA did the calculations incorrectly — a bold claim they admit even they can’t say for certain people because of redactions.

Someone calculating the data incorrectly doesn’t make AutoPilot any more or less safe than it has been since the NHTSA reviewed the data in 2007. In any case an analysis from 2007 is hardly relevant today given how much the software has changed.


> The origins of this report are suspect.

Potentially, but the data is now available and you can repeat the analysis yourself, which is a better position than we were in two days ago.

> a bold claim they admit even they can’t say for certain people because of redactions.

I'm not sure where they claim this - their lawsuit actually asks for all communications related to the NHTSA figure in question.

> Someone calculating the data incorrectly doesn’t make AutoPilot any more or less safe than it has been since the NHTSA reviewed the data in 2007.

Note the year in question is 2016.

In fact, we did not know how safe autosteer technologies were - this report shows that the report so many of us had relied on was, at best, poorly constructed.

> In any case an analysis from 2007 is hardly relevant today given how much the software has changed.

Again, the correct year is 2016. This could be true - hopefully Tesla will release a response with the data made available for independent analysts to assess the safety of their autosteer technology.


This report shows nothing since the dataset was heavily redacted, the author cherry picked clean-looking data from a garbage dataset, and the auhor’s “findings” are entirely dependent on assuming the data is accurate having just spent the entire report describing how inaccurate the data is.



Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: