Hacker News new | past | comments | ask | show | jobs | submit login
New Vehicle Safety Report Reveals Teslas Using Autopilot Are 10x Safer (notateslaapp.com)
21 points by toomuchtodo on Feb 24, 2023 | hide | past | favorite | 23 comments



How much of this is Autopilot only being used in safer situations?

There’s also been reports of autopilot disengaging seconds before a crash, so those technically didn’t occur while Autopilot was active


Much like (in Australia at least), every time a car crashed that was being pursued by a police vehicle, the police would report that "the pursuit had been called off shortly before the crash" (and media would subsequently publish said report without question).

Back to Tesla - what sort of warning is given when the autopilot is going to disengage?


One way to analyze this is to ignore if Autopilot is on at the moment of a crash and make a gross comparison of Autopilot equipped vehicles accidents rate over mileage. A side stat would be if possible to see of there is a correlation between amount of Autopilot on time vs accident rate (again ignoring if ap is on at the moment of a crash).

This is answering a different question than is Autopilot effective when on, but is Autopilot capability + the human employment of it create an observable effect in accident rates.


Highways are 10x safer than non-highway miles, and AP may be selectively engaged in only the safest situations.

A population study would be tricky to do, because all recent Teslas have Autopilot and Teslas as a whole are newer cars driven by rich people, with a heavy bias towards California. Further, Teslas have automatic emergency braking independent of Autopilot.

So you can’t compare accident rates as a whole, but you could compare accident rates of, say, new Mercedes or BMWs with Teslas in California, which have roughly the same capabilities and are driven by the same rich people (though possibly slightly older).


Entirely. People pointed this (that Autopilot is only possible to engage on the freeway and that drivers will do so only in situations they feel comfortable with) out to Tesla each of the previous times they released such numbers, and they haven't corrected it, which tells you all you need to know about what the corrected numbers would show.


From the methods section, they're counting where it was on in up to the 5 seconds before the crash; I've no idea if that's reasonable or not, and it would be interesting to know what happens to the numbers if you push that up to 20s.

> To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. (Our crash statistics are not based on sample data sets or estimates.) In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated.


Until Tesla (or any other manufacturer) assumes legal liability for any crashes via autopilot, this is all empty posturing. If you believe in your product, then put your money where your mouth is, end of story. Until then, please don't waste your breath, because that's incontrovertible proof that you don't believe in the safety of your product.


This. When you buy $500 geohot’s hack - you can posture as much as you want. But when you pay 10k you assume you get some guarantees and insurance.


Legal liability is something to be decided by local laws? Would insuring the cars cheaper would be closer to "money where your mouth is". Anyone using this: is their insurance cheaper? https://www.tesla.com/insurance


Anything that improves crash statistics over those produced by human drivers is an improvement. It does not have to be perfect, but it does need to show it's safer than the average legal human driver.


I will do it by extending insuring as Tesla if autopilot is used. If they are right they will make a killing


Car companies have improved safety significantly even beyond regulated requirements. None in history have assumed legal liability for crashes.

Therefore no car company that currently exists believes in safety?


Toyota, quite famously, was ordered to pay a $1.2 billion fine for the unintended acceleration issue.

How much more liable for you want?

Go ahead and nitpick how it's not strict legal liability.


This is a different situation, the car can drive itself (supposedly). If the car could drive itself perfectly, then it would make 100% sense for the manufacturer assume liability for autopilot driver accidents under its own insurance. Eventually this will become the case - disengaging autopilot will automatically increase your monthly insurance premium.


Presumably a car manufacturer would be liable if a safety-critical part malfunctioned, say the brakes.

Autopilot is like brakes that the maker says "they usually work but no guarantees". A higher autonomy level system would be like brakes that actually are guaranteed to work.


> In the 3rd quarter, one crash was recorded for every 6.26 million miles driven using Autopilot. Teslas not using Autopilot technology logged one crash for every 1.71 million miles driven.

Autopilot might be safer, but this statistic is just a textbook example of selection bias. In order to compare one to the other, crashes in places autopilot doesn't work need to be excluded.


https://www.tesla.com/VehicleSafetyReport

> Update (January 2023): We are proud of Autopilot’s performance and its impact on reducing traffic collisions. The benefit and promise of Autopilot is clear from the Vehicle Safety Report data that we have been sharing for 4 years. As part of Tesla’s commitment to continuous improvement, recent analysis led us to identify and implement upgrades to our data reporting. Specifically, we discovered reports of certain events where no airbag or other active restraint deployed, single events that were counted more than once, and reports of invalid or duplicated mileage records. Including these events is inconsistent with our methodology for the Vehicle Safety Report and they will be excluded going forward. These upgrades in data analysis reinforce the positive impact that Autopilot has on vehicle safety. To ensure the accuracy of our reporting, we updated all collision rates historically to account for these upgrades, including the baseline collision rates for the United States based on currently available NHTSA and FHWA data. (Note that for purposes of the baseline collision rates in the United States, an automobile crash is one that involves at least one passenger vehicle, light truck, SUV or van that is 10,000 pounds or less, as classified by available federal data.) The end result is that, when Autopilot is active, the collision rates are even lower than we previously reported.


I don't understand how this is possible. I recently rode in a Tesla with FSD engaged. In a span of 10 minutes it tried to run a red light, failed to change lanes, and failed to get off the freeway offramp. These were all vanilla scenarios with perfect San Diego weather and well-lit, well-labeled paths and signs.

Occam's razor in a situation where a hype man promised step-function improvements for well over a decade is that this is not an honest assessment or an assessment not based on honest data.


I don't own a Tesla, but my understanding is that autopilot is basically marketing speak for what everyone else calls lane assist and smart cruise control, while FSD is when the car is supposed to be able to drive for you. In other words this report is saying human drivers + automated safety features are safer than human drivers alone. In conditions where the Tesla will let someone enable autopilot in the first place.


There are a few hundred thousand users at this point. If this stuff is so dangerous, there should statistically be some evidence for that in the form of lots of accidents happening with these cars.

Instead we have Tesla suggesting that these cars are actually more safe. Of course they are very biased and you'd be well advised to take that with a grain of salt. There have been some incidents of course but overall it seems not a whole lot of bad stuff is actually happening. This, despite people insisting this is super dangerous, isn't working, cannot possibly be working because of reasons (imagined or real), etc.

Meanwhile other manufacturers now have proper self driving cars on the roads without a driver. Those to have had some incidents but not a whole lot of fatal ones. Lots of nay-sayers and yet the doom they spell just doesn't seem to actually happen.


The problem is that the safety of Tesla is only compared to the average car. It should be compared to a car in the same segment. The average car in US is > 12 years old, and is probably under-maintained.


Following GPS instructions 10x safer than me reading a map. Except that one time when it told me to make a sharp right, off the side of a mountain. Not sure this is very valuable.


Not sure how much the blog article is adding to the original: https://www.tesla.com/VehicleSafetyReport




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: