Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I feel 'A Tragic Loss' sets the tone for the remaining highly technical overview, instead of the technical breakdown seeming tone-deaf, it added tone until getting to the concluding paragraph.

Unfortunately, fortunately, Tesla is having to educate people and being very clear to not allow room for panic and unwarranted fear mongering.

I would say it is as to the point as it can be, and that it is heartfelt.



In addition it had important information for how other Tesla drivers can use the auto-pilot feature more safely. The article combines condolences and helpful safety information to prevent this kind of thing from happening again. Also, you don't see GM making a blog post every time someone passes away in a car they make.


Then again, GM sells more vehicles in a week than Tesla has sold in its entire history.


This.

Notice how quick they are to bring up "this is the first fatality in 130 million miles", but AFAICT, traffic fatalities in the US occur at a frequency of around 10 per billion miles [1], which is the same.

So even though this is just one data point, it's spot on the current average for fatality frequency in normal cars.

Furthermore, to gather enough statistics to be able to say with confidence "semi-autonomous Teslas are/aren't safer than normal cars", Tesla either needs to increase their sales volume by several orders of magnitude, or we'll have to wait for ten years.

[1] https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_i...


Is there a statistical difference in accident rate between high and low income populations? If so, do we know what the fatality/miles rate is in the group that would be able to afford a tesla? When I read the blog post, I couldn't help but think that while tesla's rate might be smaller than the national average, it is also not a fair comparison to draw conclusions from.


IDK about statistical difference, but the mechanical difference between a brand new high-end full-size car and a twenty year old Honda Civic is going to be massive. Just look at the downward slope of that curve of fatalities per billion miles in each year. Also remember that those numbers are averages over the entire car population in each year, so most cars in the 2015 data are quite a bit older than 2015.

If we had data for the "high end modern full-size car" category, it would most definitely be lower than 10 fatalities per billion miles.


The IIHS compares car model fatality rates: http://www.iihs.org/iihs/topics/driver-death-rates

If you want to do the numbers to compare the Tesla to other cars, where they measure driver fatalities per million registered years, do this:

130,000,000 miles / 12,000 average miles per year per car = 10,833 years.

That's 10,833 registered years for the Model S.

The IIHS site normalizes to deaths per million registered years, so multiply that number by 92.3, and since we have one death so far, the Tesla Model S is now at 92.3 driver deaths per million registered car years.

Let's compare to other car models:

  Ford Taurus 2WD: 20 deaths
  Hyundai AccentL: 86 deaths
  Lexus LS 460 2WD: 18 deaths
  Acura TL 2WD: 5 deaths
  Audi A4 4WD: 0 deaths (yes, a bunch of car models had zero driver deaths)
  And average for ALL 2011 vehicles: 28 deaths.
So, right now, it looks like the Tesla Model S is about 4x deadlier than the average car when using Autopilot, and about 10-20x deadlier than the safest cars.


When these automated cars can complete a journey __without__ handing back to the human, only then can those journey miles be added to the miles driven under autopilot claim. Until such time, consider all automated cars a considerable risk to other road users.


How are you defining automated? No truly autonomous cars are available yet, so they're all some form of driver assist. How are these a risk to other road users, and at which point do they become a risk and not a safety benefit?

Auto-pilot is semi-autonomous... I gather you consider that a risk...

What about Subaru EyeSight? It can autonomously brake. I can drive for hours on the freeway without ever hitting the brake nor gas, even in stop and go traffic. It has warned me multiple times of a sudden slowdown just as I noticed it. It hasn't yet saved me from anything, but if any of those circumstances I'd been distracted (even for a legit reason) it could have. It once warned me of a motorcycle in my path on a very dark night when the dim tail-light of the motorcycle was overwhelmed by the much brighter tail-lights of the cars to its left and right. I would have seen it soon enough, but the car saw it first. Is that a risk to other road-users? Yet, it's automated by some definition.

What about various other manufacturer's lane keeping features? That automation to some degree.

What about ABS? Stability control? Again... automation to some degree.

All of these have their detractors, yet I think statistics clearly show an improvement with each level of driver assist.


>So, right now, it looks like the Tesla Model S is about 4x deadlier than the average car when using Autopilot, and about 10-20x deadlier than the safest cars.

How you can draw such a conclusion from a singular data point? That's beyond ridiculous.


Seems as valid (and invalid) as the implied conclusion here:

> This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.


Fair enough. I think the framing matters a bit though.

"4 Times Deadlier!" certainly comes across as sensationalist based on such flimsy data (never mind that the real figure is actually 3.2 and the parent chose to round up rather than down to serve his/her purpose).

Whereas Tesla's statement comes across more as a statement of fact (albiet quite self-serving.. no question about that).


The 130 million miles is not Model S overall, but Model S with autopilot engaged. It's also only the first 130 million miles, so the sample size is currently 1. This will be much more interesting to look at in 5 years, and at the moment it tells us nothing about Model S.


[flagged]


.


To the people downvoting the parent of this comment: yeah, I should have put something like [deleted] in there instead of just hacking in random characters. You can stop now. It's no longer useful.


[deleted]


(Can only delete comments that do not have replies)


Fair enough. I'd looked at the thumbnails, which generally didn't have hands on the wheel. I'll delete my comment if I can.


Having the same fatality statistic was not enough for Google to bring autonomous driving into production. Many people think that it should be at least an order of magnitude safer than manual driving


Sure. The problem with this is statistics: you need to have your autonomous cars travel at least 100 billion miles before you can confidently say they're an order of magnitude safer.

A normal car drives about 100 000 miles in 10 years. You then need one million autonomous cars driving around for a decade before you even know whether they're safer!


You don't need an equal sample size to determine if autonomous cars are safer. But if you're comparing to all the cars on the road today, you need quality data that matches the distribution of cars today. Google isn't going to prove that autonomous cars are safer if they only test in Palo Alto in ideal weather conditions.


That's very true, but if you know your product well enough, and you perform well conducted experiments, you can come up with a good estimate for this without actually having your cars travel 100 billion miles.


Sure. The main issue with that is the unknown unknowns. In hypothetical cars an order of magnitude safer than today, we're talking about events so rare that a single (hypothetical) accident which kills all 7 passengers due to the Tesla misinterpreting an obscure Hungarian road sign meaning "bridge out ahead" will significantly influence your safety statistics.


There are enough youtube videos with people falling asleep in Tesla on the highway and a Tesla switching lanes and almost causing a frontal crash. These are known problems.

Maybe a billion miles are needed for a fatal crash, but for small mistakes it's much less.


> A normal car drives about 100 000 miles in 10 years.

Can you give a source for that number as it seems a bit low given warranty numbers.


my insurance estimates the average driven per year is around 12k miles so 100000 miles in 10 years is about right or a little low.


Your statistics seems to be at odds with the statistics quoted in their post:

"Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles."


1/94 million is very close to 10/1000 million, which is what I said.


Close is not the same and neither is the same as 1/134 million. Of course I have no idea if the differences are statistically significant, and it's likely impossible to tell given the lack of data on Tesla autopilot at the moment (new).


Well of course it's not statistically significant, 1/134 is based on a single data point. At this level of confidence, all these numbers are equal. If this was the first fatality in 1000 million miles, it might be an indication of higher safety. Might.


Yeah, but that's not how people's emotions usually work.

Although US foreign policy and intelligence decisions were more or less directly responsible for 9/11, you don't lecture people at Ground Zero about the dangers of blowback and not being aware of your government's activities.

There's no such thing as "condolences, but...", and Elon Musk has shown a pattern of this kind of tone-deafness before. Not to say he's not brilliant and not to say autonomous cars won't ultimately save many lives, just that he should either listen to his PR department if he overruled them, or fire his PR department if they did this without him.


"There's no such thing as "condolences, but...", and Elon Musk has shown a pattern of this kind of tone-deafness before. Not to say he's not brilliant and not to say autonomous cars won't ultimately save many lives, just that he should either listen to his PR department if he overruled them, or fire his PR department if they did this without him."

Excellent, Doc. Further to the point about these statements and Musk's leadership: not too long ago Tesla would include the fact that nobody has been killed in a Tesla in their public statements. I found it completely inappropriate since we all knew it was just a matter of time, sadly. Again they play with fire, as another death with autopilot will really wreck their statistics.


And regardless this was primarily an after-the-fact technical analysis with relevant information to other Tesla owners, the primary purpose was not a formal announcement of a death. The news typically handles that role.


> auto-pilot feature

there is no such feature, there is a MobilEye _lane following_ system Tesla repackaged under Autopilot(TM) brand.

Say it out loud - lane following. Mercedes sold this same tech almost 10 years ago, difference is Tesla felt pressure and started believing their own marketing lies about autonomous driving.


They call it autopilot. Words mean things. That word creates an impression that it is more capable than it really is.


The hardware might be the same but the compute has evolved.


GM's cars don't have autopilot driving their passengers into a semi (that I know of)


I had posted the same comment before I saw this one (I deleted mine).

I would assume that the higher-end GM cars do have an automatic braking collision avoidance system by now, since that's a pretty standard feature above a certain price point. And I would assume it's not foolproof either, but at least it doesn't lead the driver to think he can take his hands off the wheel and not pay attention to what's happening on the road.


> at least it doesn't lead the driver to think he can take his hands off the wheel and not pay attention

Tesla reiterates every time and makes you confirm you understand that this is not the case. If the driver did it, that's completely on them being irresponsible.


It's called Autopilot™.


The autopilot in planes works the same way: pilot is strongly recommended to keep hands on the controls, autopilot will release control if any conditions fall outside the autopilot's functional envelope.


Pilots are trained professionals in the field where autopilot is relevant, consumers are not.


and even though they are highly trained they still keep their attention on the plane's operation.

And pilots still make mistakes when adjusting to a context where the autopilot has unexpectedly handed control back to the humans.


They should not be using a blog post to convey critical safety information to their customers.


There's a warning that makes it clear that you need to maintain your full attention on the road, when you enable Autopilot and every single time you take your hands off the wheel. If anything, giving users specific scenarios where Autopilot may fail would take away from that fact, and make it seem like there are other cases where it's ok to take your eyes off the road.


All that matters is, did the technology fail. I don't care about crash rates in other cars, if the impact had been different, the only fact that matters in this case is, did the tech fail. If so then is it safe to leave on or should it be disabled across the board until it cannot fail this scenario again.

one failure and they will take a minor publicity and money hit, two and its going to devastating


Whether it performed as designed or not really isn't relevant. The salient point here is Tesla's promotion of an "auto-pilot" system that encourages users to take their attention off of the road and put too much faith in a glorified lane-following system.

Now, this is pure speculation, but I really can't imagine running into the side of an 18-wheeler crossing the highway unless A) I was traveling at a reckless speed (unlikely given that "auto-pilot" was on) or B) I wasn't paying attention to the road.


For one thing, it lookslike the car kept driving after the crash, hit 2 fences and a pole, then tried to get back on the road before it shut down. Sounds like pretty big screwup.

https://img.washingtonpost.com/blogs/the-switch/files/2016/0...


Well again, the Tesla autopilot is not a google car AI pilot system.

IMO, human (customers') expectations require more than a mere autopilot.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: