Hacker News new | past | comments | ask | show | jobs | submit login
IIHS: Model S has higher collision claim frequencies than comparable luxury cars (iihs.org)
118 points by otalp on April 2, 2018 | hide | past | favorite | 164 comments



This isn't great news for Tesla. This 30 second video also recently surfaced where someone reproduced the fatal accident involving autopilot from this week:

https://www.youtube.com/watch?v=6QCF8tVqM3I

This could also be bad for Tesla if this is what occurred. Interestingly in this video the chevron lines exist, something people have been claiming would have averted the accident, also interesting that no warning alarm sounded at any stage.


Wow, that video is pretty damning.

(a) Even though the driver was getting a warning to put his hands on the wheel, the car should NOT continue driving forward and piloting itself at all with no driver response and insufficient certainty to continue navigating itself.

(b) There was a very obvious hard barrier right in front of the car. Why didn't the collision avoidance detect it and halt the car?

(c) The car was clearly "pulled" to the left by the bright exit lane stripe and missing highway-side stripe.

Simply following stripes seems an absurdly primitive and unreliable technique to me. The fact that the car could be "distracted" this way suggests to me that it isn't getting (or making good use of) a lot of information that should be available to it-- like the behavior of other traffic, the presence of other obstacles and roadside geometry (like the ignored barrier, which was on a collision path), the trajectory on previous trips (and any conclusions drawn then under better conditions), the road sign which indicated a fork (and which was also itself on a direct collision path), the crosswise striping in between the two forks which the car was driving over, and the striping on the other edge of the lane, and the lanes beyond, which followed the correct trajectory.

Each of those things is a significant clue as to the real path of the lane, and are all machine-detectable and processable pieces of information. Even with nontrivial uncertainty about each of them, the totality of information, IMO, should be more than enough to clearly discern the correct lane, especially if efficiently used-- Kalman filter-like information synthesis can take a lot of very uncertain information and synthesize a very certain picture.

Instead the machine ignored the holistic picture and "target fixated" on the stripe. This apparently has a disappointingly long way to go before it is a reliable system.


You don’t understand how these systems work. The optical/radar combo can’t see stationary objects well when the car is going more than 30 MPH, it’s a sea of false positives. This is true of every self driving system on production cars today, 3D Lidar might address it but that’s up to $50K a car.


> The optical/radar combo can’t see stationary objects well when the car is going more than 30 MPH

That seems pretty damning.


I don't know what became of it but there was already a crash with dashcam video example of this in 2016: https://www.autoblog.com/2016/09/14/fatal-tesla-crash-china-...


No it’s not, because when you turn on Autopilot or any sinilar system, it explains these limitations. They are very good at lane keeping and avoiding traffic at highway speeds, but dependent on driver for detection of stationary obstacles.

On typical sports of city streets, they are much better at detecting stationary objects.


So this companies are pushing the self driving cars while the hardware is not there yet, this cars would hit a brick wall and ignore it because it may be a false positive. I am not even sure if Tesla have any piece of radar installed though.

Do you have a link about the radars that are actually installed on this cars can and can't do?


> So this companies are pushing the self driving cars while the hardware is not there yet,

Yes. You know why? Because they can. That's why...


Autopilot isn’t a self driving system (yet), it’s a driver assistance technology. It explains this carefully to users when activated.


Teslas made since October 2014 have a radar.

https://www.tesla.com/blog/upgrading-autopilot-seeing-world-...


There is no production car with "self driving system" available right now.

There are just "assistants", and they can't and should not drive on their own.


Then why does Tesla, on their order page offer this? >> Full Self-Driving Capability >> This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver.

They also seemingly purposely make it very vague as to what is actually available when you pay for it today (nothing), what might be available during your lifetime (it will try to kill you) and when it might be actually ready (never)... /s


Because simply offering the option for sale helps portray Tesla as forward-thinking? The option itself is basically a marketing mechanism at this time.

The FSD option doesn't actually do anything right now or even add any extra hardware.

According to their Q4'17 update letter,[0] "In October 2016, we began equipping all Tesla vehicles with hardware needed for full self-driving capability, including cameras that provide 360 degree visibility, updated ultrasonic sensors for object detection, a forward-facing radar with enhanced processing, and a powerful new onboard computer." All new Teslas manufactured since then include the same Hardware II (or the incremental "II.5" version), which includes the eight cameras referenced under the "Full Self-Driving Capability" option in their configurator. Enhanced AutoPilot activates certain software functionality (what people generally understand to be "AutoPilot"), while the "Full Self-Driving Capability" package doesn't actually do anything right now. You're just purchasing the right to unlock a future FSD update for a $1,000 discount today that may or may not come out before you sell the car.

Basically, Tesla is betting that they'll be able to develop and deploy FSD in a reasonable (for consumers; from a developer and regulatory standpoint, it's incredibly aggressive) timeframe using the hardware already present on its cars. In the interim, FSD orders just sit on Tesla's books as deferred revenue because "FSD" hasn't actually been delivered yet.

As an aside, it's actually kind of impressive that ~40% of buyers[1] have opted for the FSD option despite the fact that it doesn't exist yet. Offering the option today involves some pretty significant risk if there are significant delays or Tesla runs into unexpected development hurdles it can't easily get around. It could easily turn into a class action nightmare.

0. http://ir.tesla.com/secfiling.cfm?filingID=1564590-18-2956

1. https://www.inc.com/kevin-j-ryan/35000-people-have-paid-for-...


Well, that certainly does say quite a bit about the mentality of Tesla buyers, who are willing to pay a few extra grand for something that not only does not exist, but may not even ever exist. And hell, they aren't even paying for any extra hardware.

But it sure does make Tesla look progressive alright. And upon a cursory, or even second reading, the option does certainly read like you're getting something. That's probably where those 40% come from. Basically, this is a behavior I would expect from some shady operation on Kickstarter, not from a major corporation...


This video is much more important than anything that was mentioned in the posted article. There are just as many questions about the numerous variables involved in the IIHS study as there are in Tesla's fudged statistics that they always cite. However there is little argument with that video.

At some point there really needs to be a legal discussion about whether Tesla's argument that the driver must always be alert and responsible for the car is a realistic standard in which to hold people when semi-autonomous systems are engaged.


> At some point there really needs to be a legal discussion about whether Tesla's argument that the driver must always be alert and responsible for the car is a realistic standard in which to hold people when semi-autonomous systems are engaged.

It seems completely predictable that it wouldn't be, especially when they call it "autopilot."


I have addressed this elsewhere in this thread, so let me just copy and paste that comment.

>To be fair this is mostly a problem with the general public's understanding of the term autopilot and not with the actual systems that have used that name before. Aeronautical autopilots would probably only be classified as level 1 or maybe level 2 [autonomy] when that term was first coined and used to describe them. Even now I am not sure if we would classify the average autopilots you find on a commercial airliner as being fully at level 4.

I don't think Tesla's naming it autopilot should have any impact on any legal discussions. I am not sure the driver should be responsible for everything that happens to the car while autopilot is engaged. However I think it is a completely fair and reasonable expectation that a driver understands what autopilot is capable of doing before they engage it. I don't think Tesla (at least through their marketing, I have heard some questionable things about their salespeople) has been misleading about that in any way. I don't have a lot of faith in people handling a Tesla responsibly if the only thing they know about autopilot is that it is named autopilot.


HN mods really don't like when you paste comments, FYI. The best way to refer to something you already wrote is a link to that comment.

Personally I think Tesla should bear some responsibility for the name. People don't know what aeronautical autopilots do and don't do. Laypeople think autopilot does the driving for you.


Yes, but when you watch documentaries about pilots in cockpits they are routinely enabling the autopilot and begin talking to each other, eating, taking notes, and doing other stuff and are not constanly looking out of the window. If you look at it from this perspective, people are comfortable to look at their phone or eat while Tesla's Autopilot is enabled.

Pilots can do that because there are no other planes in the sky on their route because some outside human is taking care of that.


It doesn't matter what it's called. People think it's autopilot, and most of the time it does drive itself. This causes people to lose attention. This has been replicated across different systems - cars, planes, trains, test equipment, etc etc etc. It's a pretty robust finding: humans need things to do if you want them to maintain attention.


As I understand, there are devices that check if you're paying attention in many autopilot systems for precisely this reason: https://news.ycombinator.com/item?id=11017034


> At some point there really needs to be a legal discussion about whether Tesla's argument that the driver must always be alert and responsible for the car is a realistic standard

This should have been done before these things were ever allowed to put on roads.

Zuck is right. We naively trust the world at large to work for our benefit automatically. We truly are "dumb fucks".


Note: This isn't at the same location as last week's accident. Last week's accident was in Mountain View, CA; while this one looks like it's in Lake Station, Indiana.

Edit: it's in Chicago not Indiana, see replies


Indeed. This one also has chevron markings, which people were pointing to the lack of as a partial cause of the other accident.

This occurring in multiple locations isn't a comfort, it means the issue is more widespread than a single location.


It's at one of the busiest highway splits in Chicagoland, on the Ryan where it splits off to the Skyway. This isn't a random stretch of road; this is something an automated driving system has to be able to get right.


No, it's the 90/94 split on the Dan Ryan in Chicago: https://www.google.com/maps/@41.7763111,-87.6292446,3a,51.6y...


Those lane markins are awful. Ordinary paint, and worn out.

Compare some from the UK.

Bumpy paint, so the wheels vibrate and so they show up in the wet. Red reflectors, and barriers that would push the car back onto the road rather than into the bridge support.

https://goo.gl/maps/TUf1xzLUN5N2

This is normal, though they're aren't many roads like this within cities, so there's usually space for a long, grassy strip.


I have never in my entire life been more terrified than when I drove a car from Edinburgh to Elgin; there is nobody in the world who will ever convince me that UK roads are safer than those in the US. I don't understand how every driver in the UK isn't already dead.


The UK is the 4th, 5th or 3rd safest country in the world for road fatalities, depending which metric you prefer (per capita, per vehicle, by distance driven).

Maybe you aren't used to concentrating while driving? (I'm only half joking, an American friend one made a similar remark when he visited me.)

https://en.wikipedia.org/wiki/List_of_countries_by_traffic-r... (I ignored microstates when counting.)


I was definitely concentrating, and also in a semi-permanent state of yelling in terror. I don't understand why cars don't just fall off the sides of some of those roads and down into the steep ravines that appear to be the UK's (or at least Scotland's) answer to a shoulder.


Drive serpentines in southern France. Visibility ahead is only twenty meters or so and oncoming traffic regularly cuts the corner. So you suddenly find another car closing in fast on your lane!


Holy shit, not sure I would have been so interested in proving the problem that I would have taken my hands off the wheel like that.

I can really see why Tesla would try to deny this so hard, seeing it happen yourself is frightening.


What evidence have you seen of Tesla ‘trying to deny this so hard’?

I've read both of their press releases on the incident [1] [2] and I didn't see any attempts at denial in them.

[1] https://www.tesla.com/es_ES/blog/what-we-know-about-last-wee...

[2] https://www.tesla.com/es_ES/blog/update-last-week%E2%80%99s-...


From the article: “Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of. There are over 200 successful Autopilot trips per day on this exact stretch of road.”

It seems like Tesla is trying to explain-away that particular incident as a one-off statistical outlier, and not the fault of the autopilot system. Appealing to statistics like that seems like an attempt to deny responsibility to me.


That rationalization actually makes it _worse_ for me; it suggests they have no idea why this happened.

But unless the person doing the video faked it, there's no question what happened: the car is driving down I-94 and veers out of its lane towards a concrete median until it's stopped by the operator.


With the advent of this video I'm expecting Telsa to disable Autopilot across the entire fleet within another couple of hours.

Anything less would be denial and warrant some head honcho at some regulator with teeth, and probably the FBI, turn up at Musk's house and drag him away.

From your second link, in Tesla's own words:

"The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken."

What they didn't write is that the car also had an unobstructed view of the crash attenuator.

Stop calling Autopilot. It's adaptive cruise control, so call it that.


They did not denied but they just don' tell us what we wanted to know, they did not tell us if the car tried to stop, if it detected an obstacle, if it actually called the alarm in that moment or minutes before, they just said aonlyu what the facts that are not damming for them.


"Our data shows that Tesla owners have driven this same stretch of highway [thousands of times with no trouble]" reads as a denial, implying the crash was a one-off non-reproducible fluke, which the video certainly disproves.


I think that the S and X have three forward-looking cameras center-mounted. Without a binocular baseline with which to establish distance, ranging of on-center obstructions could be difficult. Even if an edge is visible, the frame-to-frame variarion is low, especially for a black rubber sheet at night.

I'd also be curious about the radar absorbsion of that rubber impact barrier.

This, in part, is why Musk's assertion that this hardware would be sufficient for level 4 autonomy was so ridiculous. It really wasn't the kind of lanuage that a responsible engineer would use.


I think that the S and X have three forward-looking cameras center-mounted.

And I think that given Tesla's known propensity of logging _everything_, they would ave video footage of those camera's saved somewhere. I wonder if this is submitted as evidence, so we can understand what the car "saw" and "understood", both from optical as well as non-optical sensors...


I want to know if the car braked automatically, or the driver had to brake to avoid colliding with the barrier.

Also, the warning to put hands on the wheel was a joke. It's incredibly subtle and would only be noticed by someone looking for it, which is probably not the people that most need to respond to the warning.


> I want to know if the car braked automatically, or the driver had to brake to avoid colliding with the barrier.

You can see from the dashboard icons that he disengaged the Autopilot (by taking control). If it had detected an imminent crash it would have made warning sounds.


That's not a crash warning at all. That is the usual warning ~ every minute to jiggle the wheel if you don't hold the wheel firmly if it can't detect you holding it.


The initial message that pops up is somewhat easy to miss, but the white flashing at the edges of the instrument panel gets my attention every time. It’s effective enough that I’ve never received the audible alert that comes next.

(I don’t use my phone while driving, so my experience is based on looking at the road. Maybe it’s easier to miss when looking downward rather than forward.)


This about as good a bug report as you can file.

Can't wait to see Tesla close the issue in Jira with a "Cannot reproduce."


Christ, this video alone has eliminated any desire I may have had in getting a Tesla once and for all. For me, the one defining feature was self driving capability, but now I see it can’t even do that very safely.


Yep. That was scary. You can feel the panic. And even after prompt intervention, that car came pretty close to the median!


It looks like it’s following the strong lane marker that ends up being the left-hand part of the marked out space in front of the barrier. I wonder if having a good line on the right-hand side of that space would help.

That said, someone purposefully doing this and stopping in the middle of the highway stupidly dangerous.


Yes, it looks like it completely ignored the right lane marker and just focused on the left one, something a human would be unlikely to do.

That said, someone purposefully doing this and stopping in the middle of the highway stupidly dangerous.

As the saying goes, "it had to be done" --- to show the defect, before someone else gets killed, and the driver was clearly paying attention here. It stopped pointed right at the divider, a position no car would normally be in anyway, and there weren't many other cars around.


Note that human-driven cars have frequent accidents in the Mountain View location, so I wouldn't assume that you understand why humans do or don't have problems here.


If I was driving on that stretch of highway, I'd be pretty confused by the line markings. I wouldn't crash into the barrier, but I might drift towards it a bit. The lack of a solid white line on the right side of the chevrons is dangerous and inexplicable.

That said, self-driving cars have to be able to handle poor markings like this.


There is a solid white line on the right side, it's just faded. The chevrons are quite clear, however, so it's pretty damning that the car ignores the indication that it's cruising along a not-a-lane.


That video is damning-- no doubt. The Tesla should recognize the big barrier and stop or adjust lane well in advance.

But what is equally damning is that I can't tell that the Tesla has done anything wrong until it's moments away from hitting that barrier. Who the hell decided that was a sensible way to mark the road?

I would strongly prefer no markings to that confidence-inducing single white line, which I have been trained to through years of driving to stay sensibly adjacent to, and which will lead me like a lemming to my death.


"also interesting that no warning alarm sounded at any stage."

At 19 seconds, you can see that the dashboard says to put hands on the wheel. Does that not trigger an audible alarm by default in a Tesla?


> Does that not trigger an audible alarm by default in a Tesla

No. That's a standard warning that triggers every few minutes if the car can't sense your hands on the wheel. If you don't place your hands on the wheel at that point, then there will be an audible alarm. If you ignore that, the car will slowly reduce speed and come to a halt.


> every few minutes

*every few seconds (12 seconds, I believe)


It's definitely every few minutes. I can attest to that from personal experience. Alternatively, you can just keep your hands on the wheel and then the warning doesn't come up.


Not unless you ignore it, then it will beep, then flash red. I think it doesn’t make an audible sound initially so it proves you are paying attention the whole time rather than only when it beeps at you.


I guess that they should add some unit tests to their vision stack.


A human sees signs at 7s and 24s and the (admittedly not very clear) split of the lane at 27s as well as the yellow sign in the distance indicating the divider, while the autopilot sees a solid white line on the left and continues following it.

It probably would've been fine if there was another white line at the split, like this (marked in red here):

https://files.catbox.moe/10p2hh.JPG


There is another white line at the split, following the trajectory you've drawn. At the junction of the two lines, it's badly faded, but you can see at :32 and better when he looks up after :34 that the line is there, it's just not as freshly painted.

That's not an unusual road condition at all; also: you'd think there'd be some logic reconciling the line it thinks it has to follow with the fact that an entire lane of traffic on the Ryan (the left-most I-94 lane) doesn't abruptly end.


The lane divider on the right is also quite visible; considering that its internal maps should tell it the road has a left exit, I'd expect that it should be able to guess that it should be holding to the right edge instead of the left edge of the lane.

This does bring up a theory: the lane keeping is biased towards holding itself to the left edge of lanes, on the basis that most exits are right exits and therefore the right edge will tend to diverge a ways before making a new lane. I'm curious how it would handle the split at the mixing bowl in Springfield, VA: https://www.google.com/maps/@38.7735345,-77.1816237,3a,19.8y... (functionally a left exit, although signed as if it were a right exit). The solid white line doesn't actually denote any lane exit boundaries there...


Hate to say it Tesla but you ain't ready for primetime. Neither is Uber.

Waymo is on a whole other level.

But I really would love to see more collaboration rather than competition in this area where safety is critical. Pardon the pun - why does everyone need to reinvent the wheel? And also encumber each other with patents?

Open source beats proprietary competitive stuff like this.


Well, why would they do that unless the regulatory authorities made them?

Is Waymo really "there"? It seems like everywhere I turn I'm seeing things that suggest that all of this technology has been hyped far beyond its actual capabilities.


According to a new York times article recently, Waymos vehicles are currently driving on average 5600 miles between user intervention.

For Uber this number was 13

5600 miles starts becoming "I need to use the wheel only 4 times a year" territory.


> 5600 miles starts becoming "I need to use the wheel only 4 times a year" territory.

It sounds like you're saying I will very likely die 4 times a year.

If anything, perhaps the better performing algorithms will lead to more complacency (inattention) on the part of the driver...

Sort of a Catch-22? (Or is there a better analogy I'm not thinking of.)

Call me a Luddite, but I think I would prefer to steer the car myself, thank you. When someone says, self-driving car, I assume they mean a taxi, bus or train: all well-proven technologies.


Well to be fair Waymo drivers aren't dying 4 times a year, it could be that the car just pulls over.

I agree that this isn't enough for full autonomy. And that partial autonomy can lead to false security. But I bet there are a lot of drivers who have more than 1 incident every 5k miles.


Are all 4 interventions safety-critical? They may simply be a case of the car having trouble finding the way out of a parking lot for example.


OMG this is exactly how it happened in the CA crash...


Weird. youtube-dl will not download this video. Can anyone else confirm?


That was also the first thing I thought of. Worked at my end. Ping me if you want a copy, email in my profile.


Works for me


The day that Musk decided to call his driver-assist system "Autopilot" he lost my respect as an engineer and my business as a consumer.

It's a great brand for a L4 - L5 system, not for Tesla's realistic capabilities as a L2 (1) - it sets a dangerous precedent in the consumer's mind.

Their tepid response to this incident outlining that the driver was not fully engaged so Tesla is not at fault, is EXACTLY how the system is used by the average consumer due to the way it's branded and marketed.

(1) https://www.caranddriver.com/features/path-to-autonomy-self-...


The right brand for this should've been "Child driver". Because if a driving system needs to be controlled all the time it's very much like a child sitting on dad's laps and trying to drive the car.


I would say it’s pretty accurate if you are familiar with how plane autopilots actually work.


How many car drivers do you think are familiar with the details of how plane autopilots really work?

If Tesla had called this something like "lane-assist adaptive cruise control" I think your average Joe would have had much more realistic expectations around how the technology actually functions.


> If Tesla had called this something like "lane-assist adaptive cruise control" I think your average Joe would have had much more realistic expectations around how the technology actually functions.

Which would be foolish on their part, since it would equate their system with that found on other cars. The systems found on other cars are garbage to the point of being unusable. The system that Tesla has is in a different class.


If by "unusable" you mean that they actually work, and do not actively try to kill you, then yes, that's what they have.

Tesla's system is exactly the same thing, except that it is marketed as some fancy "autopilot" with "self-driving hardware" or whatever BS Musk is pushing to fanboys blinded by the whole "look, ma, no emissions" thing.

But of course Tesla can't call it that, and not something fancy like "autopilot" because otherwise what could they market? A glorified golf cart (granted, a pretty fast golf cart) that combines a luxury car price with a Yugo quality? That's not a winning combo.


Well, it's either a "glorified golf cart" or you're a short seller with a financial incentive to spread the nonsense in your post. I'm going to bet on the latter =)


Certainly your right. Alas, no financial involvement with Tesla either way.

Also, it is a very expensive golf cart, although less comfortable than some.


Is this true? I thought other luxury car brands are now shipping with self driving of similar capabilities to Tesla at this point.

Or maybe it's just the sensors being the same...


Not just luxury cars, even economy cars are shipping with adaptive cruise control, automatic lane-keeping, and automatic collision avoidance, and (at least the ones I've tried) work fantastically. On the highway you really don't have to do anything.


Luxury car brands ship with a limited feature set that reflects an appropriate conservatism. For example, they can detect cars in your blind spot and warn you, and they can put the brakes if someone short stops ahead of you. They have many of the required sensors but are waiting for the tech to mature


This is not true. BMW 5 and 7 series have adaptive cruise control and lane centering. I had a Tesla and now have a 5 series and the 5 series Lane centering is on par with Tesla’s autopilot. The only thing it lacks in the US is auto lane change (but it works in other countries).


Non-luxury cars (Subaru) are shipping with more than that (lane following, adaptive cruise control).


Mercedes' system is quite good, to the point that I got tired of waiting for my Model 3 reservation to come up and got a Mercedes plug-in hybrid instead. It's really close to Tesla's AP.


Yep. Drive pilot on Mercedes and The driver assistant plus packages on BMWs are just about as good as AP1 on Teslas


The system that Tesla has is in a different class

Indeed! The Tesla system appears to have a much higher probability of killing you! I don't see headlines about "$carbrand cruise control out of control, kills driver"


You don't see those headlines because short sellers aren't obsessed with those brands like they are with Tesla. Not because those brands are actually safer or don't have accidents.

"Tesla is, by a wide margin, the No. 1 short stock in terms of money at risk and exposure within the global auto industry"

https://www.marketwatch.com/story/teslas-roaring-start-to-20...


But what does it have to do with there being quite a few (better selling, cheaper) models with 0 fatalities?

Hell, cruise control in a Hyundai actually alarms and brakes if you try to drive into a stationary object. But they don't have Saint Elon at the helm...


> But what does it have to do with there being quite a few (better selling, cheaper) models with 0 fatalities?

Pretty sure there are no car models out there with 0 fatalities that have been released for more than a year.

How much are you getting paid to spread this FUD? I genuinely want to know. If the gig is sweet enough, I might want to get in on the racket =)


This breaks the HN guidelines and is a bannable offense regardless of how wrong some other commenter is. Please don't do this again.

https://news.ycombinator.com/newsguidelines.html


Alas, there were statistics posted here on HN in one of the Tesla accident threads [0]. And these were even for cars older than Tesla, cheaper than Tesla, and selling possibly order of magnitude more vehicles (quite possible for Hondas and Toyotas on the list) than Tesla.

Wouldn't it be sad if you had invested your life's savings in TSLA upon hearing Saint Musk preaching his BS, and now need to defend that investment against all facts?

(yeah, there is some sarcasm above, but your accusations of FUD and being paid for it strongly smell of spiked Kool-Aid)

[0] https://news.ycombinator.com/item?id=16723782


> Alas, there were statistics posted here on HN in one of the Tesla accident threads

3 year period, 6 years ago. Zero citations, just a blank statement.

Not only that, it's not believable. The Honda Odyssey sells in the hundreds of thousands per year. Nevermind the other cars.

There are over 30,000 vehicle fatalities per year in the US, but those models that have been around for years and are extremely popular with consumers ... have had zero fatalities?

Let me guess, it's because they're not Teslas. Who's really drinking the Kool-Aid?

> your accusations of FUD and being paid for it strongly smell of spiked Kool-Aid

I'm happy to admit when I'm wrong, but I'm getting tired of the lying.


Yes, sure, everybody is lying, only Saint Elon tells the truth.

To go straight to the source, http://www.iihs.org/iihs/sr/statusreport/article/50/1/1

And we are talking about older cars that might not even have all the supposed safety features a "luxury" (not that it really were one) golf cart like Tesla supposedly has.

Sorry, but you really should lay off the Kool-Aid. Not to say that Tesla were a death trap, but it is fart from being a safest car on the road, no matter what electric fanboys might want to believe.


I chided the user who accused you of being a shill, but I need to chide you as well. You've been using HN for flamewars, which is really not what we want on HN. You've also crossed into personal attack in this comment, and in others previously. Would you please read https://news.ycombinator.com/newsguidelines.html and take the spirit of this site to heart when commenting here?


Sorry.


You don’t see the headlines because cruise control is an old technology. There were news about people misusing it when it became mainstream.


Almonst no one is familiar with how plane autopilots work. And in a plane you don't drive into a barrier or off the road if you don't pay attention for a few seconds like you can do in a car, unless you're flying very close to the ground. A plane can be said to be flying itself if it's just keeping it's course but to drive a car you need to pay more attention to the surroundings.


>Almonst no one is familiar with how plane autopilots work.

OK...

>And in a plane you don't drive into a barrier or off the road if you don't pay attention for a few seconds like you can do in a car, unless you're flying very close to the ground. A plane can be said to be flying itself if it's just keeping it's course but to drive a car you need to pay more attention to the surroundings.

I think you just proved your first point here.


Actually, no.

For most of the flight, pilots have enough time (and are often required) to go over paper checklists. With the Tesla Autopilot, taking your eyes for 5 second off the road gets you killed.


The skies and the seas are a lot different than the roads. They're mostly empty, while land is full of obstacles.

You can tell a drone autopilot to circle at 600ft and go to lunch. It will be there when you get back. The equivalent for a Tesla might be circling the block for an hour unattended.

The Tesla is far more advanced, but it achieves less, because the environment requires more.


Flight paths used while aircraft autopilots are engaged tend to have significantly fewer solid-object barriers.

That should be sufficient reason to not call it Autopilot.

We aren't flying here, it's a ground vehicle.


Unfortunately, most people attribute the word 'autopilot' with the way Otto performed in the movie Airplane!...


What % of Tesla owners have a pilot's license?


If Tesla's "autopilot" is turned into a martial arts master

https://www.youtube.com/watch?v=h_vvI26NnwE


Airplane autopilot is designed to drive cars into solid barriers? That's news to me.


Or a boat autopilot.


To be fair this is mostly a problem with the general public's understanding of the term autopilot and not with the actual systems that have used that name before. Aeronautical autopilots would probably only be classified as level 1 or maybe level 2 when that term was first coined and used to describe them. Even now I am not sure if we would classify the average autopilots you find on a commercial airliner as being fully at level 4.


Modern aircraft autopilot systems are often used in an L1-L3 way by that scale, but are capable of at least L4.

The flight-management systems of modern commercial aircraft can, for example, receive a flight plan expressed as a set of waypoints and information about the destination airport, and then fly the plane along that route and conduct a fully automated landing at the destination.


I think the concepts of L1-L4 don't have much meaning in the absence of an environment where the machine is constantly reacting to the behavior of other actors and environmental hazards.


> this is mostly a problem with the general public's understanding of the term autopilot

In other words, it's the customer's fault. And it's arguable that Tesla benefited from this misunderstanding.


There is a difference between the customer being at fault in controlling the car and a customer being at fault for not learning the basics about features of the car. I don't think we can make the first a universal rule, but I wouldn't think it is controversial for the second to be universal.


I agree that customers should learn how their cars work.

But if we all engineered products based on how customers should behave, the human race would be extinct.


I can’t say much about the accident rate but the loss can be attributed to the fact that anything more than a scratch on a Tesla is often counted as a total loss just look at the various Tesla salvage channels on YouTube and you’ll see how silly it is.

This is what happens when no one can fix your car but Tesla and Tesla doesn’t repair what most body shops would fix in a day.


I don't know for other areas, but here in bay area Tesla only does service, not collision repairs. I took both my Model S, and my wife's Porsche to the same repair facility, and I can tell you that both were eye wateringly expensive, and so well over my deductible that I really just didn't care.

So, I'm not sure whether to feel bad that the Tesla's claims are higher on average, or take pride that I'm getting more for my dollar from the insurer when I get it fixed?


As far as I know Tesla handles minor cosmetic damage for you they will hand it off to a local body shop but the problem is that anything that won’t buff out can’t be repaired as you for example as a body shop even a certified one can’t get say a buy a new door from Tesla.

The only way to fix a Tesla is essentially via salvage and there are plenty of people who buy scrapped Tesla’s and make a frankencar some might opt to recertify it with Tesla so they can’t get updates and service but you don’t have to do so for the car to drive and even charge from the network.


So in your example, sounds like Tesla is still not doing any body work, they're just referring the customer to a local body shop, which sets their own prices.

Repair centers can certainly buy brand new official parts from Tesla. When my nose cone was fixed, the entire front bumper assembly (including new Autopilot sensors) was ordered from Tesla. This is pretty much the rule, not the exception, and Tesla seems to operate no differently than other car manufacturers when it comes to the repair process, from what I've seen.

I don't see much about the Tesla that would make it more likely to salvage other than the battery pack though. Structural repair to a Tesla seems about the same as any other aluminum bodied car, and while there are shops that will do it (and get all the certified parts from Tesla), it just often doesn't make sense to spend $20-30k to fix a car whose used market value with a salvage title will be maybe double that?


I'm starting to feel like this whole thing is a case of competitors in the auto/energy industry smelling blood in the water after the Model 3 production issues. I'm seeing a spike in "Why Tesla is doomed" type articles recently...


but the non-tesla electric cars also had more severe (though fewer) claims. Part of that is probably that an electric car is just plain more expensive than it's ICE counterpart right now, and my experience is that repair cost of a vehicle scales with the capital cost of a vehicle.


Tesla refuses to allow third party repair or parts and has difficulty making things. So costs are crazy.

A colleague had to have his towed 300 miles and wait forever, for a rear end fender bender which necessitates a claim as the cost was a few thousand bucks. With a normal car, he’d probably just lay out of pocket to avoid the insurance bullshittery.


Tesla refuses to allow third party repair or parts

Interesting. I think this is, or will soon be, illegal in the EU - we have legislation around driving more openness from manufacturers to 3rd party repairs. I have to admit to not knowing the details, but I do recall reading about this.

EDIT - here you go:

https://en.wikipedia.org/wiki/Block_Exemption_Regulation

http://ec.europa.eu/competition/sectors/motor_vehicles/legis... (PDF, Sorry)

Looks like Tesla's behaviour in this sense is on the wrong side of EU law.


It seems silly to avoid filing an insurance claim when the other driver is insured and at fault. Even a minor bumper repair on a cheap vehicle is >$800 now. I've been rear ended 3 times in California and the amount of insurance bullshittery was pretty low.


This doesn't sound true. I had an accident and paid out of pocket at a licensed body shop. Tesla performs ZERO body work for you.


> Tesla performs ZERO body work for you.

And as such, has a very limited number of "approved" body shops that can source the parts for repairs.

See: https://www.youtube.com/watch?v=18MItauAgKo


You have to use a certified shop.

There was only one such shop in my state at the time and the backlog was pretty significant. It made more sense to ship the car out even further.


This is not correct. Tesla does not do body work.


Tesla does cosmetic body work they don’t do it in house but they do handle it for you anything more than that they will refuse to touch it and it will likely get totaled by your insurer.

You can ry to repair it on your own and recertify it with Tesla but it takes quite a bit of time and it ain’t free.


I've had 1 body repair to my Tesla. My insurance didn't total the car, and there were several local certified body shops to choose from. It was the same process as with other cars, I've never had a dealer do body work.


Not to correct you but to add context:

I recently had body work done through the Tesla service center. I dropped my car off, got a loaner, and they took mine to a nearby body shop. (They told me up front they would be doing it this way.) Picked it up a couple days later where I dropped it off.


You also need to compare Tesla aluminum-bodied cars with aluminum-bodied cars. Which are more expensive to repair. The IIHS article doesn't say what cars they compared Model S to, and aluminum-bodied cars are rare enough that it's likely not very comparable.

Mind you, that doesn't affect the claims-per-mile statistic.


The IIHS considers the Tesla Model S a "Large Luxury Car" in its classification system. Other "Large Luxury Cars" as classified by IIHS include...

* Acura RLX

* Audi A6

* Audi A7 (not rated)

* Audi A8 (not rated) (aluminum body)

* Bentley Continental GT (not rated) (aluminum body)

* BMW 5 Series

* BMW 6 series (not rated)

* BMW 7 series (not rated)

* Cadillac CTS

* Cadillac XTS

* Genesis G80

* Genesis G90

* Infiniti Q70

* Jaguar XJ (not rated) (aluminum body)

* Lexus GS

* Lexus LS (aluminum body)

* Lincoln Continental

* Maserati Ghibli

* Maserati Quattroporte (not rated) (aluminum body)

* Mercedes-Benz CLS-Class (not rated)

* Mercedes-Benz E-Class (partial aluminum body)

* Mercedes-Benz S-Class (not rated) (partial aluminum body)

* Porsche Panamera (not rated) (aluminum body)

* Volvo S80

* Volvo S90

-----

The main issue is the IIHS doesn't test many vehicles that can exceed a $100,000 MSRP, as they're a small portion of the overall market.


And the Tesla is furthermore an outlier, because for a long time it was the only luxury EV in widespread use. In a couple more years, the data on "luxury electric vehicles" should have a lot more data points.


Tesla doesn’t do collision repair. They have a program to certify third party body shops for that.


Tesla handles that for you afaik and even then it’s limited as those body shops can’t get replacement parts form Tesla other than windshields and lights so if you need to replace a body part you’re out of luck.


Tesla doesn’t handle it for you at all, and a full range of replacement parts is available, although sometimes they can take a long time to arrive.


"...and it accumulates more miles on average per day than other battery-powered vehicles, a new HLDI report shows."

Oh, no shit. A really fun to drive car, that can do ~300 miles per charge, is driven more than the Nissan Leaf @ ~150 MPC


> The higher claim severity for the Tesla Model S may possibly be attributed to the battery replacement cost of approximately $16,000.

This seems pretty important metric to me.

> Injury related coverages were also not included due to the small numbers of claims associated with the electric vehicles.

I would like to see more of this with the comparison.


Tesla is the epitome of Silicon Valley thinking applied to manufacturing. Tremendous achievements that no one thought possible, but tremendous ugliness under the hood.


From what I can see, their achievements are selling costly cars with manufacturing problems, bland interior to a dedicated fan base at a loss.

I can see why nobody thought it possible, but it isn't tremendous


Drive the car. It is tremendous.


Interesting in light of tesla's claim that autopilot improves safety


Tesla compares their crash rate to, and I quote, "all vehicles from all manufacturers" [1], rather than to comparable vehicles, or even just western vehicles. The IIHS statistics say they're more accident-prone than other luxury cars and other electric cars. It would be interesting to compare Autopilot's crash statistics to other driver assist / AEB systems, rather than to all the 10-year-old base-model cars on the road.

1: https://www.tesla.com/blog/update-last-week%E2%80%99s-accide...


Tesla's PR department is obviously full of shit.

Edit: Which PR department isn't? Didn't Edward Bernays invent PR as a form of commercial propaganda for use in times of peace?


> Tesla's PR department is obviously full of shit.

Elon Musk; Is full of shit.

Little by little more and more people ll come to understand that.


I'm not surprised. Tesla's Model S replaced Toyota's Prius as the eco-conscious high-tech status symbol vehicle to own. Owner surveys show a lot of people came over to Tesla from ordinary cars rather than from other luxury cars or muscle cars. Someone coming from a compact hybrid that does 0-60 in 11 seconds, to a huge sports car that does it in 2.5 seconds, is probably more likely than the average driver to get into an accident.


I've been to a talk by the Progressive guy with all of their data, and he says the only strong predictor of accidents is sharp braking. Not acceleration. Do you have any data points to offer?


I don't have actual statistics I can share, but work in the automotive industry and have talked fairly extensively with insurance companies about data sharing as part of usage based insurance. There are actually multiple predictors of accident-prone drivers. Sharp braking is one, but so are frequent lane changes, insufficient following distance, and abrupt speed increases.

Distracted driving is a bigger issue than all of these, but is very hard to actually detect as a first-order effect with current cabin sensors.


I'm really skeptical of this theory being pushed in the article, without further data to back it up. Maybe the Model S has a higher claim frequency because its owners would be more nitpicky of their $100k+ car, than a $30k Leaf? Shouldn't there be other data showing that performance of luxury cars also leads to higher frequency claims?

Personally, I see a lot of Tesla's on the road every day, and the people who were used to driving a Prius or minivan, seem to drive their Tesla in pretty much the same manner.


While one could argue that Tesla drivers are, indeed, less picky than people who paid $60K+ for a real luxury car (if they were equally picky, one look at Tesla's build quality and they'd be off to a BMW dealership or something)...

But really, if we are talking about people driving 100K cars, we have to compare them with people driving other expensive cars, not a driver of a second-hand Leaf.


Clearly the vast majority of Teslas on the road are P100Ds in ludicrous mode 100% of the time.

If memory serves the average model S is more like 5 seconds 0-60, which is quick but there's a _world_ of difference separating 2.5s from 5s 0-60.


So, I don't care about the claim severity, as a consumer (unless we're talking deaths versus scrapes). They're clearly talking about costs, which will drive premiums, but are certainly related to parts/labor costs and availability, which is solvable long-term.

Apparently I also don't care about claim FREQUENCY because the article notes that the frequency difference goes down (but does NOT go away) when controlling for miles travelled. What? How on earth does 'frequency' not have miles as a denominator? Okay, fine, they must consider it on a per-day or per-year basis, but as a consumer, again, I don't care.

As the consumer, I really care about whether passengers fare worse in the average collision, or are more likely to be in a collision.

Again, parts costs will come into line over the long term. This is a game the established manufacturers are already good at, Tesla will have to step up to catch up.


> On average, Teslas travel three more miles per day than other large luxury cars, HLDI found

Would like to know what is 3 miles in percentage terms. Could account for higher collision claim frequencies.


3 x 365 = 1095 extra miles a year. A typical person drives 10,000-15,000 miles per year (USA).


What has changed since Tesla claimed their cars are so much safer that the insurance costs can be lowered below standard rates?


All the comments in this thread seem to be implying that AutoPilot causes crashes, thus leading to higher insurance claims. I don't know if that's true, but that's not what this post says.

Correlation does not imply causation. It can be true that Teslas cost more to insure without Tesla being "at fault" for the discrepancy. When they compared electric vehicles with their ICE counterparts, they found that "the seven electric vehicles with exact conventional counterparts had lower claim frequencies and higher claim severities than their comparison vehicles". Electric vehicles had less claims, but when there was a claim they tended to be more severe. I assume this is because if there is damage to the battery pack replacing that part is extremely expensive. So it's not surprising that when Teslas crash they can be expensive to fix given that "crash severity" tends to be higher with electric cars, and that Tesla is a small luxury car maker.

The other thing you have to think about is selection bias. All the groups you're sampling are not equal. Because Tesla's are known for being extremely fast, they might tend to attract people who want to... drive their car fast. To make a scientific comparison you'd want to randomly assign people to a Tesla or Non-Tesla to get a comparable sample population.

The other thing they mention is that "Teslas also are on the road more than comparable large luxury cars. On average, Teslas travel three more miles per day than other large luxury cars, HLDI found." Obviously, if a car drives three times as much as another car it's going to be roughly three times as likely to crash given a consistent probability of collision per mile, right? Since Teslas don't need gas, the more miles you drive the more money you save -- and the more financial sense they make. Therefore it's not surprising that Tesla owners tend to drive their cars much more as the financial incentives would tend to favor people who drive more -- especially with free unlimited supercharging in early Model S and X. Even someone who might not have driven a lot before will be more likely to drive more if the price of doing so is reduced -- that's just the basic law of demand. Plus, if you had a Tesla wouldn't you want to drive it all the time? People are leaving their Bolt in the garage while they drive their Telas three times as much as comparable luxury cars, and yet what we take away from the data is "AutoPilot is causing crashes".

I know this won't make a difference in how the media and public perceive this, and this line of thinking will just be seen as a defense of Tesla, but I don't think that higher claim frequencies for Model S means Tesla has necessarily done anything wrong, just on a purely statistical basis.


I find it hard to believe that people who can afford a Tesla are that concerned about the price of gas.


They're probably less concerned about the cost of driving on average, but knowing that something costs less does tend to make you do it more all other things being equal.


If that extra driving averages to only 3 miles a day it might as well be explained by people in SFBA (Tesla's major market as I understand) just not being able to afford to live within a reasonable commuting distance.


I prefer battery-tested tech than something so shiny with such high risks. I hypothesize riding a horse is safer than driving a Telsa.


> Under collision and property damage liability coverages, the seven electric vehicles with exact conventional counterparts had lower claim frequencies and higher claim severities than their comparison vehicles.

I bet the higher frequency of claims is either because people who buy electrics are worse drivers or simply more litigious. Probably both.


>I bet the higher frequency of claims is either because people who buy electrics are worse drivers or simply more litigious. Probably both.

I think you should re-read what you are quoting. My reading is that the electric cars that were not made by Tesla had fewer claims than their ICE counterparts.


Indeed. I misread it as “higher” on both. In that case it’d be the reverse, with electric drivers being statistical better drivers.


What's up with the hatchet job? Is that normal for IIHS?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: