14. A moment for pause. Fourteen incidents of a Tesla slamming into an emergency vehicle stopped on the road.
14 times. And somehow, the Tesla PR machine wants to say this is not about FSD.
Can anyone point to some advanced technology which has had similar "oops" in deployment and not been taken back to the drawing board? Did the edsel have a similar failure rare at scale? The f100 truck fuel tank thing? Is that comparable?
Remember a decade ago when the vibe around here was that we were pretty close on technology to being able to save 30,000 lives a year in the US, but we'd likely be held up massively by overreactive outcry and legislation the first time a self-driving car killed someone?
Incredible how wrong we collectively were on both counts. That said, if we're gonna focus on the number 14 here, it still needs to be contextualized.
Stealing top comment, the title seems to have changed. Originally said "Tesla slams into fire truck, 1 dead. 14th Tesla-with-FSD/emergency vehicle crash"
Might want to wait until it's determined if the driver had FSD/Autopilot enabled at the time before implying it is about FSD.
> California Highway Patrol Officer Adam Lane said it was not clear whether the driver may have been intoxicated or whether the Tesla Model S was operating with automation or driving assistance features.
What is the rate at which non-Tesla vehicles crash into stationary emergency vehicles? Because if the rate is the same on a relative scale then it seems moot, but if it is significantly higher then there would be a good reason to be suspicious.
Obviously there are no statistics for "brand X crashes into fire truck" so unless you have data to the contrary, you can assume that the rates are more or less the same.
Drunk drivers in non FSD vehicles can't pretend to be capable in predictable conditions as well. So, to some extent FSD is precisely why drunk drivers in teslas with FSD enabled are possibly more of a risk in unpredictable situations.
Ever wondered how many gun related accidents involve alcohol?
Enough to make it a law to move over one lane if you see an emergency vehicle on the shoulder. And enough to put it on signs on the highway all the time.
How many times have Toyotas, Mazdas, etc. slammed into emergency vehicles stopped on the road? What evidence was there that FSD was even in use in this case?
If we're going to be pedantic, FSD stands for Full Self Driving, which isn't available in Teslas yet. Full Self Driving Beta, which requires drivers to agree that they will fully supervise the car, is what is available right now.
Help me understand something though, and this is an honest question, because this seems to come up a lot... Are you saying that because it's called FSD, the driver in these situations should be absolved of responsibility for the crash (assuming FSD was involved in this)?
> Help me understand something though, and this is an honest question, because this seems to come up a lot... Are you saying that because it's called FSD, the driver in these situations should be absolved of responsibility for the crash (assuming FSD was involved in this)?
My personal opinion on this matter is that the courts will end up having to determine this in the fullness of time. And that will come after too many people are hurt/killed by insufficient automation being negligently marketed into fully-autonomous usage. But it's those losses that will push the matter into the courts' hands.
I agree with everything you're saying there. I wonder, and this is largely an academic point rather than disagreeing with you, if the sort of people who would lean too heavily on "FSD" would also lean too heavily on "lane keeping assist", with similar outcomes.
But I think that the day in court that you mention above is going to go very badly for Tesla, largely because of the FSD name.
Why can't Tesla let go of the FSD name? I'd guess it's entirely the ego at the top. I think everyone agrees that FSD is not at all capable of full self driving. I know Musk wants to get there and thinks they can, but in the interim I don't understand why they don't just start calling it something more in line with the actual capabilities... "Driving assist" or revive the "Mad Max" theme they used for how aggressive the mode is, maybe "Full Tunderdome"?
Maaaaybe, nobody wants to tell Elon how bad FSD really is, so they have his cars remotely piloted so nobody gets fired? Honestly, that wouldn't surprise me at this point.
A moment for pause seems unnecessary, but perhaps some perspective is warranted.
Automatic Emergency Braking only has a 30-80 percent success rate, depending on speed and conditions. Sort of irresponsible and unnecessarily inflammatory to call out Tesla when this technology across all manufactures isn’t a panacea. The driver is still ultimately reasonable for the vehicle regardless of safety or driver assist system used as US law stands today.
Tesla pushed a software update (feature first seen in version 2021.24.12) that slows the vehicle if emergency vehicle lights are detected by forward facing cameras. Again, no safety system is perfect, and no other auto manufacturer does this.
> AAA took a range of modern vehicles equipped with AEB to determine just how well these systems work in practice. The testing concerned T-bone crashes and incidents with cars turning left across traffic, as well as rear-end crashes at 30 and 40 mph. The tests were performed with a selection of popular SUVs, with AAA selecting 2022 models of the Honda CR-V, Ford Explorer, Chevrolet Equinox, and Toyota RAV4 for the project.
> At 30 mph, AAA's testing found that AEB was able to avoid a collision entirely 17 times out of 20, an 85 percent success rate. In the other three cases, speed was reduced by 86 percent prior to impact. Performance was worse at 40 mph, with AEB only avoiding a crash six times out of 20 tests, a 30 percent success rate. Impact speed was reduced by 62 percent on average in the 14 tests where a collision did occur.
> The result shows that rear-end frontal collisions were reduced by 27% for cars with AEB, compared to cars without the system [6]. Many existing studies have shown that AEB can reduce rear-end collisions by 25%–50%
For comparison, GM killed 124 people with an ignition switch they knew for a decade was not fit for purpose. They recalled 30 million vehicles, and as part of a Deferred Prosecution Agreement, agreed to forfeit $900 million to the United States government.
Other manufacturers are very careful to remind users it's assisting technology. Tesla believers seem to want to promote its autonomously capable, or will be. And there's few people willing to tolerate 40mph speeds on highways, Tesla or otherwise, let alone 30.
14 feels to me like a horrendously large number for one vehicle manufacturer but your point is valid: what's the rate of analogous event for Volvo, or other market leading AEB?
The idea this problem is solved by making smart devices smarter at recognising flashing lights suggests they're looking for low cost consequence fixes, much as Boeing wanted for the Max.
> Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.
Remember, as US law stands today, the driver is always responsible for the vehicle, and therefore safety systems do not need to be foolproof. They mitigate and reduce risk, they cannot eliminate it. If the argument is these measures are insufficient, it passes muster with the NHTSA to not prevent further sales of driver assist or a recall to disable the system. Tesla vehicles have a dead man switch via steering wheel torque detection and in cabin attentiveness/attention detection mechanisms using a camera, which will kick you out of Autopilot for the remainder of your drive if your attention is not being provided (including looking at your phone for more than a few seconds). I’m at a loss what more could be expected.
14 sounds like a lot, but more people than that will die in the next 4 hours assuming 40k/year US road fatalities. The idea of zero deaths is incompatible with our societal risk budget to continue to have light vehicle mobility available.
Today, fire truck accidents are so frequent and fatal that they rank as the second-leading cause of on-the-job deaths for firefighters. Up to 25 percent of annual line-of-duty firefighter fatalities are attributable to motor vehicle crashes and collisions. Traffic accidents kill more firefighters than smoke, flames, or building collapses; in fact, the only cause for more line-of-duty firefighter deaths is heart attacks from overexertion.
Where does the "14th Tesla with FSD emergency vehicle crash" in the title come from?
The article never says this vehicle was using FSD. It says it might have been using "automation" or autopilot. And the "14" figure comes from autopilot too
> The National Highway Traffic Safety Administration is investigating how Tesla’s Autopilot system detects and responds to emergency vehicles parked on highways. At least 14 Teslas have crashed into emergency vehicles while using the system.
Also considering FSD does not support highways yet (it switches to autopilot), it's almost guaranteed that it was not using FSD.
The article does kind of conflate autopilot and FSD as well though, when it mentions the recall (which is for FSD, not for autopilot).
No amount of engineering can change the basics of physics. A firetruck has a roll weight of in the ballpark of 50,000 lbs. A Tesla has the ballpark weight of 4,000 lbs.
You need a lot of V to make up for that difference in M.
On top of that, a firetruck is typically built on really tough commercial chassis which has a feature a lot of rigidity.
Somewhat related fascinating take: A schoolbus driver did a tiktok saying, roughtly: For its size, a schoolbus is a relatively lightweight vehicle. We don't have Mansfield Bars because we don't want to absorb the impact of you running into the back of us and pushing us into children. Instead, you go underneath and the children have a better chance of surviving. Pretty sobering.
It's worth separating the momentum M*V you're referring to, from the energy M*V^2 that's dissipated in deformation in the collision.
Hmm I'm not quite satisfied even by my clarification. Car and truck chassis strength needs to scale with something closer to M*V^2, so the truck chassis is multiple times stronger.
I don't know about Teslas, but with cars in general, getting good scores on crash tests means that the people in the car are more likely to live. One way this is accomplished is by letting the car absorb more of the kinetic energy (using crumple zones, etc), so the cars themselves end up coming out of it much worse.
A fire truck is a 30-ton hunk of metal. That mass disparity really does make a huge difference in collisions. Pretty much any sedan-sized vehicle is going to be obliterated in such a collision, regardless of crash test scores.
Aren't interstate roads highways in the US? I don't think it matters what your safety score is if you crash into a stationary object at 100-120 km/h, the car obviously gets destroyed.
Because crash tests measure outcomes of passengers, not of the car itself. And even the safest vehicles usually fare poorly when the other vehicle is way heavier.
From the video in the linked page, it doesn't look like it went under. The B and C pillars are largely intact, as is the roof between them. The front end up to the A pillar is gone, and the space between A and B pillars is pretty squished.
This isn't the first time that FSD failed to detect a shiny red fire truck or an emergency vehicle.
But once again it has killed the driver instantly, telling us that it is totally not ready for general use at all; even if we gave Tesla more time to make their wild claims of Level 5 FSD, robot-taxis come true.
This add-on is an unsafe contraption, deceptively marketed as 'Full Self Driving' when in fact it is a 'Fools Self Driving' experience. This AI system is putting the driver and other drivers on the roads at risk.
Now clearly with Claim 4 [0] it is no safer than a human driver (in fact it is worse) by just slamming straight into the truck or hitting a small child-sized mannequin at normal speed with FSD on.
Not saying that FSD doesn't have problems, but, at this time:
> California Highway Patrol Officer Adam Lane said it was not clear whether the driver may have been intoxicated or whether the Tesla Model S was operating with automation or driving assistance features.
Original HN title implied otherwise (mods have updated it).
OK, I'll play Devil's Advocate. How do we know that all 14 of these crashes had FSD enabled? And do they mean FSD, or do they mean Autopilot? They are not the same thing.
14 times. And somehow, the Tesla PR machine wants to say this is not about FSD.
Can anyone point to some advanced technology which has had similar "oops" in deployment and not been taken back to the drawing board? Did the edsel have a similar failure rare at scale? The f100 truck fuel tank thing? Is that comparable?