Hacker News new | past | comments | ask | show | jobs | submit login
Tesla Cybertruck Drives Itself into a Pole, Owner Says 'Thank You Tesla' (thedrive.com)
207 points by computerliker 35 days ago | hide | past | favorite | 219 comments



I've owned a Model 3 for years now, and FSD is scary as hell. We haven't paid for it -- and we won't -- but every time we get a free trial of it (mostly recently this past Fall), I give it a whirl, and I end up turning it off. Why? Because it does weird shit like slow down at an intersection with a green light. I don't feel like I can trust it, at all, and it makes me more anxious than just using standard auto-steer and cruise control (which still ghost breaks sometimes). I don't get why anyone uses FSD.


Don't even get me started. Here's a list of things my model Y regularly does:

- Try to accelerate to 45mph in a parking lot b/c it was within 10ft of the road

- Decelerate from highway speeds suddenly to 30mph, as though it saw something it might hit (I stopped it at 30-ish and hit the gas)

- Decelerate to 50mph because of "emergency vehicles" even though there were no vehicles around (sometimes it mistakes lights that strobe b/c they are seen through median dividers as "emergency lights")

- Take up two lanes because they gradually separated and the car thinks it should stay evenly between the left and right divider line

- choose absolutely bonkers limits, like 30mph on two lane country highways.

- Stop on the highway with a big red screen and a message that says "Take control now fatal error"

- Not so much a problem any more, but when I was first getting used to it, it would beep a message at me, then scold me for looking at the message (and not the road), then ask me to do some kind of hand grip on the wheel to prove I'm paying attention, but I have to look at the message to figure out what it wants.

My wife tells me "Just keep your foot on the gas to keep up the speed and your hands on the wheel to keep it in line" and I am just left wondering what FSD is for


Drive from Memphis to Nashville in my “long range” Telsa 3 that has almost an amazing 160 mile range at 70 Mph. FSD would periodically do something crazy and then ask me why I disengaged, adding further excitement to my drive.

I am now absolutely convinced that we will have full self-driving from Tesla when we have a beautiful wall all the way from the east to the west coast along both the Mexican and Canadian borders. Both will be beautiful.


My tesla model 3 on "autopilot" (just keep speed) will ghost break if it sees some cars merging into an adjacent lane. Really dangerous, nobody expects a car decelerating from 130kph down to 50 on the nearly empty Autobahn. My previous car (VW) got that right. Overall it is a nice car (with a massive and increasing brand toxicity problem), but how can one trust the "full self driving" if it can't/doesn't even keep speed in supposedly trivial cases where the driver has control?


> what FSD is for

Hype & Marketing


Also boosting the stock price.

Musk has been lying about FSD for a decade and the lies boosted the stock price as intended:

https://motherfrunker.ca/fsd/

https://electrek.co/2024/08/24/tesla-deletes-its-blog-post-s...

It turns out lying works.


And it’s no secret he has been lying either. A few years ago Tesla’s senior most liaison to California’s road safety folks was on the record as saying hey we’re years away from true FSD, while Musk was at the very same time telling investors that they were going to crack it and deliver it in a year.


Eeesh. Yeah, I was afraid it might still be that bad after 10 years of between "this year" and "within three years".

At the time, 2016, I trusted their promotional video showing it driving hands-free; I'm not going to make the mistake of taking them at their word again after it was revealed to have not been as it appeared: https://www.businessinsider.com/tesla-faked-video-in-2016-pr...

> I am just left wondering what FSD is for

The vision and promise, or the actually demonstrated use case?

The demonstrated use case is to charge people more money for the same product.

The vision? That is exactly what Musk keeps saying: in principle, a self-driving car never gets tired or drunk, so it can be safer than the mean human even if it only operates at the level of the median human. And it wouldn't need to be limited to median human level, as the whole fleet could learn from every member, so gain experience a million times faster than any human.

But at this point, I'm sufficiently skeptical of all of this, that I think they (and everyone else) should be banned from direct observation of the entire fleet's cameras — it's a huge surveillance network operating on every public road and several private ones.


Why on earth are you using FSD in parking lots?


What else would the "Full" in "Full Self Driving" mean?


Why are you using full self driving as part of your driving?

Weird ask imo.


I hope you guys got rid of your teslas. Preferably under press. They are a danger to people around you.


They'd probably just replace it with another car, defeating the purpose.


Other cars don’t claim to offer FSD. The limitations are more explicit, therefore it’s safer for people around it.


If you were going to get rid of your Tesla for that reason, then you'd know about the limitations. Getting rid of it creates a chance for it to go to someone who did not.


Not based on their safety record.



Got any source for that, or anything to refute the reference below that shows that Teslas are actually the most dangerous in terms of fatal accidents/miles driven at twice the average?



Those are about the safety of people in the car. The comment saying they were dangerous was about people not in the car.


Based on reading that one, it seems like the takeaways are fairly opposed.


“The study's authors make clear that the results do not indicate Tesla vehicles are inherently unsafe or have design flaws. [...] “The models on this list likely reflect a combination of driver behavior and driving conditions, leading to increased crashes and fatalities,” iSeeCars executive analyst Karl Brauer said in the report.”

See also: https://www.iseecars.com/most-dangerous-cars-study#v=2024:~:...

It's also interesting in the context of this argument that they exclude models older than 2018.


I wonder if driver behavior is influenced by Tesla’s design of certain systems, in which case Tesla vehicles would be inherently unsafe and have design flaws to the extent that unsafe driver behaviors are enabled or encouraged.

“I wonder” is facetious here. Of course driver behaviors are influenced by Tesla marketing and design choices, and of course Teslas being the Most Dangerous Car Brand is an indication that Teslas are inherently unsafe and have design flaws.


I'm honestly willing to chalk a big part of the increased fatalities up to their awful UI/UX, that incessantly draws the attention away from the road to do even minor things (adjust mirrors, windshield wipers, etc)


In the same way as a 911 driver's behaviour is influenced by porsche's design of certain systems, etc


Absolutely and that’s probably one of the reasons a 911 is 4th on the Cars With the Most Frequent Occupant Fatalities list.

But Porsche doesn’t claim to be The Safest Car Brand Ever and doesn’t claim to be making cars that can be put into a mode where it Full Self Drives?


Just like every other vehicle on the road.


It's for the share price. Just like it was for uber.


What’s amazing to me is that FSD to this day cannot recognize active school zones. Even my 6 year old Audi onboard cameras can do that. If you put FSD on and you go through a school zones, the Tesla will happily zip at full speed completely ignoring the school zone.


It's terrifying that companies are allowed to beta test buggy software out in the real world by shooting huge machines with sharp pointy corners through schools.


I hope this isn't too off-topic, but I'm always amazed at how many otherwise smart people hold the naive belief that FSD is remotely close because 99.9% of the time it works fine.

Self-driving in my opinion will require an AI that is, if not very close to, an AI capable of general intelligence.

Why?

Because in the real world to be able to drive a car as well as a human across all of the edge cases a human can you probably need something approaching general intelligence.

Humans understand that a person isn't just something with 4 limbs, but also can be that thing that looks like a white sheet with eyes by the side of the road on Oct 31st. And its these types of weird edge cases that humans instinctively understand because they have a deep world model to reason about which cannot be reasoned about by the narrow FSD AI systems we currently have.

When you think about what humans need to do when driving it's so much beyond just watching the road and turning a wheel that it seems almost absurd to imagine our current AI is anywhere near capable of handling all of the edge cases humans currently are.

And I also don't buy this argument that the goal should be to simply to reduce the total number of accidents per mile... I'd grant that it's very possible that FSD could reduce the total number of accidents per mile driven because most miles are driven in the much more narrow environment of highway driving. And here AI probably could do better job than a human on average when you factor into the equation human tiredness and distractibility. But no one is going to be comfortable with FSD occasionally plowing into a group of kids outside a school because statistically the total number of people who die in road traffic accidents is reduced on a per mile basis.

I'd be interested if anyone strongly disagrees.


I think you can implement self driving without general AI, but it has to be really defensive, err on the cautious side, and that means it can't travel faster that 10 to 25 kph, like a bicycle basically. That car will have a "safe zone" around it, monitored with radar and/or lidar, and if anything enters just outside that "safe zone", the car stops before hitting anything.

The market for such cars would be very limited IMO.


Totally agreed with the fist paragraph but after that you're ignoring the existence of Waymo today, which people generally feel comfortable in and around where they are. Elons marketing is so strong even the doubters feel like Tesla is the leader of the pack.


> But no one is going to be comfortable with FSD occasionally plowing into a group of kids outside a school because statistically the total number of people who die in road traffic accidents is reduced on a per mile basis.

I really don't see why not. Since those deaths must be counted too, if it still is safer with than in mind then it can't be something that happens even rarely.


But when are these edge cases, are they like the edge cases when I need 4 wheel drive? Basically never and if I did need it I just wouldn't go there or would rent a car for the day or something.


Most safety problems, including the child in a Halloween costume, can be solved by the advanced AI technique called "Don't hit anything. If it looks like you're about to hit something, slow down or stop."

Trouble is, when the company deliberately ties one hand behind its back by insisting on camera-only vision, it is never going to be perfect at not hitting stuff. Either multispectral imaging, radar, or lidar would help avoid edge cases like the Halloween costume. The camera might not even realize there's a three-dimensional object in front of it if there's snow on the ground. Stupid, stupid, stupid.


That "don't hit anything under any circumstances" is incompatible with USA-typical "stand our ground" and "right of way" philosophy of doing things.


> Trouble is, when the company deliberately ties one hand behind its back by insisting on camera-only vision, it is never going to be perfect at not hitting stuff.

I have a Y with FSD. I think that underplays it a little. Yes, they don't have LIDAR, but LIDAR is very expensive and fragile. I can understand that.

But they also removed the windscreen rain sensors in favour of using the cameras. Consequently you can be driving down the highway on a clear day, and the windscreen wipers will start. They also don't have ultrasonic distance sensors. Consequently the car won't warn you about short bollards at the corners of the car. It seems to be completely unaware of them.

Unlike LIDAR these sensors cost almost nothing, and these are high end cars. Keeping them until they have the camera version reliable would seem prudent. In fact they could have used them to improve the camera version, by say comparing what the rain sensor said to the camera output. It's seems like they've allowed an AI religion to cloud their engineering decisions.


It's so surprising to hear these issues with FSD and it makes me nervous even though I haven't encountered any problems in v13. I regularly use it back and forth between work and home and mostly rush hour with a lot of difficult merges and weird situations.


I suspect part of the difference is what kind of roads you are on. Whenever I'm in the Bay area (or Southern CA in general), I'm amazed by the quality of the roads. The pavement is even and smooth and the lines are crisp, fresh paint that is easy to see.

Meanwhile in the Midwest, we have potholes, uneven roads, sometimes roads with different surfaces mixed together (gray concrete with black asphalt patches). Lines are often badly worn by the weather and road salt and can be quite difficult to see.

I strongly suspect with no evidence that FSD likely has more problems on roads that are in poor condition.


I had to reread your comment multiple times because i couldn’t believe I was reading someone complimenting Bay Area and Socal roads.

I say that in the most innocent, sorta joking way possible considering I curse those very roads at every turn. Maybe I have to break out of the bubble!


I agree. I've used FSD v13 on a Model Y with hardware version 4 for a couple of months now. Checking my mileage, that's over 2,000 miles, most of which was with FSD enabled (road trips on interstates, backroads, two-lane country roads without lane markings, interstates, highways, etc.). It's been absolutely fantastic.

Even my parents and sister use FSD v13 regularly now in their Teslas.

It's come a long way from the early days when I first started testing it.

It makes me wonder how many people are using Autopilot (included as standard) instead of FSD on a newer Tesla with the new AI hardware?

It's pretty wild to be able to start from park. Tap a button, and go.

Just the other day, it managed merging onto the interstate and then immediately changing 7 lanes to the left to merge onto the next interstate exit heading north. It performed flawlessly.



I test drove a model 3 ~3 years ago and FSD was terrifying. I had no idea what it could or couldn’t do. IMO Hyundai (and others with similar features) have it perfect with adaptive cruise + active lane assist. I know exactly what it can do, it does 90% of the driving on long trips, and it doesn’t do so much that I’m tempted to put too much trust in it.


Prepare for Mayhem:

"Tesla Plans Robotaxis in Texas by June, in a State with No Regulations" - https://www.auto123.com/en/news/tesla-robotaxi-texas-regulat...


inb4 autonomous only zones in cities


It's crazy, because any negative criticism of FSD will have a ton of fanboys pouring out of the walls to tell you how great it is, how great the latest update is, how your anecdotal "evidence" is not typical, etc.

Except all you have to do is go try it and it becomes clear to any layperson that it's probably getting there but, and this is really crucial, it's not there yet.


I feel like FSD has been "getting there" since before even Tesla started marketing it. I remember Google's early self driving cars and everyone thought they were only a few years away from being practical.

I think FSD definitely has utility, but not in the hands of laypeople. There are still far too many edge cases that it just doesn't handle well, and your average person can't be trusted to stay alert and attentive while using a feature so heavily marketed as not needing either of these things.


> I remember Google's early self driving cars and everyone thought they were only a few years away from being practical.

...and that turned out to be a little optimistic, but they really are "there" now IMHO in San Francisco. I rode in one for the first time last year. Subjectively, I felt safer moving through San Francisco traffic in the self-driving car than I do when I'm driving there myself or when I'm being driven by a human in a Lyft. It was attentive, cautious, and smooth, and I got there in a reasonable time with no fuss. And crucially, I see a notable lack of stories about it making dangerous decisions, despite the total passenger miles.

Why is Waymo there now and not Tesla? I think a combination of factors, including: (a) the head start, (b) the willingness to use LIDAR and RADAR to overcome limitations, (c) the focus on self-driving (they design and operate self-driving systems; they don't manufacture electric cars), (d) the service model (easier problem to focus on a mapped region with good weather and monitor everything vs. sell a car expected to work anywhere/anytime without (as much?) telemetry), (e) frankly, caring more about safety and less about hype. Of those differences, the "head start" one is shrinking relatively speaking, but the others will likely remain significant enough that I don't expect to trust Tesla's systems any time soon.


[flagged]


Weird comment considering that the most common failure point of that contraception method is _failure to actually use it_. Other than that they're pretty much unrivaled in safety.


[flagged]


I mirror the previous posters experience: the Waymos drive better than a good amount of people I know and to the the best of my knowledge haven't had any issues with pedestrians except being attacked by them once or twice.

So I'm not sure what you're referring to? Did I miss a story?


> Did I miss a story?

Just wondering the same myself. Found this: <https://www.washingtonpost.com/technology/2024/12/30/waymo-p...> but the videos don't seem as persuasive to me as the text:

* the ones on the left and right show a Waymo passing by a pedestrian standing on the sidewalk next to the crosswalk while filming. As a human, I wouldn't have stopped, as there was no indication the pedestrian intends to enter the crosswalk.

* the one in the middle shows a human-driven car passing right by a pedestrian filming from a narrow median. I think the human driver probably should have stopped, as that narrow median is a stranger and more dangerous place for the pedestrian to hang out than the sidewalk. The Waymo in the middle lane, given that the human-driven car didn't stop? Don't see why they would, how would the pedestrian have reached them safely? I suppose there's an argument they should because one driver stopped at a crosswalk legally obligates other drivers to as well, but the human I mentioned was ahead of them...

As I understand it, the law is that the car (regardless of driver) needs to yield to a pedestrian who is crossing the street. But the pedestrian is responsible for that first step into the street. And think about it...how often do you see someone standing near the roadway, even near a crosswalk? I think often enough that traffic just wouldn't move if cars were expected to assume pedestrians were going to jump in front of them.

The text "I tried sticking one foot out, crossing in both heavy and light traffic, waving at the car and even pushing a baby stroller (without my baby!)." is more worrying, but...pics or it didn't happen? (In particular their suggesting it misbehaved in these videos makes me doubt the accuracy of their reported text...)

The article also says: "No Waymo car has hit me, or any other person walking in a San Francisco crosswalk — at least so far. (It did strike a cyclist earlier this year.)"


They seem safer than human drivers.


Which is why the fan boys always tell you that it's the next version that will fix all the bugs.


It's the next version will fix all the bugs all the way down.


It's like C++ :-)


I was going to say Java…


Interestingly common trait in fan-boyism, the <thing> is always just a few steps away from being right.


Also, a fairly mundane reality of technological progression.


exactly... this is HN of course so I expect nothing less. my favorite is when I frequently (as early as couple of days ago) get comments like "go see some videos on youtube before commenting like that" :D soooo funny. the thing is absolute garbage but elon can sell garbage better than anyone that ever lived


I like that you've built a strawman for anyone who might disagree with you. "Oh there aren't any positive reviews they're just fanboys." You may as well write "Anyone who disagrees with me is wrong."


I don't get it, what do you expect them to do? just reinforce your view? the data is pretty clear how many people use FSD without issues. What's equally weird is you guys preemptively smearing folks who defend FSD. I don't get it, nothing will make you guys change your mind I guess.


For me, evidence would work. That's why California's regulations to report miles driven and # of accidents and disengagements is so nice. It's a standard to compare, measure and regulate. And when Tesla hides away from such and instead moves to unregulated Texas markets, it makes me predisposed to think they are shying away from gathering just the evidence that would convince me it's safe. If it works for you, great. But this difference makes me happy to live in California.


> the data is pretty clear how many people use FSD without issues.

What data? The data Tesla chooses to share with you?

It wouldn't surprise me to find out this incident had FSD disengage moments before colliding with the pole, thus continuing 100% FSD safe driving.


Who else can share the data Tesla is collecting??


1) Tesla has not actually shared real datasets, only cherry picked snippets and highlights.

2) This is only the data Tesla chooses to share, at times when Tesla chooses to share it.

They could have other auditors validate the data. They could release full datasets on regular schedules. They could do a lot of things, but they don't. They give little highlights of results without explaining methodologies. They pick seemingly random timeframes. They're not open with this information in the slightest.


> the data is pretty clear how many people use FSD without issues.

Serious question - what data? And who is supplying that? Tesla? And what is the emotional human equivalent for the level of confidence that we should assume is "safe"? 1% error rate? 0.5%? 0.00001%?


who should share the data Tesla is collecting? Tesla? or should some one steal their data and share it?


It’s a little hit or miss (pun intended). I’ve used it off and on in my model 3 for the past 2 years and the rate of improvement has been significant.

Still though it has quirks.

On long trips, I LOVE it. LOVE it. Being able to just tap in and relax, make phone calls, listen to an audiobook, etc is so nice. The first time I ever used it I had to leave early from the All Things Open conference in Raleigh because I was getting sick. Having it essentially drive me home for 5 hours when I wasn’t well, including stopping to charge, was a huge relief.

It’s also great in traffic jams where you’d otherwise be dealing with stop and go traffic until you get through it. Just tap in and relax til you’re on the other side.

Day to day driving, it’s a little more iffy. I’ve dealt with seemingly random slowdowns on otherwise empty roads. It feels odd especially because it’s sudden.

Early on it would have difficulty on roads without well marked lines too.

I’ve never felt like it was going to run into an object though. Usually it errs on the “too cautious” side and I just take over to get where I’m going quicker.


> Being able to just tap in and relax, make phone calls, listen to an audiobook, etc is so nice.

My 12 year old Ford Focus does that


Not while it drives you hours to your destination without your intervention…


The transmission still works?


Its a manual, and works fine


I always slow down a little bit when speeding through intersections, just in case I need to react to someone illegally putting themselves in my path.

It was this guys fault for not monitoring the car, but also Tesla's for using a double-speak name like Full Self Driving.

If FSD is a statistically significant enough risk factor for injury above Teslas that don't use it, it should be banned.


Well, FSD it is officially in beta. Supposedly it would actually be safe once it reaches release; Which decade that might be, your guess is as good as mine.


I'm on AI4 v13 and havent had a safety intervention in several thousand miles. It's incredible and extremely smooth


While we're sharing anecdotes, I have FSD (13 now) on my model Y and love it. I was anxious at first and remain very guarded while using it (which you're obligated to do anyway) but it's taken away a lot of the tedium and fatigue of commuting and long highway driving. I occasionally use it door-to-door but often turn it off on certain roads, where I'll do a better job avoiding potholes, for example. It's not done anything unsafe, but it did change lanes once without signaling and I had to intervene. No one was around and the lines were hard to see, so perhaps that's why. Overall it feels like a far safer driver than a lot of people I've been in the car with.


I feel this way too. It was scary the first couple weeks, but I'm glad I gave it more of a chance. Over time you learn when to trust it and what situations it you need to take over, and then it just becomes a normal part of driving with less stress on me while my brain energy is spent looking for problems instead of keeping up with traffic or in the lane.

Really a human + AI hybrid experience.


I too have FSD 13 on my CT and use it for 99% of driving with no issues. I have done a number of long city to city drives (30-100+ miles) with zero interventions. The roads would be 10,000x safer if every car was using FSD, even in extreme edge cases like the original post.


How do you like the CT? Is it your first Tesla?


It’s fantastic and Tesla service has been really good for the one issue I had over the last 12 months. My wife bought a Model Y in 2021 so it’s the second Tesla in the house.


There are upsides, definitely. For slow moving traffic FSD can remove so much of the tedium of matching speed, spacing, or stop-and-go.

I like the level 1 to level 3 features: Lane keeping, emergency braking (when there's something there), adaptive speed control, etc. But a new minivan has all those too.

For long highway driving it does remove 99% of the things I hate, but there's 1% of the time it just annoys the hell out of me, and it tarnishes the whole experience.


> While we're sharing anecdotes, I have FSD (13 now) on my model Y and love it.

At what ratio of good anecdotes to bad anecdotes should we trust it? For me, the ratio has to be astonishingly high, such that if there are a few people in the discussion saying it did something suspect (much less dangerous), they're always going to be the ones I listen to. Not that I'm doubting your experience; it's just not enough to outweigh the other.

Obligatory xkcd: https://xkcd.com/937/


I have no horse in this race at all, but people seem less likely to tell good anecdotes. There's little interesting in "I used it, it worked adequately."


I guess proper statistics are the thing.


   Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
I walked away without a scratch. This could have easily killed an innocent pedestrian or bicyclist. How is this best safety engineering? If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.


Clearly the owner showed that this aspect is not important to them when they ordered the Cybertruck.


The US has no pedestrian safety regulations at all for car design. Some have been proposed but it's 2025 and still nothing enacted.


And with the current administration’s attitude towards consumer protection you’ll never see a meaningful change in safety regulation.


Given the last three weeks, I would expect rules that are actively pedestrian-hostile.


> If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.

There actually is. The Automatic Emergency Braking functions separately from FSD and can prevent collisions in some cases. It doesn't work 100% of the time so I wouldn't rely on it, but at least it works as well as or better than competitors' systems.


It ran into a pole… I don’t imagine other cars’ safety systems would fail to brake there


It is passive-aggressive sarcasm. If you say mean things about Tezla on X there is a chance you may be banned/sued/delisted, especially if it involves a crash. So everything has to be couched in false praise. Nobody really thinks the cybertruck does better in a crash than a merc or bmw. Its just something said by the posfer in order to get thier story to a wider audience.


I hope that's the case. It almost read like someone trying to still believe in God right after their mom died.


It does not seem to be the case.

> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.

> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.

https://x.com/MrChallinger/status/1888546351572726230


Full tweet is below and it doesn't sound like sarcasm. He even says he doesn't want to give the haters ammunition.


>He even says he doesn't want to give the haters ammunition

That part was what made me question if it was real! "Don't want to give the haters ammo" at the tail end of a story about how his $100k pickup truck drove into a lamp post.

What exactly does he think he's doing?


A description and a picture is one thing - a video would be on every news station by the end of the day.


Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

What you're asking for, though, is definitionally impossible: obviously the cameras didn't detect the obstacle, so FSD or no, they can't react to it. The actual solution would be to do what every other car maker with self-driving pretensions does and augment the cameras with LIDAR or other sensors.


> Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

Judging by the (illegal in Europe) design, passive safety is the only safety Cybertruck has, and the safety of others have absolutely zero importance. Fits with how the rest of the world sees the typical American as well, so maybe not a big shocker.

> What you're asking for, though, is definitionally impossible

Why is it impossible for the car to stop (legally obviously) if it fails to merge, or even hit the curb, instead of continue straight forward like nothing happened?


>passive safety is the only safety Cybertruck has

Not even remotely true


Europe's safety is optimized for its environment: mostly narrow, crooked, and crowded streets with a lot of pedestrians. Most use cases for a pickup truck that's only sold in North America are in the part of America where you're much more likely to crash into a tree, deer, fence post, etc than you are a person.


I see lots of SUVs in Europe, which are light-truck sized (and are more often than not built on top of light truck chassis). That plus the preponderance of trucks in US cities suggests to me that it's mostly a cultural and regulatory issue, not a matter of driving environment.


I can't think of a single SUV that is built on top of a chassis on sale in 2025?


Aren't the Wagoneer, Escalade, Yukon and Tahoe all body-on-frame based on a truck chassis?


The OP said 'in Europe'. I'm only in one bit of Europe admittedly, but I haven't seen any of those on the road. I don't think they are generally available.


> Most use cases for a pickup truck that's only sold in North America are in the part of America where you're much more likely to crash into a tree, deer, fence post, etc than you are a person.

Not many trees, deer, fence posts in the Costco parking lot compared to people.


Dunno; last time I was in SF, I saw one of these absurd items (they are even sillier in person), right in the city, lots of people around. If they’re so rural-adapted, perhaps they shouldn’t be allowed enter built-up areas.


Really? Someone should tell all of the suburban and city-dwelling truck owners that.


> Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

Passive safety usually is defined as reducing the risk of injury or death to vehicle occupants in an accident AND also protecting other road users. You left off the second part.


Post gets 8m views as of this writing. Owner doesn’t want this message to get viral because then tesla gets flak for this. Takes the blame. Wants to share the message of FSD’s fallibility with everyone. Praises tesla for safety.

My head hurts with how oxymoronic this is. My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner. “Thank you sir for doing treat work and for fixing this problem in the future”


> My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner.

Well I imagine that since Musk was handed the keys to various government agencies and installed his henchmen you can see why you want to tread lightly and kiss the ring. Such a wonderful future this is becoming.


The very fact this is a cybetruck owner, that already tells you you're dealing with a fanboy (aka cult member). edit: over simplification of course, but not far off from the truth


You can buy weird or unique cars without being into a certain culture or group.


Not Cybertruck IME


It costs over $100,000. That is a decision you do not make for no reason.


And my guess is there's a touch of ideology to this person that questioning Tesla and FSD's fundamental safety would hurt. "I screwed up" does a lot less to cause cognitive dissonance than "Something I believe is wrong"

There's a lot of possible flavors to that ideology, it COULD be right wing political affinity, but it also could be a belief that technology is superior to human judgement, or that self driving cars are the future, or it could just be that spending 6 figures on an ugly pickup wasn't a waste of money.


I think you're reading way to much into how people think.


I think he's trying to get special attention without passing on blame to the company. Best way to get help is through kindness (at the outset) not accusatory (even if it is the fault of the company).


Worth reading the actual tweet, not just the article's truncation of it

> Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.

> Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.

> It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.

> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.

> @Tesla_AI how do I make sure you have the data you need from this incident? Service center etc has been less than responsive on this.

> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.

> Spread my message and help save others from the same fate or far worse.

https://x.com/MrChallinger/status/1888546351572726230


how is this not a cult? if a toyota or mercedes car veered off into a light pole would they write the same tweets?

unless this is sarcasm that can, at best in the current times, be construed as serious.


Most FSD users are in it because they see how great things could be if self-driving existed, and are willing to put in the risk to get the algorithms trained.


The risk is mostly on the people outside without any say on the matter.


How does that risk differ from human drivers? Other than the fact that the training would affect many vehicles vs just one driver, and the fact that humans are only supervised with possible intervention for 6hrs total.


The problem is that it's not just them putting in the risk: it's everyone else around them.


As callous as it is, I think this is par for the course of human history. Progress demands sacrifice :/


"Some of you may die but that is a sacrifice I am willing to make."


I used to be. Now I just want my damn money back.


Right the whole point was not to make an article attention about all the negative news. Which is funny because this thread is immediately full of complaints.


"Big fail on my part, obviously" WTF??? Big fail to buy Tesla and even bigger to use FSD. With my almost 20 year old car I don't have to worry about none of this BS.


The cybertruck owner is clearly only interested in their own safety. Luckily, in my country cybertrucks are not allowed on the road, for other people's safety.


Weirdly, they seem more concerned for Tesla the company than they are for themselves.


One doesn't buy a truck like that out of concern for other people.


Driving such a beast seems to give a very "I'm worthy more than others" feeling. I'm puzzled, what drives this behavior?


Cybertrucks seem like the modern day Harley motorbikes. A very expensive way to signal anti social personality.


>in my country cybertrucks are not allowed on the road

In my EU country either however they have already been spotted on the streets with valid license plates. There are loopholes everywhere usually if you classify it as a commercial utility vehicle for a business instead of a passenger car. There's plenty of people with money and no scruples.


Some cars, when I see photos of them smashed up, I get very sad. NA Miata, Corvette C4, etc. A totaled Cybertruck, honestly, good riddance. It is an extraordinarily difficult vehicle to love.

Very glad to hear no pedestrians got hit. Really hope the driver takes some kind of lesson away from this experience.


I really doubt they are taking any lessons from this. This is the author's second Cybertruck crash in a month.

If I didn't know better, I think they are trying to farm engagement.


>This is the author's second Cybertruck crash in a month.

Does anyone have a source? If true then this defense of Tesla that we see now is even more bizarre.


This is pro-tesla/elon account, but the screenshots are legitimate.

https://nitter.net/WholeMarsBlog/status/1889098514061492517#...

The whole thing is funny cause the guy who is so vehemently defending Tesla and FSD despite totaling his car, is being targeted by Tesla fanboy accounts for being a fraud. Twitter is really bottom of the barrel garbage.


>This is the author's second Cybertruck crash in a month.

JFC


I drive one and I love it! I wanted a large, robust self-driving family EV and as a bonus it looks unique compared to the blobs that everyone else drives.

It's sad that we CT drivers seem to be caught in a crossfire between Tesla and Tesla/Elon haters, when all we want to do is enjoy our cars.


> It's sad that we CT drivers seem to be caught in a crossfire between Tesla and Tesla/Elon haters, when all we want to do is enjoy our cars.

The good news is that it’s just the ones who drive their truck with the now-known-to-be-faulty FSD turned on who are caught in the crossfire. (I’d say they are actually and should be in the “crosshairs”, but that’s moot.) Doesn’t have to be you.


This is because Tesla‘s implementation hasn’t worked, doesn’t work, can’t work, won’t work, won’t ever work, and has been a decade-long intentional fraud from a con artist that was designed to pump up a meme stock. THAT worked.


I am no fanboy, but I have it on a 2023 model Y. It works incredibly well. It is actually already amazing. You all are just blind from Elon hate.


> not a fanboy

> you're all blind from Elon hate

This is very ironic.


[flagged]


What statistics? Let’s see it


And how


Anothe proof that self-driving cars with human backups should never be allowed on public roads. It's going to be used unsafely because it encourages such behaviour.


Specifically, SAE level 3 should be explicitly prohibited. Humans have proven, over and over, that they can’t handle that level of alertness while also not driving.


I'm not sure level 3 is a problem. In L3, the car is required to initiate handover and be able to give you a long time to take over. This may or may not end up OK. (Witness the Mercedes eyes-off, hands-off Traffic Jam Assist). If the car can do something reasonable 85% of the time when the human is unavailable, dead, or drunk and can't take over in 25 seconds that would probably be OK.

Level 2+, though, is a big worry. It fails enough to be dangerous, but many of these systems fail too little for humans to effectively monitor them.


Waymo has been operating near-flawlessly for years now in some busy, complicated cities. I don't see self-driving tech as the problem. It's been proven to work. I see irresponsible companies as the problem.


Waymo has conditionals (type of car, what roads it will drive on, range, etc) on how and where it can operate, FSD's conditionals are much less stringent.

I'm still waiting for Waymo to safely drive in the snow.


Waymo is level 4 while FSD is level 3. Waymo won't drive in the snow, because they know they cannot fully automate it safely in all situations. FSD will yolo any situation and just suddenly hand you the wheel whenever it likes, resulting in situations like the linked article.


> I'm still waiting for Waymo to safely drive in the snow.

I'm not, at least not especially. Technology doesn't need to be flawless to transform society. We put up with lots of limitation when we live in places with serious winters. Why should driverless cars be impervious?


You didn't read where he said "self driving cars with human backups"


If it needs a human backup then that's not a self-driving car though is it? That's an already-obsolete concept.


I wouldn't say "never", however it's clear we're not there yet.


if you want to see the true horror, check out https://comma.ai. $2000 and plugs into most cars made in the past few years, works by using cracked security for the cars it is "compatible" with. These people are on the road next to you with a car being driven by a single smartphone camera. They sell it as "chill driving" but they have a discord where people just flash custom FSD firmware.


By "cracked security" do you mean something more than the fact that it plugs into the CAN bus?

At least they are not pretending to offer anything more than level-2 adaptive cruise control and lane centering.


Ah yes, the compensating behavior theory all over again. Replace "seat belts" with "driver assist"

> This paper investigates the effects of mandatory seat belt laws on driver behavior and traffic fatalities. Using a unique panel data set on seat belt usage rates in all U.S. jurisdictions, we analyze how such laws, by influencing seat belt use, affect traffic fatalities. Controlling for the endogeneity of seat belt usage, we find that it decreases overall traffic fatalities. The magnitude of this effect, however, is significantly smaller than the estimate used by the National Highway Traffic Safety Administration. Testing the compensating behavior theory, which suggests that seat belt use also has an adverse effect on fatalities by encouraging careless driving, we find that this theory is not supported by the data. Finally, we identify factors, especially the type of enforcement used, that make seat belt laws more effective in increasing seat belt usage.

[0] http://www.law.harvard.edu/programs/olin_center/papers/pdf/3...


It's clearly not comparable, as wearing a seat belt might make you feel more safe, but it doesn't actively encourage you to pay less attention to the road by its design. The act of driving is roughly the same with or without seat belts. Driver assist or self driving drastically alters how you drive.


And yet FSD and even lane assist are going to be safer than the driver near you who is scrolling and typing away on their smartphone

Don’t mistake my post as a defense of FSD or Tesla. They’ve been lying about their capabilities for what feels like a decade.

I don’t want to see FSD and human drivers share a road. I want all cars to be meshed and communicating their intents with vehicles around them to avoid collisions. We will never see that in our lifetime


It's terrible that cell phone distraction is not prosecuted as harshly as DUI. Consequently, it's totally normalized. Get drunk and plow your car into a bunch of people, killing them? Many states treat this as a serious felony, up to and including charges like "DUI Murder." Plow into the same bunch of people while scrolling Instagram? It's "Oopsie-doopsie! Accidents happen!" The worst you'll get is something like "negligent vehicular manslaughter" which is less than a year in jail.


Judging by how often people press the emergency stop button on the escalators where I live, I fear that relying on the sincerity of strangers (and their cars) is maybe not a viable solution.


Allocating dedicated FSD roads is a terrible future. It will basically kill cities. I find myself agreeing with most of the arguments here: https://youtube.com/watch?v=040ejWnFkj0


We already have dedicated FSD roads, and they work quite well; we just usually call them "subways" or "light rail".

I don't see much point in building additional FSD roads for the inefficient, non-platooned, low-capacity, rubber-wheeled trolleycars Tesla makes, though.


I wish people would just write down their arguments instead of making a 53 minute video of it. I don't have time to watch all that :(


At some point these reckless drivers need to start going to jail. I realize it's not going to happen in the US because the government has been captured, but there's clearly some missing messaging where these drivers don't get the point that they need to be paying attention and not tweeting on their phone while their car drives into a lamppost.


why stop at FSD users? this should apply to all drivers in general. if they cause an accident they need to go to jail.


... they do already, if the infraction is deemed serious enough?


Interesting comments from X:

Snowball: "So FSD failed but you still managed to find a way to praise Tesla. You failed too for not taking over in time. But your concern isn't for the lives of third parties that You and FSD endangered. No, you are worried about Tesla getting bad publicity. You have misplaced priorities."

Jonathan Challinger (the driver who crashed): "I am rightly praising Tesla's automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. [...]"

Fair points from both sides I think.


If he's taking responsibility, why did he specify a semantic version?


There was a post on BlueSky about this the other day. Someone linked a picture of the intersection: https://bsky.app/profile/pickard.cc/post/3lhtkghk6q224

It is worth noting that this picture is a reply to a screenshot of someone saying the following:

  > I've lived in 8 different states in my life and most roads I've seen do everything they can to prevent human error (or at least they do once the human has shown them what they did wrong). The FSD should not have been fooled this easily, but the environment was the worst it could have been, also.

  - Tweet source: https://x.com/MuscleIQ2/status/1888695047044124989
I point this out because I think probably the biggest takeaway here is how often people will bend over backwards to reach the conclusion that they want, rather than update their model to the new data (akin to Bayesian Updating for you math nerds). While this example is egregious, I think we all should take a hard look at ourselves and question where we do this too. There's not one among us that isn't resistant to change our beliefs, yet it's probably one of the most important things we can do if we want to improve things. If we have any hope of being able to not be easily fooled by hype, if we are to be able to differentiate real innovation from cons, if we are able to avoid joining Cargo Cults, then this seems to be a necessity. It's easy to poke fun at this dude, but are we all certain that we're so different? I would like to think so, but I fear making such a claim is repeating the same mistake I/we are calling out.


> And rather than being upset with Tesla for selling him a smart dumpster on wheels, the driver took blame for the incident, saying he should have been paying attention.

In all fairness he really should have been paying attention.

You don't get to abdicate your responsibility to Team Elon because reasons. At the end of the day you will be sitting in the defendant's chair while Tesla will just quietly settle out of court.


Abdicate, not advocate.

And just because it's the driver's responsibility doesn't mean it's not also Tesla's.


Honestly, I think the only innocent party here is the light pole.

Just can't win as a light pole https://abc13.com/suspected-drunk-driver-milwaukee-wisconsin...


I never knew of such outcome. Probably that is why it is advised to not agree to anything immediately after an accident.


I think it would be wise to physically test as many corner cases as possible under extreme conditions. At night, in the snow, going down a hill, birds flying across the road at the same moment a baby robot crawls on to it.


This, obviously, is one of the ongoing efforts and goals of Tesla AI, and the reason they collect so much data. There's some talk about it in the Lex Friedman podcast(s)[1].

[1] https://www.youtube.com/watch?v=cdiD-9MMpb0


Fine, that's great and all if you're into that.

But if that's what you need to build a FSD product, then you shouldn't be releasing the existing FSD product onto public streets.


Since, Waymo has crashes to, is there a crash-free self driving company? Is self driving something that should not be pursued? Is there a way to achieve it without real world use cases (requiring attentive drivers of course)?


Man, why are Tesla fanboys like this... I said /existing/. I do believe Tesla will pull it off, and it's a noble goal, just their track record for safety with this tech has been far from ideal, so far. Honestly, it looks like they're going to get away with it, which makes this all a moot point..

But, let's lay out some facts.

First of all, as far as I can tell, Waymo has not had any crashes at fault! It has had some safety issues, but its crash rate is substantially lower than FSD per mile. Even if it has had crashes where it was at fault, we're talking 1 or 2 at-fault crashes over 40 million + miles!

Meanwhile, Tesla's latest and greatest plows into lamp posts in non-adverse conditions. Fantastic.

Second of all, unlike Tesla, Waymo accepts responsibility and liability for crashes caused by its software. This is actually really crucial, since right now Tesla is pushing that onto the drivers, while selling it as a perfectly safe beta piece of software that is "Full Self Driving". Oxymorons all over the place.

> Is self driving something that should not be pursued?

I never said self driving shouldn't be pursued. What I'm saying is that Tesla's FSD implementation should not be used on public roads, as it currently stands.

> Is there a way to achieve it without real world use cases (requiring attentive drivers of course)?

Waymo created test tracks and then logged many hundreds of thousands of miles with cars with human drivers actively monitoring the system. This isn't perfect, of course, but they had a substantially better safety rate than Tesla

In summary: the issue isn't if Tesla can do it or not (I'm leaning towards "yes"), or if it's worth doing (it is). The issue is /how/ they're doing it and /who/ they're exposing that risk to.


I'm not interested in, nor do I own, a Tesla. I don't directly own stocks in Tesla. My employment is in no way related to Tesla, but it is in finding corner cases for real world problems.

It's difficult to address your response, since your interpretation of my question is so misaligned with where my question was coming from.

> What I'm saying is that Tesla's FSD implementation should not be used on public roads, as it currently stands.

This is the catch 22 that my questions were framed around.

Note that Waymo (1 per 2.7M) [1] and Tesla (1 per 6M) [2] have significantly lower accident rates than bare humans (1 per 0.36M), with FSD having 2x lower that Wayme. This suggests that running as is will prevent accidents.

[1] https://theavindustry.org/resources/blog/waymo-reduces-crash...

[2] https://www.tesla.com/VehicleSafetyReport (this comparison may not be exactly equal)


I'm assuming OP is suggesting that Tesla needs to test these conditions, not the end user on a public road where innocent lives are at risk...

I could be wrong though.


> Tesla needs to test these conditions

What are "these conditions" exactly? Tesla has 1.3 billion miles of data (impossible to collect without crowdsourcing far beyond the number of Tesla employees). There's probably thousands instances of something very similar to the conditions seen, but something made this a corner case/failure. Or, it's just not possible with the current tech.

You can't opt out of data collection when self driving is enabled. Tesla is aware of every disengagement, with the sensor data categorized and added to their tests, if it's found interesting. They also include synthetic data to manufacture scenarios. They are testing everything they can, and there's a chance this will be added to their tests.

> where innocent lives are at risk

I think a separate permit should be required for self driving, with that permit easily revoked for attention violations. Luckily, in the eyes of the law, the self driving doesn't exist. Driver's still goes to prison if they hit someone.


Yeah, they also really need videos of the inside of your garage and videos of you with your family in the garage while the car isnt running apparently.


Was this statement made before or after the driver was contacted by Tesla?


Musk is currently claiming that Tesla will have driverless robotaxis on the road in June 2025.[1]

Be afraid. Be very afraid.

Tesla is in a bind. They've been promising self driving Real Soon Now since 2016, with occasional fake demos. Meanwhile, Waymo slowly made it work, and is taking over the taxi and car service industry, city by city.

This is a huge problem for Tesla's stock price and Musk's net worth. Now that everybody in automotive makes electric cars, that's not a high-margin business any more. Tesla is having ordinary car company problems - market share, build quality, parts, service, unsold inventory. Tesla pretends they are a special snowflake and deserve a huge P/E ratio, but that's no longer the reality.

Tesla doesn't want to test in California because of "regulation". This is bogus. The California DMV is rather lenient on testing driverless cars, and California was the first state to allow them. There was no new legislation, so DMV just copied the procedures for human drivers with a few mods. Companies can get a "learner's permit" for testing with a safety driver easily, and quite a few companies have done that. The next step up is the permit for testing without a safety driver, which is comparable to a regular driver's license. It's harder to get, and there are tests. About a half dozen companies have reached that point. No driving for hire at that level. Finally there's the deployment license, which Waymo and Zoox have. That's like a commercial drivers license, and is hard to get and keep. Cruise had one, but it was revoked after a crash where someone was killed.

That's what really scares Tesla. The California DMV can and will revoke or suspend an autonomous driving license just like they'd revoke a human one. Tesla can't just pay off everyone involved and go on.

Waymos are all over San Francisco and Los Angeles, dealing with heavy traffic, working their way around double-parked cars, dodging bikes, skateboarders, and homeless crazies, backing out when faced with an oncoming truck in a one lane street, and doing OK in complex urban settings. Tesla has never demoed that level of performance. Not even close.

[1] https://www.reuters.com/technology/tesla-robotaxis-by-june-m...


News update: Musk's attempt to buy OpenAI is referred to by an analyst firm as a "distraction" from Tesla's poor performance.[1] "Even with TSLA meeting its June 2025 timeline for driverless covers in (Texas), we still see TSLA as one of several autonomous technology providers, suggesting competition on price and performance."

Tesla's P/E ratio is currently 328. Ford is around 9. GM is around 8. Evaluated as a mature car company, TSLA is maybe 20x - 30x overpriced compared to the rest of the industry. A hype injection is needed to keep the price up. Optimus and fake self driving isn't enough.

[1] https://www.investors.com/news/tesla-stock-distraction-elon-...


The front end of the truck is impressively smashed. It must have been going quite fast?


There isn’t anything in the front that is rigid so it’s designed to crumple like that to absorb the energy maximally.

I think if a car with an engine had it, the pole would have knocked over rather than any sort of t-bone.


If the car had a driver in it the pole wouldn't have even been knocked over.


Your defective autopilot will drive you into a post and you will be grateful.


xAI has determined your existence to be inefficient after you said something detrimental to the Tesla share price.


"Big fail on my part, obviously.”

Hello? Whether it's Full Self-Driving or not, it's always your fault.


I've had a 2017 Model X since new that came with FSD. I had Tesla upgrade the FSD computer (for free), and drove like a granny during the FSD trial period when you needed to have a certain "score" (which was mostly dictated by not cornering or braking hard) to be eligable for FSD.

I try it every major release, and am disappointed every time. In situations where I'd be confident, it is overly cautious. In situations where I'd be cautious, its overly confident and dangerous.

I think its best use is to keep the car in the lane while I'm distracted by something (pulling out a sandwich to eat, etc). And it seems like newer Teslas have eye tracking, so it might not even be useful for that.


Is there a term for when society shifts the frame of reference on all dimensions so that anything that would have been deemed odd / wrong now naturally ends up positive ?


Overton window.

It originated as a political term, but can apply to social norms too.


Overton is, as best as I understand it, a one dimensional phenomenon, shifting the average. Here it's as if everything is slightly rotated in every axis.


Doublethink? Dystopian?


I've seen the tweet before, and the issue I have is: one person is claiming FSD crashed their car. No video, no other evidence.

I'm not saying it wasn't FSD, but it is a possibility FSD wasn't even enabled.


As a general rule, a person who knows the patch level of FSD in their Cybertruck is a person who is about to say something unimaginably stupid.


“Think of how stupid the average person is, and realize half of them are stupider than that.”

  ― George Carlin


Think of how stupid the average person is, and then realize intelligence seems to be normally distributed and that most people are approximately that smart with a small number being dumber or smarter than that person.


so normal distribution is pretty much like a straight-graph with some edges/outliers? learn something new every day :)


I often wonder how far Tesla would have progressed FSD if it wasn’t for the zealous hatred of LIDAR.


Why is this death trap still legal?


“It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb. Big fail on my part, obviously.”


Social media has made people so insane they actively incriminate themselves for views.


"my significant other repeatedly beat the living shit out of me over and over again... obviously my fault, I deserved it..."


This is a particularly extreme case of ‘I love my Tesla, but…’


I’m mostly impressed that the pole held up.


exactly my first and 1,345 thought! :)


"Full Self" Driving


obviously he was.. taking a nap.


one spooky sub is https://old.reddit.com/r/CyberStuck

FSD is clearly not even beta quality

people keep saying it's "trying to commit suicide"

and it's being fixed on the fly at everyone else's cost in life

But now they are removing federal reporting requirements so buyers will NEVER know


> removing federal reporting requirements

Got a link for that one?


https://www.reuters.com/business/autos-transportation/trump-...

I couldn't find anything suggesting this has been put in place yet though.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: