Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This looks like the vehicle tried and failed to get the driver to engage and pulled over best it could. There is no shoulder there so…

Let’s wait to get more info as to what the driver was doing and if they were incapacitated.

The FSD beta is pretty aggressive on making sure the driver is paying attention via both steering sensors, the in-cabin camera and the touch screen.



> if they were incapacitated.

You can see them get out of the vehicle and walk around a bit in the video[0], so I guess they were not incapacitated - see around the 46s mark in the video. The driver also claimed they were using FSD at the time the car performed the maneuver.

[0] https://www.reddit.com/r/teslamotors/comments/108kpgo/footag...


another angle: https://twitter.com/kenklippenstein/status/16128488720611287...

The driver was on I-80. Unless this is a Tesla employee, Tesla FSD is not enabled on interstate highways at this time; when you're on a major no-stops no-traffic-light highway, it defaults to Autosteer+TACC, with his subscription likely only enabling the auto-lane-change feature. All this means is that the likely culprits are either TACC stopping on its own as if there was a car there, or the AEB system performing an emergency stop to avoid hitting a (phantom) person or car; but based on the footage, it looks like a TACC failure since it doesn't look like an emergency stop you'd see in IIHS videos[0].

0: https://youtu.be/liQxaeGPHJg


FSD vs navigate on autopilot is a pretty small technical detail on the highway. Once you turn on FSD beta, from the driver's perspective FSD and autopilot are the same thing - turned on with the same input (double tap on the right stalk), with the same stuff shown on the display.

Navigate on autopilot can change lanes on the highway and follow highway exits without user intervention.

IMO when someone says they're using "FSD" they mean they're subscribed/paid for FSD, have the FSD beta turned on, and are letting the car drive itself (whether that's FSD on city streets, or navigate on autopilot on the highway.)


> turned on with the same input (double tap on the right stalk), with the same stuff shown on the display.

Small nit but this isn't true. FSD will show the newer, much more detailed preview. When switching to highway mode it will revert to the old, plane grey lines preview.


> pulled over best it could

If that is its best then FSD has no business being on the road.

It was erratic and didn't give clear intent to other cars what it was trying to do.


IMO the car signaled, changed lane and stopped without slamming the brakes. It didn't collide with anything, the pileup was formed by human drivers driving too close and too fast for their reaction time. If it had been a human driver that saw a kitten or a rock in the road, it would probably slam the brakes.

I'm not defending Tesla's FSD, which I think is inferior for Tesla's reluctance to use LIDAR in addition to cameras. Current car AIs are imperfect, they need as much info as they can if we expect them to drive safer than humans.


If it caused a crash it stopped too quickly. Unless there was an immediate, life-threatening need to stop it should simply not have stopped and continued until it was safe to stop. Any halfway competent driver could plainly see there is no safe place to stop there and you should never stop there unless your car is literally on fire or the road bed is missing.


It also stopped in a bad spot for human eyes. From the rear facing shot, looks like was just past a brightly lit to dark tunnel. Human eyes need a bit to adjust to that. It's quite possible, especially as the backup grew that people literally couldn't see the brake lights.

Thats not to excuse anyone, but it may have been a contributing factor.


I’ve seen videos of some other system (Mercedes? VW?) stopping when the driver had cruise control on without paying attention.

It put on its hazards and gradually slowed to a stop. Noting fast or abrupt. Easy for any driver paying attention to recognize and be ready for.

It was nothing like what happened in this video.

Personally I doubt this was the car trying to stop safely due to driver inattention. Seems more likely an AutoPilot/FSD bug or the phantom braking all the way to a stop at the worst time.

Or terrible driving by the driver if they had been in full control and are lying about FSD.


A week ago, I Turo’d a Model Y for a week. We were driving around, 35 mph on a two way road with a bike lane.

We were going past a bicyclist in their lane appropriately. I had auto pilot engaged. As we passed the cyclist, they moved over towards us slightly, but never left the bike lane.

The car suddenly went into a full stop, which was a pretty quick stop.

I can understand the car may have thought the cyclist was going to go in front of the car, but the scenario wasn’t one in which a human wouldn’t have slammed the brakes. My wife commented that we’re lucky there weren’t cars. Whine is because they might not have reacted as quickly.

To me, this shows the potential danger of these safety systems false positives. Not that I wouldn’t continue to use the systems because I believe overall these systems make me safer than without them.

But as always, the danger isn’t the car I’m in, it’s the one driven by another driver who’s not paying attention.


Stopping in the left lane of the Bay Bridge where the lanes are narrow and there is no shoulder is extremely dangerous. You can see the predictable results.

If the sole problem was that the driver wasn't paying attention, the least it could have done would have been to put on the hazards and very slowly come to a stop. It would have been even better if it simply continued until there was a safe shoulder to stop on. The chance of having an accident by stopping on the bridge is much higher than just driving on autopilot to the end of the bridge and then pulling over.

If the problem is that this would create a moral hazard where drivers stopped holding onto the steering wheel when on bridges with no shoulder, the car could play loud obnoxious noises inside the car when this happened.


Right it's dangerous but it's equivalent to a flat tire or something causing any other car to stop. I only skimmed the video once but it looked like the car slowed down quite a bit, and as others mentioned a car slowing in front of you even in a congested area doesn't excuse others from following too closely or at too high of speed or not paying attention, because something could happen to any car causing it to stop somewhat quickly.

I was watching the video and the whole time I kept thinking that if I saw my car slowing down for no reason (and my Autopilot not FSD) has slowed down "ghosted" before I would have immediately taken control. Why didn't the driver do that?

> the car could play loud obnoxious noises inside the car when this happened.

I'll be interested to hear about what happened there but I know if you don't toggle the wheel or take action when on Autopilot the car will flash blue and beep at you to take control and if you keep doing it it'll disengage Autopilot until you stop and park the car.

I haven't used "FSD" much at all so I'm not 100% familiar with it but there seems like a host of problems here and the software powering "FSD" is but one of those. It saddens me though that people aren't asking the best question here which is why are we driving everywhere in the first place?


> IMO the car signaled, changed lane and stopped without slamming the brakes

Which is erratic and reckless.

If it was an emergency situation it should have put on hazard lights, waited until all surrounding cars passed, then changed lanes and very slowly decelerated until it stopped.

All while aggressively alerting the driver what is happening.


Right, if someone is incapacitated, putting the life of that person and others on the road in danger is still not the right move. It doesn't matter what the driver is doing, any driving system that puts other people on the road in danger does not belong on the road.


Other systems will stop if no driver input is detected too. They slow down much more gradually and turn on the hazards so other drivers can safely react.


Because this is the first car pile up we've ever seen.

Interesting because a computer was involved we are incensed. "No business being on the road"... but the millions of other car crashes "oh yeah, well humans make mistakes!"

Tesla's claims of being "safer than human drivers" is as much a testament to their technology as it is to how poor human drivers are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: