Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"I watched a few hours of early unedited footage posted by Tesla owners who received the new software. The software made a number of mistakes, including two incidents where a Tesla seemed to be on the verge of colliding with another vehicle before the driver intervened."

What, Tesla still can't detect big, obvious obstacles reliably? That's pathetic. A half-dozen cases of running into stationary obstacles at full speed should have taught them something.

Running into obstacles at full speed, with no braking, is quite rare for human drivers. Usually, there's braking, but too late. Mercedes once did a study showing that over half of accidents would not have occurred if braking started about 100ms earlier.

Tesla's approach means the driver has nothing to do until something goes wrong. Then they have to react in under a second. That just doesn't work. That's been known for decades.

Great video on cockpit automation: "Children of the Magenta" (1997).[1] It's an American Airlines chief pilot talking to his pilots about automation-induced accidents. The aviation industry started dealing with over-reliance on imperfect automation a long time ago.

[1] https://vimeo.com/159496346



Instead of trusting this anecdote, why don’t you jump over to YouTube and search for “Tesla FSD” to see what it really looks like? Lots of very instructive videos available from people on the beta program.


Exactly. There's real world video footage showing the vehicle behavior that includes Tesla animations showing the vehicle's current understanding of the environment.

It's rather cheap to take trivial shots at Tesla (no matter how bad their product might be) while comparing it to 23 year old understanding of human-machine automation in an entirely different context such as aviation.

Quite unfortunate you're downvoted, but literally asking people to watch some videos of real world usage [1] to make up their minds appears to be too much.

[1] https://www.youtube.com/results?search_query=tesla+fsd+beta&...


Can the footage be trusted? From my understanding Tesla has only rolled this out to a small number of users, browsing the YT search results you’ve provided I see that almost all the results are Tesla super fan accounts.

I don’t think it’s unreasonable to disregard this until you see an objective assessment in the field.


There is a NDA. One tester has said that it [at least in part] says "you can't livestream drives"[0], and while otherwise NDA details are sparse, given the many close calls, bad performance situations, and multiple periods of rapid uploads [1,2], chances are they aren't screening videos.

0: https://youtu.be/AkexMo_jdcQ?t=346

1: https://i.judge.sh/enormous/Flim/chrome_RIGtEw3oE3.png (Chuck Cook)

2: https://i.judge.sh/sturdy/Derpy/chrome_NZrYtjHAeP.png (Tesla Owners Silicon Valley)


It's fair to question the motivations of the FSD beta testers. They were obviously hand picked to ensure maximum safety, adequate public profile, having committed large sums of money to prove their allegiance, as well as being existing fans of the company. So they're likely to be biased.

But that's a long way from alleging that they've doctored their videos, or that Tesla itself has censored them. There are many many instances of footage showing almost crashes that were only avoided because the driver intervened. If you were the censor, would you allow those?

Compare that to Waymo or Cruise or anyone else. They are obviously heavily censoring and have no independent owners recording footage, nor sharing it like this. It's a huge step up in much needed transparency.


> Compare that to Waymo or Cruise or anyone else. They are obviously heavily censoring and have no independent owners recording footage, nor sharing it like this. It's a huge step up in much needed transparency.

You clearly haven't searched Youtube. https://www.youtube.com/c/JJRicksStudios is one such channel which frequently posts videos of his Waymo rides. There are others if you search, not so much volume as Tesla though.


What you need to know about the performance of various autonomous vehicles is not observable through a bunch of youtube videos. The improvements are happening at the statistical margins, that's where 99.9% of the work is. Youtube videos won't do much to show the difference between Google's car's performance 7 years ago and today. You need granular statistical data to make any kind of informed judgement.

That Tesla is just now getting to the place where they can get a few miles in moderately complex traffic without fucking up, while still having many other videos of failures and close calls should tell you that Tesla has a long way to go. There are tens of thousands of driving scenarios that all need special attention.


Yes. Check out this one.[1]

Watch the display for what it sees and isn't seeing. Awareness of oncoming traffic is almost nonexistent. Except when the FedEx truck is reflected in some windows; then it detects the reflection as a vehicle. Although the streets are almost empty, the driver takes over frequently.

This is way below what Waymo and Cruise can do.

[1] https://youtu.be/6G1Z2J3WUSg?t=231


I always thought that it should be just as easy and safe to drive by looking at only the tesla screen and driving based on that before self driving can get anywhere close to complete. If the vehicles sensors can't pick things up , then the vehicle can't react to them.


Look closer. The oncoming traffic is detected, just the visualization on the dashboard screen in a bright yellow is difficult to mark out in the video.


4:10, the delivery truck isn't seen until its mostly pulled out. On the right at the traffic lights, It can't see humans

The traffic passing at the intersection is missed most of the time.

5:22, it drives onto the wrong side of the road into stationary traffic.

That is amateur stuff right there.


I did look closer. If you bring the image into a Photoshop-like program and adjust the levels, you can see a yellow square where one oncoming car is. You can also see yellow squares elsewhere. It's a compression artifact.


that is terrifying. the next 60 seconds after that is crazy bad.

'that is not ideal'...

We need to legislate this before more people die


Nothing scary happens in the next 60 seconds. The car would obviously not go into the opposite lane if there was traffic, and given a few more seconds it would return to the right shoulder. The driver is jumpy and taking over before it has time to correct anything.


what?!

first I dont think that's so obvious that the car would obviously not go into the opposite lane, it doesnt even see all the cars as OP pointed out and Teslas have rammed solid objects stationary at full speed.

Not being able to 'see' standard lane markings is such a basic requirement!

The car shouldn't have to correct.

if this was a student driver during a test I'm guessing they would fail - been a while I don't know the points but there are multiple really bad mistakes.


Teslas have been on the road for years and there is zero occurrences of one turning into live traffic. You’re underestimating the state of its self-driving abilities, there are no visible lane markings at that point in the video, it’s a error you might see a human make.


"It decides to go in the opposite lane. That is not ideal."

Twice in a row. At two different left turns. Yeah, not ideal.


Is this really all that important? There are 3 other cars ahead of the Tesla at this red light intersection. If you're able to react within tens or hundreds of milliseconds to a changing environment, it would actually make sense to reduce compute to save power for when the vehicle is in motion and not surrounded by other vehicles at a stop light.

Edit: It seemed to handle that intersection just fine, but a couple of turns down it ends up on the wrong side of the road, which is quite bad. That said, what proof do you have that Waymo/Cruise would be any better at this?


Here's an hour of a Cruise vehicle driving in San Francisco.[1] Watch it deal with a trash bag that fell out of a dumpster, a left turn across heavy opposing traffic, double-parked cars, cars pulling out of parking spaces, a FedEx truck changing lanes across their path... Makes Tesla look like amateur hour.

[1] https://youtu.be/EZ75QoAymug?t=122


Since they rely on pre-mapping, that’s the equivalent of a carnival ride on rails and not really comparable to what Tesla is doing (unsupervised training on billions of unknown road miles).



Being able to detect oncoming traffic seems pretty damned important to me! If it’s not detecting vehicles across the line, what assurance do I have that it’ll detect vehicles that cross the line?


What you're saying is not a capability that even most commercially available automatic braking systems can do well. Several studies have pitted the half-dozen or so cars available with automatic emergency braking systems on them, and they've all been just about the same. Apologies for a lack of citations on this since I have to run, but we at least have to compare Tesla to the rest of the industry, or compare their actual AV solution (FSD, which is in limited beta and only has been for just a month or two).




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: