Hacker News new | past | comments | ask | show | jobs | submit login
The False Promises of Tesla’s Full Self Driving (theverge.com)
35 points by ra7 9 months ago | hide | past | favorite | 15 comments



It’s really unfortunate how much impact these promises have had. A lot of people believe them, even plenty of technical people I know. Lots of people [wrongly] really believe that Tesla’s are self-driving and are the leading company in the field.


And just who do you believe is the leader? You've never [recently] been in a Tesla with FSD, have you? Tesla FSD is amazing.


Waymo and Cruise make a car autonomous allowing it to drive with no driver in the seat for tens of thousands of miles on average between incidents.

Tesla FSD requires a driver in the seat to be operated safely, and explicitly does not make the car autonomous as stated on their website [1]. Tesla FSD is explicitly not labeled as a Autonomous Driving System (ADS) by Tesla to avoid mandatory NHTSA [2] and CA DMV [3][4] incident reporting requirements and Tesla does not and has not ever had a driverless testing permit. For that matter, although Tesla does have a permit for ADS testing with a safety driver, they have not done any in years.

In addition, Tesla FSD is lucky to go a few tens to low hundreds of miles between safety-critical interventions (hard to get reliable numbers since Tesla does not report official numbers). Waymo and Cruise with a safety driver average tens of thousands [5]; literally 100x-1000x better.

This analysis also ignores qualitative differences in ability, like how Tesla FSD still can not recognize basic signs such as “Do Not Enter” and “One Way”. It is lacking even basic functionality.

[1] https://www.tesla.com/support/autopilot

[2] https://www.nhtsa.gov/laws-regulations/standing-general-orde...

[3] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...

[4] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...

[5] https://thelastdriverlicenseholder.com/2023/02/17/2022-disen...


Cruise/Waymo precisely map out their driving areas and if anything changes too much, the cars shut down. A TV news crew took a ride in one, and the car parked diagonally in the road, where they had to wait 20 minutes for the company to send out a human driver.

Tesla is attempting something different: driving anywhere, just like a human can drive anywhere. It's a more difficult problem so it takes longer.


Tesla FSD is not autonomous anywhere. It operates 100x-1000x worse than Waymo and Cruise which are themselves still inadequate by a factor of 10x-100x for safe general commercial deployment.

Saying that Tesla is trying to solve a harder problem when they are a factor of 10,000x from solving the “easier” ones is the height of wishful thinking.

Besides, Waymo and Cruise are solving the general problem. The baseless assertion that driving the breadth of a city like San Francisco or Phoenix is somehow not representative of general driving is ludicrous on its face. Humans who learn in one city generally have the ability to drive in most other cities. The skills are largely transferable.

Waymo and Cruise take a slow and measured approach, validating in a well defined and constrained domain not because they could not operate everywhere 1,000x better than Tesla FSD, but because allowing a system only 1,000x better than Tesla FSD on the road without careful supervision is fucking criminal.


...or forever?


It's already good enough to make me happy.


I haven't, although I've watched a good number of videos from recent betas.

In terms of traditional automakers and systems that anyone can buy, Tesla is the leader (although I believe hyundai has a very good highway system, not city though). But in terms of self-driving, it's hard to rank precisely but they're certainly nowhere near, say, Waymo.

Plenty of people enjoy FSD and understand that they still have to correct it every so often but to a lot of people "self-driving" means just that.


Twitter is full of videos of FSD drives and they seem to work very well now. You can see the software improvements from older drive videos on the same channel (running the same hardware).

I think it's important to look at the current state of FSD beta, not the past state from more than a year ago.


But they still aren't anywhere close to level three self driving (let alone level 5, which Elon promised). What's the point of self driving if you have to constantly monitor it?


Constantly monitoring FSD is no where near as much work as driving. It's also 5x safer than just a human.


There is no auditable evidence that it is 5x safer than a human. The 5x safer number is a number self-reported by the Tesla marketing department [1].

This “safety report” consists of three unverified “miles per crash” ratios with no underlying raw data. They do not even publish the raw number of “miles” or “crashes” used in their analysis.

It is a “safety report” where they can not even be bothered to publish the numerator or denominator, let alone publishing their methodology, analysis, controlling for variables, or accounting for bias. No self-respecting scientist or engineer would be caught dead issuing such a deficient report.

In contrast, here is the report by Waymo on their first million driverless miles [2]. As you can see, it follows the basic structure of a research paper. It offers a detailed description and analysis of every individual crash. The difference in reporting is the gap between a grade school report and a proper research paper.

It is an abomination that Tesla is allowed to publish such unsupported and misleading “statistics” just they can more effectively market a incomplete and unsafe product to unsuspecting consumers.

[1] https://www.tesla.com/VehicleSafetyReport

[2] https://storage.googleapis.com/waymo-uploads/files/documents...


Tesla was informed in August 2022 that it would run down children in the road [1].

Tesla was informed in November 2022 that it would ignore “Do Not Enter” and “Road Closed” signs [2].

Tesla was informed in November 2022 that it would ignore school bus stop signs [3].

Tesla was subsequently informed in November 2022 it would still run down children in the road [4].

Tesla was subsequently informed in February 2023 that it would still do all of these things [5].

In March 2023 a North Carolina student exiting a school bus with the stop sign extended was hit by a Tesla in ADAS mode [6]. ADAS mode usage was confirmed by Tesla in the NHTSA SGO database labeled ID 13781-5100 [7].

These safety defects continue to be present as noted in July 2023 [8].

They continue to be present along with critical deficiencies in the Tesla DMS system as noted in August 2023 [9].

FSD is critically deficient and has seen no meaningful improvement in critical showstopping deficiencies for months. Some of which have caused demonstrable avoidable harm. This is unacceptable for a safety-critical product.

[1] https://dawnproject.com/the-dawn-projects-new-advertising-ca...

[2] https://dawnproject.com/new-dawn-project-safety-tests-reveal...

[3] https://dawnproject.com/tests-reveal-that-tesla-fsd-will-dri...

[4] https://dawnproject.com/new-dawn-project-safety-tests-show-t...

[5] https://dawnproject.com/watch-all-angles-of-the-dawn-project...

[6] https://www.wral.com/howard-gene-yee-51-was-driving-a-2022-t...

[7] https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...

[8] https://dawnproject.com/the-dawn-project-publishes-dangers-t...

[9] https://dawnproject.com/dawn-project-video-shows-teslas-driv...


Great, quote the Dawn Project. You know that's a failing autopilot competitor that's faking videos. Look it up on YouTube, there's analysis videos showing that the Dawn Project is lying.


They do not make an Autopilot competitor. Lie number 1.

They did not fake the videos.

The original claim that FSD was not enabled was a lie made up before the raw video was released. As soon as the raw video was released proving it was enabled they were forced to recant due to video evidence of their lies. Lie number 2.

The Tesla content creators then claimed the steering wheel was being moved because hands were on the wheel (even though Tesla claims this is how it is meant to be used). Video does not clearly show wheel torque, so they were able to lie with impunity. Subsequent videos then showed hands clearly off the wheel before impact making it impossible for the Tesla content creators to lie about that. Lie number 3.

The Tesla content creators then claimed the accelerator was being pressed because the accelerator was not on video allowing them to were lie with impunity. Subsequent videos then showed the accelerator in the video with foot clearly off making it impossible for the Tesla content creators to lie about that. Lie number 4.

The Tesla content creators you point at are habitual liars who consistently and deliberately lie about what is not visible only to be repeatedly proven to be lying through more thorough video documentation while the Dawn Project has been shown to be truthful and accurate in all cases. Truly sunlight is the best disinfectant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: