Hacker News new | past | comments | ask | show | jobs | submit login

That's not an unreasonable conclusion, but it's a lot messier than that.

The real issue is comparing miles driven to similar miles driven, and autopilot miles are only supposed to be on the highway in good conditions (which is when the fewest accidents occur...well, probably). But the breakdown of accidents into categories such as speed, weather, traffic, etcetera does not exist (or at least I am unaware of it). It's further complicated by demographics, where older more affluent drivers - the kind likely to buy a Tesla - are safer as well. Then it's also confounded by the fact that Tesla is not a trustworthy company, at least in my opinion, and they will give OTA updates without warning owners which can revitalize old bugs (https://www.techspot.com/news/79331-tesla-autopilot-steering...). A lack of regression testing for a safety critical system is just terrifying.

Now admittedly, you came back to me with a reasonable response and I am throwing you a litany of "yeah, but" rebuttals. Do I believe Tesla Autopilot has the potential, when used properly, to make driving under certain situations safer? Probably. The main problem is the human element, making sure they're actually monitoring the car, informing them correctly of what Autopilot can and cannot do, etcetera. There are also issues with how Tesla not only improves the technology, but validates it. It's the gross overpromising (honestly I believe it is probably fraud, but I cannot be sure) that makes me despise Tesla as a company. But I can admit they make a product a lot of people like. But I think a lot of people like them because they are misinformed.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: