Hacker News new | past | comments | ask | show | jobs | submit login

AP3 is not going to be nearly good enough. It's just the next step in keeping the "we sell full self-driving hardware as a feature but's it's not ready yet" promise alive. It's not years away, it's decades away. And for a frame of reference, the computers (as in the actual processors, ram, memory, etcetera) in cars a la Waymo and Cruise cost more than the cars, so it's not an insignificant cost. That's completely ignoring the lidar issue and the fact that state-of-the-art just isn't good enough even when you have an extra $100k in hardware, have spent billions training your models, and have remote operators to help you out of sticky situations.

But yeah, okay, Elon Musk said it's going to happen. He also said they were going to have it years ago and we're still waiting on that cross-country trip from New York to Los Angeles that never materialized.




We know how it ends though.

We'll see them roll out an error-prone stop-sign+traffic-light detection as fsd beta, and everyone will act like that was what was promised. Just like Lane keeping assist is "autopilot".


> We know how it ends though.

I realize this is a tangent, but I feel like it ends with Tesla going bankrupt before they ever come close to building a self-driving car, or even for the narrative to wear off for the general public. I believe a criminal investigation should be brought against Tesla for negligent manslaughter due to the fraudulent advertising and consequential deaths, but I have no expectation of that ever happening.

I just get really angry when people lie, then people die, and nobody does anything about it. As another commenter pointed out, Elon Musk himself is still demonstrating the technology as being completely autonomous just a few months ago.


I guess others don't agree, but I do. People are dying using autopilot, deaths that are likely preventable, and yet very little has been done about it. What's even worse is they're advertising safety claims based on false data, and you have to fight the courts just to have access to the document showing the redacted data.

https://www.wired.com/story/tesla-autopilot-safety-statistic...

But it seems nobody cares.


That article says:

"The upshot is that Autopilot might, in fact, be saving a ton of lives. Or maybe not. We just don’t know."

...which doesn't support the claim that banning Tesla's autopilot would be a net positive.

Maybe the effect is zero-sum, which means that preventing the deaths where Autopilot did wrong by banning Autopilot, means that other people would die, because Autopilot saved them from a situation they would have crashed in.

If Tesla's Autopilot is demonstrably net negative, then yes, it should be banned, and banning it would save lives.

But if it's currently zero-sum, we should absolutely allow it, we should absolutely allow it to be improved, because improvements to the system will probably tip it over into net positive.

It is absolutely concerning that the company is trying to spin the tech as already being net positive, without any clear evidence of that, yes, I agree. If the tech were net negative, and they were trying to cover that up, that would be even worse. But that's not the position we're in.


You are correct that the data is redacted, but I think it's a reasonable assumption to assume it's pretty bad for Tesla. If the data painted Tesla in a favorable light, they would be force-feeding it down our throats. They would do everything they could to let everybody know "here is the data proving we are safer."

Instead they are burying it behind legal procedures and doing everything they can to make sure nobody knows what the actual data says.

I cannot deny that I do not know anything with certainty. But it's more than just a guess that autopilot is not what's it's advertised to be.


Wikipedia lists 3 Tesla autopilot deaths since Jan 2016. In that time about 4 million people have died in road accidents caused by humans. I care about the deaths but 4,000,000>3.


This is an absurd misinterpretation of the data. What matters is how many miles driven per death, not how many deaths. Did you know more people die driving each year than from having bullets implanted in their brains? And yet I'd still rather get in a car than shoot myself in the face.

If you'd actually care to educate yourself, you can start by reading the article I linked in my original comment.


The article seems to say

maybe the rate dropped 40% or 13% but

>Now NHTSA says that’s not exactly right—and there’s no clear evidence for how safe the pseudo-self-driving feature actually is.

Which doesn't seem so terrible. Personally I'm optimistic that as the systems get better they can roll them out to other cars and make a dent in the 1.3m/year global deaths.


That's not an unreasonable conclusion, but it's a lot messier than that.

The real issue is comparing miles driven to similar miles driven, and autopilot miles are only supposed to be on the highway in good conditions (which is when the fewest accidents occur...well, probably). But the breakdown of accidents into categories such as speed, weather, traffic, etcetera does not exist (or at least I am unaware of it). It's further complicated by demographics, where older more affluent drivers - the kind likely to buy a Tesla - are safer as well. Then it's also confounded by the fact that Tesla is not a trustworthy company, at least in my opinion, and they will give OTA updates without warning owners which can revitalize old bugs (https://www.techspot.com/news/79331-tesla-autopilot-steering...). A lack of regression testing for a safety critical system is just terrifying.

Now admittedly, you came back to me with a reasonable response and I am throwing you a litany of "yeah, but" rebuttals. Do I believe Tesla Autopilot has the potential, when used properly, to make driving under certain situations safer? Probably. The main problem is the human element, making sure they're actually monitoring the car, informing them correctly of what Autopilot can and cannot do, etcetera. There are also issues with how Tesla not only improves the technology, but validates it. It's the gross overpromising (honestly I believe it is probably fraud, but I cannot be sure) that makes me despise Tesla as a company. But I can admit they make a product a lot of people like. But I think a lot of people like them because they are misinformed.


Facebook: move fast and break things

Tesla: drive fast and break people.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: