Hacker News new | past | comments | ask | show | jobs | submit login
Regulator reports 'critical safety gap' in Tesla Autopilot linked to crashes (cnbc.com)
31 points by belter 24 days ago | hide | past | favorite | 9 comments



It seems like the regulators have finally caught up to these systems. I starting to wonder if these level 2 systems can even be made to be safe. Your effectively trying to engineer around human behavior, and there’s no guarantee you can get people to use it safely.


I remember them talking about this after the Uber crash - that there’s just no way for a person to remain engaged after hours of driving without incident. Although I think she was watching TV so…


Reading this morning that China is going to let Tesla do FSD there. It's a great testing ground because when people start using it, you can get whatever results the Chinese government wants to get. Good results get kept in, crashes are a result of something other than FSD and don't get reported in the data. Win-Win!


Well by definition their entire purpose is to reduce attention needed for driving which is an invitation for distraction.

I’d argue of distracted driver on autopilot than without one, but there’s more cases of abuse, and I doubt anyone has measured and compared net outcomes. Ergo it’s yet another moral panic about rocket-man-bad.


Well Tesla has four months to solve it...

"Elon Musk announces Tesla will unveil a ‘robotaxi’ on August 8" - https://edition.cnn.com/2024/04/05/business/elon-musk-tesla-...


Am I reading the article correctly that the "critical safety gap" is the fact that autopilot allows people to lose focus by only nagging them through sounds instead of say, stopping or randomly disengaging autopilot?


I don’t think you’re reading it right. The article ends on this quote:

> “People are dying due to misplaced confidence in Tesla Autopilot capabilities. Even simple steps could improve safety,” Koopman said. “Tesla could automatically restrict Autopilot use to intended roads based on map data already in the vehicle. Tesla could improve monitoring so drivers can’t routinely become absorbed in their cellphones while Autopilot is in use.”

One of the failures of autopilot is that it doesn’t restrict its use to only the environments in which it was tested and “proven” safe (enough, anyway).


Hyundai allows me to enable ADAS on any road too, so why moral panic around Tesla then?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: