Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You seem to use "regulated" as in "forbidden" right now, but maybe I am misreading so can you elaborate?

Today thousands are dying every year on the road, are we just forbidding cars entirely?

> And no, failover control is not acceptable given the past incidents and deaths.

How many incidents/deaths per miles driven? How does it compare to all the other transportation systems?



Why are people so committed to the idea that self-driving cars are anywhere near human standards? It just seems like a groundless assertion of faith to me.

Professional drivers can go for a million miles without an accident, and I don't believe anyone's autonomous driving software can get within an order of magnitude of that without a disengagement, even in favorable conditions.

You may disagree that a disengagement is equivalent to a human having an accident, but I strongly feel that it is. In either case, you have a situation where the driver reached a point where it was definitively unable to determine an appropriate next action.


I don't think most people think they are near human standards now. But there is the prospect for them being there before too long.


https://www.greencarreports.com/news/1119936_tesla-fatal-cra...

“Perhaps that’s because, as it turns out, Teslas on Autopilot could have a higher fatal accident rate than those driven entirely by humans.”




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: