Why are people so committed to the idea that self-driving cars are anywhere near human standards? It just seems like a groundless assertion of faith to me.
Professional drivers can go for a million miles without an accident, and I don't believe anyone's autonomous driving software can get within an order of magnitude of that without a disengagement, even in favorable conditions.
You may disagree that a disengagement is equivalent to a human having an accident, but I strongly feel that it is. In either case, you have a situation where the driver reached a point where it was definitively unable to determine an appropriate next action.
Today thousands are dying every year on the road, are we just forbidding cars entirely?
> And no, failover control is not acceptable given the past incidents and deaths.
How many incidents/deaths per miles driven? How does it compare to all the other transportation systems?