Everything anyone could say about bad AI driving could be said about bad human drivers. Nevertheless, Waymo has not had a single fatal accident despite many millions of passenger miles and is safer than human drivers.
Everything? How about legal liability for the car killing someone? Are all the self-driving vendors stepping up and accepting full legal liability for the outcomes of their non-deterministic software?
Thousands have died directly due to known defects in manufactured cars. Those companies (Ford, others) still are operating today.
Even if driverless cars killed more people than humans they would see mass adoption eventually. However they are subject to farr higher scrutiny than human drivers and even so make fewer mistakes, avoid accidents more frequently and can't get drunk, tired, angry, or distracted.
There is a fetish for technology that sometimes we are not aware of. On average there might be less accidents, but if specific accidents were preventable and now they happen, people will sue. And who will take the blame? The day the company takes the blame is the day self-driving exists IMO.
But even if they can theoretically be hacked, so far Waymos are still safer and more reliable than human drivers. The biggest danger someone has riding in one is someone destroying it for vindictive reasons.
In the bluntest possible sense, who cares if we can make roads safer?
Solving liability in traffic collisions is basically a solved problem through the courts, and at least in the UK, liability is assigned in law to the vendor (more accurately, there’s a list of who’s responsible for stuff, I’m not certain if it’s possible to assume legal responsibility without being the vendor).