Legally, who is liable for a self-driving car which makes a mistake? Let's say it's egregious, and clearly the fault of the car, and it say kills someone?
In California, at least, Waymo is required to have $5 million in liability insurance. And the state has a law holding the manufacturer responsible in lieu of a driver. Though this setup has barely been tested since there have been so few incidents and the only "major" one (in CA) is still in court.