Typically, when we test experimental products to gauge their safety and efficacy, we engage in informed consent with the participating parties.
If everyone involved knew the risks, accepted them and moved forward, I'd agree with your premise.
However, if I am walking down the street and I'm run over by a rogue autonomous vehicle, I didn't give consent.
I don't think anyone would be as blasé as they are with autonomous vehicle deaths in a different situation.
For example, if a potential cure for heart disease was tested by dumping it in the public water supply causing people to die, would we have posters here saying that testing such drugs will result in deaths and shrug them off?
> If everyone involved knew the risks, accepted them and moved forward, I'd agree with your premise.
Do you consent to the risks of letting 16 year olds drive? They're high, but you don't get the option.
I understand not agreeing to extremely-risky self-driving cars.
But if they can beat the very lax standards we use to license humans, that should be good enough.
Requiring them to be infinitely more perfect than humans is nonsense. If a car drives you over, do you really care if it was a robot or a human driving? I don't. The most I can ask for is a universal bar. And all the evidence I've seen is that Waymo is meeting that bar.
> For example
People would object because that's a stupid way to test and unrelated to the job of delivering safe water. If you want to talk about real water treatments, we do make tradeoffs!
> Do you consent to the risks of letting 16 year olds drive? They're high, but you don't get the option.
There's a very low bar for them to pass to have their driving privileges revoked if they prove themselves to be a danger.
Society necessitates that people drive. Society does not necessitate that Company X gets autonomous vehicles on the roads by target date Y so that their investors are happy.
> But if they can beat the very lax standards we use to license humans, that should be good enough.
"If". We have some very lax standards for what we consider intelligible English, yet Alexa can't set a timer correctly when I tell it to.
> Requiring them to be infinitely more perfect than humans is nonsense.
Who is proposing this?
> If a car drives you over, do you really care if it was a robot or a human driving? I don't. The most I can ask for is a universal bar.
Do you apply this accident causation blindness universally? Do you care if a person that hits you was drunk or lacked a driver's license vs driving diligently and licensed?
> People would object because that's a stupid way to test and unrelated to the job of delivering safe water. If you want to talk about real water treatments, we do make tradeoffs!
Some people might object to allowing unproven autonomous vehicles onto the street as stupid, but choose not to use that word in effort to have a respectful discussion.
> Typically, when we test experimental products to gauge their safety and efficacy, we engage in informed consent with the participating parties.
The state consented on your behalf. You, in fact, automatically "consented" to all sorts of dangerous and dubious experiments, including democracy itself, when you became a resident. Though the entire idea that self-driving cars are dangerous and experimental has no basis in reality and by all accounts Waymo's cars are ridiculously safe, even if it were the case that they were dangerous Waymo is operating with the full blessings of the Arizona government.
> Society necessitates that people drive. Society does not necessitate that Company X gets autonomous vehicles on the roads by target date Y so that their investors are happy.
Of course society does not "necessitate" anything. Society is not some natural phenomenon like gravity that operates in necessity. And there are many, many people who would point out that they do not agree with and certainly do not consent to America's dangerous obsession with car ownership that kills 50k Americans a year and has tremendous economic and ecological consequences. But alas.
The social contract is not carte blanche allowance for anything to happen. There's a feedback loop involved, in which the governed can give or revoke consent.
> Society is not some natural phenomenon like gravity that operates in necessity.
However, people are driven by natural phenomenon like the conservation of energy, and thus need to eat. For most people in the US, if they want to eat, it is necessary to drive to work.
> And there are many, many people who would point out that they do not agree with and certainly do not consent to America's dangerous obsession
> There's a very low bar for them to pass to have their driving privileges revoked if they prove themselves to be a danger.
Robot privileges can be revoked too.
> Society necessitates that people drive. Society does not necessitate that Company X gets autonomous vehicles on the roads by target date Y so that their investors are happy.
Society necessitates that people use cars to get places. You can 1:1 replace human driving hours with autonomous driving hours.
>> Requiring them to be infinitely more perfect than humans is nonsense.
> Who is proposing this?
Anyone who says that self-driving deaths are 'unacceptable' is requiring self-driving cars to be infinitely more perfect than humans.
> Do you apply this accident causation blindness universally? Do you care if a person that hits you was drunk or lacked a driver's license vs driving diligently and licensed?
Being drunk alters your ability to drive. They would be under the bar.
If someone lacks a license but would have qualified, I guess I don't really care.
> Some people might object to allowing unproven autonomous vehicles onto the street as stupid, but choose not to use that word in effort to have a respectful discussion.
In a way, we're discussing that right now. We're in a thread filled with posters who do not want to revoke those rights on the off chance that more dead people now will prevent even more people from dying in the future.
> Society necessitates that people use cars to get places. You can 1:1 replace human driving hours with autonomous driving hours.
This is a generous hypothetical. Society certainly necessitates that people drive, as there is no other way.
It is not true to say that we can 1:1 replace human driving with autonomous driving, the article in the OP is evidence of this. The chance that autonomous driving will never reach a 1:1 parity with humans is also just as likely.
> Anyone who says that self-driving deaths are 'unacceptable' is requiring self-driving cars to be infinitely more perfect than humans.
If this is your takeaway, I implore you to give this perspective more than a passing thought so that you can reply without turning it into a straw man argument.
> Being drunk alters your ability to drive. They would be under the bar.
I'm not sure what you're trying to say here, can you clarify?
> If someone lacks a license but would have qualified, I guess I don't really care.
Would you care if they qualified, but had their license revoked, perhaps for hitting people with their car before they hit you?
> All drivers are unproven at first.
Thankfully, we train and test these drivers on closed courses where injury to uninvolved people is minimized before we allow them to go on the open road. We both severely supervise and restrict why, when, how and what they can drive.
> In a way, we're discussing that right now. We're in a thread filled with posters who do not want to revoke those rights on the off chance that more dead people now will prevent even more people from dying in the future.
Some people are willing to trade more deaths now for fewer deaths later. But don't take that as proof that waymo's cars actually will cause more deaths. They've been pretty safe so far.
I'm not arguing that more deaths are acceptable, I'm arguing that some deaths are acceptable if we're going to be consistent with current road policies.
> It is not true to say that we can 1:1 replace human driving with autonomous driving
You misunderstood the 1:1. I mean that you can take particular driving hours and replace them 1:1. That's what the article is about, even. I'm not claiming it will replace all human driving.
> If this is your takeaway, I implore you to give this perspective more than a passing thought so that you can reply without turning it into a straw man argument.
It seems pretty simple to me. "Are you willing to allow self-driving cars that will kill people, if the number of deaths per mile is under some threshold?" What am I missing? I don't want to strawman people, I just want a realistic assessment of risk.
> Would you care if they qualified, but had their license revoked, perhaps for hitting people with their car before they hit you?
Yes, because it means they went under the bar...
> Thankfully, we train and test these drivers on closed courses where injury to uninvolved people is minimized before we allow them to go on the open road.
Your experience is very different from mine. I trained entirely in public areas. I don't even know where I could find a closed course.
Anyone who says that self-driving deaths are 'unacceptable' is requiring self-driving cars to be infinitely more perfect than humans.
That's a distortion of what I said. Furthermore, it's pretty laughable to have my internet comment treated like some kind of legally enforceable policy.
Last I checked, I'm not Queen of the world whose word is law.
If everyone involved knew the risks, accepted them and moved forward, I'd agree with your premise.
However, if I am walking down the street and I'm run over by a rogue autonomous vehicle, I didn't give consent.
I don't think anyone would be as blasé as they are with autonomous vehicle deaths in a different situation.
For example, if a potential cure for heart disease was tested by dumping it in the public water supply causing people to die, would we have posters here saying that testing such drugs will result in deaths and shrug them off?