Interesting take. I mostly disagree, but let me see if I can explain.
> I took the position recently that if the code you write directly leads to the death of someone, you should be held accountable for it.
I don't think that follows. Just because you are responsible for the death in question does not imply accountability or wrongdoing.
> But in this particular case, I disagree that Uber should be held criminally liable.
Why? That doesn't match your previous statement precisely because of what I mentioned about there not being a direct correlation between responsibility and accountability.
> The actual person responsible is the human driver that did not stop.
This implies there is a single person responsible, I think there is far more going on in this case.
> The article states that this individual will be facing criminal charges. I think it’s rather cut and dry in this case; driver’s attention was split, and they were criminally negligent as a result. That argument makes more sense to me.
I tend to agree in this situation, but just because the "driver-not-driver-unless-you-kill-someone-driver" was found to be criminally responsible that doesn't absolve uber of their responsibility or accountability.
Someone at uber decided that their self driving car should not engage a breaking procedure in the situation that an object will be hit. To me, that is an utter failure. It's actually so bad I find it negligently bad.
They also failed to train their user properly. Not that they could have, its an absurd premise to believe a user of a self driving car will pay the same 100% attention that a normal driver does.
This is why I find it negligent to hold the opinion the car shouldn't execute a breaking procedure when it identifies a situation it will collide. To do so and not also implement a user alert system is an abomination.
I know there is a lot of nuance here and I don't hold the position that companies should be held liable for every little crash a car has on the road.
What I do believe is that if they want to operate at the scale they plan to we need to be damn fucking serious about the regulations they have to jump through.
Just look at air traffic control software or better yet nasa's development processes, this is the type of development scrutiny we need to place on self driving cars.
The loss of life and the scale is far to great to trust profit driven entities to take our lives seriously enough.
They will figure out the recall figure and play the numbers, we need to disallow that kind of behavior.
Just in case it wasn't clear, personally all for self driving cars, I think they will be a massive improvement on the status quo. I just don't want it to regret that position.
> I took the position recently that if the code you write directly leads to the death of someone, you should be held accountable for it.
I don't think that follows. Just because you are responsible for the death in question does not imply accountability or wrongdoing.
> But in this particular case, I disagree that Uber should be held criminally liable.
Why? That doesn't match your previous statement precisely because of what I mentioned about there not being a direct correlation between responsibility and accountability.
> The actual person responsible is the human driver that did not stop.
This implies there is a single person responsible, I think there is far more going on in this case.
> The article states that this individual will be facing criminal charges. I think it’s rather cut and dry in this case; driver’s attention was split, and they were criminally negligent as a result. That argument makes more sense to me.
I tend to agree in this situation, but just because the "driver-not-driver-unless-you-kill-someone-driver" was found to be criminally responsible that doesn't absolve uber of their responsibility or accountability.
Someone at uber decided that their self driving car should not engage a breaking procedure in the situation that an object will be hit. To me, that is an utter failure. It's actually so bad I find it negligently bad.
They also failed to train their user properly. Not that they could have, its an absurd premise to believe a user of a self driving car will pay the same 100% attention that a normal driver does.
This is why I find it negligent to hold the opinion the car shouldn't execute a breaking procedure when it identifies a situation it will collide. To do so and not also implement a user alert system is an abomination.
I know there is a lot of nuance here and I don't hold the position that companies should be held liable for every little crash a car has on the road.
What I do believe is that if they want to operate at the scale they plan to we need to be damn fucking serious about the regulations they have to jump through.
Just look at air traffic control software or better yet nasa's development processes, this is the type of development scrutiny we need to place on self driving cars.
The loss of life and the scale is far to great to trust profit driven entities to take our lives seriously enough.
They will figure out the recall figure and play the numbers, we need to disallow that kind of behavior.
Just in case it wasn't clear, personally all for self driving cars, I think they will be a massive improvement on the status quo. I just don't want it to regret that position.