Uber may not be at fault, legally speaking. That's up to the legal authorities to decide.
However, as a society and civilization, and even more so, as engineers and scientists, we are going to expect that the autonomous car matches or exceeds human-level performance in critical situations like this.
Therefore the time spent on investigating, understanding, and discussing the root causes of the accident is worth understanding. Accidents like these generally do not happen due to a single factor. It is necessary to understand all the necessary factors if we want to make autonomous driving systems more reliable.
At the very least we need to understand whether the pedestrian appeared in the other sensors that a human could have identified by looking at the sensor data, and if yes, whether the autonomous system matched or exceeded human-level performance by detecting the pedestrian, and if the pedestrian was indeed detected, why the autonomous driving system failed to respond to the situation.
Surely not? Cars are routinely driven by people who are not owners, and liability for traffic offences (including that the vehicle must be insured) is with the driver.
In my experience typically only minor infractions like parking violations are assigned to the registered owner of the vehicle, but in other case – accidents, running red lights etc. – the driver is liable regardless of who owns the car.
No. The general rule is that negligence is required to be held responsible. If I let my next door neighbor borrow my car to go to the grocery store, and he hits someone, I'm not responsible. Unless, the person can prove "negligent entrustment", i.e. it was irresponsible just to let this person borrow my car, e.g. they're a habitual drunk, or blind, or 11.
However, most auto liability insurance covers whoever you permit to drive the vehicle, so the owners policy does typically cover the fender bender on the way to the grocery store.
Correct, the owner's insurance policy is the primary coverage when the owner lends their car to a 3rd party. Obviously in the case of a moving violation the driver is at fault and receives the penalty, but damage is still covered by the owner's policy. In the case where the other driver is at fault, that car's owner's insurance is liable.
Einstein was right about some things and wrong about some things. Which work of Einstein are you referring to when you imply that a peer review could have detected the error in his work?
Why is the backup human driver staring at the dashboard most of the time? Should he not be closely watching the road? It appears he lost a few valuable seconds right before the collision because his attention was diverted to the interior of the car rather than focussing that attention on the streets?
00:04 - The entire body of the pedestrian is visible
00:05 - Collision occurs
If we assume that the computer only had this video as input, then the computer only had 2 seconds to avoid the collision. That would be unavoidable for the computer. But the fact that there was no sign of slowing down or braking in these 2 seconds is pretty alarming.
But it sounds unlikely that this video was the only input to the computer. Did the car have multiple cameras to "see" bright as well as dark objects at night? I would imagine that a self-driving car driving at night would use multiple cameras (tuned to various level of brightness) so that the car can match human level vision?
However, as a society and civilization, and even more so, as engineers and scientists, we are going to expect that the autonomous car matches or exceeds human-level performance in critical situations like this.
Therefore the time spent on investigating, understanding, and discussing the root causes of the accident is worth understanding. Accidents like these generally do not happen due to a single factor. It is necessary to understand all the necessary factors if we want to make autonomous driving systems more reliable.
At the very least we need to understand whether the pedestrian appeared in the other sensors that a human could have identified by looking at the sensor data, and if yes, whether the autonomous system matched or exceeded human-level performance by detecting the pedestrian, and if the pedestrian was indeed detected, why the autonomous driving system failed to respond to the situation.