
Google's self-driving car is the victim in a serious crash - openmosix
https://www.engadget.com/2016/09/24/googles-self-driving-car-is-the-victim-in-a-serious-crash/
======
sounds
I think nobody questions the need for fully autonomous cars. I do think there
is still a lot of discussion needed in terms of autonomous car policy.

What insurance requirements must be in place for a manufacturer, such as
Tesla, when their drivers are routinely taking their eyes off the road?

What insurance must be carried by the _manufacturer_ of the autonomous driving
equipment? What effect does this have on the insurance carried by the "human
in command"? (I was going to say "driver" but that didn't seem quite right.)

What additional legal protections must be in place for the _inevitable_ day
when software chooses to kill a human?

What additional legal protections must be in place for the day when, because
of software, a human's life was saved -- will society even know? Should Tesla
produce a monthly statistic of the estimated total number of trips driven by
their cars where an intoxicated "human in command" was able to get home?
(Because then anyone can compare to how successful humans were before an
autonomous system was available.)

What legal protections must be in place before Google is allowed to provide
autonomous driving service "in the cloud"? If the cell connection suddenly
drops at just the wrong moment, should Google pay the costs of the ensuing
accident? (Ok, ok, maybe Google isn't planning on putting the driving software
"in the cloud." I'm just saying...it's not like the owner of the vehicle will
be any wiser. Then they'll be dead.)

~~~
Animats
The NHTSB laid down policy on this last week. Level 1 and 2, it's the driver's
fault. Levels 3-5, the driver can sit back and relax and it's the car
manufacturer's fault.

The Tesla fatal crashes (Tesla is Level 2) have produced lawsuits, but no
decisions yet.

~~~
sounds
Thanks, and I agree on your quick analysis: no decisions yet. The NHTSB's
policy will almost undoubtedly also need to be tested in court, as it appears
on the surface that it over-simplifies some critical things.

------
dsfyu404ed
"Google also stressed that red light violations are the "leading cause" of car
crashes in US cities, and that 94 percent of those are due to human mistakes."

Last I heard the speed trap towns and insurance companies were justifying
their agenda by saying speed was dangerous. Before that it was the
prohibitionists (MADD) and universities using drunk driving to push their
agenda.

I'd wager that tailgating and distracted driving causes a lot more car crashes
(they're far more common behavior even if the average instance is less risky),
barring some absurd definition of car crashes that rules out most rear
endings. IIRC other articles have mentioned Google has complaining about
distracted drivers. Why the sudden change of priorities? Is a company Google
owns trying to win speed camera contract or something?

------
parent5446
I don't know if I necessarily agree with the article's conclusion that self-
driving needs to become the rule. Sure, if all cars were self-driving, this
accident would not have happened, and that's the dream. But if the Google car
was not self-driving, i.e., it was just me or you in the car, I doubt I would
have been able to avoid this crash. If automatic sensors can't see a car
speeding through a red light, there's no way I will.

~~~
dsfyu404ed
Do we know if the van was traveling cross ways or was sitting at a red arrow?

Not many people will run a red 6sec after it's red. Many more people will
floor it right as the light changes to cut off opposing traffic and take a
left turn on a red left arrow

edit: nevermind, robocar got hit on the left.

