
Police chief: Uber self-driving car “likely” not at fault in fatal crash - sohkamyung
https://arstechnica.com/cars/2018/03/police-chief-uber-self-driving-car-likely-not-at-fault-in-fatal-crash/
======
lumberjack
The recurring theme here is "autonomous car fails to read human
intentions/behaviour". This has been a question in the past, with people
saying autonomous cars drive unnaturally through intersections because they
cannot figure out the intentions of other human drivers. The issue in this
particular case is that the car can react faster once the pedestrian is
detected as an obstacle, but it will take longer than a human driver to figure
out that the pedestrian is going to become an obstacle. So overall reaction
time might not be quicker.

Can a NN learn enough human behaviour to predict a "pedestrians about to walk
in front of car" situation without somehow possessing a very high level of
reasoning? I think yes. But there will always be some edge cases. One other
interesting aspect is how different human cultures will probably mean
different training is needed for each particular culture.

~~~
JoeAltmaier
_THIS_ My friend Gautam Sinha was driving to lunch through an uncontrolled
intersection and neatly judged two cars hesitating at crossroads. I
complimented him - "Well done!". He said "That was nothing. I've done two
cars, a motorcycle and a dog. Two dogs!" In India, you hesitate, you're going
nowhere. How will autonomous cars ever survive there?

------
dcgudeman
I hope for the industry this is the case. Although tragic, I can’t help but
wonder how many pedestrians have been killed today by human drivers.

~~~
TheGrumpyBrit
While I agree with your inference that autonomous vehicles are still safer
than human drivers, this incident throws doubt on a lot of my previous
assumptions.

A human driver, in the scenario as reported, can be expected to take a certain
amount of time to react to an unexpected event, such as a pedestrian walking
out in front of it. An autonomous driver, IMO, doesn't have that luxury. That
vehicle should be aware of every other object it's sharing a length of road
with (sidewalks and medians included). It should be tracking every other
object at all times, anticipating a change of course. It should know the
proximity of other vehicles, and be able to calculate whether it has space to
swerve if necessary, and whether the following vehicle has enough stopping
distance in an emergency. It should know EXACTLY what it will do in response
to any of the potential hazards it can detect. It should absolutely not be
exceeding the speed limit under any circumstances.

I suspect this incident will highlight one of the areas where investigators
are not currently up to speed with technology. You can't look at an incident
like this as if it was a human driver using multiple dashcams, because "It was
dark and she came from nowhere" doesn't cut it. For an autonomous vehicle
accident, the video isn't enough - investigators need to see and understand
all the sensor readings taken by the vehicle in the lead up to the accident.
Did the car detect the pedestrian? If not, why not? Why didn't it notice the
change of course? How long before impact did it identify that a collision was
imminent? Why didn't it brake?

Unlike a human driver, there's no human factor to consider. Every decision
that vehicle made was based on rules. So "The Uber was not at fault" isn't
enough - every day, thousands of people are involved in accidents where
they're not at fault. That doesn't mean they couldn't have avoided it, and
part of being a good driver is allowing for other people doing dumb things.
The investigation needs to highlight exactly what decisions that vehicle made
which resulted in this tragedy, and what can be changed to make sure it isn't
repeated.

~~~
pintxo
Sounds like the NTSB is going to get a new job?

~~~
TheGrumpyBrit
I think accident investigators could certainly take some tips from them when
it comes to autonomous vehicles. They need to understand exactly what data is
collected by these vehicles, and how they can use that data as part of their
investigation. There should also be a full root cause investigation into every
single accident involving an autonomous vehicle. These protocols really need
to be in place now, with training and funding of accident investigators
written in as part of the operating license for each city, because if we wait
a decade, we'll get a lot more resistance from the operators.

------
Symmetry
Figuring out how to investigate accidents involving autonomous vehicles (and
this certainly won't be the last) is going to be important for law
enforcement.

~~~
sirmoveon
Figuring out? What's there to figure out? it's a computer that captures live
footage of its surrounding events. I'm pretty sure it will store at least the
last 3 minutes before an unknown event or collision. Similar to a black box in
an airplane. It's a matter of requiring them legally to provide such footage.

~~~
Symmetry
Well, there's the chain of custody of the live footage to worry about and
ensuring it hasn't been tampered with. And cops have to be trained in how to
look at it, there are probably special tools. In the long run we're going to
have an FAA equivalent for autonomous cars with standardized black boxes and
formal accident investigation committees and so forth so hopefully every
accident where an autonomous car is at fault can be the last of its type. But
currently we're still figuring out how all this works which means both that we
don't want all that structure yet and also that relevant authorities have to
learn by doing.

------
lnsru
Likely too big to fail? Human driver would try at least an emergency braking.
Even if it’s too late.

I can remember Continental testing emergency braking assistant in 2011. There
was braking and accelerating noise near their office every day for weeks. They
tested with giant rectangle and dummy pedestrian and apparently it worked
well. Did Uber turned this feature off?

Edit: typos

~~~
Sumaso
Can you really make that statement? Have you seen the video? There was a human
inside the car, they could have hit the brake, but they didn't.

Seem like your making a pretty big assumption that "Human driver would try at
least an emergency braking. Even if it’s too late."

~~~
lnsru
It’s natural to hit the brakes. Sometimes that can kill you. Like driving on
the highway during the rain and trying avoid fox or dog.

This failure mode (pedestrian jumping in front of the car) is quite often. I
drive 10k miles early and encounter this at least once yearly. Last time it
was an elder gentleman who felt off his bicycle on the street from pavement.
Time before it was a boy with scooter. It happens often and the scenario is
easy to test in closed area.

~~~
TheGrumpyBrit
I don't buy the "jumping in front of the car" part. Pedestrians pushing
bicycles laden with bags are not known for being especially nimble. Crossing
without looking, sure, but assuming the pedestrian was crossing the road,
she'd be standing about where the bicycle seat is. That gives half a bike's
worth of space before she's actually in front of the car - let's say a quarter
of a second if she's going particularly quickly.

I'll forgive a human driver for not noticing a pedestrian and reacting in a
quarter of a second. But all the marketing videos have trained us to hold an
autonomous car to a higher standard. Certainly a lack of lighting isn't an
excuse. The car should have known she was there, identified her as a potential
hazard, and been prepared to take appropriate action should her course change.
Speeding, along with the fact that the car made no attempt to stop before the
collision, is a "back to the drawing board" level of failure.

~~~
zlynx
Who just walks out in front of headlights on a dark street at night? Who DOES
THAT?

This appears to me to be a suicide.

~~~
TheGrumpyBrit
Suicidal people wouldn't typically try to take their bicycle and their
shopping with them. But honestly, the reason she was there isn't really
important. The reason the Uber couldn't avoid her is a far more important
question in terms of where we go from here. It should have detected her as a
potential hazard. It should have been prepared to stop. It shouldn't have been
exceeding the speed limit.

A good driver will look back at what they did after an accident to see if they
could have done anything to prevent it, even if they weren't at fault. Uber
need to do exactly the same thing. Fortunately, unlike a human driver, they
have significantly more data than an adrenaline-fuelled mind and some grainy
dashcam footage to work with.

