
Uber self-driving crash: Footage shows moment before impact [video] - JabavuAdams
http://www.bbc.com/news/world-us-canada-43497364
======
JabavuAdams
We don't yet know why the driver was looking down and to the right.

What this raises for me is -- shouldn't self-driving car companies be using
ex-test pilots during autonomous driving? Of course, I'm assuming that the
driver wasn't one.

That's the personnel issue. On a technical front, what was she looking at? Her
own phone? The company-designed UI for the self-driving car? Does Uber use
LIDAR, if so, did the victim show up on it -- could there be an obstacle
alarm? What about IR cameras? This was obviously a very difficult visibility
situation. If we really want self-driving cars to be better than humans, can
we get away with just visible-light cameras, or do we need better sensors?

We are all far to blase about the risks of driving, and driving distracted,
from constant familiarity. It seems that pilots, and especially test pilots
might have the experience and discipline to stay alert during long autonomous
drives.

One might argue (certainly moving forward) that using regular drivers vs.
test-pilots is negligent.

~~~
BinaryIdiot
> We don't yet know why the driver was looking down and to the right.

Looked like they were playing on a phone to me.

------
mslev
I'm not sure how closely we should analyze this footage. Its relatively low-
quality dashcam footage, and I highly doubt its representative of what the car
"sees" and what a human would see in similar conditions.

That being said, Uber's LIDAR should've seen the woman in the road, right?
Does the range not go beyond the lane ahead?

~~~
TYPE_FASTER
Published vs. real-world LIDAR ranges are radically different. Even if the
LIDAR sensor is looking 360deg around the car, the Uber software may only be
looking at a small cone in front of the car.

The challenge is: what do you do when something is moving toward the vehicle's
path? You could track every object in front of the car, but since people walk
toward the road, then stop, you can't just apply full brakes every time
something is moving toward the road.

It's really, really hard to get this right: we humans can look at people
standing next to a road, and, based on movement and body language, predict
whether pedestrians will walk into a road.

And since there will be cases where pedestrians don't look and walk in front
of a vehicle going 40mph, autonomous technology will be limited by the physics
of stopping a vehicle with full brake applied.

It's impossible to figure out what the Uber vehicle should have, or could
have, done autonomously without looking at the code, and all the data.

~~~
TYPE_FASTER
Also challenging: looking ahead around the corner. You have to be able to
predict somewhat where the vehicle is going, so you have to have an accurate
map of the road, and know accurately where you are.

------
Beamer92
I can't be alone in thinking that the woman walking the bicycle didn't seem to
be in a crosswalk...

Like, ok so autonomous driving isn't perfect, and yeah the backup driver maybe
should've been paying attention to the road instead of whatever she was doing.
Fine. But I can't be alone in thinking it looks at least partly the woman's
own fault. Right? Walking your bike, in the dark, across a road without
looking when you don't seem to have right of way is just not a great idea.
Plenty of human drivers would've probably hit her too, albeit possibly slower
while slamming on the brakes.

I don't mean to blame the victim here entirely, obviously I don't think she
deserved to die, I just can't help but think that Uber isn't entirely at fault
here. Maybe Uber can make a car that's statistically a safer driver than human
drivers, but you can't out-engineer the human ability to invent ways to get
around rules other humans put in place.

~~~
BinaryIdiot
The problem is that video is deceptively dark. Video cameras don't pick up
nearly the same amount of light as humans eyes do and there are even photos
circulating around showing it significantly brighter in this area (to the
point where a human could see the entire area really, really easily).

Also some areas give pedestrians the right away regardless of where they are.
Granted there are plenty of exclusions to that too and this may fall into that
but regardless of who is at fault this looks avoidable if a human was driving,
IMO.

Regardless it's pretty insane that their software didn't pick up this person
at all. That's terrifying, IMO. That shows me it's not ready for a driver to
be using it and not paying attention like this driver (they kept playing on
their phone, looking down and not paying attention). People, deer, lots of
crazy things are going to jump out into the road and you need to be able to
handle that scenario. This person didn't even jump out, they were already in
the road and were likely very visible.

------
LinuxBender
I would argue these are not self driving cars. I would expect a self driving
car to have multiple compute nodes in the trunk that use something more
advanced than a gaming engine, a voting system and numerous sensors. LIDAR,
infrared, thermal, sonar, other.

An autonomous car that expected to operate in locations and conditions that
humans often fail in should be advanced enough to know everything around it
and then some.

~~~
jey
That's a software quality issue, not one of having the right hardware. We went
to the moon using pretty weak hardware and clever software.

~~~
LinuxBender
I think it is both. Human lives are at risk, so I expect solid battle hardened
software and hardware.

------
jey
Dupe:
[https://news.ycombinator.com/item?id=16643056](https://news.ycombinator.com/item?id=16643056)

