
Tempe Police Release Video of Uber Accident - austinkhale
https://twitter.com/tempepolice/status/976585098542833664?s=21
======
atonse
How did LIDAR and IR not catch that? That seems like a pretty serious problem.

It's clear from the video that a human driver actually would've had more
trouble since the pedestrian showed up in the field of view right before the
collision, yet that's in the visible spectrum.

When I argue for automated driving (as a casual observer), I tell people about
exactly this sort of stuff (a computer can look in 20 places at the same time,
a human can't. a computer can see in the dark, a human can't).

Yet this crash proves that all the equipment in the world didn't catch a very
obvious obstruction.

~~~
jackpirate
_It 's clear from the video that a human driver actually would've had more
trouble since the pedestrian showed up in the field of view right before the
collision, yet that's in the visible spectrum._

That's not at all clear to me. I don't know too much about cameras, but it
looks to me like the camera is making the scene appear much darker than it
actually is.

In the video, you can see many street lights projecting down onto the ground,
and the person was walking the the gap between two streetlights. The gap
between street lights (and hence the person) was in the field of view of the
camera the entire time; they just weren't "visible" in the camera because of
the low lighting. I'm confident my eyes are good enough that I would have been
able to see this person at night in these lighting conditions. (Whether I
could have reacted in time is another question.) It seems to me like the
camera just doesn't have the dynamic range needed for driving in these low
light conditions, which is a major problem.

~~~
jeffmould
I have to agree. Just like a normal camera has issues in low-light, it is
clear that this camera is diminishing exactly how light the road ahead was.
While I can't say confidently that I would have been able to stop to prevent
hitting them, watching the video in full screen does lead me to believe that I
would have seen them and been able to apply the brakes at least enough to
reduce the impact. Also, watching the video of the interior it is clear the
driver was looking at his phone or doing something else just prior to the
impact. This alone leaves me skeptical to just how much could have been done
to prevent this accident.

~~~
walrus01
This is pretty much the experience I have with my dash cam, a Yi. In its
recorded video, its automatic exposure control make it look like everything
outside of the headline cone is pitch black, but it is actually not. I have
seen deer and possums by the side of the road, and debris etc, that did not
show up when I later checked the video for the same period. There is enough
spillover light from modern headlights that a human _whose eyes are dilated
and adjusted to dark conditions_ will see a pedestrian standing on the median,
stepping off it, crossing the inner lane towards the car's current lane. More
than enough time to begin to brake and possibly swerve. I have dodged animals
in a situation similar to this.

~~~
sundvor
Yep that exposure control / sensor quality of the dash cam in the video was
rubbish. My own Blackvues produce far, far better results than that. Just look
at how _nothing_ is illuminated by street lights, this clearly has the effect
of making the poor rider appear "out of nowhere". Also agree it appeared
driver was on smart phone most of the time, thus not in control of the
vehicle, and had thus no business being on the road as these are systems
_UNDER TEST_.

If that's the best Uber can produce then they ought to hang their heads in
shame. Unless it was doctored with... as I find it hard to believe they'd put
such rubbish quality cameras in their trials.

~~~
panarky
Do you trust Uber to provide all the data, or would they selectively produce
data favorable to them?

Do you trust Uber to provide unedited raw video, or would they process it to
increase contrast, make it appear that nothing was visible in the dark areas
of the frame, reduce the resolution, drop frames, etc.?

~~~
sharkoz
It's funny how the internal camera which shows how distracted the driver was
has way better night vision than the external road camera...

~~~
Piskvorrr
The key here is _contrast_ ; plus a IR light at 2 feet works great, at 60
feet...not so much.

~~~
usrusr
The internal camera (let's be honest and call it _the scapegoat camera_ ,
because that's the only practical use for human "safety drivers" when they are
not permanently engaged) must take almost all its light from IR, because we
don't see anything of the smartphone screen glare that the eye movement so
clearly hints at.

~~~
YeGoblynQueenne
I don't think the driver is looking at her smartphone. I think she's checking
the car's monitor (as in a computer screen). Although to be fair, that should
be showing the car's view of its surroundings so I don't know what's going on
there.

Edit: Nevermind. Someone posted a picture of the car's interior, below and
there's no computer screen.

~~~
thisacctforreal
Link?

~~~
YeGoblynQueenne
Sorry - I can't find it. This thread has grown rather.

------
aecs99
I currently work full-time in the self-driving vehicle industry. I am part of
a team that builds perception algorithms for autonomous navigation. I have
been working exclusively with LiDAR systems for over 1.5 years.

Like a lot of folks here, my first question was: "How did the LiDAR not spot
this?". I have been extremely interested in this and kept observing images and
videos from Uber to understand what could be the issue.

To reliably sense a moving object is a challenging task. To
understand/perceive that object (i.e., shape, size, classification, position
estimate, etc.) is even more challenging. Take a look at this video (set the
playback speed to 0.25):
[https://youtu.be/WCkkhlxYNwE?t=191](https://youtu.be/WCkkhlxYNwE?t=191)

Observe the pedestrian on the sidewalk to the left. And keep a close eye on
the laptop screen (held by the passenger on right) at the bottom right.
Observe these two locations by moving back and forth +/\- 3 seconds. You'll
notice that the height of the pedestrian varies quite a bit.

This variation in pedestrian height and bounding box happens at different
locations within the same video. For example, at 3:45 mark, the height of
human on right wearing brown hoodie, keeps varying. At 2:04 mark, the bounding
box estimate for pedestrian on right side appears to be unreliable. At 1:39
mark, the estimate for the blue (Chrysler?) car turning right jumps quite a
bit.

This makes me believe that their perception software isn't as robust to handle
the exact scenario in which the accident occurred in Tempe, AZ.

I think we'll know more technical details in the upcoming days/weeks. These
are merely my _observations_.

~~~
noobermin
Alright, so given your observations, which I don't doubt, here's a question I
have: why have a pilot on public roads?

If uber's software wasn't robust, why "test in production" when production
could kill people?

~~~
rkangel
Disclaimer: I am playing Devils Advocate and I don't necessarily subscribe to
the following argument, but:

Surely it's a question of balancing against the long term benefit from widely
adopted autonomous driving?

If self driving cars in their current state are at least close to as safe as
human drivers, then you could argue that a short term small increase in
casualty rate to help development rate is a reasonable cost. The earlier that
proper autonomous driving is widely adopted, the better for overall safety.

More realistically, if we think that current autonomous driving prototypes are
_approximately_ as safe as the average human, then it's definitely worthwhile
- same casualty rate as current drivers (i.e. no cost), with the promise of a
much reduced rate in the future.

Surely "zero accidents" isn't the threshold here (although it should be the
goal)? Surely "improvement on current level of safety" is the threshold?

~~~
Slartie
You can make the argument with the long-term benefits. But you cannot make it
without proper statistically sound evidence about the CURRENT safety of the
system that you intend to test, for the simple reason that the other traffic
participants you potentially endanger are not asked if they accept any
additional risk that you intend to expose them to. So you really need to be
very close to the risk that they're exposed to right now anyway, which is
approximately one fatal accident every 80 million miles driven by humans,
under ANY AND ALL environmental conditions that people are driving under. That
number is statistically sound, and you need to put another number on the other
side of the equation that is equally sound and on a similar level. This is
currently impossible to do, for the simple fact that no self-driving car
manufacturer is even close to having multiple hundreds of millions of miles
traveled in self-driving mode in conditions that are close enough to real
roads in real cities with real people. Purely digital simulations don't count.
What can potentially count in my eyes is real miles with real cars in "stage"
environments, such as a copy of a small city, with other traffic participants
that deliberately subject the car to difficult situations, erratic actions, et
cetera, of which all of them must be okay with their exposure to potentially
high-risk situations.

Of course that is absurdly expensive. But it's not impossible, and it's the
only acceptable way of developing this high-potential but also highly
dangerous technology up to a safety level at which you can legitimately make
the argument that you are NOT exposing the public to any kind of unacceptable
additional risk when you take the super-convenient and cheap route of using
the public infrastructure for your testing. If you can't deal with these
costs, just get the fuck out of this market. I'm also incapable of entering
the pharmaceuticals development market, because even if I knew how to mix a
promising new drug, I would not have the financial resources to pay for the
extensive animal and clinical testing procedures necessary to get this drug
safe enough for selling it to real humans. Or can I also just make the
argument of "hey, it's for the good of humanity, it'll save lives in the long
run and I gave it to my guinea pig which didn't die immediately, so
statistically it's totally safe!" when I am caught mixing the drug into the
dishes of random guests of a restaurant?

------
chrsstrm
The description given before the video was released painted a picture in my
mind that the woman was on the median and "suddenly" entered the roadway in
front of the vehicle. I pictured someone darting across the road directly in
front of the car, with no way to stop in time.

This video shows a completely different scenario. The woman started on the
median, but the vehicle was in the #2 lane. She wasn't visible to the naked
eye but she also wasn't darting into traffic and had to cross the #1 lane
before even being in the path of the vehicle. A human driver certainly would
have difficulty stopping in time, but why did the sensor package not pick her
up? This doesn't appear to be the close call we were told it was. To me, this
seems like exactly the scenario that autonomous driving vehicles are intended
to prevent.

~~~
maxerickson
I think _this seems like exactly the scenario that autonomous driving vehicles
are intended to prevent._ may be optimistic.

Lots of people are _hoping_ they will prevent such scenarios, the motivation
for fielding them is more related to making lots of money.

~~~
adamweld
I disagree, if you look at any of the Waymo videos you can see they have great
recognition of every car, pedestrian, and obstacle within 100 meters of the
car, and most of that data (point cloud lidar/radar at the very least) would
work in pitch black without even headlights. Thus it's extremely reasonable to
expect that autonomous vehicles will easily prevent scenarios like this.

Inclement conditions, defensive driving, etc. are much harder to work with but
this should have been cake.

~~~
_up
There was also a recent lawsuit where Uber pledged to not use any lidar tech
and software from Waymo anymore. Maybe they had to deactivate Lidar.

~~~
asteli
LIDAR in question on this vehicle is an HDL-64, a commercial unit from
Velodyne. Lawsuit is entirely unrelated.

------
Keats
Someone on Reddit posted a comparison of the same spot at night but with a
good camera: [https://m.imgur.com/a/PM7uu](https://m.imgur.com/a/PM7uu) from a
video at
[https://www.youtube.com/watch?v=1XOVxSCG8u0](https://www.youtube.com/watch?v=1XOVxSCG8u0)

That paints the whole accident in a very different light from the video from
Twitter. Would any human not notice someone crossing in the good camera
screenshot?

~~~
themgt
Yep, also this HDR image of the scene at night:
[https://imgur.com/gallery/XQrAB](https://imgur.com/gallery/XQrAB)

Really hope the cops didn't just look at the low-dynamic range dashcam video
and decide it's case-closed here. I would not want one of these Uber cars on
the streets with me.

~~~
heartbreak
NTSB is investigating the crash, so surely they'll conduct adequate forensics
work.

~~~
criddell
I think Uber should be forced to share _all_ of the sensor data.

Historically, building and engineering codes advance in response to disasters.
This mindset needs to be applied to robots that are going to be operating in
public spaces. Incidents like this are an opportunity for all self-driving car
developers to get better and there's a public safety interest in that
happening.

------
cameldrv
Pathetic and sad performance by the vehicle and "safety" driver. The woman
does not "appear out of nowhere", she was in the roadway for some time. The
woman was not wearing all black, had red hair, and her shoes were reflective.
Even if we are to believe that their camera is this crappy, they still have
the lidar, and it appears brakes were not applied. Even 500ms of braking *
0.8g = 9mph. That might have saved her life. Ultimately if Ubers car cannot
see a pedestrian crossing the street at a walking pace at night and not hit
him/her, it should not be operating at night.

~~~
Gatsky
What I don't understand is why the pedestrian didn't see or hear the car. She
seems totally oblivious. Surely she would have seen the headlights. It is
possible she had some sensory impairment, either visual or auditory.

~~~
mattlondon
This happens all the time at night where I live. People cross the road and you
don't see them until you're quite close.

Cars are highly visible to pedestrians (due to headlights). I think
pedestrians assume that they themselves are equally as visible to the car
driver too - "I can see them, so they must be able to see me." is the
subconscious line of thought I think.

Often though pedestrians get "lost" in the lights of oncoming vehicles or the
brake lights of cars ahead, or are just in your blind spots.

The other thing is, I've noticed that people who have never learnt to drive
often don't really appreciate how cars act and are not very good at estimating
the distances involved. You've probably noticed this when crossing streets -
some people take some eye-raising risks when crossing the road while others
wait ... guess which ones are the drivers who are experienced with how
cars/drivers act!

My guess here is the poor woman was stationary in the median thinking "Should
I cross?" as the car was approaching. She probably then decided she had time
to make it across before the Uber car got there so started to cross the
street, perhaps assuming the vehicle was further away than it was, or would
slow down since the "driver" would see her. We've all done this I am sure ...
sometimes with time to spare, sometimes cutting it a bit close.

Perhaps if this was what happened, the Uber car "saw" her stationary in the
median and disregarded her as a stationary road-side object and carried on
until it was too late. IIRC these systems need to disregard static objects at
the side of the road (e.g. signs, trees etc) otherwise they'd be slamming the
brakes on all the time. Perhaps the initial movement she made into the road
was disregarded as "sensor noise" (e.g. you can see objects "jumping around" a
bit in this video from Uber:
[https://www.youtube.com/watch?v=WCkkhlxYNwE](https://www.youtube.com/watch?v=WCkkhlxYNwE))
from what it thought was a static object, but that would not explain the
apparent total non-reaction from the vehicle when she was clearly in the path.

I've driven a XC90 a few times - it has auto-lane keeping when using cruise
control on the highway. Perhaps that was what was driving, not the Uber
system.

Tragic stuff regardless of what happened.

~~~
coding123
I think your guess is dead on - it's what I had imagined the scenario was
before I saw the video. I see this all the time, and the social contract
between car and pedestrian is the car slows. Heck even half the time people do
this they don't even look at the car - it's quite crazy - I never do that
mostly because I make the assumption that I will be the unlucky shit that was
hit when someone decided to bite down on a messy Carls Jr burger.

Regardless of how I do this, it's a pretty accepted contract and definitely
still piles the fault on the SDV in my opinion.

------
brokenmachine
Wow, that self driving car is total crap if it can't pick up that obstacle.
Doesn't get much more obvious than that. Slowly walking from the left lane
into your lane on a fairly straight road.

I bet it was more visible to the human eye than on that video as well. You'd
have to see someone crossing the road like that.

Of course the pedestrian wasn't doing the smartest thing but I believe a human
driver would have at least hit the brakes had they been actually looking
forward and paying attention.

I thought the actual advantage of self-driving cars is that they're always
meant to be looking forward and paying attention. That doesn't appear to have
happened in this case.

~~~
jonknee
> You'd have to see someone crossing the road like that.

You could say the same thing about a car with headlights on coming towards
you... From the little evidence we have it seems like the pedestrian,
driver/monitor and the car all should have done better.

~~~
brokenmachine
That pedestrian obviously didn't have good situational awareness, but are we
happy to let self-driving cars mow down any unaware obstacle on the road?
Broken-down cars/deer/children/roadworks?

Honestly, that video proves to me that this tech is nowhere near ready. That
was the most obvious of obstacles and the car did not even slow down.

I believe a competent human would have at the very least slowed down in that
situation, and likely stopped completely.

~~~
dqpb
> that video proves to me that this tech is nowhere near ready.

It proves that Ubers tech is nowhere near ready. Based on all the stupid shit
I see Uber engineers saying on Blind app, I have very little faith in their
maturity, professionalism, and culture.

I would bet that Google is doing a better job than Uber is.

~~~
drited
Thought you needed an Uber email address to join the Uber Blind app
conversation?

~~~
mrep
He probably means the forums where anyone from any company can participate.

------
JohannesH
In my mind this accident is on Uber no matter how you interpret the video.

Scenario 1: Lets say the pedestrian was visible to the naked eye and sensors.
The model and safety operator still didn't "see" her and act in time. Who to
blame? Uber.

Scenario 2: The lighting and environment was in a condition where neither the
model nor the safety operator could see the road more than 10 feet in front of
the car, yet neither thought it to be irresponsible to go at full speed. Who
to blame? Uber.

I believe that:

1\. The sensors should have detected the pedestrian from far away even if the
lighting was bad at the time. I mean that's sort of the point with autonomous
vehicles, that they are better at perceiving their surroundings and can make
better decisions on how to act quicker than any human could.

2\. The safety operators are not engaged enough in their tasks to be
effective. I think people underestimate how boring it must have a job where
99% of the time you should do absolutely nothing other than stare straight
ahead and be ready for a situation like this. This problem is hard to solve.
Maybe we should be training the model on closed tracks and only release on the
real roads when it passes some sort of test where it is put through various
scenarios. Like a driving instructor for AIs.

For those of you who think the pedestrian is to blame. I agree that the
pedestrian might have made a bad decision by expecting the cars to brake,
however these situations occur all the time. She didn't dart across the road
or jump in front of the car suddenly. Yesterday I helped an elderly guy across
4 lanes of traffic which took about 1 minute. All you can do in that situation
is to hope you are visible to the drivers and that they will stop before
running you over.

~~~
yaps8
This.

Many people comment that the lighting is poor and that a human might
understand what's happening too late. This is debatable and misses the point:
if visibility is bad, you (and it applies with full force to automated
drivers) should reduce your speed accordingly, maybe with the exception of
freeways where you are not expected to encounter pedestrians.

~~~
singingboyo
> maybe with the exception of freeways where you are not expected to encounter
> pedestrians.

Freeways require you to reduce speed in those cases as well, possibly even
more so. This is pretty well demonstrated by the pileups that occur every
winter, and similar (if rarer) incidents caused by dense fog or similar.
Pileups at freeway speeds are much worse than the 2, maybe 3 car crashes that
occur on city streets.

Yes, there's no pedestrians, but one person hitting a pothole or large puddle
and spinning out is all it takes to get a pileup if you're not careful.

------
stefan_
If there is something that isn't going to help the public perception of
autonomous cars at all, it's releasing a compressed to shit capture of another
video showing a single camera angle from dozens.

I would say it's a deliberate attempt to manipulate if I didn't also strongly
believe ignorance on part of the police department has lead them to believe
that autonomous cars could even exit a parking lot without data from many more
than this one camera, not to mention the vastly more useful LIDAR on top.

(That's before you consider the video angles shown here are just for dashcam
purposes. The real cameras for the autonomous driving are in the sensor array
_on top of the roof_ )

~~~
jonathanyc
Yeah. Does the Uber car really capture video at 480p and 15fps? Also only
releasing the video conveniently ignores the fact that these cars have IR and
LIDAR. The pedestrian is hard to see in this video essentially because it is
dark and they are wearing dark clothing. Neither of these are at all obstacles
to LIDAR and IR, and the video at least shows us that the road is clear of
obstructions.

~~~
jon_richards
>Does the Uber car really capture video at 480p and 15fps?

Probably not, but it isn't like the inputs to the self-driving models really
need to be better than that. Lower resolution helps your processing time a lot
and there's little point in having an FPS higher than your processing time.

~~~
ClassyJacket
I'm sure at least the collision avoidance part of the system would need to
poll at a much higher rate than 15fps. That's up to 67ms latency you're
adding. With enough miles that delay could kill people.

~~~
jon_richards
Average human reaction speed is around 215ms. Not an apples-to-apples
comparison because humans can react much faster to continuous situations
(humans have a timing accuracy of around 9.5ms) while a machine-learning model
is limited to only reacting once per frame, but still.

~~~
Tobba_
If you wan't to compare against human "sample rate", it'd be equivalent to at
_least_ ~200 FPS (in order to get the same accuracy with a camera). Sure, the
signal takes a moment to plumb its way through, but that's irrelevant to
spotting objects.

If they're actually feeding data at 15 FPS into their ML model, then what the
fuck were they expecting? Correlating movements at those framerates would be
nigh-impossible.

Relying on ML for this is already comically irresponsible, but that'd just be
ridiculous.

~~~
michaelcampbell
> If you wan't to compare against human "sample rate", it'd be equivalent to
> at least ~200 FPS

Where does this figure come from?

~~~
Tobba_
My ass, mostly. I'm extrapolating based on monitor framerates and how
accurately we can see the velocity of fast-moving objects, and that I can spot
a timing difference of ~5ms reliably.

Human eyes are _almost_ comparable in terms of a framerate based on the neuron
spiking rates, which are somewhere over 250-500Hz max. Obviously that's not
directly comparable though, but it gives an idea of how well we can deal with
moving objects.

------
YeGoblynQueenne
So now we all know- Elaine Herzberg did not run out in front of the car, as
the police said, she was walking at a normal pace; she was not in the shadows-
the camera footage is typically darker than human vision; and the reason why
the first thing the driver knew of the crash was the sound of it is because
she didn't have her eyes on the road.

And all the cars sensors, its superior perception of its environment and its
superhuman reaction times were no use, without a human-level understanding of
its environment to go with them. It couldn't tell that there was a person
crossing the road in front of it and even if it did it didn't have a concept
of what a person is and why it should try to avoid them, or in any case, it
just didn't know what to do about it.

So can we now please roll back the off-the-charts hype about self-driving cars
being safer than human drivers? It's abundantly clear that this is not yet the
truth (not yet. _yet_ ). That's just not the state of the art, at this point.
We're like the people jumping off towers with crazy "flying" apparatus, in the
18th century, because they were convinced they could fly that way.

Or maybe we should just stop pretending that what we really care about is
safety and let's we just want to have cool tech toys to play with, no matter
the consequences.

~~~
andygates
You're missing the (admittedly comfy) narrative that the other tech companies
are being diligent and careful while Uber are a bunch of cowboys rushing alpha
code out into public, who only change stuff when they get caught. The autocar
fan's worst nightmare.

~~~
YeGoblynQueenne
I noticed that trend in comments, yes. Unfortunatly, the real issue with what
Uber or Waymo (or anyone else) are doing it's with the limitations inherent in
the technology itself, specifically, machine learning for object recognition
and identification and for the learning of complex beheaviours.

The limitation is -it's a bit technical, but basically, in principle, machine
learning is possible under certain assumptions, as laid out by Valiant in his
PAC-learning paper (A Theory of the Learnable), especially the assumption that
a training sample will have the same distribution as unseen data. Under this
condition, machine learning can be said to work and we can look at performance
metrics and be happy they look good.

Well, except that the real world has no obligation to operate under our
experimental assumptions, so once you deploy machine learning systems in the
real world, their performance goes down, because you haven't seen nearly
enough of the data you really need to see, in the lab.

And, if you attach such assumptions to safety-critical systems, then you're
taking an unknown and unquantifiable risk. Or in other words, you're putting
peoples' lives in danger.

And that's _everyone_ who uses machine learning to train cars to drive in
real-world conditions. Not just Uber.

~~~
317070
Yes, but that is not specific to machine learning. Humans have also learned to
drive under the assumption that future observations will somewhat be alike
what they have seen in the past. And yes, that is putting peoples' lives in
danger.

But that has nothing to do with machine learning. It has to do with all
control systems, human or machine.

~~~
YeGoblynQueenne
The point is that machine learning algorithms' decisions always have some
amount of error, and that this error goes way up in the real world.

The auto-car industry's marketing claims that self-driven vehicles are safer
than humans just because computers have faster "reaction times" (they probably
mean faster retrieval from memory).

But if your reaction is completely wrong it doesn't matter how fast you react.
Reacting very fast with very high error will just cause a very fast accident-
and make it harder for puny humans' reflexes to avoid it, to boot.

------
TrainedMonkey
There was almost a second where the woman was clearly visible and yet the car
did not attempt to emergency break. I would not expect the vehicle to safely
stop, but it could have definitely slowed down and reduce the energy of the
collision.

This is not criticism against self-driving tech, I would not expect alert
human driver to avoid collision either due to limited reaction time. With
technology however, we should be able to do better than humans, particularly
when it comes to reaction times. Clearly, there is still a some work to do.

~~~
ajross
Do we know the vehicle didn't brake? There's no audio to tell us anything, and
a car takes "almost a second" to rock forward on its suspension when reacting
to brake input too. FWIW, I think I might see the nose bobbing a bit just
before the collision, but honestly can't tell.

I certainly don't think you can say "did not attempt to brake" based on the
evidence at hand, basically.

~~~
ars
> Do we know the vehicle didn't brake?

Because the nose of the car doesn't go down. Especially in an emergency
braking situation the nose of the car will go significantly down, and you
would see that in the video.

~~~
user5994461
The video is cut before the impact. I don't think we can judge of what will
happen in the last second that is missing.

~~~
shock-value
There definitely isn't a full second missing before impact. Maybe a tenth of a
second.

------
baking
This was the original story: "After the Uber collision, the car continued
traveling at 38 miles per hour, according to the Tempe police chief"

[https://www.bloomberg.com/news/articles/2018-03-21/for-
self-...](https://www.bloomberg.com/news/articles/2018-03-21/for-self-driving-
cars-seeing-everything-isn-t-always-enough)

In other words, the car _never_ detected the pedistrian and never slowed down
on its own. This has nothing to do with _when_ it saw her. It clearly didn't,
yet both camera and LIDAR should have been able to.

~~~
ikeyany
I feel bad for Waymo and all the guys taking the necessary safety precautions.
This is going to set the industry back quite a bit. Thanks Uber, and RIP
Elaine.

~~~
InclinedPlane
I don't. Yes, I think Waymo is probably taking very nearly all the necessary
safety precautions. But how would we know? It's simply a matter of trust,
which is not good enough. And as we see here, the problem isn't Waymo the
problem is every other fly-by-night automated vehicle operator out there. We
should all know how the sausage gets made in this industry and it is
absolutely not through a process of meticulous rigor and studious observance
of industry best practices. We all know that software systems get made via
mountains of short-cuts and compromises amidst an environment of half-assed
development practices.

The only way to fix this is to have regulations which actually enforce quality
control. That means code audits, it means a lot more process and a lot more
oversight, all of which is going to be a drag but which is necessary when
developing systems on which so many lives could depend. If you look at the
regulatory requirements for aviation software or even video poker machines
they will really put into perspective how little is being done now with
autonomous vehicles.

------
DoreenMichele
A number of comments here have touched on the fact that she was apparently
homeless. I spent 5.7 years homeless. Well before that, I had a college class
on homelessness and did an internship at a homeless shelter. I am author of
the website the San Diego Homeless Survival Guide.

I've made a few comments in other discussions about some of the ways her
status may have contributed to this tragedy. I'm just going to link to them
with a short identifying blurb. Hopefully, taking them out of context won't
make this go weird places.

Potential suicidal tendencies:

[https://news.ycombinator.com/item?id=16633727](https://news.ycombinator.com/item?id=16633727)

Potentially jaywalking dangerously and darted out of nowhere:

[https://news.ycombinator.com/item?id=16625242](https://news.ycombinator.com/item?id=16625242)

Possibly poor health contributed to her death:

[https://news.ycombinator.com/item?id=16625076](https://news.ycombinator.com/item?id=16625076)

I am leaving this here in part because "She was high or crazy" is a common
stereotype about homelessness and it tends to not be a compassionate view.
There are myriad ways her status could have contributed to the situation
without her being either high or crazy. Yes, she could have also been one (or
both) of those two things. But plenty of housed people get high or have mental
health issues and we don't hand wave off their deaths as "they must have been
high or crazy."

So, my hope is to be respectful of people whose feeling is her status could
have contributed while putting out hopefully better information. Yes, her
status may have contributed. But there is no reason to complete that line of
thinking with "because homeless people are usually crazy and/or junkies."

There is a huge shortage of affordable housing in this country. A link backing
that up is provided in one of my previous comments. These days, a lot of
homeless people are just poor and can't afford rent, even while employed.

------
danso
Wow, I had thought it impossible for the Uber AV to be in the right lane, and
for the victim to be crossing left-to-right, for the police to claim that the
victim couldn't be seen until it was too late. But I didn't expect the road to
be so dark, given what we saw in the accident photos (which might have been
over-exposed?) and Google Maps, which showed a lot of street/sidewalk
lighting.

In the moment that we can fully see her, she does look unambiguously like a
person walking a bike across the road (reports say there were plastic bags on
the bike, but they weren't obvious/obstructive in the camera view). Is the
AV's LIDAR expected to detect this kind of thing, even if it's too dark for
human eyes?

The video of the Uber driver doesn't look great for the driver. I mean she
doesn't look particularly engaged -- but I suspect that's what _most of us_
would look like at the wheel. But she definitely seems to be looking
downwards, right at the moment of impact.

Unless some other incriminating info is discovered, I hope that the driver
isn't the sole focus of punishment (doesn't help that she's a convicted armed
robber, albeit years ago). Being able to brake in time for the victim seems
difficult even in most ideal and alert conditions. And I have to think human
operators are going to suffer complacency when 95-99% of the time they never
have to actually drive -- making that switch seems to be a situation ripe with
problems.

I don't mean that Uber execs/testers/engineers (again, assuming there isn't
other incriminating evidence) should be scapegoats. I hope the result involves
regulations that add more transparency to reporting (especially in Arizona),
and public debate about the expectations of AV and AI.

~~~
somebodynew
The dynamic range of small sensor digital cameras is so poor that it is
impossible to judge whether or not she would have been visible to the naked
eye based on the video.

~~~
danso
That's a great point. I did think, looking at the video, that the street was
unnaturally dark. The camera seems to be exposed for night lights and the
headlights too.

------
jmharvey
As presented, this looks like a pretty classic "overdriving your headlights"
situation.

Even though they're made of retroreflective material, only two lane divider
dashes at a time can be seen in the video, indicating something like 50 feet
of visibility. Stopping from 38 mph takes 70+ feet from the time the brakes
are applied (and human reaction time adds quite a bit more). Things (people,
animals, stopped cars, road hazards) appear in the road fairly routinely. If
you can't stop inside the area you can see, you're operating your vehicle
recklessly.

~~~
hcknwscommenter
I think you are missing a key point here. Have you ever been on a street with
streetlights and had difficulty seeing past two lane divider dashes? The point
is that the camera footage is vastly under-indicating the amount of distance a
human eye would have been able to see.

~~~
Declanomous
Furthermore, if there are street lights, and you go through an unlit area, you
should be able to detect the presence of an obstacle because it will be
obscuring the view of the road ahead.

Uber's technology clearly sucks, to the extent that I doubt they are even
using Lidar. However, the woman driving the vehicle should be arrested for not
being aware. If she had been looking at the road for more than a small
fraction of the time she surely would have seen the person on the road.

------
lewis500
This is not what I expected at all. The original news reports based on the
police comments made it seem like she was on the side of the road and then
randomly darted out. That's not something I would really expect AV's to be
able to cope with yet, if ever.

Instead what we see is a scenario that happens all the time: a pedestrian who
is sleepy or perhaps with mental problems or on some substances...in any case
not hyper alert...crossing a road without looking. It's not that busy at that
time of night and I'm sure I have done the same thing. It's very discouraging
and angering that these cars are being driven without the capability. I feel
very disappointed. To me it's not super important whether an ordinary driver
would've stopped: ordinary people drive drunk, exhausted and/or distracted all
the time; but I hoped that AV's were already better than that.

------
mrtksn
Okay, isn't this the kind of a situation where the machine was supposed to
excel?

The dynamic range of the video is very low so it looks like the victim comes
out of complete darkness but weren't all these sensors supposed to see the
obstacle even in complete darkness?

BTW, please consider the low dynamic range of the video when commenting on the
human's ability to avoid that accident. After driving in the dark for a while
your eyes will adapt and you'll be able to see much more details in the
shadows than a regular video camera can record.

------
HankB99
This seems like a simple case. I would have expected the driver assist in my
2017 Subaru to have reacted to something in the road. I'm surprised that the
much more sophisticated self driving system did not.

~~~
darkerside
I wonder if this was affected by the fact that the pedestrian was in another
lane up until the very last second. Perhaps the car detected the pedestrian
but failed to consider it an obstacle since it wasn't in its direct path. It
could be unexpectedly difficult to account for pedestrian crossing speed if it
caused automated cars to stop when a car in the next lane happened to "wobble"
towards the automated car's lane.

~~~
smogcutter
I dunno, if it decided to ignore the pedestrian because she was in the other
lane that's extremely troubling. The pedestrian was moving laterally across
the road. If the car has detected that, it should infer that she might
_become_ an obstacle very shortly.

Driving is all about predicting the future. Think of every time you've been
able to tell that someone is going to change lanes even though their blinker
is off, or slowed down when a ball bounces into the street because you know
there's going to be a kid following it. If the car isn't capable of that, it's
not ready for public streets.

~~~
galdosdi
What's most troubling is it doesn't even matter if the uber car thought the
slow object in the other lane was a pedestrian, a car, a tree, whatever. Even
if it thought the object was say, the most "normal" thing it could be, another
car, this would still be a special situation requiring action. Without knowing
the objects classification the estimated speed is enough to decide. Why would
a car be stopped in the middle of a lane on a fast road? It should be treated
as an obstacle that could grow to the side. After all it could be a police,
tow, construction, or disabled vehicle, and a cop or tow worker might be about
to walk to the side.

Actually many states now require by law that you get as far as possible from a
lane with a disabled vehicle, as many human accidents have happened.

I am convinced uber has been basically pretending to do the mountains of
careful and sophisticated crap waymo actually has gone to great lengths to do,
and is just racing to put anything out so they can keep stringing investors
along as far as they can before the jig is up. Well the jig is up now.

~~~
HankB99
I think target fixation also contributes to accidents involving emergency
vehicles on the shoulder.

Back to Uber, the number I hear is that they are striving to go better than 13
miles/intervention to prevent an incident. For Waymo this is over 5000 miles.
I'm convinced too.

------
Robin_Message
Why are safety drivers not:

\- working in pairs, so there is social pressure, conversation, and two pairs
of eyes to increase alertness and safety?

\- doing shifts of 30 - 45 minutes at most [1] (although they could
potentially swap back and forth with a co-driver)

\- issued a dumb-phone for emergencies and searched for entertainment devices
(it's good enough for Amazon warehouse staff)

\- being monitored by the driver-facing camera, with training and termination
for drivers who can't hack it

\- monitored automatically for attention using eye tracking or other methods,
with the car safely stopping if lack of attention is detected

\- required to take over on a random, regular basis for a short period to keep
them engaged and attentive (and obviously, the car keeps driving if they don't
take over, but they are marked down)

Due to the boredom, it is an extremely demanding job, but the way it is being
done is clearly not good enough.

[1] I can't find anything published about how long the shifts are, but I'm
guessing they are longer.

~~~
ng-user
All of those suggestions will increases cost and decrease profit, there's no
law mandating any of those so why would they go above and beyond for 'safety'
reasons?

I understand it's the ethical decision but at the end of the day for-profit
companies are only interested in one thing: maximizing profits.

~~~
stordoff
Robin_Message basically says a lot of what I was thinking, but you also need
to consider what maximises profits in the long term (and self driving cars are
a long term project) - is it doing the minimum now to reduce costs, or is it
taking all reasonable precautions to avoid the public perception building that
self-driving cars are unsafe?

------
imh
The video makes it look like the woman popped out of the shadows and makes it
look like this was unavoidable. But that's not what a person would really see.
People's eyes have great dynamic range. Take this picture for example (not
mine):

[https://cdn-images-1.medium.com/max/2000/1*OQtewILwLl-EssbWY...](https://cdn-
images-1.medium.com/max/2000/1*OQtewILwLl-EssbWYKF6sA.jpeg)

What you'd see in real life is much closer to the edited version on the right,
while unedited pictures (the version on the left or the uber video released)
would make it seem like you can't see shit. A driver paying attention probably
would have seen this person from far away. At the very least, the video
doesn't convince me this was unavoidable.

------
slavik81
Why is the car driving at full speed if it can't see the road ahead? Does the
car know that visibility is low? What other sensors are they using? Did the
car notice the pedestrian once they were fully lit up?

This video raises so many questions. I think we're going to be revisiting this
incident over and over again in papers, reports and eventually textbooks.

~~~
snowpanda
>Why is the car driving at full speed if it can't see the road ahead?

This is the best question to ask.

Why? Because going through all the comments, there seem to be 2 trains of
thought:

1) LIDAR should have seen it. (car's fault).

2) It's the pedestrian's fault.

Neither of these questions matter. Because the real question is like you said:
if the car couldn't have seen it, then why drive at full (or actually above
legal) speed?

------
f055
Move fast and break things. Fake it till you make it. Drink the koolaid. All
hail Silicon Valley. Blah.

The ugly truth is, when looking at the high-end camera photos of the accident
spot, a human driver had a 50-50 chance of avoiding this accident. This car
didn't even react. But that's not the worst thing. The worst thing is that the
infrared or heatcam tech is fairly advanced and would have seen this person
even in pitch black. But it seems Uber chose to do "testing on production" and
released an experiment onto the roads. It's well known that lidar tech has
problems during rains, snows and low light. Sure, driving it is great for
machine learning, but at what cost? Especially when everyone expects so much
more from this tech than to equal human drivers. The goal should be to remove
accidents from the equation. But, especially for Uber, the goal is just to
profit.

------
erobbins
A straight road, in dry conditions, obstructed by an entire person and a
bicycle with REFLECTORS on it, and the car doesn't see her?

I think whoever wrote this code should be in prison, quite honestly, along
with their entire management chain. I suspect many people will feel the same
way. This is a really really bad look for automated vehicles in general. It
also backs up my contention that anything less than full 100% automation is
worse than no automation at all, as the driver was complacent and had less
than zero chance of taking over.

~~~
siliconwrath
It certainly does look bad for self driving vehicles, and I definitely feel
terrible that a life was lost from this.

However, how does one develop a self driving vehicle that’s 100% automated
without the ability to test in real driving conditions? Despite this accident,
self-driving vehicles have a fairly safe driving record for the number of
miles and time they have been active.

Details on California accident rates for self driving vehicles, for example,
show mostly minor fender benders despite more frequent accidents:
[http://journals.plos.org/plosone/article?id=10.1371/journal....](http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0184952)

~~~
danso
It's a difficult question. Because we cannot just dive into it assuming that
the toll (in life/safety) of beta testing AVs on public roads will result in a
net benefit within a reasonable timespan. The human driving fatality rate is
~1 death per 100 million miles. Uber has 2 million miles driven and 1
fatality. It's obviously unfair to extrapolate and say that Uber has 50x the
fatality rate of normal driving. But that means we have to keep testing Uber
AVs on public roads.

What if an Uber AV accidentally kills someone at the 2.5M mark? That's still
not enough data to statistically compare apples to apples. Maybe the next 100M
miles of Uber testing is fatality free...that still wouldn't be completely
enough (right? I'm not great at stats but I would think we need at least a
billion?). Of course, it could go the other way, with Uber AVs killing someone
every 1M miles.

As a general tech optimist, I'm inclined to think tech will get better,
overall. But let's face it, that's not a given. And in the meantime, it's
likely the tech upper-class won't be the ones who suffer the most while tech
improves. The case at hand being the prime example: a homeless recently-
imprisoned woman was killed.

Earlier today someone submitted an interesting RAND study that argued that the
testing time for autonomous vehicles to meet statistical reliability for
safety testing would be on the order of decades, or even centuries, and there
would still be no guarantee that AVs would be safer. I'm hoping RAND is just
being really pessimistic here...

[https://www.rand.org/pubs/research_reports/RR1478.html](https://www.rand.org/pubs/research_reports/RR1478.html)

------
highscaler
After comparing the released video and a nighttime video [1] on the same
stretch of road, I think the released video very likely showed a lot less
visibility of the victim due to the dynamic range of the camera. In [2] the
victim's shoes became visible. Note the lamp post with a traffic sign to the
right. In [3] captured from [1] note the same lamp post on the right edge and
the storm drains on the curb, not far from the lamp post and not that dark.
[4] shows the victim was very close to the storm drains. The camera that
captured the released video had to adjust to the bright headlight of the
vehicle. IMO the victim would be a lot more visible to a human driver.

[1]
[https://www.youtube.com/watch?v=zEaTdYJExq8&t=8m50s](https://www.youtube.com/watch?v=zEaTdYJExq8&t=8m50s)
from user niftich's comment
[https://news.ycombinator.com/item?id=16644146](https://news.ycombinator.com/item?id=16644146)

[2] [https://imgur.com/QfN1XVT](https://imgur.com/QfN1XVT)

[3] [https://imgur.com/Zdkc4PL](https://imgur.com/Zdkc4PL)

[4] [https://imgur.com/ZHE9wYz](https://imgur.com/ZHE9wYz)

------
bigtunacan
Clearly there are still issues with the car. The sensors should have picked up
on this. I would be surprised if the automated emergency braking in my Honda
wouldn’t have picked this up based on past experience of a close call at night
where it saved my ass.

I want to focus a bit though on the human failure here. Uber has been piloting
this program with human backup drivers just for situations like this until
there is enough confidence in the maturity and safety of these cars.

Whether this driver could have prevented the collision if their eyes were on
the road, we will never know. What is clear is the driver was looking down,
most likely on their cellphone.

For that there is negligence on both the part of the driver and on the part of
Uber. Uber is recording the interior of these cars, so there should have been
some type of review process in place to capture this type of behavior from the
drivers so it could be addressed.

~~~
51Cards
While I do agree there is a level of responsibility on the driver's part here
I will also say that I think it's equally unreasonable to expect someone to
stay alert in that situation. Human brains need stimulus and interaction to
stay alert focused. To me this is as much a failure of understanding how the
human mind functions... not just on Uber's part but of all the companies that
think the human will stay alert while the car does the work. Our brains just
don't work that way. Once the brain starts to trust the car it disengage and
wander off onto other distractions.

------
Lewton
I really hope Tempe has the balls to revoke Ubers AI driving license.

This is completely unacceptable. Their approach to AI driving has been
terrible from the start. They don't have the data, nor the expertise to do
this in a safe way.

This is not the first time they've fucked up royally

[https://www.theverge.com/2017/2/25/14737374/uber-self-
drivin...](https://www.theverge.com/2017/2/25/14737374/uber-self-driving-car-
red-light-december-contrary-company-claims)

------
grizzles
It strikes me that a good way to test self driving cars would be to simulate
people doing random dangerous things like walking in front of the car and
seeing how good the car is at avoiding an accident. Then you give the car less
and less time to see how well it can eg. respond to a giant mannequin being
thrown in front of it.

Following that train of thought, an accident like this could have been caused
by fear of corporate liability.

Because you would eventually have to set some threshold where the car would
not respond to an errant pedestrian for the safety of the occupants.
Especially in wet weather. A good lawyer could argue that is when the car
software decided to kill the pedestrian. So, Uber's AV team might have just
said "she was jaywalking" to avoid writing code that dealt with situations
outside of the rules of the road because it might in the future look
controversial, even if the addition of that code may have the effect of
sparing more lives than it takes.

------
nkurz
1) I'd hope and assume that an autonomous vehicle would have better sensors
than a single mounted video camera. It does have other sensors that it uses,
right? Were they all operational? What do they show? Combined, do they have
dynamic response comparable to the human eye?

2) The road seems really dark. Are the headlights intentionally dimmed here?
At night, on a empty road with no opposing traffic, shouldn't the headlights
normally be on bright, rather than dimmed? Does the autonomous vehicle control
this switch or the human driver?

3) I feel like a human driver would have been likely to swerve left to avoid
the human. They may still have hit the rear of the bicycle, but no one would
have died. Does the car even attempt to prioritize collisions with inanimate
objects over living ones? Does it at least attempt to slow down if a collision
seems inevitable?

4) If you cannot stop in time to avoid an obstacle on the road after it is
illuminated by your headlights, it would seem clear that you are travelling
"too fast for conditions". Either you need brighter lights, or a you need to
slow down. Is this in dispute?

~~~
ubernostrum
I'm a broken record here, but:

Seriously. Go spend some time in /r/roadcam on reddit to get yourself
accustomed to what a typical dashboard camera video looks like. People here
are drawing all sorts of conclusions based on seeing an unfamiliar type of
video, and that's a really bad thing.

The two biggest things to get used to are:

1\. Dashcams do a terrible job in dark or low-light scenes. The car's
headlights, and the streetlights, were almost certainly functioning just fine;
the scene only looks as dark as it does because that's the best the camera can
do. A dashcam video of a dark/low-light incident will always look
significantly darker than how a human would perceive it when actually present.

2\. Dashcams often have a wider field of view than their human operators.
Which means you often see obstacles in the video before they'd be in the field
of view of a human driver (this causes no end of "well _I_ saw that coming way
before the driver did!" snark in /r/roadcam discussions).

And in theory this vehicle has infrared on board, and a LIDAR system scanning
the road ahead for trouble. Any such system should have had zero difficulty
detecting the pedestrian in time to stop or avoid. The fact that it didn't is
a very bad sign for Uber's implementation.

~~~
stefan_
Here is what even a GoPro can do at night:

[https://www.youtube.com/watch?v=p0HamWCVrFI](https://www.youtube.com/watch?v=p0HamWCVrFI)

~~~
user5994461
Not a fair comparison. The gopro is static in a well lighted corner. It's
ideal conditions.

The same camera in movement facing moving lights might never be able to adjust
its sensitivity.

------
Treblemaker
The first question that comes to my mind watching this, is, why didn't the
_pedestrian_ see the _car_? It looks as though she didn't even look up until
the car was only a few feet away from here. I'm not just trying to "blame the
victim". The car "should" have responded sooner. But see-and-avoid works both
ways. If you're choosing to cross a road in the dark, you, too, have a
responsibility to understand the drivers' limited visiblity and act
accordingly. My point is that this is a teachable moment for not only SDV
developers, but also for pedestrians.

~~~
jonathanyc
The pedestrian died, not the SDV developers. Calling this a “teachable moment
... also for pedestrians” is rather disrespectful.

~~~
cm2187
I think the only circumstance where a pedestrian can cross a road without
watching for cars is when protected by a traffic light, and even then that's
not wise. A pedestrian crossing a road outside of any crosswalk at night and
not watching for cars is playing russian roulette. Doesn't mean the driver (or
self-driving system) shouldn't have spotted it as well to prevent the
accident. But it is clearly a key driver enabling that accident.

~~~
tzs
> I think the only circumstance where a pedestrian can cross a road without
> watching for cars is when protected by a traffic light, and even then that's
> not wise.

You are definitely correct about even then you should check. And once you
check and see it is clear... _keep_ _checking_ as you cross.

An acquaintance of mine from college was killed crossing Colorado Blvd in
Pasadena, California late on a Friday afternoon, in a crosswalk at a fully
controlled intersection, with the cross traffic having a red light.

Colorado Blvd is a major street, and late on a Friday afternoon would have a
fairly high density of cars.

The car that hit him was going something like 80 mph. At the time he started
crossing, that car would have been 3 blocks away. Even if it had been the only
car on the road, at that distance there would be no way to judge the speed,
and any car that far away traveling anywhere near legal speeds would be far
enough away to not be a danger to any normal pedestrian crossing.

With the other cars that were on the road on a late Friday, it probably would
not even be possible to see the car that hit him when he entered the
crosswalk. There would have been several cars stopped at the intersection
obstructing his view, plus cars at the intersections further up the road, or
in transit between the intersections.

His only chance would have been to keep checking oncoming cars as he crossed,
even after all the cars actually at or near his intersection at stopped.

Almost no one does that. Mostly once we see everyone nearby stop we just
concentrate on cars that are turning and so might enter the crosswalk even
though the light is red (assuming we are in a right turn on red jurisdiction).

Lesson #1: Treat each step as you cross the street as if it is your first step
into the street. Do your full "is it safe to cross" scan constantly.

Lesson #2: Cars very far away at the time you start to cross can make it to
you before you finish, even if traffic seems heavy enough that there is no way
they could go fast enough to reach you. Your scan needs to look out farther
than you think it needs to.

------
alva
On the one hand, the cyclist became visible in the footage at a very late
stage. If that is how the drivers eye saw it they would have very little time
to react, however they may have been able to do a sharp swerve.

On the other, the operator was clearly distracted regardless of if they would
have been able to avoid this.

~~~
bobbles
You're getting downvoted but the guy was obviously reading his phone the
entire time...

~~~
david-gpu
We don't know if he was looking at a display showing the road from the
perspective of lidar, etc.

~~~
joosters
So? It's their job to be the human lookout in the car, not watching a TV
screen.

~~~
david-gpu
If they didn't want drivers looking at those displays, why would they put them
there? Either looking at those displays is part of the job, or the display
should automatically turn off. I'm guessing it's the former.

------
eanzenberg
Although it seems from the dashcam footage that the scene is dark and the
bike/pedestrian comes out of nowhere, humans are able to handle extreme low-
light situations and I don't doubt that this pedestrian would have been
visible to an attentive driver.

If autonomous cars cannot handle low-light conditions given the current state
of hardware, either from visible, ir, or lidar, then they need to run in the
daytime only.

~~~
pdonis
_> humans are able to handle extreme low-light situations_

But this wasn't an extreme low light situation. It was a situation with street
lights alternating with extreme low light regions. Under those conditions the
human eye will not be dark adapted and your low light vision won't be very
good.

~~~
eanzenberg
Ive driven at night and the video is not representative of what I see. With
your car's headlights and the road lights, there are no blind spots on the
road. If there is not enough street illumination and my headlights cannot
reach far enough ahead I turn the high beams on and travel cautiously.

~~~
pdonis
_> With your car's headlights and the road lights, there are no blind spots on
the road._

I agree that is the optimum, but I don't think all roads are optimal in this
respect.

 _> If there is not enough street illumination and my headlights cannot reach
far enough ahead I turn the high beams on and travel cautiously._

From what I can see in the video, this is what a reasonably cautious human
driver would have been doing on the street shown, since it does not appear to
have sufficient light to avoid blind spots. It does not appear that the Uber
car had its high beams on. But, as has been noted in other comments in this
thread, the car had other means besides visible light (LIDAR and IR) for
seeing the pedestrian, so it should have been able to do better than a human
driver at avoiding the collision.

~~~
eanzenberg
I would love to see what the lidar/ir system was seeing. I'm not sure that
will be released if it's obvious uber fucked up.

~~~
linkregister
I doubt that NTSB would fail to ensure that footage got released.

------
titanix2
Two obvious things from the footages: the pedestrian is invisible until the
last moment, and the "driver" is focused on something else, probably a
smartphone.

Given of how little time there was between the hitted woman became visible and
the impact, a driver action may not have been able to avoid it, but what will
be interesting is how his responsability will be engaged or not.

Anyway one advantage of automous car is the theoretical the ability to have
faster than human reaction and extra-human sensors. It should have been able
to catch the problem.

~~~
badestrand
> the pedestrian is invisible until the last moment

In this crappy video, yes. Maybe it was scaled down massively before handing
to the police or made darker, we don't know. It might have been visible for a
normal person.

~~~
sjg007
The Police should ask for the raw footage and at any rate the AD will if
there's any doubt. They will probably engage a forensic computer tech expert
as well and possibly ASU professors/engineers. But from a black box
perspective it's clear that the system didn't not detect anything. Still IANAL
but I would probably guess there will be a reckless driving charge for the
driver here.. mitigated by the pedestrian failing to yield to the car by not
crossing at a cross walk. The human driver was clearly not paying attention
and there is video evidence of that. If the driver had kept their eyes on the
road and the accident had still happened then it probably would be no charges
filed.

------
cozzyd
The dynamic range in the video is probably misleading compared to the sensor
capabilities or what adjusted eyes would see.

If the car can't see things in time to stop, it's probably moving too fast for
the conditions.

------
trhway
The onus is on driver to see what the path ahead is clear. Uber clearly
violated the 3 sec. rule [i.e. reaction time plus braking time with the
braking time dominating the equation]. Uber was driving too fast for the
conditions. If you have only 30m clear view, you can't drive faster than
10m/s. Driving faster, like Uber did here, is basically driving blind,
reckless and negligent. The woman was in the headlights for only 1 second -
thus Uber was driving 3 times faster than the maximum safe speed in those
conditions ( specifically the power and angle of their low beam headlights)

>the pedestrian showed up in the field of view right before the collision

It takes AZ police and $70B corp together to conclude that the woman appeared
too fast. Guess what? If people and objects appear too fast in the field of
view then you're driving too fast.

------
linkregister
In Uber’s own tech blog, it describes its use of LIDAR to generate a wire
frame of each object and track it. The lighting should be irrelevant for the
vehicle, while it is essential for the driver.

From the below article: _“Last fall, Uber officials showing off their vehicles
on Mill Avenue said their radar and lidar were able to detect objects,
including jaywalkers, as far as 100 yards away and avoid collisions.”_

Something went really wrong here, and I look forward to the release of the
rest of the data. I’m wondering why the woman, who had a large enough
silhouette, wasn’t detected by the software.

This is a better source:
[https://www.google.com/amp/s/amp.azcentral.com/story/4476480...](https://www.google.com/amp/s/amp.azcentral.com/story/447648002)

~~~
jozzas
I suspect she was picked up by the sensors and categorised as an object. These
are absolute fundamentals, and I can't fathom the car working at all if it
can't do this. If it was being driven with a limited set of sensors or a
malfunctioning sensor, hoo boy the NTSB will have a few things to say.

I reckon it's more likely that this is was due to a software issue where she
was flagged as "not a collision concern" or similar due to being in the other
lane on a 2-lane road, or due to her path not being predicted accurately due
to the odd returns received by a bicycle in profile at that range (it may have
thrown off their regular pedestrian detection). In any case a terrible shame.

------
ineedasername
Okay, so Tempe police aren't going to be able to get & grok the raw data on
how the AI was handling things. So how does that shake out in an
investigation? It seems like they'll have to rely very heavily on the good
faith of Uber (hah!) to honestly interpret & communicate the truth of that
side of the story.

Releasing the video so soon without the equally important (probably more so)
AI side is almost deliberately misleading, and at best shows Tempe ill
equipped to deal with the investigation if they believe that video to answer
any more questions than it raises, and I can't believe they don't know better.
The video screams "Blame the poor light/lazy driver/idiot pedestrian!" at
least for viewers that aren't more savvy on the tech involved.

~~~
Maybestring
The NTSB is opening an investigation, and Congress critters are watching.

The Tempe police aren't going to lead the crash analysis. If history is any
guide, they'll probably have NASA doing code reviews.

------
markc
Watch the footage carefully and you'll see that the car actually tracks to the
_right_ immediately before impact.

When centered there's about 20" between the sides of the car and the lane
markers. (120" lane - 80" car) / 2.

At the moment of impact, the victim appears to be between 3' and 4' from the
right lane marker. To avoid the fatality, the car would have only had to track
left 2' from center.

So my question is - what was the last possible moment such an avoidance
maneuver (veer left) could have been initiated?

I think that's an interesting factor in assessing whether this tragedy was
avoidable or not. Certainly seems like a technical failure here here in that
the car was easing slightly to the right and not braking or veering left.

~~~
vesrah
Could have been getting into the right hand turn lane to head down Mill Ave
that blends in right there.

------
lhorie
When I cross-reference the lane lines at 0:03 with google maps[1], I get an
estimate of the car being about 60ft-70ft away from the victim when I was
first able to spot her shoe in the video. Some googling[2] says human reaction
at 40mph would've taken 59ft, with an additional 80ft of actual stopping time.

This seems in line with the police report that "it would have been difficult
to avoid this collision in any kind of mode (autonomous or human-driven)"

Maybe a SDV with superhuman reaction time should've been able to stop some
10ft after hitting her, and maybe if it had night vision-like sensors, it
should've been able to see her from farther away, but my impression is that it
would've been impossible for a regular human driver to avoid this accident.

Given that the SDV didn't attempt to reduce speed before she was visible under
regular light, I'm forced to assume the sensors do require light in order to
detect obstacles and that there wasn't enough light to activate the sensor
before the headlights illuminated her.

[1]([https://www.google.com/maps/@33.4369852,-111.9436199,113m/da...](https://www.google.com/maps/@33.4369852,-111.9436199,113m/data=!3m1!1e3))

[2][https://www.google.com/search?q=stopping+distance+40mph&rlz=...](https://www.google.com/search?q=stopping+distance+40mph&rlz=1C5CHFA_enUS746US746&oq=stopping+distance+40mph&aqs=chrome..69i57j0l5.5503j0j7&sourceid=chrome&ie=UTF-8)

~~~
burger_moon
The speed limit is only 35 not 40 on that road unless we're looking at
different roads? If the driver is paying attention they could turn the wheel a
bit and split the left lane (there doesn't appear to be anyone else on the
road) and avoid the person who was half way through the lane at time of
impact. Might not clear the person but clipping the back of the bike would
certainly be better than a direct impact. You wouldn't even have to jerk the
wheel and swerve at such a slow speed and on a dry road with modern tcs even
over-correcting is handled by tcs to limit what your input does. Of course all
this is just playing arm chair quarterback so who knows if someone including
myself would have been able to react in time. More of a thought experiment to
how an alternative could have occurred.

I'm curious how the visibility differs between what we see on this camera and
what a human actually sees in person.

~~~
lhorie
According to google maps, the speed limit is 45 northbound (which is the way
the car was going), 35 southbound. The report was that the car was going at
38, but I could not find stopping ranges estimates for that exact number. So
give or take a little.

Looking at the video, it seemed like the driver reacted (facially) about a
second after she shifted her gaze from some device to the road. It took me
several runs of replaying the video to narrow down the time between when I
first saw her shoe and collision to about 2 seconds. Also, recall this was at
10pm. In my opinion, swerving or braking with a 1-2 second notice is
_extremely_ hard, especially if it's late and you're tired. To be perfectly
honest, if it was me, I don't think I would've been able to react at all
before the collision.

------
mlinsenbardt
It appears that UBER may have disabled/overridden Volvo's built-in safety
platform, "City Safety". City safety automatically brakes for obstructions, up
to 50MPH.

[https://www.youtube.com/watch?v=gcqktnqlf10&feature=youtu.be...](https://www.youtube.com/watch?v=gcqktnqlf10&feature=youtu.be&t=18)

[https://support.volvocars.com/en-CA/cars/Pages/owners-
manual...](https://support.volvocars.com/en-CA/cars/Pages/owners-
manual.aspx?mc=v526&my=2016&sw=15w46&article=b9c7f9b0caf57af1c0a80151666ac460)

------
archi42
A lot of people seem to mention how LiDAR should have seen her coming:
Generally speaking, that's true. But it's difficult. In a single frame the
system would hopefully detect _something_ on the left of the lane, but
wouldn't know what it is (if they use AI it's probably not trained for "person
shoving bike packed with plastic bags").

If you have the object detection in place, you still need a trajectory for
that object; if you don't know the exact angle you're looking at that object
it's difficult. And moving at any speed there are plenty of things that could
go wrong with the calculation, especially for slow objects. Also remember
you're seeing a lot of oncoming traffic (for which you don't want to do an
emergency brake) and cross traffic moving onto the road, letting you pass
before they enter your lane (for which you might want to slow down a bit).

Maybe the bags reflected LiDAR light causing a wrong classification... Maybe
their detection noise was too high and the quick fix was to turn down
sensitivity.

Whatever happened exactly (and I hope we get more information), this is a
terrible accident, because the object in question was not part of some
controlled test, but a human being left dead.

Still, the technology is very promising, but companies should be more careful
with testing; have drivers be aware of the road (is that IR light from the
camera or a smartphone?), and maybe do more testing in controlled
environments.

As a software engineer, I would only release an autonomous car into the wild
after I trust it enough to walk into its path on a dark road (with enough
space to brake; suicides can not be avoided)... I wonder if the Uber engineers
would?

~~~
hnaccy
I don't know how you can watch this video and then say that they should "maybe
do more testing in controlled environments".

This was a complete failure and Uber's cars should be immediately banned from
testing on public roads and there should be investigation into their practices
to ensure public safety while testing their vehicles. If they rushed it trying
to play catch up with Waymo they should be punished.

Personally I see this as a Therac-25 situation.

~~~
archi42
It's a "maybe" because I don't have any more data points than what's present
in the video - only assumptions.

But I strongly assume this could have been avoided by more thorough testing
(e.g. use a contraption to shove various inanimate objects in the path of the
car at various speeds [both object and car] in various light conditions). And
_iff_ a proper investigation finds out people responsible for this accident
(engineers doing dirty fixes, managers neglecting safety, the driver just
being an unaware dummy; you name it), they should carry the consequences same
as a human driver would.

And yes, their self driving cars should stay of the road until the fault has
been found and fixed. I never claimed the opposite. (Aren't they? IMHO it's
the obvious thing to do; but then we're talking about Uber and their poor
sense of responsibility...)

Eventually "maybe" has too weak of a connotation in this context (as a non-
native speaker I sometimes find it difficult to pick the correct variant for
the exact thing I want to implicate) - for which I apologise if I mislead you
there.

------
carlsborg
According to this article, the algos only track objects that have been
classified as "moving" and the initial classification is what matters.

"Let’s say that there is a huge red fire truck idling at the side of the road.
An autonomous car can’t see the fire truck if the fire truck suddenly pulls
out into traffic. Why? Autonomous cars have a stop and start problem. They
don’t have unlimited processing power, so they cut down on calculations by
only calculating the potential future location of objects that seem to be in
motion. Because they are not calculating the trajectory for the stationary
fire truck, only for objects in motion (like pedestrians or bicyclists), they
can’t react quickly to register a previously stationary object as an object in
motion.

If a giant red fire truck is parked, and it suddenly swings into traffic, the
autonomous car can’t react in time."

[http://www.nextgov.com/emerging-tech/2018/03/self-driving-
ca...](http://www.nextgov.com/emerging-tech/2018/03/self-driving-cars-still-
dont-know-how-see/146797/)

~~~
chopin
That's worrying. If that's really the case, autonomous cars shouldn't be on
the road.

------
dschuler
I've noticed that pedestrians in the US tend not to look at cars coming their
way when they cross intersections - I'm not sure if it's an attitude of
"you're supposed to stop for me" and and they're willing to bet on that
happening every time, or if they're really unaware of the danger of a few tons
of steel coming at them.

This always struck me as very strange as someone from Europe - they teach kids
in elementary school to look both ways and not to underestimate the speed of a
car.

There are about 2500 pedestrian deaths in the US every year [1], it's just
tragic and some of these fatalities could certainly be avoided :/

Edit: Since sgc mentioned 6-6.5k pedestrian deaths per year in the EU, which
didn't sound any better than the US, I tried to come up with pedestrian
fatalities per 100k population (data for 2012 or 2013):

    
    
      | Country | Fatalities per 100k | % pedestrian | Pedestrian fatalities  per 100k |
      |---------+---------------------+--------------+---------------------------------|
      | USA     |                10.6 |         14.1 |                          1.4946 |
      | Italy   |                 6.1 |         16.2 |                          0.9882 |
      | Germany |                 4.3 |         16.7 |                          0.7181 |
      #+TBLFM: $4=$2*$3/100
      http://apps.who.int/gho/data/node.main.A997?lang=en
      http://apps.who.int/gho/data/node.main.A998?lang=en
    

This example really speaks for itself.

[1] [https://www.npr.org/2017/03/30/522085503/2016-saw-a-
record-i...](https://www.npr.org/2017/03/30/522085503/2016-saw-a-record-
increase-in-pedestrian-deaths)

~~~
sgc
I'll bite!

According to
[https://ec.europa.eu/transport/road_safety/sites/roadsafety/...](https://ec.europa.eu/transport/road_safety/sites/roadsafety/files/pdf/statistics/historical_country_transport_mode_graph.pdf),
there are 6000-6500 pedestrian deaths per year in the EU (the data is only in
graph form). EDIT: here is the raw data:
[https://ec.europa.eu/transport/road_safety/sites/roadsafety/...](https://ec.europa.eu/transport/road_safety/sites/roadsafety/files/pdf/statistics/historical_country_transport_mode.pdf)

As someone who has spent a fair amount of time in both places, I can say that
the US pedestrian feels more entitled due to an _almost_ universal pedestrian
legal right of way outside of controlled intersections (stop lights). It's a
foolhardy attitude, and completely antithetical to the way people are taught
(which is: look right, then left, then right again before stepping off the
curb). But Italians often do the same because they wouldn't ever cross the
street otherwise, and there will be a small group of risk takers adding to the
statistics in every country.

~~~
user5994461
There is also a pedestrian right of way in Europe, at least in the countries I
know of.

I think that European drivers are convicted when they hurt a pedestrian. I
don't think that Americans get convicted as often or as long. That could
explain why they don't drive as carefully.

~~~
sgc
How could you possibly come to that interpretation of the data? There are 2.5x
more pedestrian fatalities in the EU. It would appear that US roads are safer
for pedestrians, and of course drivers are convicted if they are at fault.

------
hn_throwaway_99
This is definitely not a case of someone jumping out in the road, as was
previously reported. Sure, jaywalking at night is a bad idea, but this is
exactly the type of thing an automated driving system should catch and avoid

~~~
in_cahoots
I thought whole initial response from the police was weird. They went out of
their way to basically say she jumped out in front of the car, and that the
accident was unavoidable. They even told us that the women was homeless, which
I viewed as a subtle way to hint that maybe she wasn’t in her right mind.

It just felt like they were saying everything possible to absolve Uber, in a
way that’s very different from how most crashes are treated.

~~~
brokenmachine
100% agree. The Police Chief even said the video would "potentially shift the
blame to the victim herself, rather than the vehicle."

If that's not very strange language to use for a public servant describing a
video that shows a fatal vehicle-vs-pedestrian accident, I don't know what is.

I'm not in the US so I don't know, but people in another thread were making
comments that that is often how it is handled in the US, the crash victim is
blamed somehow. It was implied by one person that police want to avoid
paperwork.

------
lern_too_spel
It certainly looks like a human driver would have braked enough to make this a
minor injury accident. It's amazing that the additional sensors of the Uber
car were not enough to avoid the accident entirely.

~~~
gsam
I wouldn't say minor, but a reasonably attentive driver would have probably
slammed the brakes, even if it was too late.

------
eric_b
The Uber car was a Volvo XC90. They are supposed to have some of the best
pedestrian avoidance safety features in existence. My understanding is that a
stock Volvo would be expected to catch this situation and avoid or stop. Did
Uber override these systems? Or does it just not work very well?

~~~
mkempe
I had the same question, especially because several relatives and I own Volvos
and pay attention to their safety promises. One would assume that Uber cannot
disable Volvo's safety systems...

However, it appears that Volvo's "City Safety" detection of pedestrians or
bicyclists requires _clear visual_ identification. So it works in daytime, or
at night with strong illumination, but not in relative darkness. [1]

One other thing that seems off in the video is that the headlights don't
illuminate much distance ahead.

[1] [https://support.volvocars.com/en-CA/cars/pages/owners-
manual...](https://support.volvocars.com/en-CA/cars/pages/owners-
manual.aspx?mc=v526&my=2018&sw=17w46&article=c3d7a163845903b5c0a801513bd0d7f3)

------
brandonmenc
I lived in that area of Tempe for nearly a decade, as recently as two years
ago.

That is an extremely well-lit stretch of road - there are so many lights on
that road that it's practically daylight conditions at night.

The video is far, far darker than what you'd actually see.

------
arduinomancer
I feel like a human would've been able to at least brake a little or swerve in
this situation?

Not avoiding the collision but perhaps lessening the damage?

It looks like the car doesn't even try to slow down.

------
ChuckMcM
This time it actually has the video. And it is a poignant reminder that there
is no such thing as a too reflective vest for bicyclists.

~~~
jonknee
> there is no such thing as a too reflective vest for bicyclists

And to not jaywalk a bike across a street in the dark!

~~~
freehunter
Since I've seen this be a very controversial statement here in the past, I'd
just like to point out to everyone that American drivers on four lane roads
are _not_ expecting pedestrians to be in the middle of the road and not at a
crosswalk. That is not a normal thing to happen in America, and no one expects
it to happen. It is a very rare situation.

\--edit: okay guys... no where did anyone argue that you shouldn't be watching
for pedestrians crossing. Stop yelling "gotcha!" like you caught me in a trap.
What I'm arguing against is the common refrain on these articles that there is
no such thing as jaywalking and the pedestrian has the right to the road over
the car. Maybe it's true in Europe judging by comments on previous articles,
but it's not common in the US. It's illegal.

Congratulations on expertly knocking down your own strawman, though.

~~~
jakeogh
I have driven in the US for over 20 years and have the opposite experience. I
am constantly looking for people, animals (deer are quite dangerous to hit,
and they move _way_ faster) and lost count long long ago how many times I have
been in very similar or worse situations.

It's not even remotely rare. Drive on campus or downtown in AZ for a more than
a few minutes.

~~~
freehunter
Yes, obviously you should watch for objects unexpectedly crossing the road. No
one argued you shouldn't. The point is, some people have and will still argue
that the pedestrian has more right to the road than the driver and pedestrians
should be allowed to cross anywhere they please. But that's not how it works
in the US, crossing roads like this is illegal unless it's done at a
crosswalk, and normally people are not stupid enough to do it with oncoming
traffic.

~~~
jakeogh
The proponents of pre-programmed cars would be wise to stop blaming "humans
breaking the rules" for their mistakes.

~~~
freehunter
You're not even arguing against what I've said. You are not the intended
audience for my comment.

~~~
jakeogh
"obviously you should watch for objects unexpectedly crossing the road. No one
argued you shouldn't."

Strawman. I argued the opposite. You made the grossly incorrect assertion
"That is not a normal thing to happen in America, and no one expects it to
happen. It is a very rare situation."

If preprogrammed cars cant handle random objects in unexpected places then
they are uber DOA.

------
duiker101
Shouldn't the card have been driving with its brights on? I mean... she was
already crossing so she didn't come out of a corner or something and I have a
feeling that with brights on she could have been seen. Also, there doesn't
seem to be other cars in the opposite direction or anything for which the car
shouldn't have had its brights on.

~~~
jayess
No, this is an urban area just north of downtown Tempe:

[https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,...](https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,+Tempe,+AZ+85281/@33.4336598,-111.9422532,16z/data=!4m5!3m4!1s0x872b0931b4d9dd05:0x5d55a20356caaf8!8m2!3d33.4377022!4d-111.9433046)

~~~
sjg007
I dunno I would've probably had them on if there was no opposing traffic.

------
rootusrootus
Regardless of how suddenly she appears in the camera, it is pretty logical
that she was probably visible to the human driver if he had been watching the
road. That's not nearly fast enough on a stretch of road like that to be
overdriving the visibility provided by the modern headlights of a brand new
Volvo. I bet she was visible to the driver in plenty of time to have stopped.

I'm an optimist, but still I think we are much farther away from legitimate
self-driving cars than a lot of people think. And every accident like this
where the computer vastly underperforms a human driver is a huge PR blow to
the technology.

~~~
smileysteve
I agree; and particularly like that you mention this is a brand new Volvo -- I
regularly drive much faster on roads that are likely to have deer -- and this
driver is clearly not paying attention. I think the driver should be
considered for manslaughter charges (to which, as they were an employee, Uber
is civilly liable for)

~~~
rootusrootus
I am not totally onboard with the idea of manslaughter charges, but in
principle I'm with you. I am nervous about how the liability question gets
worked out. When a human behind the wheel is directly liable for their actions
we at least have some incentive for good behavior. When we detach that
liability and assign it generically to a company instead, then it feels like
our options for coercive behavior shaping has been significantly reduced. We
can't put the corporation in jail, and history shows that we rarely punish
corporations significantly, even large fines are frequently pretty small when
you look at the budget of the corporation involved.

------
dogweather
An alert human driver would still have had a bad crash, but might not have
killed the pedestrian.

In a two-lane road, you cannot simply swerve to the other lane w/out checking.
I (and many other?) drivers would have just slammed on the brakes. But at that
road's speed limit, I believe the brakes wouldn't have stopped in time, and a
collision would still have occurred.

The pedestrian+bicycle was a direct obstacle, not at some slim and oblique
angle. So it is odd that the car's collision detection didn't pick up on it.

tldr; a human driver may or may not have done better.

~~~
perennate
Although a collision may still have occurred if a human slammed on the brakes
immediately after seeing the pedestrian, even a 10mph slowdown may have
mitigated the fatality into a minor injury. ProPublica has a graph showing the
frequency of fatalities in crashes for different impact speeds, showing a more
than halving in the occurrence of fatalities in crashes between 38mph and
28mph impacts: [https://www.propublica.org/article/unsafe-at-many-
speeds](https://www.propublica.org/article/unsafe-at-many-speeds)

The article says that the vehicle "doesn’t appear to slow down". If the data
shows that it actually didn't slow down, it seems like there is a critical bug
in Uber's software that has killed a pedestrian.

~~~
dogweather
Yep,

> An alert human driver would still have had a bad crash, but might not have
> killed the pedestrian.

------
908087
Side note related to this incident:

When did the "futurology" crowd become so militant? I've seen multiple people
on multiple forums blaming the victim here, with some even going as far as to
say that they "deserved to get hit". Other rather disgusting comments have
been along the lines of "you have to break a few eggs to make an omlette".

Since so many of these people are outwardly saying that they're comfortable
with people dying to "train the AI" (their words), I wonder if any of them
would volunteer themselves or someone they care about to play the role of
"broken egg"?

Others have been blaming low light conditions, which is ridiculous for so many
reasons I wouldn't even know where to start.

The general feeling I get, is that they simultaneously acknowledge that this
technology is still developing, but refuse to admit that it isn't already
perfect. If they had their way, and blame for incidents was always shifted
somewhere else, autonomous cars would never improve.

Not a good look, and we're in for a very messy dystopian future if the lowest
common denominator is already willing to blindly trust what they believe to be
"AI".

------
ars
Direct link:
[https://twitter.com/TempePolice/status/976585098542833664](https://twitter.com/TempePolice/status/976585098542833664)

The interior view tells you everything you need to know about why anything
except perfection in self driving cars will lead to more accidents.

The driver specially selected and trained for exactly this job, does not look
at the road! If someone specially trained for that won't do it, the general
public certainly will not.

Also, in the video you can not see the bicycle till it's almost too late. But
I have better night vision than this camera, I feel as a human I would have
seen the bike. (For example the sides are pitch black in the video, but in
similar lighting I can see stuff to the side.)

But the real issue is that self driving cars should have better vision than a
human, not worse!

~~~
jerf
"The driver specially selected and trained for exactly this job, does not look
at the road!"

Humans gonna human. Our brains are trained by millions of years to not pay
attention to things not directly important. It's hard enough sometimes to keep
our attention on the road when we're doing the driving and the driving is too
easy, like being alone on a highway at night. Expecting humans to passively
observe driving while making no decisions and taking no actions... well...
could _you_ watch eight straight hours of video footage from a car, tomorrow?
I know I couldn't. I couldn't even pre-HN/Reddit/instant entertainment
constantly in my pocket, let alone now.

The idea that sitting a human in an otherwise-automated car will do anything
after the first few hours is absurd. It's a feel-good measure and nothing
more. I wouldn't have expected any other result.

~~~
ineedasername
Yeah, there's that videogame about driving a bus constantly, on an empty road,
for hours and hours. It occasionally veers off, so you have to pay attention,
but that's all. That game should be a minimum "driving" test for safety
drivers. (Not really, but something like that)

~~~
darklajid
Desert Bus. Not quite Twitch stream material..

I agree that it's an apt comparison, except Uber seems a worse experience. In
Desert Bus you have to correct the bus.. Stupid work, but something necessary
- boring - to do or you fail.

There seems to be nothing at all to do in that Uber?

~~~
gowld
_ahem_ [https://desertbus.org/](https://desertbus.org/)

~~~
darklajid
I .. was aware of that, but you literally pay for Schadenfreude and charity
there. I sincerely believe that no streamer can make money or get a following
with a schedule of Desert Bus runs.

Nor do I want to believe that there are people out there, bored out of their
mind to click on anything, anything(!) on Twitch, would pick Desert Bus over
literally any other thing for more than three minutes.

It's painful. People like seeing others suffer, I guess.. But these streams do
serve a purpose in this thread: The poor players do look exhausted and ready
to die after a short while already. Which brings me back to my original point:
These Desert Bus gamers have something to do (correct the bus repeatedly) and
in your example a whole community who engages them. The Uber driver ... has
nothing at all?

------
ashwinpp
Apart from all the questions about the camera, resolution and culpability, we
need to re-think what it means to be training and evaluation of autonomous
cars with respect to the rare incidents.

Training the cars based on data collected from roads is heavily biased towards
incident free conditions. This does not give any training or feedback on the
rare occasions such as these. If there was a learning algorithm deciding what
to do (assuming hand coded rules are brittle and hence one would want to learn
handling these scenarios) then it perhaps has no training data.

Evaluating the cars based on incidents per million is fine but doesn't tell
anything about how the incidents would have been handled if they had happened.
There is no incentive for the learning algorithm to slow down the car to
prevent fatality, if all it cares about is an incident happening and not the
severity.

One possible solution, autonomous cars are trained in real life simulations
(using realistic lighting conditions, dummies and what not) to be able to
handle the rare incidents and they are also required to pass regulatory
testing in similarly realistic conditions to test for their behavior in rare
incidents, before they are allowed to drive on actual roads.

~~~
linkregister
Luckily, self-driving companies agree.

All of the major ones use test tracks with live hazards, including the
scenarios you mentioned.

Waymo hazard testing
[https://www.google.com/amp/sanfrancisco.cbslocal.com/2017/08...](https://www.google.com/amp/sanfrancisco.cbslocal.com/2017/08/24/google-
autonomous-car-spinoff-waymo-builds-town-as-test-track/amp/)

Uber hazard response
[https://youtu.be/LM8Zw8fiPXE](https://youtu.be/LM8Zw8fiPXE)

Uber test track
[https://www.google.com/amp/s/amp.businessinsider.com/ubers-f...](https://www.google.com/amp/s/amp.businessinsider.com/ubers-
fake-city-pittsburgh-self-driving-cars-2017-10)

Cruise Automation’s test track [https://www.autoevolution.com/news/general-
motors-starts-an-...](https://www.autoevolution.com/news/general-motors-
starts-an-autonomous-car-development-team-builds-it-a-test-track-108721.html)

As far as driving on public roads, I tend to agree. As do the companies. They
all have a safety driver behind the wheel, ready to take over.

There was clearly something that went wrong here in this tragedy, but it’s not
a reckless disregard for safety on anyone’s part that led this to happen.

------
timcederman
The worst part of all of this is the XC90 (the car involved) has emergency
pedestrian braking built in (like many new cars today), yet based on the video
it appears to be disabled so as not to interfere with Uber's autonomous
driving.

------
noahmbarr
Can’t believe all those redundant sensors didn’t catch this obstacle.

------
AngeloAnolin
Lots of mention about LIDAR here. As well as IR sensors, etc.

Unless Uber comes out with all the technical specifications that is mounted on
the car to detect motion and obstruction on the road, all are mere
speculations.

The vehicle in question here would be in the custody of the authorities
(impounded), so they should be able to investigate the details on it. Uber
should provide as much technical specifications available on the vehicle to
aid in the investigation (especially if they are certain that it was more the
fault of the pedestrian rather than the self-driving vehicle (SDV)).

SDV companies and advocates should probably pool resources where they could
buy a parcel of land, simulate it into a city like settings, and perform edge
case tests such as people suddenly crossing, animals, objects, lighting
conditions. This will allow for better simulation as well as advancement of
technology that would soon realize the possibilities of autonomous driving
cars on the road as safe and reliable.

Update: Added safe and reliable.

~~~
kyrra
Waymo has 91-acres already[0]. They are selling/donating that land to the
"California AutoTech Testing, Development and Production Campus", which will
expand to 300 acres and allow others to use it too[1].

[0] [https://www.wired.com/story/google-waymo-self-driving-car-
ca...](https://www.wired.com/story/google-waymo-self-driving-car-castle-
testing/)

[1] [http://wardsauto.com/industry/samsung-joins-waymo-
autonomous...](http://wardsauto.com/industry/samsung-joins-waymo-autonomous-
testing-site)

------
cmpolis
It isn't obvious from the video if the human driver would have had time to
react, but they certainly did not help their chances by being unalert. What is
the point of having a human driver if they are not going to be watching the
road or able to react.

If they aren't a passenger, are they merely there to get around regulation
that requires a human in test autonomous driving vehicles?

------
andrei_says_
This exact situation happened to me on an unlit road 3 years ago. I swerved in
the opposite lane and avoided the collision, without thinking. I was just
lucky that my reflexes kicked in on time and that there was no opposing
traffic.

I had to stop and shake out all the adrenaline and rage after that.

Real question, what would have happened to me, a human driver, if I had hit
the person?

~~~
thinkloop
Police ruled it wouldn't be your fault.

~~~
andrei_says_
This would be an incredible relief. This has been haunting me.

I didn’t have video however.

------
borkt
Can't tell if the camera has poor contrast resolution or if it was indeed that
dark. With the dynamic range available on digital sensors, it seems to me that
even without some form of FLIR it still should be more possiblle to capture
this for sensors than eyes at any rate. Do self driving vehicles use any of
this, especially at night?

------
nanis
This is pure speculation, but it is my working theory: The car's so called
"AI" thought the thing in front of it was a four legged animal (e.g., see
<[https://goo.gl/images/3qAmiM>](https://goo.gl/images/3qAmiM>)).

In most states, braking for animals in a way that might endanger others on the
road is illegal. So, it decided to go ahead and hit the thing in front of it
on purpose.

I am not sure if the people who program the firmware for these contraptions
have ever driven cars in difficult conditions. Between Teslas ramming into
trucks and Uber cars taking out poor homeless people for a bonus score, it
seems to me they are programming things, but they have no idea how this stuff
is supposed to function outside of ideal parameters.

Sure, provide the driver with more information. But, taking the driver out of
the responsibility equation is stupid.

~~~
madez
> In most states, braking for animals in a way that might endanger others on
> the road is illegal.

If that is true, then that is a very bad law. If something appers in front of
a car I'm driving, the first thing I do, before anything else, is braking. And
so should you, and everbody else. First, you save precious reaction time by
first braking, then thinking. Second, it might be a human and thus worth much
more than a damn car. Third, if it's not a living thing, it's not uncommon for
humans to step on the street to go after it.

Any dangers by sudden braking are avoided by mandating sufficient space
between cars. If a car breaks suddenly and the car behind it crashes into it,
the driver of that second car is responsible for the damages, because they
evidently didn't have enough distance to the car in front of it to react in
time to danger situations.

~~~
nanis
The intent behind such laws is simple: You must not endanger others to save an
animal. These are good laws, even if those without actual driving experience
may not understand them.

As the Globe and Mail puts it: "It doesn't matter why the chicken crossed the
road — your responsibility is to the people around you."[1]

    
    
        "If you're forced to choose between getting
        rear-ended and hitting a small animal — hit
        the animal."
    
        "People end up in serious crashes with other
        cars because they let their emotions take over
        when they see a bunny, DiCicco says."
    
        "Drivers need to be prepared to make these
        tough decisions and not get in a 12 car pileup
        because they saw a baby deer and thought of
        Bambi," he says.
    

My unfounded speculation is that the RC car jockeys who programmed how the car
should react under various conditions did not fully understand what such laws
mean (probably having no real life driving experience in ambiguous
conditions). And, the cars sensors saw the human with the bicycle, but thought
it was an animal.

This is pure speculation, but it makes sense to me. My gut feelings are
somewhat validated by every single comment here that tries to make this this
poor woman's fault.

PS: You might correctly deduce that I do not think "self-driving cars" is a
good idea. I think everyone needs to be reintroduced to Asimov.

[1]: [https://www.theglobeandmail.com/globe-
drive/culture/commutin...](https://www.theglobeandmail.com/globe-
drive/culture/commuting/what-should-you-do-if-you-see-an-animal-on-the-
road/article19494060/)

~~~
madez
> These are good laws, even if those without actual driving experience may not
> understand them.

No, they are not. You didn't even touch the arguments I gave.

If there is a risk of collision with the car behind you, then not you but the
driver of the car behind you is at fault for not keeping a safe distance to
account for their reaction time and brake performance.

~~~
nanis
> but the driver of the car behind you is at fault

That will be small consolation if you cause harm to others' persons or
property when trying to avoid an animal such as a deer. That is the practical
upshot.

If you hit a deer, insurance will pay you under comprehensive coverage and
your rates will not go up. On the other hand, if, because of your attempt to
save the deer's life, you hit a guardrail and smash your car, and get hurt in
the process, you'll be lucky to collect anything.

The main point I tried to make above is this: I am speculating that the car's
"AI" thought the person walking a bicycle fit the profile of an animal which
interacted with a ruleset that embodies this principle in a simplistic way
because the code for these stupid self-driving cars is being written by people
with little to no experience driving in less than ideal conditions.

Also, Tesla S cannot see a stopped firetruck:
[https://www.wired.com/story/tesla-autopilot-why-crash-
radar/](https://www.wired.com/story/tesla-autopilot-why-crash-radar/)

------
jonknee
I suppose we'll need to wait for some standardization in the sensor
department, but it would be great if autonomous car makers could share the
sensor data for collisions so that everyone can learn from the mistakes. Test
data is great, but there's nothing like real world problems.

------
phyller
This was not what I was expecting. If this had been an accident with a normal
vehicle, I think the driver would have been found to be at least partially at
fault, and guilty of involuntary manslaughter. But I'm not a lawyer.

I thought the victim jumped out into the street, but that clearly isn't the
case. The pedestrian always has the right of way, a human would have seen her
and slowed and swerved, and no one should be driving so fast that they can't
stop for an obstacle that is just coming into view. And like many others
mentioned, I'm sure she was visible to the human eye where the camera's
exposure was set up to not wash out what was in the headlights.

Uber screwed up on this, maybe they were just lucky it didn't happen earlier.

~~~
tzs
> The pedestrian always has the right of way

That's a very common misconception, but in fact in the majority of states in
the US vehicles have the right of way on roads over pedestrians outside of
marked crosswalks and crosswalks at an intersection [1].

I think this misconception arises because many people overestimate what
privileges and protections having the right of way confers. Some people seem
to think having it means that you can go come hell or high water, no matter
what the people who are supposed to yield are doing.

It is not nearly as powerful as that.

All that having the right of way actually means is that others that want to
use the same resource as you are supposed to let you go first. If they ignore
that and try to go ahead of you, it doesn't mean you can unsafely force your
way ahead of them.

For example, if I'm standing beside a highway, nowhere near an intersection or
a marked crosswalk, looking toward the other side clearly wanting to cross,
the cars have the right of way in most states. I'm supposed to wait until
there is a gap in the traffic sufficient to let me cross safely.

If no such gap comes and I start to step out into the street the cars have to
stop even though they have the right of way. If police witness this they can
ticket me for something like "failure to yield".

[1] [http://www.ncsl.org/research/transportation/pedestrian-
cross...](http://www.ncsl.org/research/transportation/pedestrian-
crossing-50-state-summary.aspx)

~~~
phyller
Yea good points. I didn't mean "right of way" as a legal term, but as a common
phrase which means you are responsible as a vehicle driver to not hit people,
even if they are doing stupid, illegal things. If you've had a bad day at work
you don't get to speed up and swerve into little Johnny as he runs across the
street after his ball, then tell his parents "he shouldn't have been in the
street" and drive away.

------
lmilcin
Please, mind that the camera does not have the same dynamic range a healthy
person has. There are also other subtle signals that humans can use that
computers will still have problems using. I have personally avoided few
collisions with pedestrians just on the fact that they occluded some far away
light or on very subtle flicker of some part of their clothing.

I don't see how car relying solely on vision (it does not seem to react to the
pedestrian as if it was "seeing in the dark") can replace human with the
current level of technology.

If this is true and the car was not able to use LIDAR/Radar/IR camera etc. to
augment its visible spectrum then I don't think it should have been allowed
for this car to be driven fully autonomously.

------
peterwwillis
Collisions between pedestrians, bicyclists and cars would be impossible if we
restricted the roadway. Collisions of vehicles between lanes, including the
opposite and crossing lanes, would also be impossible.

You don't even need self-driving cars to do it, and it would greatly reduce
the number of serious accidents and fatalities. You could start with just the
roads with the most accidents.

And by the way, nobody seems to have mentioned that _assisted driving
technology_ could be added to cars today. We already have cars that emergency-
brake for us. They could also warn us of upcoming red lights, fast approaching
obstructions in the roadway, or unsafe actions on the road, such as letting
the car drift lanes without a turn signal.

------
uptown
I'm surprised the vehicle didn't detect an object moving into its field of
view. What are the expectations from the driver under the terms of Arizona's
self-driving car laws? Are they expected to remain engaged as a driver, or are
they permitted to allow the car to perform all of the driving? It's impossible
to say whether an attentive drive would have changed the outcome of this
scenario -- largely because a highly compressed video doesn't really convey
what the driver actually was capable of seeing, but it's very clear from the
video that the driver was more focused on a phone (or some other device) and
was not an active participant in driving the vehicle.

~~~
danso
I'm definitely interested in the legal expectations too. If it turns out LIDAR
completely dropped the ball here, it might not be the case (given AZ's lax
rules) that Uber would be penalized for LIDAR that was substandard.

Presumably, there'd be regulations for software too, classifiers and AI
decision making. Has any jurisdiction set standards in this?

------
SpikeDad
I think this brings up an interesting philosophy for autonomous driving
technology - is it feasible if it's "only as good as human driving"?

I mean I've seen comments that a human might not have seen the pedestrian. Is
that a "defense" of driving AI? That's it's about as good as human driving?

Seems to me the public won't accept driverless cars unless there is
significant evidence that it's much better than human drivers - after all
human drivers make a lot of mistakes and cause a lot of fatalities. I don't
think any other claims of the advantages of AI driving could offset any
negative publicity of injury or death.

Just a thought in the context of this tragedy.

~~~
dfxm12
I think experience is the best teacher. In a perfect world, Uber will look at
the data, determine why this happened. The AI will learn and it won't happen
again. That is compared to countless new human drivers entering the road every
year. That's not a defense, or an excuse, but an explanation or reality.

I know we don't live in a perfect world, but regardless, we can't change what
happened here, so it's best to learn something from it.

~~~
kbwt
> The AI will learn

[https://xkcd.com/1838/](https://xkcd.com/1838/)

------
aianus
However bad the car was or wasn't at detecting the victim, it's nothing
compared to how bad the human victim was at detecting the car (which
presumably had its lights on coming down an empty street at night).

Imagine if the victim was driving a car...

~~~
brokenmachine
What's your point? She wasn't driving a car. Imagine if the uber hardware was
installed into a petrol tanker. It just just as pointless a discussion.

If self driving cars can't be trusted to not smash into any obstacle that
appears such as a deer/roadwork/child/runaway shopping cart, then they
shouldn't be permitted on the road.

~~~
aianus
The point is that humans who have poorer sensors and do even stupider things
are allowed to drive. So the bar for self-driving is a lot lower than people
around here make it sound.

~~~
perennate
I actually believe that the kind of situation shown in the video (pedestrian
walking across road at night in the path of a car, where car has to brake hard
to avoid a collision) happens much more frequently than most people think. And
most of the time the pedestrian comes out fine because most drivers are
looking ahead most of the time, and someone walking across the road is a big
abnormal event that is easy for a human to notice and react to.

I don't have any data to support this, other than a vague memory of doing
something similar many years ago. Although if I remember correctly, that was a
more major road that was probably better lit. But I still find it hard to
believe that human vision would have been as lousy as the camera's.

~~~
brokenmachine
I agree. I can remember quite a few times I've had to slow down.

One time there was a bar fight that spilled onto the road I was driving along.
I slowed down as I could see the commotion on the sidewalk, and lucky I did
because they ended up in the path of my car.

This self driving car would have killed quite a few people in that case it
seems.

------
static_noise
Do you remember the reaction test where a ball is rolling onto the road and
you have to hit the breaks?

This situation is more akin to a ball laying on the road already and you drive
over it.

This situation is very clear cut. There is an object (a subject even) on the
road directly in front of the vehicle and the car just straight forward drives
into it. It doesn't even swerve. It doesn't break. Nothing. This is a total
fail of the system.

What if there is an accident and people are standing on the road for some good
reason? What if it is a deer in the headlights? What if a kid is running
loose? Is the vehicle supposed to just hunt them down?

------
relyks
If the car was able to pick up the person with the sensors, but the system
failed to stop the collision, will the programmers who designed the autonomous
system be held ethically accountable for the bug that led to the death? O_O

~~~
sjg007
Yes. I am not sure if self driving cars are subjected to regulatory aspects
such as the SDLC in FDA regulated industries but they probably should be (as
should embedded computers in autos in general).

------
goldenkey
The other commenters here have done a good job of examining this incident
under the lens of computer vision and AI/ML automated driving.

So I would like to take a moment to remind everyone, especially pedestrians
but also drivers: To ignore the road is to play russian roulette. It doesn't
matter where you grew up, even if it was a place where drivers always stopped
for pedestrians and were oh so polite. Even a car at a stoplight could
malfunction itself or the person driving it errs... Always keep your distance
and track anything heading in your direction. It is a tragedy what happened
here but it begs the question, at what point of negligence do we classify
risky behavior? Russian roulette is clearly suicide but crossing a street at
night and not paying attention to headlights in the distance isn't? I'd say
the relative risks are vastly different but the outcome similar.

What is even worse for life loving pedestrians is the fact that cars can have
their headlights turned off at night(usually newer models cant but older ones
can.) And then add in EV and the car is invisible and silent... Well clearly
it starts to look like someone driving under these conditions is just a
potential murderer. But similar scenarios are worth thinking about. Really,
anyone walking the street at night should be aware of the non obvious cues of
motor vehicles.

------
connorgutman
Some thoughts from a Tempe / Phoenix pedestrian:

Arizona has the highest pedestrian death rate in the United States. Drivers
are incredibly reckless here, and for people like myself who do not drive
jaywalking is never an option. I will not excuse Uber's mistake here. There
must be repercussions for their lack of oversight. However, I would like to
defend Arizona's loose legislation surrounding self-driving testing on our
streets. The honest answer is that the only way forward for self-driving
adoption is through real-world data. It's the fastest way to reach global
adoption. The faster that companies like Uber and Waymo can improve their
technology, the faster we can see a reduction of human drivers on our roads.
While this accident was horrible and must serve as a wakeup call, it must be
acknowledged that there were 224 pedestrian deaths across Phoenix in 2017. In
the past week alone there were 12 pedestrian deaths. This figure has actually
increased over the years and makes Arizona a unique data point for auto
companies. I am in no way suggesting that Uber should face zero repercussions.
However, as a local pedestrian, if you gave me the option to magically switch
every vehicle here over to Uber's technology with current specifications I'd
take it in a heartbeat. I interact with these vehicles every day, and feel
safer with them still.

------
hiisukun
Since I didn't find it in a Ctrl+F, I'll chime in to say that one of the
advantages of self-driving cars is the scrutiny that accidents will come
under, and how the results of that investigation can improve the driving of
all other self-driving cars.

While many fatal car accidents are scrutinized to an appropriate level
throughout the various countries across the world, many are not. And when they
are examined in detail and the vehicle had a fault, the systems in place to
issue recalls and defects are quite robust and reasonably effective, although
time consuming. When the driver is at fault, the local judicial system (if it
exists) can attempt to rectify that behaviour through the usual means - fines,
imprisonment, training, whathaveyou.

But with a self-driving car, the autonomous systems can be updated after a
collision like this is understood, improved, and distributed to existing
vehicles at a scale and rate far surpassing anything possible with humans
driving cars individually in every city across the world.

I don't think self-driving cars are flawless in concept or mandatory for all
circumstances, but having been to fatal car accidents that are eerily similar
to one another on a few too many occasions, I can welcome large scale
discussions on what went on in a death such as this one - because the positive
outcomes can be replicated elsewhere, hopefully saving someone from the same
fate.

~~~
usgroup
Typically models don’t work like that... i.e changing the model or its
training alters it’s behaviour generally and needs to be tested against its
overall performance. So whilst crash data may go toward a dataset which is
then used to better evaluate or train different aspects of a system, it has
none of the luster of the learning from every accident being maximised.

The human mind is a quite unique in its ability to decide where one model
stops and another one starts.

------
kozikow
The way I see it is that this fatality was a calculated risk for Uber. Uber
realizes how much Waymo is ahead of them and decides to drop too many safety
restraints in development of the self-driving technology.

Any safety regulations based on this fatality are going to hit Waymo just as
much as Uber, but also slow Waymo down and give Uber time to catch up. Any
financial fine would be tiny comparing to total money invested in self driving
technology.

------
mrlinx
Isn't the driver clearly looking at a phone in his lap?

------
BartBoch
The issue I am having with this is that you can easily see that something
broke the sight line between car and lights on slight left. Mind that - the
video is not only of pretty low resolution but also it is not giving out
proper "blackness" of black. A human observing the road would be able to catch
that and as a precaution take the foot off a gas and put it on a brake pedal.
Those small changes make the difference.

~~~
stevenwoo
That's a great point, and something we use in driving even in daytime,
sometimes you are driving on a curvy mountain road and see something move
through the leaf cover and know to take precautionary steps - to slow down and
extra care on being in the lane. I don't know much about it but wildly
speculate some of this might fall into general AI, though.

------
John_KZ
It's time to set the precedent by jailing those who deemed the system was
ready for real-world use.

This woman's death will not be in vain. I did not expect such a terrible fuck
up so early on, but I'm glad it clearly demonstrates how badly made autonomous
vehicles are. They're not going to be ready to hit the streets for decades,
unless of course we decide that Uber's profit margins are worth more than
human life.

------
tedd4u
A few people have commented along the lines of "I probably couldn't have
stopped in time even if I had been driving." I think it's important as a
driver to hit the brakes as hard as you can even if you don't think you'll be
able to avoid collision, because the energy is reduced greatly even with a
small amount of deceleration (F=ma). Probability of death in car/pedestrian
collision is reduced from over 80% to under 10% by slowing from 40 mph to 20
mph. [1]

Also, I think it's likely the human eye would have been able to see the woman
before the camera does. The camera needs her to be fully covered by the
headlight beam. I know when I drive at night I focus my attention on the area
between the road ahead and the fringe of the beam as I scan center, left and
right. The area outside the beam coverage never looks fully black to me. I
definitely can perceive obstacles outside the main beam coverage.

[1]
[http://www.humantransport.org/sidewalks/SpeedKills.htm](http://www.humantransport.org/sidewalks/SpeedKills.htm)

~~~
pdonis
_> I think it's important as a driver to hit the brakes as hard as you can
even if you don't think you'll be able to avoid collision_

It's good to hit the brakes, yes, but the best way to avoid almost all
collisions is to steer, not just to brake. Braking combined with steering
sharply to the left (since the pedestrian was moving to the right) would have
been the best response to this situation.

(Note that, before anti-lock brakes became common, "hit the brakes as hard as
you can" was _not_ good advice, precisely because it would lock the wheels and
prevent you from steering. In a car with anti-lock brakes, that's not an
issue.)

~~~
chillwaves
Depending on your speed, that is very bad advice. Sounds like a recipe for
losing traction and fish tailing (potentially into other cars or people or
even stationary objects).

There is only so much traction afforded by the contact patch on your tires.
You can brake, you can turn, you can speed up but I would not recommend doing
more than one at a time.

~~~
pdonis
_> There is only so much traction afforded by the contact patch on your tires_

If you have anti-lock brakes, they are designed and calibrated to know how
much to brake to be just short of losing traction. That's what they're for--to
allow you to brake and steer at the same time without having to have special
training to know exactly how to balance the two.

If you don't have anti-lock brakes (but it's very rare now not to have them in
a car of recent manufacture), then it's still possible to brake and steer at
the same time, but it's not easy or intuitive to balance the two to maintain
control. It can be done, but you basically have to be a professional driver.

~~~
chillwaves
You cannot argue against the laws of physics.

It is inadvisable for any non professional driver to attempt to maximum break
and also steer away. It is just not safe.

~~~
pdonis
_> You cannot argue against the laws of physics._

Sure. Anti-lock brakes work in accordance with the laws of physics.

 _> It is inadvisable for any non professional driver to attempt to maximum
break and also steer away. It is just not safe._

This is true for a vehicle that does not have anti-lock brakes. It is _not_
true for a vehicle that does have anti-lock brakes. As I said before, that is
precisely what anti-lock brakes are for: to allow you to just push the brake
pedal as hard as you can while also being able to steer (because the anti-lock
brakes translate "push the brake pedal as hard as you can" into "don't apply
the brakes hard enough for the wheels to lock"). And the reason why anti-lock
brakes were invented and put into cars was precisely that you need to _both_
brake _and_ steer to avoid most collisions.

~~~
chillwaves
I looked into this further, since I was trained to not steer and brake. I
understand anti-lock brakes but it does not answer the question of breaking
traction, just stops your wheels from locking up (which does maximize traction
vs locked wheels).

The answer to the question is ESC (Electronic Stability Control) which is
mandated by law in all US vehicles as pf 2012. ESC controls braking on
individual wheels to prevent over/under-steering which was my primary concern
about swerving in emergency situations.

This short video is a great illustration, it helped me understand how the
system works. Looks like my information about swerving is out of date.

[https://youtu.be/L1qt84c2KN0](https://youtu.be/L1qt84c2KN0)

Previously I only found this study that claims people were not driving or
braking properly with ABS resulting in more single car crashes due to over-
steering while braking.

[http://www.smartmotorist.com/traffic-and-safety-
guideline/im...](http://www.smartmotorist.com/traffic-and-safety-
guideline/improper-steering-endangers-drivers-with-abs.html)

------
blackrock
This is just the raw camera input. The question here is, what did the software
see?

What did the sensors register? Is their system using an ultrasonic sensor? Did
this detect anything? Commercial ultrasonic sensors can detect something at
least 15' away. So the software could have at least slowed down to minimize
the collision impact.

What did the LIDAR see? Did their software detect an object?

This accident happened on a clear night. There was no rain. There was no snow.
The pedestrian crossed the street in a perfect 90 degree angle to the
sidewalk. The pedestrian did not jaywalk and cross diagonally. The pedestrian
even walked the bike across the road. Thus, the pedestrian performed a legal
road crossing.

I don't think a human driver would have made this same mistake. When I drive
on a dark road, I occasionally flash my high beams to ensure that there is no
obstacle in front of me for hundreds of feet. That is how I, as a human, can
safely determine that the road is clear for me to proceed.

What's on trial here, is the safety of the software system developed by Uber.
And on this instance, their system failed, and a human died.

~~~
macintux
I believe this isn't even "input", it's just a dash cam. The data available to
the car hasn't been released yet.

------
InclinedPlane
Someone recorded driving the same section of road at night using a cell phone
camera and you can see how much better that camera reproduces the real-world
visibility of the situation:
[https://youtu.be/1XOVxSCG8u0?t=26](https://youtu.be/1XOVxSCG8u0?t=26)

That stretch of road was not particularly poorly lit, the pedestrian would
have been abundantly visible to anyone paying attention, and it should have
been picked up by the Uber vehicle's vision system as well, let alone their
sensor suite.

I'm going to go out on a limb and guess that the pedestrian that got hit has
crossed at that area many times in the past. There is actually a pedestrian
crossing on the median there, but it is purely decorative and posted with
signage saying not to use it. However, the area has pretty good sight lines
down the road and it does seem to be well lit, so a pedestrian who cavalierly
crosses through traffic would likely nevertheless have a very low chance of
getting hit.

------
jacksmith21006
This reminds me of a Google story from two years ago.

"A Cyclist's Track Stand Befuddled One of Google's Self-Driving Cars"

[https://gizmodo.com/a-cyclists-track-stand-totally-
befuddled...](https://gizmodo.com/a-cyclists-track-stand-totally-befuddled-
one-of-googles-1727024285)

Perfect example of a corner case. I do NOT believe you can kind of be driving
and so having a safety driver means little. The software has to be able to
handle this type of situation or should not be on the road.

I get it was crazy for this lady to be in the middle of the road in complete
darkness. But this is the EXACT situation you would have thought a SDC would
perform far better then a human.

What is really needed is some way to replay this in data against others
algorithms and see what they would do.

Love to see what Google would have done? I also worry in trying to keep up
with Google these companies are doing very unsafe things. Uber would be far
better off just using Google technology.

------
AndyKelley
I thought self driving cars had radar. How is this in any way excusable? This
looks like it should have been trivial to detect and avoid.

------
jacksmith21006
This video just reinforces you can NOT be kind of driving. Google has been
pushing this point for awhile and I completely agree.

------
krick
Oh, come on, I think all these comments about Uber PR are way, way too harsh.
Sure, the camera sensor might be making video appear darker than it actually
was, and I'm surprised LIDAR didn't help, and sure something might and should
be improved. But given the amount of accidents happening in a clear daylight
in pretty damn obvious situations without any automatics whatsoever, purely
because of humans — it's unfair to make a case of robot-auto unreliability out
of it. I don't know what would be the chance of human driver hitting this
girl, but I'm guessing high. If anything, the pedestrian _sure as hell should
've seen_ the car! Apparently though, she didn't. Why? Well, because.

The though of myself driving this car manually in this situation actually
makes me feel scared. I _might_ would have been going slower… I _might_. Fuck,
I'm actually panicking trying to imagine that.

~~~
oldgradstudent
As bad as they are, human drivers in the US have 1.25 fatalities every 100
million miles, across all road conditions and pedestrian behavior.

Uber just crossed the 2 million miles and it already has a fatality, where a
competent driver could have, at the minimum, slow down by a few miles per hour
to reduce the severity of the crash enough to result in an injury instead of
death.

There's no excuse here. The Uber self driving car should not be anywhere near
a public road, where children, the elderly, or distract people can be found.

------
tbabb
IIRC, high-resolution thermal infrared cameras are ITAR controlled-- which
means they are not available for consumer applications above an abysmal
resolution (something like 320x280).

Public safety like this seems like a great impetus to do what was done for
GPS, and declassify this crucial (and now really quite mature) technology for
the general public.

~~~
wilsonjchan
You can easily buy a VGA FLIR camera in the US. They'll probably run your name
through the denied-party list to make sure you don't have a history of
diverting products to countries we don't like.

FLIR will sell you cryo-cooled cameras domestically as well.

[https://www.flir.com/browse/rd-and-science/high-
performance-...](https://www.flir.com/browse/rd-and-science/high-performance-
cameras/fixed-advanced-level-cameras/?page=2)

------
u801e
Based on the video, it took about 1.5 seconds from when the pedestrian came
into view till the vehicle collided with them.

At 38 mph, a car is traveling about 56 feet/second. That means that the car
traveled about 84 feet in that timeframe. VOL (Visual Aim on Left) headlamps
are supposed to be aimed 2.1 inches below headlamp level at a distance of 25
feet [1]. If the headlamps on the vehicle were about 2 feet off the ground,
then they should have been able to light up the roadway about 286 feet ahead.

Had the headlamps been aimed properly, then the driver could have seen the
pedestrian about 5 seconds in advance. That would have given the driver about
1 second to react and 4 more seconds to slow down or change direction to avoid
the collision.

[1]
[http://www.danielsternlighting.com/tech/aim/aim.html](http://www.danielsternlighting.com/tech/aim/aim.html)

~~~
usaphp
> they should have been able to light up the roadway about 286 feet ahead

But when that person was 286 feet away from the vehicle - she was also way off
to its left. Headlights mostly light up what’s ahead of you, not what’s on
your side. Unless I am missing something.

~~~
u801e
> Headlights mostly light up what’s ahead of you, not what’s on your side.

Good headlamps should have a uniform and widely distributed beam pattern which
should be able to light up at least the lane of traffic you're in, as well as
the shoulder and several lanes to the left.

------
Hendrikto
Not to say there isn‘t a problem here... But walking across the street in dark
clothes, without any light or reflectors, at a random spot does not seem like
the best idea to me.

If I have a picknick on the train tracks and get hit by a train, maybe you
should consider my stupid decision as well when evaluating the security of
trains.

~~~
ht85
People don't expect trains to avoid them if they step on the rails.

Recently I almost got hit by an electrical scooter. Ambient noise was very
low, and I relied on sound alone until very late, and saw it at the very last
moment.

I've learned, and I won't rely on sound anymore. If cars occasionally don't
react to them, people will learn too, and completely stop jaywalking in
unclear situations. It will also cost many lives before everyone has had a
chance to adapt.

------
mattigames
It's incredible how bad the self-driving system failed in this instance; it's
clear that any self-driving system (Uber or any other) -including any
update/change to its software- should be audited by the government AND all
major tech universities (e.g. MIT) before a single one of those cars go into
the public streets.

------
confiscate
Wow, the pedestrian just walked onto the road without so much of a glance at
the direction of incoming cars.

Sure, it's the car's responsibility to stop. But still--it's almost like the
pedestrian didn't care at all. She never saw the car throughout the entire
incident. She never saw it coming. She didn't cared to.

~~~
itronitron
it's almost as if she has done this many times before and never been hit by
cars being driven by real people

~~~
confiscate
yes, that's why she should keep doing it and not expect anything to ever
happen! Who cares? Not her. She doesn't care.

------
danso
Anyone here have expertise on the current state of law governing autonomous
vehicles? I know AZ is supposed to be lax compared to California, so I wonder
if there is any situation in which Uber could be at fault (besides a civil
suit from victim's family)? Let's say that the car's LIDAR was indeed
dysfunctional, or obviously inferior compared to the industry standard. It
could be a hardware problem, or the software attached -- e.g. the Uber AV
classified the victim as a large paper bag. What legal framework is there to
assign fault to company/manufacturer in an accident like this, other than
design/planning that is obviously malicious or grossly negligent?

In other words, is there any situation where Uber (or Waymo, etc) could be
found legally at fault in a state like Arizona, no matter how badly their
vehicle performs in an accident?

~~~
linkregister
There’s nothing in the tort or criminal justice system that prevents
responsible parties from being held responsible for negligently causing the
death of a person in the realm of self-driving cars. If Arizona prosecutors
don’t have enough for a case now, they might later after the NTSB
investigation. No charges have been brought, so they could be brought up
later.

I would suspect that the backup driver would bear the criminal responsibility
for any collision the car got into. What the general consensus of the other HN
commenters is that the driver could not have reacted better, though the LIDAR
should have detected the woman.

In that case, the woman’s family would have a strong wrongful death case
against Uber.

(I’m not a lawyer)

~~~
danso
What you wrote made me realize that regulations/laws about
equipment/performance/computational standards would probably be managed at the
federal level anyway (e.g. NHTSA). Though America is not necessarily the lead
in autonomous travel, right? How far along are nations like Singapore and
Germany in having legal frameworks for autonomous safety?

------
gophile
Why is the backup human driver staring at the dashboard most of the time?
Should he not be closely watching the road? It appears he lost a few valuable
seconds right before the collision because his attention was diverted to the
interior of the car rather than focussing that attention on the streets?

------
devit
Driver not looking at the road (as obvious in any automated vehicle).

Might be wrong, but it looks like either the headlights are not the maximum
setting, or they are broken or the camera is underexposing the picture.

And complete software or hardware failure at detecting the obstacle.

Definitely not a combination that should be driving on public roads at night.

That said, the pedestrian is crossing outside of a crosswalk at night without
looking at the road, so they are at fault as well.

Note however that obviously even a terrible driving system like this one is
extremely more likely to hit an unattentive than an attentive pedestrian (and
in fact even more than a good driver, who tend to have accidents due to more
exceptional circumstances than "neither party was looking at the road"), so
that doesn't excuse Uber's fault, and actually makes their position worse.

------
jbritton
I would hope self driving cars have better sensors. There are high dynamic
range cameras that simultaneously take multiple exposures and composite them
in real time. The dynamic range in this video is horrible. An HDR camera
should have made her visible much sooner. Further why not use larger CMOS
sensors, something like on a DSLR camera. Finally, the headlights appear
fairly ineffective too. I would hope these cars are using Xenon headlights or
something that does a much better job illuminating the road.

bi-xenon example:
[https://www.youtube.com/watch?v=QNjckOUEqWo](https://www.youtube.com/watch?v=QNjckOUEqWo)

HDR video at night example:
[https://www.youtube.com/watch?v=GyaSHGQc49s](https://www.youtube.com/watch?v=GyaSHGQc49s)

~~~
zamalek
Xenon lights should be required on all new car sales. My new car has them and
the difference is incredible. Not only can I see far more, but signage
_really_ pops. I've noticed that they are also much easier to look at when
approaching. It is an incredible safety feature.

------
SlowBro
Whatever the outcome of this case, autonomous cars are not going away. That
cat is already out of the bag.

As for this accident, I think in time it will be shown in time that while
machines can make mistakes, they make fewer mistakes than humans. Six years
ago I heard that CSX (a major rail line in the U.S.) was actively pursuing
autonomous trains. I thought the idea was horrible, until the programmer
working on the project informed me that machines guiding trains have been
proven to make fewer mistakes than humans guiding trains. It keeps the human
element of error out of the equation.

There is a certain amount of fear these events generate, but did you notice
that the same level of media coverage was not given for the dozens of similar
accidents that happened in the past 24 hours with non-autonomous vehicles?

~~~
hnaccy
> I think in time it will be shown in time that while machines can make
> mistakes, they make fewer mistakes than humans.

This is religious thinking.

>There is a certain amount of fear these events generate, but did you notice
that the same level of media coverage was not given for the dozens of similar
accidents that happened in the past 24 hours with non-autonomous vehicles?

This is a known unethical company testing dangerous technology on public
streets and killing someone. By the numbers Uber's self driving cars are far
worse than human drivers when it comes to safety, and the video shows their
autonomous system has serious issues. Why defend them?

If you're a fan of self-driving cars you should be slamming Uber, if anything
is going to hurt the technology it's shitty companies playing fast and loose
with their tech resulting in people's deaths.

~~~
SlowBro
> If you’re a fan of self-driving cars

I’m actually not. I’m agnostic about the technology, and I don’t care for Uber
either for or against, and am not defending them nor am I interested in
attacking them.

Just being realistic, looking at the big picture. Realistically speaking, this
tech will be improved in time.

~~~
hnaccy
>nor am I interested in attacking them.

Why not?

~~~
SlowBro
I have better things to do with my time. My comment wasn’t aimed at any one
company. I think you misread my comments as fanboy cheering for autonomy by
the whole point of my comment was centered around the fact that the technology
cat is out of the bag. That’s just reality.

------
deft
Uber shills and other defenders of straight up manslaughter by a machine
(breaking first rule of robots), I thought it was the woman's fault and
tooootaaallly not a failure on Uber's part. Hopefully this puts the final nail
in the coffin of a terrible business. Sad someone had to die.

------
blunte
That is one super dark patch, but lack of visible light shouldn't be a problem
for IR?

And whether the human driver/observer could have reacted fast enough to
prevent death is a question, but it would help if the human in the car had
been looking out the window more than looking down at her? phone.

------
yalogin
What are the legal implications for the safety driver in this one? We don't
have laws to protect self driving accidents right? So if the authorities
determine that the "car" is at fault, would that mean the driver at the wheel
be implicated?

------
bunderbunder
One thing that I haven't seen a whole lot of commenting on is how distracted
the safety driver appears to be in the interior shots. He looks bored, and his
attention seems to be directed everywhere _but_ the roadway.

This raises some serious concerns about the extent to which "safety drivers"
actually increase the safety of self-driving car tests. If that's how they
normally behave when a car's in self-driving mode, then it seems to me more
like they are creating a false sense of security that mostly just serves to
make the public feel better about the risk of half-baked self-driving
equipment being tested on the public roadways.

------
squarefoot
Something that tickles my brain since yesterday: are their obstacle detection
systems active or passive? In other words, do they emit IR to illuminate
nearby obstacles or they rather rely exclusively on visible light projectors
on board and street lights? I ask because emitting IR would likely interfere
or even blind speed cameras (which use IR themselves to take night photos), so
I expect this to be either forbidden or strictly regulated. Which brings me to
the question: if it uses IR, and therefore detects it, could a speed camera IR
beam, or any other IR device, in vicinity have interfered with that car
obstacle detection?

------
osrec
That is utterly terrible. The car should have reacted to the pedestrian
comfortably. I think something in the change in lighting confused the sensors
(the road appears to go from having street lights to not having them at the
point of contact).

~~~
brokenmachine
I would have thought that it should be irrelevant if the street lights are
there or not with LIDAR.

~~~
osrec
Not if the algo is making statistical inferences from a visual input as well.
I feel it may have clouded the "perception" of what was actually there.

~~~
brokenmachine
Then it's clearly not safe to be on the road at night.

------
gophile
Timeline:

00:03 - The feet of the pedestrian are visible

00:04 - The entire body of the pedestrian is visible

00:05 - Collision occurs

If we assume that the computer only had this video as input, then the computer
only had 2 seconds to avoid the collision. That would be unavoidable for the
computer. But the fact that there was no sign of slowing down or braking in
these 2 seconds is pretty alarming.

But it sounds unlikely that this video was the only input to the computer. Did
the car have multiple cameras to "see" bright as well as dark objects at
night? I would imagine that a self-driving car driving at night would use
multiple cameras (tuned to various level of brightness) so that the car can
match human level vision?

~~~
teraflop
> the computer only had 2 seconds to avoid the collision. That would be
> unavoidable for the computer.

At the time of the accident, the car was traveling about 40mph. A Volvo XC90
is capable of braking from 60mph to zero in about 2 seconds.

[https://www.autoevolution.com/news/2016-range-rover-sport-
fa...](https://www.autoevolution.com/news/2016-range-rover-sport-fails-
braking-test-falls-way-behind-volvo-xc90-audi-q7-video-99010.html)

------
kakarot
Here is a YouTube URL for anyone like me who cannot seem to make Twitter
videos work with adblockers.

[https://www.youtube.com/watch?v=Cuo8eq9C3Ec](https://www.youtube.com/watch?v=Cuo8eq9C3Ec)

------
dumbfounder
Questions to be answered about a specific accident (not about whether we
should have autonomous cars at all):

Who is legally at fault?

Would an average human driver have avoided/lessened this accident?

Should the car have reasonably performed better?

------
urmish
Is the video slowed down? Also they need to release a better quality. If I was
behind that wheel, I don't think I would've seen that lady to be quite honest.
The place where the lady was crossing the road was under a shadow of nearby
trees or buildings. If one is going to cross a road like this, they should at
least use a well lit part of the road. Also, why did she have her back turned
to the traffic if she is jaywalking on such a big road. Very poor decisions on
her part. I am willing to bet Uber gets out scot-free with this one, and
rightly so.

~~~
bigiain
1) I wouldn't be surprised if that is the "best quality" they have.

2) I also suspect that human eyeballs would have a different view of the
light/dark portions of what's depicted there, and especially eyeballs would
have probably had a much higher chance of detecting movement in peripheral
vision than that video gives any hint of.

~~~
makomk
I expect the original was probably much better quality. Twitter's video
compression is terrible.

------
aribeiro
The video is very damning. Either a problem of camera contrast or a big bug.
And having a backup human driver is useless.

There is a terrible ethical dilemma here. Humans are currently at a a death
rate of about 1 per 10^8 (hundred million) miles travelled. Waymo, which is
best in class, is at 1 human engagement every 5,000 miles as per California
data. Of course, most of those were likely minor. But there are more than 4
orders of magnitude there to allow for plenty of surplus dead people. These
things shouldn't be on the streets!

------
aurizon
It looks like Uber is hiring irresponsible homeless people to save $$. It is
hard to say if I would have seen her, I am accident free driving for 63 years,
however, when I drive, my eyes are on the road, I do not doze off or look at
accidents I pass, so I would have slowed dramatically, and possibly stopped.
Some frame by frame analysis versus time/velocity etc should answer some
questions. LIDAR does not depend on full spectrum light, and should have seen
the woman, although the frame is mostly open space?

------
mannykannot
The interior shot seems to support the idea that it is not feasible for semi-
autonomous systems to rely on an alert human monitor to handle the difficult
bits. Passive alertness is not a human trait.

------
rfolstad
Whats the point of a safety driver who's eyes are not on the road? Pretty easy
to monitor if the safety driver is paying attention or looking down every
second like this safety driver was.

------
throw7
1\. What a clear failure for self-driving/collision avoidance technology,
especially in conditions (night time) that we've been told are much better for
machines to "see" in.

2\. "safety driver" useless if he's distracted. what's the point if he's not
actively engaged?

3\. Video comes from what? Is this raw video from the cheapest chinese dashcam
uber found on the shelf and slapped into the system so they can get ahead of
the story and plausibly deny in certain situations?

------
kuon
I'm surprised the footage looks so bad, I wonder if it was altered to look
darker to help lessen Uber responsibility.

Is there a technical way to check if some video filters were applied?

------
nimbius
oof. As a motorcycle rider Looking at the video is hard. Im always vigilant
but sadly, I'd probably have plowed right through the woman. The street lights
seem to do absolutely nothing.

People are unpredictable...however this kind of crossing feels familiar. In
Los Angeles we have transients that are known to wander through 5 lanes of
empty street at 5 AM with impunity. Ive been hard-stopped by wagon trails of
shopping carts laden with old rags and garbage, being pulled by an elderly
sisyphean figure cloaked in old comforters and tarpaulins. Ive also had the
unfortunate luck to see a garbage truck plow through a stolen utility cart
from Home Depot, conveniently parked by a homeless woman in the center of an
off ramp as she was engaged in a furious battle with unseen forces under an
adjacent bridge.

Transients are also struck and killed on our highways almost semi annually.
not to mention the occasional car-through-a-homeless-camp
[https://www.cbsnews.com/news/los-angeles-homeless-camp-
hit-c...](https://www.cbsnews.com/news/los-angeles-homeless-camp-hit-car-
jumped-california-freeway/)

My point is, its easy to sensationalize a computers miscalculation. Humans in
control may have also made this mistake.

~~~
brokenmachine
And yet you didn't make that mistake when you were in a very similar
situation.

I don't think I would have hit that lady pushing the bike. If I saw her slowly
meandering into the left lane like that, I would have slowed down. The car
just plowed ahead at full speed. In fact, it plowed ahead, at night, at 3mph
_over_ the speed limit.

I've seen my own dashcam videos that look very similar to that, and in reality
things are much more visible than what the dashcam shows.

------
rmu09
State of the art emergency assist systems in normal driver-operated cars would
have seen this woman/bike and started to brake the vehicle. Even without any
assist systems, it should be possible for a human driver to see this
bike/woman. It was dark, but otherwise conditions were excellent.

I would suppose that sensory capabilities of a driverless vehicles are above
those of normal cars. There is clearly something very wrong with uber's
technology.

~~~
eyeing_see
And due to human vision being better than a dashcam, this car's emergency-
assist system, the safety driver, could probably have caught this as well, if
they had paid attention to the road.

Which is something that that production-grade technology can ensure with head
and eye-tracking. Some non-autonomous cars have this feature to stop people
from dozing off, and good semi-autonomous systems, like GM's SuperCruise will
refuse to go into or stay in self-driving mode unless the driver is paying
attention.

------
kochb
We're still in the early days on this, but given this video it appears that
there was a hazard in the road that should have been detectable by a road safe
system. In this case, the system failed to recognize a person, and they're
dead.

I'm not a legal expert, but it seems plausible that this isn't going to be
just a sad tragedy with no one at fault - in the near future we could see
actions such as negligent homicide charges.

------
neom
For reference, here is a marketing video of Ford self driving at night:
[https://www.washingtonpost.com/video/business/technology/pro...](https://www.washingtonpost.com/video/business/technology/project-
nightonomy-ford-tests-autonomous-car-in-pitch-
black/2016/04/11/ba00def0-ffe8-11e5-8bb1-f124a43f84dc_video.html?utm_term=.7a524de367b1)

------
lmilcin
You can see the human driver taking her eyes off the phone and immediately
recognize the situation.

Yet I see no hard deceleration. This would have to be immediately visible as
it would jerk the driver forward from her seating position, yet you can't see
this happening.

I understand that once the pedestrian came into the light cone, the computer
should have reacted almost instantly to a very obvious threat situation, yet I
see nothing.

------
mnm1
2+ seconds from the time you first see the person on this really crappy video
is enough time for a human to react and a car to stop from 35mph. I don't
blame the driver. Trying to stay alert for hours just for an incident like
this is impossible and futile. I do blame Uber's shitty tech. Even this low-
res camera sees the person in time and yet it never even applies the brakes
from what I can see.

This should have been avoided.

------
thesumofall
I really hope this will be looked at professionally and unbiased. The initial
statement of the police sounded anything but. If they looked at the same
evidence as we do now, it would indicate a severe lack of understanding of the
difference between camera pictures and human vision. Plus, who spread the
rumor on the very first day that the victim was a homeless person? What does
that have to do with anything?

------
w_t_payne
The automotive cameras that I have worked on seemed to have a better low-light
performance than this video suggests.

Some of the more recent OmniVision focal plane arrays (imagers) are really
very good in low light conditions (for the price).

Of course, it is difficult to tell without a side-by-side comparison, but I
would have thought that the equipment could/should have been better than this.

Who was the supplier of this camera?

------
Balgair
A lot of comentors are talking about the differences between the eye and a
sensor. Here is a brief overview of those differences with references at the
bottom: [https://medium.com/photography-secrets/whats-the-
difference-...](https://medium.com/photography-secrets/whats-the-difference-
between-a-camera-and-a-human-eye-a006a795b09f)

~~~
linkregister
I didn’t see anything about light adjustment time in that article. Can the
human eye adjust between high and low light conditions faster than a camera?

~~~
Balgair
The article talks about that

------
aviv
The quality of the video is completely unrelated. The car should have stopped.
It's as simple as that. This is night self-driving mode 101.

------
bensonn
I realize the self-driving/lidar/AI/IR is more interesting to the HN audience
but what about the pedestrian? How can the pedestrian not see, anticipate and
react to a 4000 pound SUV, on the road (where cars normally are), with
headlights (I assume they were on), driving right towards her?

Did the pedestrian not look both ways? The pedestrian could have easily gone
to the crosswalk, waited a few seconds for the car to pass and then jay-
walked, at least stopped in the left lane when they realized the car wasn't
slowing down, hurried to get across, or something.

Based on the video it looks like the pedestrian was just as clueless as the
"driver", only looking up in surprise at the last second and making no move to
avoid the collision. Maybe it sounds cruel blaming the person that died but
they certainly played the largest part in their death.

I am making no excuse for Uber, I know nothing about self driving cars or the
tech behind them. But come on- how can a 150 pounds person walk into the
middle of a road without making any effort to check for their own safety? It
seems similar to stories of people wearing headphones and walking down the
middle of a train track.

As a 200 pound ball of squishy flesh and my life on the line, I feel it is my
job to be very aware, very cautious, very careful when entering the domain of
2 ton speeding metal objects, especially in the dark.

------
jonknee
With all the logged sensor data this should be the best documented fatal
collision in history and will be able to be replayed again and again with
different software to see if it was avoidable with the current hardware
configuration.

Obviously a sad situation, but it's reassuring that the knowledge from one
crash can lead to all other autonomous vehicles learning to avoid it in the
future.

------
kyle-rb
I'd be really interested to know if there have been any 'close calls' before
now, where a similar malfunction occurred, but the human driver caught it in
time and manually hit the brakes. Not that I blame this driver; I just think
that any incident where the human driver had to take control to avoid an
accident is really just as bad as this.

------
kodis
I sure couldn't have reacted quickly enough to have prevented this collision.
I'd expect the car to have better reflexes then I do, but given the little
time from when the pedestrian first showed up on camera and the time when the
collision occurred, I doubt that there was anything that the car could have
done, regardless of the sophistication of its software.

~~~
stefan_
An autonomous vehicle obviously shouldn't exceed a speed where it's stopping
distance exceeds it's vision capabilities, since next time it might as well be
a tree trunk. That's sometimes difficult for humans to gauge, but then we want
autonomous cars so they can make these calculations (and they are much better
equipped to).

~~~
ashelmire
That’s a good point. But in this situation I think we’d probably find that the
driver wasn’t at fault.

------
vinniejames
I'm more than a little disappointed to see the victim walked all the way
across the left-side lane, unnoticed, before being struck. While the monitor
appeared to be more interested in her phone, or whatever she was continually
looking down at, instead of keeping an eye on the road and vehicle performance
which I imagine was what she was being payed to do

------
Myrmornis
A human driver would have braked and swerved left into the other lane; there
was obviously no oncoming traffic.

A human driver would have seen the woman much better than this camera footage
makes out.

The Uber attendant was sadly not paying attention, which is going to look very
bad for the self-driving car industry, even if it is very hard to ask humans
to pay attention and do nothing.

------
jonthepirate
Does anybody really think its a good idea for Lyft, Uber, Tesla etc to develop
safety systems for the open roads separately in isolation? Who benefits from
that?!

I'll bet a Tesla would not have run this person over. This is where the
Government should step in and force all of these AV manufacturers to implement
a common safety system / protocol.

------
differentView
As expected, when there's a death, there will be overreaction. I'm surprised
the Hacker News crowd is part of it.

------
sdzin
Ethical question for HN...

Certainly, some team of engineers is tasked with analyzing the cause of this
failure and providing potential solutions for future releases. What is the
best way to do this without trivializing the fact that a human life was lost?
Does anyone have experience with analyzing the aftermath of some event which
caused death?

------
StreamBright
It would be really good to have the IR camera recording as well as the
internal state of the model that drives the car.

------
quilombodigital
I dont know exactly how autonomous cars works, but is it possible there was a
warning (beep) before the system tries to stop the car, She got confused, hits
the accelerator, disabling the system, and hits the pedestrian by accident?
How autonomous cars decide when there is an obvious obstacle and the driver
hits the accelerator?

------
sergiotapia
Anyone have a mirror? I can't play videos on twitter for some reason.

Mirror: [https://streamable.com/vllgl](https://streamable.com/vllgl)

She wasn't wearing anything visible at a pitch black street at night. It seems
like there was about 7 meters before her shoes were even visible.

------
isuckatcoding
Relevant from a few weeks back:[https://www.theglobeandmail.com/globe-
drive/culture/technolo...](https://www.theglobeandmail.com/globe-
drive/culture/technology/the-ethical-dilemmas-of-self-
drivingcars/article37803470/)

------
Rapzid
This thread is a case study on the Dunning–Kruger effect. I hope a lot of self
reflection occurs once the actual studies and, you know, facts get around
about human vision, peripheral perception, low light peripheral perception,
reaction times, pedestrian fatality statistics, this accident...

------
Tempest1981
I wonder how the Uber engineering team is feeling. I've had bugs in my code,
but they've never killed anyone.

I guess the feeling may depend on how rushed vs. well-tested they feel their
software is. But no amount of "I told you it wasn't ready" is going to repair
the psychological damage.

------
RandomCSGeek
The real problem here is not that the tech is imperfect, I'd actually expect
it to be imperfect given it's not been even a decade for this tech. The real
problem is that this imperfect tech is on public roads. How do they even get
the permission to test drive on public roads?

------
arthurofbabylon
I remain curious about the set of detection mechanisms in place - if this
video is the only one, I’d say Uber is culpable (of course it’s not the only
one), and if it’s not the only detection mechanism, then where’s the data from
the others?

Another note: I haven’t heard criticism about the driver’s inattention...

~~~
perennate
I think the inattention is just expected now. If you have to do absolutely
nothing 99.99% of the time, then it just doesn't seem possible to constantly
focus on the road. At least if you are driving down an empty highway you still
have to hold the wheel and press the accelerator. For this reason, I think
level 0 systems ("may momentarily intervene but has no sustained vehicle
control") hold much more promise for the near future (next decade or two), and
I wish there was more focus on getting these systems widely deployed.

------
dmicah
Did the vehicle have high beams enabled?

~~~
freehunter
I certainly wouldn't have high beams on here. It's a lit road and it seems to
be in the city, so you'd likely be encountering other cars often enough that
you'd spend more time turning the high beams on and off than you would
controlling the speed and direction of your vehicle.

------
amorphid
What happens when the human grabs the wheel? Is it possible the car
could/would have stopped, or reduced the danger to the pedestrian, but the
human actually overrode any emergency corrective measure the car would have
taken? Did the human prevent the car from working as intended?

------
olefoo
If the autonomobile was speeding... it is very definitely at fault.

And given that it's Uber we can expect that there's an email chain where a
programmer is told to ignore the map layer with speed limits on it.

I hope the prosecutor (best case) or plaintiffs lawyer (worst case) finds it
and makes Uber suffer.

------
xvf22
It's very disingenuous to release this video. This is not the picture the car
saw nor the human. That coupled with the fact that no radar or lidar data was
released makes me wonder why the police seem to be bending over backwards to
absolve Uber.

~~~
cozzyd
Police are nearly always primarily motorists and therefore biased to the
motorist perspective.

~~~
abraae
Police are also nearly always pedestrians, so what?

~~~
galdosdi
Baloney. In most USA cities the vast majority of police officers on patrol are
in radio motor patrol cars, not footposts. Are you even from the USA? Or is
there a city that is an exception?

Or are you repeating the trite idea that all motorists are also pedestrians
because they have to walk a few feet in a parking lot or driveway at the start
and end of their trip? That is ridiculous if so -- travelling a few feet in a
parking lot, a place designed to be safe for pedestrians, is very different
from having no choice but to walk miles of dangerous high speed roads like the
one in question today because it's the only way to get home

------
foxh0und
Watching someone die, and then watching someone else watch the same person die
is really hallowing.

------
tdesilva
It'd be nice if some telemetry came out, when the car started to brake,
steering inputs, etc.

~~~
brokenmachine
Even the police already reported that the car did not brake.

------
oxymoran
Yea that video pretty much sums it up. The car is definitely the proximate
cause. If I were handling the claim I would go 90% liability to the car for
driver inattention and careless negligence and 10% to the pedestrian for
jaywalking.

------
blhack
I have to wonder if HID headlights contributed to this problem as well. The
headlights have such a sharp, defined "edge". The woman was not visible until
she was within the bright portion of the lights (which isn't very big).

------
coding123
While this may be unfair - but LISTEN UP REGULATORS: IT's time to have
standardized tests for SDVs before they are allowed to be considered road
ready. And I hate to say this, but WAYMO is the company to design the test
suite.

------
kristianov
I don't think Uber with their "Hustlin'" culture should be allowed to run
field tests that may endanger human lives. We simply cannot "move fast and
break things" when it comes to self-driving vehicles.

------
LeicaLatte
What about the logs? Do companies like AppDynamics, Datadog support self-
driving cars yet? Any companies out there providing a platform to do this? So
important for debugging, auditing, compliance, legal, etc.

------
gerbilly
I'm waiting for the 'grassroots' campaign to get every human and dog to wear
locator beacons to make them easier to spot by self driving cars.

Maybe Cambridge Analytica is already working up a facebook campaign for it.

------
nraynaud
I passed there tonight, the place is incredibly well lit and the person was
probably visible from before the underpass, and probably visible the entire
time she was on the road because of the geometry of the curve.

------
mactitan
Elaine Herzberg probably casually jaywalked previously but human driver
slowed.this time she got surprised by robot killer that violated asimovs 1st
law. Safe av's are coming but why so impatient?

------
juanbyrge
Wow, this is inexcusable. I hope Uber is found liable for this death. Someone
walking across the road at night? That is literally the first test case I
would add if I were building a self-driving car AI.

------
csours
If anyone is wondering how to avoid accidents like these while YOU are driving
- look for shadows. You may not see someone wearing dark clothes, but you can
see the shadow they project change as you move.

------
foobarbazetc
Hmmmmm... looks like something went wrong with object detection in the dark.

------
emodendroket
It's completely predictable that the human driver would stop paying attention,
which is what makes these "semi-autonomous" systems worse than systems that
aren't at all autonomous.

------
triggercut
There are a lot of questions around the vision side of this, but I have
slightly different question, the answer which probably points to greater
systemic issue in the overall solution.

Why is a car, the speed of which is regulated by the autonomous driving
system, breaking the legal speed limit?

Even accounting for errors/accuracy of odometers/speed cameras, this car was
traveling at a speed in excess of the legal speed limit for that section of
road, increasing both all the underlying risks of operating a 1-2t motor
vehicle at 60+km/hr, and the material impacts from those risks.

In a relatively new technology, why is the decision to trade safety for speed
being made?

In any technology where safety (and ultimately human life) is concerned, why
are we testing in production?

------
jryan49
This almost leads me to believe the LIDAR part doesn't even work...

------
cjensen
Driver was obviously on cell phone.

Since this is still in development, Uber should have assumed that the self
driving software _would_ fail unexpectedly.

If a human sees something work 1000 times, they will expect it to work every
time. Uber should have anticipated overconfidence in the software. They should
have coached the driver on this. They should have reviewed footage of safety
drivers to ensure that safety drivers are complying with instruction.

This was not an accident. This was carelessness. I blame Uber. Uber's "just
get it done" cultural attitude is incompatible with developing safe self-
driving software. There's a reason that Google, with 10X the number of miles
on the road, have not killed anyone.

------
Gustomaximus
Is it me or does it seem the safety driver is on their mobile phone?

------
kazinator
Obviously a "hello, world" failure of self-driving.

I wouldn't blame a human for not seeing that person, but better should be
expected of the tech.

However:

* there doesn't appear to be a single reflective device on the bike. For instance, the usual spoke-mounted reflectors that are stock equipment on even low-end bikes do not seem to be there.

* the woman seems completely oblivious to the car's approach. She doesn't react at all but keeps casually walking with the bike right until the moment of impact. She mustn't be looking in the direction of traffic at all and is mowed down completely by surprise, like someone sucker-punched in a bar. (Was this someone with disabilities? Visual or hearing impairment? Developmental?)

* I think here is the road where this took place: [https://www.google.co.in/maps/@33.4351488,-111.9415554,3a,60...](https://www.google.co.in/maps/@33.4351488,-111.9415554,3a,60y,323.19h,90t/data=!3m9!1e1!3m7!1sOJwMRRXHT2K5i9cHVvrmkA!2e0!7i13312!8i6656!9m2!1b1!2i38) The scene of the accident is a little bit forward of here. Utterly not a place to be crossing at night in a way that is completely oblivious to the presence of vehicles and far away from the intersection. Note that this is a one-way double lane; there is only one direction in which to look out for cars.

This kind of badly behaved, suicidal pedestrian is a challenge to drivers even
in daylight. However, you would expect precisely this sort of situation to be
among the highest priority test cases for self-driving tech.

BTW: here is a shot of a sign forbidding pedestrians from crossing at almost
that exact spot:
[https://www.google.co.in/maps/@33.4362927,-111.9424451,3a,15...](https://www.google.co.in/maps/@33.4362927,-111.9424451,3a,15y,181.71h,83.76t/data=!3m9!1e1!3m7!1s4fS1O8O3KI3F7Tw5Mip-
ww!2e0!7i13312!8i6656!9m2!1b1!2i19) She wouldn't have seen that one because
she probably crossed the other lane already; she's coming from the median. How
did this person live to 49?

~~~
jes5199
Jesus Christ, show some decorum.

------
foobaw
Would love to see the log files of the car to see why and how this occurred.
I'm pretty sure it's known internally - but is Uber obligated to release them
for the public?

------
brudgers
The bicycle is visible in the video during 0:01/0:22. The glint of the
reflector appears when there are two full white divider lines between the
camera and the victim.

------
tuxguy
Did anybody else think that the human operator in the driver's seat, should
always be looking in the front, than being distracted by looking down or below
?

------
_pmf_
Volvo's pedestrian detection works exactly for these cases, and it's exactly
these cases that have been hailed as cases where AI will be better.

------
GoToRO
It looks to me that the car's lights are too low. Probably if you are not 100%
driving you don't feel like you need to adjust the lights.

------
codedokode
Why did the pedestrian walk into the traffic? She should have seen the car.
Probably she got used that the cars stop to let her cross the street.

------
lolsal
I knew the woman was coming up in the video and I didn't see her until it
would have been too late.

The driver definitely seemed distracted by a device of some kind.

~~~
foobarbazetc
That’s only because there’s no light on her until the end.

The car hardware should have seen her. If it doesn’t work in the dark it
shouldn’t be on the road imho.

~~~
lolsal
> That’s only because there’s no light on her until the end.

Yes, quiet true.

> The car hardware should have seen her. If it doesn’t work in the dark it
> shouldn’t be on the road imho.

I tend to agree. I think unfortunately this video indicates that the car did
not perform _worse_ than a human. It's still valuable to have cars that
perform _better_ than humans, even if they aren't perfect.

~~~
foobarbazetc
What do you think based on this video:

[https://twitter.com/brianawhitney/status/976591851384745984?...](https://twitter.com/brianawhitney/status/976591851384745984?s=21)

The woman with the bike wasn’t moving fast at all. I dunno...

~~~
nemothekid
Even in that over-exposed video, at 35mph, I think you would have to be
extremely lucky to not hit her. From her coming into view to collision is
~700ms. Subtract average human reaction time (250ms), and that leaves a little
less than half a second to swerve.

A human would have made this mistake, but I don't know enough about LIDAR to
understand if a computer should have made this mistake. Is night vision poor
on these machines?

~~~
JustSomeNobody
Then slow down. If you can't stop or swerve when something becomes visible to
you then you're driving too fast regardless of the posted speed limit. Period.

------
MarkMc
Just curious: How many times has an Uber car hit a pedestrian or cyclist and
NOT killed him/her? Can we see footage from those accidents?

------
johnnyb9
Setting aside the fault of the vehicle in this case, why didn't she get out of
the way? Would she not have seen the car coming at 38 MPH?

~~~
theclaw
The report said she was homeless, it's possible she was ill.

------
EamonnMR
That look of horror on the driver's face... I can't say a human driver would
have managed to stop or swerve any quicker either.

~~~
emodendroket
Uh, really? You don't think a human driver could have taken _any_ corrective
action if they were paying attention to the road? The risk I see with these
semi-autonomous systems is precisely that they invite the sort of inattention
seen here. The brakes, at least, should have been applied.

------
kozikow
The way I see it is that this fatality was a calculated risk for Uber. Uber
realizes how much Waymo is ahead of them and decides to drop too many safety
restraints in development of the self-driving technology.

The fatality is likely not cause big direct loses to Uber. Any safety
regulations coming out of the result are going to hit Waymo just as much as
Uber, slow them down and give uber time to catch up. Any financial fine would
be tiny comparing to total money invested in self driving technology.

------
edgartaor
Well, obviously something did not work properly. There is log data about this
incident? The problem can be "debugged"?

------
anotheryou
Is that the crappy video the car uses, too?

Looks so low-res and like they added contrast until you really can't see the
person any more.

------
baxtr
I drive an VW with a pedestrian emergency brake system that is made for these
situations. I wonder if it would have worked

------
tyng
There is I think a broader legal/philosophical question here:

Are we to judge the car's ability to avoid the pedestrian on human standards
or AI/machine standards?

From the video, it looked like this could be a situation where a human driver
cannot avoid the accident (due to poor light, not enough reaction time etc)
but a machine should be able to avoid it with it having multiple cameras, IR
sight, and much faster processing speed.

Greater power comes with greater responsibility?

------
ysleepy
Why is the resolution so low?

Why is the FPS so low?

Why don't we see the IR/LIDAR footage?

Was the brightness adjusted after the fact to look favourably?

Why didn't the vehicle issue a full stop once the woman was clearly visible?

I feel PR handled.

~~~
cesarb
My guess would be that this is an off-the-shelf dashcam, completely
independent from the self-driving cameras and sensors. Being an off-the-shelf
dashcam means extracting and publishing the video from it is easy, which is
why it came out first. It also explains why it has video from inside the car;
many off-the-shelf dashcam models have both a forward-facing and a backwards-
facing camera.

Personally, I'm more annoyed at the blurring of the telemetry data at the
bottom of the video. What was blurred probably included the dashcam model,
date/time, GPS coordinates, calculated speed, and so on.

The output from the other sensors might take more work to extract and convert
into an understandable form, if they are even available after the fact; they
might be immediately used in the control loop and then discarded, instead of
being stored.

~~~
djsumdog
This makes the most sense. It's probably a redundant system: off the shelf
dash cam not connected to anything but power. The police could easily extract
the sdcard.

The actual car data will need to be collected and released by the NTSB and
will probably come in a few weeks.

------
UncleEntity
Makes me wonder if the lady was riding her bike in the lane if she still
would've got plastered by the robocar?

------
Tomminn
How do we know that this footage hasn't been artificially darkened by Uber
before they gave it to police?

~~~
Tobba_
It essentially is. Even with that crappy autoexposure, she would probably be
visible if the resolution and framerate wasn't terrible, and the compression
hadn't taken out whatever detail remained.

------
jayess
I feel sorry for the driver. Now his face is plastered all over the internet
forever simply for doing his job.

------
cure
Why was the car speeding? Doing 38 mph in a 35 mph zone. Are these things
programmed to systematically speed?

~~~
Aspyre
It was a 45 mph zone. The poice statement was wrong.

------
cek
"There is no such thing as an accident. Only negligence."

This was no accident. The jaywalker was negligent for crossing the street as
she did.

Uber is negligent for not having autonomous vehicle standards that clearly
exceed what humans can do (I don't think a human could have avoided her based
on the video).

Not sure about the driver. Depends on his training/job etc...

~~~
taneq
The 'safety driver' wasn't even looking out the windscreen. If your job is to
act as a backup for an untrusted system, you are responsible.

------
fudged71
I do wonder if perhaps Uber made the video darker before releasing. Or used a
camera that is not the main sensing camera. Because I would hope the cars are
smart enough to increase ISO to get more detail at night time.

~~~
sitkack
1) It does NOT show what the LIDAR sees.

2) They should be using a dark adapted / night vision camera and probably are,
the video feed presented is just that, the best one that "presents".

3) They should be using thermal infrared to spot living things

4) They were severely over driving their headlights if what we are seeing is
"reality" as seen by the onboard computer.

I want to know how much access to the hardware Uber had after the accident,
including physical and remote. I also want to see the streaming logs and full
provenance.

~~~
TylerE
It's Arizona. I just checked. The high today was 92F. Pavement would almost
certainly be warmer than that. How's a thermal camera supposed to detect 98F
on a background that is proably +/\- 5 degrees of that?

~~~
dragonwriter
> The high today was 92F.

Probably not relevant to conditions at 10pm on Monday, when it was between 70F
and 57F (the range for 6pm Monday to 12am Tuesday.)

> Pavement would almost certainly be warmer than that.

Sure, when it was 92F the pavement would probably be 140+F.

> How's a thermal camera supposed to detect 98F on a background that is
> proably +/\- 5 degrees of that?

At night, the pavement would be much cooler than a human; at the high
temperature you report, it would probably be much hotter; there's a place in
between where the problem you have would be occur, sure, but it's neither at
the high nor, more to the point, in the conditions when the accident occurred.

~~~
TylerE
So, if the asphalt is 140F when it's 92F, what's it when it's 65? About 100?

~~~
xxpor
The reason why it's so hot is because of radiated heat from the sun. It'd be
much closer to air temp at 10 pm, well after sunset.

------
dfee
What no one seems to be talking about is that a person was walking their bike
across a divided 4 lane road that in many places would be called a highway.
This is not a pedestrian friendly crosswalk, and it’s not high noon.

You can tell by the tail lights of the car in front (first second of video)
that a pedestrian would be able to see a car coming.

Which begs the question - why did the human step out in front of the car? Is
there culpability there, too? If a person intentionally puts themselves into a
deadly situation, how should AI handle this?

We’re all looking at the cars, but let’s keep in mind that crossing a dark
divided highway in front of a car you can see coming is a really really bad
idea.

~~~
horsecaptin
You make an interesting point regarding responsibility. Say that person were:

\- Mentally deficient.

\- A child suddenly running because thats what children do.

\- A blind person making a mistake.

There are countries where if a pedestrian is crossing the road anywhere that
isn't a designated crossing zone, then they are responsible for their fate.

In Canada and the USA, I believe that pedestrians "always" have right of way
and drivers are supposed to be as vigilant as possible.

Of course, some pedestrians take this to heart and suicidally jump onto a
road. In the US, this is more prevalent in walkable cities and college towns.

~~~
mdavidn
Pedestrians do not always have the right of way.

However, drivers must avoid collisions with pedestrians, even when the vehicle
has right of way.

Here's the relevant vehicle code in Arizona:
[https://www.azleg.gov/ars/28/00793.htm](https://www.azleg.gov/ars/28/00793.htm)
[https://www.azleg.gov/ars/28/00794.htm](https://www.azleg.gov/ars/28/00794.htm)

------
csomar
Is it me or the street lighting is beyond mediocre? It is weak and alternate
between a "lighted" portion and a literally black portion.

But even then, the driver was busy with his cellphone. The accident might have
happened nevertheless due to bad lighting. But this could have been avoided
was the driver on alert.

~~~
lobster_johnson
Coming from Europe, I've been surprised as how many places in the US that
don't have streetlights at all. Driving between Maryland and Virginia, there
are 4-lane highways that are _completely_ dark for long stretches at a time.
I'm assuming this is a money-saving concern, since the US has such a huge
amount of area to cover.

Back in many native, smaller habitat, I've only rarely encountered this kind
of darkness, and it's usually on small, isolated country roads. When I got my
license, my instructor's company had a special spot, an hour outside the city,
where they always went to for dark driving lessons, because almost all of the
road network is (exceedingly well) illuminated.

~~~
web007
I prefer the situation of no lights other than headlights.

I'm from Maryland, and my neighborhood is off a country road with a single
streetlight at the head of the neighborhood. If the weather is doing anything
(fog, rain, snow) even a little bit then it's completely awful to drive
through that intersection.

If the light didn't exist, I could keep my vision adaptation the same before /
during / after, and could compensate for inclement weather the whole time. As-
is, I go from very-dark to bright-light and back to very-dark over a hundred
feet or so, and it makes it very hard to adapt safely.

------
juliand
Is it me or the car seems to steer slightly to the right, just before the
impact ?

------
sytelus
Hereby I'd like to name this as "Jack-in-the-box" test case for self driving
cars. No self-driving car should be allowed on road unless it can detect and
avoid collision with person literally popping up out of ground at 20ft of
distance while driving at 40mph in total darkness.

------
JustSomeNobody
Never outrun your headlights should be a rule for self driving vehicles as
well.

------
rubicon33
Wow... This is REALLY bad for self driving cars. HOW did it not catch this?
I'm a huge proponent of self driving cars, but I'm shocked that the car
literally did NOTHING to avoid her. I know for a fact, I would have seen and
potentially swerved (at least slammed on the breaks).

------
basicsbeauty
Alternate systems perform way better. Granted this is broad day light.

------
Jnr
Seems like Uber is using pretty bad radars, software and drivers.

~~~
oldgradstudent
And management.

------
_RPM
So, Uber self driving cars also have humans operating them?

------
newnewpdro
Driver wasn't even paying attention to the road.

------
ashelmire
She crossed at a spot that was pitch black. Sneakers come into view maybe 40
feet away. Clearly visible at 20. Is that enough time to stop at that speed?
Not sure I would have stopped in time.

~~~
jonathanyc
You don’t need to stop. Slowing from 40 to 30mph decreases the fatality rate
by an order of magnitude. Maybe the car shouldn’t have been going above the
speed limit if the conditions were so poor that it couldn’t plausibly see
humans walking right in front of it.

~~~
freehunter
That's an argument intended only to shift blame and distract. The car was
going 38 in a 35 which is well within the legal margin of error on a
speedometer[1]. For all we know, the speedometer read exactly 35.

There's enough blame to go around between the driver who was obviously texting
and the woman illegally jaywalking, you don't need to invent a false
controversy.

[1] [https://www.caranddriver.com/features/speedometer-
scandal](https://www.caranddriver.com/features/speedometer-scandal)

~~~
user5994461
I recall reading that the speed guns where I am from have a tolerance of 5
km/h or 5%, whichever is greater.

3 mph is beyond the margin for error, you'd get fined for driving that
quickly.

~~~
freehunter
I mean... I even provided a source and everything. And you threw that all away
with a glib "nah I read somewhere that you're wrong".

Please read the link. I may still be wrong, but by my math, the car was a
Volvo XC90 which means the acceptable margin of error on the speedometer is
+/\- 3 miles per hour.

~~~
user5994461
Our points are complementary. The errors of the speedometer and the error of
speed gun may add up.

The police will take off 5 km/h on the recorded speed, when editing the fine.
That's supposed to cover the margin for error.

You're out of luck if your car meter is 5 km/h off and you were trying to
drive at the speed limit.

------
bearded_goat
streamable mirror:
[https://streamable.com/vr5j6](https://streamable.com/vr5j6)

------
tamaharbor
I don't know if I would have seen her either.

------
thrownaway954
bottom line was that the person hit was crossing in the middle of the street
(jay walking), so they are to blame for the accident not the car. while the
technology _should have_ seen them, i don't blame it. this is essentially the
same as someone breaking into your house, getting hurt and you getting sued.
the person is breaking the law, yet you're to blame... that wouldn't be fair
to you, so it isn't fair to blame the car in this incident.

------
gowld
Focusing on the car-automation and driver may be missing the point in this
scenario.

@jedweeks: Nearly 2 miles between crosswalks, that's why. Street was designed
to kill pedestrians.

------
zerostar07
average people will expect robot cars to avoid these. it must also be a
terrible experience for the woman who was in the car.

------
TeeWEE
I don't trust Uber. Thats it.

------
scrame
Well, that's upsetting.

------
notananthem
Daaaang. That is monstrous.

------
dylan-m
This is such a blatant attempt to manipulate public opinion before the actual
data appears. Even if Uber's software _is_ ridiculously using visible light
for its decision-making, it should definitely be taking into account what it
doesn't know. A system like this should not be guessing. There is a giant
swath of black in the video; the camera doesn't know what's there. So, the car
should be moving so it has a short enough stopping distance to avoid a
collision. (Or at least flash the high beams, but that would be pointless,
because obviously it isn't actually using this camera. Uber may be run by
reckless psychopaths, but their engineers aren't stupid).

~~~
URSpider94
The police department released the video, not Uber.

~~~
dylan-m
Mm, fair enough, that's my inner conspiracy theorist happening :b (Every part
of the US government is obviously in cahoots with Uber as part of a complex
scheme to take over the world, right?). Okay, probably not intentional on
their part, but I think it's really disappointing that this is the video they
had to share. The video itself is manipulative, and surely they would realize
that. It provides some context, but the prevailing focus is "look how dark
much darker those pixels are than those other pixels!", which just isn't
useful.

~~~
java_script
No conspiracy needed. AZ wants to appear friendly to self driving car
companies (e.g. they already don't regulate them as hard as CA.). A single
undesirable being taken out isn't gonna change that.

Plus it's not like AZ has a long history of pedestrian friendliness anyway.
It's practically engineered to harm them, with 2mi between crosswalks

------
davidorff
Maximum Uberdrive

------
rhacker
Uber is at fault.

------
creaghpatr
Did not enjoy that video auto playing in my twitter feed.

------
itronitron
wow, Uber's system sucks big time

------
vbezhenar
So who's going to jail for this murder?

------
imrankhan0036
Awesome

------
teliskr
if it bleeds it leads

------
major505
yupe. I wold certainly have hitted the guy. Only seen him in the last second.

------
CamperBob2
Something that's worth pointing out as well is what happens to low-light
contrast sensitivity as we age. I believe there's something like a 50% loss in
night vision between the ages of 20 and 50.

Having watched the video, I seriously doubt I'd have seen the pedestrian in
time to stop for her. I don't know that I would even feel that sorry for her,
frankly -- there is only so much we can (and should) do to protect drunks and
idiots from hazards like crossing the street at night without looking.

Finally, had the driver been paying attention, at least some blame could
probably be assigned to DOT's stubborn refusal to bring their lighting
standards into the 21st century. Cars in other countries are equipped with
noticeably better lighting tech than US cars are allowed to use. It looks like
you'd be overdriving these headlights at any speed over 15 MPH or so.

~~~
dang
We detached this subthread from
[https://news.ycombinator.com/item?id=16644056](https://news.ycombinator.com/item?id=16644056)
and marked it off-topic.

~~~
CamperBob2
Nice. Let a hundred flowers bloom, right?

~~~
dang
Actually it was more flamebait than off topic, though the two are related.

------
jedberg
While I'm surprised the sensors on the car didn't catch the cyclist, I pretty
sure if I had been driving and paying full attention I would have hit the bike
too.

Personally, given this video, I'd still be totally comfortable with a self
driving car on the road with no safety driver (who clearly didn't make a
difference here anyway, not that they could have).

~~~
danso
I agree in that I consider the safety driver to be irrelevant in this
situation, because of the reaction time needed to take control, nevermind hit
the brakes, nevermind after a complacent state of mind.

But as others have pointed out, it's hard to tell what the driver could have
seen just from the dashcam video, which for starters, has a much less dynamic
range than human vision. Because in the accident scene photos/videos, the
ambient street light is much more visible [0]. I'm guessing driving down that
Arizona road at night can't possibly be as pitch dark as the camera shows it,
given how that area seems to be used for entertainment venues/festivals.

[0] [http://sanfrancisco.cbslocal.com/2018/03/19/uber-self-
drivin...](http://sanfrancisco.cbslocal.com/2018/03/19/uber-self-driving-car-
fatal-crash-tempe-arizona/)

------
foobarbazetc
Here’s a better video:

[https://twitter.com/brianawhitney/status/976591851384745984?...](https://twitter.com/brianawhitney/status/976591851384745984?s=21)

The car should have seen that.

~~~
paraxisi
How is this video "better?" This is clearly somebody holding a camera
recording a screen and seems to have much more artifacting and brightness
changes.

------
manicdee
To all the armchair experts: try recreating this accident scene in the real
world.

Sodium Vapour (monochromatic) lamps, then a large darker area with a
pedestrian in dark clothing with car’s headlights dipped.

You will be alarmed at how much stuff you can put on the road and not see it
because the road has a lighting pattern that makes it less safe than no lights
at all.

With no lights at all, and a hedge between carriageways, the car would have
had high beams on and the driver may have seen the pedestrian with plenty of
time to brake and either avoid collision altogether or reduce the impact to
non-fatal energy.

Now go recreate the scene for yourself and see how it works for you.

~~~
tachim
Nope. That person + bike would have been completely trivial to detect with the
LIDAR those cars have mounted on top.

~~~
stale2002
The comparison should be against humans. It is humans that autonomous vehicles
are being compared against.

Because the alternative to an AV is a human. So the only thing at all that
matters is "is this better or worse than a human".

~~~
tachim
There are two explanations for the video published:

1\. They _DO_ have cameras onboard with the exposure settings necessary to see
the woman in the video, and have not released the video because it makes for
bad PR.

2\. They _DO NOT_ have such cameras, in which case they should have their
self-driving permit revoked because this failure mode is completely
predictable to anybody working in this space. And anyway their LIDAR should've
detected the person.

------
bcheung
Looks like somewhere between 1.0 and 1.2 seconds between when I can see the
pedestrian in the video and when the collision occurred.

I don't think a normal human, even if they were paying attention to the road,
could have braked or swerved in time to avoid hitting them.

Not sure if visibility is better for a real human than that video but from
what I remember about the eye test at the DMV it is extremely lax. They had a
vision test sign with a really big font and it was really close. I was
appalled that they let people drive who can't see the vision test sign on the
opposite side of the room -- even that was ridiculously easy to read.

Also, even if they reacted within 0.1 seconds (unlikely) I doubt the car would
have been able to slow down in that timeframe.

I'm surprised IR and LIDAR didn't pick it up though. Would be useful to have
the other data released. Video is only a small part of what the car sees and
excluding this information is a huge disservice.

Not sure if they already do, but would be nice, no, imperative, for car
companies to collect "black box" data of collisions and shared this with all
other interested parties. This is a dataset that would be extremely beneficial
for all self-driving AI engineers.

