
Uber’s self-driving car could not detect pedestrians outside of a crosswalk - notlukesky
https://www.theregister.co.uk/2019/11/06/uber_self_driving_car_death/
======
Strilanc
> _5.6 seconds before impact, it classified her as a vehicle. Then it changed
> its mind to “other,” then to vehicle again, back to “other,” then to
> bicycle, then to “other” again, and finally back to bicycle._

System can't decide what's happening.

> _It wasn’t until 1.2 seconds before the impact that the system recognized
> that the SUV was going to hit Herzberg_

System is too slow to realize something serious is happening.

> _That triggered what Uber called “action suppression,” in which the system
> held off braking for one second_

A hardcoded 1 second delay during a potential emergency situation. Horrifying.

I bet they added it because the system kept randomly thinking something
serious was going to happen for a few milliseconds when everything was going
fine. If you ever find yourself doing that for a safety critical piece of
software, you should stop and reconsider what you are doing. This is a hacky
patch over a serious underlying classification issue. You need to fix the
underlying problem, not hackily patch over it.

How is this not the title of the story? This is so much worse than the "it
couldn't see her as a person, only as a bicycle". At least the car would still
try to avoid a bicycle, in principle, instead of blindly gliding into it while
hoping for the best.

> _with 0.2 seconds left before impact, the car sounded an audio alarm, and
> Vasquez took the steering wheel, disengaging the autonomous system. Nearly a
> full second after striking Herzberg, Vasquez hit the brakes._

And then top it off with systemic issues around the backup driver not actually
being ready to react.

~~~
chooseaname
>> 5.6 seconds before impact, it classified her as a vehicle. Then it changed
its mind to “other,” then to vehicle again, back to “other,” then to bicycle,
then to “other” again, and finally back to bicycle.

The system should have started applying brake at this point. If a 3500lb
vehicle can't decide what it is about to impact, it needs to slow down (to a
stop if necessary).

> That triggered what Uber called “action suppression,” in which the system
> held off braking for one second

This is borderline criminal negligence.

> with 0.2 seconds left before impact, the car sounded an audio alarm, and
> Vasquez took the steering wheel, disengaging the autonomous system. Nearly a
> full second after striking Herzberg, Vasquez hit the brakes.

Why were there no alarms going off at 5.6 seconds when the vehicle was
confused!!!??

SMH. This is just ... I'm flabbergasted.

~~~
a_t48
> The system should have started applying brake at this point. If a 3500lb
> vehicle can't decide what it is about to impact, it needs to slow down (to a
> stop if necessary).

> Why were there no alarms going off at 5.6 seconds when the vehicle was
> confused!!!??

That sort of depends on the specifics of how their obstacle prediction and
avoidance works - having a fuzzy view on what exactly something is at 5.6
seconds out is probably ok. The important bit is that it notices that the
obstacle is moving out to the road and should stop it. Classification is not
needed to avoid objects. The key words here are actually "at each change, the
object's tracking history is unavailable" and "without a tracking history, ADS
predicts the bicycle's path as static" which is a horrible oversight.

>> That triggered what Uber called “action suppression,” in which the system
held off braking for one second

> This is borderline criminal negligence.

Yeah, this is. Even though it was too late to swerve, there was still enough
time to slow down and attempt to reduce the speed of impact. _This_ is
probably where an alarm should have fired - the car is now in a really unsafe
state and the safety driver should absolutely know about it.

Disclaimer - Up until a few months ago I worked at a major (non Uber) self
driving car company.

~~~
mysterydip
Here's a quick way to get safety in your collision avoidance system up to
spec: randomly choose coders of the system to be the objects avoided during
tests.

~~~
roenxi
You might be able to _literally_ count on two hands the number of accidents
where we have this level of transparency into the thought processes and
sensory data of the 'driver'. And control over what happens next time. There
is no doubt that the engineering data gathered here and in other accidents is
going to contribute to a massive (truly massive, unspeakably massive,
enormously massive - all such adjectives are appropriate) reduction in both
actual deaths from car accidents and statistical deaths from the huge amount
of time people waste driving.

The big picture is so overwhelmingly positive that even if the engineers were
purposefully running people over to gain testing data it would still probably
be a large net win for greater society. Thankfully there is no call for such
reckless data gathering.

If anything, punishments in this case should be much more lenient than normal
rather than breaking out the cruel and unusual measures.

~~~
vpribish
I think you are absolutely correct. and i'm disappointed again at the broader
HN downvoters who lack any sense of perspective. It's like the hackers all
turned into fuzzy-headed luddites here.

~~~
puranjay
Come on. It's one thing to be run over by a fellow human being, it's another
to be run over by a system developed by a corporation in the pursuit of
greater profits.

As a species, we've long accepted that living around each other poses some
hazards and we've made our peace with it.

But to stretch that agreement to a multi-billion dollar corporation that only
wants to make money off it? That's too much to ask for.

~~~
glofish
I think the "for-profit"" by a "multi-billion dollar corporation" thought
process is clouding your judgment.

It is such a trite and overused argument.

Why does it feel so different to you if the accident was caused by negligent
human texting, versus an engineer making decisions in code?

If anything that engineer was most likely operating in much better faith than
the texting, perhaps drunk human driver.

~~~
puranjay
It's a rather complicated question. Intention matters a lot when it comes to
dealing with on-road incidents. If there is no intention, we deem it an
"accident", else we deem it manslaughter. The punishment - by courts and by
society - are much harsher for the latter.

Can code be "accidental"? Surely not. Someone had an intent to write and
implement it. If it fails, its not "accidental"; the system was just
programmed that way.

So the question is: are we okay with for-profit companies _intentionally_
writing software that can lead to deaths?

------
brudgers
The car hit Elaine Herzberg.

    
    
      Elaine's Obituary
    
      Elaine Marie Wood-Herzberg, 49, of Tempe, AZ passed away 
      on march 18, 2018.  A graveside service took place on 
      Saturday, April 21, 2018 at 2:00pm at Resthaven/Carr-
      Tenney Memorial Gardens in Phoenix.
    
      Elaine was born on August 2, 1969 in Phoenix, AZ to Danny 
      Wood and Sharon Daly.
    
      Elaine grew up in the Mesa/Apache Junction area and 
      attended Apache Junction High School. 
    
      Elaine was married to Mike Herzberg until he passed away.
    
      She was very creative and enjoyed drawing, coloring, and 
      writing poems.  She always had a smile on her face for 
      everyone she met and loved to make people laugh.  She 
      would do anything she could to help you, and was there to 
      listen when you needed it.
    
      Elaine is survived by her mother Sharon, of Apache 
      Junction, AZ; her father Danny, of Mayor, AZ; her son 
      Cody, of Apache Junction, AZ; her daughter Christine, of 
      Tempe, AZ; her grandchildren Charlie, Adrian, and Madelyn; 
      her sister Sheila, of Apache Junction, AZ; and many other 
      relatives.
    

But a homeless person is only ten points. So let's remember the car.

[https://www.sonoranskiesmortuaryaz.com/obituary/6216521](https://www.sonoranskiesmortuaryaz.com/obituary/6216521)

[https://afscarizona.org/2018/05/14/the-untold-story-of-
elain...](https://afscarizona.org/2018/05/14/the-untold-story-of-elaine-
herzberg/)

~~~
sschueller
This is why I will never use any service from Uber ever. I may be
insignificant but at least I know I am doing my little part to not support
this disgusting movement of "Move Fast and Break Things" which includes actual
casualties. Everyone involved except the lowest pawns get away with it so the
next company can do it even worse.

~~~
shadowgovt
It's a choice you can make, but it's worth noting that the alternative to
getting SDCs working isn't "Nobody gets hit;" the alternative is "We keep
having humans operate multi-ton vehicles with their known wetware attention
and correction flaws and an average of 3,200 people die a day."

That doesn't imply Uber has the right chops to solve the problem, but I hope
_someone_ does.

~~~
romwell
False dichotomy, my friend.

Look at Europe (or heck, NYC) for alternatives to a car-dominated society:

* walkable, mixed-use, dense neighborhoods

* public transportation (rail-based, in particular)

* car-free streets (and cities)

The solution to traffic deaths is not self-driving cars. It's moving away from
the Levittown-style suburbs that have proliferated the US since WW II.

~~~
shadowgovt
Cool, so tear down American civic architecture to its bedrock and rebuild it
from scratch.

Not impossible, but less than likely in the short horizon. We'll probably get
working SDCs sooner.

~~~
romwell
True that, but not mutually exclusive.

We used to have streetcars. We tore then down. We can put them back.

~~~
shadowgovt
And a self-driving streetcar _is_ a more constrained problem to solve than a
self-driving car.

~~~
romwell
Yup! Still nontrivial, but actively worked on[1].

[1][https://www.wired.com/story/siemens-potsdam-self-driving-
str...](https://www.wired.com/story/siemens-potsdam-self-driving-street-car-
tram/)

------
michalc
As a UK-resident, I find the omission of considering a pedestrian in the road
quite unexpectedly, and unfortunately, overly car-centric.

There isn’t really the term “jaywalking” here. It’s just “crossing the road”.
I’m not sure on exactly who has what legal responsibility, but it certainly
feels like pedestrians should look out for cars when crossing, and drivers
should look out for pedestrians.

~~~
tialaramex
Legally the pedestrian has no responsibility except that they're prohibited
from entering certain areas specifically legislatively set aside for motor
vehicles like motorways (approximately "freeways").

Drivers are required to give "due care and attention" to driving which can be
demonstrated by following the "highway code" and that code tells them
pedestrians might do things they don't expect and to assume if it's not clear
what's going to happen then yeah there is a pedestrian behind that obstruction
and they are going to run into the road in front of you, whereupon hitting
them would be your fault.

For example when I was a child I got off my first bus home from secondary
school, and ran straight into the road in front of a car I couldn't see
because the bus was in the way. The horrified driver was legally responsible
for that, even though she hasn't intended to hit me. I believe she would have
been automatically billed by the authorities for the cost of shipping me to a
hospital to have my broken leg set and so on.

Clearly it is in some sense my fault that happened, but on the other hand it's
not me choosing to drive a huge steel box at 30mph past a bunch of idiot
children...

~~~
Cd00d
Why didn't your school bus have a stop sign deployed? Those are for exactly
this risk. Running a school bus stop sign I believe is a more significant
violation than running post-stop signs.

~~~
tarsinge
Unfortunately they don’t exist in a lot of countries.

~~~
seanmcdirmid
Do they exist in any country other than the USA?

~~~
tekknik
Plenty:

[https://en.wikipedia.org/wiki/School_bus_by_country](https://en.wikipedia.org/wiki/School_bus_by_country)

~~~
seanmcdirmid
Soon Asia none of those are public school buses, especially China where they
are all private buses painted yellow and have no special treatment (no stop
sign popping out of the side). Similar differences exist in Europe. Or I guess
my point is, none of these systems come even close to resembling what we ache
in the USA and Canada.

------
WhompingWindows
"The most glaring mistakes were software-related. Uber’s system was not
equipped to identify or deal with pedestrians walking outside of a crosswalk.
Uber engineers also appear to have been so worried about false alarms that
they built in an automated one-second delay between a crash detection and
action. And Uber chose to turn off a built-in Volvo braking system that the
automaker later concluded might have dramatically reduced the speed at which
the car hit Herzberg, or perhaps avoided the collision altogether."

Wow, dangerously and maliciously negligent engineering. Another reason Uber is
a company to be wary and critical of. It's hard to say whether their rushed
and corner-cutting approach has set back SDCs, but those engineers should
change their ways now that a death is on their conscience.

~~~
bdcravens
> a death is on their conscience

How much negligence is required before a death is on their criminal record?

~~~
me_me_me
Ask Boeing about that.

~~~
krapht
In this case, isn't the blame on Vasquez, the safety driver? If a plane
crashes while on autopilot, ultimately we should blame the human pilot since
they retain master control.

~~~
stlHusker
Given the nature of the system, when would she have been notified that she
needed to take control? Additionally, what did Uber tell her about how the
system operated? Ultimately it is on Uber for the lack of serious
consideration. Rolling out level 2 & 3 autonomous driving systems where driver
attention (especially non-experts in an open world) is still required on a
wide scale is absolutely dangerous given human nature. Most people believe
that you have to completely jump to level 4 & 5 autonomy which represents
fully autonomous systems.

Think of it like this: "Hey, let's get this person who isn't an employee of
the company, gets paid a pittance in a "gig" economy where extreme "hustle" is
needed, probably isn't an attentive driver anyways and put them as a critical
backstop in a system that is still a prototype and that they most likely don't
understand."

The Uber engineering team lacked or ignored human factors experts.

~~~
wlll
I think it depends who was legally in control of the vehicle. I assume (and I
relise it could be different, but this is just creating a model) that the
safety driver would legally be considered to be in control of the vehicle, and
as such responsible for the crash. She did after all have the ability to
prevent the crash had she not been negligent.

I assume for instance that if I were to use Tesla AutoPilot on the road in the
UK (I don't have a Tesla so I haven't looked into this) and my car crashes
into someone while a) it's enabled and b) I'm not paying attention that I am
still 100% at fault.

Until self-driving cars can legally be in control of themselves, absolving
occupants of any responsibility, I'd assume that this is, or at least should
be, the case.

I don't think Uber is clean in this to be clear, I suspect they were cutting
corners to stay competitive, and I just don't trust them at all to make
decisions that are in the interest of the general public, but the direct
criminal responsibility seems to lie with the the safety driver, even though
it seems that Uber should be sueable for _something_.

~~~
jakeogh
"legally be in control of themselves"

Never. They have no skin in the game. If they did, locking them inside a car
for their life would be illegal.

~~~
wlll
At some point I suspect laws will change if self driving cars become good
enough. I don't know where the liability will lie, perhaps the car
manufacturers, or the insurance companies.

------
fabian2k
With a disabled emergency braking system and an inability to handle a rather
common situations like people jaywalking, these cars really shouldn't have
been driving in public. That is the kind of stuff that I'd expect to be tested
and implemented on private grounds.

Additionally, reducing the number of people in the car to one when the car is
pretty much by design not capable of handling emergency situations by itself
is quite reckless.

~~~
dmix
They shouldn't have been driving in public with someone who wasn't paying
attention 100% of the time as any regular driver would. The longer quote:

> Also, don't forget: the SUV's emergency braking system was deliberately
> disabled because when it was switched on, the vehicle would act erratically,
> according to Uber. The software biz previously said “the vehicle operator is
> relied on to intervene and take action," in an emergency.

The question then was there proper training and communication to the test
drivers that it's never okay to look down at your phone. Or whether that was
simply unrealistic expectations. Or if the hours were too long, or testing at
night, etc.

It said there was 5 seconds which should have been more than enough for a
human test driver to hit the brakes, which was their stated job.

~~~
Symmetry
In this case the safety drive was watching their phone which was playing a
movie so a bit beyond the normal level of attention wandering you'd expect.
But even for people trying their best having to pay attention to a road for
hours without having to provide any input isn't something you can reasonably
expect people to do. NHTSA level 3 autonomy is just a bad idea, we need to go
straight from 2 to 4.

~~~
carlmr
I think so, too. It's just asking something that humans weren't designed for.

------
wongarsu
So uber fails to predict even completely linear paths traveled by objects if
the system isn't trained on or expecting that specific type of object? That
sounds like an even bigger issue than "we didn't think pedestrians could exist
outside designated crossings"

~~~
madamelic
What's crazy to me is that an "Other" object isn't flagged for a complete stop
or a major alert to the "safety" driver. This seems like a major case for when
the software should stop being autonomous, when it isn't sure.

Likely the story is that their software flags a lot of things "Other" without
figuring it out upon closer inspection.

~~~
sp332
There was an emergency braking feature that would engage when a collision was
imminent. It fired 1.3 seconds before hitting Herzberg. But Uber had disabled
that function because there were too many false positives, so the car just
alerted the driver (who was not paying attention) instead.
[https://www.washingtonpost.com/news/dr-
gridlock/wp/2018/05/2...](https://www.washingtonpost.com/news/dr-
gridlock/wp/2018/05/24/ntsb-self-driving-uber-did-not-have-emergency-braking-
turned-on/)

~~~
Piskvorrr
Alerting a human (who wasn't _even_ in the driver's feedback loop at the time)
1.3 seconds before the collision? That is equal to "hey, look at this
unavoidable crash that we're driving into!"

------
fredley
Jaywalking was invented by the car industry, I wonder if they'll double down
on it again: [https://www.vox.com/2015/1/15/7551873/jaywalking-
history](https://www.vox.com/2015/1/15/7551873/jaywalking-history)

~~~
wongarsu
Maybe works for Uber and GM, but in Europe jaywalking doesn't exist. If they
want to sell cars here they will have to deal with being responsible for
almost every pedestrian they hit, because that's the standard we already set
for human drivers.

~~~
skrause
> _but in Europe jaywalking doesn 't exist_

Europe is not a country and each European country has different laws:
[https://en.wikipedia.org/wiki/Jaywalking#Europe](https://en.wikipedia.org/wiki/Jaywalking#Europe)

~~~
wongarsu
But it is a somewhat homogeneous region with similar laws, as supported by
that Wikipedia article: in general you can cross roads anywhere unless it's a
motorway or you are within 100 meters of a designated crossing (any closer and
the laws diverge on what's acceptable). That's just how we talk about the US
treating things a certain way despite the different states having varying laws
with countless exceptions and special cases.

~~~
Piskvorrr
In which case, this _would_ have been a jaywalking infraction in many European
countries, as a designated crossing is nearby.

However, it is not the role of a driver (or their car) to serve death
penalties for that.

~~~
josefx
According to wikipedia your nearby is 110 meters, so slightly out of the range
for most European countries. She most likely used a set of crossing paths on
the area between both halves of Mill Ave so she started even further south of
it.

~~~
masklinn
> your nearby is 110 meters, so slightly out of the range for most European
> countries.

« Every » rather than « most », 100m seems to be the Eastern European limit
and going down the list in western or northern Europe the limits seem far
lower, usually 20 to 50. Furthermore pedestrians not following this often does
not absolve drivers from any (let alone all) responsibility.

------
hirundo
When the software I write fails, and it does, someone doesn't get to read a
web page. Nobody dies or even gets very inconvenienced. This story reminds me
of how good a thing that is. After 40 years of practice I'm still not
competent to write such high stakes code. I hope the people who do write it
are far more competent than me, but still at least as skeptical about their
own capability.

~~~
AgloeDreams
An interesting trap is people in the field of social media (FB/ Twitter) not
realizing that they are in that high-stakes environment where bad privacy
settings and leaks can cause death.

~~~
twitter_anon
Some of us do, as we come from (or work with people who come from) states that
will perpetrate violence against dissidents.

Unfortunately, there still aren't enough of us. We're trying to change that.

------
jefftk
The NTSB just released their report:
[https://dms.ntsb.gov/public/62500-62999/62978/629713.pdf](https://dms.ntsb.gov/public/62500-62999/62978/629713.pdf)

"If the collision cannot be avoided with the application of the maximum
allowed braking, the system is designed to provide an auditory warning to the
vehicle operator while simultaneously initiating gradual vehicle slowdown. In
such circumstance, ADS would not apply the maximum braking to only mitigate
the collision."

"Certain object classifications— other—are not assigned goals. For such
objects, their currently detected location is viewed as a static location;
unless that location is directly on the path of the automated vehicle, that
object is not considered as a possible obstacle. Additionally, pedestrians
outside a vicinity of a crosswalk are also not assigned an explicit goal.
However, they may be predicted a trajectory based on the observed velocities,
when continually detected as a pedestrian."

~~~
DougBTX
> continually detected as a pedestrian

Interesting, from the article and this, it sounds like the system can’t
maintain position tracking of an object if it’s classification changes. So
even if it could detect a pedestrian, something that could be ambiguous like a
pedestrian pushing a bike might have no motion tracking data from one moment
to the next, so the car would have no ability to predict its trajectory.

~~~
tjungblut
I had the same impression from reading the table. Looking at the map and
trajectories though, how could the human not see the car coming? Or was she
thinking that the car would stop/slow down? Same question holds for the
vehicle operator.

edit after reading the other reports: the victim apparently was under the
influence of methamphetamine and the vehicle operator was busy watching Hulu.

~~~
jcranmer
The pedestrian was about 3/4 of the way across the road when she was struck,
and was walking a bicycle that was partially laden with goods. That suggests
that quick evasion on the pedestrian's part would have been somewhat
difficult, but given that the road was empty of other vehicles, there was a
long clear sight distance to the pedestrian, and there was ample space to
maneuver, any reasonable driver would have been able to stop or switch lanes
to evade the pedestrian.

The driver was not paying attention to the road and was incapable of
performing a timely emergency maneuver (be it a stop or lane change).

~~~
SiempreViernes
Yeah but when you cross the street you generally keep an eye out for traffic,
and the sight-line is such that if the car had headlights on it should have
been visible for more than 6 seconds before the collision. The victim might
well have expected the car to yield somewhat.

However, I don't know if the car did have any external lights on, so it might
have been hard to see until it was closer than 6 seconds away.

------
isostatic
It wasn't that the sensors were confused or something - the AI had no idea how
to cope with pedestrians in the road for whatever reason, so threw them in the
"Other" category.

At least 98% of objects are categorised, the rest don't matter. /s

~~~
Gupie
Isn't the bigger problem that it did not classify the object as being on a
collision path. It doesn't matter what the object is, brake if you are going
to hit it.

~~~
ForHackernews
> It doesn't matter what the object is, brake if you are going to hit it.

The built-in emergency braking system from Volvo does this [0] but Uber
deliberately disabled it (presumably because it conflicted with their self-
driving rig).

[0] [https://www.media.volvocars.com/global/en-
gb/media/pressrele...](https://www.media.volvocars.com/global/en-
gb/media/pressreleases/38073)

~~~
rightbyte
Atleast the Volvo system output could atleast be used as a sanity check or
something.

If you hit the accelerator harder when the Volvo brakes you override it. It
should be fairly easy to integrate as a backup.

However I guess that the Volvo system might not be far looking enough for
those speeds?

------
more_corn
I can personally attest that they couldn't see you IN the crosswalk either.
One nearly ran me down because it failed to yield to pedestrians in the
crosswalk when turning right.

While I was standing there talking about it, the driver looped the car around
the block and tried again. It did the exact same thing the second time. One of
the people it happened to walked in to the office to report the bug. The Uber
engineers there poo-pooed the concern and never did anything to fix it. This
was before they killed the pedestrian. The fact that they utterly failed to
create adequate safety systems when repeatedly warned, shows they are
absolutely not capable of doing this safely.

~~~
jacquesm
Oh wow. That's really bad, one of the ground rules of traffic participation is
that traffic that goes straight on the same road as you when you turn _always_
has priority.

> The Uber engineers there poo-pooed the concern and never did anything to fix
> it.

This is yet another nail in the coffin for Uber and self driving. I think as a
company Uber is categorically and institutionally unable to participate in
this space in a responsible manner. Their whole corporate culture is totally
opposed to what is required to make this a reality.

------
mattacular
Here's a simple rule that could be enacted for companies seeking to do work in
the autonomous driving space: if your car kills a person, your company loses
its license to work on autonomous vehicles. Forever. That will dictate the
adequate pace to achieve these goals safely. Then we'll see what's really
possible with this technology.

Having cars that can drive themselves just doesn't seem like a particularly
high priority for society at large in the face of other looming issues. Why
allow it to proceed in such a dangerous fashion at all?

~~~
drcode
What if the rate of deaths for a given company is not zero, but below the rate
for human drivers? Is it ok to have additional, unnecessary pedestrian deaths
by NOT allowing that company to deploy their technology?

(That said, the negligence in the Uber case makes it pretty clear they are
likely far from reaching that level of competency)

~~~
puranjay
So you mean being okay with a multibillion dollar corporation harming people
so it can make even more profits, all the while telling you that it's "good
for society"

Gee, where have I heard that script before?

~~~
nulld3v
So is reducing the accident rate on roads "harming people"?

~~~
puranjay
Good for the people once the system is perfected. Not so much for the ones who
get run while the system is being perfected.

The "greater good" utilitarian argument has been the basis of some of the
worst policies and politics in the world.

I'm not saying that self driving cars fall into the same category, but how
many deaths are you okay with until Uber/Waymo perfect their algorithms (and
later, charge you for it)? 1? 10? 100?

------
dvdhnt
Personally, I think using automobiles as a blueprint for autonomous
transportation is horrifying and preposterous. The first auto I know of went
10mph in 1886. Engineers have had over 100 years to iterate on the concept yet
people are still dying. Somehow, Ubers and Googles think they can distill this
kind of refinement into a decade or two of software "engineering". Insanity.

We should focus on transporting goods and services autonomously at a fraction
of the speeds. It should be about efficiency and impact, not getting one human
from A to B.

------
hysan
Animals crossing the street is rare, but a situation I'd expect self-driving
cars to be trained on. I doubt they classified all known large animals, so
shouldn't there have been a classification to anticipate animals it doesn't
know crossing the street?

As someone who spent a number of years living in the countryside, this seems
obvious. However, for those growing up in cities, this might never cross your
mind. It makes me wonder if there might be a lack of diversity on the teams
building and testing these systems.

------
munmaek
What’s that quote about the last 20% of the work taking 80% of the effort?

This crash seems to be the result of several cases not being handled
correctly, or at all.

1\. Pedestrian crossing -not- at a crosswalk, aka jaywalking

2\. Objects tagged as “Unknown” don’t have their path tracked.

By the time the car saw an “unknown object” (the woman) directly in its
trajectory it was too late (1.2s before impact).

Why was there no system for a person jaywalking? That’s extremely common.

Why don’t “Unknown” objects get stored and individually have their
trajectories tracked? Hello?

Programmers need to start going to jail.

~~~
pluma
Not necessarily the programmers. There'll always be some people who are easily
enough to manipulate to write this kind of code. But those making the decision
to ship this and sign off on it definitely should face criminal negligence
charges.

~~~
munmaek
One can only claim ignorance for so long. They know what they’re programming.

But yes, managers and the like involved should also be going to jail. This
cavalier, negligent attitude needs to be dealt with yesterday.

------
dghughes
And now today (Nov 6) there are news articles about the Tesla "Smart Summon"
which was quietly activated. There are videos of dangerous near collisions of
"summoned" driver-less Tesla vehicles very slowly driving among other
vehicles.

[https://www.ctvnews.ca/video?playlistId=1.4658613](https://www.ctvnews.ca/video?playlistId=1.4658613)

~~~
tdy_err
That’s very blatant sensationalist journalism that conveniently does not
mention that there is a person _holding down a button_ the entire time that
the car is being summoned.

If that’s “dangerous”, well, so is piloting it manually.

~~~
rayiner
That’s a red herring though. It shouldn’t matter if the person is paying
attention if the self driving tech is working. (As Musk likes to say, the
person is just there to tick off a legal checkbox.)

~~~
wiggles_md
At what point do Musk’s statements represent a moderate legal risk?

For the sake of argument, could Musk’s statement, along with his very public
insistence of being actively engaged in the design process, demonstrate a
cavalier disregard for proper safety engineering at the management level
(outside acceptable industry practice) and so a defective process?

------
patagurbon
It’s mind boggling that tracking history wasn’t maintained between object
classifications. Surely the best thing to do would be to detect "object" ->
track it -> classify -> predict path -> back to classify

------
ForHackernews
Between this and deliberately disabling Volvo's own emergency braking
features, I think some decision-makers at Uber should be charged with
negligent homicide.

Maybe a few executives going to jail will put an end to this cowboy attitude
where lives are on the line.

~~~
microtherion
IANAL, but given that the disabling of the security system was an act of
commission, not omission, it could be argued that this was more than just
"negligent". On the other hand, car manufacturers have no legal duty to
include collision avoidance systems, so disabling one is probably not illegal
in itself.

------
mangecoeur
So: Uber put a car on the road that was not designed to detect pedestrians
except in a few limited conditions. Someone died because of their gung-ho
approach to safety. How is that even remotely acceptable? Really hope we see
people being held responsible for this...

------
taylodl
When I learned to drive they stressed unanticipated objects coming into the
street - especially balls: don't look at the trajectory of the ball and judge
whether it will be in your way, instead look for the kid who will be running
after the ball while not paying attention to traffic. Also in my state if you
hit a kid (a minor actually) on a bicycle you are guilty until proven
innocent. That's the law. You have to _prove_ there was no reasonable way for
you to avoid the accident.

------
Simulacra
If the safety driver had been paying attention to the road instead of his
phone, they might have been able to break in time so that this woman did not
get hit.

~~~
KeithBrink
The interview with the driver by the NTSB is interesting if you want more
details from the driver's side of the story:

[https://dms.ntsb.gov/pubdms/search/document.cfm?docID=477745...](https://dms.ntsb.gov/pubdms/search/document.cfm?docID=477745&docketID=62978&mkey=96894)

You can also see the detailed report from the NTSB which will go through about
everything you would want to know about the actual driver and her reaction:

[https://dms.ntsb.gov/pubdms/search/document.cfm?docID=477743...](https://dms.ntsb.gov/pubdms/search/document.cfm?docID=477743&docketID=62978&mkey=96894)

Particularly of note is that Hulu was streaming video on her phone until about
1 minute after the crash (page 11 of the report).

~~~
jcranmer
One of the most terrifying lines in those documents to me is this:

> According to Uber ATG, the SDS did not have the capability to classify an
> object as a pedestrian unless that object was near a crosswalk.

Jaywalking is extremely common. I've seen pedestrians jaywalk a 45mph 6-lane
road, with no median to speak of. Anyone with driving experience should know
that pedestrians can occur even with no crosswalk, so it boggles my mind that
any engineer would sign off on such a decision.

------
yumraj
We had a thread a few days back with optimistic people predicting self driving
cars in 2020.

I will urge those people to read this and consider the numerous other cases,
edge or not, that these systems which seem to work great in the surface, may
not be handling.

------
Animats
Actual NTSB report: [1]

[1]
[https://dms.ntsb.gov/public/62500-62999/62978/629713.pdf](https://dms.ntsb.gov/public/62500-62999/62978/629713.pdf)

------
jon889
What does it matter what the "object" was, if anything/something is moving to
intercept then the car should stop.

~~~
pbhjpbhj
Well surely not, if its a bird and you'd cause an accident with an emergency
stop then you should continue. Dogmatic responses are exactly the problem with
self-driving vehicles.

Sometimes it's preferable to hit something in your own vehicle. Cue the
classic moral question of which person would you kill if you can only avoid
one - with your vehicle - by hitting another.

~~~
Symmetry
Generally we should probably have vehicles follow proscriptions of the law
when they can't avoid hitting anything and change the law if we want them to
act differently. I've never seen a hypothetical self-driving car trolley
problem where there wasn't a single option that was clearly what the law
required.

~~~
ghaff
By and large, the trolley problem concept is overblown. Most of the time the
right/best answer is going to be to stand on the brakes and hope for the best.

But I'm honestly not sure what the law "requires" if you've got a scenario
where there are going to be bad outcomes no matter what you do.

~~~
Symmetry
For instance if a car has the choice of hitting someone in the road or
swerving onto the sidewalk and hitting someone there then clearly the legal
thing to do is for the car to stay in its right of way and hit the person in
the road.

~~~
ghaff
That's a pretty clear case of taking a deliberate action to leave the road
surface. But you can at least imagine scenarios where everyone is within the
bounds of the road--say 5 people directly ahead and 1 off to the side.

As I say though, if you can't swerve to avoid people, the most reasonable
action that most people would take--to the degree they had time to make a
conscious decision at all--would be to brake as hard as they could and let
things play out as they will.

------
KKKKkkkk1
Uber's researchers make very confident presentations about the very advanced
ideas that are ostensibly implemented in their cars. How does this square with
the apparently very primitive system that's described in the NHTSA report?

Example:

Jeff Schneider: Self Driving Cars and AI
[https://youtu.be/jTio_MPQRYc](https://youtu.be/jTio_MPQRYc)

Or:

[https://eng.uber.com/research/predicting-motion-of-
vulnerabl...](https://eng.uber.com/research/predicting-motion-of-vulnerable-
road-users-using-high-definition-maps-and-efficient-convnets/)

Predicting Motion of Vulnerable Road Usersusing High-Definition Maps and
Efficient ConvNets

 _Following detection and tracking of traffic actors, prediction of their
future motion is the next critical component of a self-driving vehicle (SDV),
allowing the SDV to move safely and efficiently in its environment. This is
particularly important when it comes to vulnerable road users (VRUs), such as
pedestrians and bicyclists. We present a deep learning method for predicting
VRU movement where we rasterize high-definition maps and actor’s surroundings
into bird’s-eye view image used as input to convolutional networks. In
addition, we propose a fast architecture suitable for real-time inference, and
present an ablation study of rasterization choices._

------
manbearpiggy
This article is just terrible, it goes into the code/technical problems with
the car. They killed a woman and no one went to jail for manslaughter.

------
gourou
People jaywalk often, I feel like this issue should have appeared before.
Would it be possible Uber was using some of Otto's technology (self-driving
trucks) then decided to replace it abruptly (because of the lawsuit with
Google) and it caused this seemingly avoidable crash?

Context: Uber acquired Otto, a company founded by Google Waymo's former CEO,
Anthony Levandowski. It quickly got involved in a lawsuit where Google alleged
that Lewandowski stole Waymo's self-driving intellectual property. Uber later
agreed not to use any Waymo IP and give 0.34% of equity to Google.

[https://www.buzzfeednews.com/article/priya/waymo-asks-
judge-...](https://www.buzzfeednews.com/article/priya/waymo-asks-judge-to-
halt-ubers-self-driving)

[https://jalopnik.com/googles-waymo-and-uber-reach-
settlement...](https://jalopnik.com/googles-waymo-and-uber-reach-settlement-
in-high-profile-1822868808)

------
mark-r
My biggest fear is that accidents of this type will result in the engineers
running back to their ML to add another case to the training data. We'll end
up playing a game of whack-a-mole (literally?) as new special cases come up.
Will training the system to recognize a pedestrian pushing a bicycle enable it
to recognize someone riding a mountain bike?

Consider why the auto-braking system wasn't enabled. It's because the system
can't identify which elements in the environment are harmless, causing too
many false positive braking events. The opposite problem is no easier to
solve.

------
csommers
The PMs, engineers, and person sitting at the wheel should all be charged with
manslaughter. Maybe then we'd get actual improvements.

But who am I kidding? We can't even lock up bankers for simple shit.

------
petercooper
I've long wondered how autonomous vehicles could ever work in the UK, and
mostly because there are _lots_ of driving situations I encounter where you
_have_ to force your way out or twist the rules to make any sort of progress.

Now we can add pedestrians to that which in the UK is a tricky topic since
pedestrians have _right of way_ if they're already crossing a side street
you're turning into. Even an astute driver can run into trouble here and has
to be extremely aware of pedestrian movement.

~~~
umanwizard
Even in the US, driving is more difficult than it is in Arizona suburbs. The
fact that self-driving cars can’t even handle super easy mode makes me very
pessimistic that they will ever be used in more than a tiny minority of
places.

------
Symmetry
Brad Templeton has also written his analysis of what went wrong in the crash.

[https://www.forbes.com/sites/bradtempleton/2019/11/06/new-
nt...](https://www.forbes.com/sites/bradtempleton/2019/11/06/new-ntsb-reports-
on-uber-fatality-reveal-major-errors-by-uber/#4d1bff581781)

------
sonthonax
Another thing that seems wrong is that the backup drivers are on their own.
Anyone who's done night time guard duty will know how hard it can be to stay
awake, let alone maintain attention.

It would make so much sense t have a backup driver and co-backup driver (and
maybe allow neither a smart phone).

------
tekknik
I still fail to see what benefit self driving cars give us. There’s always
going to be bugs and now we’re strapping them to multi-tom vehicles. This
reeks of bad idea yet we continue on as if this is the a cure for some huge
problem

------
neilobremski
Holy shit this is scary. I must have been living under a rock but I didn't
realize that FULLY-AUTONOMOUS self-driving cars were even LEGAL yet.

> The self-driving car was fully autonomous at the time at the accident,
> though it had a human driver at the wheel. An internal camera caught the
> Uber worker looking down and away from the road moments before the accident,
> unaware of Herzberg’s presence before it was too late.

Uber worker ... employee? Were they doing a live test?

> ... the team at Uber Advanced Technologies Group has adopted critical
> program improvements to further prioritize safety ...

I certainly hope this means having their terminator vehicles on a closed track
instead of out and about. What the hell does "prioritize" have to do with
"common sense"? If you throw an unhandled exception ... STOP THE CAR

------
spease
Who knows, maybe the next set of victims will be people jaywalking to get to
an Uber, that then get mowed down by another Uber driving at 39mph which
classified them as bicycle / other.

------
RickJWagner
Wow. Somebody _really_ dropped the ball on that one.

It must be stuff so complicated that it's difficult to see the overview. How
else could someone not figure on people outside of crosswalks?

------
lightedman
Relevant: [https://youtu.be/dnioHfg1xbQ](https://youtu.be/dnioHfg1xbQ)

Tesla cars can't even figure out where they are half of the time.

------
nix0n
This is murder.

Uber wants to increase ridership any way they can, and if you're walking then
you're not a customer.

Since they got away with it in beta, the release version will continue to
murder people.

------
basicplus2
it can only be as good as who ever programmed it

Consumers and even regulators seem to think somehow software magically adapt
to anything.

These sort of software environments need standards developed that deal with
identified situations that it must satisfy in standardised testing as a
starting point.

------
skc
Self driving cars probably (and impossibly) need their own roadways like cable
cars or trains.

~~~
javagram
So, self-driving BRT? Probably will only happen when self-driving cars do in
general.

------
darepublic
Pedestrian detection outside of crosswalk is an edge case, it's a nice to
have!

------
Symbiote
Why does Uber use an SUV?

These vehicles are widely known to be much worse for pedestrian injuries and
fatalities. They should use an ordinary car.

(Frankly, I'd like to see all SUV drivers considered legally negligent in any
pedestrian accident due to their choice of vehicle.)

------
vanon
If you think Uber’s car is bad can you imagine how bad Lyft’s is or most any
company that started years behind?

I worked in the SDC industry. It’s mostly a science project.

Between its economics and its car I don’t understand why Lyft isn’t heavily
shorted.

~~~
dmix
> If you think Uber’s car is bad can you imagine how bad Lyft’s is or most any
> company that started years behind?

The amount of time someone spent on a software project isn't the best
indicator of its quality... especially for science work where there's a large
amount of knowledge transfer within the field.

Lyft hired a top Google engineer to run the project and plenty of other
experienced people. They weren't starting from scratch like Waymo and nor did
Uber.

The car simply shouldn't have been tested without a driver (or two) constantly
paying attention. Clearly Uber didn't trust it to be running by itself yet
(especially with the emergency brakes being disabled) but it basically was if
the test driver wasn't paying attention.

------
cellular
I can see jaywalking becoming a thing of the past!

------
sitkack
Did they tell the safety drivers this fact?

------
hexagone
I almost jaywalked while reading this. Thankfully the vehicles are all
manually driven.

------
desc
tl;dr: Uber have not the slightest idea what they're doing, and should not be
permitted to develop anything safety-critical until they can properly explain
'object permanence'.

Also, would probably be a good idea to hold the creators, trainers and
marketers of anything called 'AI' legally responsible (jointly and severally)
for all of such a system's decisions for the next few decades. Might encourage
some much-needed caution.

------
paggle
These things seem almost certain to me:

* Self-driving cars are not going to be here in less than 5 years

* The tech is improving fast enough that it will end up working, and not being vaporware. Within 20 years, self-driving cars will be more than 50% of the new car sales, and eventually it will become the norm.

------
fortran77
Some blame, too, belongs to the human driver. His job was to look for
exceptional conditions that the car may not pick up. And he failed.

It's quite possible there wasn't time for a good human driver to stop
completely or avoid a pedestrian where one shouldn't be, and this accident was
all but unavoidable, but we'll never know because no attempt was made. Given
the speed of the car, the woman crossing the street either miscalculated, was
betting that traffic would see her and slow down, or didn't see the oncoming
car.

It was also odd that they disabled the car's stock emergency collision
avoidance system.

------
chagai95
this definitely counts as a human error, so stupid.

------
rom1v
For reference, the video recorded by the car:
[https://twitter.com/TempePolice/status/976585098542833664/vi...](https://twitter.com/TempePolice/status/976585098542833664/video/1)

~~~
stefan_
No, this is a video from a shitty dash cam released by Uber (and then happily
compressed to shit and forwarded by police, who don't miss a chance to blame
whoever died in a crash) to deliberately misrepresent the situation. This
garbage pinhole sensor has not one hundredth of the dynamic range of the
actual cameras they use for the autonomous driving and is not at all
representative of how human eyes see, which would have had no problem spotting
her from many many seconds away.

No, this is part of the crime here.

------
reedwolf
To make an omelette, you have to break a few eggs.

~~~
cryptozeus
Would you say that if that was your family in front of that car ? Very
ignorant comment.

~~~
cryptozeus
Yeh agree with both of your comments but is this the way to justify the error
that caused someone’s life ? Almost makes it sound like “oh well it happens”.
Comments like these are showing why software like that was rushed to test
against live human. What about showing some empathy?

