
Self-driving Uber car kills Arizona woman crossing street - kgwgk
https://www.reuters.com/article/us-autos-selfdriving-uber/self-driving-uber-car-kills-arizona-woman-crossing-street-idUSKBN1GV296
======
icc97
One aspect that comes from this is that now car crashes can be treated more
like aircraft crashes. Each self-driving car now has a black box in it with a
ton of telemetary.

So it's not just "don't drink and drive", knowing that they'll probably
reoffend soon anyway. Every crash and especially fatality can be thoroughly
investigated and should be prevented from ever happening again.

Hopefully there's enough data in the investigation so that Tesla / Waymo and
all other car companies can include the circumstances of the failure in their
tests.

~~~
fudged71
Although it's comforting that this exact situation shouldn't happen again in
an Uber autonomous car... there is no mechanism to share that learning with
the other car companies. There seriously fucking needs to be a consortium for
exactly this purpose: sharing system failures.

Also my problem with this is that a human death is functionally treated as
finding edge cases that are missing a unit test, and progressing the testing
rate of the code... and that really bothers me somehow. We need to avoid
treating deaths as progress in the pursuit of better things

~~~
mindslight
> _We need to avoid treating deaths as progress in the pursuit of better
> things_

Au contraire. Go read building codes some time. There's a saying that they're
"written in blood" \- every bit, no matter how obvious or arbitrary seeming,
was earned through some real-world failure.

The death itself isn't progress, of course. But we owe it to the person to who
died to learn from what happened.

~~~
bigiain
You seriously think Uber wouldn't try to claim "commercial in confidence" or
"trade secrets" rights to all the data from every single death?

~~~
mindslight
Clearly I think Uber is a benevolent entity that has all of our best interests
at heart. Also, I eat babies.

Or, you know, don't jump on comments for their not explicitly addressing the
hobby horse you're riding. Frankly I just wanted to express a better
engineering context around the loss of life without getting into the political
bullshit for once.

------
w_t_payne
To ensure that all automotive software incorporates lessons learned from such
fatalities, it would be beneficial to develop a common data set of (mostly
synthetic) data replicating accident and 'near miss' scenarios.

As we understand more about the risks associated with autonomous driving, we
should expand and enrich this data-set, and to ensure public safety, testing
against such a dataset should be part of NHTSA / Euro NCAP testing.

I.e. NHTSA and Euro NCAP should start getting into the business of software
testing.

~~~
rjvir
Why would Uber agree to such regulations?

They were unwilling to legally obtain a self-driving license in California
because they did not want to report "disengagements" (situations in which a
human driver has to intervene).

Uber would just set their self-driving cars free and eat whatever
fine/punishment comes with it.

~~~
LifeLiverTransp
Very cynical but - if your self-driving is way behind of your competitors-
wouldnt it help to have your lousy car in a accident - so that your
competitors get hit with over-regulation and you thus kill a market- on which
you cant compete?

~~~
manmal
I‘m quite sure this would backfire A LOT in terms of brand damage. Uber in a
sense made history today and now has actual blood on their hands. And if such
a strategy should EVER leak (Dieselgate anyone?), people are going to prison.

~~~
smartician
GM killed 124 people with faulty ignition switches[1], yet the brand still
survived. It's a cost calculation: will the brand damage outweigh the benefit
to the company? Sadly, human lives don't factor into that equation.

[1] [http://money.cnn.com/2015/12/10/news/companies/gm-recall-
ign...](http://money.cnn.com/2015/12/10/news/companies/gm-recall-ignition-
switch-death-toll/index.html)

~~~
manmal
Sadly, that’s a common occurrence with big automakers.

I can’t say anything about GM‘s rep in the US, but here in Europe they are not
doing so well. Chevrolet was killed in 2015, and Vauxhall/Opel are doing only
ok-ish. Chevy had SO many recalls in the years before they killed it.

~~~
erk__
Opel got bought back to europe by PSA the owners of Citröen and Peugeot in
2017 so they have a chance to turn it around.

------
IgorPartola
This is what's going to happen. If you've ever seen a machine learning
algorithm in action, this isn't surprising at all. Basically, they'll behave
as expected some well known percentage of the time. But when they don't, the
result will not be just a slight deviation from the normal algorithm, but a
very unexpected one.

So we will have overall a much smaller number of deaths caused by self driving
cars, but ones that do happen will be completely unexpected and scary and
shitty. You can't really get away from this without putting these cars on
rails.

Moreover, the human brain won't like processing these freak accidents. People
die in car crashes every damn day. But we have become really accustomed to
rationalizing that: "they were struck by a drunk driver", "they were texting",
"they didn't see the red light", etc. These are "normal" reasons for bad
accidents and we can not only rationalize them, but also rationalize how it
wouldn't happen to us: "I don't drive near colleges where young kids are
likely to drive drunk", "I don't text (much) while I drive", "I pay
attention".

But these algorithms will not fail like that. Each accident will be unique and
weird and scary. I won't be surprised if someone at some point wears a stripy
outfit, and the car thinks they are a part of the road, and tries to
explicitly chase them down until they are under the wheels. Or if the car
suddenly decides that the road continues at a 90 degree angle off a bridge. Or
that the splashes from a puddle in front is actually an oncoming car and it
must swerve into the school kids crossing the perpendicular road. It'll always
be tragic, unpredictable and one-off.

~~~
orbitur
> I won't be surprised if someone at some point wears a stripy outfit, and the
> car thinks they are a part of the road, and tries to explicitly chase them
> down until they are under the wheels. Or if the car suddenly decides that
> the road continues at a 90 degree angle off a bridge. Or that the splashes
> from a puddle in front is actually an oncoming car and it must swerve into
> the school kids crossing the perpendicular road.

Are you working on the next season of Black Mirror?

In all seriousness, my fear (and maybe not fear, maybe it's happy expectation
in light of the nightmare scenarios) is that if a couple of the "weird and
terrifying" accidents happen, the gov't would shut down self-driving car usage
immediately.

~~~
IgorPartola
I am definitely not. Their version of the future is too damn bleak for me.

Your fear is very much grounded in reality. US lawmakers tend to be very
reactionary, except in rare cases like gun laws. So it won't take much to have
restrictions imposed like this. Granted, I believe some regulation is good;
after all the reason today's cars are safer than those built 20 years ago
isn't because the free market decided so, but because of regulation. But self
driving cars are so new and our lawmakers are by and large so ignorant, that I
wouldn't trust them to create good regulation from the get go.

~~~
TallGuyShort
> except in rare cases like gun laws

They're still very reactionary in that, which is precisely why it isn't very
effective when a subset of them do react: there are plenty of smart things
that could get proposed, but the overlap between people who know what they're
talking about and people that want the laws is exceptionally small, so
consequently dumb, ineffective stuff that has no chance of passing anyway gets
proposed. What does get proposed is a knee-jerk reaction to what just
happened, and rarely actually looks systemically at the current laws and gun
violence as a whole. Example: the Las Vegas shooting prompted a lot of talk of
bump stock bans. Bump stocks are so rarely used at all, nevermind in violence,
and they will generally ruin guns that weren't originally made to be fully-
automatic very quickly if they're actually used for sustained automatic fire.
Silly point to focus on suddenly. After the Florida shooting last month so
much focused on why rifles are easier to obtain than handguns. And it's
because overwhemingly most gun violence is handguns. Easily concealable rifles
are already heavily regulated at the federal level for that very reason.

~~~
alacombe
> Example: the Las Vegas shooting prompted a lot of talk of bump stock bans.
> Bump stocks are so rarely used at all, nevermind in violence, and they will
> generally ruin guns that weren't originally made to be fully-automatic very
> quickly if they're actually used for sustained automatic fire.

<off-topic> This is non-sense. Typical semi-auto are wayyy over build. Unless
mechanical wear or explicit tempering of the disconnector, there is no risk
whatsoever to fire thousands of rounds with a bump stock. Actually,
plastic/wood furniture are more likely to burn/melt before the mechanical
parts will actually fail. As worst, you might bend a gas piston, but the rifle
will otherwise be fine.

The underlying reasoning behind the push against the bump stock ban is that it
was basically a semi-auto ban, as you can trivially with a bit of training
bump fire any semi-auto without a bump stock from either the shoulder or the
hip with a mere finger. </off-topi>

~~~
TallGuyShort
>> you might bend a gas piston

Tubes on low- to mid- range civilian DE guns can burn out very quickly, and
are in fact designed to do so long before you get damage to the more expensive
parts of the gun - I've seen it happen in most of the cases (which are
admittedly quite few in number despite how often I'm there) where I've seen
someone using a bump stock at a range. In the most recent case I think the guy
was on his 3rd mag and it ruptured. It was a M&P 15 Sport II, if I recall. Not
a cheap no-name brand, but about as low-cost as you can get and missing all
the upgrades in the version they market to cops. High-end ARs would fair
better, I'd expect, but high-end ARs are again so rarely used for actual
violence because they're usually only purchased by people shooting for a
serious hobby in stable life situations. And honestly I feel the same people
buying those probably feel bump stocks are tacky and gaudy like I do.

Even in the most liberal interpretation of the proposed law, I don't think any
bump stock ban would become a semi-auto ban. I could see the vague language
getting applied to after-market triggers, especially ones like Franklin
Armory, but you've gotta have some added device for any of the proposals I've
seen to even remotely apply.

------
raphaelj
To put this accident in perspective, Uber self-driving cars totaled about 2 to
3 millions miles, while the fatality rate on US roads is approximately 1.18
deaths per 100 millions miles [1].

[1] [https://www.nhtsa.gov/press-releases/usdot-
releases-2016-fat...](https://www.nhtsa.gov/press-releases/usdot-
releases-2016-fatal-traffic-crash-data)

~~~
ejstronge
The appropriate comparison would be to ask how many pedestrians were struck by
a car and killed.

Considering that humans would likely slow down if they see a pedestrian - even
if one appeared suddenly - this is even more disconcerting.

~~~
lev99
It's more specific, but I care about all fatalities, not just car-pedestrian
fatalities.

~~~
LyndsySimon
At some point, autonomous vehicles will be carrying cargo instead of
passengers. When that happens, fatalities per mile driven will no longer be a
valid comparison between manned and unmanned vehicles, as catastrophic
accidents will have fewer people involved.

~~~
lev99
I do think it'll always be fair to consider total lives lost per usage metric,
regardless of if people were in the vehicle or not. The lives of drivers &
passengers have equal value to the lives of pedestrians.

------
pcthrowaway
The people leading the development should demonstrate that they can stop for
pedestrians by personally jumping out in front of them on a closed test road.
If they're not able to demonstrate this, they shouldn't be putting them on
public roads.

~~~
Johnny555
Self driving cars are still subject to the laws of physics... unless you're
going to dictate that self-driving cars never go above 15mph, I wouldn't
advocate jumping in front of even a "perfect" self-driving car.

Braking distance (without including any decision time) for a 15mph car is 11
ft, for a 30mph is 45 ft. Self driving cars won't change these limits. (well,
they may be a little better than humans at maximizing braking power through
threshold braking on all 4 wheels, but it won't be dramatically different)

So even with perfect reaction times, it will still be possible for a self-
driving car to hit a human who enters its path unexpectedly.

~~~
bobthepanda
Indeed. This is why many cities are reducing speed limits.

In fact, self-driving cars may actually improve the situation if cars actually
start complying with speed limits en masse.

~~~
dsfyu404ed
>Indeed. This is why many cities are reducing speed limits. In fact, self-
driving cars may actually improve the situation if cars actually start
complying with speed limits en masse.

The vast majority of people just go however fast they feel comfortable
(considering conditions, etc) regardless of the speed limit.

Mixed traffic speeds decrease safety.

Raising speed limits so that you don't have the % of people who comply with
the letter of the law traveling slower than the people who go however fast
they're comfortable usually improves safety.

Unless your goal is to increase ticket revenue or appease the "think of the
children crowd" there's no point to lowering speed limits. It doesn't do much
to affect traffic speed. To do that you have to modify the road or do
something to change the traffic flow.

Self driving cars will improve safety because they'll result in political
pressure to raise speed limits to match reality and they'll make dynamic speed
limits more practical.

~~~
breischl
>Mixed traffic speeds decrease safety.

Sure, but city streets are already mixed traffic. There are pedestrians,
bikes, vehicles parking or turning, etc. It's not reasonable to raise the
limit to what people want to drive and just ignore all the other users of the
street.

Also, the optical narrowing mentioned in a sibling comment is quite effective.
They've done that on a few streets near me via things like sidewalk bulb-outs
at intersections, and swapping the parking lane & bike lane (so it goes curb-
bike-parking-drive, rather than curb-parking-bike-drive). Everyone drives more
slowly on those streets now - myself included.

------
untog
This opens up an interesting question going forward. We can't rely on Uber
themselves to analyse the telemetry data and come to a conclusion, they're
biased. So really, we need self driving car companies to turn over accident
telemetry data to the police. But the police are ill equipped to process that
data.

We need law enforcement to be able to keep pace with advances in technology
and let's face it, they're not going to have the money to employ data analysts
with the required skills. Do we need a national body for this? Is there any
hope of a Republican government spending the required money to do so? (no)

~~~
dmoy
On the one hand I'd say give it to the NTSB, because historically they are
really good at this sort of thing.

On the other hand, I'd wonder if increasing NTSB scope this much would
drastically decrease the average quality of NTSB's work. Scaling ain't easy.

~~~
jeiting
Yeah, pilot here. NTSB is the right organization to handle this. The
investigators over there do an amazing job of determining root cause from
forensic evidence. I assume that will be the process here.

~~~
kortex
NTSB is there to handle civilian transportation accident investigations. Most
automotive accidents are not very "surprising", which leaves them mostly
dealing with non-personal transportation accidents. We are about to have a
rapid increase in surprising, non-personal transportation accidents, so I
seriously hope they are afforded the resources to deal with the influx as AVs
come online.

------
jeremynixon
Important and and missing:

"[Uber] said it had suspended testing of its self-driving cars in Tempe,
Pittsburgh, San Francisco and Toronto"[1]

1: [https://www.nytimes.com/2018/03/19/technology/uber-
driverles...](https://www.nytimes.com/2018/03/19/technology/uber-driverless-
fatality.html)

~~~
basura045478
that's SOP for every av program. have incident? ground the fleet. doesn't
matter why, or who's at fault.

------
leggomylibro
FTA:

"The Uber vehicle was reportedly driving early Monday when a woman walking
outside of the crosswalk was struck.

...

Tempe Police says the vehicle was in autonomous mode at the time of the crash
and a vehicle operator was also behind the wheel."

That's a very bad look; the whole point of self-driving cars is that they can
react to unexpected circumstances much more quickly than a human operator,
such as when someone walks out into the road. Sounds like Uber's platform may
not be up to that standard yet, which makes me wonder why they're on public
roads.

On the other hand, it sounds like it happened very recently; I guess we'll
have to wait and see what happened.

~~~
jdietrich
_> That's a very bad look; the whole point of self-driving cars is that they
can react to unexpected circumstances much more quickly than a human operator,
such as when someone walks out into the road._

Some of these accidents are unpreventable by the (autonomous) driver. If a
pedestrian suddenly rushes out into the street or a cyclist swerves into your
path, the deciding factor is often simply the coefficient of friction between
your tyres and the road.

The autonomous vehicle and the human attendant might have made a glaring
error, or they might have done everything correctly and still failed to
prevent a fatality. It's far too early to say. It's undoubtedly a dent to the
public image of autonomous vehicles, but hopefully the car's telemetry data
will reveal whether this was a case of error, negligence or unavoidable
tragedy.

~~~
exelius
I’m kind of in the Elon Musk camp here where you gotta break some eggs to make
an omelette? Human-driven cars kill a lot of pedestrians today, but we can
actually do something to improve the human-recognition algorithms in a self-
driving car.

As long as self driving cars represent an improvement over human drivers, I’m
ok with them having a non-zero accident date while we work out the kinks.

~~~
snuxoll
The problem is the blame game. Human behind a wheel hits a pedestrian, if
they're found at fault then that "horrible inattentive/drunk/whatever driver"
goes to jail for the death of another human being. It never makes anything
"right", and I won't even start on how a jail sentence regardless of length
can ruin your life in the US, but the public as a whole gets a feel good that
"justice has been served".

How do we handle this for autonomous vehicles? Do we just fine/sue the company
that made the vehicle/developed the software? Do we send imperfect human
developers to jail because they made a mistake, even if in the grand scheme of
things they have saved lives compared to humans being behind every action made
by a vehicle?

A big part of the public image for autonomous cars is increased safety, any
deaths at their hands starts raising where and how to place the blame - a
subject I think very few are prepared for right now, which is likely part of
why Tesla explicitly states autopilot needs a human driver present right now,
and why Google has been extremely cautious with operator-supervised tests up
until recently.

~~~
Dylan16807
People get found not at fault for hitting pedestrians _all the time_.

------
maxyme
If you're familiar with that part of ASU campus you honestly could have seen
this coming. There are a few different self driving cars in Tempe (Waymo,
Uber, GM...) and Uber drives by far the most aggressive. They drive on a
pedestrian heavy road and drive faster than most of traffic. They accelerate
rapidly when a light changes and can brake hard. There are always tons of
pedestrians in the area and it isn't uncommon to almost get run offer when
even crossing during the day.

~~~
vamin
I wonder if it's because Uber algorithms are trained using Uber driver data
(taxi drivers tend to be very aggressive drivers).

~~~
ClassyJacket
I don't find that at all, I feel like they're happy to take their time and
make more money.

~~~
spuz
Uber drivers don't get paid by the minute. It's a fixed fare.

~~~
pfarnsworth
No, drivers are paid by distance and mileage. Uber is paid fixed fare.

------
tabeth
Obviously more information is needed, but I thought the _entire_ point of
having a driver behind the wheel is to manually intervene to prevent this very
situation?

I'm very curious to see how they'll investigate this and who will be
determined to be at fault (person behind the wheel or Uber). It will likely
set a precedent.

~~~
annabellish
Human beings simply cannot switch between "not focussed" and "in charge of a
car, taken out of autonomous mode, and actively avoiding collision" fast
enough to avoid most accidents, unfortunately. Neither can humans maintain the
focus required to be ready to do that when 99.9% of the time they're not
required to do anything.

Semi-autonomous cars have drawbacks.

~~~
sandworm101
Which is why the machine should be backup for the always-driving human, only
leaping in to correct failings.

~~~
nradov
Exactly like what has been done successfully in aviation for decades now. For
example Auto-GCAS. I don't understand why car companies are trying to go
against a proven model.

~~~
joshuamorton
Modern aviation includes fully autonomous modes though. An airplanes autopilot
is simpler than a car's autonomy, but they accomplish essentially the same
goal: get you from point A to point B with no human interaction. AFAIK, even
takeoff and landing can be done mostly autonomously on modern aircraft.

If anything, airplanes prove that machines doing most of the work and humans
stepping in only when necessary is a proven model.

~~~
danso
> _If anything, airplanes prove that machines doing most of the work and
> humans stepping in only when necessary is a proven model._

Airplanes have far less traffic to deal with. And the points at which they
deal with traffic (e.g. takeoff and landings) is completely controlled by
humans, including many humans outside of the plane.

~~~
joshuamorton
This is incorrect. There are at least automatic landing systems that are
sometimes used.

~~~
danso
So what are the conditions in which auto landing is _not_ used?

~~~
joshuamorton
Apparently pilots generally prefer to not use them. Not because they don't
work but because pilots still need to be on alert and so its easier to just
land manually.

They're used in low visibility conditions with relatively calm weather, but
don't work well in bad weather.

------
curveship
Here we go. We'll now have the first traffic fatality trial where it's not
drivers-trying-drivers but people-trying-a-megacorp.

(Disclaimer: I'm a bike advocate, so I may have a different perspective on
some of this than most.)

Our car-based transportation system is far and away the most dangerous thing
any of us accept doing on a daily basis. 40,000 die a year.

But when cases come to court, everyone on the jury has in the back of their
mind "that could have been me if I lost concentration at the wrong moment, or
made one bad judgement, etc etc."

So penalties are comparatively light for traffic fatalities. Big punishments
are only meted out if the case is so egregious -- repeated drug use,
flagrantly reckless behavior -- that the jury can be convinced that the driver
is different from them.

In other words, drivers don't get punished for doing something dangerous,
because everybody on the road is doing something dangerous. They get punished
for doing something more dangerous than the norm.

In this case, there's no question that the "driver" is different than the jury
-- it's a computer. Now the symmetry that made jurors compare themselves to
the accused is broken.

The result, and what self-driving car advocates don't get, is that self-
driving cars don't just have to be safer than human drivers to be free of
liability, they need to be _safe period_. In a trial, they don't benefit from
the default "could have been me" defense.

That's a HUGE requirement. In fact, it's probably impossible with our current
road system. It won't just take better self-driving cars, but better roads and
a major cultural change in our attitudes about driving.

As a bike advocate, I welcome this shift, but I also see how deluded many of
the current self-driving projects are. Software moves fast, but asphalt and
mentalities move slow. We're not years away from a self-driving transportation
system, we're decades.

And this trial is just the beginning of that long story.

~~~
Maybestring
>Here we go. We'll now have the first traffic fatality trial where it's not
drivers-trying-drivers but people-trying-a-megacorp.

On even the slimest cause, plaintiff's council always tries include the
manufacturer in the case. Anyone with deep pockets that they can pull in.

~~~
bad_hairpiece
Moreover, "Wrongful death settlements are often paid out by insurance
providers who provide liability coverage for the person or entity for whom the
death is being blamed. Insurance policies typically have a policy limit
amount, above which the insurance company will not pay and the person is
individually liable" from
[https://www.google.com/url?sa=t&source=web&rct=j&url=https:/...](https://www.google.com/url?sa=t&source=web&rct=j&url=https://www.askadamskutner.com/wrongful-
death/calculating-wrongful-death-settlements/&ved=0ahUKEwj-q-
WRhPrZAhUSm1kKHUDvDegQFghCMAI&usg=AOvVaw0YZ8OkmtoCLn7H_4H5zd3u)

Although, as far as I know, were the jury to become aware of this, a mistrial
should be declared.

------
avoutthere
It's a sad reality, but regardless of how many times per day a pedestrian is
hit by a human-driven car, such incidents involving self-driving cars will be
headline news for years to come.

~~~
ambivalents
Having a hard time sympathizing here. It's one thing to fear air travel based
on the very few but very publicized plane incidents (considering all the data
we have on the safety of air travel). It's another thing to hold these self-
driving car companies accountable, considering a) the lack of data and history
of such programs, and b) their touted benefit as a safer alternative to human-
driven cars.

~~~
FeepingCreature
It's not about sympathy. It's about one question: do they kill fewer or more
people than humans per mile?

If they kill fewer people, they can be run by a joint venture of Satan and the
Mafia for all I care.

~~~
bonzini
If the miles are not enough, they kill less until they first kill someone. At
which point they kill several orders of magnitude more. # of events per
_something_ are not always a statistically valid way to measure things.

~~~
Dayshine
Well, the miles are enough...

~~~
shkkmo
Only if you aggregate them across companies and ignore the fact that the
distribution those miles are not comparable to to distribution of human driven
miles.

------
austinhutch
There may be no company I trust less to be open and honest about this than
Uber. The arms race between companies will likely keep the most important
lessons from this event proprietary. I'm very much not looking forward to
hearing execs answer for this as I'm sure it will have the same amount of
humanity as the rest of their comms.

~~~
WhompingWindows
This is my thought as well. Compared with Waymo, Cruise/GM, Tesla, Lyft, and
the many other start-ups in this space, I think Uber is the least ethically
scrupulous. I also doubt their self driving tech is as advanced as many of the
other players, and I wonder if those other companies would have been able to
handle this situation.

------
jarjoura
It's fascinating that so many brilliant people in this thread can talk so
casually about any single loss of life for the tech built by the very same
people in this thread. I would be devastated if I learned my code ended even
one person's life.

You're in a bubble if you think a world hyper connected to social media that's
in the same world scared that autonomy is about to kill millions of jobs in
one jab and would NOT over-react to a single accident.

From the article it's impossible to know if the cyclist was at fault, so I
won't jump to conclusions, but we can't, especially as the creators of the
future of autonomous vehicles talk so dissociatively about human life. Either
we go out into the public roads with very high certainty that OUR software and
OUR hardware won't kill someone, or we wait till we get there.

Public opinion on human driving is doing just fine with human drivers right
now, so we can and should take our time to get to the future we want or risk a
backlash and a guilty conscious.

~~~
larkeith
> Public opinion on human driving is doing just fine with human drivers right
> now

There were over 40,000 motor vehicle deaths in 2016 in the US [1] - more than
a hundred daily. While loss of human life is always a tragedy, individual
fatalities are absolutely worth it if we can begin to reduce that count sooner
than by proceeding with excessive caution.

It would be morally unacceptable to delay development merely to avoid a guilty
conscience.

[1] [http://www.nsc.org/NewsDocuments/2017/Fatality-estimates-
Jun...](http://www.nsc.org/NewsDocuments/2017/Fatality-estimates-June17.pdf)

~~~
jonathanyc
And it’d be morally unacceptable to allow public deployment of software that
has not been sufficiently tested. As other comments have pointed out, self-
driving cars have undergone ridiculously little testing. In fact, based on
only the objective statistics it is very unlikely that they are anywhere near
as good at driving as humans.

There is no reason self-driving cars can’t be tested in private. Companies can
hire pedestrians to interact with the cars, and the software can go through
the same certification process that buildings and vehicles currently go
through.

It’s a false dichotomy to say that you can either have self driving cars or
minimally safe and accountable development, but not both.

EDIT: Here is a link to a thread replying to a parent which is now auto-
collapsed.
[https://news.ycombinator.com/item?id=16620968](https://news.ycombinator.com/item?id=16620968)

~~~
stormbrew
I'm just going to point out that nearly every time someone gets their learners
permit, or graduated from a learners permit to a full license, an
insufficiently tested driver is allowed on the road in order to develop more
skills and become a better driver. Often they kill people in the process of
learning. Should we require years of private training for every human driver
as well?

They can't even share what they learn with each other effectively.

~~~
DougN7
“Often they kill people”??? That’s not even intellectually honest. This is the
fliplant attitude the top poster is responding to.

~~~
nightski
Our city (which is pretty small tbh, 150k) had 424 DUI arrests just this past
weekend (Fri-Sun) due to St. Patrick's day. This is despite availability of
uber, lyft, taxis, public transit, and bar services which will give you a ride
home and allow you to park your car until morning.

Ignoring these facts is just as intellectually dishonest.

~~~
romwell
These cases are not what the parent comment was referring to.

The PP was insinuating that people with leraner's permit (and, by law, an
experienced driver in the passenger seat) or people who just got a license
(and hence were, literally, tested) are "insufficiently tested".

The PP was responding tho the claim that letting "insufficiently tested"
systems on the road with the goal of letting them improve is irresponsible.

For the response to have any merit, you need to cite accident statistics for
people with learner's permits, or new drivers.

~~~
stormbrew
To be clear, the testing for getting a learner's permit (where I live at
least) is 7 out of 10 questions on a multiple choice test, and the test for a
driver's license is a 20 minute drive-about where the driver gets to more or
less choose the area they'll drive and the weather when the test is done.

I don't think it's even a very tough argument that these are at best basically
limited filters on actual driver skill. I've known people who literally went
to another city for favourable conditions for their driver test. I've known of
people who passed having driven not much more than a few hours in their lives.

Nowadays if you want to be the driver in the passenger seat for a learner you
need to do a somewhat more difficult test and be older.

Also, I'm really not talking about this specific case but I think it's
particularly relevant that in this case there was a qualified driver able to
take over for the autonomous vehicle, which is actually _more_ supervision
than a 14 year old with a learner's permit has.

------
tim333
>The self-driving Volvo SUV was outfitted with at least two video cameras, one
facing forward toward the street, the other focused inside the car on the
driver, Moir said in an interview.

>From viewing the videos, “it’s very clear it would have been difficult to
avoid this collision in any kind of mode (autonomous or human-driven) based on
how she came from the shadows right into the roadway,” Moir said. (
[https://www.sfchronicle.com/business/article/Exclusive-
Tempe...](https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-
chief-says-early-probe-12765481.php) )

Not saying the software wasn't partly at fault but it doesn't look that clear
cut. It sounds like better sensors could have helped.

~~~
scottmcdot
Shouldn't '[Coming] from the shadows' be irrelevant when the sensors are
LIDAR?

~~~
ACow_Adonis
One also needs to consider the relative dynamic range of the camera/video in
terms of both sensors and playback.

If the hardware is not capable of capturing sufficient dynamic range to be
able to see into the shadows while driving in daytime/nighttime in this
location, one must ask why not? That seems obviously negligent.

If the car can see into the shadows (say either with lidar or sufficiently
sensitive hardware) then a self driving car either just failed to identify and
predict the object and subsequently killed a pedestrian, or it detected them
and killed them anyway.

If it killed them anyway, it may be that it was physically impossible for them
to avoid the detected object, or forensics (if properly performed) would show
that the collision was avoidable.

All of these scenarios seem pretty inconsistent with the hypothesis of the
police chief knowing what the duck he's taking about, and I question what the
hell he's doing releasing such a statement before the evidence is in...

------
keypusher
> Elaine Herzberg, 49, was walking her bicycle outside the crosswalk on a
> four-lane road in the Phoenix suburb of Tempe about 10 p.m.

> Tempe Police Chief Sylvia Moir said that from viewing videos taken from the
> vehicle “it’s very clear it would have been difficult to avoid this
> collision in any kind of mode (autonomous or human-driven) based on how she
> came from the shadows right into the roadway."

This seems like an unfortunate accident, but it's not at all clear that the
car was at fault or that it could have done anything to prevent it. There was
even a human behind the wheel and they didn't react in time either.

~~~
gkya
I can't describe how shocking it is watching many HN commented blame a person
killed by a machine and defend the thing instead. There are even those who are
saying that such deaths are necessary and useful for the advancement of
technology.

------
kaibones
I see the Waymo cars while I walk to lunch (AZ) quite often. One time while
walking through the crosswalk a Waymo turning right cut me off as I started
into the crosswalk despite my having the walk indicator. This seems to be a
very difficult scenario for a self-driving car as there are often people at
the corner that are either crossing the other direction or just waiting on the
corner. I sent them a note via their feedback link, but never heard back.

~~~
stevenwoo
This is a non trivial situation for human drivers, too, when I am on a bicycle
or driving a car I have to wait and look at the pedestrian until I am sure of
their intention. That seems like a very difficult problem in that a lot of
times people are on their phone or watching a child or dog, I often have to
slow to a standstill and speak to people to ask them what they are planning on
doing.

~~~
mikelward
Indeed. Right on red is unsafe for pedestrians. I guess at least the
requirement to stop before turning helps ensure the speed of impact has
minimal consequences.

As a driver, there are certain heuristics, e.g. did my traffic light just go
red, or is the pedestrian signal/traffic light in the same direction about to
go green.

~~~
linkregister
I think the situation was been right on green, which I argue is more dangerous
for pedestrians.

The answer is to do what denser European cities do, which is to have
pedestrian crossing signals have their own turn while all vehicle lights are
red.

------
cbhl
The last time I visited Phoenix, I recall being shocked at far apart traffic
lights were in the city. Between the Wal-marts and the giant four-to-six
bedroom homes, it might be normal to see a major street have a mile between
traffic lights (and presumably, crosswalks). It was very different from, say,
driving in New York, San Francisco or Seattle.

I'd be curious whether "walking outside of the crosswalk" turns out to be
material in the analysis by NHTSA/NTSB. Were the crosswalks big enough and
close enough that a pedestrian would have normally walked inside one? Was the
interchange a regular 90-degree crossing of two roads, or was it something
more irregularly shaped? What is the annual rate at which pedestrians get hit
at this intersection by human drivers? Do self-driving cars need more examples
of "pedestrians walking outside the crosswalk" in their training sets?

~~~
robterrell
I hope they aren't expecting people to only appear at intersections:

"The California Vehicle Code says you can actually cross any street as long as
you aren't a hazard to vehicles. It is also legal to cross mid-block when
you're not between two intersections with signals." [1]

The article points out that LA has a different rule. Uber's going to need to
geofence locations and use different rulesets for different cities.

[1] [https://www.scpr.org/news/2015/04/14/50992/how-do-you-
cross-...](https://www.scpr.org/news/2015/04/14/50992/how-do-you-cross-a-
street-in-socal-without-getting/)

~~~
manmal
Sure, but it’s illegal in most countries (AFAIK also in the US) to not use a
crosswalk when you are in the vicinity of one. Eg in Austria you will be fined
(or worse, drivers license suspended) if you don’t use a crosswalk that is
within 50m of you.

~~~
gleenn
You should be killed by an autonomous vehicle because you jay-walk under any
circumstances. I hope you realize you are comparing being fined for a tiny
crime and getting killed.

------
jonthepirate
Former Lyft eng here. From my vantage point, as an industry, we're nowhere
near where we should be on the safety side. The tech companies are developing
driving tech privately instead of openly. Why can a private for profit company
"test" their systems on the public roads? The public is at serious risk of
getting run over by hacked and buggy guidance / decision system. Even when a
human operator has his hands hovering 1 inch off the steering wheel and his
foot on the brake, if the car decides to gas it and swerve into a person, it
is probably too late for the human crash test driver to overtake. This is
going to keep happening. The counterargument FOR this is that it is overall a
good idea for the transportation system if the number of crashes & deaths is
statistically less than human operated cars. I see this as the collision of
what's possible with what's feasible and that we are years away from any of
this being close to a good idea. :( Very sad for the family and friends.

~~~
amelius
Perhaps companies need to test their safety devices first. I.e., first prove
that their LiDAR correctly identifies pedestrians, cyclists, etc. From there,
build test-vehicles with redundancy, e.g. with multiple LiDAR devices. Then
prove that the vehicles actually stop in case of emergency. And _only then_
actually hit the road.

Of course, the US department of transportation should have set up proper
certification for all of this. They could have easily done so because they can
arbitrarily choose the certification costs.

~~~
scdc
if we wait for the gov to set up a certification for this, we'll delay the
whole industry 10 years.

~~~
chimeracoder
> if we wait for the gov to set up a certification for this, we'll delay the
> whole industry 10 years.

That's not a particularly convincing argument, given that (so far), Uber's
self-driving cars have a fatality rate of 50 times the baseline, per mile
driven[0].

Having to wait an extra ten years to make sure that everything is done
properly doesn't sound like the worst price to pay.

[0] Nationwide, we have 1.25 deaths per 100 million miles driven. Uber's only
driven about 2 million miles so far:
[https://www.forbes.com/sites/bizcarson/2017/12/22/ubers-
self...](https://www.forbes.com/sites/bizcarson/2017/12/22/ubers-self-driving-
cars-2-million-miles)

~~~
treis
In those 10 years ~350,000 people will die in car accidents in the US alone.

Let's say that halving the death rate is what we can reasonably expect from
the first generation of self driving cars. Every year we delay that is 15,000
people dead. This woman dying is a personal tragedy for her and those that
knew her. However, as society we should be willing to accept thousands of
deaths like hers if it gets us closer to safer self driving cars.

~~~
grumdan
> Let's say that halving the death rate is what we can reasonably expect from
> the first generation of self driving cars.

What's your evidence for why this is a reasonable expectation? The fatalities
compared to the amount of miles driven by autonomous vehicles so far shows
that this is not possible at the moment. What evidence is there that this will
radically improve soon?

------
danso
The Washington Post has an updated story with the victim's name:

[https://www.washingtonpost.com/news/dr-
gridlock/wp/2018/03/1...](https://www.washingtonpost.com/news/dr-
gridlock/wp/2018/03/19/uber-halts-autonomous-vehicle-testing-after-a-
pedestrian-is-struck/)

> _Police said the vehicle was northbound on Curry Road when a woman,
> identified as 49-year-old Elaine Herzberg, crossing from the west side of
> street, was struck. She died at a hospital, the department said._

No mention of her being a bicyclist. My gut instinct is to accept that the
victim was indeed a pedestrian -- with the assumption that they'd have that
detail cleared up by the time they have her identity ready to release to the
public.

~~~
abalone
I'd think so too, but there is literally a crumpled bicycle in the photo of
the scene. Also the Uber has a dent on the right side, which faces east.

It's possible she was walking her bicycle across the street, but that still
looks pretty bad for Uber. That street is a straight shot with clear
visibility.

~~~
phyzome
And if she was walking a bicycle, it was _extremely_ unlikely that she "darted
out". It takes some skill to do that.

(Still need more info, though.)

------
rboyd
FTA "Elaine Herzberg, 49, was walking her bicycle outside the crosswalk".

The article says "as soon as she walked into the lane of traffic she was
struck".

The police sergeant also "said he believed Herzberg may have been homeless".

These three things may together indicate that even a human driver might have
struck and killed the lady if she entered into the roadway unsafely. I'm not
sure how everyone (especially on Twitter) can be jumping all over Uber yet
until the investigation is complete.

Sad of course, either way, but not yet enough evidence to assume that the
algorithm+driver combo was any less safe than a human driver alone.

~~~
JimboOmega
Finding any tidbits on what actually did happen has been very difficult. I get
that the investigation is of course ongoing, but that's what I want to know -
did the tech actually make a mistake here?

And a mistake in a broader sense - was it an avoidable accident? If it wasn't
this barely feels like news, other than as a milestone.

Also... I don't see how her being homeless has anything to do with it.

~~~
DoreenMichele
_Also... I don 't see how her being homeless has anything to do with it._

I can think of a number of ways it may be relevant. In addition to what I said
in this discussion, I left a comment about that here:

[https://news.ycombinator.com/item?id=16625242](https://news.ycombinator.com/item?id=16625242)

------
iooi
The second ever [1] fatal car crash involving a self-driving car. This is the
first one where a third party is killed.

[1] [https://www.theguardian.com/technology/2016/jul/01/tesla-
dri...](https://www.theguardian.com/technology/2016/jul/01/tesla-driver-
killed-autopilot-self-driving-car-harry-potter)

~~~
felippee
Tesla was not a self driving car, please stop confusing this.

This case is in fact the first autonomous test vehicle caused fatality.

There were at least several Tesla autopilot related fatalities and injuries,
but I would seriously not put those into self driving bag.

~~~
iooi
Can you provide a source? From all the articles I can find, the language is
all pretty much the same:

"Tesla driver killed while using autopilot"

"The Tesla driver killed in the first known fatal crash involving a self-
driving car"

"It's been nearly a year and a half since Joshua Brown became the first person
to die in a car driving itself."

"Tesla bears some blame for self-driving crash death, feds say"

~~~
dahdum
Tesla doesn’t have self driving capability, their autopilot is just driver
assistance (like adaptive cruise control).

~~~
falcolas
Driver assistance that takes care of everything (i.e. steering, braking,
navigating, parking), that at the time the accident occurred didn't even
require hands on the steering wheel?

Seems like an attempt to shift the blame off the autopilot system and onto the
driver.

------
redm
It's terrible when anyone is killed in an automobile accident. The irony is in
the promise of autonomous vehicles to prevent this very thing.

I've heard the general report, and I find it interesting that a supervisor was
at the wheel, and they, as well as the car, failed to prevent the pedestrian
from getting hit. I read they were not in a crosswalk and I'm sure the
specific details of the accident matter greatly. It makes me wonder if a human
driver with no autonomous system would have faired any better since someone
paid to be alert and a car designed to be alert both failed to prevent the
accident.

I'll reserve final judgment until more details are out.

~~~
mr_toad
It doesn’t matter how good the driver is, AI or human, if you unexpectedly
walk right out in front of a car moving at 40mph it’s not going to be able to
stop.

~~~
princeb
now the question is why is a car (autonomous or not) driving at 40mph on a
road where people can be expected to walk right out in front of a car?

would you drive 40 mph down a city street with trucks and buses parked head to
tail on both sides of the road?

------
Tiktaalik
In the early 20th century the automobile industry mobilized to downplay the
deadly threat automobiles posed to pedestrians, and lobbied to create new laws
to frame pedestrians as being at fault instead of car drivers (eg.
jaywalking).

I expect we'll soon see tech companies do the same in order to favour their
autonomous automobile businesses at the expense of pedestrians and cyclists.

~~~
burlesona
This. There are many posts on this thread suggesting we could fix all this by
lowering speed limits, etc. That's been known for a hundred years at this
point.

If you want to get up on the history you can grab a copy of "Fighting
Traffic," by Peter Norton. He goes through all the efforts cities made in the
1920's to limit cars, and how the motor lobby formed in opposition to this and
little by little reversed the tables to the situation we find ourselves in
today.

[https://mitpress.mit.edu/books/fighting-
traffic](https://mitpress.mit.edu/books/fighting-traffic)

~~~
mlinksva
Or shorter version
[https://www.researchgate.net/publication/236825193_Street_Ri...](https://www.researchgate.net/publication/236825193_Street_Rivals_Jaywalking_and_the_Invention_of_the_Motor_Age_Street)
or podcast version [https://99percentinvisible.org/episode/episode-76-the-
modern...](https://99percentinvisible.org/episode/episode-76-the-modern-
moloch/)

Indeed it has been known for a long time. This is an opportunity to reverse
bad decisions made a century ago, regardless of how quickly self-driving cars
actually are adopted.

[https://www.strongtowns.org/slowthecars/](https://www.strongtowns.org/slowthecars/)

------
Jagat
Can we please not overreact and send the whole self-driving research down the
drain?

What does the statistics say? How many miles have the self-driving cars
driven, and how many deaths were they responsible for? How does it compare to
a human driver?

When a million humans drive a car for a mile, and 1 of them results in a
death, it's easy to pinpoint the blame on a "random drunk/distracted driver".
How'bout start thinking of the self-driving software as a combination of all
human drivers, with just a much smaller odds of having a drunk/sun-
blinded/distracted driver than an average human driver.

~~~
objclxt
> What does the statistics say? How many miles have the self-driving cars
> driven, and how many deaths were they responsible for? How does it compare
> to a human driver?

As of December 2017 Uber had driven 2 million autonomous miles[1]. Let's be
generous and double that, so 4 million.

The NHTSA reports a fatality rate (including pedestrians, cyclists, and
drivers) of 1.25 deaths per _100 million_ miles[2], twenty five times the
distance Uber has driven.

You probably shouldn't extrapolate or infer anything between those two
statistics, they're pretty meaningless because we don't have nearly enough
data on self driving cars. But since you asked the question, that's the
benchmark: 1.25 deaths per 100 million miles.

[1]: [https://www.forbes.com/sites/bizcarson/2017/12/22/ubers-
self...](https://www.forbes.com/sites/bizcarson/2017/12/22/ubers-self-driving-
cars-2-million-miles/#69d4e968a4fe) [2]:
[https://en.wikipedia.org/wiki/Transportation_safety_in_the_U...](https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_States)

~~~
rjvir
Scaling those numbers paints a poor picture for Uber. Assuming 3 million total
miles autonomously driven thus far from Uber's program:

\- Uber autonomous: 33.3 deaths per 100 million miles

\- Waymo: 0 deaths per 100 million miles

\- National average: 1.25 deaths per 100 million miles

Of course, the Uber and Waymo numbers are from a small sample size.

But there's also the bayesian prior that Uber has been grossly negligent and
reckless in other aspects of their business, in addition to reports that their
self-driving cars have had tons of other blatant issues like running red
lights.

It seems reasonably possible that an Uber self-driving car is about as safe as
a drunk driver. DUIs send people to jail - what's the punishment for Uber?

~~~
prepend
Scaling those numbers is not useful and in fact reduces usefulness.

Comically, that’s why OP said not to do that.

Comparing dissimilar things is actually worse than not comparing at all since
it will increase the likelihood of some decision resulting from the false
comparison.

~~~
rjvir
The goal is to use the best set of information available to us. I merely cited
the normalized numbers because it's been asked various times in this thread -
questions along the lines of "how does this rate compare with human drivers?"

The purpose of the extrapolation was to get a (flawed) approximation to that
answer. By itself, it doesn't say much, but all we can do is parse the data
points available to us:

\- Uber's death rate after approximately 3 million self-driven miles is
significantly higher than the national average, and probably comparable to
drunk drivers.

\- Public reporting around the Uber's self-driving program suggests a myriad
of egregious issues - such as running red lights.

\- The company has not obeyed self-driving regulations in the past, in part
because they were unwilling to report "disengagements" to the public record.

\- The company has a history of an outlier level of negligence and
recklessness in other areas - for example, sexual harassment.

~~~
prepend
But this is precisely why you should simply extrapolate. Of course people ask,
and of course the answer will be useful. But extrapolating one figure of 3M
miles to a typical measure (per 100M) is not useful because it provides no
actionable information.

Providing this likely wrong number anchors a value in people’s minds.

It’s actually worse than saying “we don’t know the rate compared to human
drivers because there’s not enough miles driven.”

Your other points are valid but don’t excuse poor data methods hygiene.

Even now you are making baseless data on its face because you don’t know the
human fatality rate per 3M enough to say is “significantly higher.” Although I
think it’s easier to find enough data from the human driver data to match
similar samples to Uber. But dividing by 33 is not sufficient to support your
statement.

I haven’t seen data on the public reporting. That seems interesting and would
appreciate it if you can link to it.

~~~
rjvir
> the self-driving car was, in fact, driving itself when it barreled through
> the red light, according to two Uber employees, who spoke on the condition
> of anonymity because they signed nondisclosure agreements with the company,
> and internal Uber documents viewed by The New York Times. All told, the
> mapping programs used by Uber’s cars failed to recognize six traffic lights
> in the San Francisco area. “In this case, the car went through a red light,”
> the documents said.

[https://www.nytimes.com/2017/02/24/technology/anthony-
levand...](https://www.nytimes.com/2017/02/24/technology/anthony-levandowski-
waymo-uber-google-lawsuit.html)

------
dilanj
According to the latest info, it doesn't look like the system was at fault at
all. Seem like the likely homeless woman abruptly stepped into traffic. Every
accident still of-course is terrible, but some maybe are unavoidable.
[https://www.sfchronicle.com/business/article/Exclusive-
Tempe...](https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-
chief-says-early-probe-12765481.php)

------
DangerousPie
I know this is a slightly insensitive question to ask right now, but assuming
that what happened was that the car tried to turn right and the cyclist came
from behind and crossed into its path[0]: Who would actually have been in the
wrong?

I cycle along paths like this from time to time myself, and I always assume
that I should let a turning car pass in front of me. As soon as I see the turn
signal I will either fall back or even pass into the car lane to overtake on
the other side.

Of course that's in part because I know I would lose the fight anyway, but
also because I think I am not actually supposed to be "overtaking" them on the
right side, so they have the right of way before me.

What do the rules say in this case?

[0] This is based on this link posted elsewhere:
[https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,...](https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,132.91h,64.25t/data=!3m6!1e1!3m4!1ss_YmQlzA3ace0P2LVk4wwA!2e0!7i13312!8i6656)

~~~
phyzome
Depending on the _state law_ , the right-turning car probably is supposed to
yield to a cyclist in the bike lane going straight. But of course, every
experienced cyclist knows that what is right is not what is safe; you should
aim to travel just behind the car's rear bumper to give yourself time to stop
if they swerve right.

This is the insanity of (most) bike lanes, that you have a lane to the right
of a right-turn lane that can go straight. And that's why I don't use most
bike lanes. They're trouble.

------
smallgovt
Not to dismiss the tragedy of this incident, but it should be expected that
self-driving cars kill pedestrians -- just at a rate lower than what's
expected from human drivers.

Perhaps there's a better metric to look at, but I'd like to see number of
deaths caused per miles driven.

If Uber's self-driving cars are killing more pedestrians than a human driver
would, we have a huge problem, but I'd be willing to bet they're at least an
order of magnitude safer in this respect.

~~~
ApolloFortyNine
It'll likely never get passed, but we truly need an "acceptable death" metric
to be passed to protect the companies (and apparently people too, since some
in these comments what the developers held responsible) from deaths. In my
opinion, a law should be passed that per mile driven, 5-10% of the current
deaths per mile is allowed, and it should begin decreasing after 10 or so
years to something low (maybe 1%, maybe .01%, we'd have to wait and see).

People will always make mistakes, the benefit with self driving cars however
is that mistake should only be made once. Whatever bug caused it to be
considered at fault can be patched and will never occur in that exact
circumstance again. Meanwhile with humans you could have that same case occur
over and over again.

With how much money people can win in lawsuits nowadays, all it would take is
a handful of cases to totally destroy a manufacturer of self driving cars.

~~~
Erlangolem
That sounds like the market correcting itself, while you want to protect the
likes of Uber from market forces, with new laws. Putting that aside though,
good luck trying to get re-elected after passing a law like that. Few people
are technocratic and emotionally detached enough to agree with you, from the
looks of it, not even on this site.

~~~
ApolloFortyNine
How on earth is a lawsuit a market force?

Did you even read the first 5 words?

>It'll likely never get passed

Can you stop trolling my posts? You aren't adding much of anything.

------
RcouF1uZ4gsC
Self driving cars are one of the things where we absolutely cannot afford to
do "move fast and break things".

If there was any whiff of shortcutting, poeple need to go to jail and Uber
should be bankrupted by penalties and lawsuits.

~~~
cdolan
Devil's Advocate argument here, but have you read books on our space race in
the 60s? Should NASA have been bankrupted by penalties and lawsuits when
Apollo 1 had its accident?

~~~
azdle
If they were dropping rockets on people standing on street corners. Yes, 100%.
Your example is different because it was deaths of people involved in the
projects who knew there were risks and agreed to them.

------
xingi538
My understanding, from talking to someone who works for the NTSB, is that the
software controlling autonomous vehicles is currently completely unregulated,
and that companies like Uber, Tesla, etc. have refused to provide access to
their software for external review.

Recently I've seen a lot of comments defending autonomous vehicles from a
statistical standpoint, but even if these cars have the potential to be safer
on average, it scares me to think that the software driving a car is less
regulated than its airbags or seatbelts. Especially considering the auto-
industry's tenancy to ignore safety issues without external pressure [0].

Is my information outdated? Have there been efforts to review or regulate
autonomous driving software in the last year?

[0]
[https://en.wikipedia.org/wiki/Unsafe_at_Any_Speed](https://en.wikipedia.org/wiki/Unsafe_at_Any_Speed)

------
fabian2k
Unless this was a situation that was impossible to anticipate and avoid, this
looks bad. It would mean that both, the autonomous driving system and the
safety driver failed.

Not hitting pedestrians should be close to the top in priority, so while
failures of the software are certainly expected during this kind of test, this
is one of the worst kinds of failures that should have received a lot of
attention before ever putting the vehicles on a public road.

The safety driver failing makes me wonder how well qualified they are, or if
there is an issue with staying alert for long times without actually driving
the car.

------
itamarst
The video says bicyclist and shows a bent bicycle. So pedestrian or bicyclist?

~~~
bmm6o
I've heard that police reports often refer to all non-drivers as
"pedestrians", even if they were riding a bike (which I learned after a
neighbor was killed while riding a bike). I don't know if this is legalese or
police jargon, but either way you would hope a journalist who covers these
things could translate.

~~~
xapata
Unfortunately, many police forces don't train the police on how to handle
bicycles. For example, in many places riding on the sidewalk is illegal, yet
police will suggest a cyclist get off the road and ride on the sidewalk.

From studying many bicycle-related police reports, I was dismayed at how often
I read the phrase "a pedestrian riding a bicycle."

~~~
steaknsteak
The article also mentions the pedestrian as "a woman walking", so it's not
clear whether they are referring to a cyclist as a pedestrian, or whether they
are getting different information and actually think it was a pedestrian. Very
confusing

~~~
xapata
The video was released. Turns out it was a woman walking her bike.

Sad event, but it's likely a human driver would have killed her as well. She
was on the road at night without lights, jaywalking without attention to the
on-coming traffic.

One could argue the car was driving faster than it's headlight distance safely
allowed. If so, humans are going to be quite frustrated with how slow
autonomous cars drive at night.

------
WHoWHo
What happens when a self driving car in the future from a service is driving
empty back to some parking? And hits someone? Does it call ambulance or
police?

~~~
WHoWHo
... this would make from some creepy film scene - someone shouting for help
and the car only watching.

------
hackpert
While incredibly sad, this incident is not terribly surprising because the
technology is nowhere near safe enough.

Unfortunately, this will inevitably set the field back by at least 10 years
with stricter regulations. We've seen the same with pharma and aviation, where
innovation was slowed drastically and is only realistic for big companies.
This is especially true here because if self-driving vehicles get banned from
public roads, companies like Waymo will be forced to develop bigger private
"test tracks" that approximate cities to have any hope of building deployable
technology, and they will face the same fundamental limitation – the data will
come from an approximation/model and not the actual location with real-life
scenarios.

Although there's no doubt that regulations are crucial to prevent such
mishaps, I do hope lawmakers can find the right equilibrium on the tradeoff
between safety and innovation.

------
xbeta
Statistically speaking, I am curious to know the data on these areas: \-
number of miles driven per fatal accident on self-driving \- number of miles
driven per fatal accident on a human driver.

It is important to know whether our machine is now doing worse/better than our
counter-part and this means a lot for our self-driving initiatives.

~~~
fnbr
The numbers that I have seen indicate that humans have a fatal accident
roughly once per hundred million miles, while we have one fatal accident for a
self-driving car, with somewhere around ten million miles driven across all
self-driving cars.

I've heard Uber is around 3 million self-driving miles.

So Uber would be 30x worse than humans.

~~~
xbeta
Good to know! But I am more curious for the data cover every self-driving
vehicles, not just about Uber. It could be Uber is doing worse, but my ask is
whether we (as an industry) is still on a right direction for this "bold bet".

------
InclinedPlane
This was inevitable. Some of the developers for self-driving vehicles are
immensely careful, some are not. And the regulatory oversight is currently not
sufficient to allow the one to operate but not the other.

------
ocdtrekkie
The reason self-driving car companies have flocked to Arizona is that Arizona
chose to pretty much completely deregulate them: They don't even have to
report statistics to the state like they do in California.

It's unfortunate, but unsurprising, the first pedestrian killed by an
experimental self-driving car allowed on public roads was in Arizona.

~~~
jonathanyc
What will the consequences of this person’s killing be? Will someone lose out
on a promotion, or miss their performance bonus?

We need to discuss how the developers self-driving cars will be held
accountable for the crimes they commit. There is no reason the person who
programs or the person who makes money from a self-driving car should be held
less accountable for a crime that if committed directly by a person would
almost certainly result in jail time. You can’t replace yourself with a
computer program and then choose to take only the benefits and not the
responsibilities.

~~~
cliffordkeeney
This is crazy. The developers likely have no say in where and when the cars go
out on public roads. That's obviously a decision for someone higher up in the
company.

The executives should be held accountable, not the developers.

~~~
MadBohr
I disagree, you're just passing the buck. Accountability needs to be had at
all levels. If an engineer writes a bug into code like this (deliberate or
not) and such a bug results in somebody's death, the engineer should be held
accountable just as much as the person who approved its release. The executive
could just as easily say "my engineers promised me it was fully tested", etc.
Engineers could say "yep it was, but that was an edge case we missed" or
something like that. In any case, there needs to be shared accountability.
Maybe execs take the brunt, but engineers should not be allowed to write code
that kills people (inadvertently or otherwise) and face zero consequences.

~~~
ApolloFortyNine
What software developer would ever sign on to a project where they could be
held criminally liable for a single bug?

Do you want software development to turn into healthcare, where every
developer needs millions of dollars of malpractice insurance? Because shit
like this will turn it into a healthcare like system real quick.

~~~
prepend
Criminal liability is a different situation as there are very few industries
with specific criminal liabilities (finance maybe).

But there are many industries where civil liabilities are required. In fact,
any software independent consultant is civilly liable for their work, but it’s
not specific to software.

IEEE has a section in their member toolkit that goes into why professional
liability insurance is needed,
[https://m.ieee.org/membership_services/membership/discounts/...](https://m.ieee.org/membership_services/membership/discounts/risk_assessment_review_toolkit.pdf)

The costs aren’t that high or at least they weren’t 15 years ago when I
purchased it for less than $1k/year for $1M in coverage. Most people need this
even if they think they are safe. If you’re the one who wrote the deployment
script that erased $1M in data, it won’t be entirely mitigated that the script
made it through qa.

Also interesting is that the engineer who wrote the Uber software is currently
liable for criminal negligence, like pretty much everyone else. But you would
have to prove culpability. I can’t find any examples of software engineers
convicted so it’s hard to tell who goes to jail-developer, qa, or executive.

More info on criminal/civil negligence-
[https://www.theblanchlawfirm.com/?practice-areas=criminal-
ne...](https://www.theblanchlawfirm.com/?practice-areas=criminal-negligence)

~~~
ascorbic
Nobody in their right mind would work with such liability without insurance,
which is all well and good for civil liability, but insurance won't help if
you're going to jail.

~~~
prepend
I think we may be arguing different things.

Almost all employees have the possibility of criminal negligence based on
their work. For programmers, this could mean that if you fuck up the code for
a pacemaker and someone dies, you could go to jail. That’s a big risk and I
can’t find any programmer who has been found culpable for someone’s death.
This is the current law in the US.

If Uber was negligent in its code, then the programmers could go to jail. They
have programmers and they work and assume this extremely low risk.

Now maybe you’re arguing that some special law should or should not exist for
Uber drivers.

------
kappi
1500 comments before knowing the real reasons. It could be suicide. The dead
woman has history of substance abuse and arrested 6 times.
[http://www.dailymail.co.uk/news/article-5519433/Self-
driving...](http://www.dailymail.co.uk/news/article-5519433/Self-driving-Uber-
car-runs-kills-pedestrian-Arizona.html) [http://fortune.com/2018/03/19/uber-
self-driving-car-crash/](http://fortune.com/2018/03/19/uber-self-driving-car-
crash/)

~~~
ISL
It needn't be suicide; from the second link, the victim may have stepped from
a shadowy spot/out of view. A car cannot avoid what it can't see.

------
hardwaresofton
It seems like a bunch of what-ifs that normally come up with self-driving cars
are about to get answered and precedents are about to be set.

I assume this case will also be one of the most well-recorded cases of a fatal
car accident in history as well, given the amount of sensors and equiment on-
board a self-driving car, along with eye witness testimony from the operator
on board.

Can't tell if Uber has just been incredibly unlucky as of late or if just
enough of their employee-base is incompetent as to prevent them from just
having a quiet year with no large failures.

------
joeevans1000
Hype cycles are beginning to cause loss of life. Self driving cars are being
allowed to operate at a far faster rate than other transportation innovations
would have. This is because of the enthusiasm around technology. We need to
realize the difference between social media apps and steel.

------
pbhjpbhj
Had the car passed the driving test?

There should probably be a specific, complex and comprehensive test for
autonomous vehicles. Also I'd want to see shared liability, the company are
driving it (by proxy through their software) so they should be liable to some
extent.

~~~
drstewart
>There should probably be a specific, complex and comprehensive test for
autonomous vehicles

Maybe we should start with a specific, complex and comprehensive test for
human drivers first.

And to build on the sentiment of this thread, every driver should finance
their own test course to practice on before "subsidizing" their learning on
public roads.

~~~
pbhjpbhj
In the UK, for motorbike tests one does a computer based test, then practices
with a company on a private off-road test area. One must pass a test then
before being able to practices under instruction on public roads. Once
sufficiently practiced one can take the test.

Basically what you propose?

As for comprehensive testing - I think the UK car and motorbike tests are
quite good, not comprehensive, but they demonstrate a general ability across a
range of skills. Humans can be expected to act relatively consistently (in
such things), the test must be more comprehensive when treating a system that
you can't expect to have innate consistency.

------
igammarays
My personal hunch is that fully autonomous self-driving tech is not
theoretically possible under currently known computational models, because it
implies many well-known NP-Hard problems. Self-driving companies are betting
on the ability to find a heuristic/approximation that works "sufficiently
well". But I strongly feel that the chasm that needs to be crossed to be
"sufficiently good" is not one of magnitude (i.e. we just need more testing!),
but of theoretical boundaries, due to the existence of at-least two sub-
problems which are not computationally solvable: 1. prediction of what
pedestrians/cyclists will do next, and 2. accounting for sensor input
distortion under bad weather conditions.

Humans can solve these problems due to _life_ experience, not just _driving_
experience. In other words, I think we're gonna need fully-conscious AI to
solve self-driving.

The only way self-driving tech will reach production is if the input space is
restricted, which is a significant-but-not-groundbreaking iteration on what
we've been doing for decades with airplane autopilots and self-driving
monorails. Sure, we can have self-driving cars on specifically designed
freeways, but nothing more.

~~~
koala_man
Do you believe human-driven tech is theoretically possible?

Current experiments and production data from human controlled vehicles have
not been encouraging.

~~~
igammarays
I added an edit to my comment: We don't have a computational/theoretical model
for human consciousness. This is why it's called The Hard Problem of
Consciousness.

~~~
th1049
That isn't the hard problem. That is simply having a theory of mind. The hard
problem is understanding how the physical world gives rise to subjective
experience (what causes conscious beings not to be philosophical zombies).

~~~
igammarays
Oops, you're right! I've edited my original comment, thanks.

------
muxator
Sorry, but what's the point of this?

> Elcock said he believed Herzberg may have been homeless.

I do not think that Herzberg possibly being homeless adds any meaningful
information. On the contrary, I feel this may subtly support unconscious
prejudices.

~~~
cloakandswagger
Spend some time in any major city and you'll see homeless people doing all
sorts of reckless things around cars. Just last week I saw a homeless man
jaywalk into oncoming 40+ MPH traffic without so much as looking or stopping.

In this sense, it is a possibly relevant detail of the story.

------
odammit
Curious how the numbers work out of number of miles [1] to number of deaths of
autonomous vs standard vehicles.

I know the scale is different but since this is the first death I’m curious if
the percents fall in line.

[1] is distance the right metric?

~~~
jws
This is an interesting metric. Although self driving test cars are rare, the
whole point of their existence is to drive so they probably clock more hours
than a normal car.

About 5400 pedestrians are killed each year in the US. US drivers go 3.1
trillion miles a year. So they kill a pedestrian about every 5.7 billion
miles. Last November, Waymo said they had 4 million self driven miles, so well
short of statistically expecting to hit a pedestrian. In September of some
unspecified year Axios claims Uber had self driven over 1 million miles.

My estimate needs help, the distribution of pedestrian bearing roads and
pedestrian free roads likely does not match from my total miles per year
number and what carbot testers cover. Also, this may have been a cyclist
death, which adds another 800 or so deaths per year.

But, in any event, in rough numbers, Uber appears to have beat their expected
time to pedestrian fatality by two or three orders of magnitude.

~~~
haneefmubarak
It occurs to me that because the deaths caused by autonomous vehicles may not
follow the same distribution across types of deaths, it might make more sense
to compare total deaths per million miles between human and autonomous
drivers.

~~~
codefined
It appears that the current figure is sitting around 10 vehicle deaths per
billion vehicle miles travelled.

Which seems unbelievably low. I'm getting these figures from this Wikipedia
graph[1]

[0]
[https://en.wikipedia.org/wiki/Transportation_safety_in_the_U...](https://en.wikipedia.org/wiki/Transportation_safety_in_the_United_States)
[1]
[https://en.wikipedia.org/wiki/File:US_traffic_deaths_per_VMT...](https://en.wikipedia.org/wiki/File:US_traffic_deaths_per_VMT,_VMT,_per_capita,_and_total_annual_deaths.png)

------
wruwew8uu9
Why is Uber allowed to test its software for free on public roads ? Why
doesn't it use a purpose-built test-track of its own, where it must prove that
its software won't kill people (using robotic crash-test dummies to avoid)
before it is allowed to use public roads ?

------
wlll
I just don't trust Uber and its engineers enough to trust them as a self-
driving car company. Even before this incident.

Their organisation has been shown to be deceitful, their CEO to be devious. I
can completely believe they would cut corners or claim safety where they know
there is risk. To get to market ahead of the competition knowing that the fate
of road users lay in a dice roll.

Technically they had to resort to stealing Waymo's IP to get ahead, and their
self-driving cars have previously been shown to behave recklessly.

I do, tentatively, trust Waymo. I've seen enough about the resources they've
put into this, the extensive testing they've done, and their safety record so
far to at least give them a shot.

------
JepZ
Why am I not surprised that this happend with an Uber self driving car?

In fact, their culture of ignoring rules and common sense might be okay for
business development, but with respect to human safety it is just
irresponsible and inhuman when it comes to self driving cars.

------
11thEarlOfMar
It's not hard to believe that self-driving autonomous cars will make the roads
safer overall.

The issue is that while fewer lives will be lost, it will be a different set
of lives than would have been lost without self-driving technology.

Those lives will have their day in court.

------
amelius
> Car was autonomous with driver behind wheel

Let me make a prediction. Uber will claim that the driver was in control of
the vehicle at the moment of the accident. And the "driver" will receive a
bonus payment in the coming months.

------
zepn
This paradigm-shift is a chance to address one of the leading causes of
preventable death in the world.

It doesn't surprise me for a moment that this was Uber (though it might have
been a Tesla). From all I've seen and read, these companies are racing
unscrupulously, and some have inferior technology compared with others.

Waymo seem to be approaching the problem from a safety direction rather than a
pure race for profit.

We need to hold SDs to a higher standard than we hold human drivers. Not 1.5x
the standard; 1000. 10,000. 100,000.

And every death must be treated as a serious failure of engineering. These are
preventable deaths.

------
foobaw
I know quite a people working on this at Uber ATG. They are in panic mode.

------
tracker1
Really? This is a completely new tech, and process. How is any reaction "a
standard move?" I guess it could be standard for any new tech to result in a
death from autonomous machinery. In any case, things like this are why I don't
think fully autonomous is nearly ready in city environments. I'm not saying it
isn't close, and shouldn't be worked on. I just don't think the intelligence
systems are near as worked out as they should be. Not to mention the level of
sensors needed.

~~~
Tobba_
I'm sure that pile of data will gain intelligence if we stir it hard enough.
Any moment now.

------
outside2344
Given how incompetent Uber is at regulation of anything else in their company,
is this really a surprise?

------
UncleEntity
A while back I was riding my bicycle home from work and saw one of the Uber
robocars coming the other direction while I was making a left turn so decided
to test their reaction algorithm by starting my turn while they were still
approaching -- car didn't even flinch while a human driver would've at the
very least honked at me.

As an aside, I've also been hit in a bike lane in Tempe by a human driver
unfortunately, if it were a robocar I'd be living on the beach somewhere off
the settlement check.

------
aanet
I would hate to have been the (backup driver) in this sad situation.
Regardless, the regulations couldn't come soon enough.

From SF Chronicle [1]:

<QUOTE> The self-driving Volvo SUV was outfitted with at least two video
cameras, one facing forward toward the street, the other focused inside the
car on the driver, Moir said in an interview.

From viewing the videos, “it’s very clear it would have been difficult to
avoid this collision in any kind of mode (autonomous or human-driven) based on
how she came from the shadows right into the roadway,” Moir said. The police
have not released the videos.

The incident happened within perhaps 100 yards of a crosswalk, Moir said. “It
is dangerous to cross roadways in the evening hour when well-illuminated,
managed crosswalks are available,” she said.

...

“I suspect preliminarily it appears that the Uber would likely not be at fault
in this accident, either,” Moir said.

However, if Uber is found responsible, that could open a legal quagmire.

“I won’t rule out the potential to file charges against the (backup driver) in
the Uber vehicle,” Moir said.

But if the robot car itself were found at fault? “This is really new ground
we’re venturing into,” she said. </QUOTE>

[1] [https://www.sfchronicle.com/business/article/Exclusive-
Tempe...](https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-
chief-says-early-probe-12765481.php)

------
marcell
Forgive my ignorance, but isn't this not supposed to happen with LIDAR
equipped cars?

My understanding is that a LIDAR provides a completely accurate 3D map of the
surrounding area around a self driving car. This is in contrast to Tesla's
image recognition approach which makes a 3D map from 2D images. So is seems
like a pretty giant bug if a self driving car ever crashes into another object
with LIDAR, since it should always know the location of external objects.

Please correct me if I am wrong!

~~~
grmarcil
You're not wrong, but it is more complicated than that. Lidar gives you an
array of 3D points and intensity, corresponding to where the lasers bounced
back, and how strong the reflection was. Roughly speaking, from there you have
to decide 1) which sets of points belong to the same object, 2) what those
objects are, and 3) what those objects intend to do in the future.

So yes, a lidar-equipped AV completely not sensing a pedestrian would be
surprising, but you can see how it might have incorrectly classified the
pedestrian, or misunderstood the pedestrian's intent.

------
ChuckMcM
Something of a self-driver car Rorschach test. On the one hand there is the
tragedy (car hits person, person dies), on the other there is the technology
(computer doing the driving), and then there is the fog of reporting where
actual data is hard to come by because people are people and report on the
things they saw/heard that were important to them.

Last night there was a story on CBS about the first 'self driving bus' going
live in San Ramon California. [1] Where the reporter steps out in front of it
to see if it will stop (spoiler alert, it does).

And while it was a tragedy, its unfortunate that because it was a 'self
driving' car this fatality gets more coverage than the thousands who are
killed by 'human driven' vehicles. Bicycle advocacy groups have been arguing
for years that better, separated, bike lanes would save lives. Perhaps the
companies behind self driving can get behind that effort to protect bike
riders from humans and make it easier on their cars.

[1]
[https://www.cbsnews.com/live/video/20180319105443-california...](https://www.cbsnews.com/live/video/20180319105443-californias-
first-self-driving-bus-is-running-near-san-francisco/)

------
Ajedi32
So this is it then? First fatal accident involving a fully self-driving car
which might have actually been the car's fault?

Well, it was obviously going to happen sooner or later. It'll be interesting
to see what the fallout is. Up until now regulators have been, surprisingly,
taking a pretty relaxed approach to regulation of self-driving cars. Hopefully
this one accident doesn't impede development on other autonomous car programs
too much.

------
otalp
Very Relevant video on what the discussions about 'the ethics of self driving
cars' miss:
[https://www.youtube.com/watch?v=ozcaLnTuidU](https://www.youtube.com/watch?v=ozcaLnTuidU)

Summary: We spend too much time talking about what decision the car should
make in these situations, not enough on how allowing a corporation to make
those life and death decisions changes our society.

------
peterwwillis
Situations no self-driving car can avoid:

    
    
      - philosophical dilemma
      - physics-constrained reaction times
      - actions that violate the rules of the system
    

Solutions:

    
    
      - protected car lanes
      - protected bike lanes
      - protected pedestrian lanes
    

Reasons why these solutions are not put in place:

    
    
      - cost
    

Until humans determine that the cost of human life is higher than the cost of
upgrading infrastructure, we should accept human death as a regular part of
autonomous driving, just the same as we do for non-autonomous driving. 37k
dead people every year in the US due to human drivers.

Top reasons for auto accidents today include inclement weather, reckless
driving, speeding, driving under the influence, and distracted driving. In
theory, most of those could be solved by autonomous driving. But then the list
of reasons for accidents would change to whatever new reasons cause autonomous
car accidents, such as damaged sensors, programming errors, equipment failure,
road hazards, etc.

Even with autonomous cars, we will still need protected lanes, and we will
still never implement them, because we don't really care when people we don't
know die.

~~~
lainga
If by "protected passenger lane" you mean putting a Jersey barrier between the
sidewalk and the road, no thanks. Maybe I'm a spoiled suburbanite, but I can't
help but recall that, scant a century ago, pedestrians could walk along or on
the streets wherever they pleased.

~~~
peterwwillis
Not specifically that one way.

In order to allow transportation to co-exist with pedestrians without
collisions, you need _some kind_ of separation between the two. With subways,
the protected lane is literally underground, but it does definitely have a
protected lane. If you don't go under ground, you can go above ground, like
several subways and metros do around the world.

If you don't do either of these, you have to make concessions on the ground
level. My personal preference would be tall fences around the roadway, and
pedestrian bridges that go over or under the roadway (but both have problems).
Another would be to still have the fences, but automate some sliding
barricades that would activate when traffic halted, which is somewhat like how
train crossings work. We could also implement hybrid methods, like that at
Shibuya crossing, for very congested intersections.

------
dmode
I know when Uber was stopped by CA DMV to operate its self driving cars
without a permit, the Arizona governor went all out to promote the state as a
beacon of business friendliness and deregulation etc. etc. Perhaps, when
something is as critical as self-driving cars going around along with
pedestrians and bikes and other vehicles, a more cautious and deliberate
approach is warranted ?

------
swyx
I'm saddened by this incident but have thought a lot about this eventuality.
there are a few levels of societal acceptance of self driving car death
outcomes i can think of:

A - Human equivalent: self driving car obeys reasonable rules that a human
also would eg minimum speed on a highway and kills someone

B - Trolley problem: (very artificial scenario) self driving car has to choose
between killing N or M people where N > M and wrongly (in hindsight) chooses N

C - Car fault: self driving car kills someone in a situation where no human
would have

As a society we would probably accept A easily but B starts to get shaky. C
currently looks completely unacceptable, BUT I would argue that society has to
get to a point where even C is ok conditional on the probable result that
overall car deaths decline dramatically.

In other words we will have to get to a point where individual deaths are
extremely regrettable but the overall death reduction of adopting self driving
cars are so undeniable that the individual deaths can be discussed without
also talking about banning SDCs in the same breath.

~~~
rimliu
How does it work out? I mean you got a situation wher SDC kills someone where
no human driver would, but somehow it gets safer?

~~~
swyx
yes, because in aggregate the deaths are still reduced. we need society to get
past individual death stories (which are very sad and we should do everything
we can to prevent it) and not let them distract from the overall need for SDCs

------
hugh4life
The self driving car craze baffles me. I would only trust it if it were on a
road made for it and the cars communicated to one another.

~~~
jpao79
Or at least have the self driving cars driving only along specific routes well
marked, well maintained autonomous driving routes similar to light rail/bus
lanes so pedestrians know to be extra careful.

And for long haul trucking applications, freeway lanes at specific times of
late night/early morning could be dedicated for autonomous driving.

------
lifeisstillgood
There are only two paths from here - an escalation of the scramble to real
world driving hidden under self-serving rationalisations, ("the bigger
picture") or a step back to some industry self-regulation, where self-driving
just does not happen till the sensors, the maps and the algorithms are much
much much improved.

We know very little about the circumstances here, apart from the obvious
tragedy, but given that we learnt from Tesla that you cannot rely on cameras
alone, whatever is learnt from this must be shared amoung all the industry and
serve as the baseline for future.

I would prefer an outright ban on driverless cars in real world settings
(until perhaps each competing car can safely navigate the worst sensor data
other companies can offer) but as a minimum we need a neutral clearing house
similar to the airline industry.

There are on this forum enough industry insiders, plus a sprinkling of
regulators that a global self-regulated path could easily be forged.

------
lettergram
Given I've seen (just last year) an Uber self driving car run a red light in
SF, nealy hitting several people - this is not at all supprising.

I do think Uber should be charged if the car is found at fault. Not a civil
case, criminal. We know machine learning algorithms fail all the time (like
this). It shouldn't be alright to just ignore this.

~~~
darkstar999
> Not a civil case, criminal.

Has there ever been a case where a car manufacturer is charged with a crime
for a defect that results in a death?

~~~
linkregister
I don’t know about cars, but there is precedent for company officers for being
criminally liable for negligent homocide. Even in non-homocide cases,
individuals sometimes face criminal charges, as the Volkswagen engineer behind
the emissions cheating scandal found out.

------
ggg9990
Of course it was Uber. I'm sure they don't even give a fuck any farther than
the cost to their brand either.

------
blauditore
Wow, this is sad.

I always imagine I would feel uncomfortable when developing software in
certain areas such as this, or e.g. for space rockets containing humans.
Errors have a much higher impact than when only developing web applications
where the worst case scenario is a tempoarily unreachable website, or maybe
some data loss.

------
nopriorarrests
I wonder how this will be treated legally.

Police can't just write "killed by python script" as a cause of death, that
would be crazy, and there is no legal framework for cases like that. My guess
is that A.I. could be equated to cruise-control, meaning person behind the
wheel is responsible.

Maybe someone knows better?

~~~
drcode
People are injured by malfeasance of a corporation involving many employees
all the time, such as when a bridge collapses- There is nothing that says a
police person needs to immediately find a single guilty person 10 minutes
after an accident like this.

~~~
nopriorarrests
I agree, for large engineering projects, such as bridges and buildings,
obviously many employees are involved and in many cases you can't even single
out someone who is responsible.

But, to my best knowledge, there are no "corporations" on the road. No matter
what company this person is driving for and under what type of contract, this
person is responsible for all kinds of incidents.

------
chriskanan
>> _The pedestrian was outside of the crosswalk. As soon as she walked into
the lane of traffic she was struck._

>> _Tempe Police Chief Sylvia Moir said that from viewing videos taken from
the vehicle “it’s very clear it would have been difficult to avoid this
collision in any kind of mode (autonomous or human-driven) based on how she
came from the shadows right into the roadway._

A lot of comments on here are faulting machine learning for this or
algorithmic flaws, but unless we see the video it is really hard to blame the
algorithms. Based on the quote from the story, it sounds like someone stepped
out into the street, not in the crosswalk, at night, directly in front of a
car going 40 miles per hour. How could a human or any algorithm cope with that
scenario?

------
ptr_void
My non-expert opinion is that we will not see self driving cars become reality
in our lifetime. I don't understand why anyone believes training on some
videos/pictures to calculate some probabilities would ever be enough to handle
all the complexities of the real world.

------
smsm42
I think it would be insane to expect 100% safety from any technology, let
alone new technology. People do crazy things on the road, I am witnessing it
almost daily. And if you look up dashcam videos on the internet, you'll be
scared to ever drive again. Expecting that some magic technology can deal with
all this craziness with 100% safety is impossible. And then we'll add bugs and
malfunctions, which absolutely every technology has, especially - new one.

So, brace for the incoming wave of similar news. These things will happen, and
these things will be heavily publicized. Hopefully it won't cause stupid
overreaction to the tune of banning the technology altogether or surrounding
it with a bubble of regulations so think as to make in infeasible.

------
sdfget434
That car has to have a dashcam -- what's the current caselaw with regards to
the 5th amendment?

~~~
CobrastanJorji
I'm no lawyer, but I imagine that it wouldn't be hard to get a warrant for the
dashcam footage. The fifth amendment is about speech, not possessions.
Otherwise, all warrants would violate the fifth amendment.

------
nwellinghoff
I love how everyone is throwing fuel on the fire here with no actual details
as to what happened in the accident. It could simply be she ran out into
oncoming traffic and got hit. Hard to say a human driver could have performed
any better. The devil is in the details.

~~~
CardenB
Agreed. We need a lot of time to digest this. At least anything before the
NTSB investigation is jumping to conclusions.

------
johnstew
Why is Uber allowed on public roads?

~~~
jakeogh
This has little to nothing to do with the specific company (although is a
convenient deflection ATM). The very idea that pre-programmed computers can
deal with driving is the fundamental problem. In time, people will realize
that, but at the moment the "solution" is to "patch" and make more laws to
hide the truth: AI has nothing to do with intelligence.

Ultimately they will blame humans, humans are the bugs in their code.

~~~
joefourier
As far as I know all self-driving car programs use some form of machine
learning, so they aren't exactly "pre-programmed" in the sense of good old-
fashioned AI based on strict, pre-determined rules.

What exactly do you mean by "deal with driving"? Drive without a single
accident, ever? That's obviously impossible in practice. Drive better than
humans, who in 2016 killed 37,461 people in the US alone? I don't see how that
would be impossible - human drivers have a limited field of view, slow
response times (average time to break is ~2.3 seconds), and are frequently
distracted, sleepy, drunk, etc.

~~~
jakeogh
Let me know when self driving cars can get sleepy, distracted or drunk. Until
then, I'm much more concerned with "stop before hitting things 101". This "pre
programmed cars must be better than X" is getting old fast.

This is one of those self correcting problems, and it's going to be fixed way
faster than the startups pushing this "smart machines" propaganda are going to
like.

------
osrec
I really don't understand the need to put a self driving car around
pedestrians or other human drivers. As with any machine learning/control
system, wild instabilities can and will crop up. Why don't we simplify the
problem domain and keep well suited, designated infrastructure just for self
driving vehicles, and solve the last mile problem with a slow 15mph creep to
the ultimate destination. I mean, it's impressive that cars can drive
themselves through a busy intersection alongside other human drivers, but it
is by no means necessary, especially when adding humans into the mix adds a
degree of uncertainty that is very difficult to account for...

------
nothrows
> “The pedestrian was outside of the crosswalk. As soon as she walked into the
> lane of traffic she was struck”

Uber has some fucking shitty programmers guaranteed. They murdered this lady.
Think about it:

Smart cars basically have 360 degrees of sensors. There’s no way it didn’t
detect that lady coming up on the car and the fact that it kills her says to
me there was no counter measure. It probably didn’t even slow down until the
moment of impact. That's some fucking elementary school shit.

> Chance of Collision? 100%. Maintain Speed

In structural engineering if you designed a bridge poorly and it collapsed
killing people you'd lose your ability to practice engineering. The same
should be applied.

------
lostlogin
I wonder how companies handle these situations? Most places have clear
requirements eg stop, check if anyone is injured, contact police, remove
debris from road if safe to do so. These things aren’t so easy without a
driver.

------
t0mbstone
Based on landmarks from the photo from the news article, and the fact that the
accident supposedly happened near Mill and Curry, I'm assuming that this was
the location of the accident:

[https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,...](https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,+Tempe,+AZ+85281/@33.4369934,-111.9429875,3a,75y,84.18h,60.6t/data=!3m6!1e1!3m4!1scUyILaxFs5z63AL2SupCJw!2e0!7i13312!8i6656!4m5!3m4!1s0x872b0931b4d9dd05:0x5d55a20356caaf8!8m2!3d33.4377022!4d-111.9433046)

------
aurizon
Uber must have a full video of this - just in case. I can hardly think they
did not have a full record of this, as well as all miles traveled? It may be
in these 1366+ comments, but I have not read them all to see.

------
justaaron
I never understood the recent push towards self-driving cars, as it already
seems perverse that the entire surface pavement has been dedicated primarily
to motor vehicles and only secondarily to our actual selves, as pedestrians.

Putting further pressure on what should really be 2 discrete planes of
activity (perhaps one in tunnels below ground or in the air, and I don't mean
the pedestrians!)

Perhaps now the arrogance of such bad ideas becomes a little bit more clear to
the tech set of folks, that in my view were excessively pushing such strange
goals upon a public that wasn't clamoring for it.

------
skarap
To repeat my comment from a previous discussion which brought a lot of
downvotes: what happens when (not if) a self-driving car runs-over and kills
someone (e.g. because of a software bug)? Do such cases cause criminal
penalties? Who is penalized? Or will all cases of autonomous car accidents
with deaths become civil cases? If so - do human drivers get the same new
rules or if they kill someone by accident (because they got distracted) they
still go to jail? Is that fair?

In this particular case I assume the operator will be thrown under the bus,
which is also unfair.

------
matt4077
Why am I not surprised that, among all the teams competing in this space, many
of which have clocked in far more miles than them, Uber is the first one to
kill someone?

This may well just be bad luck. But I cannot shake the feeling that if Uber
started an ice cream venture, they would store their molasses on a hill in a
Boston. The only way to get “humanity” associated with Uber would involve an
Uber zeppelin.

Or it’s a conspiracy, because nothing is as threatening to Uber than
autonomous cars. This is certain to invite more regulatory scrutiny. Just
kidding... I think..

~~~
Axsuul
You mentioned bad luck and that could very well be it. Unless Uber starts
having more incidents, it isn't fair to say that Uber's technology is
inferior.

------
kwhitefoot
Why is everyone concentrating on the software? There was a human in charge and
the car was exceeding the speed limit. That sounds like either driving without
due care and attention or reckless driving. After all the driver's reaction
time is almost certain to be much longer when supervising an autonomous car
than when driving directly making it much more important to adhere to the
rules.

And if it can be shown that it was company policy to allow the cars to exceed
the speed limit then heads should roll all the way to the top of the company.

~~~
kawfey
It was exceeding the speed limit by 3mph, I don't believe this far exceeds
safe margins.

------
Steeeve
Why not test these things on a private track? You have a multi-billion dollar
company attacking this problem and it feels like all they tested before they
hit the public roads was lefty and righty.

------
cf498
Looking further then the devastating tragedy at hand, it might be necessary to
get the question of responsibility publicly answered. I dont think I am the
only one here who has deliberately chosen not to work in fields where human
lifes are at risk.

Despite the likely outcome, that the engineers behind this took every safety
percussions, I think its extremely important, to get the message across, that
people with an CS background with such use cases are going to be held to the
same standard as for example mechanical and electrical engineers.

------
dchichkov
And yet (disclaimer - reflects only my personal view, might not correspond to
the current state of affairs at Uber, it is possible that Uber cars are still
safer than human drivers, but still - one who forgets the history tend to
repeat it):

[https://www.forbes.com/sites/samabuelsamid/2016/12/15/the-
tr...](https://www.forbes.com/sites/samabuelsamid/2016/12/15/the-trouble-with-
uber-fail-fast-and-iterate-but-not-at-the-cost-of-human-lives)

------
FlyingSideKick
Let’s take the extreme example where a bug in a self driving semi truck leads
to it mowing down 50 people on a busy sidewalk. How are the victims to get
justice? In almost all incidents besides a human truck driver having a major
health incident, that driver would go to jail but it seems that there will be
little repercussions beyond fines for autonomous driving companies. I think
once a mass fatality occurs people will scream for autonomous driving execs
and programmers to be held liable.

~~~
lopmotr
The way I see it, by using the road, you're sort of giving up the right to
revenge justice. It's a game of Russian roulette. People make mistakes and
kill others or themselves no matter how careful they try to be. It's like
asking where's the justice if you die from a heart attack or cancer? Those
aren't crimes, they're bad luck.

------
newnewpdro
Having these massive robots on our public streets better at least result in
unfettered, transparent public access to all the data informing the thing in
the time surrounding the crash.

This shift towards autonomous vehicles is utterly unnecessary. They could be
simply improving the safety systems of the vehicles to intervene when the
driver is about to cause an accident.

Instead they're doing the opposite - expecting the human to intervene when the
robot is about to cause an accident.

------
basura045478
I see a lot of detailed analysis in this thread, which is impressive given
that literally nobody has read anything more than the half dozen factoids that
have been released. At best, some of you are looking at maps of the incident
site to figure out potential failure cases.

Knowing that you don't know something is as important as having the facts. To
harken back to Cheney, these aren't even unknown unknowns, you're basing
opinions on known unknowns.

~~~
danso
Cheney? I think you mean Don Rumsfeld:
[https://www.youtube.com/watch?v=GiPe1OiKQuk](https://www.youtube.com/watch?v=GiPe1OiKQuk)

~~~
basura045478
you got me

------
Zigurd
The article says almost nothing about it other than that the pedestrian was
"outside the crosswalk." But the details are what matters. Without the details
it is not possible to say if striking a pedestrian was avoidable. There is
also a large grey area: Could the collision been mitigated? What other risks
were involved?

The bad outcome would be that it turns out that pedestrian safety is too
underdeveloped in this system to really be safe.

------
arnaetra
I have not been following the development of self-driving car tech very
closely, but I have some familiarity with the difficulty of the challenges
involved and I have a feeling that we are at least two decades away from
having fully autonomous tech authorized for use on public roads. Am I
underestimating the progress of the tech? I have the impression that there is
a tremendous amount of unjustified hype in this field.

~~~
lacerta
It is both an over and underhyped field depending on the area you are looking
at.

HUGE difference between consumer self driving car [Everywhere, at all times]
and the Machines as a Service [Geo, time, weather-fenced operations][1].

Waymo appears to be at the head of the pack; Sacha Arnoud, Director of
Engineering for Waymo gave a talk a few weeks ago at MIT[2] and gives a good
idea. We are about 5 years out until these start rolling out as MaaS (machines
as a service). Probably 10+ years for level 4 highway operations for consumer
models according to Frazzoli.

[1]Emilio Frazzoli, CTO of nuTonomy
[https://www.youtube.com/watch?v=dWSbItd0HEA](https://www.youtube.com/watch?v=dWSbItd0HEA)
[2][https://www.youtube.com/watch?v=LSX3qdy0dFg](https://www.youtube.com/watch?v=LSX3qdy0dFg)

------
apeace
I'm really shocked at the trend in comments here.

Yes, we get it. Fatalities will happen sometimes, they are unavoidable
sometimes, and what matters in the long run is if we can achieve a significant
overall reduction in fatalities.

But my god, a person was just killed by a computer. Can't we have some
compassion and humility?

Let's, as a community, set the standard for how we will react to these events.
Let's make sure Uber releases detailed data on what happened, whether they
were at fault or not. Let's hold the media accountable for their reporting.
Let's mourn the loss of life and think about how we can solve these problems.

But for christ's sake, please stop posting the same thing everybody already
knows, which gets posted on HN whenever a self-driving article comes out.

Can't we do better?

\---

EDIT: I clarified what I meant here [1]

Wasn't trying to say this thread should be all about mourning. This is HN so
we should talk about technology, even when a person died. I'm pointing out one
specific argument that gets repeated over and over in place of a substantive
discussion, and I think we can do better.

[1]
[https://news.ycombinator.com/item?id=16621589](https://news.ycombinator.com/item?id=16621589)

~~~
alyx
I find what might help those that quote statistics is to self-reflect and ask
the question, how would you react if your
father/mother/sister/son/daughter/etc was the victim?

Would you care that statistically self-driving cars have killed less people
than human drivers?

~~~
arcticfox
I think that's a strange thought experiment, since it's basically requesting
that you think emotionally and as irrationally as possible to draw
conclusions.

~~~
ssully
It's requesting that you remember that a human life was just lost, and to not
reduce this woman's life to a rounding error.

The only strange thing in this thread is the emotional detachment many are
showing and one could probably argue that's the exact kind of thinking that
will lead to more cases like this.

------
trhway
[https://www.abc15.com/news/region-southeast-
valley/tempe/tem...](https://www.abc15.com/news/region-southeast-
valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-
crash-overnight)

"The fleet of self-driving Volvos arrived in Arizona after they were banned
from California roads over safety concerns."

------
mdev
I wonder how software updates and are validated and tested before going to
production in autonomous cars. This is scary - imagine being a developer
responsible for a similar malfunction. I'm not suggesting that was the case
here - just think this might need as much validation as medical equipment
(hoping that it won't stifle innovation).

------
lr4444lr
Meanwhile...

[https://www.azcentral.com/story/news/local/arizona/2018/03/0...](https://www.azcentral.com/story/news/local/arizona/2018/03/01/arizona-
has-highest-rate-pedestrian-deaths-united-states-report-says/383640002/)

------
dang
We changed the URL from [https://www.wcpo.com/news/arizona-police-
investigating-self-...](https://www.wcpo.com/news/arizona-police-
investigating-self-driving-uber-car-involved-in-crash-overnight) to one that
is longer and doesn't autoplay a video.

------
jostmey
Both the computer and the driver failed to avoid the pedestrian. It's possible
that the backup driver wasn't paying attention and the technology for
autonomous vehicles is not ready. It's also possible that the situation was
challenging for both the computer and human and that both failed.

------
4rgento
I wonder how many different "cultures" do drivers have?

People from different localities drive differently. I hope they use drivers
from say, NY, LA, and various small towns to teach the AI.

People that lives in towns with ~ 10k people drive differently that a new
yorker on the same town.

EDIT: They might be training with a biased sample

------
HGGLLJNLJHU
[http://www.autonocast.com/blog/2018/1/18/46-part-two-of-a-
co...](http://www.autonocast.com/blog/2018/1/18/46-part-two-of-a-conversation-
with-velodyne-about-the-past-present-and-future-of-lidar)

------
schintan
I think the test cars should be covered in layers upon layers of foam or
another kind of shock absorbing material. This should reduce the collision
impact at least a little and can be the difference between someone getting
killed and escaping with injuries in some cases.

------
woliveirajr
This comes just few days after this: "When an AI finally kills someone, who
will be responsible?" url:
[https://news.ycombinator.com/item?id=16584862](https://news.ycombinator.com/item?id=16584862)

------
thecombjelly
This seems like a good time to question if we really should have cars at all.
Autonomous cars might be better in some ways but they are still tons of metal
flying at high rates of speed. Accidents will happen and people and other
animals will die on a regular basis. Cars also require tons of infrastructure
that walking and biking don't. You also have the environmental destruction of
GHG emissions and from the mining of rare metals needed for batteries, you
have roads and parking lots destroying habitat and using up valuable space,
air pollution and smog from tailpipes or energy generation, noise pollution,
and more harmful effects. Autonomous cars might mitigate some of those things
but most of it will still be there (or we'll just use cars even more and so
we'll have just as many deaths total along with the rest of it?). What about
doing research and testing a world without any cars? It will probably be a
better one.

~~~
Nomentatus
Paris has greatly reduced traffic, thinking along these lines.

------
kumarski
"Uber is now the target of at least three potential class action lawsuits, at
least five state attorney general investigations, and an inquiry by the FTC
because of Sullivan’s decision to pay off hackers and the cover ups."

via twitter.com/adamscrabble

------
booleandilemma
I feel like I should share this here.

The first person killed by a car:

[http://www.guinnessworldrecords.com/world-records/first-
pers...](http://www.guinnessworldrecords.com/world-records/first-person-
killed-by-a-car)

------
wiz21c
There are roughly 3e8 cars in the US, they kill 3e4 people a year => each car
kills 1e-4 person a year. If we have say 100 AV right now in US, each car
kills 1e-2 person a year. So that's 100 times more dangerous than humans...

~~~
Aunche
That's not a good comparison because self driving cars are on the road a lot
more often than regular cars.

~~~
wiz21c
yep, sure. I did that computation to have a rough estimate. Also, with on or
two fatalities, I guess the confidence in the stats is not very high.

------
bambax
If the car is deemed non-responsible it makes it the perfect crime...

How long before someones tries to program a car to "hunt" a victim and hit her
when she makes a mistake and it will appear it was a pure accident with no-one
at fault?

------
wiz21c
I once read somewhere that privacy issues (which are very important to us, IT
people) were not treated enough because they didn't have enough blood on their
hands.

At least with automated cars, there will be blood.

------
AsyncAwait
As someone who cannot drive because of a disability, I am putting real hope
into self-driving cars becoming a thing during my lifetime. Uber may've just
set that back a decade.

------
nkrisc
And now the costs of developing autonomous cars has been socialized to those
unfortunate enough to be walking in front of them while they learn.

Perhaps her estate should receive a perpetual royalty.

------
jehlakj
Just curious: Do tech companies “pay” the state so they can try out their
latest and greatest? Or do they look for states with the least amount of
restrictions?

~~~
Zarath
Isn't this just lobbying?

------
ada1981
Could this help Uber?

American's seem to enjoy the "X doesn't kill people. If only more people had
X, less people would have to die." argument.

<braces for downvotes>

------
itsuart
So, who is going to jail (or at least - trial) over this?

------
melicerte
If you skip the emotional part of the news, the interesting question is: who
is responsible when a self driving car made by a company kills someone?

------
vpribish
I suppose speculators will speculate - but c'mon. just wait and we'll have
video and telemetry and this whole conversation will be moot.

------
jordanpg
Those who say it's not safe enough: given that no engineered system is 100%
safe, when/how can you say that a system is safe enough?

------
danso
Given that so many of autonomous car accidents logged by the California DMV
seem to be the fault of the human driver trying to take control, I’m surprised
that the accident here reportedly happened in autonomous mode, which for all
its flaws seems to have been good at not hitting things in front of it. In
fact, seems like most of the CA accidents were the autonomous car being rear
ended or side hit from other cars, usually for going too slow or stopping
unexpectedly.

------
msie
Everyday I see crazy, drunk or angry people just walk across the road
recklessly. Some seem to walk in defiance of the cars.

------
gldev
So, what's the statistic of actual self driving cars killing people vs the
actual people killing others or themselves.

------
paul7986
When will this company just go away ... after everything it's done ... it's
now killing innocent people!

------
sg0
I'm paranoid when I'm on a bike. Any road with a dedicated bike lane may be
classified as bike-friendly, but I can't consider it so if cars are on the
same road. IMO, the only bike-friendly routes are where there are no cars. I
wish pavements could be made wide enough for bikes everywhere, so that bikers
could be on the road only for a limited amount of time while crossing
intersections.

~~~
amclennon
> Elaine Herzberg, 49, was walking outside the crosswalk on a four-lane road
> in the Phoenix suburb of Tempe

Although I agree with your statement in general, it sounds like she was
already walking her bicycle while crossing the intersection, and the fact that
she had a bicycle was a moot point.

------
justonepost
There were an estimated 40,100 motor vehicle deaths last year, or a drop of 1
percent from the prior year.

------
qwerty456127
Just out of curiosity, how many people have been killed by human-driven cars
on the same day in Arizona?

~~~
WhompingWindows
My guess is 2 or 3 deaths that day. Okay, according to the 2016 Arizona Crash
Facts Summary [0], there were 952 fatalities on the road that year, including
193 pedestrian fatalities and 31 pedalcyclist fatalities.

So, on any given day in 2016, there were 2.6 fatalities, with less than 1 in
the pedestrian/pedalcyclist groups.

[0]: [https://www.azdot.gov/docs/default-source/mvd-
services/2016-...](https://www.azdot.gov/docs/default-source/mvd-
services/2016-crash-facts.pdf?sfvrsn=6)

------
ashleyn
Uber's self-driving program seems to be one of the most unethical, most
poorly-run in the entire industry. There were plenty of reports before warning
us of the questionable quality of their research: cars changing lanes
suddenly, blowing red lights, and now one finally killed someone. Every prior
incident should have been treated as a serious matter.

Waymo et al has had none of these issues. Time to revoke their licence to
test?

~~~
Mononokay
> Uber's self-driving program seems to be one of the most unethical, most
> poorly-run in the entire industry.

I mean, it's Uber. I'm not sure exactly what people were expecting. Their
entire schtick is being unethical and poorly-run.

~~~
reddit_clone
Yeah. Their motto seems to be 'easier to ask forgiveness/pay fines instead of
asking for permission'.

~~~
TAForObvReasons
Irrespective of the ethics, the plan "worked" in that they managed to pressure
local governments to change rules or otherwise let Uber operate

~~~
dangerlibrary
They did manage to get some local governments to do that, yes. I’m not sure
how to interpret your comment other than as an apology/excuse for their
illegal and unethical behavior.

If there’s more to your thought than “the ends justify the means,” I’m curious
what it is.

~~~
TAForObvReasons
Uber has been rewarded for employing schticks that GP called "unethical and
poorly-run." Is it unethical? Yes, and I strongly believe that "Move fast and
break things" is wholly inappropriate when people's lives are involved. Is it
poorly-run? I think Uber is actually well-run if you accept that "ends justify
the means" is in their DNA.

~~~
dangerlibrary
What would compel me to accept that an ethical standard that can be used to
justify any amount of immoral and unethical behavior is “in a company’s DNA?”
Why would I grade them on that curve?

This is like saying the Duterte is a very effective anti-drug crusader if you
accept that murdering innocents and drug dealers is in his DNA.

------
bryanlarsen
Arizona had 10 pedestrian deaths in a single week:
[https://www.azcentral.com/story/news/local/phoenix-
breaking/...](https://www.azcentral.com/story/news/local/phoenix-
breaking/2018/03/13/arizona-official-10-pedestrian-deaths-week-show-major-
crisis/422808002/)

~~~
alacombe
As somebody else pointed out, what is the ratio of total miles driven in metro
Phoenix over fatalities ?

------
zengid
Did the person riding along attempt to take control? I wonder if they will get
held responsible..

------
Tharkun
If only there was this much outrage and discussion every time a human driver
killed a pedestrian.

------
Mtntk
there is a standard in motor industry called misra, because the target user is
human you have to met some certain safety level. Otherwise bad things happen
like this.

Maybe self driving industry needs to have some safety rules too to prevent
when the unexpected happens.

------
ensignavenger
The various reports on this aren't clear- did the car hit a pedestrian, or a
cyclist?

~~~
nimbs
Tempe Police say woman killed by self-driving Uber car was pushing bicycle
across street when struck.

------
hndamien
I don't see why this lesson could not be learned on a test track or simulator.

------
undoware
Headline should read 'woman crossing street kills electric cars'

------
Axsuul
Sad to hear it's happened this soon but this was inevitable.

------
zawerf
Is this the first death ever?

(Do we count Tesla's autopilot as self-driving?)

~~~
CocaKoala
Hm, there have been other crashes where the 'driver' in the car died (I recall
one where a car in autonomous mode rear-ended a truck, something about how the
sensor was blinded by the setting sun? And it had alerted the driver to take
control several times before the crash) but this is the first incident I can
recall where a pedestrian was struck and killed by an autonomous vehicle.

~~~
jandrese
IIRC in that case the driver T-boned a truck crossing the road, because the
side of the truck was so shiny that it read as the horizon to the car's
sensors.

------
arca_vorago
Uber is responsible for this, and corporations are people, right? That reminds
me I have an episode of opening arguments to listen to on the subject.

------
wiz21c
There are roughly 3e8 cars in the US, they kill 3e4 people a year => each car
kills 1e-4 person

------
richjdsmith
This will be the first major test of liability and insurance with a self-
driving vehicle.

------
xbmcuser
Why am I not surprised the first death of a pedestrian came from a Uber driven
car.

------
waraey
Ubye

------
VikingCoder
"Uber has paused self-driving operations in Phoenix, Pittsburgh, San Francisco
and Toronto, which is a standard move, the company says."

...there's a "standard move" when one of their cars kills someone?

I don't think that really came across the way they wanted it to.

------
abalone
This may not have been an unexpected circumstance. The "pedestrian outside of
the crosswalk" report is likely wrong. There is visual evidence that strongly
supports that this was a case of the Uber hitting a bicyclist while merging
into the right turn lane.

Here's a picture of the crumpled bicycle (note dent on the front right of the
car):
[https://twitter.com/daiwaka/status/975767445859287042](https://twitter.com/daiwaka/status/975767445859287042)

The sign it is lying next to lets us locate the exact point of the collision
on Google street view:
[https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,...](https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,132.91h,64.25t/data=!3m6!1e1!3m4!1ss_YmQlzA3ace0P2LVk4wwA!2e0!7i13312!8i6656)

It's right where the vehicle lane crosses the bike lane approaching the
intersection. Very unlikely a pedestrian was walking a bike there. Much more
likely the Uber hit the bicycle while merging.

EDIT: I take back "very unlikely" a pedestrian was crossing there. There is in
fact a _very weird_ "X" pathway in the median that pedestrians are not
supposed to be in. I didn't see that until I looked at the satellite view.[1]
So possibly she emerged from there and was walking her bike across the street.
That still looks pretty bad for Uber though, since that street is a straight
shot with clear visibility and the dent's on the right side of the car.
Meaning it would have swerved left to avoid a late-detected obstacle rather
than swerve right away from her entering the roadway. She also would have
literally had to walk straight into oncoming traffic. I still think bicyclist
is the most likely scenario and it's odd the PD doesn't mention the crumpled
bicycle.

[1]
[https://www.google.com/maps/@33.4365931,-111.9425027,198m/da...](https://www.google.com/maps/@33.4365931,-111.9425027,198m/data=!3m1!1e3)
(street view shows a "no pedestrians" sign)

~~~
omarforgotpwd
Wow, that's really bad! Easy to see how the code didn't factor in the
possibility of someone using the bike lane, which is a very bad look for Uber.

~~~
phkahler
It's not even about the bike lane. It's about not hitting a person. Beyond
that, it's about understanding unforeseen circumstances.

------
djsumdog
> You can't really get away from this without putting these cars on rails.

We should just really build rails. Self driving cars are dumping billions of
dollars onto a problem better solved by rails. Several cities already have
self driving trains including London, Singapore and Kuala Lumpar, and they
transport millions of people every day without incident:

[http://penguindreams.org/blog/self-driving-cars-will-not-
sol...](http://penguindreams.org/blog/self-driving-cars-will-not-solve-the-
transportation-problem/)

Self driving car research is asinine. It won't even being to solve the
transport problem in America and it's round-peg/square-hole tech. It has the
cool factor but there are existing tech that are insanely more useful for a
fraction of the cost. American hate public transportation (thanks GM) that
we'll never see real/good transport implemented in America.

~~~
rocqua
Rails don't work for the last mile. The great thing about cars is that they
don't suffer from the last mile problem.

It seems like a much better plan is to separate drivers from other people more
rigorously, and maybe mandate people drive the last-mile themselves (with AI
help, obviously).

~~~
CalRobert
For what it's worth, it doesn't really need to. Feet work well for the last
half mile or so, and a well-designed city can get stops this close.

~~~
Balgair
> ... and a well-designed city can...

As someone who lived in West LA for years and saw the new line come in, I'll
stop you right there. Cities like LA, Mexico-City, San Jose, etc. are not well
designed and trying to re-invent them is going to take one _hell_ of an
earthquake.

~~~
CalRobert
I used to live at 16th and Santa Monica and was really excited about the Expo
Line. I ended up moving before it showed up because it was many years late
(I'm glad it finally got built though!)

I used to ride a recumbent to UCLA now and then because I had a death wish,
and it shocked me that it took 25 minutes and yet people spent 60 minutes in
traffic in their cars every day just to get across the 405.

You're right, though, those cities are mostly lost causes, more for the
attitudes present in them than the infrastructure. Doesn't mean we can't try
to make any new ones better though.

~~~
Balgair
Oh man, the Expo line isn't synced with the lights, it's horrible. That and
the bums like to go crazy and sit on the rail lines, causing jams at the
street crossings. It's not that frequent, but it happens. It used to back up
to Bundy on a _bad_ day, now it's every day.

Riding a recumbent to UCLA from 16th is _hardcore_. Everyone I knew that rode
in west LA got hit at least once, if not more.

~~~
CalRobert
I did indeed get hit. I still have aching ribs some mornings. It's one of the
biggest reasons I moved away.

------
jakeogh
Calling these pre-programmed multi-MJ wheeled computers "self driving" is
foolish and deceptive.

------
abalone
It's likely this was actually a bicyclist, not a pedestrian.

The police are saying pedestrian walking outside the crosswalk but here's a
picture of the crumpled bicycle:
[https://twitter.com/daiwaka/status/975767445859287042](https://twitter.com/daiwaka/status/975767445859287042)

You can see it's next to a sign. That sign lets us determine exactly where it
happened on Google street view:
[https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,...](https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,132.91h,64.25t/data=!3m6!1e1!3m4!1ss_YmQlzA3ace0P2LVk4wwA!2e0!7i13312!8i6656)

It's exactly where the vehicle lane crosses the bike line approaching the
intersection. Mostly likely the Uber and bike collided when the Uber was
crossing into the right turn lane.

~~~
davesque
Isn't it reaching to see a picture of a bicycle near a sign and then conclude
that the accident must have happened exactly next to or near the sign? Maybe
the police/witnesses dragged the bicycle to that location?

~~~
abalone
Nah. The Uber likely stopped in its tracks and it's in the picture too. You
can see an official collecting evidence at the scene. The bicycle may have
been pulled onto the sidewalk but it would be the sidewalk near the accident,
not far away.

------
lopmotr
There's no indication that this is the fault of the car or its human driver.
Unfortunately, the media won't notice that. She stepped out in front of the
car and since she was struck immediately, it sounds like the car had no time
to stop or swerve, let alone time for the human driver to react in any way.

EDITed to remove offensive word.

~~~
dang
> rabble

Please follow the site guidelines when commenting here. Adding flamebait into
a flammable topic sparks explosions, making this place worse for everyone.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
make3
Why am I not surprised Uber is the first company to kill someone. With their
top notch ethics, ah. This actually makes me really angry, more than I had
anticipated.

------
justboxing
Have to quote Prof. Teddy here.

Context: How new technology is introduced on public streets and we have no
choice or say on whether we want it or not, even if it endangers us and gets
us killed.

> When motor vehicles were introduced they appeared to increase man’s freedom.
> They took no freedom away from the walking man, no one had to have an
> automobile if he didn’t want one, and anyone who did choose to buy an
> automobile could travel much faster and farther than a walking man. But the
> introduction of motorized transport soon changed society in such a way as to
> restrict greatly man’s freedom of locomotion. When automobiles became
> numerous, it became necessary to regulate their use extensively. In a car,
> especially in densely populated areas, one cannot just go where one likes at
> one’s own pace one’s movement is governed by the flow of traffic and by
> various traffic laws.

and

> Even the walker’s freedom is now greatly restricted. In the city he
> continually has to stop to wait for traffic lights that are designed mainly
> to serve auto traffic. In the country, motor traffic makes it dangerous and
> unpleasant to walk along the highway. (Note this important point that we
> have just illustrated with the case of motorized transport: When a new item
> of technology is introduced as an option that an individual can accept or
> not as he chooses, it does not necessarily REMAIN optional. In many cases
> the new technology changes society in such a way that people eventually find
> themselves FORCED to use it.)

Source: The Mainfesto: INDUSTRIAL SOCIETY AND ITS FUTURE
[http://www.washingtonpost.com/wp-
srv/national/longterm/unabo...](http://www.washingtonpost.com/wp-
srv/national/longterm/unabomber/manifesto.text.htm)

~~~
dwaltrip
This is a great point. Note, though, this is not new.

It's been been happening since the invention of the first tool. Stone
weaponry, agriculture, and so forth, all became largely non-optional after
their introduction.

~~~
bobthepanda
The major difference, though, is that most major technologies are not also
accompanied by legislation that effectively penalizes the old methods. There's
nothing legally penalizing me from using a bow and arrow, or buying hand-woven
clothes, etc., whereas walking is effectively criminal in some parts of the
US: [https://www.cnn.com/2014/07/31/living/florida-mom-
arrested-s...](https://www.cnn.com/2014/07/31/living/florida-mom-arrested-son-
park/index.html)

~~~
ggggtez
>bow and arrow

Check again. Bow and Arrows are often considered weapons at the same level of
firearms, and so many laws apply.

~~~
bobthepanda
Sure, but they're considered weapons and subject to weapons laws. Firearms are
not inherently privileged over a bow and arrow, the way that cars are legally
privileged over pedestrians in many cases.

------
rjromero
I’m pretty sure if Arizona’s lawmakers understood the underlying nature of
machine learning and autonomous driving, they wouldn’t allow self driving cars
to operate.

I don’t care how good your models and data are, until you’re able to write an
algorithm that can fully handle and learn from a situation it was completely
unprepared for, it will always encounter edge cases like this.

This is what happens when we have a society of people saying AI is already
here, people like Elon Musk saying AI is a “significant threat” despite "AI"
being essentially a black box statistical model in it's current form. From
what I’ve seen, the general public thinks we’re 20+ years of AI than we really
are.

You need general AI before AI can drive a car, full stop. Otherwise, you need
to isolate roads from other human drivers pedestrians, and cyclists. It just
won’t be reasonably safe until then.

~~~
ggggtez
>You need general AI before AI can drive a car

1.3 million people died in the USA from car crashes in 2017. An additional
20-50 million are injured. You don't need to be perfect, you just need to be
better than the baseline.

That said, and to be fair to your feelings, Uber certainly is not better. This
is a fact I'm sure lawmakers in Arizona are discussing right now.

------
timmaah
The video mentions she was on a bike and shows a mangled bicycle. Also the
line markings on the street indicate there is a bike lane to the left of a
right turn lane. Definitely very bad.

[https://www.google.com/maps/@33.4369934,-111.9429875,3a,75y,...](https://www.google.com/maps/@33.4369934,-111.9429875,3a,75y,83.99h,72.74t/data=!3m6!1e1!3m4!1scUyILaxFs5z63AL2SupCJw!2e0!7i13312!8i6656)

~~~
conductr
Not to be insensitive to the death. But, that bike lane design is just
horrible. Are human drivers really known for respecting the "do not cross
solid white line" rules? 9/10 drivers that realize at the last moment they
need to turn right are going to quickly cross over this bike lane probably
without much time to check their blind spot. If I was cycling on this path,
I'd want to use the sidewalk and be extra cautious of peds, I don't trust
drivers human or not.

~~~
mason55
There aren't a lot of options when you need to have cars turning on roads with
bike lanes. At some point the car lane and the bike lane have to cross paths
unless you're going to build a fully grade-separated bike lane.

I suppose you could have a wall that separates the bike lane from the road
except for a short area where cars can cross over, so that at least there's
only a small space where the bikes & cars can interact, but that introduces a
bunch of new problems (cars that can't see bikes behind the wall when they go
to cross over, cars that run into the wall, probably even more swerving to get
into the lane because your choices are do it RIGHT NOW or you're stuck/hit a
wall).

~~~
conductr
I think it'd be safer to keep the bikes in the far right even if they go
straight. Yes, it means they have to pay attention to cars turning right. They
might even need to stop until it's safe. This is a place where "bikes/peds
having right of way" doesn't really make sense.

~~~
InitialLastName
Users to the right have the right of way over users further towards the center
of the street. Thus, pedestrians crossing in the crosswalk have priority over
bicycles and cars turning right, and bicycles going straight have priority
over cars turning right.

~~~
conductr
I guess in a perfect world having right of way means you can cross the street
without looking left or right. I’m more willing to yield as a bike/ped because
my risk is higher and I know people are distracted idiots.

~~~
InitialLastName
Absolutely worth accepting the reality of drivers on phones' as a threat to
safety. I'm just trying to explain the rules of the road from which
intersection design can be derived.

There are ways to design an intersection so that those rules are followed
naturally, and there are ways to miss it. C.f. right turn lanes.

------
propman
This is purely an emotional initial reaction to hearing this without knowing
most facts, but I just have to vent and for once be part of the outrage
culture and make emotionally charged statements before more info comes out.

Uber ruining self driving cars that can save 20k lives each year and improve
our economy and get millions out of poverty and improve everyone's quality of
life by being short sighted, financial driven, and bringing the worst aspects
of Silicon Valley to the extreme.

I am so sorry to everyone at Google and Waymo who have tried so hard to not
let anything like this happen by being extremely cautious and safe. Ughhhh I
feel so frustrated, this is going to cause so much backlash.

Travis, Anthony and whoever else pushed for this to go ahead without caring
about safety is evil. This action doesn't have a direct correlation of evil
like directly murdering someone. But these highly intelligent individuals
ignored their conscience and the easy deduction that actions to compromise
safety, laws, decency for quick profit are immoral. Whether they admit it or
not they KNEW that this was a high possibility and refused to take precautions
creating this awful culture. 20k lives a year, and all the benifits that will
be delayed for years are on their head.

They won't face any consequences, just convince themselves of absolution of
guilt and do nothing to remedy the situation, but I have no respect for them.

~~~
Obi_Juan_Kenobi
> 20k lives each year

Road deaths are ~40k each year in the US alone.

I agree about Uber; smells like shit, looks like shit, tastes like shit, you
better believe it's shit to the core.

------
skookumchuck
If human driven cars kill 1000 people a year, and autonomous cars kill 100,
will it be politically unacceptable to switch to autonomous? I suspect it will
be.

~~~
awakeasleep
A portion of that depends on the funding on either side of the issue.

If the autonomous cars kill 1100 per year, but autonomous car companies fund a
bunch of studies saying the benefits outweigh the drawbacks, it'll be
politically acceptable.

~~~
skookumchuck
People think emotionally, not rationally. People have all sorts of greatly
distorted views on the risks of various activities.

------
remind_me_again
What I don't get is that if this is a test car, how come the the driver did
not take over the helm immediately. The driver in that car should be in full
alert as if driving the vehicle himself and jump in as needed. Test should not
mean to let loose an autonomous car.

~~~
phyzome
Do you think you could maintain the required state of alertness, focus, and
engagement necessary to hit the brakes and turn the wheel after 30 minutes (or
whatever) of just sitting passively as the car drove you around?

I don't think the "safety drivers" do much other than provide liability-
shifting for the company.

~~~
gowld
the drivers are employed by the company, doing company work. They may
(misleadingly) shift the optics away from the automation, but they don't shift
the liability.

~~~
phyzome
Sure they do, when Uber (metaphorically) throws the driver under the bus for
failing in their duties as "safety driver". >:-)

------
dang
Could you please not use allcaps for emphasis in HN comments?

This is in the site guidelines:
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html).

~~~
__abc
Updated. Thank you for pointing me towards that as I was currently unaware.

~~~
dang
Ok great. I've detached this subthread from
[https://news.ycombinator.com/item?id=16620865](https://news.ycombinator.com/item?id=16620865)
and marked it off-topic.

------
the_cat_kittles
could there be some way of emitting a signal on your person so cars never get
confused? seems like a pain in the ass, but maybe it would be worth it? cars
can know for sure there are people wherever the signal is emitted from? dunno
if its feasible, curious

------
df5t0rw
[unconstructive but still..] So the day I could be emprisoned because of an
error of coding... Like a segfault on an highway... as finally came :D Is the
disclamer "no waranty of anykind" shield us from that kind of thing ? ;) (I
code on some mainstream C libraries)

~~~
Jack000
you already can? our lives depend on all kinds of software every day. In
Canada at least you'd need a P.Eng to work on stuff that directly endanger
lives.

[https://en.wikipedia.org/wiki/Therac-25](https://en.wikipedia.org/wiki/Therac-25)

~~~
alacombe
If I'm not mistaken, P.Eng are merely needed the rubber stamping, not to do
the grunt work. I doubt they'll go through all the specs, re-run all the
simulations, and verify all the parameters.

Also, where is your P.Eng be any useful to justify the learning coefficients
of your AI, or to protect about the cosmic ray bit-flip in the hardware your
proven correct software is running on ?

------
beedogs
Funny how this story has been pushed off the front page of HN prematurely.

------
vgf
I bet people at Volvo are feeling a bit anxious at the moment for that deal:

[https://www.media.volvocars.com/global/en-
us/media/pressrele...](https://www.media.volvocars.com/global/en-
us/media/pressreleases/194795/volvo-cars-and-uber-join-forces-to-develop-
autonomous-driving-cars)

------
gexla
Uber: 1 Pedestrians: 0 Game on.

~~~
dang
Please don't do this here.

------
bitxbitxbitcoin
On purpose to protect something "more valuable," or on accident?

~~~
xexers
Forgive me, I don't understand what you mean. Please elaborate.

~~~
mattlondon
I suspect they mean the trolley car problem: "you are either going to crash
into a group of school kids, or into a group of nuns. Who do you kill?"

I.e. did the uber car kill this 1 pedestrian, because it avoided killing 2
other people? Did the uber car opt for the "least killing" in this siutation?

I think this whole trolley car question - while a nice philosophical question
- is silly though. I dont think humans would do any better in an emergency
situation. In that split second when you suddenly find yourself in the "oh
sh-<IMPACT>" situation, do you have time to a) make a deliberate, reasoned &
fully-informed decision, and then b) control the vehicle effectively to follow
through on that decision? I doubt it. If you had a second or two to think it
through and then deliberately make a choice and steer towards that crowd of
nuns, you'd probably be able to entirely avoid the accident anyway.

More likely is you'd probably do what I am sure most people do which is gasp,
stamp on the brakes, shut their eyes and hope for the best ... assuming you
even had time to realise there was about to be a crash before it happened.
Some people might swerve, but I suspect they do that instinctively to avoid
something in their way, not as a _decision_ to hit something _else_.

I just dont think self-driving cars will get themselves into situations where
they have to chose who to kill in the first place. And even if they did get
into those situations, I really, really doubt it would be due their actions,
and I'd certainly trust it to avoid the crash in the first place a whole lot
more than the average human driver in the same scenario.

Some could argue "Ah but yes computers are so fast that they CAN make that
informed decision about who to kill in milliseconds! So they have to make a
choice! The question stands!". I'd argue back that in those milliseconds
_before_ the crash was inevitable, they'd act to prevent the accident before a
human would even know what the hell was going on anyway. After that it just
starts getting into a game of who can conjure up the most ludicrous
hypothetical situation that rarely - if ever - happens in real life.

~~~
wjoe
The most obvious example I've seen is when the car (or indeed driver) has the
choice to endanger the passenger or a pedestrian.

For example, if a kid runs out into the road in front of the car, it can drive
into the kid, likely with no injuries to the passengers. Or it can swerve and
drive off of the road, or into another lane, likely saving the kid but putting
the passengers at much greater risk.

As you say, in this or any similar situation the human reaction is probably
going to be to instinctively brake or swerve without a chance to consider
options or consequences. I feel like a computer will have a few cycles to
spare to make decisions like that.

------
orliesaurus
Not to be the guy who says "I TOLD YOU SO" but it had to happen at some point,
sadly & unfortunately - if this is the first (known) case of A.I. controlled
cars killing humans, the number of casualties can only grow from now onwards

~~~
JumpCrisscross
> _if this is the first (known) case of A.I. controlled cars killing humans,
> the number of casualties can only grow from now onwards_

This is how counting works.

------
Avamander
My sympathies to the victim, but this post with the XKCD substitution plugin
made my sides leave orbit.

~~~
dang
Please don't do this here.

------
Peter-W
She stepped out in front of the car in the middle of the road, it's not like a
human driver would have done any better.

------
codecamper
Interesting nobody used the words jail or murder yet. Not even manslaughter.

Just because there was no driver does not keep it from being a crime. Do the
engineers go to jail or do the managers or the investors?

~~~
stordoff
In a typical crash, you don't jump straight to "this was
murder/manslaughter"/"who is going to jail". At a bare minimum, you have to
establish fault. I don't see why this case would be any different.

------
grecy
To play Devil's advocate, there were over 100 people killed in the US TODAY
from non-autonomous vehicles [1].

Obviously we'd need to know the number of miles driven in each category to get
a meaningful comparison, but let's all keep in mind a USA where _only_ one
person per day is killed on the roads would be a two order of magnitude
improvement over today.

[1]
[https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in...](https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in_U.S._by_year)

~~~
phyzome
If you look elsewhere in the comments, you'll see some numbers. Autonomous
vehicles are new enough that they come out as more dangerous even with one
death.

------
SuperNinKenDo
Given the number of fatal pedestrian hits by human drivers, I'd be interested
to see how Uber's self driving cars stack up against the apes.

Having said that, one concern one has with self-driving cars is that one can't
"reach an understanding" with them the same way pedestrians and drivers
regularly negotiate how they're going to behave with each other.

If one doesn't know one is dealing with a self-driving car, one might consider
certain situations safe, because they usually would be, but the car negotiates
and notices things differently than a human.

------
3pt14159
Look, I'm all for taking self-driving cars seriously. I've written about how
they're effectively WMDs if the software update computer gets owned. I've also
frequently commented about the unethical practices that Uber has taken in the
past.

That being said, let's not over-react. This is a single death. These are going
to happen. Right now self-driving cars are essentially teenagers learning how
to drive. En mass there is going to be a death here or there, but, also en
mass, we're going to have a permanently safer road once they've learned, if we
can secure them from cyber attack.

Try to avoid all-or-nothing thinking here. Definitely advocate for
regulations, and maybe Uber wasn't safe enough, but people are either going to
die from self-driving cars or they're going to die from self-driving cars
taking too long because of public outrage.

~~~
gamblor956
This didn't need to happen. It's indefensible that it did.

And there's still no proof that self-driving cars are safer than humans when
confronted with novel situations, i.e., the situations most likely to result
in injuries and fatalities. If anything, the evidence thus far is that self-
driving cars are more dangerous than ones driving by humans.

~~~
Axsuul
How are we to judge that self-driving cars are more dangerous from simply one
death?

The proper comparison to be made should be between miles driven from
autonomous vehicles vs. those of humans and then looking at the incident
rates.

~~~
3pt14159
The research I've seen is that for areas with good conditions year round
they're currently safer than a subset of the population (those in at least one
automotive accident in the past 5 years, or something like that).

Still not better than your average driver, but even if we start by replacing
alcoholics and routinely poor drivers the net impact on safety goes up.

------
downandout
The title here should be amended to “...hits, kills _jaywalking_
pedestrian...”. Let’s get the facts straight. This person was walking outside
the crosswalk. The fact that a computer was behind the wheel is irrelevant. A
human broke a law designed to prevent this exact scenario, and the result was
both predictable and tragic. It’s terrible, but ultimately the person’s fault.

~~~
danso
It's "predictable" that jaywalking will lead to death? That a killed/injured
pedestrian was jaywalking can be used in defending the motorist in court. But
the vehicle code and road infrastructure does not assume that `jaywalking ==
death sentence`. Drivers deal with jaywalking every single day -- if death was
to be expected, driving and jaywalking laws and regulations would be much,
much different than they are now.

The fact that a computer was behind the wheel is the most important factor in
this discussion, because the argument for self-driving vehicles is that they,
unlike humans, can react to abnormal situations such as jaywalking.

~~~
downandout
It’s predictable that when you (illegally) put yourself in harm’s way, you
might be harmed or even killed. This was a terrible outcome, but it’s no more
the fault of the computer that was driving the car than it would have been if
it were a human driver. If I go walking into traffic, as a logical person, I
fully expect to be hit and injured or killed. The most important laws where
jaywalking is concerned are not _traffic_ laws, but rather the laws of
_physics_.

Thankfully this is not the outcome of each and every jaywalking incident, but
it is a _possibility_ with all of them. That’s the risk you consciously accept
when you walk in front of moving vehicles that weigh thousands of pounds.

------
femto
> “They are going to attempt to try to find who was possibly at fault and how
> we can better be safe, whether it’s pedestrians or whether it’s the vehicle
> itself.”

Autonomous vehicles should automatically be 100% liable in every accident in
which a person is injured, and it should be paid for by a levy on every
vehicle.

A levy is fair because the per vehicle risk is much more uniform than with a
driver behind the wheel.

100% liability is fair because the software should be written so that the
vehicle predicts the path of all objects within its range and adjusts is
speed/trajectory so that there is negligible risk of collision given the worst
case behaviour of the object. If an object doesn't show on the car's sensors,
then its still the car's fault as the sensors should be built to they do pick
up all objects. There is a complete power imbalance otherwise: software cannot
be injured in a collision but people can, so all people should be 100% covered
to give the software incentive to be the best it can.

Summary: a car is governed by predictable physics, hence there is no such
thing as an accident involving an autonomous vehicle, only mistakes.

~~~
JimboOmega
Given current road and vehicle design, that's an incredibly impractical goal.
There simply isn't enough separation between pedestrians and roadways; to keep
injury risk negligible you'd also have to keep speed incredibly low (way below
speed limits) on all but the best limited access highways.

If the vehicle can only crawl at 5MPH because you never know when a pedestrian
might just dive in front of you... that's just impractical for real vehicle
operation.

It's one thing to hit the brakes because somebody decided to jaywalk without
looking; another to expect there is always enough space to stop in those
situations.

~~~
femto
I disagree. Pedestrian density is generally low enough that the car could
generally be doing 60km/h and not hit a "worst case" pedestrian.

For example, how many people do you see walking along freeways? If a person is
on a freeway then cars should be slowing down for them. It's a rare enough
event that cars could be programmed to avoid all collisions with minimal
impact on average speeds.

If a car is around people then it should be going at a low speed. If there are
enough people around that it can't get speed up, then that car is effectively
in a "shared" or "local traffic" area and should be going slowly.

I'd argue that the effect on average speeds will be similar to having low
speed limits on residential streets:a slower period at journey beginning/end
but the majority of the journey would be at speed on sparsely populated
arterial roads.

Here's another argument: intrinsically safe programming might even _increase_
average speeds? Why do we even need traffic lights and pedestrian crossings
with autonomous vehicles? If vehicles were 100% safe, we could do away with
dedicated crossings, and there would be no need for vehicles to waste time at
red lights, merely slowing down on the relatively rare times when it is
necessary.

------
cwbrixey
The problem is likely a failed ABS Brakes and on all cars. When the brake is
applied and then it fails, there is less than 5 seconds to a crash and likely
fatality. No time to react. Experiencing the brake failure on two cars, I have
tried to get the attention of NHTSA for a decade. NHTSA has a lock on faulty
research to blame an innocent driver for all accidents instead of finding the
correct root cause of the crash. My estimate is 50% to 70% of all fatal
crashes is caused by a defective ABS Brake.. Finally with no driver to blame,
NHTSA will have to solve the correct fail mode for these crashes. The
Congressional Sudden Acceleration investigation found nothing to make the
roads safer and Toyotas have SA crashes today and the certified accident
investigator only logs items to blame the innocent driver. Unless the braking
technology is totally changed in self driving cars, I expect daily crashes
with no one to blame. I have sent Elon Musk letters that the Tesla has ABS
Brake issues causing crashes, but their engineers join in blaming the driver
and do not fix the problem Accident investigators are clueless Lawyers and DAs
are clueless News Reporters are clueless NHTSA is clueless NTSB is clueless
Carnage on the roads goes on unabated and we convict an innocent driver.

------
Jeff_Brown
On the bright side, with an autonomous vehicle, an accident is a learning
experience that will benefit other autonomous vehicles.

------
nodesocket
Striking a pedestrian in a crosswalk is horrible and terrible news. The
reality though is how many pedestrians are hit by manual driver's vs automous
drivers? On a percentage basis I gotta believe autonomous cars are orders of
magnitude safer. Self driving cars aren't going to be perfect. How many people
lost their lives in early factories to machines during the industral
revolution? Imagine if they had pulled the plugs back then.

~~~
bkohlmann
Doesn't mitigate the tragedy or implications for autonomy, but per the
article, the pedestrian was _outside_ the crosswalk.

~~~
SllX
Does it really need to be said that an autonomous vehicle still needs to
manage to not kill people regardless of their position relative to any nearby
crosswalks?

Today you can cross in the middle of the street and take a calculated risk
that the people approaching down the street will stop or slow down enough to
not kill you. If they go ahead and kill you anyway, they'll still be
prosecuted because their attention should be on the road, and the most basic
assumption behind their driver's license is that this person is competent
enough and fit enough to drive what amounts to a deadly weapon without killing
the people around them. When you violate that assumption, the fault is
probably on you. Obviously there are exceptions, such as when somebody
intentionally jumps in front of your vehicle, but crossing outside of the
crosswalk to get across the street is not even close to the same level as
trying to commit suicide and fault will be found accordingly.

~~~
smsm42
> Does it really need to be said that an autonomous vehicle still needs to
> manage to not kill people regardless of their position relative to any
> nearby crosswalks?

If you require 100% impossibility of killing anybody regardless of what the
vehicle and the person is doing - it is achievable only by making those
vehicles nearly useless - such as lowering their max speed to something 10 mph
(maybe even lower since it's still possible to push a person who will slip,
hit their head on the pavement and die). If the vehicle is moving fast and
somebody jumps onto a street, there are physical limitation of what can be
done. So, if this technology is to exist there will always be a space where
accidents can - and eventually will - happen.

