
Driverless shuttle in Las Vegas gets in fender bender within an hour - watchdogtimer
https://techcrunch.com/2017/11/08/driverless-shuttle-in-las-vegas-gets-in-fender-bender-within-an-hour/
======
rjmunro
"Truck reverses into stationary object" is not really news.

I think there are too many possible risks of programming the vehicle to do
anything except stop dead in an emergency situation - you rapidly get into
very complex programming with all kinds of failure modes - e.g. what if the
problem was a faulty sensor? The vehicle might try to avoid a non-existent
threat by crashing into something else. Stopping is the safe thing to do. If
you want to react, you rapidly have to make lots of moral decisions like (The
Trolley Problem).

Briefly sounding a horn in an unexpected emergency stop situation is probably
a good idea, particularly if it's a white noise type thing, rather than a
siren, so that other road users can localise it's source quickly.

~~~
pbhjpbhj
A horn needs to be a horn, add a white noise generator too, but the recognised
noise and the one people are "programmed" to react to is a car horn.

Aside: emergency vehicles in the UK have stopped using the broad-band noise
sirens AFAICT, I don't know why but the reason may be pertinent to any
attempts to use it here? Actually, they often ride without the siren at all,
which is completely ridiculous to me; they put it on at the junction meaning
there's no time to get out of the way, if they used it continually you can
hear them coming.

~~~
mathw
They're trained to look ahead, anticipate and adapt to the road conditions,
judging when they need the siren and when they don't. This is important in
built-up areas, because there's no gain in waking up people at 2am when you
didn't actually need a siren to proceed safely.

When they're putting it on at a junction and you're already in it, they're not
expecting you to magically disappear out of their way. They're expecting you
to look at where they're going and stay out of their path. This might involve
proceeding as you were going to in order to clear it for them. What they
really want is to stop anybody else from entering it until they've got
through.

~~~
cicero
The argument for self-driving cars is that they will be better than a human
driver. A human driver would backup, sound the horn, or both. If a self-
driving car doesn't attempt these things, something is deficient.

~~~
fishcolorbrick
Alternatively - these are just growing pains until the human driving the truck
is replaced by an algorithm.

~~~
AlexandrB
So basically, self driving cars will be worse than human-driven cars until ALL
cars are self driving cars? This is the first fully autonomous version of "No
True Scotsman" that I've seen!

~~~
Voyage_wanderer
This is the reason I believe that self driving thing is doomed. Significant
amount of humans will refuse to stop driving. Fair amount of them will make
life of self driving things unbearable. Sizable part - on porpoise. Just to
check how good a computer is.

Seriously speaking from perspective of a truck driver.

The reasons truck driver backed into thing are: He couldn’t see it. It wasn’t
there when he started.

Horn and flash of lights would stop a human driver. When I back up a truck I
use all my senses, since I can’t see much. Easiest fix for the robot thingy -
add horn and lights. (And wait until all residents start complaining)

~~~
jannyfer
I read a sentiment that I found myself nodding to: people will stop driving
when it’s convenient for them.

Some people love driving, and some people love riding horses, but you don’t
see many horses on city streets today.

------
gvb
It has no horn?!

"What Makes for a Street Legal Vehicle?"

 _Horn – It may not seem the most important piece of safety equipment, and
many big cities even limit how it can be used, but to be street legal every
vehicle must have a horn that is audible for at least 200 feet. The horn can
generally be any note or sound (even ones that play musical tunes are usually
permitted), so long as the minimum volume requirements are met._

Ref:
[https://www.hg.org/article.asp?id=31563](https://www.hg.org/article.asp?id=31563)

~~~
pbhjpbhj
So, are they going to take the driver to court?

Sarcasm aside, who is deemed to be in control, who do you sue for the damage
if this vehicle crashes in to you - isn't it the manufacturer that causes the
crash (not in this case necessarily, I'm talking generally).

~~~
ams6110
> who do you sue for the damage if this vehicle crashes in to you

Generally, the owner of the at-fault vehicle is responsible.

~~~
pbhjpbhj
I own my vehicle, if you drive it and cause an accident aren't you primarily
at fault.

Assuming normal circumstances (so not considering things like me letting a
child drive, or letting you drive drunk, etc.). The person/persons in control
of the vehicle are surely at fault - being the owner doesn't give you power to
alter the programming and so you are powerless to avoid a crash.

The company directors, who instruct the system's programmers, are the one's
able to avoid the crash by making the vehicle operate effectively.

There may be circumstance where a crash could not be avoided but there are
going to be plenty when alternative actions - for which there is a strong
rationale - would avoid a crash too.

~~~
icebraining
I agree, it should be considered a part with a manufacturing defect, not the
owner's fault.

------
odammit
Ugh. This really worries me - not because I’m afraid of driverless cars but
because this is the kind of “news” headline that will get anti-driverless car
jerks all up in a righteous tizzy.

Title should be NO_TITLE because “doofus driving truck backs into something”
isn’t news.

~~~
eloff
Except the driver-less shuttle didn't take any evasive action, the way a human
would have. Did it even honk the horn?

Clearly this company's tech isn't up to par.

~~~
dragonwriter
> Except the driver-less shuttle didn't take any evasive action, the way a
> human would have.

Stopping _was_ an evasive action, and since the result was a graze, it may
have been moderately successful. While the people on the bus seem (from the
article) to think there was time for more action, the perception of time under
the adrenaline rush of an imminent collision may be misleading.

~~~
phil21
> the perception of time under the adrenaline rush of an imminent collision
> may be misleading.

This may be one of the more ridiculous excuses I've seen for self driving
vehicle "screwups".

Is the argument actually computers can't take action instantly compared to a
human? If so, then self driving cars may as well not exist.

It's well within a computer's capability to take evasive action for an
accident like this _far_ before a human driver - assuming sufficient sensor
input. This is exactly the reason I want to see self driving tech rapidly
accelerate - a car that has it's situational awareness 100% at all times, and
thus can react instantly in a safe manner to any situation.

I understand the software is not up to that point yet - and your response is
exactly what I fear the most. I see endless DoS attacks against self driving
cars once they get even a remotely substantial percentage of the market.

If all programming reverts to "failsafe stop when things may be remotely
potentially unsafe" we are going to see traffic problems of a proportion we
have never experienced.

~~~
thethirdone
>> the perception of time under the adrenaline rush of an imminent collision
may be misleading.

>This may be one of the more ridiculous excuses I've seen for self driving
vehicle "screwups".

>Is the argument actually computers can't take action instantly compared to a
human? If so, then self driving cars may as well not exist.

No, I don't think that's what the parent comment meant. It means that the
people who said that it didn't take proper action (that were there) may have
been mistaken in thinking there was more time to act than there was.

------
crazygringo
This feels like a _really_ tricky question, actually -- what to do when an
moving object is headed towards a stopped self-driving car?

It's easy to say the car should be smart enough to move -- but what if, as it
moves in one direction, the object (like a truck trying to avoid the car)
suddenly swerves in that direction too? Then does the car become responsible
for the collision?

And of course, it feels like there could be a real-world version of the
trolley problem [1] -- what if there are 5 occupants in the vehicle who will
be killed by an oncoming truck, but in the only direction where it can move
out of the way, it will have to run over a single pedestrian?

Glad I'm not the one having to make these kinds of programming decisions.

[1]
[https://en.wikipedia.org/wiki/Trolley_problem](https://en.wikipedia.org/wiki/Trolley_problem)

~~~
pjc50
There was someone on twitter who very effectively demolished all this "trolley
problem" speculation by pointing out two things:

\- the knowledge of the world held by cars is partial and probabilistic. As it
is with real humans, only with very different obscure failure modes (see for
example the "adversarial images").

\- if you get into this situation, something has already failed. Either in
your safety practice or someone else's. That reduces the predictability of the
situation and also the predictability of what your actions will do in that
case.

Put those two together and you realise that any kind of "deliberate" (in the
sense that one can impart "intent" to a computer system?) running over of a
pedestrian is indefensible, and that real situations don't admit of neat
counter-factuals like this.

~~~
corobo
Yeah honestly I'd rather cars _not_ have anything to do with trolley problems
built in. Build in "Don't hit objects" with a fallback of "hit the brakes,
horn, and do the robot equivalent of pray" if there's no swerve options
available.

My main concern with putting in code to handle trolley problems is what
happens if (when) this code is tripped as a false positive? Some person
walking along minding their own business when an autocar swerves into them
because something on its radar looked like a crowd of people teleported in
front of it

~~~
Piskvorrr
Sorry, that's just repackaging the problem and pretending it's fixed.

The trolley problem _is_ "there's something everywhere inside my stopping
distance, we will inevitably hit _something_ "; yours is only a possible
solution for it: "Don't hit objects [evaluated as negative]" with a fallback
of "hit the brakes, horn, and do the robot equivalent of pray [ok, gonna hit
whatever is in front instead what's front left or front right]".

TL;DR: Saying "let's have this one hardcoded solution to the trolley problem"
does not make the problem disappear.

~~~
bsder
> The trolley problem is "there's something everywhere inside my stopping
> distance, we will inevitably hit something"

You assume humans make correct decisions when faced with that situation. Most
of the time humans make incorrect decisions when faced with far less
problematic issues.

"Swerve" is normally more reflex than thought. The number of times that
drivers "swerve" to avoid something like an animal and then crash headfirst
into oncoming traffic, thereby injuring humans, is non-trivial.

~~~
Piskvorrr
Do I? Nowhere did I bring that up. The TP is a model to _compare_ the possible
executions and their outcomes - and yes, human reflexes are not optimized for
driving, thus the response is often "crash into other traffic/wayside
obstruction", or even "oversteer, flip the car and become briefly airborne". I
do agree that "just hit whatever is straight ahead" _could_ be the optimal
strategy in many cases, not sure if it makes sense to hardcode it.

------
michaelbuckbee
Saved a click: it stopped. A human controlled truck did not and grazed the
SDV.

~~~
radiorental
Could be a bit more involved than that.

The Machine acted in a way that was not expected by the human driver.

This gets to a fundamental issue we will have in the transition to the utopian
driverless future. Learning to identify and react to this new class of driver.

~~~
spyspy
Really moving the goal post here. Apparently _drive safely_ is an "unexpected"
behavior.

~~~
jacobush
Was it driving safely, then?

~~~
michaelbuckbee
The shuttle was literally not moving and was struck by the human controlled
truck.

~~~
mtgx
Isn't it illegal to do stuff like "blocking an intersection" for instance?

What will a self-driving car do if for instance someone is driving in the
opposite direction, towards it? Will it just continue driving, freeze, or get
out of the way? Humans would likely get out of the way, even if that means
"breaking the law" by going off-road or whatever. A self-driving car probably
wouldn't do that, thus it would be the kind of "unexpected driving" that could
really throw humans off, and may even cause _more_ accidents than humans, in
certain specific situations.

~~~
jacobush
Also, there are even deeper levels here. Going off-road in such a situation
may not _actually_ be against the law. It may look like it on the books, but
when such a case is interpreted by judges and peers, they will certainly in
some cases decide that swerving or going off-road was perfectly legal, for
such a situation.

~~~
mr_toad
Judges might forgive a human who drove onto a sidewalk to avoid an accident,
but they're not likely to be as forgiving of a manufacturer that programmed
their cars to do the same thing.

------
Digory
I'd say the shuttle's insurer should be partly responsible here. The Engadget
article seems to show the shuttle stopping in the truck's blindspot.[0]

A reasonable truck driver would assume the driver of the other car would back
up a little bit. But in this case, the car stopped where the truck couldn't
see whether it was backing up or not, and wasn't programmed to understand the
truck's movements at this angle.

So, software was responsible for: 1\. Stopping too close. 2\. Stopping in the
blindspot of a truck. 3\. Having no horn. 4\. Not backing up or understanding
fairly common motion by a truck.

[0] [https://www.engadget.com/2017/11/09/las-vegas-self-
driving-s...](https://www.engadget.com/2017/11/09/las-vegas-self-driving-
shuttle-bus-crash/)

~~~
wavefunction
The truck driver was cited for 'illegal backing' so they were not exactly
operating in a reasonable manner themselves.

~~~
Digory
True. Nevada appears to be a contributory negligence state, so the truck
driver might be out of luck if that makes him mostly responsible. I guess we
don't know if the Shuttle was, or could be, ticketed for something.

In most 'no-fault' states, though, I think the shuttle would shoulder some of
the responsibility.

------
justboxing
> Now, it must be said that technically the robo-car was not at fault. It was
> struck by a semi that was backing up, and really just grazed — none of the
> passengers was hurt.

I get that it wasn't the Driverless car's fault, but this brings up an
important use case that the driverless cars currently don't seem to be able to
handle.

In an ALL-Human situation, what would've occurred is that the parked car (if
it had a passenger inside) would honk at the car that is trying to back into
it, or open the door and yell at the person trying to back up, and the
accident would be avoided.

Driverless car doesn't (or didn't) honk even if it does detect something
backing up into it. Hence the accident.

------
throwaway929292
I fully expect every death caused by a self-driving cars to make national
headlines in the next few years. Hysteria, whether justified or not, will set
in and legislation will pass banning autonomous vehicles in the US.

Self-driving cars may have a faster reaction time, but they will never reach
the level of human awareness of their surroundings while driving.

Let's see a self-driving car navigate through a construction zone, watch for
instructions from a police officer who is directing traffic, or stop when kids
are playing baseball in a yard and the ball rolls across the street.
Answering, "well they'll have that capability someday" isn't a very compelling
answer. Truly self-driving cars are dependent on technology that simply hasn't
been invented yet.

How nice it will be sitting behind a fleet of self-driving cars dragging their
asses down the highway at exactly the speed limit, or slamming on the brakes
when a leaf flies in front of the sensors.

In addition to that, you think masses of people will be silent while losing
their jobs because these robot overlords are taking the wheel?

Source: I work for a self-driving car startup.

~~~
tghw
> Self-driving cars may have a faster reaction time, but they will never reach
> the level of human awareness of their surroundings while driving.

Self driving cars can simultaneously see in 360 degrees with multiple types of
sensors. They already see better than humans. Now they just need to understand
better, which doesn't seem far off.

Maybe you just work for a bad self-driving car startup?

~~~
Piskvorrr
Doesn't _seem_ far off. That is a fitting description - anything looks easy
from the sidelines; specifically computers-understanding-context always seem
to be just around the corner, for half a century now.

------
throw2016
Surely something as basic as typical traffic patterns in a town have been
simulated? Because a vehicle coming onto you seems to be as basic and mundane
an everyday traffic event as you will get.

It's in self driving proponents own best interest to have stringent standards,
because if the public loses faith its going to be an uphill battle.

Simply demonizing human drivers and hand waving away errors is too self
serving to work.

------
forapurpose
> A City of Las Vegas representative issued a statement that the shuttle “did
> what it was supposed to do, in that its sensors registered the truck and the
> shuttle stopped to avoid the accident.” It also claims, lamely, that “Had
> the truck had the same sensing equipment that the shuttle has the accident
> would have been avoided.”

Note a subtle shift, with government now shilling for the driverless vehicles.
That's the first time I've seen it; I suspect it won't be the last.

------
noonespecial
These crashes seem to be caused by the "weird" way self-driving machines
behave as compared to humans.

Perhaps we need some sort of placard that is legally mandated and easily
visible (like the "student driver" placard) to let people around these
vehicles be aware of them and expect different behaviors than "ordinary"
divers.

~~~
Harvey-Specter
What's weird about the actions of the self-driving vehicle in this case?

It detected that a truck was backing into it's path, and stopped in an attempt
to prevent a collision. The truck driver didn't properly check his mirrors and
continue backing up which resulted in the truck 'grazing' the front bumper of
the car.

What exactly is a human supposed to do differently in that situation? Attempt
to continue driving straight and potentially end up being t-boned by the truck
instead?

Also, this vehicle looks super weird already. If the truck driver didn't see
it, a 'placard' isn't going to make it more noticeable.

~~~
friedButter
>What exactly is a human supposed to do differently in that situation? Attempt
to continue driving straight and potentially end up being t-boned by the truck
instead?

Honk

------
unityByFreedom
So the shuttle stayed still while a truck backed into it.

Apparently, simply having the ability to stop does not make a self driving car
better than a human. It should evade, or at least honk.

~~~
gonyea
On the flip side, attempting to evade situations can cause greater
catastrophe. I’d rather they wire up the horn and call it done.

~~~
rmah
I'd rather that the software sense it's surroundings and make an assessment of
risk for various possible actions. In this case, backing up a bit since there
was no one behind it would have been the safest option.

~~~
radix07
And what if the truck keeps backing into it? Does the car just keep avoiding
it forever? When is the truck driver expected to take responsibility for doing
something technically illegal.

------
d0100
> It also claims, lamely, that

What kind of writing is that?

~~~
smsm42
Modern journalism one, where you do not trust the reader to read the
information and make their own conclusion, but instead must shove your
viewpoint down their throats, for their own good of course.

------
Overtonwindow
Not only does it not seem to have a horn, but it can't back up... I think
those might be features the engineers should explore.

~~~
radix07
To what end should a self driving vehicle back up or avoid a collision? You
could essentially hijack a self driving car by boxing it in if it avoided all
obstacles and impacts. Stopping seems to be the normal and best reaction to
these sorts of illegal actions. At some point the driver in the wrong will
need to take responsibility for doing something illegal.

~~~
filesystem
Have you ever seen a car try to go through a yellow light, it turns red just a
hair too soon, and the car decides to hit the breaks. Often times the car ends
up well into the intersection and they have to back up, forcing the cars
behind them to do so as well. I’m not defending the action of that driver who
ends up in the intersection - they are obviously in the wrong. However, having
a car stopped in the middle of the intersection is a dangerous situation, the
other human drivers recognize that and reluctantly back up.

Driverless software must recognize situations like this and adapt in order to
win the future. We will not go from manually operated cars to 100% driverless
cars overnight. They will have to prove their worth and/or superiority
alongside human drivers for a long time.

~~~
radix07
I have been on both ends of this situation, hitting a car behind me while
trying evade a car backing into me and I have been backed into while
stationary at a light and not having time to respond. In the end it's best not
to try to back into traffic regardless. You just can't depend on traffic
behind you to back up, especially if it is human driven, why would we expect
an automated car to?

It's not the responsibility of vehicles behind that car in the intersection to
fix things, human or otherwise.

You could also argue that a self driving car wouldn't be going as fast behind
a car chasing a yellow light and could allow more room to handle this
situation before it gets to that point.

------
blhack
A semi truck backed into it.

If anything this just highlights to me how much commercial vehicles need self
driving tech (I guess I, lamely, agree with the mayor).

My office used to be in an industrial district. The trucks there scare me.
They absolutely do not follow the rules of the road, and it's dangerous for
everybody around them.

Stuff like this: just backing up and expecting everybody to move out of their
way, or taking a turn too tight and expecting everybody at the light to back
up, were almost daily occurrences.

Yeah, self drivers need to account for this, but the bigger problem IMHO is
getting those trucks and their drivers either off of the road or in compliance
with driving laws.

------
smsm42
Better title: "Human driver hits self-driving shuttle within an hour of it
appearing on the streets".

------
exabrial
I have another question, how does the vehicle detect that it was in a
collision? Does it have impact sensors?

------
upofadown
Hard to judge this without seeing it. A human driver might of been able to
anticipate what the truck was eventually going to do sooner so as to find a
better place to stop. A human driver understands the difference between a
truck backing blind and other vehicles.

------
BanzaiTokyo
it will be impossible to park another car close to a self driving vehicle - it
will always move away.

~~~
dominotw
can't wait to bully a self driving car for a parking spot :).

------
CodeWriter23
What I want to know is if the shuttle automatically called AAA for assistance.

