
Google says self-driving car hits municipal bus in minor crash - sajal83
http://www.reuters.com/article/google-selfdrivingcar-idUSL2N1681DP
======
Gratsby
I live in the area. The stinking buses around here ... They crowd the lanes
regularly. Us human drivers I suppose are used to it, but it is a serious pain
in the butt.

The buses will drive right next to the lane marker, with their mirrors hanging
over into your lane. This makes it so everybody has to creep over a little bit
into the drivers side lane and hopefully everybody in traffic has a small
enough car to deal with it. Otherwise, you have to hang back behind the bus as
if it's in your lane and wait for it to get to a stop.

I've had my issues with Google autonomous cars (they drive slow and they used
to be exceptionally slow at making right hand turns, causing traffic
problems), but in this instance I'm happy to throw VTA under the bus, if you
will, and lay blame 100% at their feet.

~~~
nicholas73
I especially loathe the double length buses. Seriously, why? Not like these
buses get all that full. Just run them more often at peak hours.

~~~
konspence
That costs far more (in fuel, maintenance, and drivers), and typically places
that have articulated buses _warrant_ having articulated buses.

~~~
freshyill
And some places even run smaller buses. Here in the DC area, we have
articulated buses, standard ones, and even smaller ones—about 60 percent of
the size of a standard bus. It's rare to get the ideal size at the right time
in an area of such bad traffic, but they do make an attempt.

~~~
aidenn0
In the small city I live in, we have small buses, which is sized right for the
ridership. Unfortunately their top speed is about 26MPH on level ground and My
commute includes a steep freeway overpass, which that drops to about 18MPH.
The speed limit on that stretch of road is 25MPH and most traffic goes about
35MPH on the overpass as there is nowhere for a speed trap to be set. I can
tell from 4 blocks away when one of those buses has been there recently.

------
cpprototypes
There is a common assumption that humans = bad drivers. And that these self
driving cars will be much better than people at driving. I think this belief
greatly underestimates the difficulty of what Google and others are trying to
do.

Humans are actually great drivers when the situation requires thinking. Lot of
snow and can't see the lane markers? Millions of people adapt every day to
this during winter. Lots of pedestrians, bicycles, motorcycles, etc. doing
somewhat unpredictable things? Again, look at any Asian mega city, people can
adapt very fast. Basically, if the situation requires being alert, people are
very good at driving.

When are people bad a driving? Whenever it's monotonous. Bumper to bumper
traffic or a free flowing freeway. Constant repetition of red/green cycles
while going down a suburban street. These boring situations make a lot of
people basically turn off their alert thinking. Then they do other things like
texting, talking on phone, etc.

And these boring situations are exactly where AI self driving is better at
driving than people. Computers never get bored. The self driving car will be
at 100% attention even during the most boring traffic. But when boring
suddenly turns to not boring? The current state of AI is very very bad at
this.

And unfortunately there's no easy way to use the best of both sides. If the AI
is fully driving, then it will do great while the driving is boring. But by
the time the AI decides, this is too much, can't handle this, it's too late to
alert the human driver to take over. But if the human driver is required to
always pay attention, what's the point of the self driving car?

Self driving cars will get there someday, but I think it's much farther away
than many assume it will be.

~~~
roymurdock
I have a $100 bet going with a coworker at the office.

We defined self-driving cars: When I can pull out my phone in any of at least
5 major metropolitan areas around the world, order a driverless car, have it
pick me up and deliver me to a specified location within the city in a timely
fashion, with as little risk as getting in a taxi.

He is sub 20 years. I'm thinking more like 40-50 mark.

My thesis? We can solve the technical challenges, but so much physical, legal,
and regulatory infrastructure needs to change to make this viable that it will
be more than 2 decades until we see self-driving cars as a reality across
multiple cities. Of course, I would be more than happy to lose the $100 as it
would make everyone's lives better that much sooner. But I'm skeptical.

~~~
smileysteve
> He is sub 20 years. I'm thinking more like 40-50 mark.

Perspective:

* 60 years ago, we didn't have highways.

* 45 years ago, seat belts weren't required.

* 40 years ago, we expected engines to last 10s of thousands of miles. We didn't really understand crumple zones. Open containers and even drinking while driving.

* 30 years, we added fuel injection, computers, and airbags.

* 20 years ago, the hybrid car, better fuel efficiency, and side airbags

Imagine what we'll know tomorrow.

~~~
notahacker
Perspective: nearly 50 years ago we _did_ have Category IIIa instrument
landing systems in commercial jet aircraft capable of landing without pilot
input. We had reliable gyroscopic autopilot capable of keeping aircraft on a
set course long before that.

Vast sums of money have been sunk into developing exceptionally reliable
avionics systems since then.

Passenger carrying aircraft still all have a pilot or two.

~~~
ubernostrum
And to complete the thought: the main reason why human pilots are still on all
those aircraft is because... the automated/computerized systems still can't
really make decisions that match what the humans can do. So the computer is
really great at maintaining heading, altitude and speed for a set period of
time, but really bad at deciding which heading, which altitude and which
speed, and for how long (the best you can do that I'm aware of is have a human
lay out the desired route and punch that into the FMC, and then all bets are
off if conditions change along the way, and the humans will have to decide
what to tell the plane to do again).

~~~
msellout
And those decisions themselves could be easily automated, but people _choose
not to_. There's a checklist for making those decisions. If you can write a
checklist, you can write code. And for when there's not a checklist, then you
can pick randomly, because that's what the human is essentially doing.

~~~
acbabis
A human can notice when new evidence becomes relevant. If something isn't
already on a computer's checklist when it becomes relevant, the human has a
better chance of deciding what to do.

EDIT: I understand that there is research going into making computers
recognize "novel" situations. My comment only applies to algorithms that
contain "checklists".

~~~
msellout
No, then you need to get meta. There's a checklist for identifying "novel"
information and reacting. By checklist I mean algorithm.

Sure, it can be tough to tease out the algorithm from the minds of current
pilots. Luckily in aviation we already have training manuals. In other domains
it'd be tougher to know when we're done extracting knowledge from humans.

------
dsp1234
And here is the actual accident report [0]

 _" A Google Lexus-model autonomous vehicle ("Google AV") was traveling in
autonomous mode eastbound on El Camino Real in Mountain View in the far right-
hand lane approaching the Castro St. intersection. As the Google AV approached
the intersection, it signaled its intent to make a right turn on red onto
Castro St. The Google AV then moved to the right-hand sid of the lane to pass
traffic in the same lane that was stopped at the intersection and proceeding
straight. However, the Google AV had to come to a stop to go around sandbags
positioned around a storm drain that were blocking its path. When the light
turned green, traffic in the lane continued past the Google AV. After a few
cars had passed, the Google AV began to proceed back into the center of the
lane to pass the sand bags. A public transit bus was approaching from behind.
The Google AV test driver saw the bus approaching in the left side mirror but
believed the bus would stop or slow to allow the Google AV to continue.
Approximately three seconds later, as the Google AV was reentering the center
of the lane, it made contact with the side of the bus. The Google AV was
operating in autonomous mode and travelling less than 2 mph, and the bus was
travelling at about 15 mph at the time of contact.

The Google AV sustained body damage to the left front fender, the left front
wheel and one of its driver's-side sensors. There were no injuries reported at
the scene."_

[0] -
[https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52...](https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52-8f80-b33948df34b2/Google+Auto+LLC+02.14.16.pdf?MOD=AJPERES)

~~~
DannyBee
I'm actually curious who had the right of way in this one. It sounds like the
bus just plowed through them while accelerating the whole way.

~~~
dsp1234
I'm pretty sure it's the bus. Generally a already moving vehicle with a green
light on straight road has right of way. The correct action for the autonomous
vehicle (or a human driver) would be to wait for all traffic in the lane to
clear (for example, after the previous light turns red and there is a lull in
the traffic), then make it's way around the sandbags.

Here's my artists rending. The vertical lane is just the right hand lane, and
is "wide", but still one lane. The bus is traveling "up" and the Google AV
tries to manenouver around the sandbags blocking it's way. Thinking that it
can get around in time, the AV starts moving to the left and crashes into the
bus.

I guess the "one lane" aspect is the confounding variable, but since the
autonomous vehicle was stopped, then it takes on the aspect of any other
stopped vehicle on the side of the road (ie giving up it's right of way). Just
like if you have a road where cars are allowed to park on the side of the
road, those cars cannot enter traffic unless it is safe to do so.

Edit: Actually, I have no idea. Did a quick read of California's right of way
laws, and couldn't find anything that jumped out that would cover this
situation.

    
    
      |      |
             |
      |      +----
      
      
      |    SS+----
        <-[G]|
      | ^    |
        |    |
      |[B]   |
       [B]   |
      |      |

~~~
djrogers
If it was a single lane, then technically the bus was attempting to pass a
moving vehicle in the same lane as itself, which puts it at fault.

~~~
craigyk
I wonder about this too. If you are new to CA, at first it seems crazy how
aggressive people are about splitting the right hand lane to turn right... who
would have the right of way if the bus was already stopped at a light, but the
AV split the lane to turn right and passed the stopped bus? I like the idea of
splitting the right lane to make right turns, but IMO, if you've done so, you
no longer have the right of way for traffic in that lane.

~~~
DrScump
<who would have the right of way if the bus was already stopped at a light,
but the AV split the lane to turn right and passed the stopped bus?>

The AV can proceed "when safe" to approach the intersection and then, "when
safe", make its turn, but the bus owns the lane otherwise.

------
ChuckMcM
Ok, I laughed out loud on that one. This quote in particular, _" The vehicle
and the test driver 'believed the bus would slow or allow the Google
(autonomous vehicle) to continue.'"_

Bus drivers in the Bay Area are notorious for ignoring traffic (and
pedestrians). Apparently there is some indemnity or statutes that make suing
either the transportation agency or the driver nearly impossible and so pretty
quickly people learn that the bus drivers drive with impunity. Plenty of
stories posted to the local traffic column in the paper, and shared amongst
neighbors and in the department of safety's "blotter" feature.

Google needs to go back and program their cars to always assume that buses are
out to get them and avoid them at all cost, They are an active traffic hazard
often operated by a disinterested and distracted driver. The only way to "win"
is to not be where ever the bus is.

~~~
technofiend
Up next: Google self-driving buses, finally American buses run with the same
attention to time tables that Swiss ones do. And added benefit: they obey
traffic safety laws.

~~~
kuschku
The issue are not the drivers.

I’ll report what I observed in 2 years of 3 to 4 times of using the bus a day
in Germany.

Every time a bus was late, or tried to break the sound barrier™ in the hope of
reducing some of the delay, was due to one of the following issues:

(a) A tourist with texan accent trying to get onto the bus, discussing with
the bus driver if he can pay with credit card or in dollar (no), then asking
the bus driver to wait while he’s going to get money from the nearest ATM
(happens at about 5% of stops in the downtown areas where the tourists are)

(b) Rush hour traffic, 200 people squeezing into a single bus, and another few
hundred waiting at the bus stop – busses coming every one or two minutes, and
it takes quite some time until people stop trying to get into the bus, and
leave enough space for the doors to close

(c) some kids with invalid tickets trying to cheat and getting caught

These issues can’t be fixed by automated busses, or trains.

Only by less tight schedules, and more busses and trains.

~~~
Symbiote
(a) is fixed because there's probably no-one for the Texan to talk to on the
bus. For trains, the person to talk to is on the platform, and can simply let
the train depart (example: Copenhagen Metro, although good luck spotting any
staff).

It could be more easily fixed by accepting foreign credit cards (Gothenburg
manages this, London if the card is contactless) or telling drivers not to
wait.

(b) is partly solved if the cost of running buses/trains is significantly
reduced, as that leaves money for extra vehicles.

(c)... well, who will notice?

~~~
kuschku
Also, regarding (b):

Already many cities today operate busses by taking unemployed people and
forcing them to take the job as bus driver for no pay (they'd have to continue
living off of welfare), otherwise they'd lose 50% of their welfare money.

That's how you get bus drivers for free — automated vehicles can't really beat
that.

------
krschultz
It was inevitable, so I'm sure they're quite pleased it was a minor issue and
not something catastrophic. Someday in the future a self driving car is going
to hurt or kill a person and then the real legal tests will begin, but this is
the first step on the pathway to normalcy.

My personal fear is that Google and maybe one or two others will get self
driving cars right, but then the imitations from other manufacturers will fall
short. The liability needs to end up on the manufacturer of the self driving
car system, this is not something to be taken lightly at all.

~~~
toomuchtodo
I expect self driving car liability to end up similar to vaccine liability. A
fund and adjudication process is created to compensate those who have an
adverse outcome.

[http://www.hrsa.gov/vaccinecompensation/](http://www.hrsa.gov/vaccinecompensation/)

~~~
BinaryIdiot
The only difference is the vaccine fund doesn't have to be proven and a large,
large amount of cases they receive are highly unlikely to have been caused by
vaccine issues.

But a car crash will be very easy to figure out who's at fault. Why would such
a fund exist for something that can be pin-pointed directly at the party at
fault?

~~~
toomuchtodo
> Why would such a fund exist for something that can be pin-pointed directly
> at the party at fault?

People should still be compensated when accidents happen, without the drag
caused by trial attorneys attempting to extract as much as possible from self-
driving vehicle manufacturers (which is going to slow down progress).

The benefits of self-driving vehicles from reduced fatalities and accidents
alone are so great that a process and funding needs to be in place to allow
continued innovation (if done with safety put first).

~~~
BinaryIdiot
> People should still be compensated when accidents happen, without the drag
> caused by trial attorneys attempting to extract as much as possible from
> self-driving vehicle manufacturers.

Of course but the current laws today can drag this out when people are
involved; why would cars be any different?

> The benefits of self-driving vehicles from reduced fatalities and accidents
> alone are so great that a process and funding needs to be in place to allow
> continued innovation (if done with safety put first).

We don't do this in any other industry as far as I know. It's a weird mechanic
to make the manufacturers of automatically driving cars off the hook from
accidents.

I understand the intent but I don't know how that works within our current
legal system and wouldn't that encourage cheap, shitty-built cars since
companies won't need to be liable?

~~~
toomuchtodo
> I understand the intent but I don't know how that works within our current
> legal system and wouldn't that encourage cheap, shitty-built cars since
> companies won't need to be liable?

It works if you allow self-driving vehicle algorithms to be patented. You
could then open them for public examination by a government agency.

If the algorithm performed to regulation agency expectations, accident victims
would still be compensated for losses without punitive damages exacted.

~~~
BinaryIdiot
Regulation isn't a magic bullet though. I've seen countless companies check
the boxes of regulation for their software in the government space only to
have them fail spectacularly because it was done as cheaply as possible.

Regulation will never cover all possibilities of a company acting shitty; if
companies find ways to doing things cheaper and still being able to check that
box just so they have no liability then they will do it.

I don't think these types of get-out-of-jail-free-cards, even though they're
very well intentioned, are ultimately a good thing.

But time will tell either way :)

------
Animats
Here's the Autonomous Vehicle Accident Report filed with the CA DMV.[1] "The
Google AV was operating in autonomous mode and traveling at less than 2 mph
and the bus was traveling around 15 mph." The other vehicle was a 2002
Newflyer Lowfloor Articulated Bus, which is 61 feet long including the
"trailer" part.

Here's where it happened.[2] You can see traffic cones around the storm drain.

This is a subtle error. Arguably, part of the problem was that the AV was
moving too slowly. It was trying to break into a gap in traffic, but because
it was maneuvering around an unusual road hazard (sandbags), was moving very
slowly. This situation was misread by the bus driver, who failed to stop or
change course, perhaps expecting the AV to accelerate. The AV is probably at
fault, because it was doing a lane change while the bus was not.

Fixing this requires that the AV be either less aggressive or more aggressive.
Less aggressive would mean sitting there waiting for a big break in traffic.
That could take a while at that location. More aggressive would mean
accelerating faster into a gap. Google's AVs will accelerate into gaps in
ordinary situations such as freeway merges, but when dealing with an unusual
road hazard, they may be held down to very slow speeds.

I wonder if Google will publish the playback from their sensor data.

[1]
[https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52...](https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52-8f80-b33948df34b2/Google+Auto+LLC+02.14.16.pdf?MOD=AJPERES)
[2] [https://goo.gl/maps/QzvVXQGxhX72](https://goo.gl/maps/QzvVXQGxhX72)

~~~
jessriedel
> The AV is probably at fault, because it was doing a lane change while the
> bus was not.

Yes, probably true, but not crystal clear. The Google car never left the lane,
so it comes down to subtle questions about appearing to be parked or impromptu
division of lanes near right-hand turns.

------
Animats
This story is getting a lot of press coverage. Reuters, CNBC and Wired are
already covering it.

When Cruise (YC 14)'s car hit a parked car at 20 mph in SF last month, there
was no press attention.[1] Even though it was across the street from the main
police station.

That Cruise crash is an example of the "deadly valley" between manual driving
and fully automatic driving, The vehicle made a bad move which prompted the
driver to take over, but too late. This is exactly why AVs can't rely on the
driver as backup.

[1]
[https://www.dmv.ca.gov/portal/wcm/connect/bc21ef62-6e7c-4049...](https://www.dmv.ca.gov/portal/wcm/connect/bc21ef62-6e7c-4049-a552-0a7c50d92e86/Cruise_Automation_01.08.16.pdf?MOD=AJPERES)

------
mattzito
> The vehicle and the test driver "believed the bus would slow or allow the
> Google (autonomous vehicle) to continue."

Clearly the algorithm does not take into account the classic attitudes of bus
drivers.

~~~
junto
Even more lucky it wasn't a BMW.

Here in Europe the attitudes of BMW drivers are not highly regarded.

~~~
johansch
Not even in Germany?

~~~
aluhut
This is their lair. They are the worst here.

~~~
johansch
What is the German ranking in terms of obnoxiousness like?

E.g. based on BMW, Porsche, Audi, Mercedes-Benz, Volkswagen, Opel, <foreign
invaders>?

~~~
junto
Pretty much that exact order.

~~~
johansch
I would guess Mercedes owners would be more obnoxious than Audi owners though?
:)

~~~
aluhut
Mercedes are a special problem. Often driven by older male drivers who
sometimes remember that their car can drive fast but not caring most of the
time. Even while on the left lane.

------
atonse
Technically, even if self driving cars are more safe than human drivers (even
if they're not perfect), should be good enough. But my lizard brain tells me
that I'm putting my life in the hands of a machine that potentially has bugs,
and that's a little scary.

Most of us are going to expect nothing short of perfection from these machines
to really trust them.

~~~
BinaryIdiot
The same thing happened with elevators. When they had operators who would
physically make the elevator move and then these new fangled "magic" ones came
in people freaked out, got used to it and now no one cases.

I get the feeling the it's the same way with cars just an order of magnitude
bigger of a change since they're such a part of our lives.

~~~
ghaff
But that also provides evidence of the parent's point that "Most of us are
going to expect nothing short of perfection from these machines to really
trust them."

We absolutely wouldn't be OK with elevators that now and then fail in a
dangerous way. Doesn't mean it never happens but it's considered to be
someone's fault when it does.

~~~
skykooler
Elevators do fail occasionally. There are several videos of elevators moving
before the door has finished closing, or stopping mis-aligned with a floor.
It's just a very unusual occurrence.

~~~
nommm-nommm
Yeah, when I was a teenager I was hit in the head by a closing elevator door.
That _hurt_ like all hell.

------
stefap2
In related article [http://www.reuters.com/article/us-google-selfdrivingcar-
idUS...](http://www.reuters.com/article/us-google-selfdrivingcar-
idUSKCN0W22DG)

"Google said in the filing the autonomous vehicle was traveling at less than 2
miles per hour, while the bus was moving at about 15 miles per hour."

Google said in a statement on Monday that "we clearly bear some
responsibility, because if our car hadn’t moved, there wouldn’t have been a
collision. That said, our test driver believed the bus was going to slow or
stop to allow us to merge into the traffic, and that there would be sufficient
space to do that."

~~~
trhway
>That said, our test driver believed the bus was going to slow or stop to
allow us to merge into the traffic, and that there would be sufficient space
to do that."

it was a [double] bus. It isn't the kind of vehicle one stops out of courtesy
to let the other car to merge (until such courtesy is the only option for the
other car to merge, obviously)

Anyway, it is an obvious software failure as the Google car hit the side of
the bus at slow speed. There is no reason to blame it on somebody.

------
trjordan
> The vehicle and the test driver "believed the bus would slow or allow the
> Google (autonomous vehicle) to continue."

Love this. I'm shamelessly rooting for self-driving cars, and crashes are
inevitable. Having the human in the car agree with the computer brings a lot
of credibility to the report and follow-up.

------
ars
I'm actually more impressed that they are trying to code "believed the bus
would slow".

Understanding the expected behavior of other drivers is critical to making
self driving cars work. And that seems like a pretty hard thing for a computer
to figure out.

~~~
privong
> Understanding the expected behavior of other drivers is critical to making
> self driving cars work. And that seems like a pretty hard thing for a
> computer to figure out.

You're definitely right.

With regard to this particular instance, based on my experiences in many
cities in different countries, buses pulling out into traffic when they don't
technically have the right of way is an expected behavior. :)

~~~
steveax
Unless I'm reading the report wrong, it appears that the bus did have the
right of way. Indicating intent and hoping will get you in trouble with more
than just busses.

~~~
privong
> Unless I'm reading the report wrong, it appears that the bus did have the
> right of way.

Yeah, I was more trying to make an observation about general behavior of
buses. When I read the article there was no mention of fault or reference to a
report, so I didn't know. But I do know that buses disregarding right of way
is fairly common. "With regard to this instance" was probably too
strong/literal of a phrase to use to tie-in my observation.

------
evunveot
If you're having trouble visualizing what happened (like me), here's the
intersection:
[https://www.google.com/maps/@37.3859058,-122.0843836,3a,75y,...](https://www.google.com/maps/@37.3859058,-122.0843836,3a,75y,180.9h,70.85t/data=!3m6!1e1!3m4!1sT2jrKcM8pLU8MPq1hj60xQ!2e0!7i13312!8i6656!5m1!1e4!6m1!1e1)

Looks like the far right lane has double the normal lane width to accommodate
on-street parking. The Google car, looking to turn right on red, attempted to
use the right-most part of the lane to pass cars that were waiting to go
straight, but there were sandbags covering the storm drain near the corner, so
it had to stop. When the light changed, the cars it had passed continued on
while the Google car waited for a gap, which ended up being in front of a bus,
likely caused by the bus accelerating more slowly than the cars in front of
it. The Google car tried to use the gap to get around the sandbags, assuming
that the bus wouldn't just plow into it, but the bus plowed into it. Perhaps
the bus driver assumed the Google car was parked and wasn't paying much
attention to it.

~~~
duderific
The question is, how much room did the AV have in front of the bus to merge
out in front of it?

I'm wondering if the bus was cruising along and saw the AV inching out into
its lane and just thought "Oh, once they see this big-ass bus coming along,
they'll stop trying to merge." And the AV thought "Once that bus sees me
inching out, it'll slow to let me merge." And neither one was right.

------
deckar01
It is going to be difficult to predict the actions of irrational actors like
bus drivers. You can usually assume that a driver has a vested interest in not
damaging their vehicle, but my experience navigating California's city streets
has consistently suggested otherwise when busses are involved.

I would group them into a category along with police cars and ambulances due
to the prioritization of speed over safety. (Although, in my experience
ambulances, are usually very good a prioritizing safety.)

------
bch
The linked report doesn't say Google is accepting responsibility, so it'll be
interesting to see what tact they take here. In _Toronto_, from talking with
acquaintances that are drivers, I understand they work in near constant fear
of accidents. They are expected to at least practically always be able to
avoid accidents, and the feeling is that if something happens, they're
presumed guilty (within management), and proceed from there.

I don't know if all professional bus drivers' conditions are the same as
Toronto, but I can imagine this could be extremely distressing for the bus
driver involved. At least nobody was injured.

~~~
obsurveyor
> The linked report doesn't say Google is accepting responsibility

That's for insurance reasons. You're never supposed to admit fault at the
collision in the US. You let the police decide and that's that. It's like the
first thing listed on the guidelines for collisions on your insurance card.

~~~
bch
Agreed re: insurance reasons. That said, the phrasing of the event is
interesting "...self-driving car hits municipal bus..." and (in article)
"...self-driving car struck a municipal bus...", and the co-driver (and
vehicle) is quoted as "believed the bus would slow or allow the the Google
(autonomous vehicle) to continue.", sound a little "soft", more like "yeah, I
think that's my fault". I don't have 'favourite to win', regardless. There's
another article[0] that I didn't see earlier that does in fact accept at least
partial fault, and additionally mentions that the software was updated as
result of this event. Thankfully it seems to be a low-cost (in terms of injury
and damage) event, but fascinating given the young state of autonomous
driving.

[0] [http://www.reuters.com/article/us-google-selfdrivingcar-
idUS...](http://www.reuters.com/article/us-google-selfdrivingcar-
idUSKCN0W22DG)

------
donpdonp
The description of events is slightly suspicious. For all the predictive
smarts of the car and its impressive LIDAR, keeping from bumping into things
seems like it would be the highest priority possible second only to loss of
life. The corners of the car have some kind of distance sensor so the car
would have to have known it was about to collide with something. In CPU time
there was plenty of time to react and its always looking at all four corners.

The article make it sound like the AI blindly changed lanes into the bus. It
seems most likely that the AI knew about the impending collision and decided
colliding was safer than any other option. It'd be great to know what the
other options were, but I imagine we'll probably never get more detail.

~~~
djrogers
It didn't change lanes, which is why the description didn't say it did. The
entire event took place in a single extra-wide right lane, which are pretty
common. The extra width is to accommodate on-street parking as well as right
hand turns.

------
stretchwithme
Maybe Google should train their cars at demolition derbies instead of on
public roads. Have one group of vehicles trying to crash into another. And let
competing teams of students program the crashers. Winning team gets $25K.

And play with touch football rules to keep the costs down. Cover vehicles with
touch sensors and cameras that record each crash.

------
rmcpherson
Is there any information available about how these cars take evasive action or
attempt to reduce damage in the event of an imminent impact? The article's
wording makes it sound like the car was at fault for the crash by re-entering
the lane:

> "But three seconds later, as the Google car reentered the center of the lane
> it struck the side of the bus"

Presumably the car's software predicted the crash before it occurred but was
unable to completely avoid it. I'd love to know what the programmed behavior
is in these 'unavoidable' crash scenarios. edit: formatting

------
ikeboy
Can the URL be changed to a source that at least links to the report, like
[http://www.theverge.com/2016/2/29/11134344/google-self-
drivi...](http://www.theverge.com/2016/2/29/11134344/google-self-driving-car-
crash-report) or [http://www.wired.com/2016/02/googles-self-driving-car-may-
ca...](http://www.wired.com/2016/02/googles-self-driving-car-may-caused-first-
crash/)

------
mannykannot
This incident hints at what I believe is going to be a big issue in the
transition to autonomous vehicles in busy urban areas. If AVs are too passive,
human drivers will take advantage of them. If they are too aggressive, the
accident rates will be high, and the risk of an AV causing a major accident
increases. Threading this dichotomy is more difficult than solving the physics
of driving.

------
kisstheblade
The country where I live buses have right of way when leaving bus stops. They
seem to actually try to hit cars which are in the next lane (even when the car
has almost passed and it of course isn't a "right of way" situation
anymore)... So if you program an autonomous car it would be smart to program
it to never get next to a bus :)

------
smoyer
The vehicle and the test driver "believed the bus would slow or allow the
Google (autonomous vehicle) to continue."

If the test driver stated that s/he thought the car should yield to the bus,
would the test driver still have a job? I would have been shocked if the test
driver said the software was at fault.

------
Swizec
"The vehicle and the test driver "believed the bus would slow or allow the
Google (autonomous vehicle) to continue.""

This sounds like a very strong assumption. When was the last time you saw a
bus slow down for a car or anything much? There's a reason "Fuck you I'm a
bus" is a meme.

~~~
wtallis
Since the light had just turned green, the assumption was probably that the
bus would cease accelerating long enough for the Google car to get around the
obstacle, not that the bus driver would hit the brakes.

------
randomacct44
My question is - why are bus drivers such assholes (as depicted by the
comments here)?

Serious question I'm curious about. Is this societal, is it a technology
problem (do the way buses work somehow encourage this behavior towards other
motorists), is it likely a combination of both?

------
tn13
I hate El Camino road. The right most lane is always far too wide which means
if a bus has stopped everyone expects you to still squeeze from the limited
space. If you don't they will honk. If you do there is always a possibility
that the giant bus might miss you and hit you.

However an important take away for all bay area drivers from the article is

You are supposed to HUG the right shoulder when making a right turn. I am yet
to see a single driver who embraces that principle. It is not just for your
safety and convenience of the traffic behind you but also because of the
safety of bike riders, if you make a sudden right turn they might hit you and
get injured.

------
jfoster
I think this minor accident is actually a good thing for self-driving cars. It
sets the expectation that they may not be perfect, and isn't a PR disaster in
the way that a fatal accident may have been.

From here, more accidents are able to happen without being huge news stories.
Undoubtedly, the first time someone dies in an accident involving a self-
driving car, there will still be lots of questioning of the technology, but it
won't come as a complete surprise, now that smaller accidents have occurred.

------
romuloab42
I don't know in US, but at least where I live when somebody hits you from
behind, it's his fault. The idea is that you are supposed to drive cautiously
and therefore be prepared if the vehicle in front of you behave erratically --
that is, even if the vehicle's driver is wrong (say, it's not his right of
way), you still should be in a safe distance.

------
jaza
Google needs to program their AVs to be aware of the universal "law of
anything bigger"!

[http://vignette2.wikia.nocookie.net/headhuntersholosuite/ima...](http://vignette2.wikia.nocookie.net/headhuntersholosuite/images/6/6c/Spaceball_One_001.jpg)

------
bfrog
Seems like this was inevitable. Its astounding to me that it took this long
and that there was no one hurt!

It is even more amazing that there have been apparently 0 injuries or
fatalities. I wonder how these autonomous cars compare in terms of hours on
the road to # of incidents (not accidents mind you, I hate that term).

------
AdmiralAsshat
So what happened when the bus driver walked over to the self-driving car in
order to exchange insurance information?

~~~
jonknee
The human in the car exchanged it and thanks to the multitude of sensors and
cameras this will be the single best documented fender bender in the history
of the planet.

~~~
Animats
Yes, especially since the New Flyer buses also have cameras and data
recorders.

------
bryanrasmussen
It would be great if it turned out google car swerved into the bus to avoid
hitting a fat man.

~~~
ant6n
'Sand bag' turns out to be a euphemism.

------
aidenn0
It's interesting to observe the highly varying amount that other cars will
allow for merging. Merging onto the freeway during a busy day (maybe 35MPH
traffic), I tried to merge and the car I was planning on going in front of
moved up until their bumper was about a meter from the vehicle in front of
them and leaned on their horn. "I technically have the right of way in this
situation and by God, I won't yield it to you!"

This was a situation where there were about 40 cars merging onto the freeway
and they mostly just do a fairly standard zipper merge.

------
awqrre
That was not a small target...

------
lyle_nel
This just in ... one vehicle on the road hit another vehicle on the road.

~~~
chairleader
I'm finding the description of the accident a little... passive. Has anyone
found other reporting or even video of the incident itself?

~~~
theklub
I mean they aren't trying to advertise this.

------
copremesis
skynet?

------
known
Uber + Self-driving = Mayhem

------
tobbyb
Some of these discussions seem to severely underestimate human intelligence
and the ability to reason and take decisions rapidly.

There are hundreds of millions of cars operating in all sorts of conditions
with little incident. There are literally hundreds of thousands of preemptive
actions, foresight, experience and instinctive actions at play for every
possibility on roads that millions of people adopt easily.

To say that's not good enough one must articulate a system that is clearly
better or has the potential while avoiding a broad brush with 0.1% incidents
or diminishing human drivers. That's an argument of convenience and could
reflect a lack of understanding of the scope and scale of the problem. This is
just sandbags, there are literally millions of obstacles and scenarios
negotiated without incident everyday.

Crashes in snow is not always about bad decisions. It could also be extremely
poor conditions that cars should not be in, inadequate vehicle or tyres and
computing will not help if hardware is deficient. Presuming the worst of
others or jumping to conclusions about their intelligence levels is unsavory
and dangerous if used to push something.

An AI vehicle in any crowded Asian city is going to be literally stranded by
indecision. On relatively empty and organized roads the computing power needed
will barely scratch the surface of what a proper self driving AI system needs
to be unless you redesign the roads and place constraints, which then becomes
a different discussion to approach carefully.

