
License to not drive: Google’s autonomous car testing center - davidcgl
https://medium.com/backchannel/license-to-not-drive-6dbea84b9c45#.os2lwoe8b
======
timothya
> _For some time, Google has been convinced that the semiautonomous systems
> that others champion (which include various features like collision
> prevention, self-parking, and lane control on highways) are actually more
> dangerous than the so-called Level Four degree of control, where the car
> needs no human intervention. The company is convinced that with cars that
> almost but don’t drive themselves, humans will be lulled into devoting
> attention elsewhere and unable to take quick control in an emergency._

I think this is a really good perspective. Considering how often drivers are
already doing things like using smartphones behind the wheel of non-self-
driving cars, I think that sort of activity is only magnified by partial
autonomy - which is very dangerous! Humans get distracted or bored easily,
especially when completing routine tasks. I'm glad that Google is choosing to
build a car that never needs human intervention rather than rushing to market
with a partial solution.

Here's a video where you can see what distracted teen drivers look like.
Terrifying. [http://youtu.be/SDWmwxQ_NnY](http://youtu.be/SDWmwxQ_NnY)

~~~
oska
I've seen discussion of this issue in the domain of pilots with commercial
airlines. The suggestion was made that because so much of flying is now done
by autopilot, pilots' ability to react quickly and appropriately in a real
emergency when control is handed back to them has significantly declined. And
that we may soon go to completely pilotless airliners which are taken over by
ground control in case of emergency. (This would also have the side-benefit of
significantly reducing the risk of hijacking).

~~~
cbhl
How does this reduce the risk of hijacking? An attacker would just hijack
ground control instead.

~~~
jessriedel
Ground control is much easier to secure. Instead of having to find a needle
(hijacker) in a haystack (the millions of random Americans flying each day)
with a 90 second search, you can do proper background checks on the small
number of people who are allowed to be there.

~~~
CM30
Until one of those people turns out to be malicious. There is no way to tell
what anyone is thinking, or whether they've been spending their time outside
of work being slowly corrupted by certain influences.

~~~
jessriedel
And yet the rate of terrorist hijackings, although tremendously small, is much
larger than the rate at which secret service agents betray the president.

------
ksenzee
Google is betting on a system that depends really heavily on detailed mapping.
I'd love to know their plan for determining when the map is out of date,
because of road construction or whatever. That seems like the hardest part of
the whole thing to maintain long-term.

~~~
ghaff
AFAIK, pretty much everyone is. That's really been the big shift over the past
few years that's allowed for fairly impressive autopilot levels without AIs
having to "understand" and parse the world to nearly the degree people do.
Presumably vehicles connected to the system will be able to contribute to
updates but maintaining current, high resolution maps across a wide area will
certainly be a challenge. It's not unreasonable to think that the government
could play some role as well as it does for marine navigation.

------
imh
In light of all the IoT bugs we've been hearing about, it's really nice to
hear that they are being super cautious about development here. I hope they
keep the same level of caution (or increase it) as they get close to market.
My biggest worry here is that as the different car companies get close to
market on SDCs, there will be more pressure on each to hurry.

~~~
thrownaway2424
My biggest worry is that Tesla's half-baked almost-self-driving-but-not-really
will hurt someone and cause reactionary anti-self-driving-car laws.

~~~
Animats
Tesla is worrisome, but they backed off some on the automation. Everybody else
who has Tesla-level autodrive (NHTSB level 2, or lane keeping plus radar
cruise control) has sensors to make sure the driver has hands on the wheel, or
is at least in the seat and looking forward. Tesla didn't put that in. Hence
those scary Youtube videos.[1]

Cruise (YC 14) is just scary. They still have that advertising video online
[2] that totally oversells what they can do. All they have is lane-keeping and
smart cruise control, like the other entry level systems. It's automatic
driving from the "move fast and break things" crowd; they're from web and app
startups.

Google is being cautious and testing heavily. But they're spending enough
money to test fast, with many cars on the test track. That's the auto industry
way of doing things. It takes money, but not decades, to get it right.

The CEO of Volvo has the liability issue right - when in autodrive, the
manufacturer is responsible. If you can't accept that, you shouldn't be doing
this.

[1] [http://www.cnet.com/roadshow/news/tesla-autopilot-fail-
video...](http://www.cnet.com/roadshow/news/tesla-autopilot-fail-videos-
nobody-likes-to-listen/) [2]
[http://www.getcruise.com/](http://www.getcruise.com/)

~~~
ghaff
It's hard to come up with examples of consumer tech/products where it's
considered just part of the way things are to have an event resulting in
serious injury or death even though a properly maintained product was used as
directed and there wasn't a clear external factor (e.g. brakes don't work on
ice). I suppose some failures due to age. Drug side effects to a degree (but
see Vaccine compensation fund). However, in general, such things routinely
result in lawsuits in any case.

------
samstave
I asked this before: If they need to log many hours of driving and various
conditions - why cant they hook the brain of the car up to playing videogames
like GTAV and EU truck simulator etc... and have it play thousands and
thousands of hours of the game without killing any pedestrians or getting any
tickets

~~~
TulliusCicero
Something tells me you didn't read the article:

> At the end of the shift, the entire log is sent off to an independent triage
> team, which runs simulations to see what would have happened had the car
> continued autonomously. In fact, even though Google’s cars have autonomously
> driven more than 1.3 million miles—routinely logging 10,000 to 15,000 more
> every week — they have been tested many times more in software, where it’s
> possible to model 3 million miles of driving in a single day.

~~~
samstave
Thanks - I hadnt completely finished the article. :)

