
Google: Self-driving cars in 3-5 years. Feds: Not so fast - lispython
http://www.extremetech.com/extreme/147940-google-self-driving-cars-in-3-5-years-feds-not-so-fast
======
jmaygarden
If anyone is interested in some of the algorithms behind this technology, then
I highly recommend Sebastian Thrun's "Artificial Intelligence for Robotics"
course on Udacity [1]. He makes unintuitive probabilistic methods easily
understandable. I'm not aware of a better source for getting a basic
understanding of complex methods like Kalman filters and particle filters
[2][3]. I had never even heard of a "histogram filter" until recently
reviewing this material, and it's a perfect solution for a problem that I
currently have.

[1] <https://www.udacity.com/course/cs373>

[2] <http://en.wikipedia.org/wiki/Kalman_filter>

[3] <http://en.wikipedia.org/wiki/Particle_filter>

~~~
mej10
I was surprised by how understandable the concepts are behind this stuff. I
second this recommendation. They are really cool algorithms.

For real life, there is obviously a huge engineering component, but you can
definitely see how the ideas could come together to build something like the
automated cars.

~~~
jmaygarden
Exactly! I linked to Wikipedia for contrast with how unintelligibly these
topics are often presented for we mere laymen.

------
epmatsw
I think the best way to roll out self-driving cars would be to build up a base
product (like Google has done), and then open source at least a large portion
of the software. Run a campaign that anyone reporting a critical bug gets
their name on the underside of the hood in every first generation car that
Lexus/Honda/Toyota and Google collaborate on. Then, run an ad campaign based
on X thousands of programmers reviewing code totaling X thousand days of
effort. Easily the safest strategy, and it gives an easy response to any
competitor: how many programmers have reviewed THEIR software?

~~~
micampe
Sure, and give away prototypes to test the software.

------
moistgorilla
I can't wait till the day I can have my self driving car drive me to work
while I read a book and when it drops me off it goes to recharge it's battery,
pick up my family from work or school and maybe when it's not in use I can
rent it out as a cab.

~~~
JulianMorrison
The economics won't support it being _your_ car for long, at that point.

When you can just ask for a car to pick you up and one will arrive, corporate
robo-taxi fleets will have economies of scale over family cars sent off to
earn their keep. With ordinary people not needing to buy a car, facilities for
maintaining and supplying them will dry up too, as the manufacturing effort
shifts to taxi-ready cars for fleets. And the price will shift up out of your
range to reflect the earning potential.

The robo-car will end the private car as anything but a hobby comparable to
the classic car.

~~~
steverb
Except no company is going to want to buy enough cars to handle the demand
during rush hour.

~~~
JulianMorrison
Given those cars make two reliable paying journeys per day, minimum, why
wouldn't they?

~~~
saosebastiao
Because two is not enough.

------
ck2
Another good article today on self-driving cars.

[http://www.wired.com/autopia/2013/02/continental-
autonomous-...](http://www.wired.com/autopia/2013/02/continental-autonomous-
vehicle/)

 _“I drove to West Virginia in this car,” he recalls. “It’s about a nine-hour
drive from Detroit. Out of that entire nine hours, I drove for 45 minutes. And
the only reason I drove that much was because there was this nice mountain
pass, with sharp S-turns.”_

------
jeffreyrusso
I can appreciate that there are a lot of benefits to self-driving cars -
safety, the possibility of reduced traffic and congestion, amongst other
things. That being said, I'm always stunned at the complete lack of discussion
about the topic of environmental impact (whether the impact of this technology
will ultimately be positive or negative.)

If we are going to undertake such a major shift in how we all get from point A
to point B, shouldn't that be a significant point of consideration? It is
rarely even mentioned in the context of this discussion, which I find to be
disappointing.

~~~
JPKab
I actually have read discussions of the such, but unfortunately don't remember
where... :(

However, that being said, here are some of the points I've seen:

Pros: Car sharing is easier, since you can pull out a smartphone and "order" a
car to come pick you up at any point, there is less of a need to actually own
a car. This should lessen the total number of vehicles
manufactured/maintained/etc. The huge amounts of energy for making steel come
to mind here. Also, if you are using a CaaS(car as a service) many would
probably opt for a cheaper option where the car will stop and pick up other
passengers on the way, offsetting the cost and therefore reducing price. A
person doesn't have to arrange a carpool this way. If you make it easier and
cheaper to carpool than to not (currently it is cheaper, but a hassle) then
people will do it more.

Reduced traffic congestion: This sets the stage for the other benefits. Cars
stopping and accelerating less equals less fuel used. Many, many more
autonomous vehicles can share a roadway without reducing speed than human
drivers.

Cons:

More suburbinazation/low population density: This is a big possibility. While
living in a distant exurb would mean you probably have to own an autonomous
vehicle, or schedule one to pick you up well in advance, car sharing suddenly
works less.... but a 90 minute commute today would be much faster with
autoomous vehicle roads, and people can read/chill/watch news/surf the net etc
while cruising to work. Commuting is made more relaxing, so more people do
it....

It's a very complex issue to try and understand the implications, but I really
think that it would be an environmental net gain, due to the fact that, if
everyone is using Car as a Service, then the CaaS providers will have cost
minimization as a priority, and work to constantly increase efficiency. People
only care about fast acceleration and performance when they are driving
themselves. Computers don't have small penises to compensate for.

------
efsavage
The key thing is that the government is not going to (and should not) get very
deep into regulating/legalizing this tech until it actually works, which is
fine. 2020 does seem more realistic for a mass market roll-out.

So long as it's ready by the time I'm too old to safely drive, I'll be happy.

~~~
randomdata
As long as the car can pass the tests set out for humans, there really
shouldn't be need for government involvement. We already have defined
standards that we find acceptable. It is not like humans are perfect and the
machine has to match that, though I do hope that the machines can exceed those
existing expectations.

~~~
efsavage
We've defined standards _for humans_ that we find acceptable, in part because
there is much that is implicit in the operator being a human. For example, a
cop at an intersection, holding his hand up, then pointing left to redirect
traffic is not something we test in a driver's test, but is something we
expect human drivers to understand and obey. Perhaps the cars can handle this
already, but there are likely hundreds, if not thousands, of these scenarios
that are going to be necessary to think about now.

There is also a lot of legal work that will begin as these roll out,
especially dealing with liability, and these things take time that simply
selling millions of these cars in 3-4 years will not allow for.

------
digitalmerc
One thing that always comes to mind, and this may just be the pessimist in me,
but remember the "flash crash"? Where a bunch of trading algorithms triggered
short sells as a reaction to other trading algorithms triggering short sells?

Well think of that, but with cars, with people in them...

Edit: I think it's being misconstrued that I am somehow against Google Cars,
when in fact I'm very much for them. Regardless of the likelihood of the
aforementioned scenario, I agree that it's still far better than the wildly
unpredictable human factor. And ultimately I think that self-driven cars will
be a boon both for road safety as well as fuel economy and overall emissions.
(Not to mention traffic, I can't wait for a world when traffic is basically
non-existent)

One thing I realized after making this comment too, is that road situations
are far easier to predict than the randomness of the market, and the
consequences are much higher than 0's in a bank account, so I'm sure there
will be fail-safes.

~~~
ceejayoz
As the flash crash showed, if you screw up enough in the market, they'll just
hit the reset button and undo the trades.

The same isn't true for cars, and that'll likely affect the code testing
process.

Plus, if you're going to reference the flash crash, I'll point out that
people, not computers, caused the Great Depression.

~~~
freehunter
That's a good point. For every Flash Crash we've seen, how many human-error
market crashes have there been? For every algorithmic auto crash that could
ever happen, how many human-error auto crashes have there already been? Just
24 hours ago I had to go pick up my girlfriend from the side of the road after
someone swerved into her car after not looking in his mirrors. The complaint
that algorithms might cause an unintended car crash so we should continue to
only have people behind the wheel rings a bit hollow when people-driven cars
is one of the biggest killers in Western society.

------
Osiris
This technology has the potential to save tens of thousands of lives each
year. How can anyone possibly argue that the technology shouldn't be on the
roads until it's 100% safe (which will probably never be possible)?

So, it's okay for tens of thousands of human drivers to be killed while
driving, but inconceivable for tens of people to die at the hands of a
computer?

~~~
martinced
Imagine that some crazy schizophrenic-paranoid-whatnot-wacko manage to create
a botnet made of 100 000 autonomous cars and decide to make them accelerate
simultaneously and crash into each other...

Note that we're living in a world when hardly a day passes by without some
major security exploit being discovered and hardly a day passes by without
hearing about some foreign goverment taking control of domestic computers.
What if Iran manages to gain control of all the autonomous vehicles? Isn't
that worth a thousand atomic bomb?

And, yes, there's a psychological problem when there's no human to blame but
just a computer or a software (and, actually, ultimately the entire chain up
to politician that allowed such car to be used on the road and software
programmers to write that software). It's alienating to society as a whole and
it's a real issue. People are trying to find solution to deal with this
problem.

~~~
Osiris
That assumes that the cars have some mechanism to wirelessly communicate over
very large distances. Are you sure these cars are being designed with mesh-
style communications?

------
rapind
Release early, release often? Wonder if that'll work for driving software.
Chrome Version 24.0.1312.57 here.

------
eksith
I don't see how it could be any worse than some people's driving already.
Compare the failure rate of embedded devices, software and servo hardware and
that of a tired human being coming home after a long day's work and I'd take
the "smart car" over the exhausted human any day.

------
supersaiyan
"We expect to release the technology in the next five years" It's extremely
unlikely that it will be the entire car, more likely backup/safety features,
at most the self parking like we say at ces at least until there is enough
data to support their expansion.

------
martinced
Ah, HN's double standards...

I mean: it's amazing to see all these people saying, on one side, "It's normal
that there are hundreds of millions of zombies part of gigantic botnets, we'll
never have 100% secure software" and, on the other side, these very same
people saying: "Can't wait to have an autonomous car in 3 years / 15 years".

So what is it? Insecure software or safe cars?

So are there going be theorem provers used to proove that the OSes in these
cars are bulletproof? And the compilers of course shall have been prooved
secure too?

If not it means there are software security issues right? Someone commanding
one million zombie PC is not cool. Someone commanding one million zombie
autonomous car is downright scary.

~~~
tokenadult
I don't think that there is any double standard here. I think most of us who
have driven in most countries have noticed that there are some truly
incompetent human drivers on the road who raise risk for everyone. It's enough
for self-driving cars to reduce the risk of traveling from place to place. If
I must travel, why not let the machine do the driving, while I think about my
work with full attention or engage in undistracted conversation with fellow
passengers? You are of course entitled to travel only by means like commercial
airlines or railroad lines that will not be self-driving as soon as some cars
will. I think the regulators of highway traffic in each country will make
reasonable trade-offs about the timing and conditions under which self-driving
cars will be allowed on the roads I travel. I look forward to that day, having
seen how local human drivers drive.

------
martinced
On a second thought I'm thinking about something: everybody here seems so
convinced that AI is so advanced and computer software so secure that the
benefits here far outweight the drawbacks (including the loss of more and more
individual liberty at the benefit of giga-corporations)...

So what if there are actually zero dead? The car so good, with a feedback loop
so fast that you simply can't have accidents anymore...

Will the speed limits be then raised to 100mph in the street and 200mph or
more on the highway?

~~~
philsnow
> the loss of more and more individual liberty

wait, what?

> giga-corporations

... wat. Google is at most a kilo-corp.

------
martinced
There are already today software failures in cars which created serious
problems: like drive-by-wire cars who wouldn't stop accelerating.

As of now the problem is mitigated by having someone behind the wheel. Even
when the motor shutdowns and when you lose assistance, you can still turn the
steering wheel, brake and/or use the handbrake. This has saved many lifes.

But now how often do OS / software have bugs? Security issues? I can't imagine
for a second that these cars are going to be fully autonomous and not
connected to a network. How can we possibly make that network safe?

Lately on HN the consensus seemed to be: "Every software has bugs, every
software can be exploited if the attacker spends enough time".

So is the software in these cars going to be safe? Safe from ill-intentioned
exploits, safe from bugs potentially threatening lives?

Now if there are less dead then what we typically have in a year, ok. You'll
have to check the numbers.

But there are going to be deads. That is just a fact.

And the public reaction when there's going be a dead due to a "computer
driving" is going to be terrible.

I mean: it's amazing to see all these people saying, on one side, "It's normal
that there are hundreds of millions of zombies part of gigantic botnets, we'll
never have 100% secure software" and, on the other side, these very same
people saying: "Can't wait to have an autonomous car in 3 years / 15 years".

So what is it? Insecure software or safe cars?

~~~
untog
_Insecure software or safe cars?_

Your implication being that there is such a thing as a safe car. I think that
people are excited for self-driving cars on the basis that they will kill
fewer people on average than human drivers do.

It's a slightly depressing way to think about things, but it's realistic.
However, the media hysteria that will surround the first self-driving car
accident/killing will be critical.

~~~
thedrbrian
I'm curious where or with who the buck stops. If my driverless car causes an
accident who is to blame? It couldn't have been me as I gave it full control
and was asleep in the back. Is it the developer of the software or the builder
of the car? Maybe a sensor failed and thus the finger is pointed at me but I
point it at the dealer who service the car. Given that the people who would
most benefit from this tech are the same people who are least likely to give a
shit about the upkeep of their car we are going to have a few years of
interesting news headlines.

------
OGinparadise
_“We expect to release the technology in the next five years. In what form it
gets released is still to be determined.”_

You're talking, what, another 2-5 years of testing and adaptation from car
manufacturers, _if_ they choose to go with Google? Almost all of them have
their own projects and, IMO, are unlikely to be tied to Google and their
likely demands.

