

Google Street View: A wolf in sheep’s clothing - martey
http://www.holovaty.com/writing/streetview/

======
melvinram
Decent thought experiment but the whole "wolf in sheep's clothing" part is
just link bait title or a really bad analogy. A more appropriate analogy would
be "a head fake" as the late Randy Pausch so aptly put it.

~~~
pauljburke
Probably just a really bad analogy. They also used "vicious cycle of driving
and learning" where "virtuous circle" (which I take to mean a system with
positive feedback) would be more appropriate.

~~~
natep
Well, both vicious and virtuous cycles involve positive feedback (i.e. in the
direction of current change). The difference is that virtuous cycles tend
towards positive outcomes.

The whole piece uses terms with negative connotations to convey what I think
is meant to be a positive message of being impressed by ingenuity. So, I'm not
sure which is the more charitable interpretation: he was extremely careless in
writing the post, or was deliberately misleading to get views.

------
tomelders
If I squint my eyes and ignore all the facts that are already out there about
Google's driverless cars, then I suppose this could be true.

I sometimes enjoy these sorts of articles. They're the fan-fiction of the tech
world and it's no different to the sort of conjecture people like to come up
with for Apple, which can be shameless fun to read.

But when there's already a ton of information out there about how Google's
driverless cars work, it just seems cheap and hollow.

Also, I don't get the title.

~~~
jessriedel
> all the facts that are already out there about Google's driverless cars

Are you referring here to fact that the driverless cars follow pre-determined
paths rather than reading traffic signals? Or do you just mean that the
proposed motivation doesn't really hold water? Because it seems pretty
plausible to me that streetview data could be very useful to the drtiverless
car, even if other data is more important and even if streetview is better
motivated on its own.

~~~
tomelders
Google's driverless cars have had a lot of PR recently, with some pretty
decent coverage of how they actually work.

The cars don't follow predetermined routes. At present they do learn routes,
but not using Google Street View data. Actually, the opposite is true, the
intention is for the diver-less cars to generate 3d data for street view.

~~~
jessriedel
According to this blog post, the map data is pretty important.

> Two things seem particularly interesting about Google's approach. First,
> [Google's driverless car] relies on very detailed maps of the roads and
> terrain, something that Urmson said is essential to determine accurately
> where the car is. Using GPS-based techniques alone, he said, the location
> could be off by several meters.

[http://jalopnik.com/5851324/how-googles-self+driving-car-
wor...](http://jalopnik.com/5851324/how-googles-self+driving-car-works)

Do you know for sure that they don't use Google Street view data? If so, I
think you're unusual in knowing that.

------
koide
Everyone who took Sebastian Thrun's Udacity "Building a self driving car"
course (or everybody that has had training in robotics) will see that there's
a whole lot more than mere training data to build a self driving car.

It could be helpful in certain scenarios, but it's for sure not enough for the
data for the cars to be the main reason behind street view. As the author
himself notices, the map business is huge, and it's even bigger when coupled
with always connected location aware devices.

An unlikely, if fun, story.

~~~
DennisP
I took most of Thrun's course. There was a lot about robot localization based
on pre-existing map data...exactly what StreetView is collecting.

If the StreetView cars are using LIDAR, they have a lot of high-quality 3D
maps, perfect for robot localization. Even if they're just taking photos,
various groups have demonstrated building 3D models from collections of
photos.

~~~
koide
Yes, that's right. What I find hard to believe is that the main reason behind
the project is the self driving car. I can easily believe it's one of a set of
objectives.

------
hinting
Is there any evidence for this? About 15 seconds of looking around suggests
it's probably not true.

\- 2007: Street View released

\- 2007: DARPA Urban Challenge

\- After 2007: Google hires several of the teams that won. This is where they
got most of their expertise in self driving cars.

\- Also note that the Urban Challenge winners didn't rely on "lots of data"
but instead "lots of sensors"

Not really a fan of this type of random speculation.

<http://en.wikipedia.org/wiki/DARPA_Grand_Challenge>

<http://en.wikipedia.org/wiki/Google_Street_View>

~~~
tung92
Prof. Thrun was part of the team that used laser range-finders to assist the
image stitching in StreetView. Since both sensors (rangefinders and cameras)
were mounted on a mobile vehicle, wheel odometry and GPS "ground truth" must
have been available at each waypoint as well. The author of the post suggested
that Google might have used the GPS data (and possibly rangefinders) to create
a simulated world to teach the self-driving car (useful for testing new
tweaks?). This is a good idea and probably did not take them a lot of effort
given the available log of data from StreetView.

------
uuilly
I worked for a company that made the first two street view systems. They also
contributed a lot to the robocar project and eventually got bought by google
to work on Chauffeur.

The sensors and math that provide the perception component of chauffeur are,
for conversational purposes, identical to those of street view. But the two
teams are not working together. The demands of each project are too different.

~~~
riffraff
it also feels that in some cases the data would simply be useless, e.g. data
collected in rome and other historical cities by pedaling bikes in pedestrian
areas.

~~~
idspispopd
I agree, the author mistakes that any kind of driving is all that is needed to
build the driverless car system. When so far it's been revealed that the
current level of technology requires the _same_ route to be driven many times.
I.E. It's not generalised yet, and not the trojan horse purported to be. (That
and the driverless car relies on technology with a resolution that was not
originally available when the street view program originally launched.)

------
lucb1e
I thought Google's Street View cars couldn't drive very fast to be able to
take those pictures. When not driving at full speed, it's not really a good
learning set.

Also the link says they get to "know every road by definition", but you could
read that almost entirely from TomTom's maps as well. Roads change, weather
conditions change, etc. I'm not so sure Google really uses their Street View
car data for their self-driving cars. Though it is almost certainly a part of
it, I don't think this is the crux to solving the self-driving car problem.

~~~
ordinary
_I thought Google's Street View cars couldn't drive very fast to be able to
take those pictures. When not driving at full speed, it's not really a good
learning set._

I'd argue that the vast majority of the roads where one drives faster than 50
km/h are very simple. This is especially true for highways; they have to be,
because humans are bad at thinking at 120 km/h and even worse at surviving
collisions at that speed. There's an overabundance of signs and road marking,
and an incredible effort has been put into making it relatively hard for
drivers to behave irrationally. Compare that with, say, a multi-lane
roundabout in the inner city.

Once you've got enough data to reliably survive that kind of situation, you're
90% of the way there.

------
greendestiny
The author is going to be even more blown away when he finds out Google has
been using recaptcha to train image recognition on street view data. Of course
seeing street view as simply a long bet purely for self driving cars seems
like a narrative fallacy.

------
pilgrim689
Google's self-driving car's main sensors are a 360 degree rotating laser and a
radar, NOT a camera... unless Sebastian Thrun has been lying in his Udacity CS
373 course.

~~~
jc4p
Seeing how Sebastian is the head of the team working on it, I think it's safe
to believe him on this one.

------
idan
One big hole in the logic:

Street view cars all work during the daytime. There is no nighttime street
imagery on streetview, and anecdotally, I've never seen a streetview car at
night.

Having self-driving cars during daylight hours only is still an ambitious
goal, though.

~~~
tomelders
I don't think that would matter. Googles driverless cars use 3D laser scanners
which don't require any ambient light. Plus, computers can theoretically see a
lot more of the spectrum and therefore see a lot more than we mere mortals.
They could look at the road ahead an know exactly what temperature it is, and
possibly even see black ice. They could see a ninja on a black horse, at
night, in the rain and fog, from miles away using thermal imaging.

Daylight driving, to a machine, would probably be more difficult than night
time driving simply because there's more stuff happening on the roads during
the day.

------
cypherpnks
This is kind of a no-duh, sherlock non-story.

Fact 1: Sebastian Thrun co-developed Google StreetView

Fact 2: Sebastian Thrun is developing the driverless car.

A lot more goes into a self-driving car than data from how drivers drive.
That's maybe 3% of the problem. Nevertheless, assuming Sebastian is too dumb
to make the connection would assume a fairly high level of stupidity on his
part. That's a pretty bad assumption.

~~~
joejohnson
Fact 1: Elon Musk helped found PayPal

Fact 2: Elon Musk is developing rockets, with hopes of one day sending people
to Mars.

It is clear that Elon Musk is only trying to corner the mobile payments market
on Mars. Don't assume Elon is too dumb to realize the market potential of an
entirely new planet!

------
zyb09
It sure fits nicely together, but I doubt this was the plan all along. Google
general strategy seems to be to collect as much data as possible just in case
it might become useful one day. Good decision to load up the StreetView cars
not just with cameras for pictures, but with all kinds of sensors to collect a
bunch of potentially useful metrics. That data is becoming handy for quite a
few things now.

------
Arelius
Ok, let's be serious. Usage of street view data may or may not be useful for
the driverless cars. But the original, primary motivation for street view was
the ability to build their own map data and remove their dependency on
existing map providers like teleAtlas. Driverless cars were frankly too far
out, and not close enough to their primary business model to be worth it at
the time. And being able to remove their maps licensing requirements gave them
infinitely more freedom at a time when their mapping product was especially
important to them.

------
chrsstrm
If you have ever used street view to catch a glimpse of your destination at an
unknown address or to look for key landmarks at important intersections,
you'll notice right away that the Street View data is the last thing you would
want powering an autonomous car. Many of the mapped roads only show the view
from one lane, and if the road is more than one lane wide, you're missing data
on the positions of the other lane(s). I can't count the number of times the
view I wanted was on the other side of a median or divider, with no data for
that side. Following a pre-mapped course on a multi-lane road using only
Street View data would be a nightmare. Reminds me of the stories of people
pulling U-turns in the middle of a freeway because their GPS told them to.

~~~
bretthoerner
I think you misunderstood the point of machine learning. The cars don't drive
by memorization/knowledge of each street but rather the knowledge of what has
been done in similar situations thousands of times before.

~~~
chrsstrm
Yes, my point being that even after five years of mapping, they still don't
have those thousands of similar situations to use as reliable data set.

------
pkulak
So that cars can drive with the 80-90% success rate that voice recognition has
achieved with similar methods? If you can model your entire surroundings
several times a second to cm precision, you don't have to settle for machine
learning and "just about perfect".

------
josephcooney
linkbait in sheep's clothing.

------
alainbryden
I only just occurred to me - are these self-driving cars expected to pump
their own gas? Surely Google's not prepared to build a complete network of
fuelling stations that are compatible with unmanned cars?

~~~
splatzone
What's stopping them from making deals with petrol stations to have the
attendants pump the petrol for them?

------
Jun8
"You put a camera on the front of your car, and you set it to capture frequent
images of the road ahead of you while you drive. At the same time, you capture
all the data about your driving -- steering-wheel movement, acceleration and
braking, speed"

I'm curious: How does one capture the car data? In the self driver car, the ML
and camera part of it seems to be easier than the interface with the car
mechanics, yet there's generally little mention of that part.

~~~
shriphani
I know someone who used an arduino to pull in sensor data for their vehicle. I
don't think the project is online but they got a very solid amount of data
over a period of 3 months including, braking, acceleration and steering. all
of this without any distinct modifications to the body.

------
Schwolop
This article is flat out wrong. Google's autonomous vehicles do the data
collection for Google's autonomous vehicles.

They drive ahead of time around every environment in which they want to
operate, with a bunch more sensors and more accurate localisation than the
streetview vehicles. The lidar on the streetview vehicles is intended to
provide a 3D surface model of the buildings lining each street. I find it very
doubtful that they'd attempt to do supervised learning of human driving
behaviour from the streetview vehicles, rather than the actual automated ones.

------
antninja
Lots of data won't do any good with imperfect algorithms. Just look at all the
nonsense generated by Google Translate. It doesn't even know grammar.

~~~
smanek
Actually, Google Translate was an enormous leap forward in machine translation
quality. It won a number of awards for its astonishingly good performance. And
its design premise is that you can use really simple algorithms if you have
crazy amounts of training data (a then controversial approach called
'statistical machine translation' - as opposed to rule/grammar based).

Look at Norvig (Google's head of research, and AI-demigod) at al's paper "The
Unreasonable Effectiveness of Data."

~~~
waterlesscloud
It might have been an enormous leap, and it might have won a lot of awards.

But it still produces a great deal of nonsense.

My only semi-informed opinion (hunch) is that the Google/Norvig brand of
statistical approach to AI is a 80% solution to a lot of things, but that last
20% is going to be killer to get.

Right now this approach to AI is a great boost for humans, who can finish off
that last 20% themselves, but I have doubts about the autonomous versions...

I am curious to see what happens with the self-driving car in real world use.
And if translate ever gets much better than it is today. Or if, for that
matter, Google search gets much better than it is today.

------
craig552uk
"the value of the data about the collection of the data" A powerful idea.

------
nextstep
>>"UPDATE: I changed the title of this post from “wolf in sheep’s clothing,”
as that was a lame metaphor that didn’t actually make sense."

I think the original title is perfect: this is a lame post which doesn't make
much sense.

------
wavephorm
The driver-less car seems like just a side-project of Google's primary
business goal: to steal Social Network marketshare from Facebook in the form
of shoving Google Plus down our throats. I mean, what's more important,
innovating and giving people a technology which will improve their lives, or
forcing them to use a website in which to show them cat pictures and click on
context-aware advertisements?

~~~
Tycho
Interestingly, the age of the driverless car would suit Google's core business
for two reasons:

1\. People would have more time to surf the web (instead of driving)

2\. People will have more disposable income (because presumably driverless
cars will save us all a lot of money)

~~~
vezycash
you forgot to add the project glass thing. Ads would literally be in your
face! Sometime in the future, they'll also be able to control people's psyche
through all the glass thing to mask their evil doings.

~~~
damian2000
Plan is to project the ads onto the windscreen of your car - since its
driverless you don't need to see out of it anymore. ;-)

~~~
vezycash
so the screen would be touch screen or something lol. How would the ads be
tracked or interacted with if the ads are on the windshield?

I'm assuming the car owner would be sitting at the back.

------
slavak
Is the author seriously implying a car loaded down with lots of cameras and
doing an average of 30MPH is Google's ultimate vehicle for training driverless
cars?

If he's right, God help us all.

