
Google Maps Mania Blog is Dropping Google Maps - chippy
https://plus.google.com/app/basic/stream/z13qgzd5gza1ed4gw04ccn1ggua0dbohmd0
======
mbrubeck
What gets me is that Google employees like JohnMu (in the following thread)
are actually telling authors to _hide large portions of their sites_ from
Google's indexer, in order to prevent _all_ their pages from receiving
penalties for "low-quality" content:

[https://productforums.google.com/forum/m/?hl=en#!mydiscussio...](https://productforums.google.com/forum/m/?hl=en#!mydiscussions/webmasters/T-4i0yB7CrQ)

That feels like a step back from Google's stated mission to "to organize the
world’s information and make it universally accessible and useful":

[http://www.google.com/about/company/](http://www.google.com/about/company/)

Apparently, only the information that maximizes search advertising revenue is
worth organizing and making accessible. I understand that fighting SEO spam is
an important part of keeping search usable, but there must be a way to do it
that doesn't lead to completely cutting out large parts of the "indie" web
(e.g. posts in discussion forums, and other sites that host user-generated
content).

~~~
haberman
> Apparently, only the information that maximizes search advertising revenue
> is worth organizing and making accessible.

I don't even understand the theory of how there is a profit motive here.

How does Google make more money by asking webmasters to hide parts of their
site from the indexer?

Doesn't Occam's Razor say that a more likely explanation is that innocent
sites are being caught in the never-ending arms race between spammers and
Google?

If Google didn't care about the "indie" web, why would Google employees be
taking their time to coach indie website operators on how to improve their
site's rankings?

I know this is a frustrating situation for everyone involved. But the cynicism
that says there is some ulterior motive cuts deep, especially coming from
someone I respect like you, Matt.

~~~
mbrubeck
The cynical side of my concern is that search engines under pressure due to
spammers will let niche/indie content bear the brunt of the cost, because that
content is not "high quality" from the search engine perspective of "we have
valuable ad inventory that lets us monetize these searches." Not all searches
are profitable. Similarly, "sophisticated" users are less lucrative because we
click fewer ads. A bias toward readily available metrics can easily lead to
algorithm changes that benefit the mainstream majority while punishing various
niche use cases as regretted but ultimately acceptable collateral damage.

The conflict is between "the world's information" (mission) and "the world's
information, minus some troublesome long-tail bits that few people care about
and aren't profitable anyway" (business motive).

> why would Google employees be taking their time to coach indie website
> operators on how to improve their site's rankings

This "coaching" is actually Google pushing the spam problem onto content
authors, promoting "solutions" that benefit Google while still hurting legit
publishers. _" Spam has gotten so bad that our algorithms can't classify it
without misclassifying you. You can either hide some of your content as a
proof that it's not link spam, or have us hide even more on presumption that
it is."_ Yes, this gives Google a powerful tool for weeding out fake/malicious
content, but it also hurts the user experience and the mission in the long
run, compared to solutions that allow for more manual review, more use of
inbound link disavowal, or finer-grained penalties that target only the
possibly "low-quality" pages without spilling over to entire sites. Those
solutions would probably shift more cost to Google, however.

What's really bad is that some of Google's recent recommendations (e.g. "use
the noindex robots meta tag") will remove real knowledge not just from Google
but from other present and future search engines (even if those engines
wouldn't have the same problems Google did with that content). And even if the
algorithm's limitations/errors are eventually resolved, the recommendations
will live on much longer in legacy content (and in SEO folklore), meaning that
some content will be erased from search _permanently_ for no reason. These are
long-term costs that are borne not by Google but by users, publishers, and
competing search engines.

Personally, I want to be able to search ALL THE THINGS even if it means that I
sometimes have to wade through more crap, and that it's harder work for the
search engine to weed out people who are gaming the system. Maybe Google can't
offer me that without degrading its more mainstream users. Even if that's the
case, it still feels like a departure from principles (or maybe a declaration
of defeat) for Google to be making that choice. Even more so when it hurts the
ability of other search engines to step in and index the stuff that Google
doesn't want.

The reason I want to see Google hew strictly to "the mission" is that it
offers a way to let these big-picture issues override things that are more
immediately measurable. I know I became an extensive user of Google services
mainly because of the way its mission-oriented thinking was visible in its
product design choices. That intangible feeling was more important than any
particular feature. Keeping it alive requires the ability to say, "Screw it,
we're not going to go this route no matter its obvious benefits, simply
because it's not in line with the mission." But I also know that it's not
possible to do that in every decision; even here in the non-profit, mission-
driven Mozilla project, principles alone can't win every fight. (See
[http://robert.ocallahan.org/2014/05/unnecessary-
dichotomy.ht...](http://robert.ocallahan.org/2014/05/unnecessary-
dichotomy.html) for an interesting take on this.) And these things tend to
look different from the inside; maybe Panda was a case where de-indexing legit
content was the least-bad path available. But from the outside, every time
"pragmatism" wins over "principle" is a little chink in the armor. And for
Google, the outside view is the only one I have.

~~~
Apofis
Precisely why we should be supporting companies like DuckDuckGo.

~~~
admax88q
DDG is supported by advertising. I don't see why they'd be immune to the same
conflict of interests as Google.

~~~
pjc50
What search engine isn't?

~~~
ZoFreX
One that you pay for.

~~~
gone35
Or a non-profit search engine as a public good service --like Wikipedia, or
the Internet Archive.

Obviously storage and networking costs would make it impossible today, but it
might be feasible in the not-so-distant future. Either way, sooner or later
people will realize the absurdity from a public-interest point of view of
having "all the world's information" in the hands of a single corporation
whose sole legal purpose is maximize its owners' profit --like it happened
with other media in the past, like broadcast news.

------
jacquesm
Any kind of algorithmic classifier will have false positives for a given level
of confidence. The lower the threshold at which you start banning sites the
more false positives you will have, there is no silver bullet here.

So when Google decided to crack down on a class of sites sites that share
enough traits with the sites that were being axed came into the danger zone
and a number of those ended up being penalized enough that they are no longer
viable / interesting enough to maintain.

This appears to be one of those cases.

Being dependent on a single entity your hosting, for the traffic you are
receiving as well as the service you are directly promoting (in this case
maps) makes you an add-on to their eco-system.

I wonder how many small but viable websites are still in this position and how
many of them will give up after finding their traffic decimated because they
ended up being a false positive in some filter.

It might be more productive to try to figure out what exactly caused the
penalty, but I fully understand if the owner of the site just wants to move on
and become as independent of this as he can.

Let's hope that 'pointing go google maps' versus 'pointing to open streetmaps'
won't trigger another penalty because then he might lose what is still there
or a good sized fraction of that.

------
Pxtl
I'm starting to wonder if killing the "content farms" like About and Answers
and the like aren't a case where the cure is worse than the disease.

Often when you search for a "how do I" question, the content farms are the
only things that produce a result that's even _vaguely_ topical, at least
before Pandas.

I mean, it usually went content farms -> Pure SEO spam -> actual relevant
results. Content farms weren't good, but they still beat the SEO spammers. At
least the content farms had some minimal connection to the topic, unlike say
eBay.

~~~
thetrb
I actually liked sites like eHow, but I'm probably in the minority here. If I
wanted a quick answer for a question these sites were providing a good, quick
answer. They are not going into a lot of depth, of course, but if I want that
then I just use a bit more specific search terms.

~~~
wpietri
They had good answers?

Everything I ever saw on a content farm looked like it was written by somebody
who knew nothing about the topic, but did a quick Google search and extracted
some semi-random bullet points from the first few things they saw. Basically,
I could get the same value by clicking on the first non-content-farm link and
then skimming.

~~~
TazeTSchnitzel
The quality varies. Sometimes the content is good.

------
dfc
_Turkey voting for christmas_

That was a new phrase for me. It seems to be chiefly British but the Wikipedia
page suggest that the American alternative is "turkey voting for
thanksgiving." I searched google ngrams for "turkey[s] voting for
christmas|thanksgiving" and it seems that the British variant is a relatively
recent phenomena. NGrams has no results for the American variants. Is this a
common phrase in the UK?

Google NGrams result:
[https://books.google.com/ngrams/graph?content=turkey+voting+...](https://books.google.com/ngrams/graph?content=turkey+voting+for+christmas%2Cturkeys+voting+for+christmas%2Cturkey+voting+for+thanksgiving%2Cturkeys+voting+for+thanksgiving&case_insensitive=on&year_start=1900&year_end=2008&corpus=15&smoothing=4&share=&direct_url=t1%3B%2Cturkey%20voting%20for%20Christmas%3B%2Cc0%3B.t1%3B%2Cturkeys%20voting%20for%20Christmas%3B%2Cc0)

~~~
songshu
From my life in Britain the phrase always brings to mind its usage in
Parliament in 1979
([http://en.wikipedia.org/wiki/1979_vote_of_no_confidence_in_t...](http://en.wikipedia.org/wiki/1979_vote_of_no_confidence_in_the_government_of_James_Callaghan))
which seems to have been the first time many of us encountered it.

~~~
TazeTSchnitzel
It comes from Callaghan's speech (as quoted in the article):

"We can truly say that once the Leader of the Opposition discovered what the
Liberals and the SNP would do, she found the courage of their convictions. So,
tonight, the Conservative Party, which wants the Act repealed and opposes even
devolution, will march through the Lobby with the SNP, which wants
independence for Scotland, and with the Liberals, who want to keep the Act.
What a massive display of unsullied principle! The minority parties have
walked into a trap. If they win, there will be a general election. I am told
that the current joke going around the House is that it is the first time in
recorded history that turkeys have been known to vote for an early Christmas."

------
chippy
From the post:

"Tomorrow I'm going to feature the very last Google Maps on Google Maps Mania,

The blog now gets 10% of the Google search traffic it did just 18 months ago.
With Google attempting to kill off Google Maps Mania it would be like a turkey
voting for Christmas for me to continue to promote Google Maps and the Google
Maps API.

Last week I came very close to giving up completely. But despite Google I
still think there is an audience for the blog. So from Wednesday Google Maps
Mania will be featuring maps created with Open Street Map, Map Box, Leaflet
and other map providers.

If you have any Google Maps you want promoting you have about 24 hours left to
submit them to Google Maps Mania."

\----

The blog is at:
[http://googlemapsmania.blogspot.co.uk/](http://googlemapsmania.blogspot.co.uk/)

~~~
dfc
I am curious why you pasted the entire content of the linked page as a
comment. Is plus.google.com frequently blocked and/or unreachable?

~~~
silverbax88
Yes, Google+ is usually blocked on corporate networks. Which is really no big
loss.

~~~
thrownaway2424
I think you used the word "usually" to mean "in my narrow and by no means
representative experience."

~~~
silverbax88
I mean blocked on most major networks by Barracuda and WebSense by default.
So, I'll stick with my original statement, thanks.

~~~
Karunamon
I think they were more referring to the snarky "no great loss" comment..

------
fidotron
I seriously thought from the title "That must be because they've come to hate
new Google Maps so much" and was surprised that wasn't the underlying reason.

To be honest, I've always been dubious of any business that is familiar enough
with Google's search versions that they name them. It's indicative of being
way too dependent on a single entity.

~~~
ssharp
It's not necessarily indicative of being "dependent" on Google organic
traffic. Even if organic search drives just 10% of your revenue, that 10% may
still be big enough to warrant plenty of attention on the organic search
channel. Enough to warrant paying attention to algorithm changes and versions.

I'll never disagree with the idea that you should diversify your marketing and
not become dependent on a single channel, especially a channel you cannot
control, but to ignore or neglect that channel when it actually matters is not
a better or noble solution.

So far in my experience with the new Panda update is that it's actually
correcting some sites it probably shouldn't have previously penalized in other
Panda iterations.

------
uptown
OT, but does anyone else have wild zooming issues with the "new" Google Maps
on OS X with a scroll-wheel mouse? It borders on unusable as the zoom seems to
completely have a mind of its own.

Edit: This seems to be the issue I'm experiencing:
[https://productforums.google.com/forum/#!topic/maps/6D072fsK...](https://productforums.google.com/forum/#!topic/maps/6D072fsK93E%5B176-200-false%5D)

~~~
LeoPanthera
I have that issue too. It seems to depend on the mouse. It doesn't happen with
my Logitech mouse as long as I set the wheel to "clicky" mode.

------
narrator
This is a preview of our AI dominated future. The AI will penalize you and you
will have no idea why and have no recourse or appeal. It's also great for
plausible deniability if it was indeed a "manual action".

~~~
chillingeffect
It's a hint at how future search engines will diversify. Eventually Google
will become more like the yellow pages where you kind of already know where
everything is, you just need its database to get the last mile.

Google will supply the commercial results, but not all of the world's
information - only its most profitable information. If you want to know how to
do something, how something works, you'll use a different, non-google indexer
for searching, say, all of the "how to" sites: e.g. wikipedia, stack exchange,
about.com, etc.

~~~
vwinsyee
_If you want to know how to do something, how something works, you 'll use a
different, non-google indexer for searching, say, all of the "how to" sites:
e.g. wikipedia, stack exchange, about.com, etc._

I've gradually started to do this using DuckDuckGo's bangs [1]. It actually
works pretty well if I know exactly which site I want to search. I do miss
Google's ability to filter by time, though.

[1] [https://duckduckgo.com/bang.html](https://duckduckgo.com/bang.html)

~~~
chillingeffect
Hail there, fellow out-of-liner in sub-zero downvote land! I guess by not
towing the party line about Google, we're getting downvoted.

Nevertheless, I wanted to take the time to thank you for your information
about DDG's bangs. I've been using DDG since Snowden, but didn't know bout
bangs! Looks very useful and shorter than typing "site:" in google. It kind of
reminds me of multireddits

------
userbinator
As a (very) long-time user of the Internet, I've become increasingly disgusted
at what it's become, in particular the rise of the whole "SEO phenomenon"
that's basically become a cat-and-mouse game between site owners and search
engines. Large corporations and SEO spammers have the resources to play this
game, while the smaller independent sites (often managed by a single person)
don't, and when they get penalised somehow, it's much harder for them to solve
the "problem". The often interesting, unique, and more personal bits of the
Internet are being made less accessible.

I think the reason it became this way is entirely due to search engines
encouraging this ranking game. Those who want their site to be at the top of
the SERPs optimise for that, instead of focusing on what makes their sites'
content valuable to actual users. The incentives are all wrong, from the
perspective of what the Internet should be, and to fix this I think search
engines should, instead of trying to develop more and more complex algorithms
that the SEOs will just figure out how to beat, use a (periodically changing)
_random_ ranking. This would probably kill off SEO completely, as there would
no longer be anything about the search engine to optimise for - everyone who
wants visitors would instead focus on that content, so on those times when
their site appears near the top of the rankings, they can attract and retain
users who will preferably bookmark them and visit again.

~~~
zevyoura
A random ranking? Of all results? Are you kidding? The vast, vast majority of
results for a typical query are tangentially relevant at best.

~~~
userbinator
I don't mean all results that may or may not even have the search terms (which
is what Google "helpfully" tries to do), but all results that actually have
the given search terms. Hopefully this will mean users learn to use more
specific search queries to get what they want, and thus ask more specific
questions --- not a bad thing at all for the Internet and society in general
either.

The biggest advantage of a random ranking IMHO is that it completely
eliminates any incentive to game the ranking system, and gives the "indie"
portion of the Web a fairer chance at getting visitors. Let the users decide
themselves individually what's relevant to them, and not Uncle Google.

------
dalek2point3
This site was due for a revamp anyway. Nowadays some of the most exciting maps
are created with OpenStreetMap -- and the blog was featuring most of them
anyway. The Google Maps name was not giving OpenStreetMap its due credit.

------
rpedela
How is Google trying to kill off Google Maps Mania?

~~~
ChuckMcM
The scourge of low quality content has Google putting more and more pest
control into their algorithm. That has started killing off the traffic to
actual quality sites.

~~~
wcfields
Most notably, it's really taken a hard hit to MetaFilter[1]; the site had been
funded by a $5 fee to join (post/comment) and Google Adwords on
AskMetaFilter[2]. Now they've had to lay off several moderators which
contributed to the sites high quality.

[1] [http://metafilter.com](http://metafilter.com) [2]
[http://ask.metafilter.com](http://ask.metafilter.com)

------
brokentone
While this makes a certain amount of sense (not promoting a tool from a
company that is not promoting your content as well in another arena), I'm
actually glad that Google works in a blanket manner rather than an individual
manner. Assessing "manual" promotions of domains could get really sticky
really quickly.

~~~
jrochkind1
What makes you confident that Google doesn't manually adjust weighting and
ranking for individual sites? I've always suspected they might -- both
increasing and decreasing ranking -- although they won't want this to be well-
known for legal and other reasons.

The legal reasons _might_ be enough to keep them from doing it at all, it's
true, but I've always suspected they do it anyway.

I mean, we know they do it to penalize certain sites they believe violated
their ad rules, right? So we know their systems at least possess the technical
ability to manually penalize sites.

But I'm not an expert in this stuff, which is why I start with a question (it
wasn't rhetorical!).

~~~
morgante
Google does engage in manual actions, but such actions are always reported in
webmaster tools.

------
EGreg
And this is why you shouldn't rely on Google for your traffic, but diversify
off of it asap. In fact dont rely on any one company. Use the open web.

~~~
todd3834
Could you give some examples of alternative ways to get Google quality traffic
that fall under "the open web"?

~~~
TheMagicHorsey
Exactly. I want to be edified as well. I have not found another way to get
organic traffic.

~~~
EGreg
Develop content that people would share with one another, and find on other
sites.

Videos - develop videos that drive "traffic" to your website ... Vimeo YouTube
5min or your own site

Blogs - have people in your company start writing blogs and sharing their
expertise on the web, and others will naturally want to check out your
products ... LinkedIn Blogger or your own site

PR - have people cultivate relationships in forums and with publications and
post stuff that your bloggers write

there are many ways to get traffic besides relying on one search engine

and by the way, even ON that search engine, you should be creating all kinds
of things to "take over" the first page, if you are really trying to vie for
traffic

e.g. on GOOGLE:

youtube videos will show up if you make them

google+ profiles will show up if you make them

local listings, images, maps, bla bla bla

which means Google is telling you to upload a lot of content to GOOGLE owned
properties in order to promote your site

so if you really want google traffic that badly you should do it!

~~~
Vik1ng
[https://en.wikipedia.org/wiki/Organic_search](https://en.wikipedia.org/wiki/Organic_search)

So what you say might answer the first question, but not the follow up one.
Well, your "e.g. on GOOGLE:" answers it, but the fact that you use google for
that example already shows that there really isn't any way around that. He
wasn't asking how to improve ranking. He was asking for alternatives to google
search traffic. (Unless I got that wrong)

~~~
EGreg
But I just told you. Social. Linkedin. Vimeo. YouTube. Other things. Sure you
don't get the "instant win" of having a big player link to you. But you also
are not so dependent on ONLY that big player.

It's kind of like the difference between being discovered by an A&R department
of a record label or going the indie route like Ben Haggerty.

~~~
TheMagicHorsey
But people won't discover my blog posts and videos without Google.

Or do you suppose spamming my Facebook feed promoting my work website is the
way to go?

~~~
EGreg
How do you go from "people won't discover my own site under google's new
algorithm" to no one on linkedin and other places will ever discover any
articles linking to my site without google?

~~~
TheMagicHorsey
We do post things on LinkedIn and Facebook, however we get only a tiny
fraction of the traffic that we get from Google. In addition, the traffic from
Google seems to convert to paying customers at a higher rate--not sure why
yet.

------
lnanek2
I think it is healthy to cover all maps providers anyway. I had a rude
awakening too a while back when I wanted to do a native Android hack on Google
Glass using maps, but there is none of the usual MapActivity, etc. available.
So Open Street Map was the way to go. Google Maps sometimes just isn't the
solution, so it is good to know many.

------
narrator
I monitor some keywords that relate to obscure topics of interest of mine. If
I look at search hits in the last 24 hours I get lots of pages of markov chain
generated spam that looks like forums or blogs or community sites. It's
freakin' ridiculous. The spammers are using machine learning to fool Google's
machine learning.

------
scottcanoni
The only serious traffic I got to my website RoadPetition.com came from when
it was featured on Google Maps Mania.

Does anyone here have any feedback for me on this Google Maps Mashup?

[http://www.roadpetition.com/](http://www.roadpetition.com/)

~~~
bamnet
You should probably migrate to the V3 API.

------
garrickvanburen
to top it all off, Google Maps Mania runs on Blogspot (a Google property).

~~~
Shog9
As does an awful lot of spam.

------
icantthinkofone
I've owned my own web development company for 10 years. In all this time, I've
never had a problem with any of my client's sites scoring well in Google
search. They have never lost their rank, never been de-listed, never
"disappeared", never felt Google was out to get them or that Google was trying
to kill them off.

Google has always been helpful, given us free tools, told us what to do and
how to do it, what is good and what is bad and how to do good by them.

Then you run into people like this guy.

I must be doing something wrong.

------
themodelplumber
I'm sure the Maps team will be excited to hear about this. On the other hand I
wonder what kind of sites this website is competing with. I don't think the
site is really best suited to the blog format; maybe it'd be a good idea to
rethink the presentation and throw some of the latest SEO strategy in there as
well. You can't just make a blog these days and do ordinary blog things and
expect to collect the same SERP pension every year.

