That feels like a step back from Google's stated mission to "to organize the world’s information and make it universally accessible and useful":
Apparently, only the information that maximizes search advertising revenue is worth organizing and making accessible. I understand that fighting SEO spam is an important part of keeping search usable, but there must be a way to do it that doesn't lead to completely cutting out large parts of the "indie" web (e.g. posts in discussion forums, and other sites that host user-generated content).
I don't even understand the theory of how there is a profit motive here.
How does Google make more money by asking webmasters to hide parts of their site from the indexer?
Doesn't Occam's Razor say that a more likely explanation is that innocent sites are being caught in the never-ending arms race between spammers and Google?
If Google didn't care about the "indie" web, why would Google employees be taking their time to coach indie website operators on how to improve their site's rankings?
I know this is a frustrating situation for everyone involved. But the cynicism that says there is some ulterior motive cuts deep, especially coming from someone I respect like you, Matt.
I would imagine you actually do see the profit motive for that activity, and yet you rejected the statement above by saying, "I don't even understand the theory of how there is a profit motive here. How does Google make more money by asking webmasters to hide parts of their site from the indexer?"
Could you explain how these things connect in your view?
There are sometimes quite a lot of ads on the search results page these days, but they are still clearly distinguished, and I haven't heard anything about that policy changing.
The conflict is between "the world's information" (mission) and "the world's information, minus some troublesome long-tail bits that few people care about and aren't profitable anyway" (business motive).
> why would Google employees be taking their time to coach indie website operators on how to improve their site's rankings
This "coaching" is actually Google pushing the spam problem onto content authors, promoting "solutions" that benefit Google while still hurting legit publishers. "Spam has gotten so bad that our algorithms can't classify it without misclassifying you. You can either hide some of your content as a proof that it's not link spam, or have us hide even more on presumption that it is." Yes, this gives Google a powerful tool for weeding out fake/malicious content, but it also hurts the user experience and the mission in the long run, compared to solutions that allow for more manual review, more use of inbound link disavowal, or finer-grained penalties that target only the possibly "low-quality" pages without spilling over to entire sites. Those solutions would probably shift more cost to Google, however.
What's really bad is that some of Google's recent recommendations (e.g. "use the noindex robots meta tag") will remove real knowledge not just from Google but from other present and future search engines (even if those engines wouldn't have the same problems Google did with that content). And even if the algorithm's limitations/errors are eventually resolved, the recommendations will live on much longer in legacy content (and in SEO folklore), meaning that some content will be erased from search permanently for no reason. These are long-term costs that are borne not by Google but by users, publishers, and competing search engines.
Personally, I want to be able to search ALL THE THINGS even if it means that I sometimes have to wade through more crap, and that it's harder work for the search engine to weed out people who are gaming the system. Maybe Google can't offer me that without degrading its more mainstream users. Even if that's the case, it still feels like a departure from principles (or maybe a declaration of defeat) for Google to be making that choice. Even more so when it hurts the ability of other search engines to step in and index the stuff that Google doesn't want.
The reason I want to see Google hew strictly to "the mission" is that it offers a way to let these big-picture issues override things that are more immediately measurable. I know I became an extensive user of Google services mainly because of the way its mission-oriented thinking was visible in its product design choices. That intangible feeling was more important than any particular feature. Keeping it alive requires the ability to say, "Screw it, we're not going to go this route no matter its obvious benefits, simply because it's not in line with the mission." But I also know that it's not possible to do that in every decision; even here in the non-profit, mission-driven Mozilla project, principles alone can't win every fight. (See http://robert.ocallahan.org/2014/05/unnecessary-dichotomy.ht... for an interesting take on this.) And these things tend to look different from the inside; maybe Panda was a case where de-indexing legit content was the least-bad path available. But from the outside, every time "pragmatism" wins over "principle" is a little chink in the armor. And for Google, the outside view is the only one I have.
Agreed 100%. I often start looking at the search results beyond page 10, since that skips past a lot of the "sites that Google's algorithm thinks are relevant, but are actually useless". And thanks to their ranking algorithms, the linkfarms and spammy sites tend to cluster together so it's easy to skip past them too (or just add -site:spammydomain.com to the query and try again).
Yes, yes, yes, yes, this please! Google used to still have this in the first half of the 2000s or so and it was glorious.
Truly having the world's information at your fingertips. Not just Wikipedia and some high ranking sites, but as your queries got more and more specific, you'd still get results from increasingly strange and dusty corners of the web. Yes that meant you had to evaluate the value and trustworthiness of results yourself (but really, you still do), and it was so very much worth it.
Obviously storage and networking costs would make it impossible today, but it might be feasible in the not-so-distant future. Either way, sooner or later people will realize the absurdity from a public-interest point of view of having "all the world's information" in the hands of a single corporation whose sole legal purpose is maximize its owners' profit --like it happened with other media in the
past, like broadcast news.
Ha! HN loves GOOG. Try frequenting this site as an Apple employee...
Maybe they get paid by competition. After all removing competition with clever TOS tricks is right up Googles alley:
Looks like google saw what bookface was doing and copied. Remove from existence and charge for exposure.
make sure that all of your indexed pages are of the highest quality possible and that they are fantastic representatives of your website
JohnMu then gave some examples of very thin pages. It is about a site with 99% of such pages and problems with spammy UGC. Not about maximizing search advertising revenue. It is about quality like described in: http://googlewebmastercentral.blogspot.com/2011/05/more-guid...
I'd like if there was a traffic graph (is an algorithmic penalty visible, trends, or are they just down to 10% of 1.5 years ago?). We have to go by word and own research now. Googling "Google Maps Mania" shows a botched domain transfer as a likely culprit for the low rankings/visitors. The redirect is not 301. Old pages do not redirect to content on the new domain ( http://www.mapsmaniac.com/2013/02/beneath-thunderdome.html ). So any links to mapsmaniac before the domain transfer do not contribute to the new blog at googlemapsmania.blogspot.com . Domain authority is zero'd out. Many well-ranking pages now show a "Page not Found" to Google. Then another site pops up on another domain with the same name. How can Google know for sure the two are from the same owner if the redirects are not set up right?
Fix the redirect. Restore the old links and redirect them to the new site. But maybe much of that is already too late for them. For anyone else in the process of moving a site, see: http://www.mattcutts.com/blog/moving-to-a-new-web-host/
We need to stop pretending Google is a rich Mozilla, They're a poor Microsoft.
That's why Twitter, which is a tiny company with no profits and probably not much more growth potential, has a higher market cap than some insurance companies an retail chains with good growth, real money and solid long term market potential. "Feelings".
"What about Facebook? What about Twitter? I have no idea what Twitter is good for. But if it flips out every tyrant in the Middle East, I'm interested."
I guess it's just pretty hard to allocate resources in the traditional sense that most people are used to because of leverage that companies like google can provide, but people try anyways.
All in all, I think google is just trying to increase confidence from advertisers in it's content network.
Which is hardly to say that you can unload the entire company at that price, or that it'll last, but it's a step above mere 'feelings'. :b
Meanwhile Google has done terrible things including ripping off bandwidth providers, and not to mention they hold a monopoly on search. conspiracy theorists seemed ill concerned on one organization responsible for answering all their web queries. could go on about outsourcing of jobs, h1b visas, causing a recession, etc etc.
not that any of these companies are 'evil' or even 'wrong' in my eyes, but why do others treat tech companies with such endearment and throw so much hate at a handful of other companies?
we're thinking geography has a lot to do with it. Many influencers on the east and west coasts know a lot of people that work in these companies, and write very positive things about them. not completely altruistic, as they are also heavily invested in them.
So when Google decided to crack down on a class of sites sites that share enough traits with the sites that were being axed came into the danger zone and a number of those ended up being penalized enough that they are no longer viable / interesting enough to maintain.
This appears to be one of those cases.
Being dependent on a single entity your hosting, for the traffic you are receiving as well as the service you are directly promoting (in this case maps) makes you an add-on to their eco-system.
I wonder how many small but viable websites are still in this position and how many of them will give up after finding their traffic decimated because they ended up being a false positive in some filter.
It might be more productive to try to figure out what exactly caused the penalty, but I fully understand if the owner of the site just wants to move on and become as independent of this as he can.
Let's hope that 'pointing go google maps' versus 'pointing to open streetmaps' won't trigger another penalty because then he might lose what is still there or a good sized fraction of that.
Often when you search for a "how do I" question, the content farms are the only things that produce a result that's even vaguely topical, at least before Pandas.
I mean, it usually went content farms -> Pure SEO spam -> actual relevant results. Content farms weren't good, but they still beat the SEO spammers. At least the content farms had some minimal connection to the topic, unlike say eBay.
Everything I ever saw on a content farm looked like it was written by somebody who knew nothing about the topic, but did a quick Google search and extracted some semi-random bullet points from the first few things they saw. Basically, I could get the same value by clicking on the first non-content-farm link and then skimming.
If only there were a way to search multiple content farm sites at once, and rank those results according to how useful they were...
If you don't like it, you can search those sites directly. For example search for "ehow:how to ask a girl out"
was that so hard?
The point was that they've been successfully more aggressively penalized, to the point where many of these sites are now no longer showing up more.
> If you don't like it, you can search those sites directly. For example search for "ehow:how to ask a girl out"
was that so hard?
That defeats the primary purpose of using a general search engine - namely to get ranked results across a large number of sites.
I wish there was a way in Google to indicate you'd like to see results from a site ranked higher FOR YOU. I imagine they already learn that when you click on that domain in two separate searches.
That was a new phrase for me. It seems to be chiefly British but the Wikipedia page suggest that the American alternative is "turkey voting for thanksgiving." I searched google ngrams for "turkey[s] voting for christmas|thanksgiving" and it seems that the British variant is a relatively recent phenomena. NGrams has no results for the American variants. Is this a common phrase in the UK?
Google NGrams result: https://books.google.com/ngrams/graph?content=turkey+voting+...
"We can truly say that once the Leader of the Opposition discovered what the Liberals and the SNP would do, she found the courage of their convictions. So, tonight, the Conservative Party, which wants the Act repealed and opposes even devolution, will march through the Lobby with the SNP, which wants independence for Scotland, and with the Liberals, who want to keep the Act. What a massive display of unsullied principle! The minority parties have walked into a trap. If they win, there will be a general election. I am told that the current joke going around the House is that it is the first time in recorded history that turkeys have been known to vote for an early Christmas."
"Tomorrow I'm going to feature the very last Google Maps on Google Maps Mania,
The blog now gets 10% of the Google search traffic it did just 18 months ago. With Google attempting to kill off Google Maps Mania it would be like a turkey voting for Christmas for me to continue to promote Google Maps and the Google Maps API.
Last week I came very close to giving up completely. But despite Google I still think there is an audience for the blog. So from Wednesday Google Maps Mania will be featuring maps created with Open Street Map, Map Box, Leaflet and other map providers.
If you have any Google Maps you want promoting you have about 24 hours left to submit them to Google Maps Mania."
The blog is at: http://googlemapsmania.blogspot.co.uk/
To be honest, I've always been dubious of any business that is familiar enough with Google's search versions that they name them. It's indicative of being way too dependent on a single entity.
I'll never disagree with the idea that you should diversify your marketing and not become dependent on a single channel, especially a channel you cannot control, but to ignore or neglect that channel when it actually matters is not a better or noble solution.
So far in my experience with the new Panda update is that it's actually correcting some sites it probably shouldn't have previously penalized in other Panda iterations.
Edit: This seems to be the issue I'm experiencing: https://productforums.google.com/forum/#!topic/maps/6D072fsK...
Google will supply the commercial results, but not all of the world's information - only its most profitable information. If you want to know how to do something, how something works, you'll use a different, non-google indexer for searching, say, all of the "how to" sites: e.g. wikipedia, stack exchange, about.com, etc.
I've gradually started to do this using DuckDuckGo's bangs . It actually works pretty well if I know exactly which site I want to search. I do miss Google's ability to filter by time, though.
Nevertheless, I wanted to take the time to thank you for your information about DDG's bangs. I've been using DDG since Snowden, but didn't know bout bangs! Looks very useful and shorter than typing "site:" in google. It kind of reminds me of multireddits
I think the reason it became this way is entirely due to search engines encouraging this ranking game. Those who want their site to be at the top of the SERPs optimise for that, instead of focusing on what makes their sites' content valuable to actual users. The incentives are all wrong, from the perspective of what the Internet should be, and to fix this I think search engines should, instead of trying to develop more and more complex algorithms that the SEOs will just figure out how to beat, use a (periodically changing) random ranking. This would probably kill off SEO completely, as there would no longer be anything about the search engine to optimise for - everyone who wants visitors would instead focus on that content, so on those times when their site appears near the top of the rankings, they can attract and retain users who will preferably bookmark them and visit again.
It's essentially a replay of what killed Usenet. And the "SEO experts" are every bit the vile blight on the web as spammers were on Usenet.
And if you knew how to search, it was a nuisance, but could be dealt with.
The (relatively) "new" part of it is Google throwing out the babies with the bathwater.
The biggest advantage of a random ranking IMHO is that it completely eliminates any incentive to game the ranking system, and gives the "indie" portion of the Web a fairer chance at getting visitors. Let the users decide themselves individually what's relevant to them, and not Uncle Google.
The legal reasons _might_ be enough to keep them from doing it at all, it's true, but I've always suspected they do it anyway.
I mean, we know they do it to penalize certain sites they believe violated their ad rules, right? So we know their systems at least possess the technical ability to manually penalize sites.
But I'm not an expert in this stuff, which is why I start with a question (it wasn't rhetorical!).
Videos - develop videos that drive "traffic" to your website ... Vimeo YouTube 5min or your own site
Blogs - have people in your company start writing blogs and sharing their expertise on the web, and others will naturally want to check out your products ... LinkedIn Blogger or your own site
PR - have people cultivate relationships in forums and with publications and post stuff that your bloggers write
there are many ways to get traffic besides relying on one search engine
and by the way, even ON that search engine, you should be creating all kinds of things to "take over" the first page, if you are really trying to vie for traffic
e.g. on GOOGLE:
youtube videos will show up if you make them
google+ profiles will show up if you make them
local listings, images, maps, bla bla bla
which means Google is telling you to upload a lot of content to GOOGLE owned properties in order to promote your site
so if you really want google traffic that badly you should do it!
So what you say might answer the first question, but not the follow up one. Well, your "e.g. on GOOGLE:" answers it, but the fact that you use google for that example already shows that there really isn't any way around that. He wasn't asking how to improve ranking. He was asking for alternatives to google search traffic. (Unless I got that wrong)
It's kind of like the difference between being discovered by an A&R department of a record label or going the indie route like Ben Haggerty.
Or do you suppose spamming my Facebook feed promoting my work website is the way to go?
Does anyone here have any feedback for me on this Google Maps Mashup?
Google has always been helpful, given us free tools, told us what to do and how to do it, what is good and what is bad and how to do good by them.
Then you run into people like this guy.
I must be doing something wrong.