
On MetaFilter Being Penalized By Google: An Explainer - _pius
http://searchengineland.com/metafilter-penalized-google-192168
======
pessimizer
This is the problem with using associative reasoning to detect bad actors. Bad
actors are attempting to look like good actors, but good actors aren't trying
to avoid looking like bad actors. Bad actors will target a stereotyped set of
good actor attributes, and your filters will mostly catch unique or
idiosyncratic good actors, who aren't targeting anything.

Unless your good actors are willing to conform to a published set of (nerfed)
behaviors that don't have the possibility of being bad, or are willing to
register and be vetted by you individually, you can't help but be overwhelmed
by false positives. It's the same reason why the most intricate, pervasive,
and technologically flexible surveillance system in history can't find a
terrorist.

edit: I think the entire endeavor is doomed. The bubble associating your
current searches with your past searches, and attempts to eliminate spam and
eHow through algorithms have just resulted in eliminating most sites from the
searchable internet. You don't realize how bad its gotten from all of the
search engines until you spend an hour on something like millonshort (which
looks like it's down now.)

~~~
lmm
Reminds me of (forum) mafia: there's a set of heuristics for how "town"
players behave. But real town players will break those heuristics
occasionally, because they're trying to catch the mafia - so the players who
conform exactly to the "town" heuristics all the time are actually more likely
to be mafia players.

~~~
hagbardgroup
What it's resulting in is a 'news-stand' effect in which articles from the
mainstream press will rank higher than anyone else on a given query if the
match is close enough. Although I'm not a huge fan of the press as it is
today, it could be worse.

You can actually identify opportunities by finding valuable queries that are
being squatted on by the content mills. Lots of nice things in stuff like home
improvement, insurance, and finance.

------
pradocchia
I'm glad I experienced the old internet back in its heyday, when high quality
sites linked to other high quality sites, and Google exposed this natural
topology for all to explore. Google's success eventually led to the end of the
old internet: AdSense captured its value and then SEO perverted it with noise.

I expect within 5 years some crotchety old programmer will have build a search
engine that penalizes sites for anything more than basic markup. No JS, no
CSS, no hints of a CMS. Maybe it's already been done. "Old Skool Search"

And it will still be a poor imitation, because if there is one thing you can't
recreate and experience for yourself, it is an internet gone by.

~~~
ForHackernews
This is kind of how I feel as well.

Google figured out how to monetize links. Good for them. Now they get to deal
with the consequences of monetized links.

Live by the sword, die by the sword.

~~~
dwd
Ahh, the Eternal September...

I sincerely believe we need two Internets, or we need two Googles.

We need what started with ARPANET, but this time never let it commercialise.
PageRank works for that network.

We also need the commercial network that works like a better Yellow Pages but
actually gives valuable local focused information on businesses and services
that can't be gamed and has different rules.

Trying to apply the same ranking rules to both types of searches pollutes the
academic and destroys every local small business who are unlucky enough to not
make it to the magic first page - because nothing else counts and the winner
takes all.

Google has taken on a lot of responsibility dictating what they think we are
looking for and they are failing us (and especially small business) big time.
Maybe they are thinking about this? Maybe Google Local will factor more than
PageRank for commercial. Maybe they will change the layout so more than 10
lucky links get to page one and you never get to have multiple listings for
the same company. But right now it is bad and only getting worse.

------
stopcodon
MetaFilter at least has the advantage of being old and respected enough in
tech circles to get the attention of Hacker News and Matt Cutts himself when
something like this happens. For the vast majority of legitimate websites who
see their traffic drop or become non-existent when google pushes an update to
their ranking algorithms, there is virtually no recourse. I understand it's a
necessary evil and there will always be collateral damage when trying to
combat spammy websites, but at the same time it's scary to know that there are
people whose sole source of income is web traffic and overnight it can
disappear without explanation.

------
morbius
It's amazing how crappy Google results are nowadays. It's all Buzzfeed,
Crackle, Mashable, Y!A, WikiHow, eHow, and other garbage results akin to those
garbage sites. True information like you'd find on MeFi or some of the more
interesting subreddits is now completely buried. There's no other
alternatives, either. DuckDuckGo pulls info from less popular sites, and you
can find real gems with the right search parameters and hashbangs, but the
relevance is not nearly as good as Google's.

I hate to sound like a conspiracy theorist, but I think since we started
getting blog/aggregator/quizlet-type sites on the web back in 2009-2011, the
quality of search results has gone far, far down the river.

~~~
dragonwriter
> It's amazing how crappy Google results are nowadays. It's all Buzzfeed,
> Crackle, Y!A, WikiHow, eHow, and other garbage results akin to those garbage
> sites.

I think this is just a sign of the fact that the web has come of age and
everyone is on it -- the things that people are _actually_ most likely to be
looking for no longer correspond very well to the things people who were
actively and heavily using the web in, say, 1998, were looking for.

~~~
maxerickson
That's no doubt some if it, but there are sites that are pretty much worthless
that still rank high in the search results.

Not worthless in the sense that I am making a statement about the type of
content on them, worthless because there isn't any useful content.

------
jere
>At last, it’s time to dig into what happened with MetaFilter. The short story
is that I don’t know.

~~~
frozen_tomato
The real kicker is at the start

>If you just can’t wait to get to the specifics of what happened to
MetaFilter, scroll further down in this story now.

I'm pretty glad I didn't read all that between those two points just to be
told that he doesn't know.

~~~
arikrak
I didn't, which there had been a warning or tl;dr.

------
jljljl
Man, the use of Vine in this article is awful. It made the point he was trying
to make so much more confusing vs. just posting the diagrams.

~~~
thathonkey
The whole article is way too long. He makes the same points over and over
again... and then he doesn't even know what happened to Metafilter.

~~~
prawn
Sounds like almost every SEO blog I've ever read.

------
dminor
This is what's so frustrating about Google. I don't want to spend time poking
your black box to try to divine why my company's site has dropped in rankings.
Especially when that means major architectural changes with no guarantee of
any sort of payoff.

The small online retailer I work for experienced a 1/3 drop in Google traffic
a year and a half ago. No discernible reason. Thin content? Maybe - we do have
a lot of filters to make it easier to browse products.

Of course, these are things we built for our customers, not Google. I'm not
going to spend a month or two rearchitecting them just to see if Google likes
it.

------
prawn
From Haughey's Medium post ([https://medium.com/technology-
musings/941d15ec96f0](https://medium.com/technology-musings/941d15ec96f0))
about it all:

"Advice online was to scale back the "top heavy" ads... I removed most ads to
the absolute minimum of just 1 or 2 per page..."

Also mentions first running ads that blended with the content to then running
ads that stood out from the content.

Funny thing is that if you are an AdSense publisher, you get emails from
Google all the time telling you what you should do. If you aren't using your
full allocation of 3 ads and 2 LUs per page, they will often email to remind
you to run _more_ ads. And they'll encourage you to run ads in prominent
positions towards the top of the page. And to run ever-larger ads. And then
change from text-only ads to image ads.

I have had AdSense-monetised sites for around 10 years and have had seemingly
automated emails as well as personal emails from account managers about these
things.

~~~
prawn
Funnily enough, I got an email from Google this morning:

"You could run 27 more ads on 15 pages of your website."

Apparently I should try to make more money by using the full ad complement on
all pages... The 15 pages it's probably talking about are things like the
contact form, about page, etc.

------
lauradhamilton
Perhaps it's a "thin content" penalty.

Looking at the site, the posts have low wordcount. No images. Lots of links.

That's kind of consistent with the other trends we're seeing from Panda 4.0.

Thoughts?

~~~
kenko
That's hilarious. Metafilter's content is great, especially AskMe.

The idea that pictures would improve things _for metafilter_ is bizarre; it's
a discussion site and the discussion is really deep on both MeFi proper and
AskMe, which is where a lot of the search traffic went (very deep archives).

It's possible that people wouldn't "bounce" as much if there were lots of
images, but it's not _obvious_ that getting people to stay that way is a good
thing. Google's algorithms shape the web, they don't just measure it, and I'd
rather see more link-heavy, image-sparse pages.

~~~
adventured
lauradhamilton said Metafilter's content was thin, not that it was bad or
wasn't great.

If you answer everything in two sentences, Google is still likely to punish
you for it even if those are ideal answers. They punish thin content even if
it's supposedly great.

~~~
kenko
... which is absurd. Google is basically admitting, as far as I can tell, that
their algorithms have been gamed to uselessness.

~~~
adventured
I actually view the thin / thick content issue as a strong proof that Google's
search engine is incredible dumb.

They have to constantly write edge case scenario algorithms because the
overall of their search system is of very low 'intelligence.' They're
whittling their way into a corner in the process, because they don't know any
other way to deal with problems than to further narrow what's defined as the
good with another specialty algorithm that someone inside Google hacks
together to buy another day against spam winning.

The situation is: Google can't tell what's a good answer and what's not based
on the content. They have to try to figure out what's good or bad based on
every other measurement except the actual content. So if the ideal answer is a
mere 47 words long, Google is too stupid to know the difference and understand
there are many instances where short & concise is better than seven pages of
verbosity. In the not so distant future, this is going to be laughable.

My opinion might be scoffed at today, but I think Google's search platform is
little more than state of the art junk that scales well. It's the best junk we
have right now, and that's not saying much.

Killing Google search should be on PG's list of hard things that someone
should be tackling. They're a dinosaur.

------
jrochkind1
Even let's just assume for the sake of argument everyone agrees on who's the
good guys and who's the bad guys.

There's still no way for an algorithm to correctly exclude all bad guys and
avoid excluding all good guys. And trying to improve in one area often
decreases in the other. (Before even getting to the fact it's a dynamic system
where the bad guys are constantly trying to adapt to avoid exclusion)

> Often, there is an inverse relationship between precision and recall, where
> it is possible to increase one at the cost of reducing the other. Brain
> surgery provides an obvious example of the tradeoff. Consider a brain
> surgeon tasked with removing a cancerous tumor from a patient’s brain. The
> surgeon needs to remove all of the tumor cells since any remaining cancer
> cells will regenerate the tumor. Conversely, the surgeon must not remove
> healthy brain cells since that would leave the patient with impaired brain
> function. The surgeon may be more liberal in the area of the brain she
> removes to ensure she has extracted all the cancer cells. This decision
> increases recall but reduces precision. On the other hand, the surgeon may
> be more conservative in the brain she removes to ensure she extracts only
> cancer cells. This decision increases precision but reduces recall. That is
> to say, greater recall increases the chances of removing healthy cells
> (negative outcome) and increases the chances of removing all cancer cells
> (positive outcome). Greater precision decreases the chances of removing
> healthy cells (positive outcome) but also decreases the chances of removing
> all cancer cells (negative outcome).

[http://en.wikipedia.org/wiki/Precision_and_recall](http://en.wikipedia.org/wiki/Precision_and_recall)

------
harrystone
Obviously, what happened to metafilter isn't right.

But it should also be obvious why Google doesn't say what needs to be done to
remove a penalty. The sites that _should_ receive a penalty will just use that
information to further game the system.

~~~
adventured
How is it obvious that what happened to Metafilter isn't right? I'm inclined
to be sympathetic when any non-spammy site is hit by a Google penalty, but
what's the basis used for determining that Metafilter is deserving of a
particularly high ranking above where it's at right now?

The only thing I've seen are a few opinions stating that Metafilter is great.
That doesn't make it so to the majority of web users.

~~~
mbrubeck
It's not just that MetaFilter's own ranking has dropped. If that were the
case, then you might argue that it was just the algorithm doing its job.

But Google has clearly miscategorized MetaFilter as a content farm or linkspam
site, and is actively _telling other sites that it is penalizing them_ for
outbound links from MetaFilter that it has categorized as spam, even in cases
where those links are clearly not spam.

In this case, the algorithm is _not_ doing its job.

~~~
adventured
Excellent point. I didn't put together the link designation with the
likelihood that Google had put Metafilter into a spam penalty box. I simply
hadn't considered that a possibility given Metafilter's reputation.

Every time I read about this Metafilter situation, I cringe in thinking about
how horrible Answers.com is with the abusive tactics they employ now when it
comes to displaying content / answers. And yet they remain non-penalized;
historically they're one of AdSense's biggest publishers, always found that
interesting.

------
sandycheeks
We had a site get hit hard by Panda 4.0 but we are not surprised. About a 35%
drop overnight May 20th.

An observation that we have made over the years is that significant changes to
Google's algorithms always seem to soften in the subsequent months for us and
we return to a high level in the Google SERPs.

The site in question is a site where we curate free crafts projects and
patterns which my spouse and I began in the 90's with our family and friends.
The idea was to gather together excellent crafts projects on little known
mostly small sites with the criteria that they are free, complete, usually
require no email/login and are within two clicks of us. We still update it
every week.

Over the years we have used user feedback to make design decisions. For
example, when Pinterest became popular we got a lot of feedback to use masonry
instead of tables for our images/links and lately we've been moving to make it
fully responsive because we get a lot of feedback from tablet users.

The only time we were manually penalized by Google involved an issue where we
had ignored our user's complaints about so they were right and we were wrong.
We fixed it.

Our site looks thin to an algorithm and we almost always get hit by large
algorithm changes but over the subsequent months, the site always moves back
up and ranks very well. We can only imagine that this means the algorithm is
somehow tempered by our visitor's behavior (we use adsense and analytics so
they see all) and is not simply a switch that is thrown and left on.

I have to wonder how much MetaFilter has done to gather user feedback. The
answer to their problem may be there.

Disclosure: The site described is AllCrafts

------
dchuk
Anyone with any experience in the SEO world knows that Danny Sullivan doesn't
have a fucking clue anymore. He's no different than a talking head reporting
on the daily ups and downs of the stock market on your local news.

~~~
arthur_debert
Anyone with any experience in the SEO world knows that dchuck doesn't have a
fucking clue anymore. He's no different than a talking head reporting on the
daily ups and downs of the stock market on your local news.

Ad hominem. If instead of saying "X is an idiot, anyone can tell " you'd say
what's wrong with his argument, you wouldn't look like a talking head.

~~~
dchuk
If you were active in the SEO world, you'd know what I said wasn't an Ad
Hominem but instead an accurate description of his abilities.

Danny Sullivan does not actively "do SEO" anymore, a field that changes
monthly. He reports on news that other people discover about happenings of the
search engine industry.

Hence why he concluded that he didn't know what happened to MetaFilter.
Because he has no fucking clue how modern SEO works anymore.

And it's dchuk.

~~~
arthur_debert
Thanks, that's a start. And of course, sorry for the misspelling.

In all seriousness, it would be more helpful if you'd mention what's wrong
with the original article.

Since you seem to be knowledgeable and active on the SEO front, would you mind
explaining what happened to MetaFilter?

~~~
dwd
The problem is SEO in general.

We have omnipresent and omniscient Google (or so they think) with high priest
Matt Cutts bringing the tablets down from the mountaintop.

The SEO crew are on the ground busy reading chicken entrails and tea leaves
and coming up with the best guesses they can but they really don't know.

------
onewaystreet
MetaFilter's real problem is that on a web that is currently built around
sharing, its audience shuns social media. I've never seen a MetaFilter link in
any of my feeds. I've never seen a story on one of the major blogs use
MetaFilter as a source (expect for this Google story).

~~~
adventured
They appear to be making numerous basic SEO mistakes.

For example their links don't have nofollow on them, when sites such as Stack
Overflow (Rap Genius, Quora, etc) do.

That one change alone would likely boost their rankings significantly.

~~~
pygy_
That's completely backwards, though.

The Page rank algorithm depends on the existence legitimate links, and
MetaFilter has a high signal to noise ratio.

By inciting people to "nofollow" valid links to improve their own ranking,
Google is damaging its index.

~~~
adventured
I agree it's backwards / ridiculous, but that is apparently how Google prefers
links to be treated now. It clearly breaks the notion of good will value
sharing between sites. It seems these days all the best ranking sites are
careful to nofollow across the board.

I used to see claims that Google gave sites a modest benefit for sharing
pagerank via links to other high quality sites or similar sites. That doesn't
seem to hold up under scrutiny when you look at the sites that have done well
by optimizing SEO.

------
blauwbilgorgel
MetaFilter should definitely nofollow their external links in the comments.
That makes sure that spammy/unnatural links do not count as a vote (which
turns MetaFilter into a bad neighborhood as far as search engines are
concerned, and attracts the spammers).

... <a href="#">Sample</a>, an Indian online pharmacy. I have not ordered from
them myself. ...

They should look real hard at their site structure and which pages they allow
to be indexed. Too many vague subdomains, and 'posts tagged with...' in the
index.

I wonder what their user metrics show to Google. I don't think Google is
penalizing sites based on a mere hunch, or won't notice that a site like
MetaFilter got hit by an algorithmic update.

This site is someone's child, so it's hard to be too critical, but the view
that MetaFilter hosts the quality content of the internet is far too rosy.

Pages full like:
[http://www.metafilter.com/67307/Gampoumlmbampoumlc](http://www.metafilter.com/67307/Gampoumlmbampoumlc)
Which are veiled spam pages. Pages full of inane babble about medical issues
or personal drama, with the credibility and authority of a Yahoo Answers page.
Just because people paid MetaFilter to post, and a site is heavily moderated,
does not mean it adds anything to the search results for most generic search
terms.

Then most links or interesting stories are from elsewhere on the web.
MetaFilter is kinda like trying to rank with the old Digg comments: not the
best place on the internet to read a discussion surrounding a topic. Why
deserve to rank for linking to a funny or controversial news article, and
riling up 15 comments or so?

In the end there may be far more natural reasons for this. The user signals
showed something was wrong with the site. People bounced a lot (be it the
design, be it because you can't comment without paying, be it because 15
random internet comments have little value). In the meantime Reddit and other
sites grew to large communities. You don't see Reddit comment threads ranking
all too high either, unless they are really special, popular or significant,
like the president doing an AMA.

~~~
pessimizer
There's absolutely nothing spammy about
[http://www.metafilter.com/67307/Gampoumlmbampoumlc](http://www.metafilter.com/67307/Gampoumlmbampoumlc),
and if it has to be excluded from the internet because it contains two links
to the website of a product along with discussion of that product between two
dozen people - your search is broken, not the page.

I'm not sure when it became alright to exclude "inane babble" from the
internet, but I'm not comfortable with an algorithmic measure of lack of
interestingness or quality being judged as an attack. I also think that it's
moving the goalposts from detecting destructive or deceptive content to
defining what all content on the internet should look like.

~~~
blauwbilgorgel
That page is commercial spam (or "veiled spam"). It serves no other purpose
than to get a link to his own product site, no discussion, no nothing. Just:
For the special price of 999 Euro you can get your own. How is that quality
content?

It is also spammy in that it doesn't nofollow external links on comments.
Those links should have been nofollowed like 6 years ago. That they still
aren't, is telling about MetaFilter's knowledge on SEO. You know that Google
sees your site linking to bad sites, and what do you do? Complain about it?
Fire people due to an update 2 years ago? Or do you simply take action to fix
it?

MetaFilter doesn't have to be (and isn't) excluded from the internet. It just
doesn't have to rank well for terms like "Gomboc".

It became alright to derank "inane babble" the moment it increased the
satisfaction of search engine users. Which is pretty much from the beginning
of search engines. It is user metrics that count. Statistics don't lie.

It does not matter that you are uncomfortable with an algorithm. I think I
don't get your "judged as an attack"-part. The alternative is not using a
search engine that relies on algorithms.

I did a casual investigation of MetaFilter. It has not changed much over the
years. The pages I did look at were all of the level of "inane babble". One
line comments. Puns. Not adding much to an already thin topic. Perfectly
alright to have a place on the internet to debate how to get your friend to
take Viagra. But not much need to rank that for men's health topics,
relationship advice or even Viagra.

[http://ask.metafilter.com/57406/Bright-light-flashes-in-
my-r...](http://ask.metafilter.com/57406/Bright-light-flashes-in-my-right-eye)

Now would you want to find that page when you are searching for those
symptoms? Or would you want credible medical advice?

If it is moving the goalposts, it is moving the goalposts to better quality
content. Those visitors not going to MetaFilter are not gone, they are catered
to by other sites. Every tag on MetaFilter has a specialist site now. Ranking
MetaFilter so high before 2012 was a present. MetaFilter doesn't deserve to be
deranked any further, but it doesn't deserve huge pre-2012 rankings either. In
my experience, todays SERPS are much better than ever before, and the
complaints are biased, reminiscing good old times or influenced by the fact
that some paid to post on that site (You'll defend it, as you don't want to
back a losing horse, and your experiences with the site are probably good).

~~~
pja
_That page is commercial spam (or "veiled spam"). It serves no other purpose
than to get a link to his own product site, no discussion, no nothing. Just:
For the special price of 999 Euro you can get your own. How is that quality
content?_

Just for reference: metafilter bans self-links (ie links to content that the
poster is involved with in any way) outright in posts to the main page. It's
extremely unlikely that the poster has anything to do with the product in this
post whatsoever. The fact that they're a member of fifteen years standing with
only 15 posts to their name makes it even more unlikely.

I wouldn't argue that that this post deserves to rank highly in any search
engine, but to downgrade the whole site because of it? Madness: It's a link to
an interesting mathematical shape, with a bit on perfectly on topic discussion
underneath which contains a few other relevant links. Best of the web?
Probably not. Spammy? Hardly.

 _It is also spammy in that it doesn 't nofollow external links on comments.
Those links should have been nofollowed like 6 years ago. That they still
aren't, is telling about MetaFilter's knowledge on SEO. You know that Google
sees your site linking to bad sites, and what do you do? Complain about it?
Fire people due to an update 2 years ago? Or do you simply take action to fix
it?_

Links to spam sites will be flagged & then squashed by the moderators in short
order. Well moderated comments by an active user-base shouldn't be nofollow -
they contain actual useful information that a search engine (like Google!)
ought to find useful, if page-rank matters at all.

It's possible that metafilter might have a problem with links in old posts
ending up pointing to domain squatter sites I suppose, but if this is a major
problem then it ought to show up in metafilter's google webmaster tools.

~~~
blauwbilgorgel
>metafilter bans self-links

I shouldn't have said that the poster owned that site, that was a mistake, and
it detracts from my point. I should have left it as an example of a thin-
content page that doesn't really add anything to the web. Of which there are
many at MetaFilter. That the view that MetaFilter pages are all quality and
deserve to rank well is not realistic. That it isn't a mistake to derank a
site when user metrics and a/b tests show that that increases user
satisfaction.

>but to downgrade the whole site because of it?

I do not think this happened. I do not think I found the single page on
MetaFilter that deserved a downgrade of the whole site. It is but a symptom of
a deeper underlying problem with the site's quality. Best of the web? Nope.
Clear-cut spam? Nope. Somewhere in between? Yes. The rankings reflect that.

>Links to spam sites will be flagged & then squashed by the moderators in
short order.

Except they aren't. Google reports followed links to spammy sites. I found a
link to an Indian pharmacy in a few seconds of searching. Moderators ought to
work. Nofollow will work.

The problem is not the old links, though they are accompanied by old stale
content, which IS a problem. Also these are todays links:

\- "Everyone On Wall Street Is A Dick."

\- Is Using Lotion a Black Thing?

\- #basketball #trickshots

These link to other websites like vine.co or b3ta.com. They are followed by
some comments that new visitors can not join in on, lest they pay. This is
fluffy content that ranked well around 2006, it doesn't anymore. Why are these
comments so special that they deserve a top 10? When 100s of other sites also
put up a link and have a small discussion? Where is the unique quality content
of MetaFilter? In the comments?

~~~
pja
Ah, I think you're operating under the misapprehension that it was
metafilter.com that was attracting the Google clicks & so you're reasonably
looking at posts to metafilter.com & wondering how they could ever have
ranking in Google searches in the first place.

As I understand things, it's the ask.metafilter.com Q+A sub-site that was the
main source of Google search clickthroughs & thus ad revenue (Yup, 90%
according to Matt Haughey's essay on medium.com:
[https://medium.com/technology-
musings/941d15ec96f0](https://medium.com/technology-musings/941d15ec96f0) ).
The main metafilter.com site is pretty much irrelevant in revenue terms,
although it may be affecting the overall ranking of ask.metafilter.com pages
in Google search results for internal Google voodoo reasons of course.

NB. Let me know where that link to the Indian pharmacy is & I'll flag it.

------
Istof
If Google ends-up fixing this, I hope that they don't add metafilter to a
white-list and call it a day. The results nowadays almost seem like a hand
curated list that would only help the person that made that list.

------
gdeglin
As someone who's not familiar with MetaFilter, my initial impression is that
the design feels outdated and I can't quickly identify how the site works or
the value it provides.

I'm sure that over time I could grow to love MetaFilter, and possibly even
find value in the qualities that give me this impression. But perhaps Google
has an algorithm that lowers the rankings of sites that scare away casual
users who stumble upon it from a Google search result (High and/or quick
bounce rate maybe?).

~~~
phillmv
It is one of the better link blogs on the web. I've been reading it since
2005? and I gave them my $5 back in 2008. Prior to the rise of the fractal
advice subreddits, askmetafilter was (and probably still remains) one of the
best general purpose advice forums on the web.

The point is, it has an incredibly high signal-to-noise ratio and it's a
damned shame it's getting penalized for it.

~~~
joaren
Really? I've had several links posted to metafilter, in every instance the
comment thread was full of petty, small minded nagging. Seems like just
another 'intellectual' echo chamber to me.

~~~
phillmv
>petty, small minded nagging

I think you'll find this to be a facet of every forum or comment box ever made
available on the internet :).

>Seems like just another 'intellectual' echo chamber to me.

Are big words bad? What's 'intellectual' about it?

------
minusSeven
Fact: I didn't know anything about Meta Filter until I read the blog about it
yesterday.

------
whoismua
Useless article by Danny Sullivan, his columns have offered zero advice for
years now. He knows nothing special these days.

So MetaFilter has been getting some "help" from Google's public relations
squad, as is usually the case with sites that hit a nerve with HN. What about
the thousand or million other sites that lost their traffic?

~~~
rhizome
Were any of those sites any good?

~~~
whoismua
Apparently not...Google--the fair and balanced search engine--decided that ads
and Google's own properties are better.

Have you noticed Google's earnings reports that mention a 20+% increase in ad
clicks every quarter? Where do you think that's coming from? From shifting
traffic away from other sites to Google, so instead of going to Metafilter
Google decides that a G+ post of youtube video is better,

------
wnevets
tbh I hate it when MetaFilter shows up on my search results.

------
recursion1133
Sounds like illegitimate backlinking. If you don't know why your page isn't
showing up in searches, then look up the blacklists...

