I had a DMCA takedown notice sent to me on behalf of a website owner who didn't want me linking to their site. My hosting provider gave me 24 hours to remove the link or else they'd cancel my account.
The owner of the link claimed that their Google rankings were dropping because my site, iHackernews, linked to their site. With this, they were able to force me to remove it via a DMCA takedown notice.
DMCA takedown notices cannot be ignored.
Which I've blocked:
My host is Softsys hosting.
How does that even work? How does the DMCA apply such that I can invoke it to tell you to stop linking to my site?
I'm guessing it doesn't, but if you're at the mercy of a hosting service that doesn't actually understand the DMCA you're in a jam.
The logic is that DMCA notices are sent under penalty of perjury, and people aren't going to risk perjuring themselves just to improve their SEO or take down a blog post they don't like. Unfortunately, the courts seem willing to accept excuses like "the automated process i use to send out DMCA takedown notices got a little overzealous" as legitimate, so DMCA takedowns are now pretty much just a free-for-all for anybody who wants anything taken down.
AFAICT from actually reading the act, they only have to comply with notices meeting the facial requirements, and failing to do so only exposes them to liability for the content covered by the particular notice for which they failed, and only where they would have had liability without the DMCA safe harbor provisions. I'd like to see any analysis supporting the characterization you make here.
Utilizing the law to its letter and to impose the fundamental flaw in it is simply making that hurt visible.
Again, and in hopes of a more substantive reply: when has hurting innocents been an effective way to get what you want in a society with political representation for its citizens?
It's bad enough that they have automated the process to allow companies to delete millions of links from Google every year, but to downrank sites is a bit overboard.
I definitely agree though - they sound like a very poor host. One of my websites received an invalid DMCA via my host, and my host handled it well (i.e. listening to my side of the story, before deciding not to pursue the 'complaint').
Any host that handles DMCAs so poorly (i.e. threatening their customers without knowing the facts) doesn't deserve to be in business IMO.
In your case obviously it was unjust though.
Note that there are two different things to keep in mind when someone writes in and says "Hey, can you remove this link from your site?"
Situation #1 is by far the most common. If a site gets dinged for linkspam and works to clean up their links, a lot of them send out a bunch of link removal requests on their own prerogative.
Situation #2 is when Google actually sends a notice to a site for spamming links and gives a concrete link that we believe is part of the problem. For example, we might say "we believe site-a.com has a problem with spam or inorganic links. An example link is site-b.com/spammy-link.html."
The vast majority of the link removal requests that a typical site gets are for the first type, where a site got tagged for spamming links and now it's trying hard to clean up any links that could be considered spammy.
If you read the original post closely, it's clear that this is a site asking for a link to be removed--the quoted email isn't from Google.
This is a case where google wants people to mold the internet according to what their current incarnation of their algorithm says the way it should be.
Rather than google trying to mold its algorithm to understand the internet, natural links vs. spams, and their respective authority and value.
Google here is saying that all links from this site are suspect, even when the link in question is 100% valid, quality and relevant.
Instead of going back to the drawing board and trying to fix the algorithm, google has discovered that since they are so big now, it is easier to play the "benevolent" dictator and dictate to the whole internet, how and what they should be doing.
And to mete out knuckle raps when they get out of line.
Do you see the point now?
The model (google's algo) should fit reality. Rather than forcing reality to fit the model.
I believe this is what the author is trying to communicate by saying "google is breaking the internet" - what he means is that the internet is becoming a 'google' version of itself, rather than what it would naturally be.
I read the article and didn't see any mention of what site the link was on or even what site it was linking to. It's impossible to say if it was legitimately spammy or not, but either way Google never asked him to do anything. Someone who didn't know about the Link Disavow Tool decided on their own to email him about taking the link down.
You would have a good point prior to the Link Disavow Tool, but it has solved this
(very tricky) problem.
"In short, the email was a request to remove links from our site to their site. We linked to this company on our own accord, with no prior solicitation, because we felt it would be useful to our site visitors, which is generally why people link to things on the Internet."
"Apparently Google convinced them, via their Webmaster Tools portal, that the link looked 'unnatural', and that they should use the Link Disavow Tool to discredit the link. Furthermore, they thought it necessary to contact us to manually remove the link, which is something I’m not going to do (out of principle)."
It used to be you would get sketchy emails asking you to link XYZ and now a few years in the future the sketchy emails are asking you to unlink XYZ.
it cannot possibly be your responsibility to remove links that other unrelated entities have towards you! (unless it's an obvious link scheme)
The site that emailed to get a link taken down employed an SEO agency to improve their rankings and they created a bunch of spammy links to try to manipulate Google into ranking that site higher than others that probably deserve to rank higher due to their content. If Google did nothing about it we would get some crappy search results.
Also, if you read what Matt Cutts is saying, it implies that if they only had a few unnatural links then they wouldn't have received a webmaster tools warning telling them they had spammy links. They got the warning because their link profile suggested that it was obvious they had been trying to manipulate rankings, not because they had one or two iffy ones.
It's likely that the site then blanket emailed anything and everything that they weren't sure about to get the links taken down. Is it Google's fault that this is how the site decided to deal with the warning? Should Google just make it really easy for them by saying, "OK you disavowed those crappy links, you're off the hook now"? If the consequences are so weak and short-lived, where is the deterrent effect? No, Google doesn't own the internet but they own Google and the site is just trying to provide users with the most relevant results to their searches.
Re, the nofollow aspect. Google didn't "force" sites into using the nofollow attribute by scaring them into thinking that their rankings would drop. Nofollow was in place before there was any notion that who you linked to could harm you. Sites like Wikipedia and others were only too happy to add nofollow to links because it put spammers off from crapifying their sites.
As Google penalizes the best writers and blogs, one of two things happens:
1) Entire sites are being nofollowed - removing links to deserving sites.
2) The site refused to nofollow, so they stay penalized with PR0 no matter how good the site may be or how high the PR once was.
The end result will be that even more black hat spammy MFA sites will rank. Sites that deserved to be written about and linked to will drop and be invisible.
It is high time people stop defending Google when they are destroying small businesses and costing much needed jobs. Now Cutts has declared war on bloggers who often make a living writing.
Google wants to destroy any site that enables advertisers and businesses to connect with sites where their target audience already is by forcing publishers to decide whether they want to risk losing organic traffic and being branded with a scarlet PR0 or to nofollow everything they ever write.
Nofollow = NOT TRUSTED. We do not invest time writing about anything we do not trust. (On the VERY rare occasion we do we do not link to them.) We do not publish content that does not fit our audience.
Every link is a potential risk. Nofollowing links may damage the sites we link to so what is a writer who already cares about what they write about supposed to do?
Google has a monopoly on search because the wealthy elite who own the media handed it to them. People use it because in many cases they do not even know there is an alternative. Many use whatever is installed on their computers without even having any idea what it even is.
People who are trying to clean up unnatural links penalties are not SEO experts. They rely on tools to tell them what links to remove. These tools are not very accurate. I've had them flag DMOZ, Best of the Web, and Business2Community as "toxic". Finally many remove and/or disavow every link in desperation.
How does all that disavowing affect search results and individual sites? How many verify that requests to remove organically given links are from the actual site and not their unethical competitor. (Hint: almost none.)
Many businesses fail waiting for recovery. Others get penalties lifted, but having removed links they no longer get any traffic.
"Google didn't "force" sites into using the nofollow attribute by scaring them into thinking that their rankings would drop." YES, they most certainly DID!
>Google provided a search engine that worked better than any other so people started using it.<
While at the beginning this may have been true, over time that search engine has grown more parasitic in terms of things like scrape-n-displace knowledge graph results, AdWords ads on branded keyword terms where ads with junk bundleware rank above official sites, etc.
>People are still using it because it's still better than the others.<
Care to explain why Google is spending over a billion Dollars a year buying default search placement in other browsers like iOS & Mozilla Firefox? Any thoughts on the Flash security updates which hit other browsers and bundle Chrome's web browser in with it? Or how about the Android contracts with default search placement (and other forms of) bundling baked into them?
Google is spending well over a billion Dollars a year on the thesis that your thesis is wrong.
>It's not a monopoly.<
The hell it's not. At least if we use any of the standard definitions.
>You can use another search engine if you want to.<
And while an informed individual may choose to, the majority of people are driven by default settings which are purchased, as per the above.
>The site that emailed to get a link taken down employed an SEO agency to improve their rankings and they created a bunch of spammy links to try to manipulate Google into ranking that site higher than others that probably deserve to rank higher due to their content.<
Are you suggesting there are not false positives, or that competitors do not buy links to torch their competitors? Either such assertion is simply untrue.
>They got the warning because their link profile suggested that it was obvious they had been trying to manipulate rankings, not because they had one or two iffy ones.<
The second they there presumes a competitor didn't do them for it. Only a person ignorant of the field of SEO would presume this to be true in all cases.
>Re, the nofollow aspect. Google didn't "force" sites into using the nofollow attribute by scaring them into thinking that their rankings would drop. Nofollow was in place before there was any notion that who you linked to could harm you. Sites like Wikipedia and others were only too happy to add nofollow to links because it put spammers off from crapifying their sites.<
This is a complete misunderstanding of history, on numerous levels.
Nofollow was introduced as a (ineffective) solution to blog comment spamming. To help aid further/wider adoption, at some point some Googler's even suggested things like pagerank sculpting could be useful, up until some large sites started doing it excessively. They were looking for reasons to justify its widespread use, because Google intended from day one that the tag could then be spread onto paid links & other links they didn't want to count.
And the other level of absolute misunderstanding was that (before Google went on their fearmongering campaign about links) they in the past suggested that you couldn't control who links to you, but you could control who you link to & sites which linked to bad neighborhoods could indeed be penalized for it.
Very rarely is there a comment which is that long & that wrong. Impressive!
In this case the author says he created the link independently, because he thought it would be useful to his users. That's, by definition, a genuine, organic link.
I wonder if a link flagged by the disavow tool and confirmed by the user weaken the Google juice of the source site.
@MattCutts (or another Googler), is it the case?
We haven't been using disavowed links as a reason not to trust a source site. So it wouldn't weaken the Google juice of the author's/source site if the company that sent the email had just disavowed the link.
Genuine organic links can still be spammy. The OP seems to be a genuine affiliate marketer/spammer, it's not inconceivable that his links aren't treated with a ton of respect.
Some entities might try to say it is six of one & a half dozen of the other,
but those same entities do not view their own affiliate marketing efforts as spam, even when they are pure spam
Look at the cloaked affiliate YouTube links to iTunes on music videos. And Google has invested in Viglink, RetailMeNot, etc.
In addition to those types of investments, it was Google's ad programs which funded entities like Mahalo & eHow.
There are zero affiliates which have made as much off affiliate marketing as Google has.
You do realize, of course, that Google is part and parcel of the internet, and that much of the way the internet works "in reality" is in response to external perceptions of how googles algorithms work?
Google's early dominance in search was a result of their algorithm's ability to surface relevance. If they had never staked out that Site A linking to Site B means site B has more relevance, then the internet would be an entirely different beast, and SEO would take entirely other forms that do not amount to linkstuffing.
The internet hasn't existed 'naturally' since search engines first came around.
When SEOs break the quality guidelines of all major search engines (not just Google), then they might need to do some work to clean things up if they get caught.
One of the reasons that Google is such a popular search engine is that we do take spam seriously and we take action to counter spam.
This still seems to be missing the point.
As I understand it, sites that are not doing anything in the realm of SEO can get penalized because of the actions of people outside of their control.
If the wrong sorts of sites (however defined), sites outside my control, are linking to my site, why am I penalized, and why should the burden be placed on me to go fix it?
Check the people who try to drop profile links & comment spam links on your blog and such. Are there any websites which have as many spammy inbound links as YouTube does? How many people does Google allocate to cleaning up YouTube's spammy link profile?
There's another factor in that Google may shift what is considered reasonable over time. A buddy of mine got an unnatural link warning where the link cited was a link that has been in place LONGER THAN GOOGLE HAS EXISTED.
Now consider that some websites are bought & sold, change ownership, etc. ... does it make sense to penalize today for something which happened 10 or 15 years ago?
Another factor is that just by ranking you will pick up some scraper site inbound links you do not want. Many of those sites in various forms or fashion are monetized via Google ads. Google continues to run their ads on many of those aggregation sites, yet you may get an unnatural link penalty for having links from the same sites.
Good enough for Google, NOT good enough for you TM
Very well put.
Oh he gets it, it's his job to act like he doesn't and to change the subject "...Ummm...Bing does it too..look I made a video"
Well, if a website was link spamming then it seems appropriate that the site should attempt to clean up the spam they made before they can rank well in Google again. Otherwise it's not fair to the other websites that have been trying to rank fairly and would like a level playing ground.
In general, if you weren't trying to create spam links, then it's very unlikely that any of this is an issue for you and the burden isn't on you to fix it. I made a video about the "what if the wrong sorts of sites link to my site" question here: http://www.youtube.com/watch?v=HWJUU-g5U_I
So if Bing wants to rank Bing.com higher than Google.com on the keyword "Search Engine", all they need to do is to pay link farms to link to Google?
Like everything this could go both ways. If you penalize sites for having links from link farms, what is stopping your competitors from paying link farms to link to your website? or worse yet, some link farm owner comes up with an ingenious idea to hold other websites hostage until they pay up?
It would be more understanding if Google din't have any other resources on hand to make smart decisions but heck you guys have hands on realtime experience with most traffics via Google Analytics, Google Adsense and even DNS. All these resources and can't even distinguish a link farm from a legitimate site? Doesn't sound believable.
The Google webspam team seems to prefer psychology over technology to solve the problem, especially recently. Nearly everything that's come out of Matt Cutt's mouth in the last 18 months or so has been a scare tactic.
IMO all this does is further encourage the development of "churn and burn" websites from blackhats who have being penalized in their business plan. So why should I risk all the time and effort it takes to generate quality web content when it could all come crashing down because an imperfect and overzealous algorithm thinks it's spam? Or worse, some intern or non-google employee doing a manual review wrongly decides the site violates webmaster guidelines?
>Nearly everything that's come out of Matt Cutt's mouth in the last 18 months or so has been a scare tactic.<
the bizarre thing is the gap in perceptions internal to Google versus external.
>IMO all this does is further encourage the development of "churn and burn" websites from blackhats who have being penalized in their business plan.<
Absolutely. churn & burn sites, and then some mixed in parasitic hosting on sites which are already highly trusted.
During this "crackdown campaign" Google has been on, I have sat in on meetings at large clients where they've pulled sites that are supposedly link farms (according to google) and put together an ad buy on behalf of competitors meant to cause these penalties to be levied and WMT messages to be sent.
They needed to provide a useful service to begin with to get the traction (good search algorithms for the web-that-is), but now that they are the incumbent, it is more in their interests to penalise website owners and force them to buy ads.
It's sad, but it's normal monopolist behaviour.
The cost of throwing scummy links at a website is under $100.
And if their first $5, $10, or $20 test doesn't work? Then so what for them. They put it on cron job and throw another $20 at it again and again.
The key point here is that there's a 100X or 1000X gap between 4 figures & 6 to 7 figures. The cost of burning things to a crisp is much lower than building them up. Whenever incentive structures have a 1000-fold difference in outcomes the path of least resistance becomes a popular path.
After a few cases like this, a long list of multiple disavowed domains/networks/fingerprints, where are these negative links going to come from? This anti-SEO tactic you outline can only be used a handful of times before it becomes useless.
Matt Cutts' job at Google involves him making tons of videos explaining how the Google algorithm works. It isn't omniscient, and the guidelines show how to work within its limitations. If you don't care about optimizing your Google presence (as we don't) then you shouldn't care about Google's guidelines either. But if you do comply with some guidelines, then this will probably help your rank in other search engines which also take similar measures to TRY and detect linkspam.
Google's customers are its users. It tries hard to have the most relevant results (or provide value some other way, or just maintain a brand image so people keep coming back). The advertisers come because the users find them and spend real money. And the users want relevant search results, not spam. That's why Google spends a ton of resources on combating the spam.
How do you propose they magically "know" when a link is natural?
No, Google's customers are its advertisers. They're the ones who pay money to Google.
I agree with your general point though, the algorithm isn't perfect and sometimes it goofs. What I think people are having a negative reaction to is Matt Cutts' seemingly obtuse responses, as though he doesn't even understand the issue at hand (Google's actions distorting natural behavior), so he just keeps explaining the anti-spam policies in general terms. It's not a conversation, it's people talking past one another.
Think of some of the stereotypes of engineers. Many can code but struggle to communicate. Matt can code, has his name on many patents, leads a team, makes hundreds or thousands of videos, regularly keynotes at conferences & interacts with thousands of people, was GoogleGuy for years, participates in the comments here, was an expert in some of the past lawsuits against Google ... he has basically had a near infinite number of opportunities to put his foot in his mouth & yet how many times has he ever done it? Almost never, if ever.
About the only times I think he has potentially missed at all was the "breaking their spirits" bit on TWIG, and then two minor bits in a few YouTube videos
http://www.youtube.com/watch?v=muZuX9OaMLo&feature=youtu.be&... "We got less spam and so it looks like people don't like the new algorithms as much."
"the gap between what you can get away with and what google says you can get away with is getting smaller all the time"
Who do you know who has as much media exposure as he has had who hasn't repeatedly put their foot in their mouth? For me, that answer is nobody.
What makes Matt's performance even more impressive is how scummy some of Google's policies have been, even as he came off smelling like roses. There are endless debates on how reasonable it is for Google to infer intent on links & so on, yet at the same time some of the past Google executive emails have quotes in them like:
"As with all of our policies, we do not verify what these sites actually do, only what they claim to do." (for AdWords advertisers)
"I would prefer that Omid do it verbally since I don't want to create a paper trail over which we can be sued later? Not sure about this. thanks Eric"
No, Google's customers are primarily its advertisers. Optimizing advertising revenue has some overlap with pleasing users, as you note, but if a decision needed to be made between pleasing users and increasing revenue, there's no doubt about which way it would go.
I suppose any company that works out how to increase revenue while upsetting and alienating it's users could possibly implement said ideas. But if you're smart enough to do that, I would guess you would also be pretty successful at the traditional approach of increasing revenue by pleasing your users?
Rather than these fear mongering tactics, why can't Google just ignore spammy links instead of penalizing the whole websites and all the websites that link to them?
But! You may have hit the nail on the head. if i had to guess i would say that storing exceptions means that suddenly the algorithm had to check part of the corpus for these exceptions. at the scale at which google operates this could probably lead to rather big size and speed implications.
But maybe i'm just fantasizing.
Ask Google engineers how they feel about centralization of power in telecom or finance or politics & they will tell you it is the worst thing ever
Yet that same sort of centralization is fine when it is Google.
Those same engineers will admit that any algorithm has some level of false positives & false negatives. They may try to minimize these, but they can't make them zero.
The other thing which gets very little coverage, but is crucially important is that in spite of already having a monopoly market position Google keeps buying search marketshare with: their secret Android partner contracts, Firefox default search placement, iOS default search placement, Flash embedded into Chrome (so security updates happen in the background without sending you to a download site where trash can be bundled in with the update) & Flash security updates that hit all other web browsers coming bundled with Chrome bundleware which sets Chrome as your default browser, etc.
If Google decides they don't like you (for any reason), then you need searchers to use a different search engine in a browser which isn't paid off by Google for default placement. And you need to hope that users are savvy enough to repeatedly say "no thank you" to the automatically bundled Chrome install with their frequent Flash updates.
Google mentions how anything is only a click away or similar, but most people tend to use defaults.
When someone changes those defaults on Chrome users, that's a horrible user experience & the "hijacked" settings must be reset.
But when those defaults are paid for by Google or changed through Google bundleware it is "a great user experience."
If Google believes their marketshare is a reflection of their superior search offering, they are welcome to stop buying default search placement in other web browsers & stop bundling Chrome installs on Flash security updates. But currently they are spending north of a billion Dollars a year on these activities, which indicates they clearly feel there is significant value to them.
You are suggesting an approach where there are no downsides to link spam. Obviously, players of this game would then ramp up their link spam bots, and run them non-stop. Because it causes no damage to the site they want to rank.
Link spam isn't just a search engine indexing problem, it is also a detriment to the general Web user too. It makes the Web worse, not just search.
Don't like YouTube's copyright claim issues on video game playthroughs? Use Twitch. Oh wait, what's that? Google is trying to buy Twitch.
So then what's the difference between disavows and having the links removed? Getting the links removed costs far more time & money. And it gives Google a data mining stream of feedback they can leverage to dish out further penalties.
Google requires some of the links to be removed not because it improves the web, but rather their goals & interests are aligned with punishment. They want to add cost & uncertainty to SEO in order to discourage investment in SEO from entities not formally connected to Google.
That's a good thing right? Pushing sources of bad links down through the basement, effectively nuking the source/network from the link graph.
You know Google's ranking system is based on information, and adding more information leads to better results for the Web user.
That really depends on your perspective.
If it were impossible for others to throw trashy links at your site in bulk for cheap it could perhaps be a good thing.
If Google didn't change policies overnight & then retroactively penalize you for things which were fine in the past
then it could perhaps be a good thing.
Unfortunately, in reality, both of those ifs are hypothetical & untrue. Which means it is absolutely not a good thing.
>You know Google's ranking system is based on information, and adding more information leads to better results for the Web user.<
Quality of information matters as much as the volume of information. Look at Demand Media's current stock price for an example of this.
But in terms of links, you have to think through the impacts here...
Yes if Google freezes some activities they dislike then perhaps that is a net positive for relevancy, however the more they fear-monger about links the less natural linking goes on. Most of the major social sites put nofollow on almost all external links. And with the sea of fear approach to relevancy, in some cases magazines or newspapers will profile a person & not link to the source in spite of the entire article being about that source. I've spent many hours being cited by journalists where in spite of being good enough to be the primary source for their content, there was no link citation.
paraphrasing David Naylor's excellent recent video
"More and more high profile websites are having a no linking policy. Which really is kind of weird isn't it, because if that is a journalistic website and they are writing about your website surely those are the kinds of links Google would want to see. ... It seems kind of weird that the links Google actually wants you to get are the first links to dry up on the web." - David Naylor
The other important factor is the quality of the information created by those who are trying to do disavows and link removals. When people are irrationally driven by fear while in a harmed state, they are NOT acting rationally. Business owners who are selling off assets, firing employees, stalling with creditors, aligning predatory lending to try to keep things afloat, etc. are stressed out & are likely to make many poor decisions in that rushed & panicked state.
Thus some of that more information which is created is junk misinformation.
Consider this email:
They asked the website to remove the link claiming it was unnatural, and then they emailed again to ask the website to ignore the removal request.
They don't know which links are "unnatural" and so they are using automation to try to sort through it all. Are their automated quick guesses (which are often driven by tools) more useful than all of Google's internal reviews & ratings data which has been built up for over a decade? Color me skeptical.
There's another factor with the disavow data as well. Look at this removal request
Our site has about 20,000 unique linking domains referencing it. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting against our site than we have voting for it. And those profile pages were already not indexed & the links were nofollowed anyhow. Those pages are effectively outside the search game, yet those ignorant spammers can still create negative votes against our site based on the fear-mongering.
And as bad as that sounds, there's no reward for actually removing the bad links either. At some SEO conferences numerous SEO experts have gave the advice to "just disavow them anyhow" even if the links are removed. Thus this deluge of email spam offers publishers no value whatsoever, just sunk cost & wasted time - time which could have been spent creating useful information.
In such scenario, how does Google determines if I created the links? Does Google have precognitive abilities or it can spy on my communication channels (Gmail, Google Voice, Hangouts etc.) to figure out if I am innocent?
A model is only as good as the information in it. In this case providing details of these spammy links helps you, and every other site targeted through the same sites/networks/fingerprints.
Consider it a reverse no-follow mechanism.
It appears after years and millions spent it is impossible to get the perfect search algorithm.
That is an absolutely ridiculous rule of thumb to follow. What exactly is wrong about doing something like:
1) Having a good article on your site about a certain topic
2) Reaching out to other site owners who have similar content and asking them to link to you as an additional resource
3) Getting that link.
Can you honestly, with a straight face, tell me that's BAD!?
The situation you mentioned can start out as innocuous but quickly become downright shady. Hmm, you're asking them to link to you, but what do they get in return? Oh, here's a thought, you tell them you'll link to them as well if they link to you, that way you'll both get a ranking bump! Totally fair and win-win for everyone except for the people who are trying to get quality information.
That might be the ideal result, but it's hard to see how anyone could achieve it with a purely automated system. To make the kinds of distinction you're describing, in general you probably need some form of manual curation by genuine experts with uncontroversial opinions, assuming such people even exist in the field of interest.
How do you feel about Google sponsoring educational seminars for market regulators where they do not disclose their sponsorship, actively conceal it, and request the entity putting on the sponsorship do the same thing?
It is also worth mentioning that when Google was lobbying against regulation they indeed DID control & manipulate the placement of favorable media masquerading as regular content:
"the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published, according to the documents obtained by The Post."
Another thing with the line of thinking based on pureness...should companies which have repeatedly been caught rigging actual physical markets (e.g. municipal bonds, California energy, LIBOR, interest rate swaps, forex, etc.) in the real world get a penalty by Google for it?
Or is it reasonable that they take their outsized gains from their market rigging behavior & invest some small portion of it into buying out smaller competitors, lobbying & writing regulations to harm smaller competitors, buy some feel good brand ads & do various charity donations to paint a picture of themselves?
I also want to quote the following past statement from a Google executive about the AdWords ads Google puts directly in the search results:
"As with all of our policies, we do not verify what these sites actually do, only what they claim to do."
and, some icing on the cake, ... ;)
check out the keyword rich deep links to some of Google's affiliate offers on BeatThatQuote.com (for credit cards, car insurance, etc.)
Google bought BTQ & then put a bunch of cross-site keyword rich deep links on it. is that natural? how is that any different than buying links?
"While important as a vote of confidence for the content they point to, there is simply so much link spam these days that it’s tough to know where to turn. Obviously buying links is a dead end, and it doesn’t matter how you split this hair: sharing, encouraging, incentivizing, buying – it’s all the same. You want links to surprise you. You should never know in advance a link is coming, or where it’s coming from. If you do, that’s the wrong path. Links are part of the bigger picture. You want them, but you want them to be natural. If an engine sees you growing tem naturally, you’re rewarded with rankings. If they see you growing them unnaturally, you’re rewarded with penalties."
If you disagree with that, you'd need to take it up with Bing's webmaster outreach team.
Your statement implies both Google and Bing share the same sentiment.
So now, my question still stands: What about that scenario that I laid out is actually bad/spammy/evil? Why should anyone ever be punished for doing that?
This practice has existing long before Internet, and internet has made it easier than ever. Google is essentially telling us not to ask our customers to evangelize for us.
Why such hypocrisy? How about the invites system Google uses for many products including Gmail where users are encouraged and incentivized to invite friends.
According to Google's terms if I link to my Dropbox referral link on my blog, Dropbox should be penalized for it because they incentivize users to do that.
The internet should not work according to Google's algorithm and policies, it should be the other way around.
Every marketing seminar I have ever been to has emphasised that as much as you can turn your clients into fans, in the end you still need to ask them for a testimonial and that you need to tell them that you love referrals.
Under that same logic, Google Adwords should never know in advance where their ad clients are coming from, and if you have someone out there trying to on-board big clients (or increase their spend) that's the wrong path.
You say "if they get caught" when sometimes, there is nothing to catch. You're dinging people who have grinded to forge relationships, in the name of trying to keep spammers out of the index.
It makes it too easy for me to hire a spam company to blast an overwhelming amount of crappy links at my competitors who may or may not be well equipped to disavow those links.
Which, if my company has the budget to hire someone to monitor & disavow, and my competition does not, well... somebody's still gaming the system...just not in the same way...
That's what's wrong with this picture. The link was perfectly valid and yet Google recommended that site's admin to disavow the natural link, simply because Google's algorithm isn't good enough to see what's natural and what isn't. So his suggestion is: if it's not good enough, then don't recommend the wrong stuff to people.
This is pretty typical of 2014 Google.
To be honest these days I pretty much regard Google as one of the banes of my existence, you guys hold so much power and influence so much of the internet the rest of us have to work around you, I'm currently pulling all our customers off Google Apps for Enterprise for a multitude of reasons (buggy code, broken imap implementations, impossible to get an actual human answer) as well as moving myself away from all Google services (including Android).
I remember when you guys where the good guys.
People have not only been complaining about this since Penguin came out (around two years now), before that they were complaining about the opposite: that Google's arbitrary rules were creating spammy links all over the internet. It's completely ignoring the past to pretend this is some new thing to fit a preconceived narrative.
Most of the SEO folks that seem to come out of the woodwork for these threads would agree with me that "This is pretty typical of 2014 Google" ignores years of them complaining about the exact same thing, except they're just happy to have more people on their side.
Then we stared seeing the first indications that real "negative" SEO was possible in certain circumstances and we complained even further. Stop counting links you don't like. Now we are at a point where we see old websites being told to remove or disavow links that may be as many as 10 years old.
Can Matt or Google really stand there and say that it took them 10 YEARS to figure out that not only was that a link that they didn't like, but it was so evil that now it needed to be removed from the internet? The solution has been, and always will be, stop letting things you don't like count as a plus in your algorithm. When in doubt, don't count it.
Instead, we have Google adding thousands of webmasters and site owners as a living part of the algorithm and getting them to remove all those links, plus many they can't see or understand the value of, by the use of fear and scare tactics. Remove these links are we will ruin your website rankings. It doesn't matter if you created or asked for them or not, remove them.
Yes, it most certainly did!
From the article:
> Apparently Google convinced them, via their [read: Google's] Webmaster Tools portal, that the link looked “unnatural”, and that they should use the [Google] Link Disavow Tool to discredit the link.
> It's quite likely they noticed a penalty...
Yes, and where would they have noticed that penalty?
In a drop in incoming traffic, so with whatever they're using for analytics.
In all honesty it's probably just a new SEO consultant trying to clean up past sins and move on with whatever the latest grey hat strategy is.
IMO it's a gross overreaction to a form letter that a high percentage of webmasters have received.
The whole point is, Google spreads FUD to paper over its inadequate algorithm. People believe and look up to you as a source of truth. However, you take that trust, and you ask people to do your dirty work. That means more work for them, and not every small business has all sorts extra time money to spend on stuff like that. This is not to mention that there are probably a bunch of other more important things they could be doing. (and I know you know this, your friendly face is the head of FUD over at Google)
With all of your power and authority, you have a responsibility be more honest and straightforward. Not everyone knows better, and when people end up listening to your Google/self serving advice, they are ruining the open and free nature of the internet.
This wouldn't make me so salty, but the complete BS and favoritism exhibited by you and Google is obnoxious. So, sometimes, it might be better if you just shut up.
sorry if there are any typos etc, my (virtual) keyboard is spazzing out.
I suspect neither would be a major issue if Google modernized their algorithm to take into account (mostly/only) which links people actually clicked on rather than what is linked from where. They'd only have to be able to distinguish people from click-bots, which might be easier to do with some confidence than recognizing the intent behind a particular link (spam/SEO or real information for visitors) - I hope they are already able to do this, for the sake of AdWords customers ...
Google's algorithms should reward behaviour that's best for Web users, and penalise behaviour that is worse for Web users.
That way, for people who want to game search engine rankings, they are pushed more towards techniques that benefit web users in the course of trying to benefit themselves.
The problem with models is that they are not accurate simulations of the real world. They have flaws and simplifications, and those give rise to people taking advantages of those flaws. All models and representations have flaws.
It is impossible for Google to exist at its current scale and not influence the Internet.
The tiger is born out of adaptation to the jungle, but in time, the jungle too will adapt to the tiger.
Matt, this is beyond disingenuous. If Bing or Yahoo responded in the same way, I doubt we would hear a ringing endorsement from Google on the subject.
You've indicated in other comments that basically your algorithm is everyone else's problem to deal with. You can use whatever words you like, but the actions are what matter.
At this point, it sounds more as if Google is concerned with "being right" more so than "doing right".
This blog does not indicate some fundamental breaking of the web. It indicates a guy got a random email from a random website, that he is free to ignore if he wants (and in fact should ignore, because the webmasters in question don't even need to contact him, they can use the disavow tool).
No work to do? No problem!
It's reasonably clear to me this email was anything but random. Take Google out of the picture and this communication never happens. Both the sender and receiver are taking actions with consideration of one single point -- how does this affect my Google search ranking?
If Google weren't the literal default entryway into the Internet, your argument would be valid. There is precedence in this scenario as well. Back in the 90s, many people said the same thing regarding Microsoft and attaining placement on the Windows desktop for new shipping PCs. This was a key element of the DOJ trial.
The argument that a powerful intermediary has no effect on the conduct of others is simply choosing to ignore certain facts.
Did you read the article? The reason that site is asking for the links to be taken down is that they believe (apparently based on feedback from the Webmaster Tools) that it will please the Google Gods.
Spammy links and irrational fear of spammy links are both unnatural linking behaviors that result when sites are more concerned with pleasing search engine algorithms than with pleasing human users. That's what the article is talking about.
ForHackernews, we say in our quality guidelines to "Make pages primarily for users, not for search engines." If the SEO whose email was quoted in the article had followed that advice, they wouldn't be in the position of trying to clean up spammy links after getting caught spamming.
The problem with that is that it's not true (in my admittedly very limited experience). I used to believe that until I created a small website and found it wasn't getting listed in Google and I had to try and guess why and carefully read the rules.
Making pages primarily for users means you unthinkingly do things like having duplicate content on multiple pages/subdomains/sites or linking to a site that reviewed your product after you gave them a free sample (a paid link).
I'm sure to any even vaguely experienced webdev they know not to do these kinds of things (use canonical links, nofollow and all that jazz). Somebody just throwing a few pages together who has no experience in this stuff easily falls foul of the rules. The rules may seem obvious to people who understand how search engines work. If you have never heard of pagerank and don't have an understanding of how search engines work and how people try to game and spam search engines, the rules are not obvious at all and certainly the rules are not the same as the natural behaviour of somebody creating a legitimate site with the users in mind.
* Product with a category name in url, or without, or under multiple categories
Both are very useful for humans, but have to be massaged with canonical urls to please Google machines
I don't see how the same product under a variety of URLs, one for each category it's in, is at all useful to me as a human. It actually rather annoys me when the same page has a whole bunch of URLs: I can't tell which is the "real" one, my address bar frecency gets messed up, visited links don't work correctly, etc.
As a consumer, if I came to the website looking for instructions on how to use my [Deluxe] Widget, I would reasonably expect to find the same operating guide on both product pages.
Frankly, I think "spammy link penalties" is an approach that cannot work long term, any negative weight put on links is too easy to game by competitors.
The vast majority of link removal requests I've received do not come from an email address that matches the destination of the link for which the removal is requested.
They still don't really understand the web, and they hired someone who appeared professional to them (and probably cold called them). Only after a while do they realise they're paying a lot of money for something unquantifiable and with no guarantee of hitting the front-page of Google, and so they end that relationship (in fear, as they're told that whatever placement they now have may be in jeopardy).
So starts the merry-go-round, where the smaller merchants constantly jump from one bad SEO company to another. Each one creating mess, whilst claiming they'll clean up the mess before.
I do feel for the smaller merchants. Their business is being eaten by the internet, and they're missing out and have ambition but are being fleeced by sharks every step of the way.
You're Google employee, right? You lot have gone quite cheeky nowadays. You're exploiting the monopoly you hold in the search market, to make people adhere to your standards, instead of adhering the standards the internet community naturally generates. I'd suggest Google engineers to go develop better algorithms instead of telling the internet off.
That's taken straight from Google's webmaster support forums... https://support.google.com/webmasters/answer/2648487?hl=en
Even in situation number #1, Google clearly encourages webmasters to "do as much work as you can to remove spammy and low quality links..."
I happen to disagree with the article, but it's kinds disingenuous to claim that Google had nothing to do with that link removal request.
I don't think Google needs to spend resources to turn around and tell this guy, "got'ya"...
Personally, I don't see the harm in having organic links being asked to be removed. I would never remove a link that sends referral traffic, and I see no reason to keep a link that doesn't send me visitors. I have no problem having someone no follow a link that sends me traffic, because it will send me the same traffic...
so I think the issue here is less about actually breaking the internet and more about how it makes people feel when they thought they were helping you by linking, and then they get accused of destroying your rankings by asking for the link to be removed....
A video from Google would be useful to help explain what is happening here for all parties involved.
I thought I'd made at least one video for sites that have been getting link removal requests. I'll check, and if we haven't made one then I'd agree that would be useful. Thanks for the suggestion.
Yes, Google isn't sending these so you could argue that Google are not directly to blame.
But, the merchants in question who are sending them are doing so because Google has made them believe they are (or will be) subject to some penalty if these links are not cleaned up.
I run forums and all of our links are natural links. For the first year I would look at every link that was asking to be removed and found that, without a single exception, every link was created by a trusted person in the community helping to answer a question by someone else. Every link I checked was natural and high quality, none... literally none... were suspicious or spam.
It may be that these merchants did involve SEO crews who did create a lot of spam links, but those are not on our site.
What's more worrying is that merchants feel so overwhelmed (numerically there are a lot of links to deal with) and in such a state of panic about this (they fear a bad placement could have significant effect on their business) that now they're hiring people to clean up SEO mess, and those people create 4 problems:
1) They have no clue which links were the result of SEO, so they just request every link they can to be removed.
2) They send these removal requests from emails like firstname.lastname@example.org when the links point at www.watches2u.com so it is impossible to verify as a publisher that the person requesting link removal represents the merchant to be de-linked (and #1 means that they don't enter into dialog as they can't, they're sending tens of thousands of requests).
3) The removal notices seldom have enough info to be acted on. One forum I run has several million posts, and the removal request may simply be "You've linked to blah.com, please remove it". OK, more verbose and almost identical to the OP's example, but not a useful pointer to the page or comment that contained the link.
4) If, as a publisher, one fails to remove a link, then their automated removal request scripts will now spam the publisher routinely with the same request.
If you can finally get a conversation that says "Do you represent the merchant, have you got a full URL to where the link exists on our site, and are you sure this isn't a natural and good link?", then every merchant has so far agreed that the link should stay.
But... for all of that, the effect is very real for the publisher: We get spammed evermore, from unaccountable sources that we don't know has a link to the merchant, asking for links to be removed that are actually very good quality and with good context and content.
It's a mess. A mess for which the merchants may be to blame (for originally doing SEO, and now for handling removals so badly), but this mess is definitely due to Google creating the situation.
My policy now: I put link removal requests in my spam folder.
That is how I regard it, this is just a new type of spam that Google has created. We never engaged in SEO or supported anyone doing so, we took an aggressive stance against spam, but as we have content that links elsewhere we are being spammed constantly with these requests and we carry the burden of the problem.
PS: The example cited above was just the top one in my spam folder, the link can be found on this post in the form of an img tag: http://www.lfgss.com/thread17604-52.html#post2119173 , it's the photo of a watch posted on a horology thread within a cycling forum by a long-standing member of the community with high karma. The request for link removal did come from an @web-marketing-group.org.uk address. The request will be ignored. We don't do "nofollow", but it's irrelevant as that isn't valid on img tags.
The main thing I'd recommend for a site owner who gets a fairly large number of link removal requests is to ask "Do these requests indicate a larger issue with my site?" For example, if you run a forum and it's trivially easy for blackhat SEOs to register for your forum and drop a link on the user profile page, then that's a loophole that you probably want to close.
But if the links actually look organic to you or you're confident that your site is high-quality or doesn't have those sorts of loopholes, you can safely ignore these requests unless you're feeling helpful.
It isn't, and never has been.
We operate a nursery period ( https://www.lfgss.com/thread9350.html ) in which the actions of new members are both severely restricted and observed by existing members, and all initial content must be 100% public and observable by members (no private messages, no user profile customisations).
We very aggressively remove content and ban anyone who creates any content that appears spammy and continue to encourage reports of spam even long after they have been registered.
We report those individuals to http://www.stopforumspam.com/ at the same time preventing signups from anyone that SFS reports as coming from an IP recently used to spam.
Comments are run through the Akismet API and moderated if they are flagged as spam.
When a spammer is caught (a few times a year someone does get past all of this - for a very brief moment), then every piece of content of theirs, and any other content from nearby IPs is reviewed and usually removed.
We made profile pages non-public, we even removed the customised profile fields from display (though we do leave them in the form as it helps us spot any scripts that are trying to spam us and we autoban them).
To put it mildly, we are aggressive.
We've made a really poor experience for new members of our community to help ensure the quality of the content on the site and that we're free from spam. We felt that was worth it from day one, we still feel that it's worth it.
Which is why it's hard to feel good about Google when our reward for fighting spam on our corner of the internet is to be spammed by merchants whipped into a state of panic by Google.
Unfortunately, if you look at wider area of forums as a whole, a very large fraction of them are unmaintained, unpatched, or otherwise not kept safe from drive-by spamming. There are entire forum spamming software packages that will register spam users for dozens of different forums in order to drop spammy links. You can even lease the spamming software by the month. :(
I figure already that you'll eventually consider that a valid signal, but I'm sure you could already check my site and that there must already be some disavowed links, yet you personally admit this is a well-maintained forum.
There is a fear that we are being caught in the crossfire, and it's hard not to believe that quite a few of us are going to get hit by that crossfire.
I've added rel="nofollow" as good-enough when trying to help out people throwing too-far of a net.
"Do these requests indicate a larger issue with my site?"
When the truth of the matter is that it's Google's issue. Why does the burden of the cost have to be on the publisher?
This is the 3rd time someone is pointing this out, and it pretty much showcases "Google"'s attitude on this.
At this point the cost of becoming a good source of information has increased greatly to the point that very few publishers with resources can actively manage all the rules and "guidelines" Google decides to throw at the content creators that provide the content that makes the search engine valuable.
I see this everyday in our business as smaller but richer sources of content just disappear on many topics. Blogs, forums and independent enthusiast content sources.
Google has created a one sided ecosystem. That is consolidating traffic to so called "good" sites.
As a long time publisher for a large media company I think it's time to take the internet back from Google. Who care's what Google thinks and does and start building for a better internet.
Google is what it is: trying to sift through all that information algorithmically and using some forms of AI so it can extract what humans think is good information. I agree that a good forum that has good policies is getting hurt by some of google's actions, but it's the larger system, not Google. To make an analogy: almost everybody drives and everybody uses roads, but nobody should say: I'm a good driver or I don't drive, I've never been in an accident, therefore I cannot be in an accident. With all the information and intents behind such information on the Internet, and with google as the self-appointed traffic laws, there is still bound to be some collateral damage, sometimes because of how Google tries to deal with the dreck.
As an aside, one can view Google as a just one way of monetizing information on the Internet, competing and co-existing with many other ways. You can also view it as a giant AI algorithm that gets rewarded (advertising revenue) when it finds the good information that humans are looking for and defends itself against others who have other interests.
If you don't care about what Google's view of the internet looks like, then ignore them, and ignore any emails from people asking you to change things on your site for them. Done. Simple.
This issue crops up because people usually do care what Google's view of the internet is and would like to help make it better. Marking links as nofollow and so on is entirely optional: only people who want useful search engines do it, and others don't care.
Thanks to Google for creating these parasites. The link-spam problem has been caused by Google in the first place.
I don't remember getting such spam in a pre-Google Internet world.
Small website owners like us may not have the technology and human resources to combat such spam. Therefore, we are caught up in the fire and get penalized by Google.
Wouldn't that be like every vBulletin site?
When webmasters are writing to ask for a link to be removed it's because they believe that that link could possibly be one that was made with the intention of increasing their PageRank and manipulating Google. BUT...sometimes it's hard to tell whether a link was naturally placed or whether it was one that a former SEO made on their behalf. As such, it's certainly possible that an email requesting link removal could be sent out in error. If that's the case, then ignoring the request is fine...or even better, writing back to inform them that the link was not purchased, solicited or self made would be really helpful.
I've been working the past two years, from contacting all the webmasters via WhoIs information, creating unique content on a weekly basis, having a good social media following, and still for the life of us, we cannot get ranked well organically on Google.
Could you personally look into this or give me any advice on resolving this issue because I feel like I've done everything possible and we put a lot of work into the website over the years. I would really appreciate it.
How am I as an outsider of Google able to discern any information about what's going on in this situation? I assume this isn't a unique case but there is literally 0 information. I'd like to clear this up - not just for me but for the community. I know this may be the wrong place to ask (I'm not sure where the "right" place would be), but would you be willing to contact me? caudill m at gmail
If you are able to identify problem links, you can just ignore them. Or am I missing something?
You're penalising spammers, and it's being effective. GOOD.
All Google roads and products lead to ad clicks and Adwords. Enjoy it while it lasts.
As you can imagine the process is opaque. I have reported some sub-optimal websites and seen a difference a few weeks later. I have no idea if my comment was read by a human let alone whether any action was taken as a result.
So what? Why should I have to care that Jeremy Palmer (or any other person) is linking to my site?
The point of the article is that webmasters are going around trying to get links to their sites removed because it might get them penalised by Google. That is not natural or desirable behaviour. Site owners shouldn't have to care about that. Many have resorted to adding 'nofollow' to all of their own outbound links for fear of being punished by Google for linking to 'spammy' sites and appearing to be part of a paid/circular links scheme.
It all sounds very broken to me.
Note: I have no interest in SEO whatsoever, and I only have a single low traffic web site.
Even if Google isn't penalising you for that today, they may well tomorrow. So using nofollow seems like quite a rational thing to do for any individual site as there isn't any downside as far as I know, only potential benefits.
Additionally, manually disavowing links hardly seems like a scalable solution. As somebody with a small website that I spend very little time maintaining, I don't want to have to regularly be checking some Google console for 'bad links'that I have to manually disavow to prevent getting delisted.
Breaking internet connectivity is a downside for society as a whole.
That's because previously webmasters we going around trying to get links to their sites added because it might get them rewarded by Google.
Google is trying to prevent the aberrant behaviour that the success of Pagerank caused in the structure of the Web, unfortunately there can be some collateral damage.
It's hard to cry when justice is served.
It is ridiculous that I have to police my site for low quality inbound links or risk getting banned (even more so for mom and pop webmasters who are more susceptible to link penalties). Lastly, if you are penalized, try recovering unless you are big brand like Overstock or JCPenney with millions per year in Adwords spend.
No it's not, because his argument is making the assumption that his content is high-quality.
Who said it was? Google didn't ask him to do anything. A random person on the internet did. I frequently get emails from random people on the internet who ask all sorts of insanely idiotic things that have little basis in reality.
> Apparently Google convinced them, via their Webmaster Tools portal, that the link looked “unnatural”, and that they should use the Link Disavow Tool to discredit the link.
The OP has come up with a theory about the backstory and then used that theory to blame Google for ruining the internet (ignoring the fact that the entire "credit" he wants is Google's invention in the first place).
Not really "breaking the internet".
No hyperlinks = no internet.
The tail named Google is wagging the dog named Internet by sending thousands of web masters on a wild goose chase of policing sites outside their direct control, essentially the entire internet. It's a giant waste of human capital, and it only exists due to Google's dominant market position and their willingness to solve problems of their search algorithm at the expense of the third parties.
Please do research before you judge :)
This may seem to be an impossible task, but in days not too long ago people switched search providers often. I went from the curated links of Yahoo, to Lycos, to AltaVista, to Webcrawler, to Google, and now DDG.
Google got a lot of mileage out of my in my mind with their motto "Don't be Evil", because I had some trust for them. I don't trust them anymore, and I regularly explain to others why they should no longer trust them either.
But their own moral authority and index reach are limited.
That said, I still encourage using DDG on both privacy and increasing search engine competition grounds.
The former only proxies queries from Google, whereas the latter works on multiple engines.
The first two provide mostly spammy results or subtitles. The DDG results are mostly real sites with torrent or streaming copies of the show available. Which search engine do you think is more useful?
All I care about are the quality of the results, and for that DDG does a pretty good job.
Any ranking system that gave any value to links would suffer this problem.
nofollow is a necessary evil if you allow untrusted sources to publish content on your site that contains links. It's a way of saying "I don't vouch for these links in the same way I vouch for other links on my site".
Sounds pretty reasonable to me.
A link is supposed to be a hyperlink.
Google creates a false incentive to link/not link, and then complains when people link in a way that they don't approve of. This is a problem of Google's making.
are webmasters breaking the net because they follow SEO worst practices? definitely.
i wrote this article for techcrunch in 2010 http://techcrunch.com/2010/07/07/startups-linking-to-your-co... the tl;dr: pagerank is thoughtcancer, if you start thinking of links as some flow of mystical pagerankjuice you will make bad decisions, decissions that will hurt your business, your users and in the end the internet.
the stupid "remove link emails" are just the newest iteration of this thoughtcancer.
my recommendation stays the same: link to whatever you like and link to whatever your users like, want or need, oh, and also link to your competition. but for gods, your sanities and the internets sake: don't do it for any kind of page/trust/magic-rank or any kind of link/penguin/panda-juices....
my name is franz enzenhofer /
i'm the most successful SEO in europe /
i do not care about links
Golden. It's not Google who's breaking the Internet; it's the people who try to game the search engines.
I believe you mean "implore."
One has to wonder to what extent the entire phenomenon of link spamming sites might be a Hegelian Dialectic tactic by Google - ie a manufactured problem, intended to prompt a 'solution' that is in fact more beneficial to Google's unstated intent. Are we really to believe that there's no algorithmic solution to mitigating the link-spamming nuisance?
Another dimension of this same ambition, is the movement to obfuscate URLs in browser address bars. Traditionally, Web users could copy, save and manipulate links directly as another means of navigating the Internet and maintaining their own records of Web places. As browsers progress further in the direction of eliminating direct user visibility of true URLs, this navigational method becomes less available. The Google-promoted alternative? Just search via Google!
As for why Google would want to do this, the obvious answer would be the usual 'money and power'. If Google succeeds in virtually eliminating all navigational alternatives to Google-searching, they then own the Net. For instance, if they wanted to make any given web site disappear, they could simply de-index it. That's a politically very dangerous power. Even if Google has no political agenda now (a debatable point), given such power they'd be guaranteed to become political. Power corrupts, etc.
We've been through this before, with a gold ring and a volcano. It's generally a bad idea to create 'one thing to rule them all', and Google is no exception.
I'm a little confused why he cares about nofollow links--if Google doesn't "own" the internet, what does nofollow matter? It's certainly better than not having a link at all.
Furthermore, the "credit" he so desperately wants publishers to receive is only a thing because of Google Page Rank in the first place.
That makes no sense. Citing an original source, such as an article that inspired your own work or with relevant background for ideas you're building on, is common courtesy in any field, and it's directly useful to your readers in the Web case.
Amazing as some people apparently find it, it is still quite possible to "surf" the web, following links from site to site without relying on a search engine like Google to find anything interesting or useful. The fact that Google appear to be promoting policies that could threaten to this alternative is rather the point here.
The OP is against "nofollow" links which only affect search engines. The links work just fine in a browser and visitors won't know the difference. Publishers aren't scared to link, they're just wary of vouching for a site by not using nofollow.
Some people have taken upon themselves the task to discover the rules that please the cloud Lord. In the process these priests have found a large number of arbitrary rules that have to be followed strictly. Breaking any of these rules is a reason for the cloud Lord to banish you to the dark corners of his empire - or so the priests say. For only a small fee the priests will give you a glimpse of the rule list that might you good standing with the cloud Lord. It may work, or it may not, because the cloud Lord works in mysterious ways.
Page Rank is a neat idea, but making everyone listening to Google so that Page Rank remains meaningful is stupid. What do you do if suddenly Google thinks Twitter or Facebook are to be considered spammy link farms (which would be true) ? Do you ask everyon to delete their tweets linking to you ?
A look at Ask.com's history on a tool like SEMrush makes this line of thinking rather compelling.
> Please be wary if someone approaches you and wants to pay you for links or "advertorial" pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations.
I do not care that you do this, but you should be aware of what you are doing.
Real companies selling real products to real people get in trouble for scuzzy SEO all the time. JC Penny comes to mind. Or RapGenius (though they aren't even selling anything!). It's the techniques used that are the issue, not what is being promoted.
Let's say a reader of mine is searching for a job and my site has a new great job at Apple. Apple advertises on my site because it attracts a certain kind of person. Shouldn't that person be shown my ad because it's both timely and highly relevant?
That the job is on my site is a valuable signal. I don't use an ad service, I place every single ad and I have done so since the start. I check every single link. I check every single ad. I personally talk with every single advertiser.
It's not hard for me to imagine an algorithm that does similar sorts of calculations. Oh, this link goes to apple. I've heard of them before. It's for a job. Scalability Engineer seems to be a valid occupation. Looks OK to me.
You are saying all such signals have no value and should be filtered by nofollow?
In actuality I've not given nofollow much thought. I guess I may now. I've lived by the idea of creating quality content, being honest, and doing a good job, thinking Google will figure it out. That unfortunately doesn't seem to be the case.
The irony in this age of targeting and personalization valuable signals are being lost.
Quality content is awesome and I'm sure your site is great, but Google's thought process here is that your site's reputation should only be handed on to sites that earn it organically (e.g. not through their buying it).
It's really for your own benefit, with the way PageRank works your outgoing links are votes that are as powerful as your page's reputation is good. If you link to bad sites your page's reputation takes a hit and thus your PageRank goes down. If I pay you to vouch for me (e.g. link to me) that only means that I have money. If you vouch for me without any money that's a much better signal that I have quality content.
Google has established these rules to prevent the commoditization of PageRank. They want links to represent an editorial choice and indicator of quality, not marketing budget.
There is nothing wrong with selling ads and visibility on your site to companies that wish to purchase this, but you should nofollow the links to prevent running afoul of Google's TOU.
Again, I don't care if you sell links on Fiverr and make a million dollars. I'm just explaining the rules to you.
> Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations.
It's an easy test: if the advertiser isn't OK with you not transferring PageRank you are selling a link and not advertising. It doesn't matter that it's good content or that it's not spammy (that just makes you the publisher feel OK about it), you're selling links.
You can't buy links, but you can if you do it from us. What a scam.
If pagerank can't differentiate "blackhat" seo links from organic links it probably needs fixing.
In regards to this article its a bit strange. Reading the email its pretty obvious how it went down:
Company hires dodgy seo company who sets up a bunch of links
Google detects it, sends out warning listing all links to their site
Company mass emails every site owner with links pointing back at them
I don't think they were really asking the links to be removed from the authors site at all, it was aimed at someone completely different. All they had to go by was the referrer and the link.
You seem to be confusing how Google's advertising work. You can't buy a page rank passing link from Google. Adwords and Adsense ads do not pass PageRank. No legitimate advertising does (for starters because they all want to track visits and send you through redirects, but also because it's usually injected remotely and not part of the page that can send PageRank).
Now they are telling you how you should advertise as wellas which adworks are allowed to sell you links. Want to try some viral marketing where you dont tell the user it was advertising - just brand awareness and a link back? Oops no that's trying to mess with Page rank were going to flag your site.
Look I know they're not out to hit the good guys and the spammy tactics of bh seo peeps is gross as hell but I dont think you can really draw a definite line here without collateral damage being inflicted which is what I think this article was trying to point out.
Such things should be handled automatically by the ranking algorithm. High quality sites shouldn't need to research what are the ranking algorithm internals. Today most sites prefer to stay on the safe side and put 'nofollow' on everything, which is detrimental for original content creators.
Basing ranking on characteristics of the page content is also going to pose its own problems, since instead of linkfarming, the SEOs will just focus on generating useless content (they are quite good at that already.) Without very strong AI, it's difficult to tell whether the content was there just to spamdex or if it's something that may be equally low-entropy (for example) like tables of useful information. In my mind, even a totally random ranking (not one that changes every search, but maybe ~monthly) would be better than one based on links or page content. At the very least, it would expose many users to more parts of the Internet that they might not otherwise experience if they stayed within the first 1-2 pages of search results (if I'm looking for something that happens to be relatively obscure, I routinely go into the 100th or more page of results, since there is often good content there too!)
I haven't received any such link removal requests (the sites I have a relatively small), but I do not care about SEO that much and if I did receive any my response would basically be "go complain to the search engines, not me."
At the same time, they have to deal with an entire industry that exists to exploit the very metrics they rely on to rank results so I don't blame them for just saying screw it, let's just float up our own content and well-known brands for every search and call it a day.
"We have discovered that a company we hired to help promote our website have used a variety of questionable techniques to secure links to our website. These links were placed purely for SEO purposes, with the intention of manipulating search rankings."
So basically the company hired people who spam links in order to game search results. Google is totally right in fighting that. The "natural" links on the author's site are not made by the SEO company, so those links are fine.
The probem is not linking to content you want to link to, the problem is hiring people to artificially boost your search ranking by spamming links in places they don't belong.
Secondly, nofollow applies to crawlers, not human beings. It doesn't even "break the Web", it has zero impact on an end-user's experience of navigating links.
What you could say is that excessive "nofollow" breaks PageRank and other search engines. Less melodramatic and link-baity, but more accurate.
What you could claim is that search algorithms are opinionated and 'mold' content and link structure across the Web. That would be true, but unavoidable. It's impossible to have a search engine that would not editorialize in some respect, and any engine that gained prominence for sending lots of traffic would quickly be descended upon by people like Jeremy trying to figure out how to mold content to game the algorithm.
Even if we had some sort of incredible AI based search engine that could understand meaning and nuance like spam, it might still have an editorial opinion that people would optimize around.
However—and I think this is the point of the article—use of nofollow doesn't just disincentivise spam links, it also means that many valid and useful links no longer contribute to PageRank.
It would be great if we could have a way to allow all to links contribute to PageRank, but still protect ourself from spam.
Which got me thinking…
At the moment the only options for site owners are to say "yes, I endorse this link" or "no, I don't endorse this link" (by adding nofollow). Instead I could imagine something more fine grained, a system where site owners could tag specific content within their site as coming from specific users.
Content tagged like this is neither endorsed nor disavowed, instead responsibility is pushed to the user who wrote the content. In other words site owners would be able to say something like this to search engines: "this content is created by user X, don't blame me if it's spammy!".
Smart search engines could use this more fine-grained content ownership information in their search algorithms. That means they wouldn't need to throw out all user contributed information on the internet just to protect the internet from spam.
There are a few challenges, of course. :)
* How to identify users? (Anonymously?)
* How to have users endorse content on different sites?
* How to work out which users are trustworthy?
* Building a PageRank algorithm that incorporates fine-grained trust information.
But it's fun to think about technical solutions.
While I'm not sure I understand your objection to nofollows as a compromise, have you any alternative methods in mind to improve the web without "breaking" the internet?
They're only breaking their own ranking algo.
I think some inside Google get this already, but there's not much they can do short of AI. I'm sure they're working on it.
Disclaimer: I most likely have the most traffic of everyone here (give or take) - Have lived off Google publisher cheques for 10+ years. I realise what they are doing is damn near impossible, and that they are not 'evil' but probably a bit overwhelmed.
Apart from the principles, there is also a practical problem with complying with such a request. How can one verify that the request is genuine? It could be a hacked email account from that domain. Or a rogue employee. Or someone who doesn't have sufficient authority within that company. How is the recipient of the request going to verify it?
It's incredible how google shaped my expectation of looks of a search page results. I was like - WTF is that page? - oh... I've changed my search engine... Go figure.
I agree with 90% of this piece. But this particular comment isn't new. Webmasters have been looking to control flow of PageRank since they learned what it was.
- a facebooker, working at a google competitor :P
Has this been seen? Can anyone show a case?
So although I acknowledge the general message, the specifics really don't help the poster's case.
No. No, you don't. You implore. If you compel, I rebel.
Also, if someone doesn't want you linking to their site, shouldn't it be more a case of "Whatever. Your loss."?
It's one thing if you're pointing out fraud, abuse, illegal acts or unethical behavior (in which case, you'd probably be posting the evidence on your own site), but if it's a friendly link and they don't want it, don't give it to them.
You can also get a site dinged by doing UGC spamm against you - again I have seen that an helped a major publisher get the penalty lifted
We have to acknowledge that Google has been performing up and down the perfect margin (but not getting it perfectly right), but it is because, presumably, they are in the last part of the equation to make it all perfect.
The only thing is they can't simply solve it relying only to technology because search engines deal with people. People are emotional beings, search engine robots, are, not bring funny, just robots.
Consider this as equation Neo (of the Matrix).
But you could be friendly and agree to remove the links in questions since they seem to believe it will harm their business. Not doing so "out of principal" may effect others as Google believes your links are low quality.(although it sounds like their SEO company is mostly to blame for their problems).
In the bad old days of the Internet, web sites would complain if you linked to them, because they though they had some sort of a right to determine who linked to them. After court cases, etc. the issue was resolved. And now this.
If Google really thinks that a link is illegitimate, then it should just ignore it in its algorithm.
Please don't reply to this if you don't fully agree with my points.
And "Why" one should not behave as an ass is self evident in my mind. But we may not agree on this as evidenced by your last sentence.
In those instances, even if you think you are being a good guy, there's a good chance you are wasting the one resource you can never recover (time) in an attempt to help someone who is likely going to attempt harming you AND waste your time.
That was the point of my last sentence: even though I asked that replies be in agreement, you still chose to reply. It's the same as someone asking "please don't link to me", but everyone is still free to link. That's the way discourse works, and that's how linking should work too.
Ok, I take your very well made point.
I agreed up until this, but I think it's meant as irony?