Hacker Newsnew | comments | show | ask | jobs | submit login

“We try very hard to make sure we communicate with Webmasters,” Mr. Cutts says. For instance, he says, he reassured Gawker Media that it would remain in search results after Hurricane Sandy caused it to crash.

This somewhat perfectly summarizes the genius solution Google found to customer service. Google is very keen to react with a personal face to high profile cases while at the same time squeezing all the thousands of faceless small businesses in similar situations into Adwords.




How is Google squeezing small businesses?

-----


Basically, the volatility of their algorithm and the opaqueness of it means that any small business that is getting >80% Google-referred traffic is in for a tough time if they're marked as bad. And mistakes happen. Say:

* A canonical tag that shows up due to a bug. Dead.

* Industry terms that appear duplicative to the NLP parser. Dead.

* Too-diverse content. Lower rankings, but functionally dead.

There's tons more, but I'm not a practitioner of SEO.

-----


This smells like bullshit to me. Google's search results are generally extremely good from the "customer's" point of view.

People whining about how they're hurt by losing search position inevitably turn out to be spammers or other lowlife that have been trying to use "SEO" or some other shady practice to artificially inflate their search position.

So yeah, Google is bad for con-men. Darn.

-----


I've been spending time in Black Hat internet marketing forums (out of curiosity). They hate Google. Google is consistently making their search engine better for the user and worse for spammers.

-----


People whining about how they're hurt by losing search position inevitably turn out to be spammers or other lowlife that have been trying to use "SEO" or some other shady practice to artificially inflate their search position.

I have a slang dictionary site that's currently being penalized by Google for showing citations of slang use gathered (by hand) from media.

With the penalty, Google sends 1/3 less search traffic. And I whine about it. But I'm not a con man.

-----


> There's tons more, but I'm not a practitioner of SEO.

Sounds like you know an awful lot about SEO and consequences for not being a "practitioner."

-----


I also call bullshit on this. Google is very open to webmasters,and has a giant like 40 page .pdf available to them strictly about what things get better search results and what things cause negative results or out right panda-ing. The article says right up front about the voting website that they had duplicated content on pages and no one linking to them to boot. Both are known negatives to anyone working in SEO or web today and it has been known for years.

The entire article seemed to be one giant raving contradiction of its self. Companies don't want to hire good people (or for some reason have ignored what they have said), chosen to keep doing things they way that keeps getting them negative results, then moan about it. What happened to the voting website is the very thing thing that Bing would do to them (and probably has done to them).

Now admittedly, Google tends to promote its services above services of other companies, should they be in competing markets. Its unclear if this is by accident (Google engineers probably know the best Google SEO methods after all) or by design. At worst, if it is done by design, then Google is no worse then any other large company. Remember when all the big super markets started producing their own products at cut rate prices? Before when you walked into a Publix, you had 10ft of shelf space devoted to nationally branded ranch dressing. Now you have 5 ft devoted to Publix brand Ranch dressing and 5ft devoted to everything else. So if you are Kraft, your seeing your [eye-ball] search traffic being reduced by half and your competitor now has a much lower price to boot.

Why is it that Kraft doesn't care about that? They don't care, because they have spent the last half decade building their brand and their customer pool. They know that if their loyal customers go into a super market they will see the cheaper supermarket branded ranch dressing first; then second they will see the high priced Kraft and buy Kraft. I can easily extend this analogy to many of the situations presented in the article. When Google enters a market, they are the underdog. Just like when Publix decides to copy another nationally branded product and sell it in their stores. Even if Publix devoted 9 ft to Publix brand Ranch and 1 ft to National brands, on the shelves, Publix still wouldn't capture 90% of the market.

-----


Also, Kraft probably wholesales half the store brand stuff to Publix.

-----


Just because most reactions somewhat imply shady SEO problems:

Cutts' clearly states, that they are giving preferred treatment to a high viability site such as Gawker, while normal websites (i.e. internet peasants) would have been de-indexed in the same situation. Cutts' fulltime job is communicating ranking related issues, so I am quite certain this would have sounded a lot different if this would have been implemented as a general rule for disaster ridden IP ranges etc.

-----


I don't think he "clearly" states that. To me, it sounds like he is saying: our algorithms are robust to temporary service failures; websites aren't penalised heavily.

As a genuine question, do you have any evidence for the de-indexing time of normal websites? (And evidence that they aren't restored as soon as the website is restored?)

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: