Hacker News new | comments | ask | show | jobs | submit login

This is part of why I think I've developed a reflex, after searching Google, to skip over the first few results after the sponsored links and start looking near the middle of the page. W3Schools, Wikipedia, and a few others. And it's a great example of the central failure of the pagerank idea: if the strongest signal is popularity measured through linkage, the highest quality results will rarely be at or even near the top. Right now Stackoverflow is good and deservedly ranks highly, but I fully expect it to be supplanted in Google's search rankings by an inferior copycat within a few years: one that just happens to generate more revenue for Google by carrying more advertising.



> W3Schools, Wikipedia, and a few others. And it's a great example of the central failure of the pagerank idea: if the strongest signal is popularity measured through linkage, the highest quality results will rarely be at or even near the top.

Actually, I find that the top ranked things usually have what I was searching for -- often that's W3Schools or Wikipedia. I'm not a big fan of W3Schools site design, and I wish they had output examples, but, really, the site design isn't bad enough that it makes it unusable as a reference, and usually when I get it at the top of a search results page, I'm usually looking for a quick reference.


>but I fully expect it to be supplanted in Google's search rankings by an inferior copycat within a few years: one that just happens to generate more revenue for Google by carrying more advertising.

Google does not take revenue into account in its ranking, or the decisions on how it changes its ranking. It is neither a signal, nor does it influence which signals are used.


I know that is what they (you?) claim, and I don't have any real evidence to the contrary. The contrary hypothesis, however, explains some otherwise puzzling behavior in Google's search results. So I would like to know if you can offer any evidence for your claim (statements by Google employees are not evidence).


What is this puzzling behavior?


For example, the frequently remarked (here on HN and elsewhere), persistent outranking of high-quality, ad-free sites by spam sites, or other lower-quality URLs, that invariably contain Adsense units.


Do you have an example query?


If you are paying attention to the subject of the article and most of the comments on this page, you should have no trouble generating one yourself. Just search for anything involving basic html or css, like, say "html title tag", and see that the first result, and often the first three results, are to an inferior site (w3schools) sporting Adsense ads. Then there are some more like that. The high quality results (w3.org, mdn) start halfway down. None of these have any advertising, and so they generate no revenue for Google.


Half of the people in this thread say that they like w3schools, or even if they don't like it they use it all the time. That is sufficient to explain why it ranks well. Now imagine that the average person issuing these queries is less sophisticated than the people commenting in this thread ...


Among people here who have commented on the comparative quality of these sites, it is almost unanimous that w3schools is far worse than the others I mentioned, to the extent of it actually being dangerous to use. Your reply amounts to a claim that w3schools outranks the higher quality sites in Google's results because it has higher pagerank. Frankly, this is the expected reply. If that is the whole story, it confirms my earlier claim that there is a fundamental flaw in the Pagerank algorithm that conflates popularity with quality. The hypothesis that potential revenue is in fact a signal influencing search rankings is still very much a contender. Not to belabor this, but the protestations of Google employees should not be considered as evidence one way or the other in evaluating whether this hypothesis explains the data.


There's a balance between popularity and quality that we try to be very careful with. Ranking isn't entirely one or the other. It doesn't help to give people a better page if they aren't going to click on it anyways.

You might be interested in this: http://searchengineland.com/too-many-ads-above-the-fold-now-... Google demotes pages that have too many ads, including Google's ads.


You seem to be saying that there is something besides "quality" that influences the ranking of search results, which I find surprising. I thought the idea was to return the best quality results, but there may simply be an issue of semantics at play. When you say "popularity" here I suspect you have in mind something different from what I had in mind when I used the word in previous comments. Can you explain what kind of popularity you mean here (how is it measured).

"It doesn't help to give people a better page if they aren't going to click on it"

I think you've lost me. Don't people tend to click on the top result? And isn't the idea to put the highest quality results at and near the top so people go there? How could it possibly be helpful to the user to not offer them the best quality results?

I looked at the link you provided, but I'm as confused as the author. Isn't there a limit of three Adsense units on a page anyway? As the author points out, Google suggests to publishers that they use the maximum number of ads, and specifically (see the heat map) that they put them above the fold. Then it appears they've decided to penalize publishers who follow this advice. The only way for this to be consistent is if they're only penalizing pages displaying competing advertising products.

The author also shows a screengrab of Google's own results page, where we can see that everything visible on the monitor is sponsored content, with no organic search results at all.

Frankly, I'm perplexed about what point you were trying to make by suggesting this link, but it was interesting.


Suppose you search for something like [pinched nerve ibuprofen]. The top two results currently are http://www.mayoclinic.com/health/pinched-nerve/DS00879/DSECT... and http://answers.yahoo.com/question/index?qid=20071010035254AA...

Almost anyone would agree that the mayoclinic result is higher quality. It's written by professional physicians at a world renowned institution. However, getting the answer to your question requires reading a lot of text. You have to be comfortable with words like "Nonsteroidal anti-inflammatory drugs," which a lot of people aren't. Half of people aren't literate enough to read their prescription drug labels: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1831578/

The answer on yahoo answers is provided by "auntcookie84." I have no idea who she is, whether she's qualified to provide this information, or whether the information is correct. However, I have no trouble whatsoever reading what she wrote, regardless of how literate I am.

That's the balance we have to strike. You could imagine that the most accurate and up to date information would be in the midst of a recent academic paper, but ranking that at 1 wouldn't actually help many people. This is likely what's going on between w3schools and MDN. MDN might be higher quality, better information, but that doesn't necessarily mean it's more useful to everyone.


I don't have the slightest idea why you're getting downvoted because I think that's a valid observation.


First downvote was < 60 seconds after I posted. There are at least a handful of Google employees here. EDIT: and yes, I deserve the downvotes I'm getting for this useless meta-comment.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: