Hacker News new | comments | show | ask | jobs | submit login

What really needs to be changed is Google's algorithms. The number of links to a site may be correlated with content quality and relevance but shouldn't be taken as an indicator of such, since it basically promotes large sites with lots of links - but not too many - while penalising the "less developed" (in terms of linkage) parts of the Internet, the parts that in my experience also tend to have the most interesting and valuable content.

Basing ranking on characteristics of the page content is also going to pose its own problems, since instead of linkfarming, the SEOs will just focus on generating useless content (they are quite good at that already.) Without very strong AI, it's difficult to tell whether the content was there just to spamdex or if it's something that may be equally low-entropy (for example) like tables of useful information. In my mind, even a totally random ranking (not one that changes every search, but maybe ~monthly) would be better than one based on links or page content. At the very least, it would expose many users to more parts of the Internet that they might not otherwise experience if they stayed within the first 1-2 pages of search results (if I'm looking for something that happens to be relatively obscure, I routinely go into the 100th or more page of results, since there is often good content there too!)

I haven't received any such link removal requests (the sites I have a relatively small), but I do not care about SEO that much and if I did receive any my response would basically be "go complain to the search engines, not me."

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact