Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

google results were already 90% SEO crap long before ChatGPT

just use Kagi and block all SEO sites...





How do we (or Kagi) know which ones are "SEO sites"? Is there some filter list or other method to determine that?

If you took Google of 2006, and used that iteration of the pagerank algorithm, you’d probably not get most of the SEO spam that’s so prevalent in Google results today.

It seems like a mixture of heuristics, explicit filtering and user reports.

https://help.kagi.com/kagi/features/slopstop.html

That's specifically for AI generated content, but there are other indicators like how many affiliate links are on the page and how many other users have downvoted the site in their results. The other aspect is network effect, in that everyone tunes their sites to rank highly on Google. That's presumably less effective on other indices?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: