Hacker News new | past | comments | ask | show | jobs | submit login

I think there's a great opportunity now for a new search engine that evaluates the general spamminess of a site, and then punishes sites that link to spammy sites.



This has already been in place for a long time. So much so in fact that you used to be able to attack a sites rankings by creating, or stealing, many low quality sites and publishing links on them to the victim's site. The victim, if they were lucky enough to know about theses things, would then have to create a Search Console account (for Google) and declare that the sites have nothing to do with them and Google should ignore them for ranking purposes.

These days there are more attributes to add subtlety to your outbound links of you want the search engines to take you seriously.[1] I'm sure other search engines make similar judgements based on them.

[1] https://support.google.com/webmasters/answer/96569?hl=en


I was aware of that, and it's exactly backwards. Google is punishing sites for having low-quality sites point to them (even if they have no control of them). What I'm suggesting is the opposite, that a site be punished for its outward links, if they point to low-quality sites.

(One could do this separately for various kinds of undesirable content and let the users choose whether they want to avoid spam, hate speech, plagiarized content, nsfw content, misinformation, Rick Astley videos, and so on.)


Punishing slow complicated sites and sites with annoying newsletter or cookie notifications would be nice too. Promote simple, fast readable sites with no dark patterns.

There's plenty of room for improvement in search. DDG is already edging ahead because it searches for what I want, not what it thinks I want.


I’d be interested if I could alternate which sort of weights to apply-like you and sibling comment suggest.


Google already does this. It's an NP hard problem for sure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: