Hacker News new | past | comments | ask | show | jobs | submit login

It's biased by your watch history, but it's never just that. In my experience (browsing without accounts, in private browsing with no cookies, on rotating IPs), there seems to be a distinct spot in the algorithm for some inflammatory engagement bait regardless of your history. That bait is not dependent on your watch history and is based on your geographic location by the looks of it.

Regardless of what I watch, in the middle of otherwise on-topic recommendations, there will always be one or two videos that are attempts at getting me to engage with some complete off-topic inflammatory political bullshit. Of course, once you click on that, the "regular" recommendation system takes over and feeds you more of that (which is somewhat fine), but the fact that it's trying to suck the user into this in the first place despite no indications the he desires to be exposed to such content in the first place is disgusting.






There is strong incentive for youtube creators to create this kind of "clickbait" content (and especially clickbait titles and thumbnails) which perpetuates that situation regardless of whether the algorithm explicitly rewards it. As long as engagement is a factor and creators are rewarded for it then it seems like what you observed is kind of unavoidable.

I don't mean usual, on-topic clickbait consistent with the watch history. I mean that in the middle of said on-topic clickbait, one or two of the recommendation slots are always explicitly allocated to a broader, regional-level pool of inflammatory political clickbait completely unrelated to watch history.

So for example, I could be watching some niche technical videos, and my recommendations would be more of that for the most part. Except that on an English-speaking-country IP address, I'd also get some inflammatory Trump-related video among the usual recommendations. On a French IP I get the French equivalent, and so on.

So either consumers of various niche content (in unrelated fields, from retrocomputing to farming or vehicle repair) also all happen to be into political trash in various languages so much as to outcompete other on-topic videos in the recommendations, or the recommendation engine has an explicit feature to push inflammatory crap in addition to "organic" recommendations. I strongly suspect it's the latter.


My completely unsubstantiated pet hypothesis about this is that it's cheaper and easier to cache the same click bait for everyone instead of different well tailored recommendations.

Agreed.

The most insidious thing is when you see kids hooked on it. Not only are they fed the same garbage content and ads, some of it is actually harmful, like Elsagate. Some of those videos are still available on the site, and more get added all the time.

We can argue whether parents should let their kids use YouTube, and if the YouTube Kids app works well enough to protect them from this, but at the end of the day we're just data mines and not customers, so nothing besides public outrage and regulations could improve this. It's also an incredibly difficult problem given the amount of videos uploaded every day, but I'm sure Google could solve it if they had good reasons to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: