Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
YouTube’s Rabbit Hole of Radicalization [pdf] (arxiv.org)
115 points by dfabulich on Dec 28, 2019 | hide | past | favorite | 111 comments


For the convenience of those who can't conveniently read a PDF on their current device:

> The role that YouTube and its behind-the-scenes recommendation algorithm plays in encouraging online radicalization has been suggested by both journalists and academics alike. This study directly quantifies these claims by examining the role that YouTubes algorithm plays in suggesting radicalized content. After categorizing nearly 800 political channels, we were able to differentiate between political schemas in order to analyze the algorithm traffic flows out and between each group.

> After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTubes recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels. Our study thus suggests that YouTubes recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets.

Tweet from the author:

https://twitter.com/mark_ledwich/status/1210743168246771716

> It turns out the late 2019 algorithm DESTROYS conspiracy theorists, provocateurs and white identitarians

> Helps partisans

> Hurts almost everyone else.


> Our study thus suggests that YouTubes recommendation algorithm fails to promote inflammatory or radicalized content

Asserting that "mainstream media and cable news content" is neither inflammatory nor radicalized. I'm not sure how accurate this statement is today.


I believe it is generally accepted definitionally that "mainstream" isn't radicalized (i.e. mainstream is defined as "not the radical fringe"). Can you think of an example of why that definitional assertion should be discarded?


I don't think it matches modern usage. I regularly see complaints about how suchandsuch politician with mainstream support has a "radical agenda" or is "radicalizing" their followers.

If we insist on the definitional assertion, it becomes a lot less obvious why we should care about the original claim. Is it even a problem for Youtube to promote "radical" content if "radical" just means "different than what mainstream news channels say"?


what spicylemon pointed out, and I find many 'mainstream' news shows to be radical and not fringe.

Sometimes it's little slants added by hosts around facts.. sometimes it's hosts adding in little details to what interviewees have said - once you start to see it, it becomes sadly obvious that it's radical bias, and yet looking at the polls or seeing other people's fbook feeds, and such - it's not fringe.

Sometimes it's fact being left out, sometimes it's only reported what can be shown as bad.. or good - and avoiding showing other details. I must say 'mainstream' news both right and left has been radical in their support or resistance of certain narratives.


True or not, this isn't what they were referring to.


Whether it works one way or the other, I find what it recommends to me very disturbing.

Example: I'm a former rock climber and am very interested in knot and knot-tying. I watched some vids on hemp ropes, including "braiding" strands of rope. That points me to a video on hair braiding, which I watch as I do find it mildly interesting, a knot system that won't bind into a mess. Now youtube is recommending me very odd videos of little girls doing makeup and gymnastics. And all the static ads are suddenly only dating aps, "find flirty women in your area" junk.

What do I need to watch to get out of this inappropriate category?


I think your Google account has a page where it tells you what it thinks your interested in and lets you opt out of mistakes. Not used it in a while so it may have changed.


> What do I need to watch to get out of this inappropriate category

Nothing, just tap the three dots on wrongly recommended videos and then "not interested" or "don't recommend channel". The recommendation algorithm might need more than one data point but it will catch on eventually.


The recommendation algorithm can be really bad at taking negative feedback. It'll sometimes stop recommending videos from a specific source when you give negative feedback, but once it's decided you might be interested in a topic, getting it to stop recommending videos related to that topic feels impossible.

The only reliable way to get it to stop recommending inappropriate content is to delete the videos that triggered those recommendations from your viewing history.


Perhaps removing it from your watch history? There should be some ui somewhere on yt to do this, or at the very least there used to be.

Personally, Youtube seems to only recommend me mainstream media now (CNN, MSNBC, even The Daily Show). It's better than pushing me towards right wing independent media like Sargon and Ben Shapiro, which is what it used to do every chance it got, but I very much don't want to watch either option. As a Bernie Sanders supporter, I'm not exactly wanting what either of them are pushing. It's not that I want an echochamber, but it'd be nice to have political commentary which did not try to either obviously lie, or selectively tell the truth in order to paint a false narrative.


Anyone who regularly browses their Youtube recommendations knows that the algorithm changed dramatically a few months ago.


They pretty much nerfed it entirely. At least to my eyes, it seems to only "recommend" more videos of exactly the same topic from the same channels more often than not. It used to show videos that were along the same lines as the ones I watched but not necessarily exactly the same topic. It seemed to be better about following somewhat tenuous links between videos that ofter turned up new and interesting channels and related information.

It seems they're trying to play it safe by strictly showing the same thing you just watched. The bias seems to be heavily towards your recent watch history. You can actually see this in action if you go through your subscribed channels and watch a few videos from channels that you haven't visited recently. The change to your feed is immediately apparent.

I've also noticed that they seem to have blacklisted or at least severely downranked some channels even when you are subscribed so they rarely show up in recommendations. These aren't even controversial channels but they don't squarely fit into a category either. Overall it feels like they've shackled a previously good recommendation engine in order to appease their loud critics. I'm really hoping someone can create a viable competitor soon, I'll abandon YouTube in a heartbeat once there's another option.


I honestly think it's designed for infants and toddlers who watch the same videos hundreds of times.

Almost every child I've seen in the past 5 years has had YouTube in their hands.

I would expect all their data points to the idea they should suggest the same video over and over again.

I've also noticed the infants of my friends don't know how and have no interest in skipping the ads.

So long as Google can continue to plausibly claim it's women in their 30s-40s watching the nursery rhyme video (because that's who's logged in), they can continue to rake in the more expensive advertising associated with that demographic.


> So long as Google can continue to plausibly claim it's women in their 30s-40s watching the nursery rhyme video (because that's who's logged in), they can continue to rake in the more expensive advertising associated with that demographic.

That stops working when the advertisers realize they aren’t making any sales from those ads.


True, but ads dont just sell products, they also sell ways of thinking by frame.


You also have to think about the people who use Youtube to listen to musics. I assume music tastes don't differ that much.

But honestly, the moment they decide to make arbitrary decisions about what videos should be played next they are going to make some miscalculations. Some times they just made worst calculations than average.


> I assume music tastes don't differ that much.

I wouldn't make that assumption. Personally, I have a wide range of musical genres that I listen to and those tastes change over time, even month to month depending on how I'm feeling. I've met enough people that say similar things to think this isn't uncommon.


It's not like they announced this. I suppose they're always tuning it against spammers.


I haven't realized :/.

Do you have some examples ?

Now that I think about it, I distinctly remember is that at some point Youtube started throwing lots of "it's always sunny in philadelphia" scenes at my face.

I am not even angry, I discovered my favorite show this way.


I used to see a heavy dose of my subscribed channels in my recommendations, now only a fraction of them are represented. We're talking channels where I watch nearly every video they post but they won't show up in my recommendations. There's clearly stuff Youtube "likes" and even a naive recommendation engine would perform better.


Same here. That's why I changed a couple of subscribed channels to send me notifications, it puts updates of these channels in the notifications area and sends me an email for every update. The June 2019 change of algorithm was that big and hostile that I had to turn on notifications.


interesting, there must be some variation in the algorithm.

I have just loaded Youtube and in the 8 recommendations above the fold :

- 4 of them are from my subscriptions

- 1 is the last published in a series of videos I have been watching, published 2 hours ago

- 1 is from a channel I have never watched AFAIK, but contains a sketch from a comic duo I watched often on Youtube

- leaving 2 lectures that don't come directly from what I watched recently.

If anything, I would prefer if Youtube was pushing way more original topics.

One thing I have remarked is that their algorithm is very "grippy". Once it identifies a topic you might be interested in, it makes sure to find other videos on the same topic.

As an example I stumbled on a beautiful video about what life is like for an man with autism. The next day Youtube recommended me a video where women with autism talk about their lives (actually I remember having already watched that video several years ago, so a meh recommendation). Today Youtube wants me to watch a video about autistic children in China.

Also, with the magic of A/B groups, we might be running on 2 very different algorithms for all we know.

> even a naive recommendation engine would perform better.

I think you are underestimating the complexity of the problem :)


> over independent YouTube channels with slant towards left-leaning or politically neutral channels.

Why were they explicitly listing this channel types not including right-leaning? From what I heard there indeed is no problem with left radicalization but with right radicalization and mind fuckery through conspiracie channels and similar.

I guess I have to read that article


I think you've mis-parsed.

The algorithm itself (as opposed to the independent channels) is "with [a] slant towards left-leaning or politically neutral channels." i.e, it recommends channels which are slightly left-of-center.


I don't entirely understand the linked chart here.

But best I can make of it, has 1/4 the # of impressions as MSM and almost half the impressions for either partisan left or right.

This is a minor issue that is showing impression numbers the equivalent of entire media organizations. How is this evidence that Youtube's algorithm does not promote radicalized content?


This should be the top comment of the thread.


This quote raises questions: "The scraped data, as well as the YouTube API, provides usa view of the recommendations presented to an anonymous account. In other words, the account has not ”watched” any videos, retaining the neutral baseline recommendations, described in further detail by YouTube in their recent paper that explains the inner workings of the recommendation algorithm[38]. One should note that the recommendations list provided to a user who has an account and who is logged into YouTube might differ from the list presented to this anonymous account."

Many discussions of radicalization talk about recommendations coming after someone has watched a number of videos.

This situation is that after N videos, the algorithm starts to see a pattern, a niche, that the user fits into and then recommends more niche content. Once the user watches the niche content, just more niche content is recommended, or that-niche and mainstream content is recommended and the mainstream content is boring.

In a sense, I don't see how a blind recommendation system could escape this situation. It seems more an inherent problem of blind recommendation systems.

What could be a better approach?

* Better education in the US and the world so people have BS detectors?

* Instead of unlabeled recommendations, the algorithm categorizes what you're seen, categorizes what it's recommending and let's the user say what category they want and remove or add categories for recommendation.

* Have more ordinary search tool and a better categorization system (Youtube is a horrible black-box now, search is terrible, recommendations is terrible, I get videos from other sites mostly).


Have the recommendation system mix it up, put niches that are somewhat similar in the recommendations not quite as often, put things a bit further away less often, etc. This should have the algorithm recommend counterpoints which are clearly visible to the user.

This is assuming things that are partisan one way are similar to things that are partisan in the other way and counter partisan content.

Avoid censorship, censorship is too easily abused.


We should not classify "refusal to ever recommend X because you know X to be untrue or defamatory" as censorship.


It's not censorship, but this position makes you not simply a neutral platform, but an opinionated editor.


The platform is not and cannot be neutral; they necessarily remove porn and copyright infringing material, plus whatever they are legally obliged to do in various countries. An endless series of judgement calls.

Moreover, we're talking about the recommendation algorithm. The only neutral recommendation algorithm is a random number generator. Everything else is making a decision that video X is "better" than video Y for user Z. What it means to be "better" is difficult to analyse, but Google themselves must be constantly reviewing that question.


I didn't mention anything specific, so I am not sure what you are replying to. Your example can be dealt with better in other ways, such as putting a prominent statement with a brief description of why something is untrue and a link to further information. For YouTube something like a popup that appears on the video which has to be manually closed, and also the information appearing under the video.

Censorship in general is terrible, and abused very regularly. Even on the really small scale I see little mafias pop up all the time, such as on forums, where anyone a bit critical of the select group gets banned and their posts deleted. On a larger scale is Putin announcing that criticism of the state is to be censored, citing other European laws do just that and punish people for what they say. If someone is censoring someone it is just extremely suspicious, and I don't blame anyone for going 'it is censored, therefor it is likely true', because the censor has removed the information that could be used as evidence for whatever it is.

"When you tear out a man's tongue, you are not proving him a liar, you're only telling the world that you fear what he might say." - George R.R. Martin, A Clash of Kings


Difficult to take this analysis seriously when CNN is categorized with The Young Turks as “Partisan Left.” Regardless of what you think of either channel the programming of these two channels is dramatically different.

The fact that the tags were manually created by the experimenters and were manually assigned by the experimenters throws the results even more into question.


I don't follow your logic. Yes, CNN and TYT are very different. This only means they can be tagged differently, but it doesn't mean they cannot be tagged under same category.

For example, The Selfish Gene and Advanced Algebra are two very different books, but they can be both categorized as nonfiction.

Maybe you are right with the conclusion, i.e. CNN or TYT should not be both categorized as “Partisan Left,” but you need a better reason.


I think the more distinct categorization is relevant in this particular paper.

I think TYT is much further left than CNN (to an European, CNN is actually center-right). So, if you want to study whether there is a "funnel" from the center to the (extremist?) left, then you have to distinguish them, otherwise your results will be biased (at the very least, biased towards right-wing extremism).

Second, we don't know if YT recommender treats them differently or not. There is some evidence that it treats "mainstream" channels (i.e. TV stations) differently than "alternative" channels. Whether TYT falls into the latter is not clear.


In my view it’s a good example of why the manual labeling is flawed as CNN regularly criticizes progressives and airs republican views that TYT then responds to and criticizes CNN for.

To categorize them both as “Partisan Left Focused on politics and exclusively critical of Republicans” is a stark example of why manual labeling of these channels, especially by the two authors and some unnamed person the authors trusted (pg. 4), cannot be unbiased and a survey methodology or topic model would produce more persuasive results.


Isn't that kind of like saying InfoWars isn't a conspiracy/entertainment website because it regularly posts about actual news?


Note I did not downvote you. These links provide some info on bias of the 2 channels:

CNN: https://mediabiasfactcheck.com/cnn/

TYT: https://mediabiasfactcheck.com/the-young-turks/


Iirc, stations like FoxNews and CNN have (had?) segments of traditional breaking/daily News, and segments that are commentary.

Maybe the lines are now blurred, but it would be interesting to see the amount of bias in the traditional daily News segments. Obviously the commentary segments are much more biased.


Its interesting seeing how arbitary all points are along the 'Overton window'; I'm not from the US so CNN strikes me as center right wing, low regulation, anti-union and moderately pro war etc.

This illustrates the danger of ranking everything relatively... if enough people watch infowars like content then every university will be marked as radical left wing etc. There needs to be some way of anchoring the spectrums in actual facts or we're firmly in the 'post truth' era.


CNN, along with most media companies, seems to be more pro-conflict than anything else.


CNN International is quite different from CNN (US).


CNN and TYT are both partisan left. They both published content showing extreme dismay at the election of Trump. That's fine, they're allowed to do that. This just shows their political bias. The same way Fox was airing pro-Trump things at the same time. This isn't a dispute over whether they are right, just over what type of content they publish. I see no problem putting CNN and TYT on the same side of the political spectrum.

EDIT: Downvotes? Give me a counter example that shows how TYT or CNN is not left-leaning.


> Downvotes? Give me a counter example that shows how TYT or CNN is not left-leaning.

Maybe downvotes are because you claim your parent claims CNN isn't left-leaning, whereas what they said was that CNN isn't "partisan left" in the same way that TYT is. You also don't address their main evidence for their claim, that they frequently attack each other.

I have no idea if you're right, because I haven't looked into the issues in enough depth. Your attack of a strawman is, however, completely unconvincing to me.


CNN is very pro-democrat, as clearly shown in all their election coverage. This makes them partisan left.

From what I've seen, CNN tends to support the favoured democrat candidate/front-runner, and TYT lean further left and often show support for somebody like Sanders, but show general support across the board.

> You also don't address their main evidence

What main evidence? Their main programming? The main programming of both is left leaning, and when it comes to discussing elections and US politics, is incredibly partisan.


> > You also don't address their main evidence

> What main evidence? Their main programming? The main programming of both is left leaning, and when it comes to discussing elections and US politics, is incredibly partisan.

The second half of the sentence you quoted tells you exactly what main evidence. If you're going to abridge quotes to take away their key meaning and then attack them I'm not going to both doing the research necessary to argue with you.


You're really grasping at straws here. There was no link or example provided in the original comment. They simply stated what they perceived the main programming to be. In one of my comments I address that by saying that the main programming on both is very pro-democrat, but with TYT leaning further left than CNN in their support of candidates that are further left than the front-runner. There is nothing else to address. If you dispute my counter-claims, please do so. Don't just say "you didn't do x" and sit back as if that's enough intelligent input to win an argument.


To most Europeans the US Democrats are center / right wing. There are no major left wing political parties in the US and no major left wing media outlets.


The left-right paradigm is obviously limited in its scope for explaining the nuanced differences between parties and political movements and especially across international and cultural boundaries.


Still, there's a marked difference between CNN and Jacobin.


I'm not partisan left and I shared content showing extreme dismay at the imbecile running my country, too.


I've found the downvotes tend to come from those who believe "reality leans left."

I.e. there is no "left" content, because that content is just the truth. And any "right" content should in their view be properly labeled as "wrong" content.


Individual HNers should lack visibility into who is downvoting them. How did you de-anonymize the downvote signal?


> They both published content showing extreme dismay at the election of Trump.

So did the National Review (https://www.nationalreview.com/2016/01/donald-trump-conserva...), but they're hardly left.

Same for George Will (https://www.washingtonpost.com/opinions/the-spiraling-presid...). Again, hardly left.


George Will and the National Review crew are neo-cons. AKA not-Conservatives. Straussians. They are the fake "right" of the modern pro-global government party. Watching people aruge over what's "left" and "right" is surreal. The real differences (if one wants to find them) are sovereignty of nation states vs a permenant ruling class that sets quazi-global policy.

To anyone who thinks they are "left": Consider watching the current presidents UN speeches.


> Consider watching the current presidents UN speeches.

I can't. They're a nonsensical standard of English full of insultingly untrue things.

> sovereignty of nation states vs a permenant ruling class

These two things are not in conflict; historically most nation states have a fairly fixed ruling class. There was a paper a while ago showing how many of the ruling families of 14th century Venice were still in significant positions politically or economically.


"untrue things"

What second hand info were you given?

https://www.youtube.com/watch?v=jp71VWgqURQ

The commoners are aware of the various power centers. The conflict you pretend does not exist happens when that status is challenged. From the perspective of power, it should not be possible; hence their PR firms (FOX/CNN/xNBC/NYT etc) losing control of the "pick A or B" directive.

You stepped over the mention of the quazi-global policy. The narrative is that does not exist, or you are a "insert negative words" if you talk about it. The concentration of power is in conflict with the idea of a soverign nation state. Modern language engineering, like the constant attempts to asscoiate liking one's country with ethnic supremicy (kinda hard with the US:) is one rather transparent technique used to attempt to shape how people think about it.

73rd UN Session: https://www.youtube.com/watch?v=q6XXNWC5Koc

74th UN Session: https://www.youtube.com/watch?v=TzufjsnCa7Y


> George Will and the National Review crew are neo-cons.

Irrelevant.

The parent comment claimed being dismayed by Trump's election makes one automatically part of the "partisan left". Neither George Will nor National Review are, by any stretch of the imagination, "partisan left", despite their being anti-Trump.

They may not be the form of right-wing you like, but they're very much on the right hand side of the spectrum.


That's fair. I agree. Some of Trumps most powerful political enemies are what some people would consider "right". Ryan, Mitt, (formally) McCain... the list is long. I find the conventional l/r distinction worse than useless; to the point of deliberately obfuscating intent.

Studying YT through the conventional l/r window is not going to yield useful results.


The dismay of a Trump presidency is just reality.


> CNN and TYT are both partisan left. They both published content showing extreme dismay at the election of Trump. That's fine, they're allowed to do that. This just shows their political bias.

‘Extreme’ disapproval of Trump is an exclusively left wing position? Clearly that is not the case.


I think this study is testing the wrong hypothesis.

Their idea is that people get "radicalized" by being recommended extremist content after watching mainstream news videos. ie, fox news -> 911 conspiracy videos. They demonstrate that this doesn't happen often.

But the real problem wirh youtube is intracategory recommendations. If someone watches one conspiracy video, then their recommendations become all conspiracy videos which inevitably leads to them consuming more and more far right content.

This seems like good science and well conducted research. I just don't know if they had the correct view of the problem


The 9/11 event is by definition the outcome of a conspiracy. The parties to the conspiracy and the methods employed are where some disagree with the official analysis. Regardless of your political stance, do your fellow human beings a credit and acknowledge that it wouldn't be the first time a government lied to it's people and it won't be the last.

Also it's kind of a weird take that extremism == far right. It seems like far left would be included, again as a matter of definition.


If someone watches one conspiracy video, then their recommendations become all conspiracy videos which inevitably leads to them consuming more and more far right content.

Just in case you didn't know, conspiracy theories aren't an exclusively right wing thing.


>Just in case you didn't know, conspiracy theories aren't an exclusively right wing thing.

Exclusively, no. Primarily, yes. At the moment, right-wing extremism is enjoying a global surge in popularity, so that ideology currently dominates conspiracy narratives online, and contributes to their virality.


Alternative media is seeing a surge of right-wing because mainstream media is exclusively left-wing, often downright conspiratorially so.

I could turn on the TV at prime time and expose my family to the supposed benefits of transgenderism, homosexuality, racial diversity, etc. I am infinitely more likely to see a positive portrayal of a mixed-race lesbian than a nationalist family man.

If you share that perspective, then that's probably wonderful. You might even suppose I must have a serious flaw to see things differently. You need to realise that half of everyone else sees it the other way.


> Alternative media is seeing a surge of right-wing because mainstream media is exclusively left-wing, often downright conspiratorially so.

Ironically, the narrative that all mainstream media is left-wing propaganda is, itself, a right-wing conspiracy theory.

A successful one, since it's the raison d'etre for Fox News, but it's still just a step removed from fears of "globalist elites" running a shadow government, cultural marxism or Hillary Clinton having a kill count.

Even if there were a credible point to be made about left-wing bias in the media (which could easily be countered by pointing out the amount of pro-business, pro-war content in the very same media) none of that is going to make Reddit, Breitbart or Gab more credible as alternatives.

>I am infinitely more likely to see a positive portrayal of a mixed-race lesbian than a nationalist family man.

No, you aren't. Although I don't know (but I do suspect) what "nationalist family man" is supposed to mean as a qualifier, or why this is presented as the obverse to "transgenderism, homosexuality and racial diversity," the majority of relationships portrayed in media are heteronormative, and plenty are between Caucasians, and few portray those traits as inherently negative.

Indeed, it's still common in mainstream media to fall back on old tropes like "killing your gays" (having homosexual or nonbinary characters die in a narrative, which goes back to the old Hays Code and its requirements that "sexual perversion" be punished or else never portrayed in a positive light) camp gay characters, depraved bisexuals and "traps." While not as common now as in the past, you're still more likely to find negative, or at least sterotypical, portrayals of non-white races and non-hetersexual identities than the opposite.

>You need to realise that half of everyone else sees it the other way.

Now you're trying to move the Overton window towards normalizing what is an extremist point of view.

I am aware that plenty of people do hold such views, but it's simple falshood that they comprise at least half the population.


[flagged]


> This is one area where the left-right paradigm breaks down

No, it isn't.

> I've see far more leftist war-mongering in the last decade than anything coming from the right

That's probably because you have confused as “left-wing” the center-right neoliberal faction that's been completely dominant in the Democratic Party since the early 1990s (though in the last couple years that dominance is weakening and there’s a real chance that they won't have someone from their faction secure the Democratic Presidential nomination next year, which would be the first time since at least 1992 that that has occurred; 2016 was the first time since at least then that there was even a significant contender that wasn't from that faction.)

> If you imagine the quintessential American man, a hard-working White lower-middle-class family man, it's impossible to assign to him modern leftist politics

Odd, then, that that's exactly the group in which the actual left-wing challenger far out-performed the center-right establishment candidate in last year's Democratic primary (and, also, during the primary when the two were placed in general election polls, the same effect was seen.)

Modern left-wing politics do just fine with the working class, including the white male segment of the working class. Center-right neoliberal capitalist politics and the bourgeois minority identity politics that faction engages in to divide the working class and obstruct the formation of a working-class identity, don't work for the white working class (and aren't intended to) just as the white nationalist identity politics embraced by the other party’s capitalists for the same purpose don't work, and aren't intended to, for the minority groups to whom the Democratic center-right’s identity politics are targeted.


We can keep doing study after study but the kernel of all of this seems to be an inability of certain academics and tech industry titans to understand that not everyone thinks alike them or holds the same values. There is an undercurrent in all of this confusion that if we can just control what others are and hear, we can make them believe the exact same things we believe. Afterall, we are right, they are wrong, so the error must be due to some contamination of their minds by dangerous content. Eliminate access to that content and all will be well. Banning wrong think from YouTube might be good for advertisers and Google's bottom line but it's not going to make thoughts that have been around since the dawn of humanity magically disappear.


> There is an undercurrent in all of this confusion that if we can just control what others are and hear, we can make them believe the exact same things we believe.

I mean, that’s how advertising works and most of these businesses are ads funded.


> we are right, they are wrong

Vaccines work; the safety risks are minimal; there is no reliable evidence that they cause autism and the primary proponent of the theory was eventually struck off for malpractice. Getting this wrong will cause the unnecessary suffering and death of small children.

The earth is spherical, and this has been known since ancient times. Some things are not a matter of opinion.


> The earth is spherical

Well, close enough for most people.

But I think the problem is that just trying to suppress dangerous untruths doesn’t seem to work in our society, and the greater the capacity to do so in other societies, the more that capacity is itself incredibly easy to abuse.

I don’t have solutions for this. I wish I did, because I have no reason to think I might be immune to untruths.


[flagged]


Do what with it? What's the question to which you might want a true/false value assigned?


Some things are a matter of opinion. Even worse, some things seem objectively clear yet are censored by YouTube for wrong think.

I suppose my point was that you've picked some very low-hanging fruit there (and even then I think the vaccines one is far more nuanced and complex than that, and the flat Earthers are ultimately harmless).

Do racial diversity. Who is right and who is wrong? Why should those preaching the virtues of third-world immigration to the West be promoted on YouTube while those preaching the virtues of nationalism in the West are censored, demonetised, or banned?


I'm curious why all submissions about YouTube manipulating their recommendation algorithm gets flagged? I've submitted lots of links to HN, most of which never gets any points of course, but only those about YouTube have ever been flagged:

* https://news.ycombinator.com/item?id=21793498 * https://news.ycombinator.com/item?id=20475792


I can speculate that this particular submission might be flagged because it goes against anti-Youtube propaganda that some HN-popular companies are participating in, like Mozilla. And, you know, people love to suppress opposing views to their propaganda. But of course there might be more specific organized effort too.


Does Youtube radicalize? Edward Snowden made an observation during his Joe Rogan interview that compartmentalization within the CIA is necessary so that no person can see all the bad stuff at once and freak out at its total enormity. As a sysadmin, he was in effect radicalized by seeing the bigger picture at NSA that wasn't visible to other analysts.

I think Youtube shows you an analogous bigger picture that makes the dominant media narratives seem fabricated and dishonest when viewed together, even if individually most of it is sincere (if dumb) reporting.

The discourse around radicalization has also been abused by editors and academics with agendas to de-normalize formerly moderate views, and who treat the interests of regular people as beneath discourse. So it's hard to take any discussions of radicalization seriously, even ones questioning whether it's even a thing, as you still have to acknowledge the nonsense it has been freighted with first.


Which formerly moderate views do you believe are being de-normalized by "editors and academics with agendas"?


- Authoritarian ideologies, such as fascism and communism, have no place in our society and their proponents must be shunned

- People should be judged (i.e. admitted to college or hired) by the content of their character and not by the color of their skin

- Mass import of foreign workforce by corporations is not being done in pursuit of any high-minded ideals but to drive the wages down, and it is not something we must embrace or celebrate

If you agree with any of the above, congratulations -- you're a fascist.


Please don't take HN threads into ideological flamewar. We should resist going there, not jump in first thing.

https://news.ycombinator.com/newsguidelines.html


As a “libertarian”, I have to hold my tongue and smile in most conversations.

The Overton Window has become so shockingly small, that most “normals” educated in the last 20 or so years no longer even realize that they have been mentally neutered.


I mean, of all the different libertarian philosophies, most of them are radical, in so much as they all seek to distance the individual from systems of government and social institutions. Since those things are what most people use day to day, and in the US only one of the two political parties leans that way, it does put you in a minority camp whose views can be felt as a sort of existential threat. (And honestly, politics and religion should both be kept out of most conversations)


Everyone is a libertarian — when it comes to their own liberties.

Other ideologies vary, insofar as they devalue and restrict differing sunsets of other peoples’ liberty.

So it is always strange to me that valuing other’s liberty as highly as my own is now being considered as “radical”.

And, I’m always ready and willing to lay down my life to defend my liberty — and yours.

I find is also strange that the political ideologies deemed non “radical” are willing to demand that I defend their liberty with my life — but would throw me and my loved ones under the bus without blinking.

But, I’m the “radical” one...


It sounds a bit like you've got an axe to grind more than you want to actually discuss political philosophy, and that might be more the reason why other people don't want to discuss it. (And for what it's worth, 'radical' isn't pejorative... Pretty much anything that isn't social progressivism or social conservativism is radical in this country)


FWIW, I'd note that Snowden was a CIA contractor, not operative or spy, so his understanding of why compartmentalization is done should maybe be taken with a grain of salt.

(For example, another---and if I understand correctly, the main---reason compartmentalization is done is so a single compromised agent can't give "the whole farm" to an enemy intelligence organization. It's not about protecting agents from 'the real story,' but from making sure the enemy doesn't have a picture of the agency's understanding of 'the real story').


Both aspects are far from mutually exclusive.

This does not change the fact that splitting people and processes in little groups is an excellent way to get otherwise unspeakable deeds done.

Ben Griffin, an ex UK SF trooper turned pacifist, talks about compartmentalisation in detail [0] (00:30:10).

Yes, Goodwins law. And yes, Ben has an obvious agenda. In our context here, he does bring forth a few valid and concrete examples that underline Snowdens observations.

0 https://m.youtube.com/watch?v=6tHvtFibhic


Great comment, but media narratives are a thing. The senior executives at mainstream networks set and reinforce narrative on a daily basis.


This applies every bit as much to Google's web search as it does YouTube search.

I know of many marginalized people targeted by online hate groups that have been smeared with proven-false allegations and doxxed online, and these stalking websites are consistently ranked as the top results in Google search and Google Images search results for their names.

Google's AI is clearly trained to recognize and promote controversy, because it is human nature for controversy to drive engagement.


I've been convinced for a while now that Facebook tries to start fights between people of opposite ideology. I'm sure it's just that their algorithms have picked up that pro-skub people are highly likely to reply to anti-skub comments in pro-skub threads (since that was the pattern I was seeing) but it sure comes across as trying to cause drama.


Kind of reminds me of CGP Grey's this video will make you angry.


What applies? The submission concludes the opposite of what you're describing.

> To the contrary, these data suggest that YouTubes recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content.


...but only for logged out users.

They didn't test what happens for people with an account.



This is a pretty clever way to identify natural recommendation levels. I wish the authors had given a bit more justification of why net impression flows are a good metric, but I don't want to quibble too much, because it seems pretty reasonable and it's certainly a vast improvement over what I was afraid I'd find going in.

> However, it seems that the company will have to decide first if the platform is meant for independent YouTubers or if it is just another outlet for mainstream media.

Come on, authors, save it for the blog post.


> with slant towards left-leaning or politically neutral channels

So ... YouTube is actually politically biased? This would be an interesting study to continue. You would need to normalise for view counts etc. since I suspect there is a bias just in terms of numbers. The biggest channels are more progressive, left leaning, so YouTube could appear left-biased simply by recommending most-viewed videos.


Some other research came to a compatible conclusion:

https://www.wired.com/story/not-youtubes-algorithm-radicaliz...


I devised a simple test for this. After the last US election there was some analysis and discussion of Google's youtube recommendations engine which appeared to show a strong bias towards conspiracy videos, which in turn led to videos that generally promoted Trump over Clinton. Now, I'm not American, and don't care who your president is, but it was around this time that I realised youtube's search engine had become pretty useless for me. Since then I only use youtube for watching tutorials or technical talks (and old TV shows that are hard to find elsewhere). I try to avoid the search engine and recommendations altogether.

Ok so that's the why, now what's this simple test? Just search youtube for CERN and note the mix of results. I found that most of the results could be categorised as either pro-science or anti-science/conspiracy. In an ideal world I would expect to see the official CERN channel at the top, followed by sciency videos about CERN, and finally the fringe conspiracy videos about wormholes to other dimensions and such. It's fun to use tor to see how the results vary by country. The last time I did this was 2017, and I don't have the data to hand, but it was roughly 70-80% of results in the first few pages where conspiracy related.

Here's a screenshot of my result in 2017 when I'm logged in. https://user-images.githubusercontent.com/28928495/71557922-...

I just had a quick look now and my first impression is that results have certainly changed and improved, but it's a mixed bag. I will find my old data and update it soon.


I see the term radicalization is being brought up, do we serve ourselves by focusing on superficial presentations that won't be the same in 10 years rather than how similar everyone is: Feeling enlightened, competitive and lack consideration that their perceived "ideological adversary" is every bit vulnerable, needy, and scared as themselves?

How is this different from couple's therapy, just at long distances w/ groups?

None of them want to have a picnic and cooperate with each other. And who could blame them? They both fail to recognize each others hardships, yet "get" the human condition better than the other so much better. They begin by insinuating there's another group that has a character flaw, and is so angry. Yet - all the while, they're angry and accusing (https://en.wikipedia.org/wiki/Projective_identification)

I have a hypothesis: It's all acting out, cathartic, to blow off steam. They're not at the soup kitchen or volunteering. They have a dysfunctional coping mechanism stemming from earlier traumas, and it's more profitable to captivate lonely, bored people by stir up people's anxiety existentially than help them find common ground.

Because if people realized the common ground they shared and cooperated, people would start to pass laws and regulations to make healthcare, employment, housing, education for more fair for legal persons. The whole concept of political sides is a sham: They are legal persons and https://en.wikipedia.org/wiki/Maslow%27s_hierarchy_of_needs.

If common sense stuff isn't being fixed and people are bikeshedding: the fix is in, people in suits are giving each other high fives and laughing at you. You're being suckered into squandering your political rights (you voted, or can!) to take worthless, symbolic digs at people rather than get what you need, better laws for the practical issues everyone shares. Hint: they tend to be boring.


Youtube recommended this to my son: https://youtube.com/watch?v=4LfJnj66HVQ

Please watch it and tell me what you think of it.


What do you mean? Gucci Gang is a pretty well known hip-hop track liked by the "youth". I am not surprised it was recommended to your son since most of his peers probably have watched it.


It should be noted that this research was likely published with a specific agenda in mind: https://twitter.com/mark_ledwich/status/1210743217982803970?...

> My new article explains in detail. It takes aim at the NYT (in particular, @kevinroose) who have been on myth-filled crusade vs social media. We should start questioning the authoritative status of outlets that have soiled themselves with agendas.

From the linked Medium article:

> These events, along with the promotion of the now-debunked YouTube “rabbit hole” theory, reveal what many suspect — that old media titans, presenting themselves as non-partisan and authoritative, are in fact trapped in echo chambers of their own creation, and are no more incentivized to report the truth than YouTube grifters.

The paper itself makes a fundamental flaw of using logged out recommendations and attempts to disprove 2018 algos even after substantial changes since, which invalidates the research entirely.


Less of an agenda than the aforementioned NYT article.

Sure, this research doesn't use 2018 algos, because it's 2019. I guess the NYT and others should've done the research in the first place then. It certainly doesn't invalidate the research - it shows that the claims made by others are probably not accurate in 2019 on YT.


I believe the 2019 algo was dramatically changed recently, meaning that the core of the 2018 algo was present throughout the majority of 2019.


It makes me wonder how one would research Youtube algorithms from the outside.

Perhaps freshly imaged computers in random geographical locations periodically running selenium or other browser automation software, recording which recommendations are made for various viewing preferences?

The problem is A) the study could be invalidated by basic bot prevention (from identifying Selenium, identifying fresh accounts, etc) and B) Youtube actively preventing such research with "new" accounts and C) lack of sufficient scale could skew the results.

Ideally Youtube would partner with a transparent auditor and have internal teams work with the auditors, but there's no way Google agrees to that because it could potentially unearth some very bad practices by the Youtube team (if they do exist...).


I helped contribute to YouTube algorithm research for BuzzFeed News in January: https://www.buzzfeednews.com/article/carolineodonovan/down-y...


> To better understand how Up Next discovery works, BuzzFeed News ran a series of searches on YouTube for news and politics terms popular during the first week of January 2019 (per Google Trends). We played the first result and then clicked the top video recommended by the platform’s Up Next algorithm. We made each query in a fresh search session with no personal account or watch history data informing the algorithm, except for geographical location and time of day, effectively demonstrating how YouTube’s recommendation operates in the absence of personalization.

From a technical perspective this does not seem very rigorous, specifically:

> We made each query in a fresh search session with no personal account or watch history

I don't see how you can confidently say

> effectively demonstrating how YouTube’s recommendation operates in the absence of personalization

---

1) What is a "fresh search session"

2) Were all these requests made from the same IP address?

3) What does no personal account or watch history mean? Did you use incognito mode, did you logout from chrome, had you recently been logged into an account in that browser (such that cookies may still be present)?

---

The point of my comment is that this research is ~hard~ and its difficult to definitely prove that Google's algorithms are immoral unless you can research with a high degree of rigor (the study is repeatable, its been peer reviewed by experts, etc).

FWIW I think this research is immensely valuable and needed, but I feel like until it is done well the "news" about it is little more then Op eds and blogspam.


Why did you edit out the line that says you used techniques like the ones the gp mentions? That was what made your comment relevant.


> It should be noted that this research was likely published with a specific agenda in mind:

Just like the previous "study" which claimed everybody not part of the corporate media were alt-right.

These types of "research" are funded and run solely for an agenda. Primarily to push social media platforms to direct traffic to corporate media.

Also, radicalized is such an interesting word. If the saudis ran the exact same study with their definition of "radicals", I wonder how that would turn out? What if the chinese or the russians? Are their radicals the same as ours? Also, are the liberal elites' definition of radical the same as conservatives or the general population at large?

At the end of the day, it's all about "authorative" sources - an orwellian and creepy word if there ever was one.


> Also, radicalized is such an interesting word. If the saudis ran the exact same study with their definition of "radicals", I wonder how that would turn out? What if the chinese or the russians? Are their radicals the same as ours? Also, are the liberal elites' definition of radical the same as conservatives or the general population at large?

If we want to be consistent, the founding fathers of the United States were radicalized, but that's usually not how people think of them.





Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: