
YouTube’s Rabbit Hole of Radicalization [pdf] - dfabulich
https://arxiv.org/abs/1912.11211
======
dfabulich
For the convenience of those who can't conveniently read a PDF on their
current device:

> _The role that YouTube and its behind-the-scenes recommendation algorithm
> plays in encouraging online radicalization has been suggested by both
> journalists and academics alike. This study directly quantifies these claims
> by examining the role that YouTubes algorithm plays in suggesting
> radicalized content. After categorizing nearly 800 political channels, we
> were able to differentiate between political schemas in order to analyze the
> algorithm traffic flows out and between each group._

> _After conducting a detailed analysis of recommendations received by each
> channel type, we refute the popular radicalization claims. To the contrary,
> these data suggest that YouTubes recommendation algorithm actively
> discourages viewers from visiting radicalizing or extremist content.
> Instead, the algorithm is shown to favor mainstream media and cable news
> content over independent YouTube channels with slant towards left-leaning or
> politically neutral channels. Our study thus suggests that YouTubes
> recommendation algorithm fails to promote inflammatory or radicalized
> content, as previously claimed by several outlets._

Tweet from the author:

[https://twitter.com/mark_ledwich/status/1210743168246771716](https://twitter.com/mark_ledwich/status/1210743168246771716)

> It turns out the late 2019 algorithm _DESTROYS_ conspiracy theorists,
> provocateurs and white identitarians

> _Helps_ partisans

> _Hurts_ almost everyone else.

~~~
sandworm101
Whether it works one way or the other, I find what it recommends to me very
disturbing.

Example: I'm a former rock climber and am very interested in knot and knot-
tying. I watched some vids on hemp ropes, including "braiding" strands of
rope. That points me to a video on hair braiding, which I watch as I do find
it mildly interesting, a knot system that won't bind into a mess. Now youtube
is recommending me very odd videos of little girls doing makeup and
gymnastics. And all the static ads are suddenly only dating aps, "find flirty
women in your area" junk.

What do I need to watch to get out of this inappropriate category?

~~~
wcoenen
> What do I need to watch to get out of this inappropriate category

Nothing, just tap the three dots on wrongly recommended videos and then "not
interested" or "don't recommend channel". The recommendation algorithm might
need more than one data point but it will catch on eventually.

~~~
duskwuff
The recommendation algorithm can be really bad at taking negative feedback.
It'll _sometimes_ stop recommending videos from a specific source when you
give negative feedback, but once it's decided you might be interested in a
topic, getting it to stop recommending videos related to that topic feels
impossible.

The only reliable way to get it to stop recommending inappropriate content is
to delete the videos that triggered those recommendations from your viewing
history.

------
joe_the_user
This quote raises questions: _" The scraped data, as well as the YouTube API,
provides usa view of the recommendations presented to an anonymous account. In
other words, the account has not ”watched” any videos, retaining the neutral
baseline recommendations, described in further detail by YouTube in their
recent paper that explains the inner workings of the recommendation
algorithm[38]. One should note that the recommendations list provided to a
user who has an account and who is logged into YouTube might differ from the
list presented to this anonymous account."_

Many discussions of radicalization talk about recommendations coming after
someone has watched a number of videos.

This situation is that after N videos, the algorithm starts to see a pattern,
a niche, that the user fits into and then recommends more niche content. Once
the user watches the niche content, just more niche content is recommended, or
that-niche and mainstream content is recommended and the mainstream content is
boring.

In a sense, I don't see how a blind recommendation system could escape this
situation. It seems more an inherent problem of blind recommendation systems.

What could be a better approach?

* Better education in the US and the world so people have BS detectors?

* Instead of unlabeled recommendations, the algorithm categorizes what you're seen, categorizes what it's recommending and let's the user say what category they want and remove or add categories for recommendation.

* Have more ordinary search tool and a better categorization system (Youtube is a horrible black-box now, search is terrible, recommendations is terrible, I get videos from other sites mostly).

~~~
pitay
Have the recommendation system mix it up, put niches that are somewhat similar
in the recommendations not quite as often, put things a bit further away less
often, etc. This should have the algorithm recommend counterpoints which are
clearly visible to the user.

This is assuming things that are partisan one way are similar to things that
are partisan in the other way and counter partisan content.

Avoid censorship, censorship is too easily abused.

~~~
pjc50
We should not classify "refusal to ever recommend X because you know X to be
untrue or defamatory" as censorship.

~~~
golergka
It's not censorship, but this position makes you not simply a neutral
platform, but an opinionated editor.

~~~
pjc50
The platform is not and cannot be neutral; they necessarily remove porn and
copyright infringing material, plus whatever they are legally obliged to do in
various countries. An endless series of judgement calls.

Moreover, we're talking about the recommendation algorithm. The only neutral
recommendation algorithm is a random number generator. Everything else is
making a decision that video X is "better" than video Y for user Z. What it
means to be "better" is difficult to analyse, but Google themselves must be
constantly reviewing that question.

------
faizshah
Difficult to take this analysis seriously when CNN is categorized with The
Young Turks as “Partisan Left.” Regardless of what you think of either channel
the programming of these two channels is dramatically different.

The fact that the tags were manually created by the experimenters and were
manually assigned by the experimenters throws the results even more into
question.

~~~
Thorentis
CNN and TYT are both partisan left. They both published content showing
extreme dismay at the election of Trump. That's fine, they're allowed to do
that. This just shows their political bias. The same way Fox was airing pro-
Trump things at the same time. This isn't a dispute over whether they are
right, just over what type of content they publish. I see no problem putting
CNN and TYT on the same side of the political spectrum.

EDIT: Downvotes? Give me a counter example that shows how TYT or CNN is not
left-leaning.

~~~
war1025
I've found the downvotes tend to come from those who believe "reality leans
left."

I.e. there is no "left" content, because that content is just the truth. And
any "right" content should in their view be properly labeled as "wrong"
content.

~~~
shadowgovt
Individual HNers should lack visibility into who is downvoting them. How did
you de-anonymize the downvote signal?

------
hooande
I think this study is testing the wrong hypothesis.

Their idea is that people get "radicalized" by being recommended extremist
content after watching mainstream news videos. ie, fox news -> 911 conspiracy
videos. They demonstrate that this doesn't happen often.

But the real problem wirh youtube is _intra_ category recommendations. If
someone watches one conspiracy video, then their recommendations become all
conspiracy videos which inevitably leads to them consuming more and more far
right content.

This seems like good science and well conducted research. I just don't know if
they had the correct view of the problem

~~~
jansan
_If someone watches one conspiracy video, then their recommendations become
all conspiracy videos which inevitably leads to them consuming more and more
far right content._

Just in case you didn't know, conspiracy theories aren't an exclusively right
wing thing.

~~~
krapp
>Just in case you didn't know, conspiracy theories aren't an exclusively right
wing thing.

Exclusively, no. _Primarily,_ yes. At the moment, right-wing extremism is
enjoying a global surge in popularity, so that ideology currently dominates
conspiracy narratives online, and contributes to their virality.

~~~
Torches66
Alternative media is seeing a surge of right-wing because mainstream media is
exclusively left-wing, often downright conspiratorially so.

I could turn on the TV at prime time and expose my family to the supposed
benefits of transgenderism, homosexuality, racial diversity, etc. I am
infinitely more likely to see a positive portrayal of a mixed-race lesbian
than a nationalist family man.

If you share that perspective, then that's probably wonderful. You might even
suppose I must have a serious flaw to see things differently. You need to
realise that half of everyone else sees it the other way.

~~~
krapp
> Alternative media is seeing a surge of right-wing because mainstream media
> is exclusively left-wing, often downright conspiratorially so.

Ironically, the narrative that all mainstream media is left-wing propaganda
is, itself, a right-wing conspiracy theory.

A successful one, since it's the raison d'etre for Fox News, but it's still
just a step removed from fears of "globalist elites" running a shadow
government, cultural marxism or Hillary Clinton having a kill count.

Even if there were a credible point to be made about left-wing bias in the
media (which could easily be countered by pointing out the amount of pro-
business, pro-war content in the very same media) none of that is going to
make Reddit, Breitbart or Gab more credible as alternatives.

>I am infinitely more likely to see a positive portrayal of a mixed-race
lesbian than a nationalist family man.

No, you aren't. Although I don't know (but I do suspect) what "nationalist
family man" is supposed to mean as a qualifier, or why this is presented as
the obverse to "transgenderism, homosexuality and racial diversity," the
majority of relationships portrayed in media are heteronormative, and plenty
are between Caucasians, and few portray those traits as inherently negative.

Indeed, it's still common in mainstream media to fall back on old tropes like
"killing your gays" (having homosexual or nonbinary characters die in a
narrative, which goes back to the old Hays Code and its requirements that
"sexual perversion" be punished or else never portrayed in a positive light)
camp gay characters, depraved bisexuals and "traps." While not as common now
as in the past, you're still more likely to find negative, or at least
sterotypical, portrayals of non-white races and non-hetersexual identities
than the opposite.

>You need to realise that half of everyone else sees it the other way.

Now you're trying to move the Overton window towards normalizing what is an
extremist point of view.

I am aware that plenty of people do hold such views, but it's simple falshood
that they comprise at least half the population.

------
Mountain_Skies
We can keep doing study after study but the kernel of all of this seems to be
an inability of certain academics and tech industry titans to understand that
not everyone thinks alike them or holds the same values. There is an
undercurrent in all of this confusion that if we can just control what others
are and hear, we can make them believe the exact same things we believe.
Afterall, we are right, they are wrong, so the error must be due to some
contamination of their minds by dangerous content. Eliminate access to that
content and all will be well. Banning wrong think from YouTube might be good
for advertisers and Google's bottom line but it's not going to make thoughts
that have been around since the dawn of humanity magically disappear.

~~~
pjc50
> we are right, they are wrong

Vaccines work; the safety risks are minimal; there is no reliable evidence
that they cause autism and the primary proponent of the theory was eventually
struck off for malpractice. Getting this wrong will cause the unnecessary
suffering and death of small children.

The earth is spherical, and this has been known since ancient times. Some
things are not a matter of opinion.

~~~
ben_w
> The earth is spherical

Well, close enough for most people.

But I think the problem is that just trying to suppress dangerous untruths
doesn’t seem to work in our society, and the greater the capacity to do so in
other societies, the more that capacity is itself incredibly easy to abuse.

I don’t have solutions for this. I wish I did, because I have no reason to
think I might be immune to untruths.

------
bjourne
I'm curious why all submissions about YouTube manipulating their
recommendation algorithm gets flagged? I've submitted lots of links to HN,
most of which never gets any points of course, but only those about YouTube
have ever been flagged:

* [https://news.ycombinator.com/item?id=21793498](https://news.ycombinator.com/item?id=21793498) * [https://news.ycombinator.com/item?id=20475792](https://news.ycombinator.com/item?id=20475792)

~~~
zzzcpan
I can speculate that this particular submission might be flagged because it
goes against anti-Youtube propaganda that some HN-popular companies are
participating in, like Mozilla. And, you know, people love to suppress
opposing views to their propaganda. But of course there might be more specific
organized effort too.

------
motohagiography
Does Youtube radicalize? Edward Snowden made an observation during his Joe
Rogan interview that compartmentalization within the CIA is necessary so that
no person can see all the bad stuff at once and freak out at its total
enormity. As a sysadmin, he was in effect radicalized by seeing the bigger
picture at NSA that wasn't visible to other analysts.

I think Youtube shows you an analogous bigger picture that makes the dominant
media narratives seem fabricated and dishonest when viewed together, even if
individually most of it is sincere (if dumb) reporting.

The discourse around radicalization has also been abused by editors and
academics with agendas to de-normalize formerly moderate views, and who treat
the interests of regular people as beneath discourse. So it's hard to take any
discussions of radicalization seriously, even ones questioning whether it's
even a thing, as you still have to acknowledge the nonsense it has been
freighted with first.

~~~
TTPrograms
Which formerly moderate views do you believe are being de-normalized by
"editors and academics with agendas"?

~~~
dextralt
\- Authoritarian ideologies, such as fascism and communism, have no place in
our society and their proponents must be shunned

\- People should be judged (i.e. admitted to college or hired) by the content
of their character and not by the color of their skin

\- Mass import of foreign workforce by corporations is not being done in
pursuit of any high-minded ideals but to drive the wages down, and it is not
something we must embrace or celebrate

If you agree with any of the above, congratulations -- you're a fascist.

~~~
dang
Please don't take HN threads into ideological flamewar. We should resist going
there, not jump in first thing.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
cirno
This applies every bit as much to Google's web search as it does YouTube
search.

I know of many marginalized people targeted by online hate groups that have
been smeared with proven-false allegations and doxxed online, and these
stalking websites are consistently ranked as the top results in Google search
and Google Images search results for their names.

Google's AI is clearly trained to recognize and promote controversy, because
it is human nature for controversy to drive engagement.

~~~
taneq
I've been convinced for a while now that Facebook tries to start fights
between people of opposite ideology. I'm sure it's just that their algorithms
have picked up that pro-skub people are highly likely to reply to anti-skub
comments in pro-skub threads (since that was the pattern I was seeing) but it
sure comes across as trying to cause drama.

~~~
pixl97
Kind of reminds me of CGP Grey's this video will make you angry.

------
r721
Thread:
[https://twitter.com/random_walker/status/1211262124724510721](https://twitter.com/random_walker/status/1211262124724510721)

------
SpicyLemonZest
This is a pretty clever way to identify natural recommendation levels. I wish
the authors had given a bit more justification of why net impression flows are
a good metric, but I don't want to quibble too much, because it seems pretty
reasonable and it's certainly a vast improvement over what I was afraid I'd
find going in.

> However, it seems that the company will have to decide first if the platform
> is meant for independent YouTubers or if it is just another outlet for
> mainstream media.

Come on, authors, save it for the blog post.

------
Thorentis
> with slant towards left-leaning or politically neutral channels

So ... YouTube is actually politically biased? This would be an interesting
study to continue. You would need to normalise for view counts etc. since I
suspect there is a bias just in terms of numbers. The biggest channels are
more progressive, left leaning, so YouTube could appear left-biased simply by
recommending most-viewed videos.

------
twic
Some other research came to a compatible conclusion:

[https://www.wired.com/story/not-youtubes-algorithm-
radicaliz...](https://www.wired.com/story/not-youtubes-algorithm-radicalizes-
people/)

------
irthomasthomas
I devised a simple test for this. After the last US election there was some
analysis and discussion of Google's youtube recommendations engine which
appeared to show a strong bias towards conspiracy videos, which in turn led to
videos that generally promoted Trump over Clinton. Now, I'm not American, and
don't care who your president is, but it was around this time that I realised
youtube's search engine had become pretty useless for me. Since then I only
use youtube for watching tutorials or technical talks (and old TV shows that
are hard to find elsewhere). I try to avoid the search engine and
recommendations altogether.

Ok so that's the why, now what's this simple test? Just search youtube for
CERN and note the mix of results. I found that most of the results could be
categorised as either pro-science or anti-science/conspiracy. In an ideal
world I would expect to see the official CERN channel at the top, followed by
sciency videos about CERN, and finally the fringe conspiracy videos about
wormholes to other dimensions and such. It's fun to use tor to see how the
results vary by country. The last time I did this was 2017, and I don't have
the data to hand, but it was roughly 70-80% of results in the first few pages
where conspiracy related.

Here's a screenshot of my result in 2017 when I'm logged in. [https://user-
images.githubusercontent.com/28928495/71557922-...](https://user-
images.githubusercontent.com/28928495/71557922-17e1b780-2a45-11ea-84de-93a9725dc6af.jpg)

I just had a quick look now and my first impression is that results have
certainly changed and improved, but it's a mixed bag. I will find my old data
and update it soon.

------
XPKBandMaidCzun
I see the term radicalization is being brought up, do we serve ourselves by
focusing on superficial presentations that won't be the same in 10 years
rather than how similar everyone is: Feeling enlightened, competitive and lack
consideration that their perceived "ideological adversary" is every bit
vulnerable, needy, and scared as themselves?

How is this different from couple's therapy, just at long distances w/ groups?

None of them want to have a picnic and cooperate with each other. And who
could blame them? They both fail to recognize each others hardships, yet "get"
the human condition better than the other so much better. They begin by
insinuating there's another group that has a character flaw, and is so angry.
Yet - all the while, they're angry and accusing
([https://en.wikipedia.org/wiki/Projective_identification](https://en.wikipedia.org/wiki/Projective_identification))

I have a hypothesis: It's all acting out, cathartic, to blow off steam.
They're not at the soup kitchen or volunteering. They have a dysfunctional
coping mechanism stemming from earlier traumas, and it's more profitable to
captivate lonely, bored people by stir up people's anxiety existentially than
help them find common ground.

Because if people realized the common ground they shared and cooperated,
people would start to pass laws and regulations to make healthcare,
employment, housing, education for more fair for legal persons. The whole
concept of political sides is a sham: They are legal persons and
[https://en.wikipedia.org/wiki/Maslow%27s_hierarchy_of_needs](https://en.wikipedia.org/wiki/Maslow%27s_hierarchy_of_needs).

If common sense stuff isn't being fixed and people are bikeshedding: the fix
is in, people in suits are giving each other high fives and laughing at you.
You're being suckered into squandering your political rights (you voted, or
can!) to take worthless, symbolic digs at people rather than get what you
need, better laws for the practical issues everyone shares. Hint: they tend to
be boring.

~~~
nothingtos33
Youtube recommended this to my son:
[https://youtube.com/watch?v=4LfJnj66HVQ](https://youtube.com/watch?v=4LfJnj66HVQ)

Please watch it and tell me what you think of it.

~~~
fruffy
What do you mean? Gucci Gang is a pretty well known hip-hop track liked by the
"youth". I am not surprised it was recommended to your son since most of his
peers probably have watched it.

------
minimaxir
It should be noted that this research was likely published with a specific
agenda in mind:
[https://twitter.com/mark_ledwich/status/1210743217982803970?...](https://twitter.com/mark_ledwich/status/1210743217982803970?s=21)

> My new article explains in detail. It takes aim at the NYT (in particular,
> @kevinroose) who have been on myth-filled crusade vs social media. We should
> start questioning the authoritative status of outlets that have soiled
> themselves with agendas.

From the linked Medium article:

> These events, along with the promotion of the now-debunked YouTube “rabbit
> hole” theory, reveal what many suspect — that old media titans, presenting
> themselves as non-partisan and authoritative, are in fact trapped in echo
> chambers of their own creation, and are no more incentivized to report the
> truth than YouTube grifters.

The paper itself makes a fundamental flaw of using _logged out
recommendations_ and attempts to disprove 2018 algos even after substantial
changes since, which invalidates the research entirely.

~~~
fastball
Less of an agenda than the aforementioned NYT article.

Sure, this research doesn't use 2018 algos, because it's 2019. I guess the NYT
and others should've done the research in the first place then. It certainly
doesn't invalidate the research - it shows that the claims made by others are
probably not accurate in 2019 on YT.

~~~
metamet
I believe the 2019 algo was dramatically changed recently, meaning that the
core of the 2018 algo was present throughout the majority of 2019.

------
throwaway294072
The far-right radicalization effect is well known.

You can test easily with a dedicated account.

[https://www.thedailybeast.com/how-youtube-pulled-these-
men-d...](https://www.thedailybeast.com/how-youtube-pulled-these-men-down-a-
vortex-of-far-right-hate)

[https://www.theguardian.com/media/2018/sep/18/report-
youtube...](https://www.theguardian.com/media/2018/sep/18/report-youtubes-
alternative-influence-network-breeds-rightwing-radicalisation)

[https://www.cjr.org/the_media_today/youtube-conspiracy-
radic...](https://www.cjr.org/the_media_today/youtube-conspiracy-
radicalization.php)

[https://www.youtube.com/watch?v=2Nrz4-FZx6k](https://www.youtube.com/watch?v=2Nrz4-FZx6k)

[https://www.youtube.com/watch?v=P55t6eryY3g](https://www.youtube.com/watch?v=P55t6eryY3g)

