
YouTube's tendency to push everyday users toward politically extreme content - anigbrowl
https://www.nytimes.com/2018/09/07/world/europe/youtube-far-right-extremism.html
======
dang
This is a substantive article. We've changed the title to a representative
sentence from the main text, in the hope of not plummeting straight into
national-race-ideology war. If you post in this thread, please don't post that
kind of thing. It just leads to predictable flamewars.

Instead, follow the site guidelines. They include " _Comments should get more
civil and substantive, not less, as a topic gets more divisive_ ," _Eschew
flamebait_ ," and " _Please don 't use Hacker News primarily for political or
ideological battle._"

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
b0rsuk
This was addressed more broadly in "We're building a dystopia just to make
people click on ads" by Zeynep Tufekci on TED.

[https://news.ycombinator.com/item?id=15891014](https://news.ycombinator.com/item?id=15891014)

She remarked that Youtube's recommendation system invariably pushes people
towards more extreme views on _anything_ , not just politics. Interested in
vegetarianism? Here's something about veganism. She has a hypothesis it works
this way because the goal is to make people spend as much time on a site as
possible, and feeding people progressively more extreme content works for
that.

My comment: what does it say about us humans that we fall for progressively
more extreme stuff? Perhaps that few people actually hold balanced views and
nowadays a compromise is looked down upon and a mark of a weak character?

~~~
dragonwriter
> She has a hypothesis it works this way because the goal is to make people
> spend as much time on a site as possible, and feeding people progressively
> more extreme content works for that.

While I understand the attractiveness of conspiratorial hypotheses, I think
there is a simple explanation that applies to recommendation systems
generally: such systems are designed, overtly, to identify content the user is
likely to react positively the, based on what information it has about
population-wide preferences.

When it knows nothing (or nearly so) about your personal preferences, the only
thing it can do is recommend the most widely accepted content. As it learns
more about your preferences, it has more basis to predict things that you will
find meert your preferences that are less widely supported in the population
at large—so, over time, it naturally rexommwnds more content that is extreme.)

~~~
finnthehuman
So, rather than being a deliberate choice of Google, it's simply an emergent
phenomenon of a system Google expressly designed to manipulate people?

How is it not a hundred times more terrifying than the conspiracy theory, that
google chooses to keep such a system online?

~~~
dragonwriter
Oh, it's far more terrifying that the natural result of being good at feeding
people things that you can determine that they will respond well to is that
you will progressively feed them more extreme content, because tragedy of the
commons is much scarier than mustache twirling villains.

~~~
2bitmachine
At the point when you recognize this and then choose to proceed apace, you
become a “mustache twirling villain.” YouTube crossed that threshold a long,
long time ago.

------
joe_the_user
Just anecdotally, Youtube's recommendation system seems effectively self-
referential. On virtually any subject I look at, recommended videos seem to
cycle between just a few themes or even just a few specific videos. I think
there's a logical reason for this.

I imagine for a given topic you wind-up with:

    
    
        [broad video] related to 
        [other broad but unrelated video] plus [extremist video].
        And
        [extremist video] related to  
        [other extremist video only]
    

Since youtube's recommendation system is entirely naive, once you choose
[extremist video], that is the video that gives the system the more specific
clue and thus [other extremist videos] will be what's recommended.

It's a function of naive recommendations as such. If a system knows that X
likes two videos, one that the entire population likes, one that only "metal
heads" like. What can it recommend? Metal is the only sensible thing. And if
the system then shunts many people to metal and they seem to like it, metal
will count even more as a logical recommendation.

~~~
jdminhbg
I see the same thing on a completely uncontroversial subject: music videos. If
I watch e.g. a Beatles video, the recommendations aren't [Rolling Stones,
Kinks, Badfinger, ...], they're always [Beatles, Beatles, Beatles, ...]. I
don't know if it's naive and bad or if it's actually good for engagement and
I'm just an outlier in what I expect to get.

~~~
gboudrias
I only wish I would get songs by the same band... It sort of seems like
Youtube tries its darndest to steer me away from the current band. I'd be
happy if it just cycled between hits from the same artists, but I've never
seen it do this recently.

I think what's most frustrating about it is I don't know what game they're
playing. Recommendations used to be logical, now they're almost a waste of
screen real estate. What even happened?

~~~
blihp
I've noticed the same in my usage across almost all google services at this
point. My speculation is that this is due to them now being so heavily
dependent on neural networks these days vs the previous mix of algorithms. The
old approaches seemed to be better at not presenting false positives /
creating a rediculous mish-mash of previous trains of thought in an effort to
recommend something, anything... no matter how wrong.

For example: older versions of the google keyboard were pretty good about not
recommending/autocorrecting non-words. So I'd type things like
function/variable names (which often use camel case) and it would either
suggest valid English words or perhaps a non-word used earlier in the message.
These days, my use of camel case names in technical emails spills over so that
emails to family now pop up with bizarre camel case recommendations. I seeing
similar things happening to YouTube recommendations and elsewhere.

------
abhiminator
> Users searching for news on Chemnitz would be sent down a rabbit hole of
> misinformation and hate. And as interest in Chemnitz grew, it appears,
> YouTube funneled many Germans to extremist pages, whose view counts
> skyrocketed.

As a long time YouTube user, I've noticed this myself.

The starting video doesn't even have to be extreme. For example, watching a
5-minute daily news video on Fox News channel primes YouTube's super sensitive
[$] algorithm to nudge me toward channels with considerably more ideology-
driven content (like 'PragerU,' for example). Watching the next auto-
recommended video primes the algorithm to an even greater extent that nudges
me toward channels with even more extreme positions ('1791L,' 'Computing
Forever,' for example). Rinse and repeat a few more times and you end up on
channels that server as a gateway to radicalization -- channels whose names
I'm not comfortable mentioning.

YouTube's algorithm needs some urgent tweaking, especially to its topic/video
recommendation sensitivity. It gives more weight to recently watched content,
which is fine, but should not be at the cost of previously watched content by
the user.

[$] When I talk about YouTube's 'super sensitive' algorithm, I mean to say how
watching even a single video on a particular subject primes it to keep showing
me similar videos everywhere -- from homepage to recommendations to the auto-
play ones.

P. S. The aforementioned example is also perfectly applicable for channels on
the opposite end of the political-spectrum.

~~~
benbenolson
1791L? How is that an "extreme position?"

~~~
claydavisss
Right...very few people seem able to differentiate "radical" views with views
they merely oppose.

What does "radical" even mean? Opinions that are not part of the present
consensus? How is society supposed to evolve?

A society that cannot express "radical" views with society's support may
eventually express these ideas violently. The downvote button only works for
so long.

~~~
Retra
And what facts do you base these assertions on?

------
ericdykstra
This article seems a bit sensationalized. The examples they give are a single
video with 500,000 views, and a couple of channels whose views average 20-30k.
If YouTube is recommending "extreme" channels so often, why aren't they
getting more views?

> “Lies, propaganda and manipulation are harmful for society, but on their own
> are not illegal — and so our hands are often tied,” said Mr. Ipsen, of the
> government-linked internet monitor.

Good. I think it's insane that the German government is even trying to
establish a Ministry of Truth.

It seems now more than ever, we are getting biased reporting from mainstream
media outlets, and I guarantee that any government crackdown would not include
these mainstream news sources.

Here's an example from just a couple of days ago: CNN's Chris Cuomo as well as
a number of major news outlets reported on Judge Brett Kavanaugh "snubbing"
Fred Guttenberg, father of a victim of the Parkland shooting, by not shaking
his hand at the event. None of these outlets bothered to mention that the day
before on Twitter, Guttenberg said:

"I will be at Kavanaugh hearings and I hope to play a role in ensuring that
this man does not become the next Supreme Court Justice."
([https://twitter.com/fred_guttenberg/status/10366004607261614...](https://twitter.com/fred_guttenberg/status/1036600460726161408))

They also didn't mention that Guttenberg was the only one trying to approach
Kavanaugh, or that Kavanaugh's kids were removed from the room for safety
concerns shortly prior. That seems like manipulation to me.

I don't say this because I want CNN banned, or anything like that. I'm just
completely against any censorship, especially government censorship, of ideas.
I believe in the liberal values of free exchange of ideas (or "Liberal
Science" as Jonathan Rauch calls it, in his excellent book, _Kindly
Inquisitors_ ).

~~~
ehsankia
Absolutely. Those numbers are peanuts in Youtube scale. Popular Youtube videos
regularly get 10m+ views. It's similar to back when WSJ went digging for days
and managed to find 2-3 racist videos with 10k views, then refreshed them
until a Coca-Cola ad showed, to screenshot it. Most of these are manipulating
reality to drive their narrative. I've yet to see a proper scientific and
thorough example of this actually happening.

~~~
ericdykstra
Yeah, just pluggin in the "far-right figure on YouTube" that they highlight
under the heading "From fringe to mainstream" into Social Blade, you can see
just how "mainstream" this guy is:
[https://socialblade.com/youtube/channel/UCYo2MjFS5C7Wynyty9w...](https://socialblade.com/youtube/channel/UCYo2MjFS5C7Wynyty9wTM7g)

Subscriber rank: 347,946th

Video view rank: 680,665th

If YouTube is promoting this content so much, at the very least you would
expect the video view rank to be higher than the subscriber rank, right?

------
Alex3917
See also Danah Boyd's talk on how media literacy classes push some students
toward extremism. The gist is that schools have been promoting programs where
students are taught not to take what they see in the media at face value, and
to go Google/YouTube it for themselves. Which, because of the sorts of things
discussed in this article, results in a certain percentage of students
becoming radicalized and basically (or literally) becoming nazis.

[https://www.youtube.com/watch?v=0I7FVyQCjNg](https://www.youtube.com/watch?v=0I7FVyQCjNg)

~~~
chriswarbo
I suppose there's a need for "Internet literacy" (or "Google literacy"). One
rule I instinctively follow is, if I find some claim surprising, e.g. "there
was more X under president Y, than president Z", I will search for something
like "X USA graph", look for a source I recognise (and hence know the
credibility and biases of), and see if the data supports either or neither of
those alternatives.

I will specifically _not_ search for "X president Y" or "X president Z"
because that will be priming the algorithm to confirm my biases.

------
ilamont
Google News works the same way. I've been using Google News since it was a
list of links to news sites. The algorithm is very good at surfacing breaking
news, but can't help showing the most extreme clickbaity BS among the legit
stories.

Interested in stories about Oceach Science? We'll show you "Holy S __*, These
Triassic Ocean Reptiles Were As Big As Hell "

[https://twitter.com/ilamont/status/983714720153612289](https://twitter.com/ilamont/status/983714720153612289)

We see you read about NASA and space flight. Then you'll love this story about
a Russian boy who who says 7-foot aliens reside on the red planet!

[https://twitter.com/ilamont/status/929392451193835520](https://twitter.com/ilamont/status/929392451193835520)

Or fossilized alien footprints on Mars, covered up by NASA!

[https://twitter.com/ilamont/status/970814193631981568](https://twitter.com/ilamont/status/970814193631981568)

Trying to get an update on news about China and foreign relations? Let's show
you a "news" headline that suggests WWIII is about to break out:

[https://twitter.com/ilamont/status/1037765753162919937](https://twitter.com/ilamont/status/1037765753162919937)

~~~
ajross
Two criticisms: one old tweet ( _edit: parent added extra tweets after I
posted this -- and... I can 't find them when I do equivalent searches
anywhere near the top of the page. ilamont appears to have some kind of axe to
grind here_) doesn't carry much weight. A current search for "ocean science"
on news.google.com in fact looks pretty much like what you'd expect.

Secondly: sorry, what exactly is wrong with a sensationalized headline about
interesting content? Some of those animals were, in fact, "big as hell", and
it's worth covering in popular media. This appears to be the article in that
screenshot:

[https://gizmodo.com/holy-shit-these-triassic-ocean-
reptiles-...](https://gizmodo.com/holy-shit-these-triassic-ocean-reptiles-
were-big-as-he-1825106733)

And it's a good article! It's a writeup of this paper, and it even links to
it:

[https://journals.plos.org/plosone/article?id=10.1371/journal...](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0194742)

So... what's your criticism here, exactly? That Google put a good story up on
their front page even though it had "shit" in the title?

------
mrow84
Assuming the youtube recommendation algorithm is mostly just "people who
watched this also watched", then couldn't the asymmetry in the recommendation
graph structure simply be a relatively accurate reflection of asymmetry in the
underlying data?

It seems plausible to me that the probability that people who hold "extreme"
views will watch "moderate" content is lower than the probability that people
who hold "moderate" views will watch "extreme" content. I haven't run any
experiments, but it seems to me that a difference in probabilities like this
would induce the kind of "closed group" structure being discussed on the
resulting recommendation graph. The low number of steps to the closed groups
shouldn't be that surprising, given previous famous results about degrees of
separation in these kinds of graphs.

------
CynicalDio
It's quite bizarre, the way YouTube works. I will watch a video that covers
something like climate change and immediately I am bombarded in my
recommendations with left leaning news clips and stories, the same goes for
when I watch a video that could be considered conservative, right away I am
suggested videos like "Professor puts liberal sjw student in her place -
compilation."

~~~
ahelwer
Watching any video involving comedian Bill Burr will immediately get you anti-
feminist and even white nationalist video recommendations. It's just weird how
close that cliff is to fairly mainstream videos.

------
dwaltrip
I use incognito mode for probably close to half of the YouTube videos I watch,
as a defense against the algorithm. That technique, plus freely using the "not
interested" button, has actually resulted in a pretty good experience.

~~~
pasquinelli
I actually _want_ youtube to recommend me videos, because it recommends me
stuff i'm interested in a good deal of the time. I can keep it fairly clear of
toxic content in the same way i can keep my garden fairly clear of weeds:
constant vigilance.

~~~
dwaltrip
Yeah, perhaps I wasn't completely fair. For some of the content I watch, I'm
quite happy with what they recommend. But as you said I have to be constantly
vigilant with the rest.

------
bborud
In 1999 I was working for a web search company and one of the things we really
wanted to do was to make search personalized. Eventually we started talking
about what years later became known as "filter bubbles". It would be bad if
people were constantly fed stuff from the same information universe, right?

In retrospect, it didn't get all that much of our attention in terms of work
done. Sure, we got very obvious (and measurable) feedback mechanisms and we
did spend considerable amounts of time countering the aspects that more
directly affected percieved quality of the product. But in terms of what it
means, we figured "we have time to deal with it later".

Just figuring out how to do this inside a few milliseconds per query was more
than enough of a challenge in the near term. Since then (and probably before
then) I'm sure thousands of teams have had roughly the same experience. You
know that there are going to be problems ahead, but you build it anyway
because if you don't someone else will. Your paycheck depends on it.

When you implement software that anticipates the preferred path a human likes
to take through content you end up amplifying what tendencies are present in
the person. This is "harmless" in the sense that you are not imposing some
editorial direction. You can't be criticized for imposing your opinion or
notions of good/bad, right/wrong etc. However the results appear to be
polarizing in that they allow people to navigate to certain maxima.

The really hard problem here is how you counter this effectively without
imposing your view. I think there's interesting work to be done in that area.

------
fareesh
The international corporate media has not been honest in its reporting of the
refugee situation in Europe. This is becoming clearer year by year, because
some of the facts that were earlier dismissed as racist rhetoric are now
becoming very difficult to ignore. For example earlier this year, Chancellor
Merkel finally admitted that there are "no-go" zones in the country where even
the police dare not go. This is an extremely serious problem.

People flock to this content because they are not getting the whole truth from
the corporate media, and that is the root of this problem. YouTube's content
would not be compelling if these things were being reported honestly, and
given the same degree of volume as other important stories.

In my country, India, there are plenty of places in which the value systems
are completely at odds with ideas of gender equality, religious tolerance,
racial equality, and LGBT equality. If a western country were to create
pockets of populations where people held these values, it would be extremely
incompatible with the norm in the societies that they have achieved today. As
a result, these places would be radically transformed into incredibly
dangerous areas, particularly for women and LGBT folks. It is not bigoted to
point this out, it is true because this reality exists in our country, and it
is one that we are aware of and grapple with. Of course it will exist in
another country if a large enough number of people who hold these ideas are
imported en masse. If these values existed in Syria or Sudan or other parts of
the world where refugees are coming from, of course this will become a problem
for the country that is resettling them. This was a valid criticism that did
not get a fair enough deal from the media, and the consequences have been life
altering for many Europeans.

The dream of multicultural societies where everyone lives together in harmony
requires everyone to buy into it. If there are bad actors, and others are
aware of this, it does not make them bad actors too. The media seems to have
been hesitant, perhaps because of fears of stoking xenophobia by providing it
with legitimate information that could be compelling for their cause. There is
never a good reason not to tell the truth about these things, this is the
result of what happens when you go down that path. At this point, many are
convinced that the news media had sinister and not well intentioned motives,
behind why this was done, and as a result trust in the news media too will
decline rapidly, in favor of rag tag YouTube operations.

~~~
erikpukinskis
It can be true and racist at the same time.

If you never care about rapes except when it’s a dark skinned perpetrator in a
white dominated country, then you are both racist and correct that that rape
amongst refugees is an important issue.

~~~
fareesh
No, it's just true. You can take a fact and use it to support a larger racist
worldview composed of other facts, many of which may be false, contributing to
a poorly reasoned out illogical perspective, or one that is lacking key
details.

Or you can simply take that fact and look at it in isolation, or in
combination with true statements, and all the necessary key details, and form
a conclusion that does not involve bigotry.

What you are talking about is what people take away from the story. If the
media is trying to control what people take away from the story, by not
publishing it, because it could potentially help racists further ratify an
already flawed worldview, then that is actually having the opposite effect
right now, and it is a poor, unethical strategy. The media should report the
facts, and should report on important issues with a fair amount of coverage,
the volume of which is unaffected by agenda.

~~~
erikpukinskis
How about this: it’s both true, and with the addition of other information
about rapes the individual doesn’t notice, is evidence of their racism?

------
benjaminjackman
This is my largest gripe with these recommendation systems. They are
patrimonious and regressive and not aspirational in any sense. What do I mean
by that? I mean that they are built with a very sacrosanct attitude towards
the _algorithm_ knows best (so maybe algorimonious).

They allow no ability for the user to control and guide the process, to mold
what they consume to be the person they aspire to be.

More concretely I _never_ _ever_ want a recommendation of “late night talk
show interviews celebrity person” even though I am somewhat likely to click on
that, and waste my time on hollow vapid content, and tell the algorithm to
regressively drag me down towards that content.

I have gone so far as to go to each of those channels and banned them but that
doesn’t stop the video recommendations from randomly popping up in the sidebar
so I think it must just block those channels ability to comment on things I
post.

What I would prefer overall:

An ability to guide and control the recommendations to the point of writing my
own code that runs in a google cloud function because everyone will be a bit
different in what they’d like to see. Personally, I’d then write something
that flat out blocks certain keywords and attempts to show sidebar
recommendations similarly to how they used to do them: closely mirror the
current video with very little if any skew towards the current users history.

So I think the issue we face here is one of limited user control, users are
not being given a voice or meaningful input to the algorithm that determines
what content they see. Instead the only input they are given is their own
watch history.

Right now YouTube is a buffet. It’s the only buffet in town. It loads the bar
with all the junky food people are statistically most likely to eat whether
they want to have that food sitting their tempting them or not, and whether
it’s even good for them or anyone or soceity as whole or not. It doesn’t give
them any ability to say “no French fries” in the buffet. If you put French
fries on your plate before and they observed you to eat the French fries, by
golly, you’re getting French fries in that buffet from there on out, mental
waistline be damned.

------
saagarjha
It's repeatedly emphasized in the article that the recommendations algorithm
gives viewers "what they want to see" and what has high engagement.
Unfortunately it looks like these criteria seem to push towards fringe and
extreme content because that's what evokes people are interested in watching
and causes them to react. YouTube, on its part, probably does not intend for
it to be this way, but seeing as its goals are the same (albeit to maximize ad
revenue using these guides), I'm not optimistic in this being fixed unless
there's a massive rethink on how to surface content to users.

~~~
goes
The fix is not complicated.

They have to play with the control variables producing the feedback loops.

People and content creators are influenced by the view/like/dislike count.
Instant feedback through these counts produces all kinds of unintended
consequences.

Instant feedback is overrated and unnecessary. It's like constantly getting a
live upvote/downvote from your spouse or boss or prof on everything you do.
What kind of behaviour does anyone think this produces? In such real time
conditioning environment ppl turn into rats in a B F Skinner experiment.

The Russians or any bad actor don't have to create content anymore. Just find
a lunatic and encourage/prop them up with views and upvotes. YouTube will then
take care of distribution triggered by the view and like counts.

Hide or delay the counts to both the creator and viewer and the world changes
over night.

The experiment can be run on HN. Hide the upvotes on certain controversial
threads and see what kind of conversations happen.

------
lsmarigo
As a content creator it's very quickly apparent the best route to driving
views and subs is to push extreme/edgy content. Balanced content gets no
engagement. It's really that simple, with the ADHD short attention span of the
current gen of heavy media consumers (read: kids) you have to shock
them/surprise them/make them laugh ~once every 5-6 seconds to keep them on the
video. YouTube offers very detailed metrics on engagement and you can study
the exact time where people lose interest/close the video.

Only massive established channels have the privilege of pushing non-edgy
content as their audience is built in already, for anyone new it's shock value
and hard-line extremism or your voice will be lost among the noise.

~~~
pasquinelli
Yeah, but if you are shocking and edgy, is your _voice_ preserved? Sure,
people may watch but it's in one ear and out the other, a swell in the sea of
noise.

I'm sure most youtubers don't care, but viewer engagement is not the same as
being a notable voice.

------
paradite
For people who are new to this topic and find it to be surprising: This is not
new. There is a term for this called "echo chamber":

[https://en.wikipedia.org/wiki/Echo_chamber_(media)](https://en.wikipedia.org/wiki/Echo_chamber_\(media\))

I have observed this trend happening on Internet since 2014 and written about
it in my blog:

[https://paradite.com/2014/11/11/internet-polarized-
society/](https://paradite.com/2014/11/11/internet-polarized-society/)

Previously it was more for closed communities like reddit and Facebook groups,
but seems like now it is also substantial on open platforms.

~~~
0x4f3759df
I've only ever heard the term 'echo-chamber' as a pejorative against people
who hold anti-establishment views, ie 'Hillary lost because middle America is
stuck in right wing / alt-right echo chambers.' It's appears to be a
derivation of the 'conspiracy theorists are mentally ill argument' or 'Yes,
people hold these views, but its not their fault, choose one (mental
illness/echo chamber).

------
stareatgoats
The west has long lived by the rule of laissez-faire, until something bad
happens and only then try and fix things up. It has been effective albeit
somewhat risky. We are still in early, lassez-faire days of the internet,
meaning we have got a lot to learn about funneling the vast sea of information
responsibly. Maybe we'll learn it eventually, without sacrificing core
democratic values like freedom of expression in the process.

This problem in particular seems to be caused by the need for ads (since the
recommendation engine seems to be geared towards clicking on ever new videos).
Maybe a solution to this and similar issues that currently turns the internet
into a sewer is to promote a 'responsible ads policy' for the internet that
puts the initiative in the hands of the user: 1. no recommendations alongside
the main content (only under a separate filterable view), 2. no ads alongside
the main content (only under a separate filterable view). 3. No ads in the
middle of main content (i.e. in the middle of a video, only in a separate
filterable view).

It may seem idealistic, but with small increments of public momentum it
_could_ eventually make a dent. I don't see any other way, really.

------
garysieling
I got frustrated with the Youtube recommendation system, and built a search
engine of videos I scraped:
[https://www.findlectures.com](https://www.findlectures.com)

I'm relying primarily on trusted recommendations - if someone I trust likes a
speaker, I'll index channels where they speak. This doesn't scale well, but it
has made for a tool that works really well for doing research.

~~~
le-mark
Have you written about the challenges involved in this? Like does yt allow it?
What do you index exactly? The implementation? Would be very interesting
reading. There used to a few yt aggregrators around. I think the most
successful was whatever became youtube kids (someone correct that please).

------
Reason077
It's not just politically extreme content it does this for. You watch _one_
video about how NASA supposedly faked the moon landings and suddenly it's
trying to get you to watch all kinds of wacko conspiracy stuff.

~~~
bobwaycott
Right. The point of the article—and many of the comments here—is that YT is
always pushing people toward _extreme version of X_ , whatever _X_ may be.

------
pendenthistory
It just struck me that this explains a lot of changes I've seen in people near
me and myself. My mother started getting really into stuff like essential
oils. My dad (non-american) spends a crazy amount of time watching videos and
reading news about how crazy Trump is (yeah, he IS, but we don't even live in
America!). My sister got really into nutrition and diets, with a bent towards
complete suspicion of western medicine. My wife, while not a full on anti-
vaxxer yet, is now suspicious of vaccinations and wants to wait to vaccinate
our child, and skip some of them all together. This after having watched some
"documentary" series on Youtube.

Myself, I've also seem to have become more extreme in what I'm willing to try
to improve my health, personal finance, mental abilities etc. I'm not sure if
all of this is due to Youtube specifically, but definitely due to the internet
in general. However, I do believe that for many topics, the center/average
view is usually wrong. For example the average views on things like diet,
exercise, personal finance, materialism is just bad. The average Joe eats like
crap, rarely exercises and has less than a few thousand dollars in
savings/investments. Therefore, I'm starting to wonder what else out there I
might be doing wrong that I haven't even considered yet.

~~~
dnomad
> My wife, while not a full on anti-vaxxer yet, is now suspicious of
> vaccinations and wants to wait to vaccinate our child, and skip some of them
> all together. This after having watched some "documentary" series on
> Youtube.

Out of curiosity is your wife college educated?

It's the radicalization of "average" people that is most interesting. I
suspect what your wife discovered wasn't just a series of videos but a whole
_community_ of people deeply committed this nonsense. This is the secret of
radicalization: it appears as just a (very rigid) community of like-minded
souls. This shows up again and again in the literature and the news.

The internet is doing its job of bringing people together. What's not clear is
whether any of these communities are actually beneficial.

------
ardy42
> Perhaps most striking is what was absent. The algorithm rarely led back to
> mainstream news coverage, or to liberal or centrist videos on Chemnitz or
> any other topic. Once on the fringes, the algorithm tended to stay there, as
> if that had been the destination all along.

I wonder if part of the problem is the "news coverage" on YouTube is mostly
amateur. People in the mainstream don't feel the need to produce amateur news
coverage, since that's done professionally. However, people on the fringes
produce and consume vast quantities of amateur news content, because there's
so little fringe content that's professionally made. That leads to a large,
focused single that Youtube's algorithms have trouble filtering out because
they're too coarse and they confuse activity for quality.

------
slothtrop
With a clean slate (no cookies stored), the front page can get as ridiculous
as recommended videos. For half the music out there yt will recommend anything
between chemtrail videos, weird christian conspiracy videos, aliens, all
manner of tinfoil hat shit. It's vile.

------
ajuc
I regularly view videos on Polish politics and geopolitics. I am constantly
suggested videos by fringe extreme right geopoliticians and conspiracy
theorists ("it's all a Jewish conspiracy to destroy white race and Poles", "EU
is USSR", "muslims will rape us", that kind of videos - Michałkiewicz, Braun,
Kowalski and similar). I regularly click "don't like this suggestion, stop
showing me it", and I'm still suggested it after a few days/weeks. It seems
there's endless supply of these videos and they have hundreds of thousands of
views.

It's like youtube wants me to become a right-wing nut. That despite me mostly
watching mainstream news and liberal news-as-comedy shows.

My coworker is a right-wing estremist and he gets suggested the exact same
videos, he doesn't see liberal or mainstream news in youtube suggestions,
ever.

------
dprescott
I was a casual youtube watcher. I am talking 2012 style fail videos, music
videos, and broad areas of interest.

Now I can not stand YT. 1st, they have clearly decided that they want to
compete with IG as their 'Trending' section is nothing more than influencers.
2nd, it is likely the case but the 'Trending' section is always loaded with
liberal, nonsensical, click bait videos, and artist of no interest to me.

I am clearly no longer their target demo.

~~~
slig
Their trending section is much like TV: content build for the masses. Log in
and like videos that you enjoy, and you'll have a personalized recommendation
page that excellent, in my opinion.

------
DennisP
This seems like it could be inherent to recommendation systems. How would one
go about making a recommendation system that doesn't have this effect?

------
mattbierner
Say a perfect recommendation system existed. Should it give me what it knows
I'll like or challenge me? Should it give me what I ask for even if it's
wrong? Should it aggregate information for me to judge or analyze the content
and judge it for me? Is it optimized for me? For the company? For what's best
for humanity? And who decides what's best?

------
qubax
What is politically extreme? Would a company that hires someone like sarah
jeong be viewed as politically extreme?

The problem isn't youtube. The problem is the news. The news has become
politically extreme and that has seeped into youtube.

And it's a bit hypocritical of the nytimes to whine about youtube when they
pressured Alphabet ( youtube and google ) to give them priority status on both
platforms.

Anyone else getting sick of the news media attacking social media for all the
ills they themselves created and are creating even further? Youtube, and
social media in general, has become politically extreme because the news
industry ( in particular CNN, Washingtonpost and the NYTimes ) has been
pressuring it relentlessly to push their content on social media. And since
they are so politically extreme, it creates an extreme backlash against it.

Lets not forget that their political extremeness is why fox news was created
and why fox news is so popular. If news media companies were a tiny bit honest
and objective and fair, we wouldn't be having this problem. But these people
running these companies surround themselves with sycophants and create such an
impregnable bubble that they can't see how extreme they have become.

Get off your high horse already. You are the problem, not youtube.

------
briantakita
I like the recommendation system. It provides content that I'm interested in
and I'm surprised over how "fair" it seems given what I'm interested in
watching.

Maybe Youtube can provide a choice of algorithms. For example, if somebody
wants a more curated "socially acceptable" experience, they could opt-in.

------
DyslexicAtheist
dang and I thought the whole point of youtube was to end up in its weird
corners. So many memories of me whistling to the cheerful tunes of the Arctic
Monkeys, 4 hours later I'm googling _" anxiety"_ to the background music from
Boards of Canada.

And whenever I fall asleep to the stimulating and thought provoking ideas of
Yuval Noah Harari, I wake up several films later and likely half way through a
4 hour documentary of the 4 horsemen & NWO.

But I guess the remedy is to use Google even more, deactivate my adblock and
embrace the milking experience as intended. So that they can build a more
accurate picture serving me "better" content. :)

------
jumperabg
Yep YouTube is recommending me Alex Jones videos and memes. They are just
overflowing.

------
beeforpork
It also happened to me. I was watching some completely unrelated stuff
(technical stuff) when I noticed that a polemical and obviously stupid and
dumb right-wing AfD propaganda video showed up high in the recommendation
list. I was watching stuff in English, yet BS in German was shown (just to be
clear: that it happened to be German was not the problem, but additional to be
puzzling that it was such BS, it was also strange to show me something in a
different language). I don't remember what the current news was at that time,
but when I then searched for something related to that, that extreme right BS
pub talk stuff was all over the place. It was obvious that the recommendations
were skewed. I was puzzled. I closed the browser (cleared cookies) but it
remained the same. One video was recommended regardless of how non-political
the videos were that I tried to watch. And once I searched for that topic,
again, every second video or so was extreme right BS. It took a few days
before that video was not shown anymore.

Scary. Definitely a problem.

------
severine
FWIW, original title: As Germans Seek News, YouTube Delivers Far-Right Tirades

------
oblib
In essence they've perfected a way to profit off of our innate gullibilities
with no regard to either the individual's or societal consequences.

"Soulless" comes to mind when I ponder this sort of thing.

------
im3w1l
If I understand the methodology correctly, he built a spider that recursively
followed all outgoing recommendations. He was able to find fringe content when
starting with mainstream content.

But is that so strange? I remember playing the "Hitler wikipedia game" where
you start on a random page and try to find the Hitler page with the minimum
number of clicks and it's typically possible to get there in a handful of
clicks.

I don't think he has done enough to establish that there is a problem.

EDIT: On the other hand whenever I see someone respond to "source?" with a
youtube video, I immediately tune out and disregard their argument as in my
experience it will be either cranky, long-winded or both.

~~~
losvedir
This is a good point. It seems plausible to me that a recommendation algorithm
should "feel out" a user by putting a variety of options in the sidebar.

What I don't totally understand is how it determines the directed nature of
the recommendation graph, and how the destinations are labeled. What does it
mean to "end up" at "alt right" videos? Do the recommendations have to
exclusively lean right or just mostly? What about higher vs lower
recommendations? If there are recommendation patterns that go left but don't
end up in a tight loop are they considered somehow? There could also be some
dependence from the chosen starting video / topic. I'd be curious to see the
results from a variety of starting points.

Basically, it's an interesting exploration but I'd like to see it with the
rigor of a peer reviewed paper, more than just the NYT.

------
jonthepirate
I once clicked on a video of Ben Shapiro speaking at a college campus. Now I
see Ben Shapiro videos nearly every time I go on YouTube.

------
partycoder
Most useless recommendations: "Can you cast obsidian?", "Important videos" aka
Yee

~~~
meowface
I'd argue Yee is quite important and not useless, though. It's brought joy to
many.

------
MichaelMoser123
But people know how to form their own opinion - YouTube likes to show me talks
by Noam Chomsky, but I don't have to agree with him. I think they are blaming
too much on these evil algorithms.

~~~
trukterious
Yes. 'Extreme' refers really to the emotional temperature rather than the
content. 'Moderate' views, compromises, etc, don't make much sense and are
therefore ultimately boring. This doesn't imply moderation or the battle for
centre ground in politics is bad. After all we know more than we can tell, so
we can't always explain our decisions and policies. It's merely that there's
less to understand and/or criticise.

------
claydavisss
What is "radical"?

At one point in history, the suggestion that you would vote for your leader
was "radical". Giving the vote to women was once considered "radical", etc etc

~~~
megous
How about encouraging others in demonizing, targetting and killing emergency
workers/first responders. Seems pretty radical. Yet it's becomming more and
more normal to amplify voices of freaks who do this on social media.

I'd rather not for this relativism of "what's radical?" to spread too much,
because you can justify pretty much anything with it.

------
awt
Who decides what is extreme? Abolitionists were extreme yo!

------
k__
At least for me it does it on both sides.

I get ContraPoint and Jordan Peterson.

------
cliffordthedog
Seemingly mostly right wing content.

------
msie
Silicon Valley is ruining the world. Between programming nerds and business
bros you have no one to look out for the consequences of social media.

~~~
anticensor
I misread as "Silicon valley is _running_ the world".

------
puppymaster
Anecdotal but +1. Watched a couple Jordan Peterson videos and YT started
recommending far right wing channels everywhere. Identity politics is exactly
the kind of thing he preached against.

~~~
bachbach
That is not meaningful.

In Germany a new edition of Mein Kampf was published - and it sold a lot of
copies. Nearly all of those copies were to people who were already anti-Nazi.

I read all sorts of stuff - doesn't mean I buy it.

------
DavidWoof
The focus of this article is on right-wing extreme content, but I think the
tendency is true on both sides. The fact is that partisans are more likely to
seek out news content, and partisans are happier with content that matches
their prior viewpoint. And so the extreme content rises in popularity and is
presented as the recommended option to the less partisan.

It's a worthwhile question as to whether this is a _flaw_ in the algorithm or
if the algorithm is simply accurately mirroring democratic preferences. Maybe
everyone's tendency is toward more politically extreme content, and youtube is
simply giving the people what they want. Which is, after all, what YouTube
probably sees as the correct goal.

~~~
DanBC
The extreme left aren't involved in mass murder at anything like the extent
that the extreme right are.

When looking at HVE groups the far left appears rarely, those lists are full
of either militant Islamist or Far Right groups.

The focus on the far right happens because it's the far right that are
murdering people.

~~~
meowface
What is HVE? What far-right groups are you referring to specifically?

~~~
jazzyjackson
Homegrown Violent Extremists

There's been numerous reporting on the surge in popularity of anti-government
militia groups, tho it looks like most of it comes back to a report by
Southern Law Poverty Center, which you may or may not take as a reliable
source.

[1] [https://www.splcenter.org/20090731/second-wave-return-
militi...](https://www.splcenter.org/20090731/second-wave-return-militias)

------
hguhghuff
YouTube needs to ban accounts that republish TV news.

------
zimablue
You have to remember that these articles are to some extent manual stitchers
writing critiques of the sewing machine. The newspapers are what is getting
killed by new media, and honestly they deserve it. They've had an oligopoly on
pushing their agenda for a long time, and it's hurt people (eg. beating the
war drums for Iraq, leaving the American people objectively misinformed).
There are criticisms to make but these people are not the ones to make them.

------
factsaresacred
A week ago the 'mainstream media' spoke of mobs and pograms. Yet here's
Saxony's Minister President:

> _He also criticized the role of the media in the reporting of the events,
> saying that "there was no mob, no hunt and no pogroms."_

You see it's all about narrative. Yes fringe Youtube accounts may have
misinformation, but so may mainstream news. In fact people flock to Youtube
because the media tend to commit lies of omission or editorialize their
reporting.

[https://www.thelocal.de/20180905/saxonys-minister-
president-...](https://www.thelocal.de/20180905/saxonys-minister-president-
there-was-no-mob-in-chemnitz)

