
YouTube’s Product Chief on Online Radicalization and Algorithmic Rabbit Holes - Bhilai
https://www.nytimes.com/2019/03/29/technology/youtube-online-extremism.html
======
cameldrv
* Let me configure the recommender.

* Let me turn it off entirely.

* Let me adjust the relative weight that the current video gets vs my past views (i.e. do I want to see related videos to this one, or ones that YouTube generally thinks I'll click on)

* Let me adjust the weighting of thumbs up vs. watch time.

* Let me configure the homepage

* Let me make the entire recommender panel be just subscriptions or items from a particular list

* Let me replace the recommender entirely with something of my own devising, called back through a webhook

These things would make YouTube much more useful for me. I'm not going to
YouTube just to kill time, and I pay them $13/mo. They're not getting any ad
revenue from me, so why am I stuck with their recommender that only cares what
it thinks will cause me to spend the most time on YouTube. I am not interested
in spending the most time on YouTube. I'm interested in getting the most out
of it in the minimum time.

~~~
scotchio
I always call this Modern Day Tobacco.

At a certain level of volume, more transparency should be LEGALLY required for
multiple companies (auto personalization / recommendation engines / etc.). At
least the ability to opt-out. Just hard to define. Could you legally define
“bubbling” or super duper smart recommendation algorithms?

Big tech knows this is the direction IMO.

It’s my guess why they randomly started pressing so hard for screen time usage
reminders.

They’re selling addiction to kids (and adults).

YouTube, FB, Instagram, Twitter Moments all have liability when it starts to
click with everyone.

It’s a serious issue that is totally unregulated and not understood by most.

~~~
dmitryminkovsky
I built Pony [0], an email platform that delivers once a day, to see if it is
possible for a platform that doesn't manipulate user psychology to succeed.

In the UI I eschew every traditional user manipulation technique I could think
of: there are no notifications, there are no unread message counts. There even
isn’t read/unread message state. It's launching now, so finally I get to find
out: can a platform that doesn't vie for attention, that lets users create
their own unguided, unprompted experience succeed?

[0]: [https://www.pony.gg](https://www.pony.gg)

~~~
dundercoder
Non instant email. Very interesting.

~~~
dmitryminkovsky
Thank you, thank you :) I have been enjoying it.

------
rikkus
Whenever I went to YouTube I would get suggested videos that were either
interesting and related to stuff I had watched, or not interesting to me, but
no problem. The algorithm just didn’t quite judge what I might be interested
in as well as it could.

Occasionally I would try dismissing one of the suggestions in the hope I
wouldn’t see similar again, but then I’d still see it’s kind repeated and was
just disappointed it wasn’t taking my request into account.

Suggestions are hard though. No problem.

One day, though, I made the mistake of following a link to YouTube which
turned out to be to a video of an American political figure (not even sure if
he is a politician) whose views were of the sort that court controversy - on
purpose, I guessed.

What I didn’t expect was that YouTube would then offer me videos featuring the
same person - or people with similar ultra-polarising messages - every time I
returned to YouTube.

I hadn’t realised that not only does YT take into account what you have
watched a _lot_ of, it also seems to massively favour what you have watched
once - if that is something especially controversial or perhaps otherwise
‘special’.

While I don’t mind pointless suggestions I will ignore, it’s very concerning
that without ‘liking’ or repeatedly watching such material, people are being
suggested it time after time, even when they ask to have it taken away.

~~~
harrumph
> it also seems to massively favour what you have watched once - if that is
> something especially controversial or perhaps otherwise ‘special’.

This is my experience as well. YT thinks I love Jordan Peterson. Four or so
months ago, I didn't know who he was, and clicked one clip, found it
ridiculous and never clicked another one. But my recommendations are lousy
with the guy.

~~~
kristiandupont
You are the second comment here stating this specifically about Jordan
Peterson. Weirdly enough, I had that same experience. I saw one video, didn't
like it and yet YT keeps recommending this and that video of him or more
right-wing people _DESTROYING_ someone. It's frustrating to say the least, not
only because I don't want to see it but because I showed a video once when
giving a talk, and after it had finished, all those garbage recommendations
showed up for everyone to see, making it look like I am secretly some lunatic.

~~~
harrumph
> after it had finished, all those garbage recommendations showed up for
> everyone to see, making it look like I am secretly some lunatic.

This is kind of my nightmare! Condolences.

YT, wtf.

------
minimaxir
> Sorry, can I just interrupt you there for a second? Just let me be clear:
> You’re saying that there is no rabbit hole effect on YouTube?

> I’m trying to describe to you the nature of the problem. So what I’m saying
> is that when a video is watched, you will see a number of videos that are
> then recommended. Some of those videos might have the perception of skewing
> in one direction or, you know, call it more extreme. There are other videos
> that skew in the opposite direction.

I assisted with data research on YouTube recommended videos
([https://www.buzzfeednews.com/article/carolineodonovan/down-y...](https://www.buzzfeednews.com/article/carolineodonovan/down-
youtubes-recommendation-rabbithole)), and this claim is misleading at best.
There may be _some_ videos that aren't extreme, but that doesn't matter if
they are a) in the extreme minority and/or b) appear very far down in the
queue of 20+ recommended videos when users typically either do "Up Next" or
look at/action only the top few recommendations.

This isn't a "both sides" issue.

~~~
manfredo
I don't think this is a fair summary of what the Product Chief is saying. When
people talk about the YouTube rabbit hole, they're talking about watching
video A, which links to B, to C, etc. until they reach content considered
extreme.

This is simplistic, but say all videos are on a 1-10 integer scale in terms of
politics. YouTube always recommends 80% videos that are on the same value as
the currently viewed video and 10% are one value higher and 10% are on one
value lower. It doesn't push anyone in either direction. Randomly clicking on
recommended videos will randomly move you up or down (and usually not move at
all). But, no matter where you begin, you can always reach either end in 4
clicks - cue people talking about the YouTube rabbit hole.

But hey, maybe it's not enough to merely prefer content that isn't extreme,
what if we wanted to actively push people to the center? We could recommend
80% videos that are one value higher or lower towards the center spectrum (or
the same is the viewed video is at 5 or 6), and 10% videos that are at the
same value, and 10% towards one value away from the center. Randomly clicking
on videos _will_ bring most viewers to the center in this system.

But we can still reach the extremes in just 4 clicks. This is why complaining
that users can go down a rabbit hole feels disingenuous to me. They always
will be _able_ to do so - unless YouTube never recommends videos that aren't
more centrist than the video being viewed.

I suppose YouTube could just ban all videos that are at 1 and 10 in our
spectrum, but that's basically just calling for censorship. And not to
mention, 2 and 9 will just become the new definition of extreme.

Edit: many of the comment lead me to think that I haven't made what is and
isn't in scope of my comment clear. Allow me to point out two things in
particular.

1\. YouTube already does prohibit hate speech. How well it enforces its rules
is a valid point of discussion, but not what I'm talking about here. I'm
pointing out that the ability to get to the extremes of what _is_ allowed on
the platform through recommendations should not in and of itself be a point of
concern.

2\. I'm also aware that sometimes the recommendation are very messed up in
context of the current video - e.g. shock videos getting recommended to kids.
But this is more about trolls deliberately gaming the recommendation
algorithm, which I see as a distinct issue from the alleged rabbit hole
pattern of recommendations.

~~~
strangeloops85
The alternative is that Youtube could choose NOT to recommend videos for large
categories of videos, particularly those related to kids or other categories
(for example, Rohingya-related videos in Myanmar in the current time period).
You need not censor the videos, however you can choose to make their discovery
less easy.

There's no law that says they _have_ to recommend videos. There is however
sufficient evidence at this time around the rabbit-hole effect causing real-
world harm. This isn't 2005. It's about time Youtube, and social media
properties adopt the medical principle: "first, do no harm". Perhaps they will
do this willingly, but somehow I doubt it.

To wit: [https://www.bellingcat.com/news/americas/2018/10/11/memes-
in...](https://www.bellingcat.com/news/americas/2018/10/11/memes-
infowars-75-fascist-activists-red-pilled/)

~~~
manfredo
I'm not sure how kids factor into this, I've always heard the YouTube rabbit
hole in the context of pushing people to either ends of the political
spectrum.

As far as "do no harm" goes, implementing safeguard against the alleged rabbit
hole has a very strong potential to do harm by making YouTube a partisan
platform. Bear in mind that concern over the alleged rabbit hole is not equal
between liberals, centrists, and conservatives. Many of the recommended videos
I've seen people reference as evidence of the alleged rabbit hole are videos
I'd considered mainstream. What it takes to placate the critics has very
strong potential to violate the principle of "first, do no harm".

~~~
strangeloops85
I would suggest looking through that previous link regarding examples of where
one can quickly end up in white nationalist content through recommendations.
Again, recommendations are for Youtube's own growth-related ends. However, it
is emphatically a choice to have them.

Regarding kids, this has been very well documented:
[https://www.ted.com/talks/james_bridle_the_nightmare_videos_...](https://www.ted.com/talks/james_bridle_the_nightmare_videos_of_childrens_youtube_and_what_s_wrong_with_the_internet_today)

[https://medium.com/@jamesbridle/something-is-wrong-on-the-
in...](https://medium.com/@jamesbridle/something-is-wrong-on-the-
internet-c39c471271d2)

~~~
manfredo
Your medium post is not directly related to what I was referring to in my
first comment. Yes, bad actors game the recommendation algorithms to troll
people. That's what Elsagate was about. People deliberately making channels
and videos that would appear as and presumably get recommended as kid friendly
content when it was actually shock content. I'm well aware that recommendation
algorithms can be abused. Perhaps I should have more strongly emphasized that
my examples are presented in a very simplified scenario, that is limited to
the alleged political rabbit hole and not things like Elsagate.

The point I am making is that pointing out the fact that it's possible to get
to the extremes of YouTube by clicking through on recommended videos shouldn't
be surprising. In fact, of you couldn't so that it would be evidence of
deliberately trying to keep certain content from being popular.

The question of how well YouTube polices it's terms of service (which already
does prohibit hate speech) is related to this question about the alleged
rabbit hole because such content presumably exists at the edge of my
hypothetical political spectrum values. But it's not directly related to the
fact that clicking through on recommended links can bring people to 1 and 10
even if they start at 5 should not be surprising.

------
mberning
Why is it an acceptable expectation for the user to self moderate certain
information but not others? For example, if I watch a video about modding a
car, then that links me to another video of more extreme modification, and so
on, up to the point that I am no longer interested and stop watching. That is
OK for the people at YouTube. But when it comes to politics they feel that
people cannot decide what they do and don’t agree with and self moderate. I
get that you could make the argument that politics is different, but is it
that much different than some other topics that could lead to undesirable
outcomes?

~~~
InitialLastName
Car modding has a) very high up-front costs, relative to developing political
ideas and b) arguably very little chance of doing large-scale damage to a
society.

If car modding videos led you down a rabbit hole towards "watch how fast I can
go if I cut the brake lines with these $3 wire cutters! Those braking
requirements are just a conspiracy between big auto and the shadow government
to keep you from reaching car enlightenment, so make sure to drive through the
front window of your nearest car dealership"... and then people started doing
that, we would have a different attitude about them.

~~~
FakeComments
Are they going to take the same stance to videos about gaming, because gaming
addiction is a real problem and a progressive diet of “Lets Plays” normalizes
unhealthy amounts of video games, by showing people whose entire life is video
games?

Why is that different than politics?

~~~
InitialLastName
I can't believe I have to say this, but it's different because political
beliefs, whether they inspire one to shoot a bunch of people or just to vote
for the "burn it all down" party can have a damaging influence over society as
a whole. I.e. it's not just about the personal responsibility to oneself as
much as the risk of being inspired to terrorism.

~~~
FakeComments
Gaming addiction is a systemic problem that’s impacting the social stability
of several Asian nations, and possibly Western ones — your argument isn’t
principled, merely special pleading.

------
annadane
Radicalization aside, is anyone just annoyed with how much worse the
recommendations are compared to how they were? Damn near impossible to
discover anything given how biased it is towards "popular" content as opposed
to, y'know, related to what you're watching

~~~
InclinedPlane
Yup. It's just... shitty. I subscribe to a lot of channels and well more than
half of the videos on the recommendations list are just recent videos from my
subscription feed. I recognize that they are trying to create some sort of
dumbed down landing page experience, but it just sucks. I feel as though there
was a period of time when it was much better, but that time has sadly long
passed.

------
css
> The first is that using a combination of those tools of authoritative
> content and promoting authoritative content is something that can apply to
> other information verticals, not just breaking news.

How is this not a clear admission they are a publisher and not just a
platform?

------
yuliyp
So it feels like the way YouTube's algorithms work re suggested videos is
heavily related to "what other videos did the people who watched this video
watch?". Now imagine you have a video that expresses some conspiracy theory.
That video is watched by a bunch of people who are into conspiracy theories.
Then you come along, and YouTube now recommends you videos that the people who
watched the same video as you just watched. And bam you get sucked toward
those others who were interested in that video.

Now this does work in the other direction, where someone with more normal
viewing habits watching a video will steer the conspiracy theorists toward the
types of videos they watch, but the size of the pools means it's the
polarizing direction that would have a greater effect.

~~~
InclinedPlane
It's worse than that. YouTube tries to maximize engagement, so it will
preferentially recommend videos watched by highly engaged viewers. Guess what
category of viewers are highly engaged? Indeed, conspiracists, polemecists,
etc. Imagine walking into a college campus and asking the registrar for some
recommendations on classes to take and them telling you to go listen to the
crazy religious nut shouting from atop a milk crate, because they were so much
more engaged with their material than anyone else on campus.

------
api
The thing is this is how algorithmic timelines, lists, etc. work. They're
programmed to drive engagement. Fear, hate, outrage, sensationalism, and
controversy are what does that as has been known since the time of P. T.
Barnum.

------
canthonytucci
At one point, if you watched an Asmongold video, you'd be bombarded with
suggestions for more of them.

Never seen it happen as strong with any other topic.

I don't know about increasing levels of extreme, but I can definitely see it
possible that suggestions could get people to a place where they are feeling
like an idea or a thing is more popular than it really is.

He talks about it briefly here (nsfw)

[https://youtu.be/Rstb4IRXZLQ?t=625](https://youtu.be/Rstb4IRXZLQ?t=625)

------
ve55
Seems to be a complaint that the Internet's Overton window is difficult to
easily control. Good.

~~~
ddingus
Seconded.

My beef with all this is our current Overton window has been compressed to
right of center economically.

American opinion and general support for solid, center left economic policy is
high and growing.

Medicare For All, and friends.

Our mainstream media simply does not report on, nor offer much opinion
favorable to labor, and or framed from a labor point of view.

It used to. When I was a kid, we would compare the various points of view,
learn to identify them, and discuss what having the means to society.

This lack of control is a good thing. I think we've got problems it's a
struggle with, but I don't think they're as bad as what's being represented,
nor do I think a nicely centered window makes any sense. I think the people
should decide that.

~~~
drak0n1c
Ideally, the best way to balance online rhetoric is for left of center
creators to generate more frequent and more compelling content (without admins
needing to finesse algorithms or censor/shadow ban content in pursuit of a
digital Fairness Doctrine).

By sheer quantity mainstream media leans center left and online indie media
leans center right. You could say that taking the whole landscape into account
that constitutes some kind of tenuous balance overall - but it would be nice
if both mediums were more balanced.

Interesting aside: conservative talk radio has always been more popular than
liberal talk radio - is this just the online expression of the same
phenomenon? People on one side of the aisle tending to enjoy listening to and
watching political content at greater length than their counterparts?

~~~
danbolt
I think left-wing creators are getting better at making compelling content on
YouTube and having it as part of a discussion. Hbomberguy, Shaun,
PhilosophyTube, and ContraPoints come to mind.

~~~
ddingus
They are. Well spotted.

The important thing for Lefty's, who are interested in resolving this problem,
is to realize they need to support those voices. And we need to do it rather
directly, because of the big systems, and the large money, really won't do it.

------
charlesism
"There’s more work to be done" is the understatement of the year.

------
sfashset
Just out of curiosity - we have people on this thread and on other parts of
the internet, claiming they'll watch one "extremist", typically right wing
video, and then have their recommendations blitzed with even more
extreme/right wing content. This is the "rabbit hole" effect that's being
referred to in the article.

Can anyone say that they have distinctly _not_ encountered this effect? I can
recall having watched maybe a couple of Joe Rogan clips, a Jordan Peterson
interview, and even some stuff I would label alt-right content. I'm personally
fairly left-ish/liberal, so I find these videos boring/offensive and
eventually will go back to my normal youtube consumption - music, sports, some
tech videos.

My recommendations are all of the latter categories, not of extreme/political
right wing category. I guess due to selection bias, most people who don't have
a problem with their recommendations won't report anything, while most people
who do will leave comments that they too experienced the rabbit hole effect.
I'm wondering if that leads to the problem described being overstated. Or am I
really the only person who's managed to watch some fairly extreme right wing
content, and had my recommended videos stay intact?

------
dmitryminkovsky
The big question in my life is whether an app can succeeed in 2019 without
manipulating user psychology like this.

I built Pony [0], an email platform that delivers once a day, to see if this
is possible. In the UI I eschewed every traditional user manipulation
technique I could think of: there are no notifications, there are no unread
message counts. There even isn’t read/unread message state.

I truly wonder whether people can adapt to a totally unstructured online
platform, an unguided, unprompted experience that they create for themselves.
My bet is they can.

[0]: [https://www.pony.gg](https://www.pony.gg)

~~~
BucketSort
> Pony is the world’s first non-instant messaging platform

The postal service would like a word with you.

~~~
dmitryminkovsky
Neither snow nor rain nor heat nor gloom of night stays these couriers from
the swift completion of their appointed rounds.

------
IronWolve
If only someone took the rss feeds from youtube channels and made your own
front end. Ditch their portal.

Turn youtube into a type of podcast db, its just a media storage site. Then
you could create your own portal, parse the views/ratings and provide real
statistics.

I don't watch youtube for youtube, I watch youtube for user-created content.
And some of that content has already moved off onto 3rd party sites.
Floatplane anyone? Patreon anyone? list goes on.

I just wish finding video content was as easy as searching for podcasts.
Youtube provides RSS links, I use them to add to my podcast player.

------
coltonv
I've noticed an interesting thing in terms of "youtube rabbit holes" towards
extreme content. I'm what American society would generally consider "liberal",
and I watch a lot of videos about fixing climate change, medicare for all,
etc. Interestingly, I don't usually get "liberal" recommendations on my home
page.

However, if I watch a video from a conservative angle, even just one or two
videos, I almost immediately get extreme right wing content in my
recommendations. Stuff like PragerU, ReasonTV, NRATV, etc. Even watching
videos that I wouldn't consider right wing, just critical of certain left wing
sects, like h3h3 for example, tend to almost immediately lead me into videos
like "DUMB FEMINISTS GET OWNED - COMPILATION".

It's strange the rabbit holes almost always take me deep into right wing
territory, but never really into left wing territory.

~~~
throwawaysea
If you think PragerU or ReasonTV are "extreme right wing" content,
respectfully I think you might be lacking perspective. I would describe
PragerU as mainstream within the conservative sphere. Their presentation is
almost always high-quality in terms of production, respectful in its delivery,
and even-keeled. You might disagree with the content (or their values), or
find factual errors, but it is easy to find flaws with a wide range of content
producers on the left as well. ReasonTV is even closer to center, and is near-
right libertarian, and definitely not extreme right.

As for your experience with recommendations - I am not sure why you would not
see recommendations similar to the videos you like. I see videos from multiple
ideologies regularly and get content recommended from all of them. I haven't
had my feed dominated by one side. What I think is happening is that people
_notice_ videos that incense them, and incorrectly perceive it as being
dominant when it truly isn't.

~~~
PavlovsCat
> high-quality in terms of production, respectful in its delivery, and even-
> keeled

[https://www.youtube.com/watch?v=EM7BgrddY18](https://www.youtube.com/watch?v=EM7BgrddY18)

"Why you should love fossil fuels", "It is easy for feminists to forget that
men gave women the right to vote, gave up their monopoly on power, and
invented birth control." \-- even-keeled?

> it is easy to find flaws with a wide range of content producers on the left
> as well.

More importantly irrevelant, since the criticism isn't that imperfect right
wing videos are presented instead of perfect left wing ones.

> I haven't had my feed dominated by one side.

So? It's perfectly possible for you to be in a different testing bucket. Don't
assume what you see is what everybody sees, and that they are just
interpreting it differently than you do. I might as well say "nah, you only
_think_ you see videos from multiple ideologies".

------
username223
Thank $deity for [https://youtube-dl.org](https://youtube-dl.org). I can't be
bothered to deal with Google's "up next" algorithm.

------
swamy_g
If you call Jordan Peterson videos or Joe Rogan clips extremist videos, then I
think this is clearly a step in the wrong direction. Their videos thrive on
YouTube not because they hold extreme views, it's because their videos are
very engaging, fun and you learn something from it.

If this is a ploy to push more mainstream narratives through YouTube that is
akin to watching CNN/CBS or ABC, then I'll be looking for other platforms.

~~~
minikites
[https://twitter.com/VicBergerIV/status/1110301264657551360](https://twitter.com/VicBergerIV/status/1110301264657551360)

>Joe Rogan went on Infowars today, the same day a man who lost his child in
the Sandy Hook shooting committed suicide. The grieving father was among the
plaintiffs suing Alex Jones for defamation after smearing the murdered
children and their parents as part of a hoax.

>In the same Infowars episode that Joe Rogan appeared on, Alex Jones pushed a
conspiracy theory that the father of a child killed at Sandy Hook didn't kill
himself, but may have been murdered to silence Jones and end the first
amendment.

~~~
fmihaila
>
> [https://twitter.com/VicBergerIV/status/1110301264657551360](https://twitter.com/VicBergerIV/status/1110301264657551360)

That Twitter thread is interesting, because it contains a link to a 'This
American Life' program that gives a platform to Alex Jones to explain himself.
I doubt anyone would call 'This American Life' a platform for spreading
conspiracy theories or hate.

In this light, is Joe Rogan, who also allows people like Jones to explain
themselves, a moderate or an extremist?

~~~
minikites
You have to look at what both Joe Rogan and This American Life do in a larger
context, which should clarify the differences. I do think that TAL should not
have given airtime to Alex Jones because it really is that dangerous to give
him (and people like him) a platform.

------
BEEdwards
>Yeah, so I’ve heard this before, and I think that there are some myths that
go into that description that I think it would be useful for me to debunk.

This is no myth bro, go to youtube pick a political video leave on autoplay,
watch it go to shit.

------
mindgam3
It’s almost like the YouTube guy is taking his talking points directly from
Zuckerberg’s script.

\-----------

1\. “of course we take this seriously”

\- Mohan: "Having said that, we do take this notion of dissemination of
harmful misinformation, hate-filled content, content that in some cases is
inciting violence, extremely seriously."

\- Zuckerberg 2016: "we take misinformation seriously. We’ve been working on
this problem for a long time and we take this responsibility seriously."

2\. “we’re simply trying to help people get accurate information”

\- Mohan: "when users are looking for information, YouTube is putting its best
foot forward in terms of serving that information to them."

\- Zuckerberg 2016: “Our goal is to connect people with the stories they find
most meaningful, and we know people want accurate information.”

3\. “fear not, it’s our users who are in power/in control”

\- Mohan: "But YouTube is also still keeping users in power, in terms of their
intent and the information that they’re looking for."

\- Zuckerberg 2004: “People have very good control over who can see their
information.”

\- Zuckerberg 2017: “Our full mission statement is: give people the power to
build community and bring the world closer together. That reflects that we
can’t do this ourselves, but only by empowering people to build communities
and bring people together.”

And last but definitely not least,

4\. “we’re proud of our progress, but there’s more work to do”

\- Mohan: "It’s an ongoing effort. I think we’ve made great strides here. But
clearly there’s more work to be done."

\- Zuckerberg 2016: “We’ve made significant progress, but there is more work
to be done.”

\- Zuckerberg 2018: “I’ve learned a lot from focusing on these issues and we
still have a lot of work ahead…I’m proud of the progress we’ve made in 2018…
I’m committed to continuing to make progress on these important issues as we
enter the new year.”

Why anyone would copy Zuckerberg’s script at this point is beyond me.

\-----------

Sources:

\- Zuck 2004: [https://www.thecrimson.com/article/2004/2/9/hundreds-
registe...](https://www.thecrimson.com/article/2004/2/9/hundreds-register-for-
new-facebook-website/?page=single)

\- Zuck 2016:
[https://www.facebook.com/zuck/posts/10103269806149061](https://www.facebook.com/zuck/posts/10103269806149061)

\- Zuck 2017: [https://www.facebook.com/notes/mark-zuckerberg/bringing-
the-...](https://www.facebook.com/notes/mark-zuckerberg/bringing-the-world-
closer-together/10154944663901634/)

\- Zuck 2018:
[https://www.facebook.com/4/posts/10105865715850211/](https://www.facebook.com/4/posts/10105865715850211/)

~~~
ams6110
It's because they are both spouting standard PR cliches.

------
habosa
I work at Google but not on anything related to YouTube. This opinion is my
own: that was very disappointing to read.

I _use_ YouTube, I don't browse it. I search for what I want, and then I watch
it. Yet YouTube seems absolutely determined to send me down rabbit holes for
some reason. For instance my friend sent me a video by some cable news neo-
philosopher (a Jordan Peterson type) and since then I can't get rid of low-
quality vids in my recommendations that are trying to make me angry. "{Person}
totally EVISCERATES {other person}" being the calmest of them.

The recommendation algorithm is also just not even good at its job. I am
Jewish and the number of videos I have seen recommended to me that are thinly-
veiled anti-semitism is pathetic. Do they really think I am going to watch
those?

The one thing from the article I do believe is that these extremist videos
(mostly) don't monetize. So why do they dominate? Honestly I think we just
suck at building recommendation systems.

I am not a big fan of Facebook, but I applaud their recent ban of white
supremacist content. Freedom of speech is one thing, freedom to use someone
else's site to speak to millions is another thing. YouTube should just take a
harder stance on all of this dangerous crap. The Sandy Hook deniers and the
Antivaxxers will have a hard time finding a new home.

</rant>

~~~
fossuser
From the outside it seems like they don't actually design YouTube with
features their users would like.

I pay for YouTube to avoid ads and wish I didn't have to use their home view
which shows me a lot of content I don't care about. I want my homescreen to be
my subscriptions (instead I have to manually switch to this every time).

I would entirely turn off recommended content if I could, in favor of channels
suggesting other channels or something more human curated (like Spotify
discovery which in part uses other user's playlists).

People are vulnerable to bad information delivered in a manipulative way -
especially when this is reinforced by a bunch of recommended videos.

------
lvs
This is nothing more than a very long-winded PR denial.

------
whotheffknows
This is what google okay is for. Trust me.

------
wbronitsky
I'm so tired of people calling the regulation of hate speech and violent
speech on private platforms "censorship." These people are more than free to
express themselves in any public platform or their own platform. They are not
censored in any way. They are being removed from a private platform for not
complying with the rules of the platform.

You wouldn't say that a drunk who walked into a nice restaurant yelling
hateful things was being censored when they are asked to leave. Neither would
you say that a man trying to convince kids to get into his van outside of
McDonalds was being censored for being kicked out of the McDonalds. Both of
those are private businesses regulating the behavior of their customers, and
not in any way censorship.

> Randomly clicking on videos

The entire point is that people don't click on random videos, they click on
videos they think are interesting. Call it "rage clicks", curiosity, or some
people are just bad people and want to see bad things, just don't call it
random, because thats obviously wrong.

~~~
Pharmakon
I’m at the point where dishonest discussions about “free speech” online are a
non-starter (Having had an illuminating exchange on this very site two days
ago). It’s not as though these anonymous chats matter in the slightest,
they’re had in bad faith and most of all are intellectually stultifying. The
only winning move is not to play. You won’t convince anyone of your position
who has made the conscious choice to adopt a dynamic series of positions to
support their unstated agenda. It’s like arguing with creationists, the best
you can hope for is changing their declared position of the day. Meanwhile
they win just by having an audience and the semblance of legitimacy, not to
mention dominating and railroading ever comment section they can to quash
reasonable debate.

This is setting aside issues with bots, trolls, and brigades. In real life
with people you know and can talk to there _can_ be value in these
discussions, but never anonymously and never online. It’s pseudointellectual
wank dressed up as reasonable talk. There is nothing wrong with simply saying,
“I’m not interested in having this conversation with you, sorry.” and moving
on.

~~~
malvosenior
Do you really not accept that some people (many) may just actually believe in
free speech and not be arguing in bad faith, be a bot, brigade...?

That was a pretty normal position to hold on the US until about 5 years ago
and I’d say it still is.

~~~
thatoneuser
The two posters before this are exactly why people are so riled up about
censorship (calling it what it is. Just cuz someone dislikes the speech
doesn't warrant removal of the right to speak.)

Basically you have people that demand others be censored. That they don't
deserve to speak freely. And they'll come up with all these mental gymnastics
as to why other people's speech isn't acceptable. And for most of us we
understand that the first amendment is the most important and that's not
acceptable.

Just look thru the last 2 years of left leaning websites. Anyone who claimed
trump was innocent (clearly...) was a "bot, troll or brigader" and ought to be
censored because it was "hate speech".

A world where we censor is a dark world. There's a reason free speech was
right number one in this country.

~~~
russdpale
The infringement of speech is only prevented for the federal government. The
internet doesn't count. The internet is not beholden to physical barriers and
there is more than enough room to start your own website. You have a right to
speech, you don't have a right to a privately owned platform that isn't
yours..

Furthermore, deplatforming has always been a thing. It basically happened to
the dixie chicks after they spoke out against the Iraq war and GWB in England
way back in the early 2000's. I don't think they have ever been on fox or a
sinclair broadcasting station ever again.

No one is saying proclaiming trumps innocence is hate speech, that is an
absurd claim, where are you getting that nonsense?

~~~
intuitionist
Nobody is talking about free speech as a matter of constitutional law in these
instances (although even that applies to more than the federal government,
i.e. state and local governments.) People are making a normative claim that
it’s better for institutions like YouTube to have liberal speech policies than
restrictive ones. If you want an example with opposite political valence think
of NFL players kneeling during the national anthem. The NFL can legally punish
players for exercising their constitutionally guaranteed rights, but it
nevertheless has a chilling effect.

~~~
amanaplanacanal
I would submit that there is a difference between hosting speech, and
amplifying it with your recommendation engine. If your recommendation engine
does not promote something, I don't think that is the same as censorship.

------
jkha0
Giving people more of what they want is the optimal user experience.

People are ultimately choosing what to watch.

It's a clear conflict of interest for traditional media to blame their younger
competitors for all the world's problems.

~~~
pixl97
I want more free crack please, it's the ultimate user experience!

------
ousta
the stupid idea that it is the algorithm driving people to extreme content. It
is not. people are driven by that. our whole societies are driven by violence,
fear, disruptive stuff

