
Study finds Reddit’s ban of its most toxic subreddits worked (2017) - rbanffy
https://techcrunch.com/2017/09/11/study-finds-reddits-controversial-ban-of-its-most-toxic-subreddits-actually-worked/
======
DeusExMachina
It "worked" only if the definition is "it removed it from Reddit".

As the article mentions, these people did not just disappear or change their
mind. They just moved to more obscure places, where they can continue and be
evem more motivated, because now they have been censored.

And you might still consider this to be a satisfiable definition for "it
worked". Out of sight, out of mind.

Except that, these people are now relegated to an echo chamber far from public
discourse where they will only reinforce their views without challenge. An
echo chamber that is now suddenly more appealing, because it's forbidden.

Recently, in an episode of the Joe Rogan Podcast with Tim Pool, Jack Dorsey
and Vijaya Gadde, Jack admitted that a woman that was part of the Westboro
Baptist Church changed her mind only becasue she was allowed to be on Twitter
and be exposed to other points of view.

So, from the point of society, I don't think it ever works to pushi extremists
to the underground. Sure, Reddit does not have any obligation to give them a
place to gather. But still, maybe it should anyway?

EDIT: Reading other comments here, I also got reminded of the fact that,
before Hitler got to power, Germany had hate speech laws that explecitly
forbid nazi propaganda. Many were sent to jail for that, which only
contributed to their determination and their spreading.

So, if you are concerned that being allowed in the public allows them to
spread their views, do not be so sure that pushing them away is going to stop
them. You might just do them a favor.

~~~
sagichmal
> It "worked" only if the definition is "it removed it from Reddit". As the
> article mentions, these people did not just disappear or change their mind.
> They just moved to more obscure places, where they can continue and be evem
> more motivated, because now they have been censored.

What about all of the new people who would have more easily stumbled into
those cesspools, but now have to take quite a bit more effort, and go out of
their way, to go to other sites?

Sunlight isn't always the best disinfectant. Sometimes the disease requires
treating things at a lower level, stopping the roots of the trauma, and not
only treating the symptoms. De-platforming is an important and necessary
component of fighting hate online.

~~~
pmarreck
I would like to see evidence that “deplatforming” actually reduces hate
_globally_ instead of simply moving it elsewhere, because I haven’t seen any,
and until then this is simply an unfounded belief (i.e., “bullshit”)

~~~
clucas
The thing is, Reddit isn't responsible for, and couldn't possibly begin be
responsible for, hate in the whole world. They banned it on their site, and it
cleaned up their corner of the world, and they're happy that the place they
have control over has less hate. That's not bullshit.

Bullshit is saying "well, we can't possibly get rid of it, so may as well let
it stay where ever it shows up."

~~~
lordlimecat
Bullshit is saying "We stand for free speech" and then later saying "the best
way to deal with toxic and hateful viewpoints is to censor and hide them."

Belief in free speech as a principle assumes that good and right arguments
will win out every day in an open public forum.

~~~
marmada
That assumption is false. Things like racism aren't motivated by rational
arguments, so they cannot be cured through a rational tool, like free speech.

Racism stems from fear, hate, and deeply embedded cultural tools. You can't
"cure" racism through argumentation. Maybe, maybe, you can make people less
racist in person -- but on an anonymous forum like Reddit, not a chance.

~~~
aetherson
The world is less racist now than it was in the past. What mechanism do you
posit for this?

~~~
tonyarkles
I can't reply to the sibling comment that suggests education, but a corollary
to education is that many of the people who had deeply-engrained racist
beliefs are dying/dead from old age. I agree that the world is less racist now
than it was in, say, the 1950s, but my personal experiences with the elderly
suggests that the people who were racist in the 50s and are still alive are
just as racist now as ever.

~~~
aetherson
That just begs the question. Young people today are not less racist because
each generation magically improves. If they're less racist, then there had to
be a mechanism for it.

~~~
tonyarkles
This is totally an unsupported hypothesis. I have no data.

The mechanism I'm proposing is that the current generation has not been
exposed to many of the toxic things that reinforced racist beliefs. Slavery,
segregated schools, etc. Kids in school are taught why those things were bad
and why racism is bad. While not all of society has abandoned racism, there's
been enough of a push that the public institutions educating the next
generation are making efforts to reduce it.

~~~
aetherson
But... slavery and segregated schools aren't _weather_. They didn't just
happen. You seem like you're working really hard to avoid the idea that maybe
some human convinced some other humans at some point to go from bad behavior
to better behavior.

~~~
tonyarkles
Absolutely some humans convinced some other humans! Enough, in fact, to make
change! I'm not at all trying to avoid that idea. But I'm wholly unconvinced
that when those things went away, a majority abandoned their racist notions.
Rather, the next generation, when raised without the same degree of racism
surrounding them, didn't take up those ideas to the same extent. It's not so
much that people were convinced "I need to stop being racist" but rather that
a subset was convinced "we can't teach this in our public schools"

That's all just observational though. Around here it's not so much racism
against black folks but rather against aboriginal folk (and to some extent,
middle eastern folks). I hear almost none of that crap from my generation or
the younger generations, and still hear horrible amounts of it from older
generations. Yes, some people in those generations changed, but it sure seems
like the "racism rate" is significantly lower than the mean in the younger
generations and significantly higher than the mean in the older generations.

Edit: it's also quite possible that it hasn't gone away in the younger
generations either, it's just not considered something that is acceptable to
say in polite company.

Edit 2: and thank you very much for making me think more about this!

------
pygy_
Collective response to those who claim that it pushed those users to other
platforms:

Only marginally.

If you read the paper ([http://comp.social.gatech.edu/papers/cscw18-chand-
hate.pdf](http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf)), you
can see that ~20% of the users of the banned subreddits became inactive vs
10-15% in control communities. The difference for users that deleted their
account is of the same magnitude.

The users that stayed had a dramatic drop in hate speech.

Also, people end up believing disinformation that is often repeated. Silencing
those opinions probably has a net positive effect on society.

~~~
sdinsn
You seem to be making some assumptions: (1) Users that went inactive on Reddit
went somewhere else, and (2) Users that didn't go inactive didn't go anywhere
else. I don't think either of these assumptions are true.

The fact is, we _don 't_ have a way to track users between platforms.

~~~
pygy_
You can't track them, but you can make educated guesses.

There's a finite amount of time in a day.

So for 1) they must have been doing something during that time they were off
reddit. They could have picked up cooking or moved to a non-hating online
platform. If so, that's a win anyway.

For 2), that's possible, they would have to do it by reducing another
activity. Given that reddit is designed to eat all your free time, I find it
unlikely.

------
Hermel
For a long time, I underestimated the power of the mods. Then, I saw how a few
of them could turn the sentiment of a technical forum about a seemingly
technical question upside down, simply by banning the right users and removing
the right posts. Censorship works, regardless of whether it is used for good
or for bad.

~~~
wswope
I agree, for me the big surprise was the impact of removing the "vice" subs,
e.g. darknet market/some piracy-related ones. You'd think with the internet
being the internet, those communities would just move elsewhere, and no doubt
some members did, but for the most part the communities just died and the
world moved on.

------
joeyrideout
An emergent ethical concern keeps bugging me in these situations. Of course,
private companies reserve the right to refuse business and services to
whomever they want. In a healthy society this is no problem.

But if we slide towards a dystopian future where Big Tech is basically an
oligarchy governing important chunks of society without due process and the
democratic checks and balances we would expect from a governing body, things
look pretty grim.

I have no solution to offer, unfortunately.

EDIT: Despite all the cynicism, I do think we are still on the healthy side of
things :)

~~~
tyingq
I'm not so worried about the free speech aspects of that...there's always
somewhere to spew whatever opinion you have.

I do agree, though, that Google and others wield a lot of power by being able
to permaban a whole account/person for an alleged incident on just one of
their platforms. Where _" banned on YouTube"_ means, for example, that your
Android phone no longer works.

~~~
joeyrideout
Yes, but "there's always somewhere" may stop being true if we enter that
dystopia.

Competition for fringe discussion exists today so our society is healthy
(thank goodness). If a true monopoly is allowed to form in these platforms, we
will have a big problem. Thankfully we have anti-compete laws. Unfortunately
it feels like we are flirting with monopolies when vertically integrated
companies like Apple and Google have a very large market share (from Android
to YouTube, as you mention).

My favourite Sci Fi that deals with that dystopian future is a little-known
show called Continuum. The oligarchy begins when governments go bankrupt and
get bailed out by corporations in exchange for political power in the form of
a "corporate congress". Fantastic show. Still sticks with me.

~~~
luckylion
> Yes, but "there's always somewhere" may stop being true if we enter that
> dystopia.

That's true. That "somewhere" might become TOR because everything else is well
regulated. In the end, you can technically also always meet like minded people
in your basement to do whatever the government doesn't want you to do in
public, so there's always "somewhere" ;)

------
intertextuality
Reddit back in the day: bans /r/fatpeoplehate, /r/coontown, /r/jailbait, and
other awful subreddits.

Reddit today: effectively bans (quarantines) /r/waterniggas simply because of
the subreddit's title. It was quite literally a joke subreddit about being
hydrated. Doesn't ban subreddits like /r/hapas.

Edit: The problem with reddit is that it's wildly inconsistent about banning
subreddits. Insane subreddits can exist for years, but only will get banned
once they start to become popular enough that they start to get noticed.

A fair amount amount of people went from reddit to voat because at the time
voat was new, had signups open, and advertised itself on free speech. Note
that I DO NOT support the likes of voat- I am describing the event that
happened.

edit; changed substantial to fair amount

~~~
geofft
Did anything happen with Voat, though? I hear frequently (just yesterday, in
fact!) about terrorists radicalized on 8chan. I see screenshots of Gab quite
often. I only hear about Voat in the context of "A few years ago, some people
left Reddit."

If the net effect is that certain people left Reddit for a platform inviting
the worst of Reddit, and then didn't stay there either, that seems like a win.

~~~
threeseed
Voat is already on its way to being shut down.

The owners posted last week a message telling some of their users to stop
inciting violence as the Feds were in contact. Given that Voat is ultra-toxic
now it's not going to be possible to moderate every piece of content ever
posted.

[https://voat.co/v/Voat/3178819](https://voat.co/v/Voat/3178819)

~~~
chomp
Wow, I remember joining Voat way back when it first started and it just seemed
like an angsty Reddit. I even started a group for my home town.

Just read that thread and it's crazy how much it has festered since then.
Everyone there just seems so angry all the time, there are zero positive posts
on the front page. How do people live their lives so angry all the time?

------
anthony_doan
Yep mods work.

I see that first hand from Jeff Hoogland, he twitch for a living and his niche
is magic the gathering. His channel is very civil and he moderate the heck out
of it. His motto is that he's responsible for the discourse so he have to
moderate it. He disabled youtube comment for awhile because he doesn't have
the time to moderate it. Then later on he enable comment and all the youtube
comments were very civil, I believe his twitch rules just transfer over
because all his fans know how to behave.

------
r3bl
> Migration was common, both to similar subreddits (i.e. overtly racist ones)
> and tangentially related ones (r/The_Donald).

Tangentially-related? I'm pretty sure this crosses the line of being
tangentially-related to racism.
[https://i.redd.it/ua1ukmoi8ls21.jpg](https://i.redd.it/ua1ukmoi8ls21.jpg)

~~~
cannedslime
You realize anyone can post anything on reddit? I could go to
/r/SandersForPresident and LARP as a violent communist revolutionary, right
this moment if I wanted to.

~~~
r3bl
Yes, you could. You'd probably receive a ban instantly, not have people
engaging with you and upvoting you. You'd certainly not be able to do that for
weeks (since some of these I've opened at random still post in t_d).

I don't know who you're trying to fool by claiming this shit isn't common on
t_d, but you're not fooling anyone.

------
glenstein
There's a predictable cycle each time you ban toxic users. There is a core of
incredibly toxic users, surrounded by a halo of sympathizers. When the toxic
users get banned, the sympathizers will argue in bad faith, moving heaven and
Earth to insist that the ban was either unfair, inconsistent, ineffectual, or
extreme, or a violation of their interpretation of free speech.

To whatever extent there was an actual debate with neutral parties, I think it
was likely inflamed by these kinds of actors, and the fact that there's no
there there at the end of the day isn't entirely surprising, even in light of
significant debate on the question.

~~~
whenchamenia
So a free speech advocate is in bad faith in your mind?

~~~
glenstein
Not necessarily, but it can be for sure. I think in the context of reddit
threads debating the need for moderation, a large proportion, if not most of
the free speech folks were working backward from wanting to defend trolls
toward an interpretation of free speech principles that effectively worked to
shelter the trolling behavior.

------
Ancalagon
This is slightly off-topic and will probably get buried, but has anyone else
seemingly noticed a huge spike in incel-type commentary as of late all over
reddit? I feel like I can't go into any serious posts' comment section without
getting that vibe anymore. I was wondering if this is related to some of those
more infamous subreddits getting banned and "releasing" those types of people
to the rest of reddit's site?

------
ben_w
Interesting. I have wondered if the “iron law of prohibition” [1] applied to
censorship as well as substances (my original context was the UK’s “extreme
porn” prohibition, but the same question arises for non-sexual censorship). It
looks like drugs and censorship are different, given the diminished behaviour
by the same accounts on other subreddits.

[1]
[https://en.m.wikipedia.org/wiki/Iron_law_of_prohibition](https://en.m.wikipedia.org/wiki/Iron_law_of_prohibition)

------
satokema
One should ask why the "bad and wrong" arguments are so dangerous that they
need suppressing.

If it's so bad and wrong, it shouldn't have a leg to stand on, surely?

There are kernels of truth unaddressed by the champions of so-called-good
because they undermine the groupthink. That's the real catalyst for conversion
in the rabbit hole funnel. Having a subreddit is just a convenience.

------
luord
It's no skin off my back whatever reddit allows or doesn't. I despise
discussing politics, specially online (I'm making an exception in this one
comment, which I'm sure I'll regret), so I only follow technical and funny
subreddits.

That said, I wanted to point out something that the people saying that this is
a win are ignoring with their "fewer recruits" idea: The more you censor
someone, the more they're likely to actively proselytize, I think.

I also saw someone saying that censorship could be a tool for the betterment
of society and I'm kind of horrified now at this thread so I didn't read any
further.

------
nyxxie
Ok, lets suppose that banning discussion of "toxic viewpoints" worked as this
title implies (overall spread of the viewpoints declined). Now for the tricky
question: who defines what "toxic" means?

Is something a "toxic" viewpoint when it's holders take action that is
harmful? Certainly that's why racism is bad, but what about more controversial
opinions? People who spread the belief that abortion is OK are encouraging
more people to get abortions, which the pro-life movement equates to murder
(and therefore harm). What about meat consumption? Vegans and PETA-types would
label this harmful as it involves the death of animals. What about weed? LGBT
normalization?

This labeling of speech as being "harmful" is necessarily subjective. What
happens when operators of major discussion platforms decide that things
besides racism are too "harmful" to permit discussion of? Seeing that limiting
the discussion of ideas shapes how a population propagates them (taking this
study's claim at face value), do we really want to normalize this sort of
censorship? Is it even moral for site operators to manipulate people's
perception of reality like this by censoring what they see to shape a
population to be politically homogeneous to the operator's views?

~~~
apasserby
Ideologies that if they should come to fruition would limit the very free
speech they are currently abusing, in much the same way that tolerant
societies should not tolerate intolerance.

------
0xfffafaCrash
So, it "worked" in the sense that the users likely moved to more toxic echo
chambers where they can be further radicalized in their toxic viewpoints? Yay?

~~~
Waterluvian
That's an interesting perspective.

I mean, it's kind of like kicking the KKK out of your local YMCA. Sure they go
have Klan rallies elsewhere but at least my kids aren't bumping shoulders with
them anymore when they're using the water fountain between basketball games.

~~~
1337biz
Yeah, because kids are not attracted to things that are forbidden.

~~~
paulgb
I reckon there is actually more excitement to being part of an infamous fringe
of a large mainstream community like Reddit, than to be part of an isolated
fringe community that the mainstream is unlikely to ever see. Part of the
appeal is surely in offending the mainstream users who share the platform.

~~~
ThrowawayR2
> _...than to be part of an isolated fringe community that the mainstream is
> unlikely to ever see..._

Like being a computer hacker was (and still to some extent is)? Sorry, no.
People like to gather into in-groups and exclusivity makes it more attractive.

Sweeping dirt under the rug doesn't destroy the dirt; you just wind up with a
bigger and bigger lump of dirt under the rug.

------
tfolbrecht
It was never a question if top down rules on speech work, it's if it was a
desired. Reddit is the beige wall of community. My experience is choosing
between echo chambers, marketing, or ghost towns.

"Thousands of communities, endless conversation, authentic human connection."
Would that mission statement be more true for Reddit, Tumblr, Image boards,
IRC channels, random internet forums? and the conversation is boring, its
feels just like tech Twitter where everyone has to make sure they thoroughly
vet their posts for the sake of their future and avoid getting slam dunked on.
At least it's usually up and down arrows and ban hammers instead of verbal
assault.

Please convince your communities to abandon ship to self hosted, more intimate
communication platforms.

~~~
fullshark
> It was never a question if top down rules on speech work, it's if it was a
> desired.

I don't agree, a lot of people seemed to think the Streisand effect proved
that you can't control speech / content on the internet once upon a time.

------
pengod
I've come across some old time r/fatpeoplehate people and they pine for the
olden days when they could freely say what they wanted about fat people. They
aren't signing up for any reddit alternatives any more than anyone else is.

The hardcore haters, trolls and extremists will find other platforms, but the
majority of toxic users are passive readers, or occasional commenters who
randomly stumble upon subreddits, and aren't going to dedicate themselves to
another platform.

------
skilled
I would be careful with getting the pitchforks. In reality, very few people
live through such anger and hate on a daily basis.

Still, I do agree with banning most of those subreddits in defense of people
who might get distressed and deeply offended.

~~~
eropple
One of my side hobbies is tracking and studying online reactionaries--in an
effort to understand how their swamps bubble over and they impact society at
large I've got eyes and ears on something like 60K accounts and probably 20K
real identities behind them across Twitter, Discords, Riot groups, etcetera.
(I should say "fascists", though; in my estimation, there aren't many
reactionaries around, these days, who wouldn't fit the term.)

I can _assure_ you plenty of people "live through such anger and hate on a
daily basis". The Diet Coke version of this is Fox News, where on their hard-
news shows they literally hard-cut to commercial when, after a shooting at a
synagogue guests so much as _mention_ that Trump et al. encourage racism and
anti-semitism. It is systemic and systematic and the way this stuff poisons
the brains of the people who are soaked in it is very, very real. We
underestimate it at our peril--because the full-on, undiluted thing is a hell
of a lot more powerful a drug.

------
Smithalicious
A good opportunity to point out that "toxic" means exactly "people I disagree
with" and absolutely nothing more.

~~~
simion314
That is not true, we disagree here on HN on many topics like software freedom,
is Apple a monopoly, is Tesla autopilot good or bad etc but persons that
disagree with my view points can disagree without becoming toxic. For me toxic
comments are comments that are int the first place illogical, use false facts
or speculation as truth, use personal attacks or aggressive language,
generalize one or a few data points to the whole and mainly they are not
people that can have a coherent, reasonale logical discussion.

My point toxic != someone that I disagree, your are generalizing.

~~~
Smithalicious
Fair enough, the implication doesn't go both ways. My point was that all
things considered "toxic" are just things that are not being agreed with. Of
course, things can also not be agreed with without being considered "toxic".

~~~
simion314
>Fair enough, the implication doesn't go both ways. My point was that all
things considered "toxic" are just things that are not being agreed with

Can you give an example because you make no sense or you are stating something
obvious, yeah communities do not agree with toxic things because they are bad.

The toxic meaning is clear in the context of each community, if you don't
agree then don't join, if you want to join follow the rules, if you just want
to join and not follow the rules and you want to create harm then you are
toxic.

~~~
Smithalicious
Reddit is notably _not_ a community. I can accept a community deciding that it
doesn't want certain opinions to be expressed because they don't fit the
communal culture, but this doesn't apply to Reddit, which is explicitly a
platform for people to start their own communities.

I think there's some room for common sense as to what communities should be
given a platform and which shouldn't, for example I don't tihnk /r/coontown
should be given a platform in western society which is overwhelmingly anti-
racist. But the moment you start banning communities based on criteria that go
beyond basic common sense of your society, especially of you do so based on
vague and ill-defined rules, you're no longer being a neutral platform. If
Reddit wants to be a community as such that's fine, but from my perspective
that's explicitly _not_ their branding, and it doesn't seem to be the opinon
of Reddit users as agood whole either.

~~~
simion314
I was referring to sub-reddits and the moderation that happens there where
"toxic", trolling , offtopic ares clear terms.

Not the full reddit platform, it is a private thing and a business, if keeping
toxic sub-reddits on would make them money and good PR then they would keep
them , it was a business decision and if it is legal then it is nothing you
can do then move your community elsewhere.

I also don't like when business ban perfectly legal things like adult content
but what can you do? Make a law forcing business and private people host
things that they don't want, if you had a forum and someone would spam it with
disgusting images don't you want the freedom to remove that? Reddit is not a
monopoly, we have forums and they even work better for some content type

------
xfitm3
It didn't work. It's more toxic than ever, just in a different way.

------
hkai
Yes.

Also, Chinese efforts to fight freedom of speech and punish religious people
worked very effectively to reduce criticism of the government and prevent
people from attending churches/mosques.

------
pmarreck
Until machines can distinguish arguing against hate speech from hate speech
itself (which requires understanding context), automated deplatforming is
anti-democratic

------
PorterDuff
So when do they get rid of r/politics?

------
matz1
This is why we need decentralized, hard to take down platform, but still
relatively easy to be discover platform.

~~~
Nursie
It's also why most people won't use one - it'll turn into a cesspit of
prejudice and hate.

------
sureaboutthis
About a year ago, we had a discussion about reddit at work and tried this
experiment. Three of us went to a city subreddit, /r/mycity, and looked for
controversial posts about things like local politics, things involving police,
and so on. One of us would make posts agreeing with others in the thread.
Another would ask questions, feigning lack of knowledge but generally
disagreeing, while a third intentionally took an opposing view. In all cases,
we were careful not to attack anyone or use curse words or other disparaging
remarks.

Interestingly, the first city we tried didn't upvote or downvote very often
but the posts that disagreed with the view of the thread quickly, within a day
or two, found their posts banned or shadow banned--that is, their posts
appeared to us as going through but, in reality, did not appear to anyone
else.

Some of the city subs, where the person slightly disagreed with the thread but
mostly said they didn't understand the points, always found their posts either
listed at the bottom, were downvoted, or got themselves banned but it took
longer--perhaps a week or two.

No person agreeing with the thread was downvoted or ever banned which is to be
expected. However, the whole point of our doing this was to prove our
expectation that reddit is a biased culture that only lets those in who will
agree with the crowd and won't let opposing views in. At least in the /r/city
threads we chose but we noticed the same behavior in many technical subs, too.

I wish we kept the data but none of us visit or use reddit in any way, for
these reasons and a few others, and we only scribbled the notes on paper
somewhere.

~~~
amanaplanacanal
OK, something is wrong with your anecdote.

Bans from a specific subreddit are instituted by the mods of the subreddit. I
can easily picture the mods wanting a sub to be run in a certain way, and if
you run afoul of them, you are out.

Account suspensions from reddit and shadowbans can only be done by reddit
admins. Why would they even care about whatever politics are happening in some
tiny sub?

~~~
sureaboutthis
> I can easily picture the mods wanting a sub to be run in a certain way, and
> if you run afoul of them, you are out.

Thus my point. Except the comment is not how they are run but that opinions
expressed that don't tow the line are cut from the tree.

------
skookumchuck
I doubt censorship and speech codes ever changed anyone's mind. It just
produces a facade.

------
tus87
Wow TWO posts on front page of HN calling for censorship. What's going on?!?!

------
wutbrodo
I commented on this the last time it made the rounds, and the study's
methodology is flawed in a pretty central way.

From the paper:

> Hate speech and harassment are contentious topics, lacking clear
> definitions. As discussed above, the European Court of Human Rights notes
> that “no universally accepted definition of the term ‘hate speech’ exists”
> [48]. They adopt a definition including “comments which are necessarily
> directed against a person or particular group of people”, focusing on race,
> religion, “aggressive nationalism and ethnocentrism”, and homophobic speech
> [48].

> This definition provides a useful starting point, but it is difficult to
> operationalize at scale. We therefore take a usage-based approach: given
> that Reddit has banned the r/fatpeoplehate and r/CoonTown forums, we focus
> on textual content that is distinctively characteristic of these forums.
> Using an automated keyword identification technique, we build lexicons of
> keywords for r/fatpeoplehate and r/CoonTown, which makes it possible to
> track whether the words in these lexicons become more common in other forums
> after the ban. Next, we manually inspect the automatically generated
> lexicons, and identify a subset of terms that are especially oriented
> towards hate speech. These manually refined lexicons are sparser, but offer
> higher precision.

The Tldr here is that they defined hate speech as "the subset of terms heavily
used in these subreddits, manually filtered by the authors' subjective (and
opaque) assessment of hatefulness". My beef actually isn't with the
subjectivity of the latter part: I think the reliance on the ill-defined
concept of hate speech does a lot more harm than good, but I acknowledge that
plenty of people think it's valuable, and that's a non-central hill to die on
here.

The real issue with the study is that starting your definition of hate speech
with "the lexicon of banned subs" is a fatal confounder:

Many subreddits, particularly free-flowing ideological echo chambers, end up
with distinctive lexicons: specific phrases, terms, and in-jokes that people
bring up to solidify feelings of community. This isn't a novel insight, since
it's how most communities (and indeed relationships) work. the phenomenon is
exacerbated for ideological echo chambers, and not just the ones in the right
which are more likely to align with people's definition of hate speech: you
can find the exact same phenomenon on exho chambers like LateStageCapitalism
or ChoosingBeggars, and a milder version on most subs that are less general-
interest than eg r/movies.

The study's conclusions essentially boils down to: "if you ban a sub, the
distinctive lexicon of that sub becomes less common".....duh? It provides very
little signal about the way the amount of hate speech evolves, as someone from
fatpeoplehate could easily be spewing the same vile content elsewhere, and
just use the term "hambeast" less in favor of a more broadly-used slur like
"pig". This isn't a possibility that the paper even attempts to address, and
it's conclusions in light of this fatal flaw are downright dishonest.

It's depressing how often people cite this study uncritically. As always,
_read the papers behind articles before you cite or share believe them_,
particularly when they're reliant on undefined, impossible to measure
definitions like "hate speech". If authors at "papers of record" like the NYT
are regularly too dumb or dishonest to accurately describes papers'
conclusions (or at least describe their limitations), a rag like TechCrunch is
DEFINITELY not something you should take at face value.

------
senectus1
Only if you search for results with blinkers on.

they shoved these toxic groups elsewhere. they didn't go away.

------
chobeat
They are only moving to other platforms that are harder to track, like Discord
or Telegram. Just because they don't appear in the data doesn't mean that they
don't exist. Then yes, you sanitize some platforms but the problem as a whole
is just less evident but still growing.

~~~
DavidHm
I would argue that's still a win. Discord and Telegram are more difficult to
track because they are less public and visible.

The move has therefore managed to reduce the exposure that non-active
participants have to toxic behaviour, and the normalization of said behaviour
as something acceptable/tolerated.

~~~
kuzehanka
The size and conviction of 'toxic' groups is growing, not diminishing.
Marginalising them and pushing them to less-visible communication channels is
only harmful from every angle except staving off short-term moral outrage of
social justice types.

When various less-desirable groups existed on reddit, it was easy to monitor
and gauge their size and their beliefs. It was possible to engage with them on
some levels. Now they are scattered across various difficult to observe
channels and are fully enclosed in an echo chamber that amplifies their
beliefs that much more.

The early days of large social platforms like Reddit was the only time in the
last several hundred years when it was possible to get a reasonable estimate
of what segment of the population held what beliefs, including the less
popular ones. Being able to get an objective view of that distribution was the
first step in qualifying and addressing the underlying issues.

Instead we are sweeping those demographics under the rug again, and destroying
the ability to study and interact with them.

~~~
p49k
What evidence has shown that engaging with them is more effective? It seems to
be like it’s nearly impossible to convince someone that their worldview is
wrong; however people do feel the need and desire to be part of communities
and knowing they might be excluded for having hateful views might be more of a
motivator for them to rethink those views on their own terms, which is
possibly a more effective method of changing someone’s views.

~~~
kuzehanka
> What evidence has shown that engaging with them is more effective?

We have a lot of evidence that marginalising sufficiently large groups leads
to tremendous backlash when they reach critical mass, e.g. almost every
revolution in history. I think we can agree that doing exactly that which
failed every time before is not a productive avenue.

> having hateful views

Let's not conflate hateful views with 'toxic' views. The term toxic refers to
any unpopular view. In addition to hate speech, this includes various sexual
preferences, political views, anti-science movements, mens rights groups, etc.

~~~
DFHippie
> Let's not conflate hateful views with 'toxic' views.

Is this a terminological distinction you just made up? In the usage I'm
familiar neither "hateful" nor "toxic" is a term of art.

~~~
kuzehanka
As per TFA, reddit banned a list of 'toxic' demographics. A quick scan over
those demographics makes it quite apparent that about half of them are not
related to hate speech but rather some of the other categories I listed above.

Not sure how the distinction between 'hateful' and 'toxic' could possibly be
contentious.

------
aiyodev
Reddit got the racists and sexists to leave by dialing the bigotry up to 11.

Reddit has never been as hateful as it is today. Take a look at /r/politics
sometime. Do you see love?

People jump down your throat if you dare disagree with the hive mind. Those
expressing the most popular opinions get to bully the minority. Fighting back
will get you banned for incivility.

The censorship has also had a chilling effect on creativity. People create
garbage content afraid to publish anything controversial. It’s becoming as
boring to read as Facebook.

------
deertick1
This is so frustrating to me. First of all who is to say what constitutes
toxic? For instance, a drug related subreddit I know of was banned. There,
people discussed harm reduction and chemistry of novel compounds. It was
amazing and very useful for people. But that got banned.

Second- no of course this isnt effective. Its effective at removing people
from the platform. But it is not decreasing "hate speech" (which is nonsense
newspeak anyway) its just pushing it out of the mainstream. And for those of
you making the argument that it is reducing peoples exposure to hate speech
which stops its proliferation... Nonsense. In a free society everyone gets to
dicide what they do and don't believe. You cannot control peoples thoughts by
reducing their exposure to the bad thoughts.

Good people will not be convinced by bad arguments. Bad people will be bad
either way.

I hate this so much and reddit is not NEARLY theplace it used to be. Now it is
a liberal echo chamber which is all too happy to collectively shout down any
divergent thinking. It was alwaysthat way to an extent, but it's so bad now
that I finally had to delete my 6 year old account.

~~~
pygy_
> You cannot control peoples thoughts by reducing their exposure to the bad
> thoughts.

Actually, yes, you can, quite literally. Falsehoods repeated ad nauseam are
convincing in the long run (do you remember the second Irak war and the WMD
hoax propagated by the government? Half the US still believes that one).

In a meeting, a single person making the same argument three times is almost
as convincing as three people making it once.

Etc...

We're not as free as you imagine.

~~~
deertick1
let me rephrase. Yes, you can. But you absolutely shouldn't, because once you
start controlling peoples exposure to ideas, things go off the rails.

------
Noumenon72
One of my favorite things about Reddit was the ability to find people actually
expressing opinions that are anathema in the mainstream media, like "women can
also be bad sometimes". I feel heavily oppressed by the people in this thread
saying I'm not allowed to read or talk about those things except in the
equivalent of a "free speech zone".

I think the comparison to a law forcing the homeless out of your neighborhood
is apt. It works, but you must recognize that you are using power to harm
people you don't like, not just improving the world for everyone. Like police
power, it sometimes does improve the world for everyone and sometimes it grows
to do more harm than good. In this case the result is good, but the fact that
everyone just thinks "I like the result so censorship powers are good" is a
step down a slippery slope.

