
The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech [pdf] - lainon
http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf
======
sillysaurus3
This is an A+ study which seems to confirm that banning subreddits can be an
effective way to silence their inhabitants. "Fat hatred" as an epidemic has
largely disappeared since 2015, for example.

It's nice to get some hard data to counter the theory that if you ban a
subreddit consisting of undesirables, they'll simply invade other parts of
reddit and continue. In reality, the other parts of reddit aren't nearly so
tolerant.

So what's the implication? Well, ban judiciously. Getting rid of places that
fester hatred is like putting out a fire. But it's obviously very tricky to do
this.

See /r/physicalremoval for an example of a sub that was just banned for
inciting violence.

~~~
istorical
The danger of censorship lies in the idea that while the initial generation of
censors may have benevolent intentions, a future actor who inherits those
powers to stifle communication or censor ideas may use them to subtly shift
discourse or the overton window, push an agenda, or control people.

Why do free speech proponents fear the precedent being set by Reddit or
Twitter to ban those they consider hate speech? They fear it because of the
idea that over the course of years or decades that same power of censorship
(or even just cultural idea that censorship is OK) may slowly move away from
being used on obvious bad actors and hate speech and into the censorship of
groups that don't deserve it. Today's well-intentioned stifling of hate speech
is the same set of tools and culture that could enable totalitarianism
tomorrow.

Edit:

I was asked who I'm referring to as a free speech proponent.

I would cite Zeynep Tufekci as being a really interesting author in this space
as her works specifically examines how technology and social media is changing
the politics of dissent and revolution. Here's an interesting article
discussing her book:

[https://latest.13d.com/will-direct-democracy-give-way-to-
aut...](https://latest.13d.com/will-direct-democracy-give-way-to-
authoritarianism-direct-democracy-movements-powerful-vulnerable-febe6e75a953)

"With the rise of social media in 2005, the networked public square shifted
from blogs and websites to “massive, centralized platforms where visibility
[is] often determined by an algorithm controlled by a corporation, often with
the business model seeking to increase pageviews.” Traditional mass media,
once the gatekeepers of social movements, were replaced by a few, very
powerful “chokepoints” that monopolize ad dollars and users while encouraging
surveillance. Facebook’s “real name” policy, for instance, can snuff out a
movement before it even begins.

...

To be effective, censorship in the digital era requires a reframing of the
goals of censorship not as a total denial of access...but as a denial of
attention, focus, credibility...Censorship by disinformation focuses on
attention as the key resource to be destroyed...Rand Corporation researchers
refer to this method of propaganda as the “firehose of falsehood.” The primary
goal is simple: to confuse and overwhelm the “audience”...The result is a
frayed, incoherent, and polarized public sphere that can be hostile to
dissent...” This goal can unfurl in a number of ways, but platforms always
play a central role. We have witnessed it with Russia’s army of trolls. With
the stream of fake news that clouded Trump’s election. And with the
governments of Turkey and Russia, which have maintained their grip on power by
demonizing “alternative sources” of media—the kinds of sources where movements
are likely to flourish—while ramping up control of traditional media. In other
instances, platforms have made behind-the-scenes-deals with repressive regimes
in order to allow access to their users. In places where Facebook is the
internet, social movements have little recourse but to submit to the terms of
the deal.

...

Because “these platforms own the most valuable troves of user data and control
the user experience, they wield power to decide winners and losers for
people’s attention."

~~~
komali2
I wish I could find the exact book, but there was a philosopher writing about
anti-semitism after WWII.

I really want to give you direct quotes but I'm about to run to a meeting,
I'll search harder afterwards.

The jist, as written in ~1950s IIRC:

1\. Anti-semites are immune to argument or criticism, because they are "Just
Joking." They will spew hate speech and throw every argument they can at you,
logical or otherwise, and outright lie, because they are "just forcing a
discussion." If you actually pin them down and try to challenge them, they'll
laugh you off. "I'm just starting the discussion here, are you really taking
me seriously? Hahahahaha loser, triggered!"

2\. It is acceptable to argue the possible benefits of levying import tax on
wheat from Alegeria, because there are genuine positive/negatives to the
transaction. However, it is not acceptable to argue over whether "All Jews
should be killed." A standpoint that suggests the outright destruction of an
entire people, or their enslavement or removal of freedoms, is so heinous as
to not even be worth discussing the _possibility_ of merit. In other words,
anti-semitism is a garbage philosophy that our zeitgeist should not permit. It
does not fall under the protection of "free speech," it is simply rejected
wholesale.

My point: Restricting hate speech is not a slippery slope for freedom of
speech. I personally believe there is no universal morality, but if I had to
pick, I'd argue that the best outcome for the human race would be a culture
that purges all racist and other arbitrarily prejudiced mindsets that judge
entire populations on untenable grounds (race, gender, etc).

It is not a slippery slope if you set clear boundaries.

~~~
grzm
> _I wish I could find the exact book, but there was a philosopher writing
> about anti-semitism after WWII._

You may be thinking of Sartre's _Anti-Semite and Jew_

[https://en.wikipedia.org/wiki/Anti-
Semite_and_Jew](https://en.wikipedia.org/wiki/Anti-Semite_and_Jew)

Here's a quote:

 _Never believe that anti-Semites are completely unaware of the absurdity of
their replies. They know that their remarks are frivolous, open to challenge.
But they are amusing themselves, for it is their adversary who is obliged to
use words responsibly, since he believes in words. The anti-Semites have the
right to play. They even like to play with discourse for, by giving ridiculous
reasons, they discredit the seriousness of their interlocutors. They delight
in acting in bad faith, since they seek not to persuade by sound argument but
to intimidate and disconcert. If you press them too closely, they will
abruptly fall silent, loftily indicating by some phrase that the time for
argument is past. It is not that they are afraid of being convinced. They fear
only to appear ridiculous or to prejudice by their embarrassment their hope of
winning over some third person to their side._

(Thanks to 'geofft and 'tptacek for this:
[https://news.ycombinator.com/item?id=13089118](https://news.ycombinator.com/item?id=13089118))

~~~
mlloyd
I'm pretty sure I've heard this argument made about the campaign for our
recent President as well. HRC had to use words well because she, and her
supporters, believe in them.

While DJT and his supporters believe in "LOL - just kidding - can't you take a
joke? Don't be so uptight." So he was free to play with the truth. His
supporters aren't trying to convince themselves - they know their arguments to
be BS - instead they're trying to find the secret code to convince enough
bystanders.

~~~
dropit_sphere
You have to take into account, though: they _won_.

What's interesting to me is that reading through this thread, particularly
Sartre's description of anti-Semite thinking, is that the _exact_ same
thoughts are voiced in the rightist spheres I inhabit, but referring to the
left. Particularly regarding discourse and respect (or lack thereof) for it.

In fact, it gets interesting when I think about speakers being no-platformed
of late. When, IDK, Richard Spencer or someone gets protested away from some
university, is this an example of:

\- blatant disrespect for words, as shown through Spencer's poisoning the well
as Sartre describes, or

\- blatant disrespect for words, as shown through him not being allowed to
speak?

It is a strange world indeed where _both_ sides, referring to the _same_
incident, take completely different positions, both in the name of free
speech.

~~~
mlloyd
Yeah, totally agree on that. The worse part is that you can't offer a solid
argument without being called biased by the other side.

I think this is one where you just have to call them wrong and tell them to do
one if they disagree. Hate speech just isn't something that we have to
accommodate, nor do we have to give credence to the arguments for it.

------
nostromo
This is a bit like studying crime in one neighborhood in isolation.

Putting cops in neighborhood A may reduce crime there, but it may just
increase crime in neighborhood B.

I actually liked having offensive subreddits on Reddit because there were so
many people that devoted themselves to arguing with the people that posted
there.

Now those people are gone and are off to internet spaces where nobody
disagrees with them. This will reinforce their opinions, not change them for
the better.

~~~
rxhernandez
You genuinely think the type of people who subscribe to subreddits like
coontown and fatpeoplehate are going to have their mind changed by people on
the internet?

I've had no success changing the minds of people who think that being trans is
a mental illness despite citing numerous peer reviewed articles. Maybe I'm
going about it wrong, but I'm not sure it gets more clear than pointing to a
bunch of scientists that directly contradict their understanding.

~~~
anon2936
I post on /pol/ without agreeing with most of their views. You're unlikely to
immediately convince anyone of a diametrically opposed point of view at any
given time. But you can bring nuance to a very one-sided, memed and
stereotyped "discussion".

For example there is a large overlap between pro-fossil fuel and nationalist
attitudes. It is fairly easy to get some concessions about renewables from
them once you point out that renewables mean energy-independence and not
giving money to the middle east.

If you have no scruples of adopting their arguments for the sake or arguing
you can also make your argument explicitly anti-jewish by pointing out that
israel would be a far less important strategically if the US did not depend on
oil as much. It seems like people have trouble applying logic of this kind
because they fear being seen endorsing the arguments that they use but in the
end it's just a form of playing devil's advocate.

If you have a thread where the general tone is that all blacks should be
killed then maybe an admission that this would not be a smart move and better
solutions can be found is already progress. Small progress, but still
important, nudging things in a better direction bit by bit. And yes, board
culture does change over time, imperceptibly to most, but it does change.

~~~
fatbird
So patient, persistent effort might yield a small shift in the attitudes of
some in the audience.

At what cost in those who see the basic terms of debate validated by
engagement? When you're arguing that only some black people should be killed
instead of all of them, what harm are you doing by legitimizing the basic
question of whether blacks should be killed?

~~~
anon2936
You are assuming that there is a question to legitimize. If you have someone
convinced of the issue then there is no question. By engaging in discussion
you are simplying providing an alternative view point. You don't have to
present yourself as opponent, you can present yourself as member of the
ingroup with a slightly more moderate position that might also be more
palatable to the public. There are many strategies. It is also important to
realize that there are many passive readers of such discussions. I recall moot
(rip) saying something along the lines that there is a 1:10 ratio of posters
to lurkers. So by presenting a more moderate position you're also showing
people who might not like the left that there are alternatives to the most
radical voices.

In the past, even in many brutal wars, belligerents generally recognized the
need for parley. This is not even a war, it is a disagreement about social
norms and laws. If villainize the other side to the extent where it would be
immoral to even speak to them you're only fueling the polarization instead of
creating a continuum. Look at europe. The extreme right exist too, but between
the left and the right there are many different currents represented in
parties, which lowers the activation potential for people gradually switch
camps.

------
mobileexpert
This uses the pushshift.io dataset. The dataset is collected by an independent
non-Reddit inc user @jasonbaumgartne on twitter. Huge props to him for
collecting it. Anti props to the authors for not giving him a better
acknowledgement in the paper.

~~~
coherentpony
Maybe reach out to the authors and tell them. I like to think it was ignorance
as opposed to malice.

~~~
rspeer
I wouldn't assume it's malice either, but I'm glad it's being mentioned.

There's a trend right now where researchers and reviewers have no respect for
data, even as research uses more and more data. They don't think of citing the
data they use any more than citing the air they breathe. This leads to lots of
research built on bad data, as producing good data is not rewarded.

I've seen my own data set become a footnote, appear as the only uncited source
on a slide, and be attributed to other people who worked on a related project
but had nothing to do with producing the data.

------
LMYahooTFY
I find the effects on "invaded" communities post-ban to be most interesting.

There seems to be no indication of any up tick in hate speech within
communities that were "invaded" by former subscribers to the banned
subreddits.

Similarly, there was a dramatic decrease in hate speech overall among the
subscribers of the banned subreddits after the ban.

To draw an un-scientific and unsubstantiated inference, perhaps the power of
crowd-psychology and echo chambers is responsible for more hatefulness than we
yet can prove.

In some regards I would like to hope this is the case.

~~~
anon2936
That doesn't mean much. People wear masks, self-censor or adopt more obtuse
language. For example despite using a throwaway right now I am still phrasing
things not quite as I would like out of concern that I would not be taken
seriously.

"hate speech communities" still consist of people. Those people can engage in
other hobbies without injecting their political views into everything. Similar
how vegans and meat-eaters can dine in the same establishment and atheists and
christians can be buried on the same cemeteries.

I would even argue that dedicated places for hatespeech lead to exaggerated
expression of mostly reasonable people. Think of trump promising the wall.
People cheered him on for that. But that doesn't really mean they want a wall.
They might be perfectly happy with stricter immigration policy or enforcement
of existing immigration laws too. But shouting "build the wall" is a group
ritual for them. Virtue signalling works for both sides.

Edit: After skimming the paper I noticed that the authors were also
identifying subreddit-specific lingo. I wouldn't expect this to carry over
unmodified to other subreddits. You can still denigrate particular groups
without using those particular keywords.

~~~
maxk42
Not to mention this completely ignores the fact that most subreddits have
active moderators who remove hate speech.

If a user is forced to discontinue use of a subreddit where it's allowed and
continues to use subreddits where it's not allowed, then _of course_ you're
not going to see an uptick in hate speech. You've done nothing to change their
mind, however.

~~~
theWatcher37
Reddit also has some absolutely fascist mods. For example, it's well-known
that if you post anything that is against illegal immigration (and it gets too
many upvotes!) /r/worldnews mods will go through and ban you after the thread
dies down.

I was once banned there 48h after a polite discussion on the economic costs
(citing various sources) for "hate speech". I got too many upvotes so it was
noticed. Appealing to a mod resulted in him "examining my posts on other subs,
and given who you voted for the ban will be permanent".

Actual hate speech is one thing, but far-left mods of major subs are using the
banner of hate speech to silence well-informed opposing views.

~~~
ubernostrum
Aside from a small number of general site-wide rules, reddit is based around
the idea of "your subreddit (meaning, you're a mod of it), your rules". And
nothing about it requires that every single rule be exhaustively defined up-
front; moderator discretion is ultimately the sole definition of a subreddit's
rules.

If you don't like a subreddit's rules, you're free to create a competing
subreddit on the same topic but with different rules, and try to attract
people to it.

I moderate a medium-sized (200k-ish subscribers) subreddit, though, so I know
how unpopular this idea is with some people.

------
dahdum
I'm not surprised to see a study that confirms censorship works as intended,
just more concerned about the future societal structure we're building.

I grew up with free speech as the ideal. Free information, free speech, and
freedom in general. Reddit itself was about that at the beginning. That
zeitgeist is gone now - I just hope praising free speech doesn't become "hate
speech" before I pass.

~~~
jmull
You call it censorship. But to what extent is Reddit -- or any person or
organizationan -- _required to provide a platform_ for speech or expressions
they find abhorant or contrary to their values?

Would it be OK with you if I put a bumper sticker on your car? "Silence hate
speech!" or "Jews will not replace me!" or whatever it is that you might find
uncomfortable?

~~~
jmull
So...

if you downvote this to make it grey out or disappear, doesn't that mean you
think it's OK to supresss ideas on a public forum you don't agree with?

Heh.

~~~
mulmen
My understanding of HN votes is that they are supposed to reflect the quality
of the contribution to the conversation, not agreement or disagreement.

~~~
jmull
Sure. Then the question simply becomes whether it's OK to suppress ideas on a
public forum because they are deemed by some to be a low-quality contribution.

It seems to me this is dancing around semantics to avoid the question,
especially considering the problematic nature of the distinction between
something that one considers low-quality vs. something one disagrees with.

~~~
mulmen
They're fundamentally different questions. I have conversations with a lot of
people on a lot of topics, some of those conversations are more substantive
than others. Even if I disagree with someone it's easy to tell when a new idea
has been presented and how well it has been articulated. My understanding of
this voting system is that we are meant to use it to encourage meaningful
conversation, not to motivate our own agenda.

Is it possible to abuse this system? Of course it is. That's why I am engaging
you in this conversation to perpetuate the idea that the voting system is
actually a meta-vote about the quality of the _conversation_.

------
indubitable
This study seems to have failed to compensate for the most basic of
confounding variables. For instance their control group was specifically
chosen not to be a control, in the nominal sense, but other groups that were
likely to be banned or otherwise reprimanded. Why? They showed that the so-
called control group and the treatment groups had a slight decline in overall
posting activity, but failed to consider the trends for the site at large.
What was the overall trend, per user, at the time? Many of their control
groups were also banned, some within the window of time they sampled!!

The authors also seem to be failing to account for the fact that the entire
site began to begin automated censorship of posts with undesirable keywords or
from users that had posted in undesirable subs. Many of these posts can not be
recovered even from the API which directly taints their data. This also
resulted in two further confounding issues. A very non-zero number of users
that stayed on decided to post under new names and use language less likely to
be tagged by automated censors.

I think the important question is what effect this censorship has had on
trends of such beliefs, rather than whether the certainly specifically
censored words continue to appear. The answer to the former is interesting and
the entire question of censorship. The answer to the latter is rather self
evident and has little to do with whether censorship is effective, which is
itself separate from the question of whether it's desirable.

------
whataretensors
Reddit takes user's content on the assumption there is value. In the case
where there is liability and no value, they are not interested in the
content(makes sense).

I think the future is in a decentralized reddit-type system where content
creators and curators are rewarded for their contributions fairly.

------
toomim
Censorship works. Propaganda works. That's why people use it.

~~~
dionian
It works for people being edgy online, but can we censor glorification of a
more serious problem than “fat shaming” - say, gang violence - and have it
reduce _acts_ of “hate”? I’m not convinced similar censorship/propaganda can
stop more serious problems which entail actual violence... or do we really
consider people making fun of fat people a serious problem which having been
solved improves society in a significant way?

~~~
grzm
> _I’m not convinced similar censorship /propaganda can stop more serious
> problems which entail actual violence_

I think a more useful framing of this is whether and how it changes the
situation. I don't think anyone is arguing that this stops it, but it's not
hard to imagine that given these two alternatives ((a) provide forums where a
given behavior is allowed and (b) censor forums where a given behavior is
allowed) that it might have some impact.

And let's be clear: it's not like this is a single-argument function or that
there's only a single value that's being optimized. If it were only so simple
:)

------
Asdfbla
Very interesting result, somewhat encouraging to me, because it seems to
cautiously indicate that you can actually do something about those echo
chambers that may start as jokes but then actively normalize hate and
discrimination, possibly pulling formerly neutral users in too. (And keep in
mind that those hate echo chambers are not places of honest discussion, since
they regularly ban any opposing viewpoints and only strive to keep the hate
train going.)

------
olleromam91
> "In a sense, Reddit has made these users (from banned subreddits) someone
> else’s problem. To be clear, from a macro persepctive, Reddit’s actions
> likely did not make the internet safer or less hateful. One possible
> interpretation, given the evidence at hand, is that the ban drove the users
> from these banned subreddits to darker corners of the internet."

Could these groups now become _seriously_ dangerous when they convene and
discuss in the shadows?

~~~
memeCrasher
When you ban speech, you don't convince people they were wrong. If anything,
you convince them they're right.

The only thing censorship achieves is sweeping problems under the rug and
pretending they've been solved.

------
lochlainn
I think an interesting extension of this study would be looking at how other
sites were affected by these bans as well. There could well be a mass exodus
to a different site, like 4chan or Twitter, where the community reforms.

~~~
bglazer
This is explicitly mentioned in section 6.6 "Implications for Other Online
Communities"

------
alan-crowe
This seems like a case of using an easily available proxy end point and
cheerfully ignoring broader social issues. We are in the middle of an obesity
crisis and should be looking at the issue of "Fat People Hate" through the
lens of morbidity and mortality.

The typical post on Fat People Hate consists of a picture of a morbidly obese
person overflowing the seat of their mobility scouter. The comments echo each
other as the commenters say how disgusting the sight is.

One topic for empirical investigation is whether the person in the photograph
responds to the shaming by losing weight, or perhaps even gaining more. But
typically the person involved is unaware of the existence of Reddit. There is
a break in the causal chain and nothing to investigate.

Another topic for empirical investigation is whether the (presumably) slim
participants in the Fat People Hate sub-reddit maintain a healthy weight. One
conjecture is that participants are exposed to extreme images of morbid
obesity and become complacent about maintaining a healthy weight. They go on
to become overweight and a proportion get fatter still and suffer serious
health consequences.

An alternative conjecture is that some of the participants feel social
pressures in their daily lives to clean their plate, eat up, and not be a
broccoli munching spoil sport. Fat People Hate gives them an opportunity to
construct and maintain an identity as some-one who doesn't get sucked in by
their local social pressure to over eat. They unexpectedly maintain a healthy
weight, contrary to predictions based on their place in society.

We can articulate an ideal for research in this area. It involves tracking
down the participants in FPH and discovering records of their body weights.
The important questions are: While they participated in FPH, did they buck
national trends towards obesity? After FPH was banned, did they gain weight?

We are actually interested in counting diagnoses of type 2 diabetes and toes
amputated as a consequence of diabetic neuropathy. When we ask about the
effectiveness of the ban on FPH we are actually asking about weight gained and
lives lost. But when research question 1a asks: How were their activity levels
affected? they are not talking about walking or jogging. They are talking
about speaking.

------
nanistheonlyist
Perhaps I missed it, but this paper seems to ignore usage of alt accounts.
Many users have more than one reddit persona and may have simply stopped using
their FPH persona rather than leave the site as the authors assumed.

~~~
eighthnate
> Perhaps I missed it, but this paper seems to ignore usage of alt accounts.

It's also mistakenly attributing lack of "hate speech by these accounts to
banning of subreddits when it's actually that most of reddit now censors "hate
speech" and bans accounts that do so.

------
beedogs
Great. Now do /r/The_Donald. PLEASE.

~~~
0xcde4c3db
They actually did tweak the /r/all "heat" algorithm partly to prevent
/r/The_Donald from dominating it (because T_D users were gaming the old
algorithm by mass-upvoting a firehose of shitposts).

