
When does moderation become censorship? - tomkwok
https://blog.disqus.com/when-does-moderation-become-censorship
======
trowawee
Basically never, at least on the internet? No one owes anyone else a soapbox.
My space, my rules. There's nothing stopping anyone from rolling up a site and
saying whatever they want to say, but there's absolutely no reason I have to
give space on my site, no matter what my site is, for anyone else's viewpoint.

~~~
hackney
You are contradicting yourself, and did not even answer the question. What you
should have said was "as the site owner and moderator, whenever I damn well
please". But instead you said it never happens.

I have almost zero sites I login to for a reason. I would think that as soon
as a website becomes embroiled in taking a political viewpoint to the extreme
of not allowing the opposite side to speak is when censorship comes into play.
If we are talking individual attacks obviously that would and should fall into
the moderation category.

~~~
exolymph
I think what trowawee meant is that they never equate deleting comments (or
whatever) with censorship. It sounds like you do. This is a semantic conflict
re: what "censorship" means, precisely.

Personally, I think that deleting someone's comments is censorship, but that
doesn't mean it's a bad thing. Censorship gets a bad rap, but purging certain
viewpoints from a certain venue is not inherently evil or negative. For
example, Hacker News does it with the shadowbanning system. I don't mind
because dang is a thoughtful moderator.

~~~
trowawee
Also censorship has a specific denotation, i.e. that is is performed by an
agent of the state, and specific connotations, i.e. that the censored is
unable to express themselves in other ways (among other connotations). A
private citizen deleting your comment on a webpage they own and administrate
matches neither the denotation nor connotations of "censorship", except in the
broadest theoretical reading. And that reading that would appear to prohibit
any maintenance of comment sections whatsoever.

~~~
thaumasiotes
> Also censorship has a specific denotation, i.e. that is is performed by an
> agent of the state

This is not correct; TV networks have censors that operate internally. The
government doesn't operate television censors; it is limited to punishing the
networks after the fact.

> A private citizen deleting your comment on a webpage they own and
> administrate matches neither the denotation nor connotations of
> "censorship", except in the broadest theoretical reading. And that reading
> that would appear to prohibit any maintenance of comment sections
> whatsoever.

This is a confused argument; you seem to be proceeding from the premise that
censorship is bad, and therefore a phenomenon which is not bad can't be
censorship. And then that if it was censorship it somehow wouldn't be allowed.
(Observe that the definition of "censorship" cannot in itself prohibit or
mandate anything.)

It's much more productive to agree on a definition of a term, and then argue
over whether any particular example is good, bad, or a mixture of both, than
to agree that a term must be good, and then argue over whether certain things
are metaphysically able to fit under that term.

~~~
trowawee
1\. When a television employee is censoring according to the rules of the
state, they are operating as an agent of the state.

2\. No, I'm not. I'm preceding from a definition of censorship as "the
practice of officially examining books, movies, etc., and suppressing
unacceptable parts." Getting into "the dictionary say..." arguments is mostly
boring, but most of the definitions do include some element of official
imprimatur. Someone removing comments from their personal site fails to meet
that official element.

~~~
thaumasiotes
Media censors take the rules of the state into account, but they are primarily
concerned with pressure groups, not the state. They are most definitely not
state agents; that would violate the constitution.

The publisher of a website removing comments from it meets the same soft
officialness standard as a newspaper declining to run articles on a particular
topic, or a television network refusing to air episodes in which someone
disagrees with the group but doesn't end up suffering for it (not an invented
example, by the way -- that was a real element of a code governing children's
cartoons).

~~~
trowawee
The majority of censors are working to prevent state censure; they are
primarily concerned with the state. The primary work they do is removing
profanities and indecent images at the behest of the state.

~~~
meric
"The primary work they do is removing profanities and indecent images at the
behest of the state."

I'm sure that's the primary function of the Chinese government's _Central
Propaganda Department_.

Censorship is when someone or some organisation who has power over you,
control your ability to access information.

From the government regulating the content internet users can access deleting
content contrary to the government's goals, to administrators of personal
websites deleting comments by people espousing views, considered disagreeable
by the owner of said website.

~~~
trowawee
Except the government's power to censor you goes way farther than somebody on
a private site deleting your comment. You're saying popguns and artillery
exist in a continuous spectrum, and I'm saying they're completely different
categories and should thus be treated differently.

~~~
meric
The diameter of star R136a1 goes so much farther than a typical Red Dwarf.
They might be different categories of stars, but they're both stars
nonetheless.[1]

I'd say a personal website eliminating disagreeable views from comments is
actual censorship, even if it's ability to silence views is a few magnitude
orders smaller than state censorship.

If state censorship is an artillery, personal censorship is at least a zip
gun, and it can injure someone.[2]

[1]
[https://en.wikipedia.org/wiki/R136a1#/media/File:The_sizes_o...](https://en.wikipedia.org/wiki/R136a1#/media/File:The_sizes_of_stars.jpg)

[2]
[https://en.wikipedia.org/wiki/Improvised_firearm#/media/File...](https://en.wikipedia.org/wiki/Improvised_firearm#/media/File:A_Crude_Indian_Homemade_%22Gun%22.jpg)

~~~
trowawee
Who can it injure? The state can actually legally prevent you from saying
something ("That is slander/libel") at the threat of punishment. I can just
prevent you from saying that on my website. I can delete your comment on
trowaweeritesgud.com, but I can't stop you from saying whatever ("Kumquats are
a better fruit than mangos!") in the comments here, or on NPR, or from rolling
up your own site and saying it there. These aren't just (massive) differences
in scale, they're also massive differences in effectiveness of the injunction.
That's what moves it into its own category.

~~~
meric
You can't stop comments in NPR, and China can't stop people accessing
information in a foreign country, bringing it back to China and providing it
in person in a secret room either. They are massively different in scale, but
that is not a reason to put them in different categories (same as with stars,
there are massive differences between them in scale, but stars nonetheless)

------
Pengwin
My idea of moderation is:

1\. Outline rules for a community, make them clear and understandable

2\. Enforce those, and only those, rules.

3\. If something objectionable happens, and the rules don't cover them but you
think they should, then change the rules to how you see fit.

4\. Handle the fallout of the rules changing.

Steps 3 and 4 are where things mostly fall apart in communities. Personally,
any time something happens which leads me to change rules for something, it's
reactive and is the voice of the majority of people (not the loudest, the
majority). Most people are happy that way.

Disagreement is never a rule I'd have. I'd only make sure that people either
have the tools to self-censor what they don't want to see, or tell them they
might be happier going elsewhere.

~~~
esbranson
> _Outline rules ... clear and understandable_

There are many discussions about rules being either clear or understandable,
but usually not both.

> _Enforce..._

Oh. We "just" enforce it? You sure this mechanism will be both clear and
understandable, _and_ be enforced as intended?

> _change the rules_

Change the rules, in reaction to specific instances, while keeping them clear
and understandable? This is why they're usually neither both clear and
understandable and enforceable.

> _Steps 3 and 4 are where things mostly fall apart in communities._

There is no community where this idea of moderation could hold true.

~~~
Pengwin
Really? It seems as though that is how most online communities try to follow
this logic.

I fully get that 'clear' and 'understandable' are very subjective, though.

------
CookieMon
I've seen so many moral-handwringers try to justify silencing people by
trotting out that XKCD comic as though they were "the community" that Cueball
weirdly stopped feeling like a positive hacker character for me - half the
time I encounter him now it's as some apologist for "I know best" assholery.
The "Moral Majority" all over again.

In my experience it's hardly ever been the community showing someone the door,
it's always a corporation getting cold feet over potential controversy, or
some narrow-minded zealot abusing their janitorial moderator powers.

Last time this happened it was the community that overwhelmingly wanted to
show the zealot the door (a vote happened while the mod was away), but they
couldn't - they were nice normal people and not the kind of weasels who
dedicate hours of janitorial work for the chance of having veto power over
others in an internet forum.

Moderation and censorship can be separated - readers can be given the option
to circumvent the moderation when/if they choose, or they can have the option
to "unsubscribe" from the actions of moderators they disagree with. To create
genuine public spaces we are going to have to find a way that doesn't assume
uncorruptible integrity of moderators (the role attracts weasels). Private
websites can certainly behave however they want, but they are currently
standing in for our public spaces - and presenting themselves as such, which
means we have none.

~~~
andrewflnr

      ...or they can have the option to "unsubscribe"
      from the actions of moderators they disagree with
    

This is a really interesting idea. Another would be to make moderation opt-in,
e.g. you select the curated filters you want to apply to forum X based on
their reputation for fairness and pleasantness. These might in fact be the
personal filter settings of upstanding members, or a vote between a few such
members.

Anyway, thanks for the yummy food for thought.

------
daurnimator
Moderation vs censorship reminds me of [https://en.wikipedia.org/wiki/If-by-
whiskey#Canonical_exampl...](https://en.wikipedia.org/wiki/If-by-
whiskey#Canonical_example)

------
esbranson
A moderator --is-- a censor.

~~~
exolymph
Yes. Agreed. They're different terms for the same thing, with different
cultural connotations.

~~~
thaumasiotes
Basically what I came to say. daurnimator said it better, though.

------
JakeAl
If Disqus would simply design their software so it allows users to
filter/block whichever users they don't want to see no one would need to be
censored or moderated. All the trolls, paid or not, would vanish from a user's
view and it would be up to the user whose messages they see. Problem solved.

------
paulddraper
Moderation: the action of making something less extreme, intense, or violent

Censorship: the practice of officially examining books, movies, etc., and
suppressing unacceptable parts

Moderation is censorship.

To say otherwise is to make a distinction without a difference.

Rephrase your question to have meaning. Perhaps "When is censorship
justifiable?"

------
dragonwriter
Moderation is always censorship. But censorship isn't always problematic.
Government censorship, or censorship by an entity that has effectively the
same kind of broad control over media of communication is problematic.

------
dredmorbius
As others have noted, moderation _is_ censorship, at least by action. Neither
however fall under the _legal_ protections of free speech, which don't apply.

The first question is: what is a given community for, and how should it
accomplish those goals? Individual moderation, collaborative filtering,
individual killfiles, expert ratings systems, etc., are all tools toward these
ends. They're non-trivial.

[https://www.reddit.com/r/dredmorbius/comments/28jfk4/content...](https://www.reddit.com/r/dredmorbius/comments/28jfk4/content_rating_moderation_and_ranking_systems/)

 _Community behavior is very strongly dependent on BOTH scale AND founder
cohorts._

A group with one person (blog, winking in the dark) is different from one with
two, or a small set of people engaged in discussion (say 3-30), or a larger
group discussing common topics (say 30-100), etc. Part of this can be thought
of as a _cost function_ in which the _positive_ contribution of members falls
with scale, while the _cost_ of each additional participant is rather more
constant. Eventually, adding more participants makes the experience worse for
all.

It's _really_ difficult for any one conversation to have more than a few key
participants. Two and a moderator, or perhaps 5-6 participants _who know each
other well and get along_.

If you're trying to arrive at some truth or understanding, it's really
difficult not to have a truth-based moderation criterion.

Individual killfiles are somewhat useful, except that the killfilee tends to
see a great many one-sided discussions (others interacting with those they've
filtered). Unless the system blots out _both_ sides, this accomplishes little.

I've started looking more at discussion tools which foster both smaller and
larger groups. "Warrens" and "plazas". The idea of creating persistent
communities well below Dunbar's number (say 50-300 people) and promoting up
material from those has some appeal (you'd also want to allow for lateral
movement of individuals). A small set of tiers would easily accomodate the
entire global population. (And yes, there's a social network set up on this
basis though the name escapes me.)

A huge problem with allowing noisy participants is that they draw the oxygen
out of the room, and tend to _very_ strongly discourage high-quality
particpants. There's a curiously persistent asymmetry between an individual's
proclivity to participate in a group discussion and the interest in others in
their doing so. Good designs balance this mismatch.

------
green_lunch
It becomes censorship when a moderator removes a comment or discussion based
purely on personal beliefs and disagreements.

Moderation should be about removing the trolls, not what it has become, which
is censorship.

It feels like many people personally enjoy moderating down disagreements
because they don't get any actual power in their own lives.

It's one of the reasons I stay away from most online discussions these days:
because nobody can be open and honest. It really makes me wonder if this is
the reason why secret groups like the masons were created. Back in those days,
instead of getting moderating down, you were killed or attacked.

~~~
droithomme
"Moderation should be about removing the trolls"

True, but some argue that those who support Trump, or who are against gender
identity equality, are trolls.

~~~
gozur88
That's the problem in a nutshell. About 95% of the people out there think the
definition of "troll" is "someone who disagrees with me", and I don't see how
you ever really get away from a subjective determination.

~~~
trowawee
Because it's a fundamentally subjective evaluation. There is no way to
objectively define "trolling".

~~~
markdown
> There is no way to objectively define "trolling".

Oh it's been very objectively defined. _Knowing_ when someone is trolling, or
when they legitimately believe what they're saying is the problem.

~~~
trowawee
So...it's defined, but it's impossible to know whether it's happening unless
the person specifically tells you they're trolling? That sorta seems like a
flawed definition.

~~~
markdown
Can you define lying? Has lying been defined? Check your dictionary.

Knowing what a lie is is not the same as knowing when someone is lying.

------
CookieMon
And Hacker News deletes it, <sigh>

~~~
esbranson
Epic

Edit: I don't think this item has been deleted, it apparently was flagged and
has been "moderated" automatically without human intervention?

~~~
CookieMon
Sounds plausible - abusing the flag/report system seems to be common with
groups who like to act as self-appointed censors.

How did you tell the difference? (or suspect the difference)

------
lotsoflumens
... immediately.

~~~
scrollaway
A user-driven upvote/downvote system is a form of decentralized moderation. Do
you also believe that if a post is being downvoted, it's censorship? Do you
believe that upvotes are a tool of censorship?

They can be, of course. Censorship is not just deletions/removal. You can
censor something by hiding it, drowning it out, discrediting the author, etc.

Now keep all that in mind before giving a one-word response to a very complex
question.

~~~
esbranson
The article uses moderation in a sense apart from an upvote/downvote system.
Your argument is premised on a upvote/downvote system being moderation. _How_
exactly does it moderate? If it merely reorders and doesn't remove, then this
is not moderation in the context at hand.

Therefore your argument is invalid. (Your argument is really just orthogonal,
premised on different assumptions.)

~~~
scrollaway
The point I was trying to make was that you can't just say "immediately" and
get away with it. And upvotes/downvote don't necessarily have to merely
reorder. On many websites, large counts of downvote automatically hide and
sometimes delete the post in question.

~~~
lotsoflumens
Sorry about the delay in responding ...

I don't consider upvotes/downvotes by themselves to be moderation or
censorship. That's just non-verbal communication.

However, when an upvote or downvote system hides a post or promotes a post,
then it's censorship, because promoting a post necessarily hides some other
post.

