
Social Media Moderation without Banning - username3
Let users choose their moderators.<p>Allow user created moderation groups.<p>Let users follow multiple moderation groups.<p>Hide posts or users moderated by group.<p>Show conflicts if two groups disagree on what or who to ban.
======
DoreenMichele
The fundamental flaw in this question is the assumption that all users are
trying to participate in good faith. This will not be true.

If you get any traction, a high percentage of banned users will be malicious
actors who are trying to spam the forum for purposes of making money or
similar. It won't actually be members trying to talk with people and failing
to be understood or rubbing people the wrong way, which is basically what this
question aims at addressing.

"Behave acceptably or be excluded in some manner" is the crux of all social
control. Every utopian scheme to include "everyone" has only worked with a
small, self-selected group of like-minded people. When they open it up to the
broader public, freeloaders who refuse to meet expectations consistently ruin
it and usually in short order.

------
CharlesColeman
I've heard similar proposals made million times, but never seen it
implemented. I see a few major problems with it:

1\. Moderation is a thankless task, and it's unclear why anyone would put in
the effort to bootstrap a "moderation group" \-- it could be a huge amount of
work for zero members

2\. Legal liability - the site administrators still need to moderate away
illegal content that could get the site closed down

3\. Fragmentation - users segregate into separate communities polarized by
some conflict or other, which would probably lead to weird community dynamics

~~~
gus_massa
> _3\. Fragmentation_

It's essentially the idea of having a few subreddits about the same topic,
like r/btc and r/bitcoin. (Or something like that, I'm ignoring both.) Both
are about the same topic and the difference is how they filter the post and
interact with the community.

The alternative is to have both groups in a single big supersubreddit, and
some users can choose to filters the users baned by the r/btc moderator, some
users can chose to filter the users banned by the r/bitcoin moderators, some
users can choose to filter both, and some users can choose to filter neither.
(And there are a few more alternatives, the mods can add whitelist too and you
can combine them.) I don't know if it may work, but the brigade wars would be
amazing.

------
krapp
What are moderators supposed to do if they can't ban? Give users a stern
talking to?

4chan.. 8chan.. Voat.. _every_ community no matter how free and anarchic has a
line somewhere that will get you banned if you cross it. Even "anything, as
long as it's legal" means "anything illegal gets banned."

~~~
username3
Moderators can flag bad users.

Users decide if they agree with the moderators.

If the user agrees, the bad users are hidden from the user.

The user decides where the line is.

~~~
krapp
So users will simply never agree with moderators and will decide there should
be no line. Why bother with moderators at all?

~~~
username3
User A will follow moderators that blocks users that cross the line.

User B will not follow moderators and see all users.

------
ddingus
Ok, I am gonna put this here, despite the fact that it is both gonna be
controversial, and some work:

 _Rules._

You want as few as possible. Where there are rules, people will both press
them to see where the boundary issues are, and they will game them to act
against others from a position of relative impunity.

Rather than define how shitty people can be to one another, assign doing that
a risk, and set the expectation of treating others right, with mutual respect
as the ONLY safe thing.

 _Agency in conversation._

The hard truth is most people are either lazy, or unaware of the full set of
options in dialog.

Example: getting called an ass by a clown

Weighting comes first. They are a clown. The weight here is modest on a good
day.

Then comes options. The number one response is righteous indignation. "How
dare you..." They are clowns, trolls, or just toxic. Of course they dare, and
that response is exactly what they expect.

And you can bet your ass their weighting on whatever indignation you express
is near zero too. They will not care and consider it all entertainment.

Other choices:

Rate the garbage. D -not feeling it, etc...

Ignore and redirect. "Dude, at least try, now back on topic..."

Humor. SPIN it and have fun.

You get the idea here.

Denying them the basic, standard response disincentivizes that behavior,
leaves you in control of the overall dialog, and puts others on notice and you
as the person having higher ground.

Conversations go bad when you let them go bad. Don't let them.

 _Super important:_ when a moderation even happens, and a user could have
employed agency to recover and maintain a good dialog, tell them that and ask
them to try next time.

Do this with a no blame, just building strength approach. Everyone can improve
in how they employ agency and when they do the outcome is amazing. Model it
for them, if need be.

The troll or abuser totally is out of bounds. But their impact has more to do
with people being effective at managing dialog than it does whatever shit they
said does.

Not everyone will play that ball. Fine. Perhaps they can go somewhere else
too. No joke. (Without this, moderators end up baby sitters, and fuck that.
Babies can be as much or more trouble than toxic people can be.

No blame, no fault. Just improve, seek better and it will happen for them, and
everyone, next.

(I know, work and high controversy, but oh so effective.)

 _Norms vs rules_

Most of what I just put here are norms. They get distributed among members of
the community by modeling them successfully. The more they are seen and used,
the more potent they are. Secondly, unlike rules, norms are much harder to
game.

I have done this in communities to a degree where trolling all but goes away.

Why?

Risk reward. Someone trolling can do so at very modest cost. The rewards can
be huge! Bans and other blunt instruments are expensive and inhibit real, even
heated, but otherwise high value interactions. Chilling effects and ripple
effects abound!

(Ban impacts people and others connected to them)

It is hard to raise the cost, but the reward can be impacted significantly.
Low risk reward = low nefarious actor potential.

 _Cost to post._

Rather than ban, cost to contribute can be increased, and this can be done
while leaving the discussion and potential for better all on the table. And it
works well for everyone to see things are being done.

No four letter words, for example. If a post contains them, it is rejected.
They try again. This is hard to do, expensive, but not crippling. When the
norms are well placed, it also is fun, and the person who is needing help
often gets it. When they improve, remove the no four letter words restriction,
all forgiven.

Disemvoweling. No vowels for x number of posts, days. Do not just use one or
the other. Both work well. 3 days and 10 posts works great.

"Sry ppl"

There are other tools along these lines that are not bans, that do not shut
people out, and that really do cultivate a strong culture that is inoculated
against trolling and abuses of various kinds.

When they complain tell them this community does not exist for their nearly
free entertainment. Pay up, or play nice.

The beauty is others see the nefarious person "paying" and will actually often
demonstrate better agency in dialog! It is far less impactful when a troll is
trying to troll in a clear way. Funny most of the time.

Some will leave, some will reform, some will rage. Ban the ragers, let the
rest play out.

 _Moderators._

The best way to strengthen a community and be valued is to participate and
help.

Make it clear when an expression is moderation vs just being a user.

You have all the power. You should never, ever let a troll get under your
skin. Weigh things lightly, model good discussion options, offer help.

If needed, have another moderator use the tools. Moderating a discussion you
are in sucks. Get help.

Cultivate two things:

No victims. There will be a few, but empower them, help them, and do that
which makes the community strong and them feel secure more able.

No fear. Being among people there, helps make family, reinforce norms and
people will have real conversations about problems. The moderators hidden in
the citadel are vulnerable to a whole range of subversion attacks that well
loved, servant type moderators are not.

Help includes finding resources, searches, publishing good group findings and
having fun, even encouraging it at regular times.

Done right, the community will be innocualted, well equipped to deal,
welcoming and helpful to others, and will laugh more often than it cries or is
indignant.

All of the things you, moderator, will have modeled, cultivated and empowered
the community to do.

It scales. All it takes is a few great users to hold the norms, and the actual
moderation duties are light.

Automate spam and make abuse, criminal speech moderation simple, effective and
well understood.

 _Bottom line is everyone in the community_ is responsible for their part in
it. Step up and it gets exponentially easier as people come to realize they
just do not need to be impacted.

(HN actually uses norms in an exemplary way, well done moderators. I am a fan)

The moment they do?

Game changer. Trolls and others will smell it right away and often just leave
long before an action gets taken.

~~~
ddingus
For anything criminal, or liability is high, simply remove it, act quickly,
and make it very public and very clear. Everything I wrote above is for non-
criminal but toxic speech.

