Ask HN: Suggest steps that can help to stop the trolls in an online community? - startupflix
======
at-fates-hands
I used to manage a few online communities and you're always going to have
trolls. Here's some ways you can control them.

1 - Moderate comments and remove any comments you don't like. Make a rule that
all comments need to be approved before they will be posted. This will cut
down on about 60% of the stuff you'll see.

2 - Shadwo ban users:
[https://en.wikipedia.org/wiki/Shadow_banning](https://en.wikipedia.org/wiki/Shadow_banning)

3 - If Shadow Banning doesn't work, then target and block their IP addresses.
If you need to, warn them you will contact their ISP and let them know.

These are the big three that always seemed to limit the trolling behavior I
saw. Only in extreme cases do you have to do #3.

~~~
Nomentatus
Note that the worst, and most sophisticated trolls will just keep a second
well-behaved account to monitor the actual forum as others see it; so it won't
be more powerful than a ban to them. Forcing you to 3) as you say.

Someday we'll have shadow banning deluxe - not only will the troll be seeing a
different forum, but they alone will see fake A.I. generated responses to them
that will cause the troll to waste endless amounts of their time continuing to
troll-debate. Time they can't spend buggering around with someone else's
forum.

Sadly, by that time, we'll also see A.I. auto-troll-programs appearing in
large numbers.

------
cntrlaltdlt
It's probably best to start defining what factors contribute to trolling of an
online community.

What particular communities are more susceptible and why?

Let's first tackle the non-sybil form of trolling and think about individuals.

Consider the differences and similarities between a community like linked in
vs say 4chan? These are as far apart on the spectrum of communities as I can
currently think of. Is one of these communities more easily trolled or duped
than the other? Why might that be?

I hypothesize a community like linkedin would be more difficult to troll as
you must offer personal data or spoof personal data in order to be let into
the community. Certainly this isn't impossible to circumvent, but requires a
greater degree of commitment. This commitment is essentially a subconscious
time value calculation as well as a calculation of social cost. The logical
conclusion is that the social cost of being a troll on Linked in is difficult
as there is a very real and immediately perceivable lost opportunity cost.

Let's take another example of online discourse and hold a single variable
constant, identity. Consider a discussion forum for a college course where
students are asked to participate in discussion of class material learned from
a physical lecture hall VS topical online discussion on a facebook group. Both
groups are forced to offer up some form of personal identification. There is
one difference between the groups and that is face to face interaction. My
hypothesis is that the first group has a larger social cost tied to being a
troll on the discussions and perceivable punishment from acting in bad faith
vs facebook where the social costs can be diminished and mostly result in
being unfriended. Even removing the possibility of lost tuition dollars as
punishment (aka you attend free college) I would argue that the per capita
rate of trolling would be diminished compared to facebook.

To summarize my hypothesis with in the original scope, if an individual does
not perceive the social costs for their actions with in any community their
propensity for becoming a bad actor should increase.

~~~
Nomentatus
Not quite a troll, perhaps, but an interesting sock puppet:
[https://theweek.com/speedreads/764714/dizzying-story-
david-j...](https://theweek.com/speedreads/764714/dizzying-story-david-
jewberg-fake-pentagon-official-stirring-antikremlin-sentiment-abroad)

------
Nomentatus
Another bad rule I don't advise: insist on a peer-reviewed journal reference
for literally everything said. There are a lot of problems with this, too many
to list, really.

One is that inevitably it is only fully applied to statements that contradict
the current prejudice, whatever that is; blocking the very sort of up-to-date
knowledge you hope to find in a forum.

Another problem is that accomplished trolls will badly distort or reverse the
takeaway from an opaque study and drop then drop that as a reference. The
moderator can't possibly devote the hours to police that.

But yes, I've actually seen this system tried in a forum.

------
Nomentatus
Humans need intermediate punishments. It's how we work. (Homeostasis and all
that.)

Therefore Facebook (etc) ought to provide a very automatic "time-out" system
for moderators allowing them to one-click to create a temporary ban (a week, a
month) for a user that reinstates that user automatically after that period
_without_ further action by the moderator. Automatic tracking of past
penalties and auto-escalations (if desired) to harsher punishments up to a ban
would be a good idea, too.

A quick display of relevant stats regarding that user would be helpful right
then, too; giving you some rough idea of how valuable past contributions by
them have been. (Sometimes what looks like a troll is just a fact that few
people, even in the field, know.)

Manually managing such a system works but Lord is it a ton of fiddly (and
unnecessary) work for the moderator.

One of the advantages of this is that mild penalties can usually make the
point with little risk that a bad penalty decision will permanently piss-off a
valuable contributor.

------
Nomentatus
A bad "solution" I've seen more than once in health user groups - ban
disagreement (but allow contradictory threads.) "If you don't like what you
see, just move along to the next thread." This removes any chance of limiting
fake news, and takes trolling to a whole new level since the first big lie
wins, in any thread. So not that.

------
slater
Make people pay for a user account

~~~
startupflix
Do people will really pay? I am not sure about this one.

~~~
sincerely
worked for SomethingAwful

~~~
cntrlaltdlt
Not to be a snarky, but this worked in 2002(?) to combat 12 year old trolls.

The point it really speaks to is

What is an appropriate barrier to entry for an online community where two
conditions are satisfied?

1\. The online community can scale. 2\. The online community is susceptible to
what is essentially a sybil attack.

It may be the case these two conditions are inversely related and can't be
simultaneously satisfied.

In the case of SA the admin correctly identified that 12 year old trolls
aren't willing to pay 5 dollars to troll. This is an economic barrier to entry
that just doesn't apply in the current landscape.

~~~
AnimalMuppet
The current landscape _also_ includes 12 year old trolls who aren't willing to
pay 5 dollars to troll. If a barrier stops some but not all trolls, it's still
useful.

The problem with charging is, I probably wouldn't pay for the privilege of
posting here. I suspect I'm not alone. If you charge to deter trolls, you
deter (some) trolls, but you also deter some fraction of real users. It would
result in a different community.

~~~
cntrlaltdlt
Let me put a few caveat on why 5 dollars in 2002 DNE 5 dollars in 2018 for a
12 year old.

    
    
       Ease of payment, when SA instituted this barrier a young troll intent on disruption had less of a problem coughing up 5 dollars for an account, and more of a problem finding a means to pay this 5 dollars. This is not the case for today's youth. See Xbox live and twitch donations if you don't think I'm being fair here.
    
       Scale-ability, which you directly touch on. Enforcing an upfront cost is a very real value calculation one must make prior to having extracted any value from the service and something most people my self included are currently adverse to. This directly impedes a communities ability to scale.
    

I can completely relate to your idea that I wouldn't pay to post on the
forums. For years I had friends telling me to sign up and I always just
shrugged and responded MEH why would I pay for that sort of thing? I didn't
understand (at the time) the point or the reason for doing this. My (and many
people's) value structure was in-congruent with the owner of SA.

Moving forward, people have to value the idea of an online community free of
trolling in order to even put a price tag on that idea. I would find it very
hard to believe that

a.) enough people (I'm talking an order of magnitude equal to the number of
facebook users) value a troll free environment

b.) that, said people would have a unified enough of an idea of what an act of
trolling is such that focused discussion doesn't boil down to pointing
fingers. Maybe this point would merely resolve it self over time.

------
ggggtez
Ban users early and often. Make signing up for accounts difficult (pay real
money, require phone number verification, etc). It needs to cost the user
money(or time) directly (they need to keep finding phone numbers that aren't
banned, etc etc).

~~~
ILikeConemowk
>Ban users early and often.

OP is referring to an online community not an echo chamber. Therefore, your
advice does not apply IMHO.

~~~
Nomentatus
Not understood. There may (or may not) be an argument to be made about the
difficulty of sorting between trolls and those with unusual but accurate
knowledge, but you haven't made that argument. I'd be interested to hear it,
but you've gotta make it.

I've never seen a never-ban policy and can't believe it would work; the
conduct of genuine psychopaths online can be truly extreme and not just
uncivil, but designed to prevent the forum from functioning. Persistence-
bullying is one of the nastier forms this takes.

------
DanBC
Depends on the type of troll.

Meatball wiki has a bunch of information about this. Here are a few.

[http://meatballwiki.org/wiki/UsAndThem](http://meatballwiki.org/wiki/UsAndThem)

[http://meatballwiki.org/wiki/VestedContributor](http://meatballwiki.org/wiki/VestedContributor)

[http://meatballwiki.org/wiki/GoodBye](http://meatballwiki.org/wiki/GoodBye)

[http://meatballwiki.org/wiki/DissuadeReputation](http://meatballwiki.org/wiki/DissuadeReputation)

------
gesman
vBulletin online forum had a way for admin to quietly disable user. Making his
posts invisible to anyone but himself.

So proud troll keep moving full steam ahead with no harm to anyone.

They called this feature: "Tachy goes to coventry"

~~~
matt_the_bass
That’s fantastic! Thanks. I didn’t know that.

------
Nomentatus
A.I. to detect Character Disorders - since people with Character Disorders
tend to be very repetitive and characteristic in their behavior (hence the
name.) People with Character Disorders can't be remotely cured and won't
temper their behavior in any way that truly matters.

~~~
Nomentatus
It's not well understood that many trolls don't see themselves as trolls (even
though they thoroughly are.) For example, narcissists who just can't allow
themselves to be contradicted and really take the gloves off if they are, or
who want to simply take over a forum to hog all the oxygen and glory. Sadly,
such people are not a smaller problem than conscious trolls. In fact, they may
be far harder to discourage since they lack insight into their behavior and
won't necessarily act with any prudence. You can't even count on them to be
self-serving.

