
Ellen Pao: The Trolls Are Winning the Battle for the Internet - gwintrob
http://www.washingtonpost.com/opinions/we-cannot-let-the-internet-trolls-win/2015/07/16/91b1a2d2-2b17-11e5-bd33-395c05608059_story.html?postshare=7391437061068212
======
partiallypro
I'm going to disagree. I've been in IRC, Forums, Newsgroups, lurked 4Chan,
Reddit, and Slashdot for years. It has actually been evolving to a point,
especially among the younger crowd to get bent out of shape about essentially
nothing. Maybe that is because more of our lives are online than before, but
if you were to go through Twitter and Tumblr and just browse around it is a
huge mess of people constantly being offended and wanting an apology. I'm not
defending the Redditors that legitimately harassed Ellen Pao.

However, modern media (like Gawker, Vox, Salon, Slate, etc) can somehow turn 3
tweets from anonymous people with no followers into a national shit storm.
These media outlets feed the trolls, because it gets them clicks. So, it is of
my opinion that the "trolls" aren't "winning" the battle, they just haven't
left, and never will. The news outlets are click junkies and know they can
foment faux rage by enticing people who otherwise we would have just ignored
10+ years ago. Of course they look large in numbers if you're putting them
under the microscope. The thing people who have little online experience miss
is that they are making things worse by giving people attention and trying to
"silence" them.

~~~
fixermark
Does "Just don't feed the trolls" continue to work in a world where trolls
know that SWATting someone is as easy as picking up a phone, and doxxing
someone can have significant long-term consequences?

~~~
msandford
Should we repeal the first amendment because people sometimes used racial or
ethnic slurs? Or they deny the moon landing? Or say things that your preferred
political party happens to disagree with? Or maybe only things that YOU
disagree with?

~~~
fixermark
Nobody is recommending a repeal of the First Amendment. At best, the
conversation is about whether orgs like Reddit have any obligation (moral,
business, or otherwise) to allow people screaming racial or ethnic slurs to
stand on (metaphorically speaking) their property and use their megaphones.

Usenet still exists, and nobody is stopping people from using it. It's just
not where trolls go to be heard anymore, because nobody is there to hear them.
;)

~~~
pdeuchler
"Nobody wants to take away your right to free speech, we're just trying to
create social norms that discourage, or hinder, the freedom to express views
we don't like. Ignore the fact that social norms inevitably become codified in
law."

~~~
babygoat
Reddit does not owe you free speech. You can say whatever you want, and they
can choose not to broadcast it to millions of people.

~~~
Lawtonfogle
Or does it? Now, no company would owe you freedom of speech just for existing,
but in the previous promises Reddit made, it did promise to be a bastion of
free speech. It wasn't an explicit contract that they can be taken to court
over, but do we really want to say that, unless there is an explicit contract,
there cannot be any agreements?

~~~
DanBC
Can't they change their mind?

~~~
Lawtonfogle
Can you change your mind on a verbal agreement? That a court is likely to not
find sufficient evidence to enforce it doesn't make changing you mind about it
right/ethical once you've gained benefit from the agreement.

If I get the community to help me build a orphanage, it is wrong for me to
decide I rather do something else with the building once built.

~~~
ectoplasm
I believe you're conflating laws which are objective and shared with
morals/ethics which are subjective and personal. Something legal to you and me
can be unethical to you while being ethical to me. Something illegal to you
and me can be unethical to you while being ethical to me.

~~~
boomshucka
I believe you are confused about the law. Laws are not objective. There are
many, many cases brought before the courts where the facts are not in dispute.

~~~
ectoplasm
Can you clarify?

~~~
boomshucka
Which part isn't clear?

~~~
ectoplasm
All of the parts are unclear. I'm confused because the objectivity of a law
depends on the facts in court cases being in dispute?

When I use the word subjective, it means that an individual personally gets to
decide the truth of something. You can't make something illegal just by
changing your mind, but you can make it immoral. We probably have different
understandings of subjectivity and objectivity.

~~~
fixermark
Inconveniently, juries (and judges in non-criminal cases) do, in fact, make
something legal or illegal by changing their minds. It is the function of a
jury to decide questions of fact, because determining absolute objective truth
in a courtroom scenario (from a scientific sense of the word) is often
functionally impossible. So you're left with the situation where the plaintiff
and defendant provide evidence that their respective versions of the truth are
the objective reality, and the jury's subjective opinion of their case
determines what the law agrees upon as objective.

This interface between objective and subjective is the "magic" of the legal
system. If you can argue in a court of law, successfully enough to convince a
jury (or judge, depending on the criminal / civil nature of the case), that,
say for instance, a corporation is a person in the same sense as an individual
human being, then as far as the law is concerned it is now an objective truth
that corporations are people.

~~~
boomshucka
I'm not even talking about deciding questions of fact. If you take a set of
facts as a given, it's still often not clear if something is legal or not.
Legislation is often written in broad strokes and it is up to the courts to
fill in the details. There are always new cases coming up that don't fit the
same fact pattern as existing cases and hence no one is quite sure what the
law says on such matter.

------
lhnz
Does anybody remember when trolling was a term used to describe an activity in
which users wielding enough understanding of other's perspectives would
attempt to give other's cognitive dissonance?

It seems that one of the most powerful online political classes have lumped
hatred and abuse in with disagreement, mockery and satire.

I don't think it's good for the conversation between the group of user's that
dislike where Reddit and other big social media sites are headed and the
people that are taking them there.

~~~
ewzimm
I miss those days when I was proud to be a troll, when it meant tricking
people into thinking in ways that they normally wouldn't. People used to love
trolls! Of course, it didn't just mean that. There were also troll raids when
friends would get together and make silly comments on every thread they could
find, crashing the party but having fun with it. The incendiary ones used to
be called flamers, but I think that term is too trollish to use against trolls
these days.

~~~
imron
> People used to love trolls!

No they didn't.

> crashing the party but having fun with it.

I'm sure you and your friends were having fun with it. The people whose
parties you crashed? Probably not so much.

~~~
digi_owl
This is starting to look like a debate similar to "where does tagging stop and
graffiti begin?".

------
itg
From what i've seen on reddit, twitter, and sometimes even on hn, some are
quick to label people whose opinion differs from theirs as trolls. It doesn't
matter if it is a polite or well-though out comment, they are quick to shout
the word harassment or whatever -ist allows them to hold some moral high
ground.

~~~
DanBC
Pao had thousands of people calling her a ching-chong cunt. Why do you think
your comment is relevant to this thread?

~~~
josefresco
"Pao had thousands of people calling her a ching-chong cunt."

Is this a known/established fact? I see in the article a paragraph about how
she received thousands of positive notes/emails but nothing about the # or
volume of hate. It may not be documented here, but I'm curious if it was
actually as widespread as implied.

~~~
kyoji
The front page of Reddit was littered with derogatory Pao posts for days on
end following the shutdown of /r/fph and firing of Victoria. I'm sure she also
received a large amount of private hate messages too.

~~~
josefresco
I saw the front page, and many (most?) were not of the "ching-chong cunt"
variety. I witnessed a lot of misplaced hate (we now know is "misplaced" after
the fact) but there's a difference between "flaming" someone and using
racist/bigoted language.

------
hellbanTHIS
[http://www.cs.cornell.edu/~cristian/Antisocial_Behavior_file...](http://www.cs.cornell.edu/~cristian/Antisocial_Behavior_files/antisocial_behavior_trolls.pdf)

tldr; heavy moderation actually causes trolling on sites like Reddit.

>taking extreme action against small infractions can exacerbate antisocial
behavior (e.g., unfairness can cause users to write worse). Though average
classifier precision is relatively high (0.80), one in five users identified
as antisocial are nonetheless misclassified. Whereas trading off overall
performance for higher precision and have a human moder- ator approve any bans
is one way to avoid incorrectly blocking innocent users, a better response may
instead involve giving antisocial users a chance to redeem themselves.

The whole thing with Pao got out of control because somebody was routinely
deleting comments and shadowbanning people who mentioned the trouble her
husband was in. Whether it was her, another admin or a moderator doing the
banning _that_ was what fueled the fire.

Moderation on Reddit is insane, a single basement dweller with sociopathic
tendencies (99% of forum moderators everywhere) can control the conversation
totally, they constantly ban discussion that they don't agree with and there
is absolutely no transparency. It happens to be a wonderful way of incubating
hate groups by eliminating cross pollination of ideas, but it also tends to
piss people off.

------
msandford
The trolls won the war for usenet too. It didn't destroy the internet. Just
because trolls are making reddit unpleasant doesn't mean that they will
destroy the internet. reddit isn't the whole internet, not even close!

Furthermore, supposing that reddit users revolting against the clamping-down
of reddit management constitutes the "biggest trolling attacks in history" is
a little disingenuous. What about all the people who got internet famous for
dumb stuff like the mustard guy or folks that got pranked by Sacha Baron Cohen
or the jackass guys? Those were arguably trolling, and they got made into
major motion pictures that got commercials and put into theaters!

Reddit trolls just post nasty comments on reddit, a place that probably half
of America barely knows about and definitely doesn't care about.

I'm not saying that the nasty stuff that happened to her didn't happen, or
that it's somehow not a big deal. It obviously is a big deal to her, and a
great many other people as well.

But that doesn't make it the "largest trolling attack in history"

~~~
smacktoward
_> The trolls won the war for usenet too. It didn't destroy the internet._

It sure as hell destroyed Usenet, though. I remember, because I was there when
it happened.

At least when we lost Usenet there was somewhere else for all those
discussions to migrate to: the Web. If they destroy the Web, where are the
discussions supposed to go then?

~~~
msandford
Please remember that just because someone says a thing, that does not make it
true.

Reddit is not the WHOLE internet. Just because trolls are causing havoc at
reddit doesn't mean that they're winning everywhere.

Further any one website isn't the whole internet. Usenet was a unified system,
the myraid websites that people troll on right now are not. Facebook is
different from Reddit which is different from slashdot which is different from
HN which is different from etc.

------
joelrunyon
> The Trolls Are Winning the Battle for the Internet

Does anyone honestly believe this? Even Ellen in the op-ed says

> As the trolls on Reddit grew louder and more harassing in recent weeks,
> another group of users became more vocal. First a few sent positive
> messages. Then a few more. Soon, I was receiving hundreds of messages a day,
> and at one point thousands. These messages were thoughtful, well-written and
> heartfelt, in stark contrast to the trolling messages, which were usually
> made up of little more than four-letter words. Many shared their own stories
> of harassment and thanked us for our stance.

I think most people see trolls as just trolls. People are going to be jerks
sometimes - whether under their real name (facebook / youtube) or anonymously.

~~~
fixermark
The major issue is the notion that "jerks sometimes" is evolving into "jerks
all the time."

Not yet sure how much of this notion stems from the focus on the problem and
how much is actual increase, but it's a problem nonetheless. Why do we
tolerate as much trolling as we do? Why do we continue to see trolls as just
trolls when we have evidence of people being bullied to suicide via the
signal-magnification effect of online interaction?

~~~
joelrunyon
> The major issue is the notion that "jerks sometimes" is evolving into "jerks
> all the time."

I would doubt that. I think it's an outgrowth of more information created (so
the trolls, if you're looking for them, will have more trolling as well).

It's just like how the news is claiming things are always getting worse.
They're not. Things are better than ever, we're just getting much better at
hearing about the outlier cases of when bad things do happen.

------
wxs
I just posted this in the earlier submission[1], but I'll put it here too:

I really worry that if we don't find a way to fix the abuse problem,
pseudonymity online will be looked back on by future generations as our time's
Klan hood.

Abuse stories like these make me embarrassed to support any form of privacy
online. Platforms need to take a stand and find a way reduce hate and abuse,
or public opinion (and my own) will turn against support of privacy. If it
turns out pseudonymity is fundamentally incompatible with safety of the abused
online then I'll take the latter, but I'd rather search for a way to have
both.

Platforms banning abusive subcommunities seems like a good start.

[1]
[https://news.ycombinator.com/item?id=9898366](https://news.ycombinator.com/item?id=9898366)

~~~
diamonis
Why are you posting anonymously? [EDIT: no, I don't go around clicking on
profile links so some SJWs can accuse me of stalking.]

~~~
dsp1234
Clicking on the profile link:

"Co-founder and CTO at Whirlscape, makers of the Minuum keyboard
[http://minuum.com](http://minuum.com), xavier@whirlscape.com"

I'm not sure that's really "posting anonymously"

------
songshu
Misquoting Bart Simpson "If you never experience the abuse, you'll never get
desensitized to it", and with that in mind, and talking here only about
written abuse, these are my favorite three outcomes for, let's say, my
daughter:

1) No one abuses her online 2) People abuse her, she dismisses it as noise 3)
People abuse her, and she reacts and attempts to control it

I think we should be aiming for number 2.

~~~
jon-wood
Why do you believe it's better for your daughter to receive abuse and dismiss
it than not to receive it in the first place? If we're aiming for ideals,
let's go for the ideal of a world in which people aren't subjected to so much
abuse that they just filter it out as background noise, because that sounds
like a shit future.

~~~
boomshucka
It can't be a great idea to aim for the "ideal" situation if it's impossible
to do. So much wasted time. Surely it's a better idea to aim for something
that one can control, and is obtainable, even if not easy.

------
DanBlake
The situation at reddit is appalling and the death threats and egregious
harassment against Pao were wrong, and likely illegal in some cases.

That said, my opinion of her is that she is far from the champion that this
battle needs. I believe that the most common view of Pao is not too savory,
between her husbands ponzi scheme, the details that came out of the KPCB trial
and the plethora of other poor decision taken by her, she has made it easy to
find faults with her character.

~~~
x0x0
shorter: Pao is subject to appalling abuse but she brought it on herself
because of her supposed character faults and the things her husband did.

I think that nicely proves her point. Even if you don't care for her
decisions, she _still_ shouldn't be treated the way she is.

------
dimino
Trolls used to mean (and I think still does mean) a person who attempts to
voice an opinion they don't hold with the intent of invoking a negative
reaction out of someone else. Most of the folks spewing bile at Ellen Pao
probably don't actually hate her, but can't help seeing their actions have an
effect.

Trolls don't win if you ignore them -- the shitshow that's been Reddit this
past month is almost entirely because Ellen and co. can't/won't stop reacting
to short-sighted outcries from Reddit's userbase.

Yishan, Ellen, Alexis, Steve, even Sam, and everyone else involved needs to
shut the fuck up (publicly) about Reddit for a few weeks. This whole "airing
of grievances" is actually what's killing Reddit, not the trolls.

What I want to see is legal action taken against Yishan for fueling this well
beyond its natural life. He _had_ to violate an NDA of some kind with all the
extra fuel he's been throwing onto this...

~~~
danielweber
[http://www.flamewarriorsguide.com/warriorshtm/troller.htm](http://www.flamewarriorsguide.com/warriorshtm/troller.htm)

It referred to trolling the way fishermen troll for fish.

It didn't really have anything to do with the "ogre" definition of troll, but
that's what it means in most people's minds now.

~~~
dragonwriter
That's kind of what happens when you noun a verb that is already a noun with a
very different meaning.

------
eonw
i just think its dumb how people want to "change" the internet, or anything
for that matter. its not a tamable creature, its anarchy and chaos and
humanity in all of its glory, good or bad.

it bothers me when people say we need to take something off the internet... no
really, we dont, it was put there for a reason, no matter how misguided that
reason may be. hate speech, no matter how distasteful is still free speech.
dont want to see it on reddit, dont look for it! i mean quite frankly, if you
are fat and self conscious, going to a "fat haters" forum is probably going to
end poorly for you.

as for her statistic about 40% of internet users saying they have been
bullied... welcome to the real world lady! run that same quiz in a random
highschool and i'd bet its double that percentage... people are mean. why
should the internet not reflect real life? art imitates life.

it also bothers me these people saying we need to change hiring practices and
other such nonsense. many things self regulate, leave them alone.

------
omarforgotpwd
"I have just endured one of the largest trolling attacks in history" I don't
know enough about what Ellen went through to comment on it, but I find this
comment very funny, and it made it hard to take the rest of the article
seriously.

------
dade_
Rule #1: Never feed the trolls. She has just given them a feast. I miss
FidoNet, it used to be so easy to shut down abuse and the conversations were
so much more thoughtful. I wonder if we can ever get that back.

------
wf
Reading all of this has given me the beginnings of an idea. Although I'm not a
machine learning/NLP expert the idea is obviously non-trivial to an extreme
(maybe impossible, currently): What if there were a forum similar in style to
reddit that had a form of auto-moderation that identified logical fallacies
and other irrelevant information and then highlighted those within arguments.

For example if there was an ad hominem attack "Fuck you Ellen Pao." The auto
moderator would allow the post but would highlight the text red with a small
bubble identifying it as ad hominem, users could react appropriately.

Or, maybe you have a long well thought out argument that has a straw man in
it. The auto-mod could highlight the straw man to point out that it exists and
maybe even format the post to show the point from which the rest of the
position is based on that.

Maybe this wouldn't have to be actively presented but just a tool that
comments were run through so users could be warned prior to their submissions.
Obviously the complications with this would be ridiculous and you could never
stop training the algorithm that was processing your text. But the
implications of having such a thing seem pretty incredible.

------
georgemonck
There are a few underlying issues here.

There is a lot of denial about the nature of the intellectual bell curve. The
majority of people are not that smart and cannot make thoughtful contributions
to most discussions. The majority of people will at least some of the time
lose decorum and make vicious personal attacks when they perceive themselves
as being threatened or as being wronged. Upvoting and downvoting is an action,
and action requires motivation. Thus, unless an audience is especially smart
and valuing of rationality, comments are most likely to get upvotes when they
provoke an emotional reaction.

To have a high quality discussion board you must have a critical mass of
reasonable, smart people. You must avoid letting in the hoi polloi. And you
must quickly ban users who make dumb comments or violate decorum.

However, most internet business models are based on advertising. Advertising
revenues are based on eyeballs. And a normal stupid person's eyeballs are
worth just as much as smart person's eyeballs (perhaps more, the smart person
probably has adblock installed).

Thus the central conflict. Business models are based on making it as easy as
possible for normal people to get engaged in the platform. But the result of
normal people getting engaged will be vicious and unsavory content. Also if it
is easy to create account, then it is easy to switch accounts if one account
is banned. So it becomes impossible to silence the unsavory users.

I think that ultimately Reddit will end up as sort of the Wikipedia version of
a newspaper. All pretense of being free speech or democratic will be dropped
in the major subreddits. Rather it will be a media site on a variety of issues
that is mostly top-down, and controlled by the actions of entrenched,
established users and moderators.

------
MrDosu
People like to gossip and trashtalk. We can act like it is not human nature
and philosophize about ideal conversation, but one look at the afternoon TV
programme will quickly convince you otherwise.

Imo the way forward is not demonizing either social group, but to give each
their own forum for expression. The word 'troll' is an extremely slippery
slope...

------
ForHackernews
She's still a millionaire executive. Obviously nobody deserves the kind of
abuse she's gotten, but I think I could put up with a whole lot of nobodies
saying mean things about me on the internet in return for being in her
position.

~~~
dragonwriter
> She's still a millionaire executive.

I would presume she's still a millionaire, though I don't have access to her
current finances. Of what is she still an executive?

~~~
ForHackernews
Fine, she's technically between positions. You know as well as I do that she
will shortly be made an executive or a board member of some other firm(s). She
is part of the executive class.

------
klunger
I am afraid that she is right. The few times I have tried dipping my toes into
Reddit, I have been attacked by users so filled with bile they seemingly could
not communicate without spewing it up.

It left me just flabbergasted by how little provoked them, or how they somehow
thought their behavior was appropriate or justified.

These scattered but blessedly brief experiences have been so horrible that I
have completely given up on it. I wish it were not the case -- Aaron Swartz is
one of my heroes - but I am afraid this particular legacy of his has just
spoiled.

------
krstck
I'm not sure that the solution is to completely prevent trolling rather than
to inoculate ourselves against it.

Part of 'trolling' (in the sense being talked about here) is knowing what to
use against the victim for maximum shock value or impact. I don't think Pao
was hated _because_ she was a woman or asian. I think that she was hated, and
therefore they found an easy way to disparage her by throwing slurs about her
gender and race. It's shocking to see such words and that's precisely why
people use them.

I don't think that the world would be a better place if it we cloned it as is
and scrubbed all negative references for race/gender/other protected
characteristic. I think people will be just as nasty. They'll be nasty using
different words to strike at different vulnerabilities. The problem isn't
"racism" or "sexism", it's people being plain, old-fashioned _mean_.

------
ElComradio
Unfortunately we can't trust the intentions of this piece. She might not
actually care about trolls at all, just brushes them off like I do. It's an
image piece, designed to fulfill specific purposes, and I doubt the purpose is
merely to spread the message of the positive potential of humanity on the
Internet.

------
angersock
Ugh.

 _But that balancing act is getting harder. The trolls are winning._

The success of the original balancing act was that there wasn't one--it was an
open platform, it was a loose federation, and it was decentralized. By
creating these massive walled gardens and trying to monetize them, companies
like Reddit have suddenly had to take a stand.

Short surprise that they're running into problems that didn't exist before
they decided to create these massive platforms. If you democratize access to
mass point-to-point communications, guess what? By definition, half of your
users are of below average empathy and responsibility.

 _But to attract more mainstream audiences and bring in the big-budget
advertisers, you must hide or remove the ugly._

Gentrification of the Internet.

 _As the threats became really violent, people ended their messages with “stay
safe.”_

Threats can be graphic, not violent--if they're violent, they're not threats.

------
jmount
Definitely an interesting read. Mostly I am with Pao. But I do feel Reddit has
been in the business of supplying a troll friendly environment that is just
interesting/safe enough to monetize for quite some time. So, in my opinion,
nobody truly "anti-troll" would have stepped into the situation.

------
upofadown
Trolling is not a new thing. As long as hateful and/or sociopathic and/or
sadistic people exist there will be trolls and trolling. You can only manage
trolling, you can't eliminate it. That's the mistake Reddit made, they thought
they could eliminate it, or at least hide it, and that isn't something that
anyone knows how to do. In their efforts they make some basic mistakes.

They removed areas where the trolls had self segregated (ex. fatshaming)
forcing them back into the rest of the community. They failed to properly
ignore the trolls and reacted to them up to the extent of claiming that the
resignation of a member of their executive had something to do with their
actions.

I predict that the Reddit troll problem is going to get much worse before it
gets better.

------
anthony_romeo
Many of responses I read here are infuriating because many are deliberately
ignoring the issue of harassment by claiming it's a free speech issue. In the
US you do not have 100% freedom of speech. If you deliberately incite violence
against another person through speech, you can be charged with a crime. You
can face a lawsuit for assault without touching the victim. You can be charged
for, say, harassing others in the workplace. You certainly have the right to
speak your mind, but if you think you have the right to deliberately cause
severe mental distress, not through mere discussion of your points, but
through repetitive and abusive language and getting your cohorts to join you
in the assault, you are absolutely mistaken. Even if we were to assume some
enshrined right to harass individuals, perhaps it's not the best way to go
about a disagreement.

Further, people are choosing to get angry over the false binary of freedom vs
harassment, and choosing to ignore the third option of choosing empathy, tact,
or clear and fair discussion over slinging barbs at others.

I'm reading comments along the lines of 'well, it couldn't have been _that_
bad' or 'I don't know enough about the extent of the harassment, but I'm going
to comment about semantics or the 100 percent accuracy [as opposed to, say, 90
percent] of her claims anyway!' Your comments are pointless and dance around
the actual issue of abuse. You fail to feel any sort of empathy because you
yourself are not the victim.

I hear people making fun of the concept of 'microaggression' because they fail
to adequately understand its meaning. The point is that it's really easy to
throw a one-line insult to someone, but in aggregate it can cause great
distress, because the throwaway comments of a thousand people can feel like
one very persistent stalker. People dismiss this by saying 'oh it's normal'
and recommend they just ignore it. Frankly, the point is that it should not be
normal, and some people have difficulty ignoring it.

And I know some will deliberately choose to misconstrue what I'm saying here.
No, I'm not saying that we should censor the Internet and ban all
controversial discussion. I would just hope that more people become aware that
it is a real issue that people face, and that commenters should be aware of
their own actions and reevaluate how they choose respond to conflicts.

------
rabbyte
> Reddit is the Internet.

That's an insane thing to say. I know she's the former CEO but anybody saying
this is losing perspective. There are countless communities not on Reddit and
they can't be represented by Reddit. I know it's probably a conscious
exaggeration but it's getting hard to tell, people take this forum shit so
seriously.

------
balls187
Given her past, Ellen Pao is not very credible. Her opinion seems part of her
self serving rhetoric that is now focused on presenting herself as a martyr.

Incivility has always been a problem in human society--it's magnified by the
internet, because of the lack of consequences.

------
rbanffy
The Internet?

They are gaining ground everywhere. Winning the argument is more important
than being right.

------
serve_yay
What I really care about: Who gets to decide who is a troll and who isn't?

~~~
pekk
People have been trying to automate moderation and make it perfectly fair for
a long time now. It's basically impossible. If you do it at all, you have to
use people, which makes it suck.

If you don't do any moderation, you get a sewer, and then you get shut down.
You can try to exclude only illegal stuff and stand on free speech, but in
terms of content you still have a sewer. To be fair, most of the stuff that
moves through sewers is not deadly poison, just crap and water. Some people
like sewers, so sewer cities continue to be built for people to live in, but
most people find them unattractive, so they never take over the world.

To make something more attractive out of all the crap people emit, you need a
modicum of moderation, which always comes from other people. Which means it is
essentially unfair, capricious, and creates incentives to endless mediocrity
and pandering. But even if we accept that rather than going back to the
sewers, one man's perfect moderator is the other man's little Hitler. So maybe
we should just crowdsource moderation! Brilliant! But "democratic" crowd
moderation is essentially a way of diffusing responsibility for censorship.
The censorship of _whoever_ happens to brigade your posts (maybe with
sockpuppets, or external links) is not any more free, or pleasant, than any
other censorship. Meanwhile, upvoting and downvoting takes on a massive life
of its own and creates arenas for endless meta-gaming like subcommunities
trying to push each other out. This takes over the site. If you manage the
site, have fun managing all that effectively, and try not to get sucked in.

If what you want is fairness, all you can do is publish explicit policy (NOT
legal ToS) so people know and agree to what they're getting into - then
enforce that policy universally to the letter. That will be fair only in the
limited sense that everyone agrees to it the same, like the rules of chess.
But human-based moderation will never be fair.

~~~
serve_yay
That's not really what I mean. I have no problem with, say, reddit deciding
that a "fat people hate" forum is not fit for its site. It's their site, they
get to decide that.

Rather, what I am concerned about is the broader conversation which takes
place in articles like TFA, by prominent people such as Pao, and by
journalists. This broader conversation uses the word "troll" \- but who gets
to decide which opinions are in that bucket, and therefore not acceptable?

It is easy to say racists are trolls, it's uncontroversial for the most part.
But what about, say, gender-critical radical feminists? These are people who
challenge certain received beliefs around what gender means, or should mean.
They are not hateful, and they are 100% serious about making the world better
for women, yet are often accused of transphobia (and worse) for what they
believe. They get death threats (from other people who also call themselves
feminists) and the rest of it. And it would be _so easy_ to place them in the
"troll" category.

------
facepalm
Where can you actually still have an uncensored discussion on the internet?

~~~
fixermark
USENET, via email, via your own server, via a blog you host on your own
server.

But those solutions (a) take effort and (b) implicitly decrease your potential
audience because you aren't piggy-backing on some other organization's social
networking. So it's understandable why people who get their jollies on
upsetting other people would prefer an easier solution for them, like Reddit
letting them say whatever they choose to unhindered because "Freedom of
speech."

I'm 100% in favor of people who want to talk about topics Reddit has rejected
setting up their own competitor. Maybe it will even become as big as Reddit.
It's happened before.

------
antrover
The trolls are a small, small percentage of users but some of the loudest.

------
dominotw
Reminded me of this
[https://www.youtube.com/watch?v=Co5jTvGlt6Y](https://www.youtube.com/watch?v=Co5jTvGlt6Y)

------
slang800
> The trolls are winning the battle for the Internet

The trolls aren't "winning" because "the trolls" aren't an organization. They
don't have an ideology that they're pushing, or some end-goal that they're
hoping to achieve. They're a byproduct of free speech that we all have to deal
with.

But most importantly, they're not even able to be categorized uniformly. What
is trolling to one person, may be comedy, satire, or even just _disagreement_
to another.

------
Claudus
I've been following the Ellen Pao debacle on Reddit, and her misuse of the
term "troll" in the article illustrates her ignorance of Internet and Reddit
culture that led to her stepping down as CEO.

I'm sure there were many offensive, harassing, and vitriolic comments and
directed at her, however they were a result of her incompetent actions and
failure to understand the culture that caused the problem, not due to trolls
or trolling.

------
trhway
may be, yet it is still doesn't help them win in court ...

Without any reason, just as a fishing expedition for millions of dollars, she
basically publicly labeled a bunch of men as sexist pigs. Talking about
trolling, ehh..

------
tosseraccount
Don't like trolls.

Don't visit reddit.

------
ajkjk
I wish that there was some sort of Internet-wide 'reputation system' that
would accumulate one's reputation across all websites they use - while
preserving their pseudonymity on each site.

As an analogy, Steam and Battle.net and other gaming platforms have something
going for them, maybe by accident: by using one account across lots of games,
the risks incurred by cheating are amplified. If you cheat (or harass) in one
game, your account can be banned across all games. This means that if you care
about your account on one game still existing, you're disincentivized from
cheating in other games, even if you don't care about them.

I play League of Legends regularly, and Riot Games has put in huge amounts of
work in figuring out a way to discipline abusive players that will actually
induce them to reform. They have shown that most people, when _branded_ as
abusive, will often reform their behavior immediately - as if they did not
realize (or did not internalize) what kind of person they were being. See, for
example, [http://www.polygon.com/2015/1/6/7500045/riot-games-league-
of...](http://www.polygon.com/2015/1/6/7500045/riot-games-league-of-legends-
bans-rewards-behavior) . But it's saddening that all of Riot's work in
reforming players or banning them if they can't get along with others cannot
be used to weed out these toxic players in _other_ games. I love gaming, but,
for me, the absolute worst thing about this hobby is how awful everyone is to
each other, and I would love to see a systematic solution.

I wish something like Steam's centralized identity existed across online
forums. I wish we used one account everywhere, and the use of a widely-
connected account was required to sign up to things - but the details of who
you are, and how to associate your identity on one site with your identity on
another site, is totally masked to everyone. Site owners and users cannot tell
that userA on reddit is also userA on, say, some Disqus blog. But your
_reputation_ would still be preserved across sites: if you were reported as a
spammer or a bully or a cheater on one site, it would be visible to the other
sites you were on. And if those sites wanted to ban you based on your common
reputation, they could.

This puts a large burden on the identity management platform. They have to
handle everyone's privacy and security, deal with accounts being hacked and
used for spam or abuse, provide mechanisms for people to prove their behavior
has improved and remove black marks from their records, deal with websites
that submit false abuse claims, etc. And (hopefully) they have to be resilient
to law enforcement trying to use these records to complete expose users on
every website. But at least they could do it in _one place_ , instead of every
website having to solve all of these problems independently.

It seems to me that good behavior in 'tribal' human society was enforced by
having to, well, remain in one's tribe, and having to convince everyone to put
up with you if you want to survive. It also seems that being labeled, with
numbers, as a 'bully' or a 'cheater' can have a positive effect in reforming
people's behavior. We would need a centralized reputation system like what
I've described to do this.

This is on the list of "things I would like to build".

[edited for awkward phrasing]

~~~
uptown
"A lot of times I wish that there was some sort of Internet-wide 'reputation
system' that would accumulate one's reputation across all websites they use -
while preserving their pseudonymity on each site."

There is. Unfortunately, it's the NSA that operates it.

~~~
ajkjk
Well, that doesn't accomplish any of the things I want, so I don't think it
counts for my requirements.

------
dgfv1
Maybe they shouldn't have started the war in the first place. Stop trying to
gentrify the internet.

Also, people who disagree with you == trolls? Nice.

~~~
pekk
I'm seeing a lot of this meme that the internet is gentrifying, and I think
it's worth discussing. I'm certain the internet isn't getting more gentrified
over time.

The internet was extremely gentrified when it was populated only by a
relatively tiny elite which tended toward prestigious academic affiliations
and higher status. Even the old BBS scene, though a little more populist and
accessible to people at home with phone lines and home computers, looks pretty
genteel and exclusive by modern standards.

The internet sure felt a lot more democratic when issues like DDOS, doxing,
spam, child porn distribution (just to name a few) were not that hard to
avoid, so measures against them did not feel that urgent. But that atmosphere
is _directly_ related to the selective nature of the audience. Things as small
as the introduction of AOL users to USENET caused massive upheavals back in
the day. The "barbarians" were at the gates.

It's not coincidental that Facebook built traction against Friendster, etc.
largely by restricting to people with academic affiliations, Reddit got its
start from a userbase that was more "selective," etc. But then the floodgates
are gradually opened to the "barbarians" who were very unlikely ever to have
patronized USENET, a BBS, or even an early-2000s phpBB.

In reality, that ratchet only moves forward, and the internet population gets
more and more like the general population. Pervasive commercialization is
actually part of that trend away from gentrification. It's not gated
communities, it's Wal-Mart, payday loan joints, low-end casinos and hawkers in
the streets. It's not Singapore, it's a developing country. And for better or
worse, nobody can do anything about that.

------
escobar
> I have just endured one of the largest trolling attacks in history.

I feel bad for Ellen and have nothing against her, I don't really go on reddit
much. I am aware that she was more or less a scapegoat. But still, for some
reason, AND even though the second sentence of the article almost saves face,
this statement still bothers me. Sure, she got picked on by reddit quite a bit
for lack of transparency, and got thrown under the bus by other high-ranking
officials at Reddit.

But to come out and make yourself seem like a champion still illustrates her
inflated sense of importance - she's a millionaire exec in an interim position
who made mistakes, and got called out in an online forum.

