
Reddit and the Struggle to Detoxify the Internet - smacktoward
https://www.newyorker.com/magazine/2018/03/19/reddit-and-the-quest-to-detoxify-the-internet?currentPage=all
======
gdubs
We tend to think we’re more in control of our behavior than we actually are.
That is, our brains are operating from habit a large amount of the time, and
the idea of the CEO brain consciously deciding our every action is largely an
illusion.

This is why emergency exit doors must open outwards. People have died in fires
in theaters trying to push on doors that said “Pull”. The crash bar on an
emergency exit door is what’s known in design as an affordance.

When you present people with a door handle that affords “pulling”, they will
pull even if the sign says “push”.

Now consider the fact that the primary affordance of social media is the
“reaction”. Is it a surprise that content that garners a reaction will trend
towards the outrageous?

If proactivity is the road to a more fulfilled, more civilly minded life and
society, maybe we need to think of our affordances. Because we’ve made it
awfully easy to be reactive, and awfully cumbersome to be proactive.

Edit: The comments below correctly point out an error; what I should have said
was that there’s a danger in putting a “pull” handle on an emergency door that
pushed outward, because of the confusing affordance.

~~~
InvisibleCities
>This is why emergency exit doors must open outwards. People have died in
fires in theaters trying to push on doors that said “Pull”. The crash bar on
an emergency exit door is what’s known in design as an affordance.

Is this true? I always assumed it was because it's pretty much impossible to
pull open a door when you are being crushed by people pressing against one
another trying to escape.

~~~
KirinDave
It's the same sort of myth as "why are manhole covers round?" and there are
lots of attributions for why that ignore history for a just-so story.

~~~
mulmen
I don't understand the reference. Why are manhole covers round? I always
thought it was to prevent the cover from falling into the hole. Is that not
the case?

~~~
KirinDave
That is the argument but actually any lipped shape can give this guarantee.
Even historically manhole and utility covers had to be somewhat flush to the
ground and so they've always had to be lipped for this.

You can see lots of utility covers that are hinged, square, or other options,
but all are lipped for safety. While a circular profile makes the lipping
easier, it doesn't seem to influence many utility covers.

~~~
anigbrowl
You can easily put a square or a rectangular utility cover into the hole,
because the length of the side is less than the diagonal. This isn't possible
with a circular cover because the diameter is uniform.

~~~
derefr
The parent’s point is that a lip prevents this for every shape, by making the
hole smaller than the side-length.

~~~
mulmen
Only with a sufficiently large lip. The diagonal of the hole has to be less
than the shortest side of the lid. As I picture it in my mind this is an
excessively large lip.

~~~
KirinDave
If you have a very rectangular shape then the lip would have to be large.

Stop for a moment next time you walk in an urban environment. Look for
circular utility covers you see in your urban environment as opposed to squaed
ones. Look at the features of the circular ones vs the non-circular ones.

One of the reasons this myth irritates me so much is that everyone is so
certain that they know exactly the answer but their actual daily experience
doesn't line up with the results at all.

I've heard several people offer explanations from a geometric safety option
(which is attractive for free-standing covers) to the simplicity of
manufacturing (e.g., that it's very easy to make circular molds and get even
density compared to square molds) to simply what the contractor suggested.
I've also heard people suggest that metal cylindrical templates were something
very common to manufacture for a variety of industrial uses.

~~~
mulmen
I'm not certain, that's why I asked the question in the first place.

I'm not sure what features I would look for because the lip on a rectangular
cover would be under the lid. How do I know that the rectangular covers have
sufficiently large lips to prevent the lid from falling in? I'm willing to
accept that in some cases the lids are rectangular and there is a risk of them
falling in.

------
floren
I can't wait for the word "toxic" to drop out of favor. It implies that the
person is not just giving a bad opinion, but is in fact fundamentally flawed
and dangerous. Arsenic isn't toxic because it had a bad day at work, arsenic
is by its nature a deadly poison and you can't change it, only avoid it. The
vast majority of people labeled "toxic", though, are just humans with a
variety of opinions and beliefs, some you may agree with and some you surely
disagree with. They may change these beliefs, but not by being vilified and
told they are intrinsically bad.

~~~
guelo
Poisoning a conversation is a thing. And if you've participated in online
discussions I don't see how you can honestly say it doesn't exist.

You're implying that in every thread where a troll shows up to derail the
conversation we all have to stop what we're doing and give thoughtful
responses to the troll to show them the error of their ways. But then the only
possible conversation is debates with trolls. But that gives too much power to
the trolls. Sometimes you just have to shut the trolls up so you can talk
about what you want to talk about.

~~~
kerkeslager
> Poisoning a conversation is a thing. And if you've participated in online
> discussions I don't see how you can honestly say it doesn't exist.

"Poisioning a conversation" is a bad metaphor that conflates two different
things which do exist, but need to be dealt with in two different ways:

1\. Baiting: trying to say something horrible to anger people for their own
entertainment. The proper response to this is simply to ignore it: if you
aren't entertaining the baiter gets bored.

2\. People saying things they actually believe, even when those things are
genuinely terrible. Responding to these people prevents their beliefs from
going unchallenged, and is the only way we can possibly hope to change those
beliefs.

If it were just group 1, you _could_ just ban those people and that would be
fine. But the problem with that is that sometimes people are actually in group
2, and engaging those people and correcting them is part of arriving at shared
values in a functioning society. The tendency to accuse people who are
genuinely expressing their (awful) opinions of simply baiting so that you can
ban them is problematic for open discussion.

> Sometimes you just have to shut the trolls up so you can talk about what you
> want to talk about.

Contrary to what you're saying, I think it's very possible to have
conversations about what you want to talk about while letting these people say
what they want: isn't this what comment trees exist for? On Reddit and HN,
person A and person B want to have a conversation, person C can say whatever
they want, and it doesn't affect the continuity of A and B's conversation as
long as A and B respond directly to each other's posts, and not to C's posts.
Every platform I know of supports private messages.

Underlying what you're saying is an assumption I'd like you to reconsider: why
is it that you think that a public conversation in a forum where anyone can
respond should only be about what you want it to be about?

~~~
projektir
It's not a bad metaphor. If your community is ignoring trolls instead of
banning them, it's effectively sending the message that the trolls are very
welcome. I've never observed that a no-moderation community is free of trolls,
and ignoring trolls often results in them just taking over because they just
start talking to each other and everyone else leaves.

These things are caused poison because even a small amount can cause serious
problems if left unchecked, and the effect creeps out across an area (how many
people are hit) like poison spreading.

I think what you're missing is that a very large amount of people simply do
not enjoy a certain style of discourse to the point that they'll opt out of a
community that doesn't ban that discourse. You may not like it, you may think
those people are weak or something, but the rest of us want to talk to them,
and we want them to feel welcome.

~~~
kerkeslager
> It's not a bad metaphor. If your community is ignoring trolls instead of
> banning them, it's effectively sending the message that the trolls are very
> welcome. I've never observed that a no-moderation community is free of
> trolls, and ignoring trolls often results in them just taking over because
> they just start talking to each other and everyone else leaves.

Which form of trolls from my post are you talking about? I insist that we not
pretend these are the same group of people.

I think that good moderation filters out the baiters and lets the people who
believe what they're saying stay. And contrary to what you're saying, I don't
think that such communities end up with just trolls. There were plenty of
reasonable conversations on Reddit before Conde Nast took over and dropped the
banhammer.

> I think what you're missing is that a very large amount of people simply do
> not enjoy a certain style of discourse to the point that they'll opt out of
> a community that doesn't ban that discourse. You may not like it, you may
> think those people are weak or something, but the rest of us want to talk to
> them, and we want them to feel welcome.

I'm not missing that--in fact, _I_ don't enjoy talking to people with hateful
beliefs either.

But the alternative you're proposing is an echo chamber where you don't have
to hear those people, but they still believe what they believe, and those
beliefs become our leaders and laws. If we ignore bigots on the internet we
get bigots in office.

~~~
dnomad
I think you're very confused. The idea that rational people have to engage
thoughtfully with irrational bigots is pure nonsense. It is not the duty or
obligation of anybody to engage with those who hate them.

The reason we get bigots in office by the way is not because trolls are
banned. It's because powerful interested want bigots in office. Bigotry sells.
It's very easy to screw people over if you can distract them by having them
hate on some out group. It is naive to think that taking with bigots will
change this.

And here is the point: social change doesn't proceed through rational
discussion. Never has, never will. Real change requires organization and
solidarity and protesting and marching and uncompromising demands.

If you want to waste time engaging with trolls have at it. You will find that
these people have nothing but contempt for discussion and no interest in being
swayed by logic. For the rest of us we have far better things to do and
banning trolls and bigots is the obvious choice.

~~~
kerkeslager
> I think you're very confused. The idea that rational people have to engage
> thoughtfully with irrational bigots is pure nonsense. It is not the duty or
> obligation of anybody to engage with those who hate them.

"Have to" and "obligation" in a general sense are things I try to avoid
saying. They don't exist in my belief system, and I apologize if I mistakenly
said otherwise.

What I'm saying is that if we want bigots to change, we can't just expect it
to happen.

> The reason we get bigots in office by the way is not because trolls are
> banned. It's because powerful interested want bigots in office. Bigotry
> sells. It's very easy to screw people over if you can distract them by
> having them hate on some out group. It is naive to think that taking with
> bigots will change this.

I think you've confused cause and effect here. Some powerful interests
certainly see bigotry as an end goal, but I think most powerful interests who
support bigotry see it as a means to an end. As you said, bigotry is a
distraction to achieve other goals. Bigots are easily manipulated if you don't
care about bigotry: you just pretend to be a bigot and that gets you power,
and then you can do what you actually want to do. If there were not bigots to
be manipulated, powerful interests wouldn't push bigots into power.

> And here is the point: social change doesn't proceed through rational
> discussion. Never has, never will. Real change requires organization and
> solidarity and protesting and marching and uncompromising demands.

Organization and solidarity and protesting and marching aren't incompatible
with rational discussion, and in fact none of these things work if they _aren
't_ a means of putting forward a rational discussion.

Modern protest movements need to read Martin Luther King's writings and
understand what he really did. Every single protest he lead was carefully
designed to make a point in the rational discussion of the time. The bigoted
viewpoints of the time: that people of color were violent, dangerous, less
intelligent, etc., were struck down one by one on public television by MLK's
protests. Bigotry is based on lies, and MLK made it impossible for people not
to see the truth. When bigots feared people of color would be violent, he
showed them people of color peacefully being beaten. When bigots feared
takeovers by blacks, he showed people of color only wanted normal things like
sitting where they wanted on the bus and drinking from the same water
fountains. He didn't simply try to talk over the people he disagreed with, he
listened to their concerns and showed their concerns to be invalid.

Harvey Milk, as far as I know, didn't write about his tactics, but they are
clear in what he did and said. When bigots saw homosexuality as a foreign,
unusual, threatening thing, he encouraged people to come out so that bigots
could see that gays were normal people all around them. When bigots saw
homosexual culture as an invasion of their neighborhood, he showed it also
brought economic benefits ("You don't mind us shopping at your liquor store."
"We both pay taxes for your child's school").

Can you explain to me how you think protests work to change policy? If all
they are is simply trying to yell your opinion louder than your opponent, why
should people in power care? If protests don't persuade anyone, what's to stop
everyone voting for the same people and getting the same bigots in power? If
our only tool is escalation, they'll just escalate back, and they can escalate
further because they have guns. :)

> If you want to waste time engaging with trolls have at it. You will find
> that these people have nothing but contempt for discussion and no interest
> in being swayed by logic. For the rest of us we have far better things to do
> and banning trolls and bigots is the obvious choice.

If by trolls you mean people who are saying inflammatory stuff to enrage
people for their own entertainment, sure, engaging with them only entertains
them.

But if you're talking about people who are just trying to live their lives and
think that bigotry is the way to do that, I very much doubt you have tried
talking to these people, because this has not been my experience at all. If
you approach talking with someone about their bigotry as if they were a human,
with compassion, and address the actual fears and hang-ups that cause them to
be bigots in the first place, people do change. It doesn't always happen
quickly or at all, but sometimes it does. And more importantly, I've _never_
seen it happen any other way.

------
VLM
For, uh, purely scientific purposes I've noticed the posting volume on pr0n
subreddits like /r/gonewild and uh, a few dozen similar subreddits, is perhaps
100 to 1000 times the sheer posting volume of a controversial subreddit like
/r/the_donald. The sheer volume of relatively R rated pr0n is perhaps 1000
times the volume of everything else.

There seems to be tap dancing around the issue that reddit is a 1960s Playboy
magazine fifty years in the future. There's just enough excellent articles to
keep the advertisers amused, but the fact has to be faced that 99% of the
traffic is young men looking at scantily dressed young women. You need a
little submarine PR to stay controversial about subreddits nobody on a
statistical basis reads to keep things legit appearing, whereas all the
traffic and money is over there at /r/randomsexiness

Just like the old saying about Playboy, I only read it for r/ama not the
nekkid ladies.

I'm not complaining; I'm just pointing out that reddit is THE most successful
pr0n site out there with the most brilliant strategy I've ever seen. Please
don't confuse it, and its achievements within the pr0n industry, with legacy
news media or anything like that.

~~~
icanhackit
> _reddit is a 1960s Playboy magazine fifty years in the future [...] Just
> like the old saying about Playboy, I only read it for r /ama not the nekkid
> ladies._

That's a thought-provoking way of putting it, though I'd add the /r/gonewild
subreddit and its variants makes it considerably more meta. A lot of people,
particularly females, are posting explicit imagery of themselves - revealing a
hidden culture exhibitionism as the vast majority of them do it for little
profit beyond comments and upvotes. There are people on Reddit who do actually
make money from posting explicit content, but the vast majority of users who
do it are in it for a sense of self-worth from the "updoots" and the thrill of
exposing themselves relatively anonymously.

Taking your observations about advertising into account, it's a complex
ecosystem. The closest thing I've seen to Reddit is Usenet, but in a age where
digital cameras are ubiquitous.

~~~
Trundle
Have any data on it mostly being girls doing it for the thrill? It seems like
every time I like a photo and look the posters profile, they've got something
else going on where the free photo was just marketing. Tons of underwear
selling.

~~~
icanhackit
> _It seems like every time I like a photo and look the posters profile_

No offence intended but could it be that well-proportioned, model-like
subjects are more likely to have their profile viewed, so it's a case of
sample bias?

While no, I don't have any hard data so my points are anecdotal, I tend to be
disinterested in subjects that are more model-like as the point for me is to
look at everyday people getting their kit off. And from what I can tell those
everyday people make up the lions-share of self-posted content.

------
tabeth
So we've tried:

\- Real identities (Facebook comments)

\- Voting and self moderation (Reddit, HN, etc.)

\- Strong moderation (Reddit, HN)

They all result in toxic comments, trolling, an echo chamber,or worse, a
complete lack of participation. There's no real solution to this problem.
However, if you create consequences or a _cost_ to commenting you'd eliminate
toxic comments and trolling at the cost of less participation and an echo
chamber (though, you could argue that not all participation is equal and
you're mainly removing poor participants).

There's no perfect way to do this because even if you made a subscription
necessary, for instance, you may just create an echo chamber. As part of the
solution you'd need to prevent the creation of new accounts to circumvent any
punishment received.

I'd say the most straightforward solution is that you have a forum and you get
an account. Physical mail is sent to your house in order to get a _single_
account. Then, regular moderation practices would be taken seriously as
there's no way to create another. The community would be left with those who
care enough to not be banned. The problem is that the moderators themselves
may be corrupt or wrong.

Thoughts?

~~~
keepper
We haven't really tried "real identities". We've come close.

Real identities would require verification, which sites like FB only do after
the fact.

I fear that a huge repercussion of the election issues is that we will get
there. A real ID may be required for you to post comments in all websites. And
i'm not sure how I feel about that. Reality is that it would drastically
reduce comment trolling, if your real identify was viewable and searchable by
all you know. But at what cost?

I really do wonder what is the root of trolling. What makes a "normal" person
take an alter-ego/view/counter view, solely based on lack of face to face
interaction...

~~~
gknoy
> A real ID may be required for you to post comments in all websites. [I]t
> would drastically reduce comment trolling, if your real identify was
> viewable and searchable by all you know. But at what cost?

The cost is that it would prevent people from anonymously reporting abuses,
which means that fear of retaliation will have a chilling effect. We've
already seen this where people get death threats, houses burned down, etc,
when they do things like report sexual assault.

~~~
chillwaves
People should be allowed to be anonymous but only explicitly so. The problem
now is we have people lying about who they are and using multiple accounts and
other forum moderation abuse to push viewpoints, often paid to do so.

------
siegecraft
This problem is easy to solve: give users the tools to do their own filtering.
But services like to pretend that it's a hard problem because they don't want
to actually cede that control to their users. The whole point of controlling
the communication networks is to control what and how people say and hear.

~~~
prepend
It’s surprising how much social media bullshits on this issue. RSS had the
standard and solution a decade ago.

Being able to filter even by simple regex would be so much better than
Facebook/Google’s “algorithm.” But it would allow us to filter out ads.

It would also allow for community-driven white/black lists.

But again, wouldn’t allow Coke and HBO to make you look at stuff.

~~~
gowld
And Usenet and email had the solution a decade or two before RSS. The march of
progress is in dumbing things down so the average person doesn't have to
think, and the intelligent person isn't permitted to think.

~~~
ep103
Harrison Bergeron

------
switch007
Just the past week:

\- New Yorker publishes this article

\- Mayor of London at SXSW called for tighter tech regulation.

\- UK Culture Secretary declares social media "broken", states he wants to
time-limit children's access to social media and "impose stricter checks on
the age limits when children can first set up a social media profile"

\- Tim Berners-Lee said "we must regulate tech firms to prevent 'weaponised'
web"

\- The Economist prints an article "On Twitter, falsehood spreads faster than
truth" (reporting on a recent study by MIT’s Laboratory for Social Machines)

\- A report is delivered to European Commission: "Final report of the High
Level Expert Group on Fake News and Online Disinformation"

\- The House of Lords International Relations Committee concluded tech firms
were negatively affecting our society

Blimey...

~~~
mattdeboard
Yep, I agree. Politicians & their PR firms are getting the bee hive stirred
up, swaying public opinion. It's the governmental version of
[http://www.paulgraham.com/submarine.html](http://www.paulgraham.com/submarine.html)

------
minimaxir
It's worth noting that instead of community management, Reddit has been soft-
pivoting to become a new Facebook with features like group chat, new profile
designs, and a News Feed-esque view on the official mobile apps.

Incidentally, Facebook itself hasn't solved the problem of its community/fake
news, hence the recent algorithmic changes to the News Feed to surface more
content from friends, and a public push toward Facebook Groups. To be like
Reddit.

It's a never-ending cycle.

~~~
ben_jones
They couldn't make enough money with their current monetization model so
they're copying the arguably most successful player in their market (FB).
It'll be interesting to see how it plays out because of reddit's "anonymity",
its lack of real social connections (friends, family, that cute person who
sits next you you in econ), or any kind networked apps (whatsapp, instagram,
messenger, ...).

------
austincheney
Bah, I recently deleted my 5 year account with nearly 4000 comment karma on
Reddit.

Reddit is toxic. Saying something benign but mildly disagreeable caused some
people to get emotional, and I do mean exceedingly emotional like you just
crapped in their car and had sex with their girlfriend. I am not talking about
politics, but subreddits like _javascript_ and _programming_.

Worse than being toxic, Reddit is an echo chamber. Many users find vote counts
as more than a validation but rather a righteousness or correctness, which
contrarily means opposing votes are not a disagreement but a vilification.

I just found myself becoming progressively more depressed contributing to
conversations there. In the end it seemed to almost never be about the
conversation but about being RIGHT + 1!!!

Since deleting my account a month ago I am a much happier person.

Hacker News has voting as well, but with an incredibly important distinction.
The votes are hidden from all but the respective contributors so as to
eliminate votes from ever being used as any kind of vindication by ignorant
people.

~~~
zaarn
There are definitely reddit communities were this isn't true, though these are
largely small community subreddits that focus on some particular topic.

~~~
ohtwenty
This is often mentioned but the examples for it keep changing—any of the small
nice communities that grows too large will become filled with noise and become
unpleasant. It's the reason I've been moving to lower-volume/higher intimacy
websites recently, hoping to filter out noise. Unfortunately it's not really
easy

------
digitalantfarm
Reddit used to be a place where you could post links to anything on the
internet (kind of like this place) and comment on it (without signing up with
your email). Now it's place more like Facebook with algorithms based on your
personal taste, geolocation, forced email, shadow banning, banning of
politically incorrect subreddits, endless political posts and "Futurology" and
"Uplifting" news articles. Free speech on reddit died a long time ago.

~~~
gowld
Reddit has a common frontpage (two, actually: /popular and /all) and user-
customizable list of subreddits.

> algorithms based on your personal taste,

only for recommending subs to try

> geolocation

only for recommending subs to try

> forced email

no it does not

> shadow banning

extremely rare, and per-sub, not reddit-wide

> banning of politically incorrect subreddits

extremely rare; most of why Reddit is publicly despised is because it _allows_
politically incorrect subreddits.

~~~
fernandotakai
> extremely rare, and per-sub, not reddit-wide

from your list, it's the only thing it's not correct:

there's a site wide shadow-ban that is only done by admins. that means if you
post something, nobody will see but _for you_ everything will show up as
normal. subreddit moderators can approve your comment and it will show up.
this started as a measure against spammers but it seems like normal users get
shadow banned too.

now there's normal ban, where you can be banned from a subreddit (you can't
post but you can browse) and there's a reddit-wide ban (your account gets
basically deleted).

and there's also AutoModerator "shadow-ban" which makes use of reddit's auto
moderator to delete a user's comment as soon as it get posted.

------
deviationblue
Why do we think we'll succeed at "detoxifying" the internet? To me it seems
like wanting everyone to say things that don't differ from what everyone else
says. That doesn't happen off the internet.

Off the internet, most people interact with those who are more or less like
themselves, so there's less chance of disagreement. For anyone who differs
significantly from you, you hold back your real thoughts and/or stew silently.
We just have to accept that echo chambers are an inevitability on the internet
with censorship and moderation, and discontent is an inevitability otherwise.

If people knew how to talk diplomatically or to get along, this wouldn't be a
problem in the first place. Maybe that's something that can be taught, but
it's not a problem that gets fixed on the internet, or is a problem caused by
the internet- fix it at home; this may also take care of the trolling problem.

~~~
austincheney
> Why do we think we'll succeed at "detoxifying" the internet?

We won't. Honestly, most people don't want success in any form. They want
validation, comfort, and instant gratification. If you have to direct success
toward a broad enough group it will never be achieved lest you temper your
expectations. This applies to all things. It even applies to choice of career,
income expectations, and hiring decisions as well as expecting more mature
behavior online.

------
rdl
Rather than censorship, just give readers better filtering tools. This was
largely solved on USENET years ago, has been largely solved in email (to a
great degree) -- it's just Twitter, FB, etc. which seem to have a hard time
solving it.

If you made certain sets of filters the default, or available in groups, and
did filtering based on your best beliefs about the desires of readers (based
on keywords, senders, and maybe other characteristics of the conversation), it
would go a long way.

~~~
s73v3r_
How would you do that in such a way as to prevent a Twitter army from piling
on someone? Or at least make it so they don't feel they're being bombarded?

~~~
rdl
At some point go whitelist or raise thresholds very high (like, on twitter,
only people you follow, or blue checks, show up in your timeline if abuse is
detected, as a very rough pass. Could easily do something far better.)

------
seany
I can't be the only person who is just not bothered by this kind of thing at
all, am I? I'm way more worried about political indoctrination in upper
education than trolls on the internet.

~~~
1337biz
Then you are running in the wrong circles where you can't just reject any
oposing viewpoint as 'Russian bots'.

~~~
verylittlemeat
In the past month I've had 2 comments accuse me of being a Russian bot or a
shill on hacker news. People really do just see what they want to see.

------
throwaway84742
Aside from the degenerate cases like “kill all white men” or “liberalism is a
mental disorder”, one man’s “toxic comment” is another man’s “common sense”.
And as frustrating as it may be to SV digital overlords they don’t get to
decide which is which, at least not without unpleasant repercussions down the
line.

So there’s really no way to solve this. There is a way to contain it somewhat:
create homogeneous bubbles. Which is what we’ve been observing over the past
few years.

~~~
abvdasker
Or, rather than giving up or "creating homogeneous bubbles" instead you could
look at things on a case-by-case basis as Reddit is doing. At the end of the
day the privately run company _does_ get to decide whether content warrants
removal or not since they are running a business and advertisers don't like
advertising in a space associated with extreme or harmful content. They also
have an incentive not to take the moderation too far since to do so would put
the company's most valuable asset — its user base — at risk.

~~~
throwaway84742
Trouble begins when you try to define “harmful” content outside the most
egregious cases. Politics inevitably creep in. Worse, these days some people
think that “words are violence”, that being offended gives them any rights
(and thus creates the vicious cycle of being offended by more and more inane
things), and that everyone who voted for Trump is a Nazi. How far are you
willing to go, as a private company, to please your advertisers? And how far
will your user base allow you to go before deserting en masse?

------
castis
As long as there are no consequences to being a dick to people over the
internet, "toxic internet" is here to stay.

~~~
devmunchies
Sticks and stones.

More concerning is the large scale propaganda and marketing.

~~~
castis
This has nothing to do with "getting your feelings hurt".

If someone is a dick to you IRL, you just go about life without them and its
of zero consequence to you. Imagine if they stayed in your life outside of
your own accord and just yelled dumb shit at you from the sidelines.

Not everyone is built as strong as you, it bothers some people.

~~~
intended
Thats untrue - in that what you describe is the kind of trolling we would wish
for because its benign.

Trolls actively target forums for support groups. Today they can attack groups
and classes of people.

At this point, expecting people to "grow a thick skin" is not an option,
because a major aspect of their life is fighting for their rights against
people who are attacking them, or people who have hurt them.

At this juncture the troll loses all rights over the definition and
interpretation of how their action _should_ be perceived.

Instead most normal people just see it as vandalism and stop caring about
whether it was for the lulz or not.

\---

And as stated in the article - NOT responding to a troll who is attacking
someone makes you complicit.

~~~
castis
Your definition of 'complicit' appears to be flawed.

Choosing not to be involved is clearly not the same as actually starting the
behavior to begin with.

------
wnevets
Reddit has zero problem leaving post that threaten the lives of an entire
religious group on their frontpage. If they were serious certain subs would be
nuked.

~~~
ralusek
You realize that is a perfectly reasonable objective for a site to be a
_mostly_ impartial, user-moderated forum, right? I don't consider _any_ view I
encounter on reddit to be endorsed by the platform. If people want content
moderated with a specific slant, just subscribe to the subreddits that suit
your tolerance, there are plenty that are heavily moderated.

Personally, I love seeing things that I disagree with on reddit...it's
honestly one of the last places left that gives me the feeling of exposure to
different bubbles.

~~~
fatbird
The problem with Reddit is that, while it's not endorsing the content, it is
facilitating it. You can let the white supremacists in your neighbourhood use
your garage for their weekly meetings and plausibly claim that you're just
being neighborly while disagreeing with their purpose. But I'm still going to
judge you for helping them out, for making their organizing easier and
cheaper, and for smoothing the road for them.

When Reddit banned r/fatpeoplehate and r/coontown in 2015, among others, it
actually succeeded in reducing the amount of hate speech on Reddit:

 _Working from over 100M Reddit posts and comments, we generate hate speech
lexicons to examine variations in hate speech usage via causal inference
methods. We find that the ban worked for Reddit. More accounts than expected
discontinued using the site; those that stayed drastically decreased their
hate speech usage—by at least 80%. Though many subreddits saw an influx of r
/fatpeoplehate and r/CoonTown “migrants,” those subreddits saw no significant
changes in hate speech usage_ [0]

By making it easy for participants of r/coontown to participate, Reddit
actually contributed to the output of r/coontown.

[0] [http://comp.social.gatech.edu/papers/cscw18-chand-
hate.pdf](http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf)

~~~
ralusek
Letting white supremacists use my garage for weekly meetings is not the same
as me building an establishment for the explicit purpose of facilitating
conversations of all kinds, and then not kicking out a white supremacist
minority for using the facility.

My concern with censorship is always 2 fold:

1.) You have to always assume the person who is in control of censorship is
the person who doesn't agree with you. Donald Trump has on multiple occasions
indicated that he would like to restrict the press's capability to criticize
the president, since he has been in power. Kim Jong Un shares a similar
sentiment. Would we like that, because there are many Trump supporters who
would see absolutely nothing wrong with that. There was a moral majority in
the US for a long time that would have preferred to restrict the capability
for others to talk about what they considered to be sinful things, such as
homosexuality or communism.

2.) You have to realize how quickly things become conflated. Ben Shapiro, a
conservative pundit, went to Berkeley to speak, where student groups
attempting to ban him from speaking had strung up a huge sign reading "We say
no to your white-supremacist bullshit." It's important to note that Ben
Shapiro is an Orthodox Jew who is harassed daily by white supremacists
online...and has never said a single thing advocating for racial supremacy. In
fact, he is hugely opposed to racial, gendered, or national identity politics
of any kind. Students wanted to ban him for hate speech simply because they
associated him with what they perceived to be hate speech.

~~~
fatbird
Building an establishment to facilitate conversations of all kinds generally
does not obligate you to allow every specific conversation. Were it my
establishment, I wouldn't permit members of NAMBLA to organize their sex-
trafficking trips to Southeast Asia, or white supremacists to organize the
next Unite The Right rally. I wouldn't feel at all like I'd betrayed my
purpose in facilitating more, and more different, conversations by
specifically banning some that are obviously problematic in and of themselves.

I also wouldn't feel like I was censoring anyone. My not facilitating their
conversation is not the same as my preventing them from having it. I would
also be mindful that the presence of nazis and pedophiles in my establishment
was itself a barrier to other communities taking advantage of my facilities;
if members of those other communities started showing up after I banned them,
I'd probably consider it a net win for freedom of expression and
conversational facilitation.

Also, you're soft-pedaling Ben Shapiro by calling him a "conservative pundit",
as if he's in the same inoffensive league as Ross Douthat or David Brooks. He
was an editor of Breitbart News, which is now quite openly the voice of the
"alt-right"; he departed and got on their enemies list because he was smart
enough to realize that Trump was going to damage the cause for a generation--
which is why they're going after him, because they were quite happy to have an
orthodox jew on staff while he was agreeing with them. He's said a variety of
things about Arabs and black people that are flat out racist. He's not quite
the bomb-thrower that Milo is, but he still drops nuggets such as "Arabs like
to bomb crap and live in sewage". Calling him a white supremacist might be
hyperbole, but it's not simply conflating conservative views with nazis. He's
a racist asshole who advocates for ethnic cleansing. Maybe white supremacy
isn't his conscious goal, but when he facilitates that end, it becomes a bit
irrelevant if he's actually written that down on his "A better 2018" list.

------
overcast
The only way to detoxify the internet, is to get rid of the "social" aspect.
Starting with _gasp_ comments. My online experience has been that much better,
since adding comment blockers to my browser.

~~~
ASalazarMX
Reddit has a large user base and a myriad of subreddits where different
solutions have been tried. So far, the best I've seen is strong moderation:
clear rules and swift enforcement.

AskScience is a shining example of a high quality subreddit, although the
comment section usually looks like a graveyard with 90% of the comments
removed.

~~~
jstanley
I'd like to see moderation decoupled from the forum namespace.

So you'd subscribe to a forum namespace in order to see posts about a topic
(say, AskScience), but you would then also subscribe to whichever moderators
you want. The moderation wouldn't be inextricably tied to the namespace.

Anybody can post anything in any namespace (and anybody can declare themselves
a moderator of any namespace), but people will only see posts if their chosen
moderators allow it. You could have all of the same moderation powers that
exist at the moment, except any given moderator's actions are optional to any
given user's view of the forum.

We wouldn't need a situation like with r/bitcoin and r/btc where disagreements
about moderation resulted in a splinter group creating a separate namespace:
those who disagreed with moderator X would simply unsubscribe from X's
moderation.

I don't know if this would result in a _better_ forum, but I'd be interested
to see how it's different, and I don't know of anywhere it's been tried.

(And in case someone out there is granting wishes... can I also have it
decentralised e.g. using IPFS's pub sub?)

~~~
crankylinuxuser
Yep. That was usenet. But in those days, we didn't have 3rd party recommended
filters - but there's no reason why they wouldn't work now.

Maybe it's time to remake Usenet, minus binaries. Binaries, piracy, and their
data load per server are what killed Usenet.

(Yes, I know it's still living on in paid-service world. But gone are the days
your ISP runs a machine.)

~~~
jstarfish
> Binaries, piracy, and their data load per server are what killed Usenet.

Binaries and piracy were two of the biggest reasons to _use_ Usenet.

 _Spam_ is what killed it.

~~~
crankylinuxuser
I respectfully agree and disagree.

Spam was and still is a nassive headache. However Bayesian filters were really
starting up. But Spam was annoying at best.

What caused Usenet servers to be quit was that ISP's were seeing them as a
pirate haven and a lawsuit magnet. There was all the impetus to stop
supporting piracy, and lose the costs incurred with that bandwidth to a Usenet
server.

Sure, it was a great draw to use it to pirate... but it is also why it fell.
Now these days, time to move to IPFS. That place is ripe for piracy, and super
simple to share.

------
Clanan
> “Does free speech mean literally anyone can say anything at any time?”
> Tidwell continued. “Or is it actually more conducive to the free exchange of
> ideas if we create a platform where women and people of color can say what
> they want without thousands of people screaming, ‘Fuck you, light yourself
> on fire, I know where you live’?

This comment, by Reddit's General Counsel, says a lot about their approach.
They believe that minorities/women will be subjected to nonstop violent
comments unless Reddit pursues proactive and forceful moderation, free speech
be damned. I think (hope) that's a very cynical view. The author of the piece
does a great job juxtaposing this quote with the results of r/Place, however,
in pointing out that it did not devolve into swastikas and "toxic" content
when left to its own devices.

But reddit moderation is not just about quality content. I think many
commenters here miss this. Moderation of a hugely successful website is not
just about the technical issues or the logistics. Huffman, the CEO, is quoted
as saying he believes Reddit can sway elections. So now moderation is about
power. Imagine the temptation: you're one click away from destroying the
popular, impactful subreddit of a political candidate you detest. One click to
sway an election, maybe.

Edit: spelling

~~~
intended
Minorities and women _are_ subject to nonstop violent comments.

God help you if you are a moderator and a member of a popular community and
known as member of one of those groups.

And its even worse if you are a mod, a minority member, AND a mod of a
minority community.

\----

The best way to think of social media/reddit is a beautiful cleave sort
process.

People form a community, and then they have to deal with attackers or
outsiders. Eventually a drama even occurs and people get expelled.

These people who are expelled form their own community and this process
continues ad infinitum.

On top of it, some people will independently create subs just to target you,
or create subs that inadvertently target you (from the old forum days - nazi
forums would actively harass Jewish forums.)

------
zitterbewegung
I don’t know if you can do this. The problem with social news sites is not
that the trolls are winning but it is that the moderators aren’t properly
compensated because no website has the capability of moderating large
websites. Also to get advertisers on your platform you need a moderated forum
that aligns with your advertisers. The internet always had free speech and
with that comes trolls by definition . The difference now is that there are
competiting ideologies associated with websites that want to control vast
parts of the internet . We have websites that want your pictures , attention
and comments for free in exchange to promote other people’s products. On the
site operators side they need to do deals with advertisers who don’t want that
controversial topics. So the advertisers force discussions on the platform to
only align with them. Or the site operators do this. Solving the troll problem
is not going to be solved just by reddit.

------
AllegedAlec
> To its devotees, Reddit feels proudly untamed, one of the last Internet
> giants to resist homogeneity.

Waitwhat? Reddit is a huge hivemind, with each subreddit having more or less
the same views, since people who do not agree to the narrative of that
subreddit are quickly downvoted so much that they can't post more than once a
day or something.

~~~
treerock
That's not my experience. There seems to be subreddits on every conceivable
topic, and you can subscribe to whatever variety you want and you'll get a
wildly varied range of views/politics/whatever.

There are even 'change my view' type subreddits for people looking to
understand and debate the 'other side' of their arguments.

~~~
austincheney
The comment you are replying to becomes more valid the larger the subreddit
regardless of its subject. You wouldn't think subreddits like _javascript_ and
_programming_ would be hotbeds of rampant emotional insecurity fueled by deep
echo chambers, but they certainly are.

------
harlanji
All this centralization has created all these problems. We need to federate
our systems, perhaps at the app level even. I worked at OpenTable for a short
and their arch was interesting. Each PoP in the old version is its own system
that pushes data central. Tho they’re replacing it with a new generation of
centralization presumably due to eng maintenence concerns. Similar with their
regional frontends, seems they’re being consolidated.

I am coming to think nested intranets and special purpose vpns might be a good
approach to global commerce and expression. I love building massive scale no
touch systems, but they’re never that perfect once they start being used. Eg.
Usually you end up with more severe and harder to solve issues with one
massive db vs. smaller systems with derived data and links. And on the FE a
lot of hard problems go away, beyond i18n etc also cyber bullying etc become
tractable.

------
randyrand
Would it be sufficient for every user to just have client side configurable
filters? Or is this movement more about denying "toxic" people a platform to
begin with?

If it's the latter case, I can't get behind it. Free speech means everyone
should have a platform to speak. As society transitions away from using our
mouths to using our keyboards, the first amendment implies that speech should
be protected on the common medium - whether verbal or text.

~~~
s73v3r_
Filters can work after the fact, but they don't prevent a lot of it to start.

As for "toxic", the article is referring to things like /r/CoonTown, or
brigades like the one that went after Leslie Jones on Twitter a number of
years ago. The people behind those already have a platform to speak: The
Internet. They're not entitled to space on another platform like Reddit or
Twitter, and they're not entitled to force their speech in front of someone
else's face.

~~~
prepend
Coon town is obviously offensive, but easily avoided (ie, not a problem).

I casually followed the Leslie Jones incident and don’t see the problem. The
brigading seemed easily avoided by Jones and was actually amplified by media
covering how bad it was.

My twitter feed doesn’t show mentions by randos, so Jones should have been
able to easily filter out at the client. She may have for all I know.

So both your examples seem easily solved by client filters to not force speech
in faces.

~~~
s73v3r_
"Coon town is obviously offensive, but easily avoided (ie, not a problem)."

It is until it starts to leak.

"I casually followed the Leslie Jones incident and don’t see the problem."

I honestly cannot see how one can see the incident, and cannot see a problem
with thousands of posters telling an African American woman that she's an ape.

"The brigading seemed easily avoided by Jones"

That is blaming the victim.

"So both your examples seem easily solved by client filters to not force
speech in faces."

Except they're not.

~~~
gowld
> It is until it starts to leak.

It doesn't really leak, not until the honeypot gets shut down and the denizens
are forced to hangout elsewhere.

~~~
ufo
Someone made a quantitative study about the /r/fatpeoplehate ban recently and
their conclusion was that the honeypot theory isn't valid and that banning
hateful subs causes a reduction in overall hateful behavior instead of just
shuffling it around.

~~~
intended
This was the study - [http://comp.social.gatech.edu/papers/cscw18-chand-
hate.pdf](http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf)

------
wufufufu
You are making the problem seem worse than it actually is. Account age and
inability to delete comments make an anonymous reputation pretty realistic on
HN at least. Just in this thread we see examples of toxic comments being
downvoted into invisibility (oblivion).

Or is the problem that /r/the_donald exists? Just don't go there if you don't
like it. And don't go to /r/trees if you don't like marijuana.

~~~
InclinedPlane
The problem is that reddit is one site. You can live in other "neighborhoods"
on reddit that are not /r/the_donald, but you are still forced to live in the
same city. That means people from there will show up in all of the other sub-
reddits, voting and commenting away and twisting every single part of the site
to be what they want. It's exhausting to put up with, and if you don't notice
it then count yourself lucky. There are many regional city/state/province sub-
reddits that have come under concerted "attacks" by /r/the_donald posters.
Folks from there will re-post links (or share via chat) to "controversial"
posts in other sub-reddits to coordinate brigading, for example.

~~~
thomastjeffery
> It's exhausting to put up with

It would be _more_ exhausting to put up with a system of censorship.

There will always be people around you that say or do things that you do not
appreciate. Getting offended at them is something _you_ do to _them_ , not the
other way around.

The more we act like people have power over us, the more power we give them.

~~~
intended
This theory has long since met with reality and lost.

I mod - and it turns out that trolls target groups of people.

So for example seeing a bunch of posts targetting women, or minorities, is
going to go ahead and affect people who have been abused for being parts of
those groups.

Indiscriminate massed assaults are extremely hard for people to deal with -
and even if I am not the target I would expect myself to step in, or some
authority to step in.

So yes, in a very real way they have power over a large number of people, just
the same way you have power over someone weaker or more vulnerable than you.

The old chan philosophy worked on a smaller net, if it even worked then.

------
skookumchuck
> Reddit was also an important part of Trump’s strategy. Parscale wrote—on
> Reddit, naturally—that “members here provided considerable growth and reach
> to our campaign.”

So were Hillary's and Bernie's campaigns, who each had a major presence on
Reddit, both much more than Trump's.

------
cup-of-tea
I quit reddit for good on Saturday. There are two major problems with it:

\- Downvoting creates echo chambers,

\- No accountability for moderators.

I think the downvoting should be removed completely. Downvoting is basically
giving people the power of a mini-censor and the people _love_ it.

Subreddits can be completely censored by moderators and there's no way to know
this is happening. Maybe reddit can develop a way to judge the behaviour of
the moderators. Like showing what posts they removed, for example.

I've been part of plenty of internet communities that don't become like this.
Even completely public forums and IRC channels. I guess HN isn't as bad as
reddit because we have an above-average intelligence crowd, but there are some
opinions I am careful not to mention here as well.

~~~
prepend
How do you filter out user sentiment without downvotes?

I think there are flaws in downvoting, but they are less than systems with
only upvote (eg Facebook).

Upvote only encourage extreme views that attract attention because if 5% like
and 95% dislike (theoretical) this is ranked higher than 4% and 0% dislike.

What we really need is a score that is like as percentage of views that will
be more meaningful.

Mixing in some sort of cost to post/vote will help as well because people will
be more judicious with likes. Currency is hard to work out as steem is a big
fluffed-fest. Maybe something where you get credits every day and can exhaust
or bank them.

~~~
cup-of-tea
I've never actually used Facebook so I haven't seen the effects of only
upvoting. But up/downvoting is not meant to signal agreement or disagreement
anyway. It's meant to signal whether it's a useful post or not. But people
treat it as the former which is why it creates echo chambers. I think the
Facebook "upvote" is meant to signal agreement. Does it also make the post go
to the top?

~~~
prepend
Something like that.

Ultimately it doesn’t matter what up/down vote is supposed to mean. It matters
what the outcome is.

Up means prioritize, down means deprioritize.

------
allthenews
I detest this smug mentality, that so called trolling is trivial to identify
and differentiate from legitimate, but unpopular opinion.

I feel strongly about it because it creates opportunity for censors, both in
the form of moderators and communities themselves, to stifle debate under the
guise of "detoxification." This kind of communal policing is ground zero for
the echo chamber phenomenon.

We live in a reality where there will always be those who abuse platforms of
communication for nefarious purposes. We also live in a reality where we are
able to temper our own outrage, and we must if we wish to maintain healthy,
open dialogue.

Offense is taken, not given.

~~~
s73v3r_
And I detest this smug mentality that this subject is concerning things that
are mere differences of opinion, like we're discussing tax policy. That's not
the case in the least.

~~~
allthenews
You don't seem to understand that terms like "toxic" are as arbitrary and
subject to abuse as the word "obscene".

In fact, I'm not even sure as to what you're getting at. Are you seriously
suggesting that what is toxic is somehow universal?

It does not bother you that dozens of serious contemporary, controversial
topics, with severe consequences for our society, are regularly censored on
websites like reddit under the guise of so called toxicity?

You aren't bothered by such vague conceptions of offensiveness being used by
corporate giants to push their narrative at the expense of others? Of the
danger of self-censoring group think on community moderated boards?

~~~
s73v3r_
"It does not bother you that dozens of serious contemporary, controversial
topics, with severe consequences for our society, are regularly censored on
websites like reddit under the guise of so called toxicity?"

I think you're gonna have to provide an example for this. An example of
something that is being labeled as toxic, but wouldn't appear to be.

~~~
allthenews
The problem is not that the topics themselves are toxic, but that only one
point of view is identified as such. I only mention these "flamebait" topics
as examples, not to discuss them or choose a side.

Gamergate

Gender Equality/Equity

Affirmative Action

White Privilege

The various subjects in James Damore's recent memo [note, he was fired for
honest, sourced feedback that was explicitly requested of him]

Politics in general (especially anything remotely pro-Trump or pro-Republican)

If you pay attention to dynamics on reddit, for example, certain sides tend to
be stifled, regardless of any potential merits. I am not choosing sides, just
offering examples where certain opinions are immediately identified by
communities as toxic, and no discussion is even attempted.

Hell, there is a growing movement among feminists, at least a vocal minority
online, which claims that our modern concept of masculinity is "toxic."

The very word is turning into a catch all for "opinions I don't like," which
seems to happen any time one attempts to censor information.

Finally, consider this: if the topics I've brought up are so clearly, as you
claimed, not merely "differences in opinion," then why do they attract such
heated discussions, with contributors on both sides, even on HN?

~~~
milesrout
I suspect I might be feeding the troll, but I'll give you the benefit of the
doubt.

Gamergate might have _very initially_ been about 'ethics in games journalism',
but it didn't stay that way very long. It was very quickly infested with
racist, sexist trolls. All that stuff with the lady that supposedly slept with
someone to get good reviews, or whatever? Turned out that was total bunk. That
was proved within a couple of weeks.

Guess what, KotakuInAction still exists. It doesn't really matter whether the
initial idea was right, though. If they'd been right initially, their later
actions would still be wrong. And they were wrong initially in reality, and it
hasn't stopped them from continuing their shit.

For quite a long time, I dismissed the criticisms of Gamergate because I
really had nothing to do with it. I read the initial couple of news articles
and watched a couple of videos with some outrage and kind of went 'yeah no
shit, IGN gets paid to give good reviews, not news'. I then basically ignored
them for years. I would hear that they were toxic this or problematic that and
I dismissed it, because I thought their initial points seemed solid enough.

Then when I actually went back and looked at it, it had basically turned into
TheRedPill. KotakuInAction and GamerGate are really not worth defending.

\---

The rest of the topics are pretty similar: all differing levels of
controversial, and with echo chamber subreddits heartily happy to advocate for
them and heartily happy to advocate against them.

The word 'toxic' is only used by one of those sides, but the other side
generally uses far worse language to describe their 'opponents' if you want to
use that term. I think being described as 'toxic' is a little better than
being described as 'worthy of being shot' or whatever other horrible things
people in TRP, The_Donald and such tend to say.

Plus sometimes echo chambers are actually good. When you want to discuss the
finer points of something, being able to just say 'this is not a subreddit for
debating the merits of ideology X just because it's called /r/X, go to
/r/DebateX please' is fine. One of the biggest problems with reddit is that in
comparison to old-style forums it's so ephemeral. Other than the (maximum 2)
stickies and sidebar links there's little in the way of permanent material in
any subreddit, so you end up having the same introductory surface-level
discussions again and again and again.

Reddit is a social _news_ website first and foremost. It's not good at
discussions at all.

------
pbalau
This is stupid. Facebook, reddit, twitter are trying to make the internet this
beautiful family friendly place. This is bullshit, we as humans are toxic.
This is the way we function. This is the way react on things. Censoring all of
this might affect things we should not censor. Fuck family friendlyness if it
has a chance to affect free speech.

------
skookumchuck
> which brand themselves as strongholds of free speech and in practice are
> often used for hate speech.

Um, hate speech _is_ free speech.

~~~
DanBC
Hate speech is used to silence people. Hate speech is not free speech.

~~~
skookumchuck
You're referring to threats, which are not free speech. Hate speech is free
speech.

------
at-fates-hands
Since the dawn of social media, as an anthropologist, I find the way it has
evolved to be incredibly fascinating.

The original point of social media was to bring people together and build
communities. Now, more than a decade removed from it's inception, it is doing
the exact opposite. It's tearing people's communities apart, making people
_more_ polarized, and driving huge wedges into the very foundation of this
country.

And yet, with all the privacy concerns, people continue to use it more and
more, seemingly unaware of the consequences of doing so. It's like a train
wreck you see coming a mile away. Yet you continue to stare, thinking or
hoping its going to get better.

~~~
nostalgeek
> The original point of social media was to bring people together and build
> communities. Now, more than a decade removed from it's inception, it is
> doing the exact opposite. It's tearing people's communities apart, making
> people more polarized, and driving huge wedges into the very foundation of
> this country.

"Corporate" Social media are businesses, they are not here to bring people
together but to sell influence. Businesses like Google,Twitter or Facebook
have been very good at that. Social media are no different than TV or any ad
sponsored business. The customer is the advertiser. How do you think these
businesses are making billions without their users paying anything?

Venture Capitalists decided to support these business models with massive
amounts of money, allowing them to reach critical mass. The whole "bringing
people together" was just marketing ploy during the growth period in order to
acquire an audience. But that's not how VC got their return on investment.
Remember when Facebook was a website acting as a private network for families
and friends? they quickly pivoted away from that because it doesn't make that
much money.

------
dalbasal
With the benefit of hindsight, I think “free-form platforms” like reddit or
youtube may be the road to mediocrity.

Youtube is still the major player even though twitter, Facebook & netflix grew
around them. It is less opinionated on how video should be consumed. Less
opinionated on what should be consumed. Anyone can post. Anyone can see it.

Problem is, youtube doesn’t really have a purpose that it is trying to serve.
At least not an obvious one. It has a lot of content. It’s searchable (I hate
that the web is no longer searchable..). It’s the obvious place to put stuff.
It’s everywhere. But…

The platform does not distinguish between the lazy mashup with a clickbait
headline and the youtuber making original content for people who would really
miss it, and could not get it anywhere else. No preference. No values. No
taste. No goal. Just views. One content is as good as the next. Monetisation
reinforced this, as did the partial rollback.

Go to the youtube homepage now. Is the content good? Read the comments. If the
conversation interesting?

In the middle is Facebook. It started off highly opinionated. Use real names
(big deal at the time). Connect to your real friends. Watch their videos and
read their posts.

Over time though, Facebook has lost opinions. Stuff is stuff. Views, likes,
pokes, plays and comments. Features that get users doing more stuff, they win.
This caused their political pickle. Political baiting gets lots of views,
likes, pokes, plays and comments. So, as they “optimized” for more stuff,
that’s the stuff they got.

On the other end of the spectrum are wikipedia and stack overflow (for
example). Highly opinionated. Wikipedia is for XYZ. Stackoverflow is for ABC.
If you want something else, go someplace else.

HN is opinionated too. It takes effort to avoid comment threads full of
bullbaiting an even more to avoid a flood of one liners.

So Reddit… Reddit did avoid uniformity, that’s a nice way of putting it. It
has values. It has things (e.g. censorship) that it does _not_ want to do.
But, I feel it still lacks purpose. What does Reddit actually want, what’s the
ideal comment or post or sub reddit?

Personally, I would love more HN-like places, more wikipedia like places,
places with a job to do.

gdubs has an excellent comment about what Reddit is doing wrong (promoting
impulse gut reactions), and I suspect it’s related. When you don’t have an
opinion about what you are trying to do or be, and so you’ll tend to end up
with whatever is cheap. It happened to youtube and it happened to Facebook.

~~~
jeffreyrogers
I don't know about youtube. There's a lot of really interesting, useful stuff
from people who just want to share a bit of what they know about some niche
topic. If you know what you're looking for youtube is a great resources. But
if you're just trying to browse to find fun stuff in your free time, then I
guess it's not as useful.

I agree that a lot of the bigger channels get worse as they make videos
primarily to get views. Although I think this is true of most content
producers: You only have so much good material, but it's hard to quit when
there's still money to be made at the margins.

~~~
dalbasal
I don't disagree, but I think we might be talking past eachother.

Youtube definitely does have a lot of really interesting, useful stuff.
There's some stuff that's fantastic. That's the stuff I want them (youtube) to
value more highly. Some (most) of it is interesting and useful to people who
aren't me. I'm not even commenting on youtubers disingenuously leaning into
controversy or whatnot to get views.

What I'm saying is that there is a lot of stuff that isn't. Youtube does not
seem to have an opinion about what is good or bad, at least not beyond view
count. Have a look at their homepage. Is it good? Does it give a special place
to the things youtube does uniquely well? To me, it just looks like a random
collection of videos and a lowest common denominator theme.

Structuring youtube as a video db, with search... that's great. I like that.
It means you can narrow down to the point where the lowest common denominator
is relatively high, but I feel like youtube would prefer me to just watch some
scandal outrage news clip or a show clip or something.

------
confounded
Algorithmic/centralized vs Human/community moderation is the largest divide
between Facebook, Twitter, and Google vs. Reddit, HN, Twitch, Mastodon, etc.

I’m thoroughly in favor of the latter. Algorithms can be used as tools to aid
human moderators, but context, discretion, and diversity _are_ important. In
the case of bad _communities_ , you have a much more tractable problem in
terms of tracking/banning/avoiding what goes on —- as both a platform and as a
user.

------
bskinny129
I just started reading a book that got me thinking about this - Conspiracy:
Peter Thiel, Hulk Hogan, Gawker, and the Anatomy of Intrigue [0]

If you can remove just a couple of the worst actors from the internet, does it
have an outsized benefit? Are those people defining "acceptable behavior" and
by example giving more reasonable people permission to behave that way?
Interesting questions regardless of the specifics of the Thiel/Gawker case.

[0 affiliate]: [https://www.amazon.com/Conspiracy-Ryan-
Holiday/dp/0735217645...](https://www.amazon.com/Conspiracy-Ryan-
Holiday/dp/0735217645?tag=breakoutmentors-20) [0 non-affiliate]:
[https://www.amazon.com/Conspiracy-Ryan-
Holiday/dp/0735217645](https://www.amazon.com/Conspiracy-Ryan-
Holiday/dp/0735217645)

------
apeace
A lot of commenters here are mentioning that better education could help solve
this problem. I agree.

What if there were a new type of "teacher" who replaced or augmented regular
mods on sites like Reddit?

Universities strive to hire professors who will teach students to be open-
minded, consider the sources of information, and just _think better_ in
general. The prestige of each university is largely based on this factor. We
tend to measure this by looking at the research accomplishments of the
professors, or by looking at how many great thinkers/leaders come out of those
universities.

What if sites like Reddit did the same thing, and tried to build their
prestige based on excellent moderation? Could they be measured by how many
great ideas/movements come out of the platform?

How do universities prevent the "echo chamber" effect where a few mods or a
few users can take the whole conversation in one direction?

~~~
VLM
"How do universities prevent the "echo chamber" effect where a few mods or a
few users can take the whole conversation in one direction?"

They turbocharge group think. Probably not the best example of an open minded
community.

A distant cousin of my wife went to a Seminary; within a relatively narrow
range of beliefs, he claimed there was a lot more intellectual freedom than
universities along the lines of teaching how to talk to devil's advocates and
heresies and converting non-believers. More of a carrot and stick than
university which is solely stick-style, who can most emphatically say "I can't
even" and so on.

------
verylittlemeat
One more opinion for heap?

People are too sensitive about seeing their snowflake opinions validated.
Upvote and downvote do not map to right and wrong. What makes an opinion
popular has as much to do with rhetoric and appealing to a million fallacies
as it does to facts or logic. This is human nature, not an inherent problem
with the internet.

------
mozumder
Social media sites are learning the hard way that they're media properties
like any other media property, and that they have to have strong editorial
control over their media property to be a proper media business.

Reddit is a hive of working-class populism, which is incompatible with any
advertising-oriented business. Advertisers don't want their ads next to shitty
toxic content. They want their ads next to elite, well produced content. You
think Calvin Klein wants their beautiful fashion ads next to a photo of a dead
body?

Sure they can get 10 cent ads of your neighbor's garage sale to place next to
the photo of a dead body, but to get the $5 million ad buy from Proctor &
Gamble, they'll need editors to raise the quality of their content.

Detoxification is central to any ad-based business.

How quickly they learn this is the question for Reddit.

~~~
xenihn
Reddit initially rose to prominence because of bad decisions on Digg's part.
Reddit is nothing without its users. Any decision that potentially drives
users off its platform can't be made lightly.

~~~
mozumder
As a media business, they can be a lot more profitable with fewer high-quality
users than hordes of low-quality toxic users.

A fashion magazine only has 1 million subscribers, but can make $400
million/year in revenue, basically printing money.

~~~
Slansitartop
> As a media business, they can be a lot more profitable with fewer high-
> quality users than hordes of low-quality toxic users.

I see no evidence of that.

> A fashion magazine only has 1 million subscribers, but can make $400
> million/year in revenue, basically printing money.

A fashion magazine doesn't outsource its content production to its
subscribers.

~~~
mozumder
> I see no evidence of that.

I would encourage you to look at the evidence.

> A fashion magazine doesn't outsource its content production to its
> subscribers.

And?

~~~
Slansitartop
>>> As a media business, they can be a lot more profitable with fewer high-
quality users than hordes of low-quality toxic users.

>> I see no evidence of that.

> I would encourage you to look at the evidence.

I would encourage _you_ to provide it.

------
ScipioAfricanus
It's not that the "trolls are winning", it's that people are allowing the
trolls to bother them. Trolls have always existed; it's our heightened
sensitivity and inability to just shrug them off or laugh in the face of their
obscenity that's letting them "win".

~~~
ssully
Yeah this is bullshit. Trolls have always existed, but shrugging them off was
never a solution.

In message boards I used to frequent trolls were suspended without question
and banned for repeat offenses. Now when trolls get banned there is an out cry
from the troll and those in line with them about censorship and violation of
their free speech.

The problem is trolls are given too much room to play and speak.

~~~
mindslight
The conflict is that reddit originally touted itself as a meta-community,
where such moderation was applied _per-subreddit_. If you didn't like the
topics/policies of one community, then start another right alongside.

But the desire of investors for widespread palatability and the media's latest
push for censorship have perverted the site into creating unified "community
standards", across what should be considered independent communities.

Reddit itself gained much of its popularity due to the mass exodus from Digg
over their censorship of one simple number! Users inherently do not want to be
censored in what they can communicate about, and so the cycle will be with us
until we finally scrap this hack of using centralized websites in lieu of end-
user software - centralized structures can never remain free of top-down
control.

~~~
s73v3r_
The other problem is that several subreddits were known to "leak," where the
subscribers to some subreddits would go out and spread that kind of thing
across the rest of the site. If the racist content stays in their subreddits,
it's still terrible that it's there, but at least it can be firewalled off.
But when the users of those subreddits start spreading that content through
the rest of the site, it's much more difficult to avoid it.

------
lopmotr
Back 15-20 years ago, forums didn't like personal abuse or being off-topic.
Now, the problem seems to be "dangerous ideas" or "might influence people to
believe something I don't". It's changed from staying reasonable to trying to
influence real world politics by silencing dissent. Why not have freedom of
political ideas on a political forum? Stop worrying about influencing voters
in the "wrong" way and meddling in elections and all that nonsense.

Even on HN, there are "wrong" ideas that you can't even hint at, even when
they're on topic and you're being non-abusive. It's dominated by aggressively
enforced political opinion.

------
mancerayder
I'm an old BBSer. I was on message forums seeing the entire array of so-called
problems of today, which people summarize as "toxicity." I saw the full gamut
of what can happen, including court-ordered restraining orders, personal
harassment in real life, fights. But I also saw communities of what were (in
those days) teenagers and young adults (sometimes older adults) finding self-
moderation through trial and error and experience.

We were anonymous in those days and no, it wasn't more toxic, it was less. I
credit those days with teaching me to learn to write correctly, to learn a
style of approach to the world which is a mix of rational and skeptical, and
with being exposed to numerous philosophical and ethical viewpoints that I
would NEVER have been exposed to in the _actual_ toxic environment that I
experienced as "high school in America."

Now, this was in the mid 90's. Fast forward to the Internet revolution, we
start to see the beginnings of message forums online. Fast forward a little
bit, we're shocked to find an extremely low level of intelligent engagement,
and a high level of emotive hissy-fitting, anger and knee-jerking. Worse,
there's a set of interests and focus areas that belong to the masses. My BBS
friends and I quickly abandoned our years-long dedication to messaging. It was
finito.

Because suddenly, a computer nerd niche was exposed to the masses, and the
interests of the masses overwhelmed this niche, to the point that intelligent
message forum discussion boards because extremely hard to find. RIP
intelligent message forums (that _I_ could uncover, anyway) circa early
2000's.

Now, to me Reddit is the pinnacle of that. I've never written a single item on
Reddit but I've read hundreds of posts. And to me they are:

\-- Very short, one or two sentence one-liners Me-too's, "this!", etc. Blanket
dismissals. There's nothing inherent about anonymity or any other element of
the media itself that leads to that.

\-- Controlled by autocratic "mods" who delete or otherwise punish views that
aren't held by them. The political ones are classic examples. Newsflash, guys,
there's nothing inherent about politics that should lead to that, and it's not
the medium that leads to that degree of censorship. Echo-chamber my butt -
that's not an echo chamber, that's a top-down dictatorial type of
dictatorship. And evil at that. An echo chamber is when people self-select (in
my view) and bounce and reinforce each other's ideas.

\-- Emotiveness. This is number 1. In any discussion, whether discussing a
muscle pain in one subreddit, Trump in another subreddit, or altcoins in
another, what you find are people view the message medium as a chance to
express their feelings. And so when one person disagrees, it's almost an
attack or a failure to acknowledge the other's feelings, if you look at this
psychologically. But it's perceived as "toxic." What you say is toxic is a
manifestation of people who are not USED to rational back-and-forths -- they
didn't qualify for the debate team if you know what I mean -- and they think
that it's about expressing personal feelings. Almost like they are VOTING on
what is true and false.

Folks, I assert this: it's not the anonymous aspect. It's not "toxicity."
There's no "cure." There's only people. And levels of education and
intelligence (yes, I'm sorry, intelligence plays a role here.).

I was going to talk about Facebook but I deleted the paragraph as it's a whole
other set of variables in addition to these.

If there's ANY cure it's education. Teach people what argumentation is. Ad
hominems being the first thing they should learn is naughty. Next, a code of
conduct for MODERATORS. If you have moderators who behave like the Stasi
intercepting private correspondence and throwing out the bad ones, that should
be number one on the hitlist.

It's about a, education on argument logic, b, ethics on what moderation is
(and isn't). Again, my BBS peers learned this when we were 13-18, gradually.

------
valeg
The article reminded me about another proposal of how to fix reddit:
[http://chuqui.com/2015/07/fixing-or-replacing-reddit-some-
qu...](http://chuqui.com/2015/07/fixing-or-replacing-reddit-some-quick-
thoughts/) I see a problem with this but there are some valid points. Maybe
mastodon.social is heading there:

 _So I wouldn’t host the stuff. But I could build an easy to install
environment that would be a standardized system that could be installed on
effectively any hosting site. Start with WordPress, WordPress’s P2 theme, a
forum plug-in, the Disqus commenting system and a couple of weeks of hacking
some custom work, and you’d have something that could be easily installed and
run by a non-geek on any hosting service that supports WordPress.

There are big advantages to this: If someone really wants a topic to exist,
they can get it going for well under $100 (including domain name) and keep it
running for $10-20/mo. Most of these sites will be very low traffic and a lot
of them will in fact be pop-up and collapse as people figure out running sites
is work and audiences don’t appear by magic — but the good ones will thrive
and grow, and for most of these, it’ll be cheap enough to operate that most
people can run them out of pocket. By building it as an independent site,
though, that person would have the option of doing advertising, or running a
Patreon or GoFundMe, or find other ways to pay for hosting the content of the
site.

It also shifts the liability for the existence of the content to the owner and
host of the site and away from the central authority. If that person wants to
find an offshore hosting service that doesn’t care what the content is, that’s
up to them. So you’ve removed the need for a central authority to have to
censor to protect its own interests.

You still need a way for people to find these topic-specific sites. Enter the
central authority. It hosts a directory, much like the original Yahoo!
directory was. Building something like this is dirt cheap and easy to host, so
someone (like, ahem, Reddit) could do so at low cost so that it doesn’t have
to host those subreddits any more but could still support them by hosting a
directory of them where they could be found._

------
TAForObvReasons
Given that reddit is a Y Combinator alumnus, did anyone at YC contemplate the
current mess of the internet back when they applied? I'm curious about the
questions asked back then and how they would be answered now

------
ziikutv
Reddit doesn’t owe anyone anything.

how about asking kids parents to detoxify their kids than solving the problem
way down the chain of command?

People are shitty online because they have anonymity. Now instead of going
ahead and removing that, it might be worth going backward even more and going
to the person.

people need to be nice on the internet. The internet does not need to be nice
to people. Detoxifying internet is no different than censorship. I can say
this with certainty that as soon as we ask companies and governments to
detoxify, it will get misused [un]intentionally.

~~~
Retra
So it's going overboard to police what people say in a public forum, but it's
not going overboard to police what parents teach their children in their own
home? You're more afraid of censorship in the public space than state-mandated
parenting practices in private?

~~~
ziikutv
> state-mandated parenting practices.

I hope I can become a good parent and can teach my kids to be good people by
the time I am ready. If you need a state to tell you that, you probably should
not be having kids. Having is a privilege and unfortunately also a right.

Edit: In all seriousness, what I was saying is that we should be teaching kids
to be good people rather than acting good on the internet.

Also, we should be smart enough to know that all this article is doing is
spewing shit on Reddit to generate them pageviews. That is all journalism is
now a days.

------
kazanz
I'd argue that a cross-internet reputation service would fix this problem.

1\. The reputation system is affected by voting on all sites in which you
participate.

2\. The history of your participation across sites is viewable in your history
(ala reddit, HN)

3\. Your reputation is displayed with you participation wherever you
participate.

Sites could then put reputation limits only allowing users above a certain
reputation to participate. Making the cost of toxicity increase.

The biggest problem would be getting the walled gardens to adopt the system.

~~~
monksy
So you're suggesting the Black Mirror Nose dive approach?

~~~
crankylinuxuser
When I watched that, I thought of "Facebook Hell Universe". Because I think
that's what Zuck would want - having a reputation system under his control and
control of everyone under it.

------
cwkoss
I think, somewhat counterintuitively, the ability to "anti-like" or "Report as
bad take" could actually make the internet a much more positive place.

The insulation from negative perceptions of those outside your in-group causes
polarization and rewards unpopular but extreme opinions.

------
zeristor
This, to me, sounds like the zombie apocalypse we were promised.

------
stcredzero
[https://www.reddit.com/r/NeutralPolitics/](https://www.reddit.com/r/NeutralPolitics/)

------
el_cid
I'd take toxicity over censorship any day. The media machine & politicians
have been going at it recently.

I wonder when being toxic will become illegal.

------
yummybear
What are your thoughts on requiring or at least encouraging verified accounts
- somehow putting a verified name behind the poster?

------
jasonmaydie
I would argue we don't need to detoxify the internet, stopping trolling is
missing the point however we should stop the spread of misinformation, in fact
sites like reddit and facebook rarely create information, they just consume it
blindly without questioning the source

~~~
Infernal
I'm less sure regarding Facebook (as I'm rarely on it) but there is a large
amount of information created (or at least initially disseminated) on reddit.
Places like /r/DIY, /r/woodworking, and countless others are chock full of
original content and helpful users willing to answer questions and get into a
dialogue down in the weeds.

------
panarky
_> Struggle to Detoxify the Internet_

The internet isn't toxic.

It's the people who are toxic.

These people are just as noxious in real life even if some of them hide it
when their identities are known.

Can anything be done to detoxify the people, or do we just treat them like
spam and filter them out?

And what happens next when millions of rabid voices are suppressed?

The toxic people don't cease to exist, we just won't be able to see them as
well.

Perhaps we'll find the social and political environment of 2022 to be much
darker and more dangerous than 2018.

~~~
majos
> _It 's the people who are toxic._

I half-agree with this. The internet doesn't make a genuinely kind and
empathetic person into a troll. But I think it does amplify certain bad little
impulses that are latent in pretty much everybody -- the temptation of quick
and cutting putdowns, mob and tribal mentality, lobbing rhetorical bombs then
ignoring the consequences...

There is a qualitative difference between a back-and-forth on Twitter (or even
HN) and a back-and-forth in real life. There's way more trust in good faith in
the latter.

------
sureaboutthis
Decent, moral people avoid people and places like reddit altogether in their
every day lives while walking down the street.

A news analyst, on NPR a few years ago, called reddit a "Frankenstein's
monster they can't control".

------
randyrand
Does detoxify mean getting rid of mean speech?

~~~
s73v3r_
There was a post on /r/The_Donald not long ago calling for shooting all
refuges in the US on sight. Do you feel that is just "mean speech"?

~~~
douglaswlance
People would downvote them or argue it out.

If that person were never able to share that opinion, they would never get
feedback that it is an unacceptable opinion; and might therefore act on it.

~~~
s73v3r_
Moderators of T_D prevent that. Unless you subscribe to the subreddit, you're
unable to vote, and despite constantly rallying against "safe spaces", they
instaban anyone who goes contrary to the group.

That, and I find it impossible to believe that someone who is able to post
that online would know that it is an unacceptable position.

------
kerkeslager
> Some of the conspiracy theorists left Reddit and reunited on Voat, a site
> made by and for the users that Reddit sloughs off. (Many social networks
> have such Bizarro networks, which brand themselves as strongholds of free
> speech and in practice are often used for hate speech. People banned from
> Twitter end up on Gab; people banned from Patreon end up on Hatreon.) Other
> Pizzagaters stayed and regrouped on r/The_Donald, a popular pro-Trump
> subreddit. Throughout the Presidential campaign, The_Donald was a hive of
> Trump boosterism. By this time, it had become a hermetic subculture, full of
> inside jokes and ugly rhetoric. The community’s most frequent commenters,
> like the man they’d helped propel to the Presidency, were experts at testing
> boundaries. Within minutes, they started to express their outrage that
> Pizzagate had been deleted.

This is a critical thing that we as a society need to recognize about
censorship and political correctness. When we censor people or ostracize them
for saying things we don't like, _even when what they say is legitimately
awful_ , these people don't disappear, change their minds, or stop voting.
They go elsewhere, become more entrenched in their awful beliefs, and because
we've pushed them all together, they become more united. _They become
stronger._ And because they become stronger away from us, we don't notice, and
we're blindsided when they flex their political power.

Underlying this mistake are some important truths:

1\. People who say hateful things are human beings worth engaging with. No, I
don't like reading a lot of what people say. It's easy for me to forget that
they are human beings with their own struggles and traumas that cause them to
believe the awful things they believe. When we dismiss them as trolls, we're
dehumanizing them. Such a society doesn't leave room to be wrong and learn--if
you're wrong, you're dismissed--and it dismisses the people who are the most
dangerously wrong. It's not our responsibility to educate people, but that's
irrelevant to the fact that if we don't educate people no one will. We need to
engage people who believe awful things, try to understand what needs cause
them to believe those things, and try to address those needs with compassion
and courage. Truth is the antidote to hate.

2\. Free speech doesn't just matter in a legal context. Free speech is
protected in the US constitution _because it 's important in a free society_.
If we're going to let the discourse of our society move into privately-owned
platforms like Reddit/Facebook/Twitter instead of publicly-owned platforms
like street corners where newspapers are sold (or, the rest of the internet)
then we have to value free speech on those platforms as well.

Too many people are stuck in this idea that the Trump election was an anomaly
--that in November Congress will change and in 2020 we'll have a new
president. I see no reason to believe this will happen. We have changed
nothing about our behavior and we're hoping the ones who elected Trump to
change.

~~~
milesrout
>2\. Free speech doesn't just matter in a legal context. Free speech is
protected in the US constitution because it's important in a free society. If
we're going to let the discourse of our society move into privately-owned
platforms like Reddit/Facebook/Twitter instead of publicly-owned platforms
like street corners where newspapers are sold (or, the rest of the internet)
then we have to value free speech on those platforms as well.

I think this comparison makes little sense. Either you're comparing newspapers
and social media platforms or you're comparing the street corner and the
internet. The powers that be should not be censoring the street or the
internet. But the owners of newspapers have always exercised some level of
control over their content, just as social media platforms exercise varying
levels of control over what is posted to them.

I don't like it, but I didn't like newspaper control over newspaper content
much either. Expressly ideological newspapers that are more like newsletters
are one thing, but newspapers that purport to be objective and are not are
quite another.

I don't see echo chambers on reddit as being much different from the echo
chamber that is the University [Insert Here] Society's newsletter. Echo
chambers have always existed. People sometimes just want to surround
themselves with opinions they agree with.

------
JorgeGT
> Huffman can no longer edit the site indiscriminately

There's precisely zero proof of this.

~~~
losteric
Tampering can be detected with internet archives and data dumps of Reddit
submissions/comments.

~~~
gruez
and how many people are running bots/scrapers to do this? also, how do you
know whether the OP edited it, or it was an admin?

~~~
losteric
There are multiple sources of internet archives and reddit-specific archivals.

re: edits; check whether the post sentiment changes, I'm not worried about
admins fixing typos/grammar.

Also, investors understand trust is a prerequisite to profit... editing
shenanigans are intolerable.

------
UrukParthian
Toxic statements = statements from a different political orientation

Why are people so willing to embrace Orwellian doublethink? My prediction is
that such a sword will do decapitate undeserving people of all political
aisles.

~~~
sincerely
The article is largely about removing flagrantly racist content (the
subreddits they listed as examples were like r/KKK and r/CoonTown) and various
illegal content like bestiality videos. It seems disingenuous to classify
these as simply "different political orientation".

~~~
prepend
As much I disagree with racists, I believe that a healthy society must allow
them to speak.

I tried to find a quote from Dershowitz’s book on defending Nazis but gave up
due to paywall.

Toxic is not an objective measure for speech. But like in real life, toxicity
depends on the dose. Life doesn’t eliminate toxins, it reduces them to
manageable levels (or dies).

There’s also the practical effect that banning is irreversible. Toxicity is
not terminal, but banning doesn’t allow for redemption.

------
InclinedPlane
I think you replied to the wrong comment.

~~~
dang
Yup. I moved that one to its intended parent, and detached this one from
[https://news.ycombinator.com/item?id=16572878](https://news.ycombinator.com/item?id=16572878)
and marked it off-topic.

------
aje403
Most people are idiots. Consider that half of the population (hypothetically)
has less than a 100 IQ score. Idiots did different things together 100 years
ago and, 1000 years before that, groups of idiots would get together to raid
and rape and pillage. They still do that in other places in the world. Now
they come onto the internet to send hateful things and porn to one another.
Most of the biggest idiots do not go on reddit. When the idiots on reddit act
up, management cleans it up a little bit. Is there an issue that needs to be
fixed here? I think anyone even claiming to have a solution may belong to the
former group

~~~
wufufufu
"Since the early 20th century, raw scores on IQ tests have increased in most
parts of the world. When a new version of an IQ test is normed, the standard
scoring is set so performance at the population median results in a score of
IQ 100. The phenomenon of rising raw score performance means if test-takers
are scored by a constant standard scoring rule, IQ test scores have been
rising at an average rate of around three IQ points per decade."

[https://en.wikipedia.org/wiki/Intelligence_quotient](https://en.wikipedia.org/wiki/Intelligence_quotient)

You assume too much about half of the human population.

~~~
aje403
I suspected that I shouldn't have put that into my post or should have
qualified it with a statement regarding how completely useless of a metric it
is. My general point was that probably around half of the planet is humping a
tree as we type at our computer and pontificate about the human potential and
dignity, regardless of a useless score that slightly correlates with things.

~~~
lopmotr
IQ isn't useless. It does correlate with violence (rape and pillage?) and of
course it's a powerful predictor of academic performance, future income, and
even life expectancy.

~~~
aje403
As soon as it gets mentioned to make a point, it incites a completely
orthogonal discussion about what it means, its validity, socioeconomic bias,
racial bias, etc

------
calebgilbert
Oh, mai. Where is the tldr; for this one

------
mirimir
OK, so how did r/Trees end up focusing on marijuana, and
r/MarijuanaEnthusiasts on trees? Was r/Trees first, and then tree lovers did
r/MarijuanaEnthusiasts as a joke? Was there a war?

Edit: tone

------
ecommerceguy
Not one word about the HUGE Antifa presence or doxing by left wing
conspirators and agitators. This is exactly why I don't read the New Yorker.

I would consider reading it if they gave light to both sides of this issue.
But they don't. This is not journalism.

Edit: u/spez was caught editing user comments on certain subs he publicly
disagrees with. How he's still CEO is beyond me. At least he admits he's a
troll.

Edit 2 - Downvotes all you want, all I'm saying is FACTS. Here's more FACTS:
[https://motherboard.vice.com/en_us/article/z4444w/how-
reddit...](https://motherboard.vice.com/en_us/article/z4444w/how-reddit-got-
huge-tons-of-fake-accounts--2)

Reddit culture starts at the top and this is Steve Hoffman. He should resign.
He's completely in over his head.

