
Ask HN: How do we stop the polarization/toxicicity filling the web? - dabockster
Hey HN, I&#x27;ve noticed a huge uptick in the toxicity online in the last 5-7 years. Before, around 2010-2012, people who disagreed would usually leave it at that and walk away respectfully. Now, it seems like everyone treats everything as an argument or debate to be won at all costs. Even niche sites like HN are not immune.<p>So how do we fix this? I&#x27;ve heard some talk that upvote systems and algorithms might be at fault. Do we ditch them and go back to a literal timeline? Or is this more of a social problem that code can&#x27;t solve? Let&#x27;s hear some input on this, because I can&#x27;t shake the feeling that tech isn&#x27;t totally innocent in this mayhem.
======
rayuela
Encourage people to participate in physical social gatherings that center
around non-political issues, like hobbies.

We need to start emphasizing what we have in common, help people see that
ultimately we're really not all that different from each other. The internet
lets people hide behind this isolating veneer that makes it too easy to just
shout at each other and behave in extreme ways that would never be socially
acceptable in the real world. Interacting in real life significantly moderates
people's behaviour and when social media and the internet facilitate
segregation and amplification of extremes, spending more time in person might
really do us a lot of good.

We need to put people in a context where politics is irrelevant and a shared
interest has the opportunity to bond people who would otherwise not meet. It's
the only way I see of countering Identity Politics that try to be all
consuming.

It would also be helpful to encourage people to focus on more positive events.
Outrage culture is a thing, only exacerbated by social media, where rage =
clicks/views = money. We need to change the economics of this equation. We
would really benefit from a culture that gave more attention to people doing
good rather than just focusing on the villains in society.

~~~
reaperducer
_Encourage people to participate in physical social gatherings that center
around non-political issues, like hobbies._

I think there is a lot more to this than many people think.

If more people participated in hobbies, then they would spend less time on the
internet. The time they do spend would be more targeted and focused instead of
mindlessly scrolling and scrolling and getting baited and agitated.

They would also form communities, as you suggest, with common interests and
get to know people better. It's basic socialization.

Part of the problem is that for some reason there is a notion that everything
has to be a business today. There's a Fidelity Investments TV commercial
running right now where a daughter says to her retired mother, "Did you paint
this [painting]? You know you can sell these!" And a business is born. Why?
Why can't she just enjoy painting for painting's sake. For her own enjoyment?
Why is money what decides if something is valid or not?

I encourage people to start hobbies. And be bad at them. Be terrible at
reading, stamp collecting, woodworking, gardening, model railroading, brewing,
or whatever. Do it because you like it. If you happen to be good at it, great!
If all of your carrots die, that's OK, too. You got to spend quality time with
yourself, which is more important than all the carrot sales you could possibly
make.

~~~
andy_ppp
Damn, this is a great idea... I’m going to try woodworking and make the
shoddiest set of chairs one can imagine, but they’ll be my chairs. Could take
up to a year maybe of practice but who cares!

~~~
chadmeister
I hope this leads you to make friends with the other local woodworking nerds
:)

~~~
nelblu
As a matter of fact, i picked up woodworking as a hobby last summer and made
some really good friends. You don't have to be good at it, just the whole
process of going through it is rewarding. Look up "Mr. Chickadee"s channel on
YouTube, watching his videos is like a meditating experience.

------
eqdw
Discussions like this are frustrating to me because they always seem to skip
over what feels like a crucial element:

Do you know _why_ this uptick has happened? Does anyone?

I get the feeling, in general, when people talk about wanting to "Do
Something" about this sort of thing, they are always focused on managing the
symptom and never focused on addressing the cause. Frequently, when I try to
bring this up, the cause is waved away with "oh they're just
stupid/ignorant/angry" or some other such explanation. The current-year
favourite is "they just believe fake news", and I'm sure next year there will
be a political explanation, and then the year after that we'll loop back to
generic ones. (The irony of denouncing other people as causing polarization
"because they're just angry and stupid" is lost on most people)

I don't know why this is happening. But I do know that until the why is
address, and done so with care and respect, this problem will be papered over
at best but never solved.

Also

> Do we ditch them and go back to a literal timeline?

If I got one wish to change the world, but I had to give up a bunch of great
things, I would sell out all technological wonder in order to get literal
timelines back. As soon as other people decided they knew what I wanted to see
better than I did, the problem started. And as soon as they got tired of doing
that and made computers do it for them, it got worse

~~~
Sorry_Rum_Ham
I believe the reason for the uptick is two-fold.

1.) In general, this is a reactionary movement from the elite class of
society(straight, white, cis people) to keep a death grip on the social and
systemic power they see being taken from them in the form of diversity,
equality, feminism, etc. This has been happening for a long time now, but it
got real, real bad after...

2.) ...Trump was elected. Him winning the election was a shot of weapons-grade
steroids into the ass of these regressive movements. He validated and
emboldened them. He made them feel not only okay about being bigoted, but
morally good for it.

That's my thoughts, at least.

~~~
rayiner
> Trump was elected. Him winning the election was a shot of weapons-grade
> steroids into the ass of these regressive movements.

It’s actually exactly the opposite. We lurched left in a very short time. As
recently as 2007, a plurality even of Hispanics said there were “too many
immigrants” in the US. [https://www.pewresearch.org/fact-
tank/2019/02/19/latinos-hav...](https://www.pewresearch.org/fact-
tank/2019/02/19/latinos-have-become-less-likely-to-say-there-are-too-many-
immigrants-in-u-s). Little more than a decade later, every mainstream
Democratic candidate is running on providing universal healthcare to people
who immigrate illegally.

------
scott_paul
It's worse than the Eternal September. It's definitely more toxic than it's
ever been, and people are definitely more polarized. The 2016 election news
cycle got a lot of otherwise normal people really radicalized and hostile.
Sure most of the old-timers among us remember, often fondly, the constructive
insanity of usenet. And the younger generation remembers the torrent of pure
id that was 4chan in the early days, and 8chan later.

Here's the thing. It's leaking. This stuff used to be contained, kept in a
little secret segmented-off piece of your world that was 'the internet'. Now
it's invading your real life, your real-world friendships, your workplace,
even your family.

Think of the Greater Internet F-ckwad Theory: "Normal Person + Anonymity + an
Audience => Total F-ckwad". Now run that function over the entire populace,
map/reduce style. Now take that F-ckwad, and remove the Anonymity by requiring
them to use their real name. And bingo, now you live in Interesting Times.

~~~
vitaflo
>Now it's invading your real life, your real-world friendships, your
workplace, even your family.

I fully expect in the future for there to be a segment of the population that
chooses to "drop out" of the always-connected digital social life for these
reasons. And I don't just mean giving up Facebook, but going back to analog
social circles only, as well as ignoring those who are still connected (and
hooked) digitally.

I like to compare it to smoking. Imagine everyone got addicted to smoking, but
you eventually wanted to quit because you realize how bad and addicting it is.
Even if you quit though you'd still be hanging out with other smokers mostly,
and breathing in that 2nd hand smoke. The only true way out would be to both
quit and only hang out with non-smokers.

This won't be everyone of course. Like smoking, toxicity and polarization are
too addictive, but I can see there becoming a counterculture that simply
rejects all of it altogether.

~~~
ryandrake
Don't wait for the future. You can have this life today. Whenever I read about
how this Twitter stream is toxic, and that group is using Facebook to harass
some person, about Gamergate and 4chan and The_Donald--how glad I am that I
don't participate in any of these sites, and therefore have opted out of all
of this drama!

This stuff is almost 100% online. It's not invading your real life unless you
invite it in. Switch the computer off and disengage, and suddenly you're
immune. Someone right now might be badmouthing me on Facebook or Twitter, and
I don't care. _It doesn 't matter_ because I'm not on these sites.

~~~
anigbrowl
Counterpoint: people who shoot up churches and the like.

~~~
jessaustin
Those people are vanishingly rare. You're vastly more likely to die in a
traffic collision than to be shot because somebody doesn't like your church.

~~~
anigbrowl
They're a lot less rare than they used to be. Likewise, few people die in wars
until they're suddenly declared.

~~~
jessaustin
We haven't "declared" wars in my parents' lifetimes, yet somehow millions have
died at our hands. "Apocalyptic" violence in USA is partly an echo of our
crimes overseas, but mostly an obvious result of the constant fear-mongering
and dehumanization our media performs in order to justify those crimes.

------
throw_m239339
I don't think anything really changed. Only the scale of things, with
platforms like Twitter that can be extremely viral.

Everybody remember the collective insanity of the Covington Catholic school
fiasco where journalists and celebrities were publicly wishing for the death
of a bunch of kids wearing MAGA hats or at the very least getting them doxxed
or hurt all because of a one minute video taken out of context. And those who
approached the story in a cautious manner were called out by the rest for
being to soft or accused of being part of "the bigots", Ironically, the
account that initially twitted the video snippet was banned after a few days.

I don't think the population is more deranged than before though, I mean the
70's were pretty violent politically, more than today.

The solution is to quit social media or join smaller communities free of wedge
issues and identity politics. Twitter, Facebook or Reddit aren't going to fix
themselves, they make money off outrage and petty divisions.

And finally you don't owe activism on a specific cause or an opinion to
anyone. It's OK not to have an opinion or not wanting to get involved in a
political debate. Anybody that attacks you for refusing to take a stance on
any issue should probably be muted/blocked or removed from your life, that
person isn't your friend and will try to make you look bad at the first
opportunity just get brownie internet points for themselves.

------
scottlocklin
>I've noticed a huge uptick in the toxicity online in the last 5-7 years.

You're crazy my dude(tte). I've been on the interbutts since 1991 and Usenet
days, and it's exactly as it always was. The big difference is we have
reporter ding dongs on Twitter thinking Twitter, aka the comments section they
removed from their web presence, is the real world. Also old people on
Facebook who never learned the lessons of being on Usenet in the early 1990s,
aka arguing on the internet is a lame and addictive hobby.

There are minor accelerants for this; youtube really did have some kind of
pathological radicalization rabbit hole in its recommendation engine for a
while (now it's just boring and useless and shows you "more of the same" on
"blue checked" accounts). And of course, the other media encourages
polarization and demonization of the other for dumb short term. That's a
purely American phenomenon, and nothing's going to change it until the people
pushing this swill on MSNBC, Fox and CNN decide to change it.

~~~
kody
When reporters on Twitter freak out about the trolls on 4chan like it's some
disease infecting our precious wholesome web, I can't help but remember
(through my biased lenses of course) when most of the web (that I visited)
looked like 4chan. What the folks who think the web is 'degrading' don't seem
to realize is how much of a wild west it used to be, and how much corporations
have tried to (and somewhat succeeded in) sanitizing it.

~~~
JohnFen
> I can't help but remember (through my biased lenses of course) when most of
> the web (that I visited) looked like 4chan.

Interesting. I don't remember the web (or the pre-web internet) generally
looking like 4chan in the earlier days. Pockets of it always did, of course.

But we are both engaging in a statistical error here -- we're using sample
sizes of one. Your experience and mine may differ quite a lot simply because
we hung out in different parts of the internet.

~~~
kody
It really is interesting that our anecdotes can be so different. I'm assuming
you're older than me (25) since you mention the pre-web internet, so I'd also
guess that by the time you were on the web/pre-web internet you would've had a
more developed filter than I.

My introduction to the web progressed roughly with watching my dad use BBS ->
playing Neopets and Runescape -> becoming very involved with Runescape forums
-> running a Runescape forum (first introduction to moderating user-submitted
content at the age of 10. Yeah...) -> getting involved with video game modding
communities, which meant I was spending all my free time browsing forums,
following every link, completely absorbed by everything the early web's
computer game communities churned out.

My experience probably would've been very different if I hadn't been able to
get around the parental controls my parents used, or if I had been interested
in different hobbies with a more approachable web presence.

I've been nostalgic for the "old web" lately, but I'm also very, very grateful
that I haven't clicked on an inconspicuous URL that turned out to be a
jumpscare, virus, weird porn, or gore in quite a while.

EDIT: I don't want to imply that the Disturbing Web == Videogame Web; that
just happened to be my experience.

~~~
tayo42
What video game forums compared at all to 4chans /b/?

Every where banned you for gore, child porn, harassment, encouraging suicide.

I really don't think you should be trying to normalize 4chan as thats just the
way the internet is.

~~~
noirbot
There is/was also a profound difference between /b/ and most of the rest of
4chan though. Not all of the site is /b/, that's just the most famous. It
definitely set some of the culture of the other boards, but not to that same
degree.

The video game/tabletop game boards on 4chan were fairly normal for boards at
that time, and it's not like everyone who was active on 4chan was there to
post or read /b/. You might dip into it for a laugh or dare now and then, but
at least when I was younger it was more like the internet equivalent of
sneaking into the abandoned house down the street: a "dangerous" thing that
felt cool to do.

~~~
kody
Good point. I was framing the conversation with 4chan as a whole, not just
/b/. Big difference.

------
missosoup
Polarisation and 'toxicity' are two completely different things.

The fact that people have increasingly divergent views (or rather, are voicing
them, they probably always held those views) is just a fact of life. It's not
unique to any online community or even any country. It might be simply the
fact that too much diversity is inherently destined to collapse.

'Toxicity' is a made-up new meaning for the word, which means anything
undesirable from the point of view of the speaker.

Labelling the former as the later is one of the root causes of the feedback
loop you're describing.

The first step is for both sides of polarising issues to acknowledge each
other, that both think they're doing the right thing, and that neither is
'toxic'. Without that common ground, dialogue can never begin.

~~~
mthoms
>'Toxicity' [...] means anything undesirable from the point of view of the
speaker.

I think it's more often the _delivery_ than the message that gets something
labeled as "toxic". Merriam Webster describes toxic (when used in this
context) as _" extremely harsh, malicious, or harmful"_

People are generally open to new ideas as long as they're presented in a
friendly and factual manner.

~~~
missosoup
"Extremely harsh" is entirely different from "malicious or harmful". Again the
conflation of these is one of the root causes of what we're seeing today.

People who use the term 'toxic' like this are people who subscribe to the
mindset of "I'm being prosecuted because someone is saying something I don't
like".

People seem to have forgotten that having free speech necessarily means
tolerating speech you disagree with, speech you find objectionable etc.
Blanket labelling anything outside one's bubble as 'toxic' and refusing to
engage in any dialogue is exactly how polarisation begins.

~~~
mthoms
I literally just explained to you, in a courteous and civilized manner, that
many (in my opinion - most) people don't use the term that way. That includes
me.

The irony of you ignoring my comment/experience, doubling down on your claim,
and then going on to preach that _" refusing to engage in any dialogue is
exactly how polarisation begins"_ would be hilarious if it weren't so
depressing.

People seem to have lost the ability to listen. That makes me sad.

Edit: Grammar.

~~~
missosoup
It sounds a bit like you might be feeling prosecuted because I said something
you disagree with. I'm not even sure what part of my comment upset you since
none of it is harmful, malicious, or personally targeted.

Emotive language like 'preach' has no place in constructive debate.

I'm not at all upset about your viewpoint, I'm happy to debate it on any level
you'd like. Keep your comment as a solid example of what I meant for the other
readers.

~~~
mthoms
Again, ignoring the substance of my message in order to continue your attack.

>I'm not at all upset about your viewpoint, I'm happy to debate it on any
level you'd like.

I told you exactly how I see something, you've yet to acknowledge it, and have
twice doubled down and claimed I think the opposite. This is the problem. You
aren't interested in listening, you're more interested in being "right".

I'm done here.

 _"...refusing to engage in any dialogue is exactly how polarisation begins"_

-missosoup (while un-ironically refusing to engage in dialogue)

------
krick
There is nothing to fix, except for the attitude of those who think there is
something to be fixed, because essentially it's they who demand something
unreasonable: censoring of the free speech. It is unreasonable, because that
assumes somebody (the person or group, who "fixes" the problem) can and must
decide, what is good and bad in regards of what you are saying. And unless
this person is you, you always will be dissatisfied with the results to say
the least.

Instead, you should try to accept the truth: free speech is ugly, deal with
it. There always are _some_ people, who are nice, but _all people_ are never
nice. A lot of different people with different opinions freely voicing their
opinions in the public space will always be ugly.

Ok, now let's assume you don't care and get to actually answering your
question. In order to remove "ugly" you need to remove some of the elements:

1\. Different opinions. People with different opinions are removed from
society, everybody in the community must be as similar as possible. There are
multiple ways to achieve that to multiple degrees, but the key is closed
communities, since banning faster than they appear (or find a way to return,
which is relevant for the web) is a hard work.

2\. Freely voicing them. Again, multiple ways: good old moderation of the
content, make everybody have a stake in what they are saying (reputation
systems, goods exchange, game mechanics enforcing cooperation), etc.

3\. Public space. Move all discussions to PM, so they still argue, but you
don't see it and feel good about yourself and your platform.

4\. Speech. Just don't let people communicate. Easiest to achieve on a given
platform, and the most effective.

~~~
freehunter
There definitely is something to fix, and it has nothing to do with free
speech. Something about text communications makes people much more aggressive
than they would be face to face. People wouldn’t say half the things they say
online if they had to say them out loud.

PM, public forum, real names on Facebook, it doesn’t matter. Something
inherent to text communications makes people more toxic than they would be in
person. It’s not a free speech issue.

~~~
anigbrowl
It's simple; there are few-to-no costs for doing so (at worst, getting banned
and making a new account), but the emotional benefit of upsetting someone
else, while attenuated, still deliver significant satisfaction to those who
seek it.

For simple trolls (as opposed to political actors), the dynamic is simple to
model; you spend time and some effort to create accounts and say antagonistic
things while people respond with abuse (to which the troll feels immune), and
then harvest (via screenshots) examples of the saltiest tears for sharing with
ones troll peers for lulz in other forums.

People do the exact same thing in real life, but it's more time-consuming and
expensive to establish and maintain physical groups, both in economic terms
and direct costs (legal or physical sanctions).

------
muzani
I've been active online since 2000.

There was a theory in 2004, known as GIFT: Normal Person + Anonymity +
Audience = Total Fuckwad

Guess what? People without anonymity are total fuckwads too - this happens on
Facebook, Instagram and Twitch, in an era where you could probably find
someone's personal details if you look hard enough.

I think the key is _audience_. Bullying feels good for a lot of people.
Bullies will go for low hanging fruit where they won't be struck back.

You see people acting this way even in once helpful sites like Stack Overflow;
downvotes will pull bullies in like a magnet. You see people picking on anti-
vax, flat earthers, Justin Bieber, not so much because they do harm, but
because they're easy targets to hold down.

Viral algorithms amplify this effect. It highlights bad news that everyone can
join in and rage on. It's not a new thing; news channels have done this for
decades.

We can't really fix it. I'm more a community nomad these days. It's easy to
move on to other new and old communities, away from this effect. I've been
happy with HN, Discord, and IRC lately.. IRC has picked up back to 500 people
per channel, and it's probably no coincidence.

~~~
thdrdt
I also think the lack of consequences makes it very easy for people to post
all kinds of shit.

On HN this is greatly reduced by the point system and the opacity of bad
posts. A bad post fades away so it doesn't look as important as other posts.

In real life you can get a punch in the face if you say nasty things to
someone.

But on Twitter, Facebook and YouTube most trolls wont ever learn because they
just can say whatever they want.

~~~
wool_gather
Worth considering that this isn't simply mechanical: to make those posts fade
away requires humans clicking the downvote button. Which requires some kind of
culture that includes clicking the button on "bad" posts, for whatever the
definition of "bad" is. If there _were_ downvoting on say, YouTube, I'm not
sure that it would produce the same results that it does on HN.

~~~
blululu
Hacker News is indeed a special place on the internet and it would probably be
difficult to reproduce its style of discussion. That being said I think it is
worth considering why HN has succeeded and to what extent is this a function
of the presentation algorithm (upvotes/downvotes)?

It is worth remembering that the visible part of an internet community is a
small part of the total possible community. Following the classic 90-9-1 rule
there are a lot more people who could participate than people who do
participate. This means that the visible face of an online community has a lot
of room to change.

------
christiansakai
Read "Amusing Ourselves to Death" by Neil Postman.

You will realize that Social Media is designed in such a way intentionally to
draw toxicity from society.

No technology is neutral. Technology gives and technology takes away. Medium
is the message. Communication evolved because of the medium. For every
evolution, it gives something and it takes away something. From oral
communication to the printing press to telegraph to TV to the internet to
social media. Every evolution gives birth to something and destroys something.

For example, TV makes everyone reachable (what it gives), but makes everything
into entertainment (what it takes), even important topic such as politics,
religion, war, poverty, pestilence, etc science becomes pure entertainment,
juxtaposed between endless drama, reality show, and ads, coupled with
background music and personas that manipulate the minds. The more
"entertainer" you are, the better, regardless whether you are a dumb scientist
or a dumb lawmaker that will affect many people's lives using your policy.

Social media such as twitter, for example, everyone now has a voice (what it
gives), but with its char limit (what it takes) doesn't give critical thinking
and rational debate a highlight, therefore it spirals down into madness. The
more outrageous you are, the better, because it will go viral and people will
react in such a predictable way.

A good example of a person who knows exactly how TV audience and social media
audience will behave in a predictable way, and took advantage of that, is
President Trump.

You want to design a medium/platform in such a way so that the pros outweigh
the cons. But I think the hard part is knowing how will people use the
platform. Those social media giants started out with good intentions, and only
later and later down the road, and here we are right now, that we discover its
true effects.

~~~
cr0sh
> Social media such as twitter, for example, everyone now has a voice (what it
> gives), but with its char limit (what it takes) doesn't give critical
> thinking and rational debate a highlight, therefore it spirals down into
> madness.

Twitter has a hard, small limit. Most other platforms either have no limit, or
the limit is fairly large.

The problem is, people don't use it.

Worse, those that do use it are ridiculed, or their words are ignored (TL;DR
anyone?)...

Thinking about this, I wonder if any of it has to do with people's "inner
monologue" that was discussed yesterday here on HN? If you didn't see it, the
gist was that there are some people who don't have such a monologue, and it
came as a surprise to one person. Similarly, those without such a monologue
are often surprised that others have it; one person commented that they often
wished that the voiceover of characters in a movie, expressing their inner
thoughts, was a real thing - and were shocked to find out that for most people
- it is!

Anyhow - does this play into how people write online? Do they tend to write
less or smaller messages, because their inner monologue is too loud or
constant? Do those without such monologue write more thoughtful and longer
posts? Then I think of myself; I have an inner monologue, but I tend to write
long things (case in point - this post?) - but I don't find my inner monologue
a burden.

But some do - I know I have read of people who either must always have some
noise around them to drown out their "inner monologue", or if left in silence,
even for small moments, will declare themselves "bored", perhaps because their
inner monologue isn't perceived as interesting (whereas I and others have no
problems thinking and pondering things, in silence, with no boredom)...?

Does this effect how people compose and type their messages? Does it help or
hinder understanding? Does it facilitate or does it block meaningful
conversations?

Twitter may have tapped into something that was always there to begin with,
and in essence has helped foster that communication style - making it
acceptable widely - conversation as "sound bites" \- which has perhaps led to
our present situation.

~~~
christiansakai
Twitter is just an example. If you read that book, the gist is basically
Social Media makes it easy to brain dump and leave that brainfeces all over
without having to look it back again (paraphrasing mine). Anyone in the world
can post anything, without any accountability, without the need to carefully
revalidate and be validated/invalidated.

I saw the post about inner monologue but didn't read it. But I believe it
could be related.

I highly suggest reading the book because I'm doing a disservice trying to
explain about it. It basically explained how oratory, printing press,
telegraph/telephone and television really changed society a lot, but in
subconscious ways that most of us don't think.

------
CM30
Truth be told, the most practical solution is to stop centralising everything
and using these giant social networking sites that force people with nothing
in common together. The bigger your audience gets and the less focused on any
one topic it is, the harder it is to moderate/keep under the control, and the
more drama you'll inevitably have when groups clash there.

Smaller internet forums, subreddits, Discord/Slack groups, etc tend to be a
lot more civil than the likes of Twitter or YouTube are.

So a revival of those types of sites and communities will help a lot.

As will returning to the days of multi pseudonyms for different websites.
Because people are not one sided. They don't always act the same way in every
setting.

No, their behaviour depends on the company they're with. They might act one
way with family, another way with friends, another way at work, etc.

That's how society stays together to some degree. People don't know how others
act in other settings, and they don't care. Your coworkers likely have a whole
mix of political opinions, but since it likely doesn't come up during work, it
doesn't really matter.

Social networks seem to be trying to demolish this sense of separation between
sides of people's personalities, and that's making society more and more
fragile, as one wrong move means someone's entire life gets destroyed by the
internet mob.

Oh, and decent moderation too. Unfortunately for Facebook and co, you can't
automate moderation and expect it to work well, and you can't outsource it to
a bunch of full time employees in a distant office somewhere. It has to be
done by people with a real investment in the community, which is again where a
well run small community shines.

~~~
el_cujo
Unfortunately, I don't think it's as simple as "just go back to smaller
groups". For companies, more users = more revenue, there isn't realy much
incentive to not try and pull in as many people as possible. And then for most
normal users, if they here about some big site, they're inclined to join that
so they don't miss out on the the funny tik tok memes/all of their friends
being on instagram/etc. For them, what benefit is there to a smaller site
where you can't get as many followers or where your friends/favorite celebrity
isn't a member.

I agree with you that size is one of the main contributing factors to the
problem, I just don't think smaller sites is a practical solution for the
public at large. That being said, if you don't care about "fixing" the public
problem, then you're right on the money. If you personally don't want to
experience toxicity, get off the big sites, it's that easy. I just don't see
that being a fix for the average teenager/college student/boomer

~~~
gyulai
...it doesn't need to be a huge company (or company of any kind) to run these
things. Back in the early 2000s, a volunteer who knew about computers would
set up phpBB on a church's website, or a radio station would allow one IT
person to spend half their time maintaining something like this, and that
would be it. No need to turn millions in advertising dollars to have an online
community.

I do agree with the parent comment, that we may well see a resurgence of small
and medium-sized online communities, for the simple reason that more
fragmentation could be a good thing. When a community has its eternal
september, people can move elsewhere.

~~~
el_cujo
I meant more that even a new, small company would have no incentive to not
want to become the next big thing/the next facebook. Good point though that we
don't need companies/start-ups to feed us online communities, particularly
since the users are almost always the ones bringing value to platform like a
message board/forum sites.

------
bjt2n3904
Stop responding to it.

No seriously, just stop. This isn't a "complex problem that needs nuanced
technical and legislative solutions".

Back in the old day, there was this saying. "Do not feed the trolls". Sadly,
we've forgotten that.

Our current approach is, "create rigorous 30 minute point by point take down
videos to defeat their point of view". Our urge to debate and correct people
who are wrong just fuels them making more content. A troll needs reactions to
survive. Just downvote and move along.

~~~
iso1631
We found a lot of lies during the UK election last month. Not difference in
opinions, not beliefs about whay may or will happen, easily proven lies,
posted across community forums, copied and pasted to other ones, and repeated
in an increasing cresendo

People then believe those lies, they repeat them, and even if they don't those
lies sink into their subconcious and change their behavior, not necessarily
today or tomorrow, but for the next 30 years

Ignoring them doesn't fix the problem.

~~~
tengbretson
You might find that the people you see posting and repeating these lies see
the things that you post and repeat as lies.

~~~
organsnyder
And one of the sides has to be right. Sometimes the facts are difficult to
ascertain, but so many of the lies spread via social media are easily
disproved by consulting primary sources.

~~~
ideonexus
> but so many of the lies spread via social media are easily disproved by
> consulting primary sources.

I wish this were true. I used to post snopes links and primary sources to Baby
Boomer posts on Facebook, but it's hopeless. They either don't trust the fact-
check, can rationalize it away, or just don't care. One of the most shocking
realizations of my adult life has been learning that a very large portion of
my otherwise high-functioning friends will believe anything, no matter how
crazy or self-contradictory, if it reinforces their sense of self-
righteousness.

~~~
meheleventyone
And a whole bunch of people that see the minority or unpopular opinion as more
valid because of it.

------
kerkeslager
I don't know if I know the solution, but I know one thing that _isn 't_ the
solution: silencing everyone who disagrees with you.

A lot of the people complaining about "toxicity" on the internet seem to be
under the impression that if we "deplatform" the so-called toxic people, that
will fix things. But on the contrary, that makes things worse.

If someone says something awful on the internet, and everyone either ignores
them or politely presents a counterargument, they either move on because they
feel they've been heard, or they engage in a polite discussion. Maybe they
change their mind, maybe they don't. If they really can't engage in polite
discussion, then they come across as making their ideas look worse, so they
aren't really doing much harm.

If someone says something awful on the internet, and everyone rails about how
awful it is and gets them banned, then that person is angry, and that anger
motivates them to keep posting about it everywhere and spreading their idea.
Meanwhile, they will integrate that idea into their identity, which makes it
far harder to change their mind. And if you actually manage to get them to go
away, they will go to cesspools like Voat, where they are even less likely to
be exposed to ideas that change their mind, and where in fact they are likely
to be exposed to even worse ideas.

Let's get some perspective: what you're complaining about is people saying
things you don't like on the internet. Yes, what they are saying spreads
ignorance, but the solution to ignorance isn't silencing the ignorant, it's
education.

MLK and Harvey Milk both recognized that the source of the bigotry they fought
against was fear borne of ignorance. But the average left-leaning person today
doesn't see bigots even as people any more. All it takes nowadays is for
someone to say one of a list of banned phrases and they're completely written
off as even human. If we're going to bridge the gap here, we on the left have
got to consider that we might be the toxic ones.

I've found that when I actually talk to so-called "toxic" people politely,
they are willing to listen. It's not them that are causing the polarization.

~~~
bonaldi
The data suggests that deplatforming works. Milo; Katie Hopkins; InfoWars et
al have all lost their former agenda-setting influence following
deplatforming. And studies have shown that banning hate sub-Reddits does _not_
cause that content to “pop up elsewhere”, it causes it to decline overall.

It’s important to remember too there are incentives for people to argue and
behave otherwise: FB, Twitter, hate-speech mongers who want easy access to
large audiences — All have commercial cause to act in favour of more and more
extreme speech.

People like that benefit from more and more extreme speech. They permit or
encourage it on their platforms. This in turn causes the white blood cell
count of the body politic to spike as it tries to counteract the bile and
hate. This angry counter-speech then gets presented as “polarisation”.

The solution to ignorance _can often be_ silencing the ignorant, yes, in order
that the educators can be heard.

Would we still have an anti-vax problem if FB banned it across its properties?
Really?

Banning disruptive speakers works. Every pub landlord knows it.

~~~
djsumdog
You realize Voat exists right? And Gab? I wrote this about how Voat grew:

[https://battlepenguin.com/tech/voat-what-went-
wrong/](https://battlepenguin.com/tech/voat-what-went-wrong/)

Deplatforming was a big part of that. Deplatforming didn't stop Milo. His base
dropped him because he started defending hebeaphiles & pedophile / men being
attracted to teens/pre-teens/boys (there's actually something sadder here;
with Milo not realizing he was himself abused ... there's an entire tragic
story lost there people don't seem to understand or pick up on because they're
too busy hating him).

They might leave the platforms you like, but they move over to Voat, Gab or
startup their own Pleroma/Mastodon instances (that get banned from
everywhere). Deplatforming doesn't really work in the way you thin it does. It
literally gives people more drive to stand up for and behind what Capital-T
"truth" they think got themselves banned.

~~~
potatoz2
This is akin to an argument that you shouldn't fire your nazi-sympathizing
coworker when they discuss their support for an ethnonationalist state because
they may join neo-nazi groups and "radicalize" as a result.

Yes, they may. But while they do the workplace where most people interact is a
livable place for the rest of us. We shouldn't make it easier to be heard if
you have despicable views just because of the implied threat it might get
worse.

~~~
eanzenberg
Yes, and for truly horrendous viewpoints it's ok to ban them outright. But
society isn't black and white, and when you start lumping non-extremists into
the same bucket and banning them all you end up where we are today: distrust
of MSM but a big chunk of society.

~~~
ceres
You keep talking about "non-extremists" but you are not telling us what you
mean by that.

~~~
eanzenberg
I don't want to get political, but do you think Joe Rogan is an extremist?
Jeanine Cummins? What about Ben Shapiro?

These are people that the MSM is trying to cancel quite literally right now.

~~~
sagichmal
Nobody is trying to cancel Joe Rogan. I don't know who Jeanine Cummins is. Ben
Shapiro is definitely an extremist.

~~~
CapricornNoble
>>>Nobody is trying to cancel Joe Rogan.

Oh I bet the Woke Twitterati (tm) would if they could, but Joe Rogan is in the
same stratosphere as Ricky Gervais and Dave Chappelle where they are
effectively impervious to cancellation barring some Harvey Weinstein/Bill
Cosby-level misconduct.

[https://www.hollywoodintoto.com/joe-rogan-cancel-culture-
sme...](https://www.hollywoodintoto.com/joe-rogan-cancel-culture-smears/)

[https://www.youtube.com/watch?v=zvAc7002eRM](https://www.youtube.com/watch?v=zvAc7002eRM)

------
mwfunk
IMO that probably has more to do with changes in your own reading habits or
maturity level vs. 5-7 years ago. As toxic and ridiculous as any sort of
electronic forum can get, I don't think it's any different now than it was on
Usenet circa 1995.

For example, I started reading Slashdot not long after it started ('96?). I
really enjoyed it, read it every day, occasionally posted. Then sometime
around 2000 I just got disgusted with the quality of the comments- it seemed
like there was a long slow decline, and no longer did the average commenter
seem like they were my age or older, my level of experience or greater, my
level of knowledge or greater- instead it just seems like a cesspool of
pointless, toxic flamewars about stupid things that seemed to exist for their
own sake. Reading Slashdot comments started making me feel stupider and
angrier, not more informed. So I eventually stopped reading Slashdot.

But then, over the years I would talk to various people younger than me with
the same experience- the only thing that changed was which year they thought
Slashdot was good, and which year they thought it went so downhill that they
just couldn't read it anymore. Like, "Slashdot was awesome circa 2007 but by
2011 all the commenters sounded like toxic teenagers". Which makes me think
that what actually happened was Slashdot comments were always terrible, but if
you started reading it your larval hacker years you might not know any better.
Then you grow up some more and get more knowledgeable and mature, and the same
level of commentary seems really inane.

------
rdiddly
It's too late - the web already "democratized" communication, meaning that not
just people with writing skills, credentials, connections or experiences
capable of impressing some gatekeeper get to have a voice. People with
absolutely nothing worthwhile to say get to talk to everybody now. Sound
elitist? Then the desire for "no toxicity" is elitist. (It's basically
expressing a bias against Shitty People.)

EDIT: Possibly against my better judgment I'll take the question a bit more at
face value. I think it's all about the climate or _culture_ of a place. As an
engineer I've tended to dismiss talk of "culture" esp. where a company or
workplace is concerned, but in recent years I've changed my mind. For example,
one message or pronouncement by the CEO can change people's entire experience
of working there. Same thing for a website, which is a virtual place. And what
kind of culture do you expect will develop in a place where the incentives,
motivations, values etc. of the people running the site are actively hostile
to yours? A place where the very premise of being there is already a hostile
act against other humans? It's a corrupt and fundamentally dishonest (not just
incidentally dishonest, I would argue) place where the social order has
already broken down before a word is even said.

Yeah okay, being a surveillance platform that optimizes for dark patterns and
"engagement" doesn't _guarantee_ that it'll be full of trolls. But it's not
exactly a surprise when it turns out to be that way.

Steve Jobs was obviously pretty shitty in some ways, but at the height of
Apple's revolutionizing of the UI (before it turned shitty) it made you feel
like there was something noble about it and you felt better and more of a
sense of _decorum_ just by being there. Meanwhile Twitter refuses sensible and
awesome ideas for UI/UX over and over for years because they get more
_engagement_ by keeping it shitty. And shittiness has a way of spreading. If
the place is shitty you will act shitty. But if you suddenly get invited to go
have lunch with the Queen of England or something, you probably behave better
despite yourself, even if you're Johnny Rotten.

We don't "behave better" because, fewer and fewer of us are getting invited to
lunch with the Queen, and because we're not the (U)ser whose (X) the ones
running the place care about, and because we can tell we're being taken
advantage of, and the nearest person available to take out our grievances on,
is each other.

------
ebg13
> _Before, around 2010-2012, people who disagreed would usually leave it at
> that and walk away respectfully._

I don't know what internet you were using, but this does not match my
experience. The internet has been a dickfighting playground since at least the
90s. The biggest difference is that comment threads are the majority of how
people create content on the internet now and they weren't before.

~~~
jandrese
The term "flamewar" predates all social networks except Usenet. It was not an
isolated occurrence in the early days of the net.

~~~
ebg13
Are you responding to something I said? If so I can't tell what.

~~~
jandrese
> The internet has been a dickfighting playground since at least the 90s

I was just reinforcing your point by pointing out one of the earliest phrases
coined on the Internet is about people discussing a topic without civility.

------
at_a_remove
Honestly, I think the large-scale nature of communities is a factor.

Moderators now are also more willing to ban accounts that are non-spam.
Certain views are "harmful" and apparently people cannot be asked to _not
react_ to seeing things, or even ignore people, whether via some tool or just
by saying "Oh, it's her again, I'll just skip over that." This places pressure
upward on the moderators to make certain people go away.

Folks love gaming the algorithms to decide what will and will not be read and
algorithms come into play because of the aforementioned large scales. Face it,
the Internet runs on ad money and that can be tight, so everyone gets the
bright idea to either let the algorithms do the moderation or to let people
who have free time on their hands and the motivation to do it. Usually the
motivation is expressed in some seemingly-altruistic manner but it always
boils down to pushing their own views. Eventually some views become
acceptable, others anathema, and you get that ghastly distillation where more
moderate voices are driven off until finally you get a kind of ghost-town.

This is a pretty tough nut to crack from this vantage point.

One of the ways I see out is the admittedly computationally-intensive tack of
making the algorithms only affect a user's _personal_ view rather than making
it site-wide. You can still run into the echo chamber issue, only in this case
it is a bunch of people in their own private speech bubbles in one _large_
chamber.

I've spent a lot of time thinking about this because I have watched this sort
of thing kill communities for the past thirty years now.

------
jp555
Sounds like "Naive Realism" \-
[https://en.wikipedia.org/wiki/Na%C3%AFve_realism_(psychology...](https://en.wikipedia.org/wiki/Na%C3%AFve_realism_\(psychology\))

"Have you ever noticed that anybody driving slower than you is an idiot, and
anyone going faster than you is a maniac?" \- George Carlin.

Funny... but what's even funnier is that EVERYBODY THINKS THIS.

People dont realize that it's not that there are a group of bad drivers out
there, but that everyone drives badly sometimes. It's always different people.

The toxicicity online is similar. There's just a lot more people online a lot
more hours per day, and everybody is a jerk sometimes.

~~~
growlist
But this is possibly a kind of a fallacy in itself: it's possible that there
is bad behaviour out there, and that it's associated with one particular group
as opposed to another; there's points to be scored by pushing a particular
position online if it influences reality. In my experience on some forums,
mention anything about Brexit and a flood of anti-Brexit comments appears;
either there were hundreds of lurkers that just happened to be online and spot
the comment, or there was organisation behind it.

~~~
jp555
There may be groups of bad actors coordinating, maybe. But you remind me of
this Alan Moore quote:

"The main thing that I learned about conspiracy theory is that conspiracy
theorists actually believe in a conspiracy because that is more comforting.
The truth of the world is that it is chaotic. The truth is, that it is not the
Jewish banking conspiracy or the grey aliens or the 12 foot reptiloids from
another dimension that are in control. The truth is more frightening, nobody
is in control. The world is rudderless."

The more I learn about probability and complexity, as well as the lack of
general knowledge of ergodicity leading us to make fallacious conclusions, the
more this rings true.

~~~
krainboltgreene
Right but like...There are organized movements by governments to do horrific
things. China is doing it, Nazi Germany did it, America has and is and will do
it.

Random quotes from counter-culture writers doesn't negate that bad actors have
coordinated, they weren't bothered by sunlight, and when they got enough power
they did truly horrific things.

~~~
jp555
I dont mean to suggest that i dont think people can organize in groups. Or
that group efforts can never have negative effects.

Im trying to say that there is an organic ecosystem of groups that severely
constrains all groups in unpredictable complex ways, and much more than most
people would think.

------
dclusin
We're essentially feeling negative effects produced by sorting a list of
items. Currently they are sorted based on friend interactions and other
criterion; mainly popularity. They sort this way because it increases
engagement. What we've also found out is that this is also negatively
effecting peoples emotional health.

My personal feeling is that reverse chronological order with maybe throttling
your chronic over-sharing friends would probably go a long way to helping
blunt some of the negative effects we see today. But it would also lower
engagement and lead to revenue losses so they probably haven't earnestly
considered it. Kinda like how the cigarette companies don't want you to know
that smoking is bad for you, because you might smoke less.

~~~
brlewis
That's exactly how FriendFeed worked, and it was great.

------
tenebrisalietum
The toxicity is seeping in from real life, and really popular social networks
and media treatment of these is creating a feedback loop.

The real problems are:

A) real life sucks unless you are part of the subset of the rich or have good
support structures,

B) toxic behaviors helps you get ahead in certain situations in life and
people with low or no resources often resort to them,

C) there's no cost to joining certain social networks so it attracts those
with no resources and/or the lowest common denominator, including people who
have been damaged by life and only know toxic behavior as a norm,

D) people are interesting and useful when doing something productive or
creative, but not otherwise.

It's hard to say if the category of people who fall in C is increasing in
absolute number or only due to popularity of certain social networks.

Social networks that aren't dedicated toward doing something productive or
creative, or who are open to everyone for free, will be overtaken by toxic
behavior as a result. They should be avoided, or new ones developed in their
place.

Another factor is the downfall of respect for the news media in the US, this
makes discussions of news events attract comments from the type of people in C
as they think their opinions are more important than they should be.

------
rgrieselhuber
It helps to realize that much of it is intentional, especially on behalf of
companies like Facebook. Once you realize this it has a sort of spell-breaking
effect. Ask yourself if you would say the same sort of thing to someone if you
were face to face. If the answer is no, think twice about posting it online.

------
barryaustin
There's no straightforward fix because this behavior is something that emerges
from mass psychology, like war. In a sense this _is_ war, with new platforms
to amplify and direct weaponized ideas.

If war is politics by other means (Clausewitz), we can turn that around and
say that politics is war by other means. See: the Russian concept of hybrid
warfare.

What ends war is when enough people in a society are convinced it's not worth
it. And that happens when enough people are exposed to the horrors of war,
when they see what it does to their own lives and to the lives of people they
care about.

Now we're in a period when most people haven't been exposed to that level, so
this is unfortunately an upswing.

The way through is for some number of us to keep our humanity toward others
(in the positive sense), to act accordingly, to carry that through until
conflict blows over, and to rebuild society afterwards.

------
dec0dedab0de
It has always been a problem, but I agree it has gotten worse. The problem is
that it has gotten worse in the real world too. Ive had friends family and
strangers yell at me over minor political disagreements, and its not primarily
coming from any particular side or group. You would think they were trying to
fight about something important like emacs or vim.

But why is this happening recently? I think the rise of smart phones has
facilitated way more people engaging on the internet instead of just browsing.
This in turn has opened up people to the vast amount of viewpoints in the
world, and unfortunately people tend to get angry at people that are
different.

Another thing that has become a problem is people being paid to support or
attack a cause. These professional rabble rousers are stirring up shit inside
of people that would rather just have a sandwhich.

~~~
kangnkodos
There may be a small number of paid partisans, but I think a bigger problem is
people who are paid in internet points. There's some kind of psychological
flaw that drives people to get more points. Then the combination of the way
web sites are set up, and the people inside of them leads to the extemists
"winning" by getting more points. Then the whole mess feeds on itself to get
more and more extreme.

------
jf22
> Before, around 2010-2012

Eh, people were being jerks to each other on usenets and bbs's way before
this.

Godwin's law is 30 years old at this point.

------
decibe1
I don't think we should try to stop it. I like discussion groups where the
only thing moderated is spam. Yes, you'll get a lot of noise - but they can
also be the only source of discussion that lies outside the overton window.

I've found it easy to ignore content I don't like, but it is getting harder to
find open discussion forums.

~~~
duxup
>outside the overton window

I feel like what you describe here ... is the noise.

~~~
AnimalMuppet
Not everything outside the overton window is noise. It sure isn't all signal,
though. It almost certainly has a higher proportion of noise. But it's also
where the new directions for society are going to come from, for good or ill.

~~~
duxup
There's a "outside the overton window" description (and similar descriptions)
I hear folks talk about when it comes to the value of internet un-moderated
discussions and .... honestly I've found it mostly to be poorly thought out at
best, often painfully ignorant as far as the motivations go.

There seems to be some folks who talk about really valuing novel ideas and how
they can't show up on moderated discussions but I just see a lot of noise that
might not be "overton window" but are mostly pithy garbage.

I find that anything mildly thought out usually fits inside even the heaviest
of moderated type discussions.

~~~
AnimalMuppet
> I find that anything mildly thought out usually fits inside even the
> heaviest of moderated type discussions.

Depends on the moderation. Here, if it's well thought out (and expressed in a
non-antagonistic way), it _usually_ fits. Not every moderated place will that
be true, though. And even here, certain topics may not run afoul of the
moderators, but users may still downvote them to oblivion. (For example: There
is some suggestion that autism may be caused, in at least some cases, by gut
bacteria. But if I were to link that idea to the suggestion that the measles
vaccine might, in some cases, cause gut issues, I would expect to be destroyed
by downvotes, no matter how good of an argument - or even evidence - I had.)

------
deltron3030
>Or is this more of a social problem that code can't solve?

Absolutely, because it's wrong to regard the web as a thing that's separate
from the real world and it's effects. We're talking about people's identities
in the sense of how they see themselves, and that's shaped by their
environment (what they see, hear and do), online and offline.

What happened is that people are able to find a tribe, whereas before the web
was a thing, finding a tribe was more difficult and often exclusive to cities.

People living in rural areas were likely to adapt to existing "big village
tribes", branching off wasn't viable or a matter of circumstances because of a
lack of different subcultures within those rural areas.

The web changed that, everybody can socialize and find a tribe online, it's
like a big city full of subcultures, and subcultures provide identity.

Early people on the web were often already part of urban tribes I think, and
took position of views within those, so if they encountered a rival group
online (think about poppers and rockers stereotypes from the 80's) toxicity
was almost ensured. So what happ[ens if a mainstream culture that's fed with
stereotypes through hollywood etc. is let loose on the web?

The before tribeless population which was forced into a tribe by circumstance
is basically discovering the world and is going through their teenage years.

That's my explanation of what's happening.

------
duanem
I'm not sure you were around for the great Gnome vs KDE wars of the 2000's. It
was pretty toxic and "win-at-all-costs" back then :-)

When I released my app 8-years ago, I thought support would be my most hated
part of releasing a product. To my surprise, it was one of the most rewarding
parts, I met some great people via email, some I will visit one day.

But support has its ugly "toxic" side as well. This is particularly prevalent
in 1-star reviews for trivial issues. All the reviewer sees is a box to vent
their thoughts without considering that there is a real person on the other
side. But I'm a real person who cares and some comments do stir bad emotions.
This has brought some of my lowest and darkest days of app development.

Sometimes I will make contact with the reviewer and as soon as they get to
know me as a real person, they are friendlier and more respectful.

The faceless nature of the internet causes people to treat others poorly. In
the real world where we meet people in face-to-face, our initial and natural
position is to treat each other with respect.

So here are a few ideas: 1\. imagine a person, someones face, not a textbox,
website or company - add a persona to him or her. 2\. Ask this simple question
"Would I treat or talk to someone (my friend) like this in real-life?" 3\.
Would you be proud of the way you're conversing if people you admire were
watching, e.g., respected colleagues, friends, parents.

This will change the way you write.

~~~
fastball
Like anything else, I think a big part of this is how you are raised.

There are plenty of people that treat others terribly, in real life, without
any anonymity whatsoever. Likewise, there are plenty of people that treat
other anonymous posters on the internet respectfully.

As with everything, parenting and social norms just need to catch up to
technology. We need to instill in our children the idea that being rude to
anonymous strangers is a bad thing, just as it is a bad thing to be rude to
strangers in real life, just as it is a bad thing to be horrible to people you
already know.

~~~
duanem
Agreed. If you have a bad upbringing and treat people poorly, when you enter a
new environment, e.g., a workplace, the new environment would quickly teach
the new person the proper way to act.

Likewise, on the internet, we need to encourage people to behave nicely and
discourage bad behaviour. And we do, this is what this post is about. I guess
we need more of it.

------
ineptech
1\. Amazon releases MyOneServer, the first AWS product aimed at non-tech end
users. M1S is a VM that runs apps (like, from an app store) for server-side
use cases like running a blog or hosting a minecraft server.

2\. M1S gains traction from hackers who don't want to be part-time sysadmins,
but the first widely-used killer app for it is a social media front-end
combined with a CDN. Self-hosted files and "publish once, syndicate
everywhere" starts to flourish and Facebook begins to lose its position as
middle-man to the web's social activity.

3\. Bezos buys Keybase with spare change from his couch cushion and integrates
it in to M1S; now all server-side apps, and the users thereof, natively have
cryptographic identities.

4\. Users begin to demand M1S integration from businesses that used to hold
(and resell) their data. Widespread adoption from webapps cyclicly drives
further adoption by users, making a M1S identity (if not heavy usage) almost
ubiquitous.

5\. With most client/server use cases sharing a single cryptographic auth
network, the last vestiges of anonymity disappear from the web as peoples'
username, real name, cryptographic identity, and checking account melt into a
blob.

6\. With their real identity attached to online activity, online activity
becomes "activity", and, while toxicity is not eliminated, it is at least no
worse than the real world.

(Crazy? maybe. But I am honestly surprised every day that I wake up and find
out that Amazon hasn't made something like step 1 yet...)

~~~
sjf
I really thought this was going somewhere until I got to the punchline: the
expectation that having online identity linked to real identity will reduce
toxicity. Real names policies have already been tried by several huge social
networks and don't seem to be having any impact.

------
netcan
There's a Bernard Cornwall book where an Arthurian warrior collaborates with
the druid Merlin. It takes place in Britain during the Saxon invasion period.

At one point Merlin's in a library. We learn that Druid's are not allowed to
write. Once you write something down, it becomes fixed. Fixed on paper. Fixed
in your mind. Rigid and unalterable... this is not conducive to magic, which
cannot work under such rigidity.

It reminded me of Socrates & Phaedrus' conversation on reading & memory. Words
in a book don't work the way words in your head do. Reading about a
conversation isn't like discussing a topic in Socrates' garden. When writing
replaces memorisation as a method of learning, things are lost as well as
gained. The ideas themselves change.

It's particularly interesting as Socrates was famous for his conversational
philosophical methods, where allegory and flexible style make the case. His
student is famous for his writing, and the more precise, detail-oriented,
reductive and uhm... platonic in his method.

Anyway... the internet has made us all writers. We write whatever enters our
head. Then it becomes fixed. Then we defend it... like cranky old academics
defending pet theories they wrote in their youth.

~~~
ianai
I’d say the fix then is to remove the words. Ie leave social media or make it
too decentralized to attack.

------
meekstro
I think you need to acknowledge human nature and profit from it.

Human’s love observing an argument.

Twitter is too concise to form decent arguments.

If there was a website called Dual - where people could register a username -
then post on the forum where they were arguing to verify their dual account
and then continue the argument on a platform set up for settling written
arguments and integrating external references.

People would talk less anonymous smack or have to back up their smack on dual
and then write at a level that convinces the masses they know what they are
talking about. No swearing on dual. The disagreements would actually educate
everyone and the voting and following would tell you what to advertise.

If someone was talking smack on a forum and didn’t register on dual you could
assume they were a coward or a troll. If followers could supply duelers with
arguments we would get to the truth in an entertaining way very quickly and
any misunderstanding would be highlighted and documented.

People often do great thinking in the heat of an argument. We should harness
the anonymous arguments. Politicians should accept duels instead of televised
debates.

Limit the responses to 300 words and six references or something effective.

I wish I had the money and health to clone twitter and do it but someone
should do it. Both sides have to watch the arguments unfold so it brings
observers and duelers closer to the truth rather than polarising them.

The trick is in trusted auto verification of anonymous duelers and getting
enough duels and spectators to make internet dueling a thing.

~~~
anigbrowl
This is an intriguing concept. A number of websites have tried this sort of
thing with limited success but your idea of making it a forum specifically for
challenging people to duel is a superior hook.

------
forgottenpass
Remember back when we used to have concepts like "flamebait" and "flamewars"?
When we would openly talk about conversations to not get involved in? Right in
direct proximity to those conversations themselves? We'd acknowledge that no-
one was above getting hot under the collar, and that while we'd try to avoid
it we might have to go bicker at eachother in the corner for a while?

What changed? My rough list is:

1\. News bloggers got online and turned flamebait into their business model.

2\. Normies got iPhones and started posting online. They never read an
netiquette guide. If they had heard of the concept at all they dismissed it as
computer nerd nonsense. They don't think about online communication
systematically and expect every interaction online to be like talking to
customer service: they get to say whatever they want, and only hear a very
constrained window of things back. This backfires in all the obvious ways.

3\. "Platforms" with KPIs based on engagement are incentivized to enable the
largest flamewars possible and put it in as many people's faces right up to
the point they start causing bad PR.

4\. People with basically zero forethought thought they could fix society by
being paternalistic to strangers.

------
rlucas
If you're working on this in a for-profit context, I'm a Seattle VC looking to
invest in solutions for this particular problem, and my phone/email/etc. are
in my profile.

My most recent investment was on this thesis:
[http://blog.rlucas.net/vc/exchanging-thoughts-on-a-
thoughtex...](http://blog.rlucas.net/vc/exchanging-thoughts-on-a-
thoughtexchange-investment/)

------
dcchambers
Several sites I used to enjoy have been ruined by this toxic polarization.

Everything is an echo chamber these days. No one wants to be wrong or look
foolish online. People don't want to even read opinions other than their own.
There is no more civil discourse.

It's not just online - I'm noticing the same thing in real life.

I don't know how to fix it...but I hope someone does. Whatever the case, we
need to treat the cause, not just the symptoms.

------
hickernews
Ai is a Skinner box right now. It’s designed to capture attention. It does
that by measuring how intensely something makes you feel. Dopamine high =
content dependence and an attention reward. If you see people fighting your
brain looks. If you see something normal who cares? You’re not forming
relationships with people. You’re trying to get likes and shares. Why the hell
would you do something boring like find a middle ground? It’s fun because this
article from another thread might help people think of ideas on how to fix
this. Algorithms are the attention filter right now. They obviously suck and
don’t respect autonomy. Personally I liked the days of “content like this”
navigation. I used to spend hours just walking around the network of Spotify
songs to find new content. Now you can’t do that. The algo suggests stuff and
you’re not really free to explore the graph. [https://lithub.com/how-we-pay-
attention-changes-the-very-sha...](https://lithub.com/how-we-pay-attention-
changes-the-very-shape-of-our-brains/)

------
rv-de
Few weeks ago I read Heinrich Böll's "The Lost Honour of Katharina Blum" [1]
which is a literary reflection on how society and media treated RAF [2]
sympathizers.

What struck me was how much the described trolling and toxicity resembled what
we observe nowadays. The book and the setting is the 70's so the anonymous
bullying and hysteria just materialized via letters, newspapers and the phone
instead of social media.

Also it has been mentioned on HN several times that society didn't become more
toxic - possibly even less. But more and more people gained access since the
90s and take the opportunity for abusing it in such ways. Also there are now
troll farms and a single person without a job can spread poison like a gatling
gun bullets.

1:
[https://en.wikipedia.org/wiki/The_Lost_Honour_of_Katharina_B...](https://en.wikipedia.org/wiki/The_Lost_Honour_of_Katharina_Blum)
2:
[https://en.wikipedia.org/wiki/Red_Army_Faction](https://en.wikipedia.org/wiki/Red_Army_Faction)

------
jedberg
You really can't.

The only fix is aggressive moderation, and even that isn't great, because
you'll be shaped by the ideas of the moderators.

The internet is a wonderful tool. It magnifies reach of all thoughts and facts
and opinions. Sometimes that's a good thing, sometimes it's a bad thing, and
what each person considers good or bad is different.

We've always had "crazies" with strange ideas. Back in the day they just stood
on street corners shouting. Then they passed out leaflets. Then they got on
public access cable.

And now they have the internet, where they can reach the whole globe at once.

The one thing I can tell you that won't work is pure democracy. Something I
learned at reddit early on is that pure democracy just doesn't work, because
the trolls have far more time and patience than everyone else, and they _will_
manipulate the system.

Also, a lot of people just don't want to think for themselves. They will just
follow the loudest voice assuming it is correct. This also makes democracy not
work well.

But democracy is still the best system we've got.

------
ativzzz
The answer is heavy handed moderation. You need active moderators who
discourage toxic comments and commenting patterns, and give out punishments to
repeat offenders. I don't know the the guys at HN do it since there are so few
of them, but I like their method of leaving a comment why someone's comment is
toxic (usually against guidelines).

------
cirno
Step one is getting people inside Google, Twitter and Facebook to stop ranking
toxic and polarizing content so strongly in search, images, videos and
recommendations.

They'll say it's the AI, and it's only giving people what they want, but if
you ask a kid what he wants to eat he'll become malnourished on an all-sugar
diet. A healthy diet means serving up some vegetables even if your kids don't
like you as much for it.

In other words, even if it hurts your ad revenue and engagement metrics a
little, it's the morally right thing to do. I'm not saying to bury misdeeds or
censor anything, but give people some positivity once in a while.

Dumping controversy and callout posts that have barely any backlinks at the
top of peoples' results gives it artificial weight and makes people see the
world as more and more polarized, which results in them acting accordingly,
and we're all the worse for it.

------
peckrob
It's a big problem without one simple solution. One thing we can do, though,
is not enable it when possible. One big thing I have zeroed in on recently is
comments.

When I redesigned my blog/website last year, I intentionally removed Disqus
and all commenting ability. In fact, what prompted the redesign was that I
wanted to add a box pleading with people to not use comments for tech support
for various open source projects I've created and to file Github issues
instead.

But the more I thought about it, the more I realized how bad comments have
really gotten. Trolling, bullying, racism, sexism, homophobia, transphobia,
abuse, and general mean behavior have become such a rampant problem that
"don’t read the comments" has become an Internet meme on par with "don’t feel
the trolls." Think about it: when was the last time that you changed your
opinion because of a comment you read on a blog or news article? More than
likely it just make you mad or made you sad.

I have been low-key nursing this idea for awhile now that the Internet would
be better off if a lot of sites killed off comments. When comments become a
breeding ground for the worst aspects of human behavior, why should we
continue to enable it? Why should we host it and give it a voice? Why should
we implicitly endorse it under our brand? Because that is what we do when we
host comments.

Commenting should probably still exist on social media sites (Facebook,
Reddit, HN, etc.) But most sites would be a lot better off if they just
dropped comments entirely and reallocated those resources to more productive
uses, and I think their users would be better off for it as well.

Yes, I am just one person with a small blog about programming. But if this is
the position that I want to take, I should be the change that I want to see in
the world. Thus, the removal of commenting. And I've been pushing this idea a
bit in other areas that, just maybe, users don't need to be able to directly
comment on things.

~~~
CM30
Hmm, as someone who's run communities for a while, I'd have to say that the
quality of a comments section is very heavily dependent on the topic, audience
and how well you moderate it.

Over on YouTube for instance, I've had very few negative comments on my
videos, and usually found the comments section under them to be a mostly civil
place where I can help out people who had trouble with a walkthrough or guide.

Same goes with the comments on blog articles I wrote in the past too. The
comments have generally stayed on topic, were always free of personal attacks,
etc.

Comment sections can certainly be hellholes, and for large media outlets they
usually are. We all know how bad they are on stuff like the BBC, Daily Mail,
Guardian, etc.

But they don't necessarily have to be, and they can be entirely valuable if
moderated well and related to a topic people don't treat like its the end of
the world.

------
mam2
There are only 2 sane strategies:

\- Ignore them and despise people arguing on the web while you build yourself
a better life.

\- Laugh at them and possibly feed the troll for more fun, but it gets boring
and youll get back to strategy 1.

We live in a clown world where too many people spend 90 percent on their
screens receiving information from strangers instead of living a normal life.
Plus the more opinionated people are on the web, the more they have nothing
actually going on in their life.. its like the same fact that all psychologist
are people whith poor mental healt who went into this branch hoping they would
understand themselves. The louders someone is about a subject......

Do you really want to invest time into policing this clown world ?

If you are not convinced just go in flat earth group on facebook. Youll
understand that "dont argue with pigs, youll get dirty and the pig loves it"

------
prirun
This polarization isn't mostly a technical problem, but my experience hosting
forums at a previous company did give some insights:

\- smaller communities are more civil. As forums get more popular and their
power to influence grows, assholes want to use that power to get their way

\- reasonable people might put in their 2 cents on a discussion, but will not
continually argue with the assholes. The assholes know this, so pour lots of
energy into turning a discussion into a shitstorm.

I have seen this same thing here at HN: two people go back and forth on a
point until the discussion is two words per line on the far right of the
screen. Once I see that starting, I usually quit reading the whole topic.

I'm not that familiar with the reputation thing here, but in my previous life
I had some ideas about how to rescue our forums. Most of the ideas revolved
around limits, like:

\- limit the number of posts per userid on a topic. This would require that a
person get all his thoughts out on the table and think before posting to do
that, rather than going back and forth forever.

\- limit the number of posts per userid per day. Then if someone went on a
rampage, they couldn't just pollute one topic and go to another to vent (or
create new topics to vent). My hope was that if you force a person to cool off
for a day, they might not want to get into the same argument the next day.

Once our support forums got popular, they got ugly. We tried everything we
could think of, but eventually had to shut them down and went to a model where
we anonymously posted questions that users had and posted our responses. Users
couldn't post. The support forums were still informative, but all the heat
(and a lot of the interest) was gone.

I'd never have forums in another business. They're great in the beginning, but
in my experience, impossible to manage when they get popular. HN has done a
great job here, but it's also a specialized, highly educated community.

------
qwerty456127
Rational thinking, manipulation techniques (you have to know them to be able
to detect them), emotional awareness and stuff like that should be taught at
every school. Everybody should be conscious of the fact 99% of what you find
online (let alone see on TV) is bullshit, designed to troll and manipulate you
and absolutely not worth taking serious (and if something really seems
important - check the proofs and don't forget to question their legitimacy). I
would even call for the governments to fund efforts to educate every person
(adults included) this way - this clearly is a matter of national security
nowadays.

I believe there is no sane way (an insane way is to deanonymize and watch
everyone - China and Russia implement this) to stop the
polarization/toxicicity in the first place, all we can do is develop immunity.

~~~
TruthSHIFT
Also, it should be noted that foreign governments are actively trying to
polarize us.

------
dx87
I don't think you can fix it, it's just the nature of the Internet. I was
talking to some older coworkers who were adults before the Internet became
mainstream, and they said that before the Internet, your social circle was
neighbors and co-workers, so being an ass would have direct, and sometimes
physically painful, ramifications. They said that on the Internet, there's no
real ramifications for being a horrible person, and you'll always find a group
of people that agree with what you say, whereas before you'd get ostracized,
or worse, for treating your neighbors/co-workers poorly. I think the best you
can do is encourage people to interact outside of the Internet, and hope that
the manners required to function IRL stick with them when they go online.

~~~
chrisco255
It's not just that, it's the fact that most interaction on the net is text
based so you miss out on tone and body language. I feel like some discussions
escalate too high on the net due to the text based format.

~~~
dx87
Definitely. I remember one time a manager from another group at work asked me
a question about when a project would be done, and I responded along the lines
of "I'm working on it, and I don't know when it'll be done." because it was
the truth, and there was nothing else to say about it. They interpreted my
single sentence reply as rude and dismissive, then complained to my manager
about it. When my manager talked to me about it, he said something like "Oh,
ok, your reply is fine, dont worry about it."

------
arman_ashrafian
This is not an answer but on a similar note, has anyone watched the latest Joe
Rogan Experience with Daryl Davis. He is a black man who convinced Klan
Members to leave the KKK. Amazing podcast. Maybe the only way to stop
toxicity/polarity is to show real conversations with real people.

~~~
brlewis
Successful persuasion stories are always interesting to me, but at over 2.5
hours I'm unlikely to watch this whole thing. Can you please list some key
takeaways?
[https://www.youtube.com/watch?v=oGTQ0Wj6yIg](https://www.youtube.com/watch?v=oGTQ0Wj6yIg)

~~~
arman_ashrafian
Ya this is a good clip.

[https://youtu.be/75fGNLFAoIc](https://youtu.be/75fGNLFAoIc)

------
BjoernKW
This is not an algorithm problem. It's a people problem. Start with yourself:
[http://www.paulgraham.com/identity.html](http://www.paulgraham.com/identity.html)

An algorithmic solution would be to rigorously hide everything that looks like
mere unproductive outrage from public discourse. However, not only would that
amount the severe censorship but it'd probably also not be exactly easy to
implement.

Besides, platforms such as Twitter or good old-fashioned news thrive on
outrage. If you take that from them there's probably not much left, which is
why it's not in these platforms' interest to do something against the issue.

So, it's back to square one: Yourself. Keep your identity small and try not to
perpetuate outrage on the Internet.

~~~
banads
Let's not pretend algorithms optimized for "engagement" don't feed on hate. Is
there any other emotion which is so easily evoked, manipulated, and engaging
as anger/hate?

~~~
lukifer
CGP Grey: "This Video Will Make You Angry"
[https://www.youtube.com/watch?v=rE3j_RHkqJc](https://www.youtube.com/watch?v=rE3j_RHkqJc)

------
growlist
> people who disagreed would usually leave it at that and walk away
> respectfully. Now, it seems like everyone treats everything as an argument
> or debate to be won at all costs. Even niche sites like HN are not immune.

You seem to be assuming a. that these are real people and not bots and b. that
these people are not paid shills etc. - yet we have ample evidence to the
contrary on both points. If you ask me, we are in the midst of an information
war that is crossing over with a culture war. I wish it would blow over but we
seem to be stuck with it until the people behind it either win or become
demoralised. The only solution that springs to mind is the complete removal of
anonymity online, but that would bring its own problems.

~~~
codingmess
"yet we have ample evidence to the contrary on both points"

People on HN believe that? The web has changed...

~~~
growlist
? The original poster said 'how do we stop toxicity', and I am saying how can
we tell the toxicity isn't at least somewhat down to bots and shills, the
presence of either of which is hardly controversial, is it? I mean we've had
people trying to convince us for years that it was the Russian bots that put
Trump in the White House, unless I've seriously misunderstood.

~~~
codingmess
"I mean we've had people trying to convince us for years that it was the
Russian bots that put Trump in the White House"

Just because people have been trying to convince us of that theory, doesn't
mean we have to believe it. Ask yourself who is doing the convincing and what
might be their motive.

Anybody who has ever used Social Media knows that in general, bots don't
simply get access to your timeline. First they would have to get you to follow
them, a much harder problem than running thousands or millions of bots.

Likewise for shills - first you have to get people to follow you.

The notion that we would all be happily in love with each other, if it weren't
for bots and chills, isn't really worth investigating.

People are angry online because there are fights over real issues. It's not
just Mac vs PC anymore, it is about money, land ownership, power, control...
People tend to get angry if you try to take away their stuff.

As a simple example: many people like video games. Other people came around
claiming video games are sexist, and try to get game makers to change the
games. That is some serious meddling with something many people hold dear. Of
course that makes them angry.

------
kgwxd
A bunch of personal rules I try to be conscious of, but usually break:
Respond/listen to ideas, not people. Try for a top-level quality comment
instead of a reply. If a reply makes better sense, don't direct it at the
person. Minimize the use of names, personal pronouns, partisan terms,
adjectives, and adverbs.

It's probably already been attempted, but a forum that had some understanding
of language structure and could programmatically provide some feedback by
adding a "Review" step to posting would be interesting. The system would have
full access to the context a post is being made in and checks could be built
around it. The less checks that are ignored, the better the rank.

------
gasull
I'm very late to this, but I wanted to say that partly the fault is on the
naive assumption that search engines adopted many years ago that the
information found online was mostly true if it had a lot of backlinks.

That's no longer true. If Google or Facebook don't want to censor content,
then they need to change the UI and warn that some of the results contain
wrong/misleading information, or even complete lies.

This fact-checking wasn't needed long ago because the pages with the most
backlinks had almost always factual content.

------
abootstrapper
Not all actors are participating in online conversations in good faith.
Meaning, a big part of the problem are trolls and propagandist antagonizing
and encouraging division. Part of the solution has to be stopping them.

~~~
ianai
Lots of other comments here say it’s just the old trolls at scale, but I agree
with you. There is propaganda at such a scale that it’s hard to miss and some
is very hard to spot. I don’t remember people in the US being so likely to
repeat Russian propaganda in the past. I routinely have people repeat Russian-
twisted, false accounts of historical facts to me! This isn’t the old problems
at scale.

Edit-just this morning the news reports Republican senators being afraid for
their families well beings if they counter Trump. Unthinkable just 4 years
ago.

------
kup0
_Disclaimer: I acknowledge that "toxic" is a loosely-defined term that can be
thrown around at people that really aren't... I'm mostly meaning people that
go out of their way to be negative/harmful in unnecessary ways, harmful forms
of trolling, the inability for people to have a discussion without it
dissolving into attacks/etc extremely fast... that kind of thing_

I think we're looking back at history with rose-tinted glasses here. Toxicity
has always been around. Maybe in recent years it has grown, but I think that's
just the nature of the technological beast.

It doesn't mean that on a broad scale technology always actively contributes
to making things more toxic- but just that it amplifies and contributes to all
sorts of things, and that includes the negative. We're more connected than
we've ever been, and the unfortunate side-effect is that the degrees of
separation between us and those that are toxic have been significantly
reduced. Anonymity is maybe a small part, but there are plenty of openly toxic
people, both online and off.

Toxic people are just toxic people and I think it's a social issue that is
just always going to exist at some level. Maybe there are ways to use
technology to assist in fighting against it and maybe not. I think it's just a
potentially-unsolvable complex problem that will always arise in society.

I don't know the most effective ways to fight it. However, I'm trying my best
not to contribute to it, and maybe personally fighting it within myself will
have an outward effect on others. While I wouldn't say I'm mean/toxic/etc
online, I do try to stay self-aware of my actions/reactions/emotions and what
I post online (and have failed to do this sometimes) because it's very easy to
get caught up in negative news/misery and then it's easy to branch off from
there into an unhelpful level of anger/negativity.

I try to do my best not to assume the worst of others, to realize there are
beings with entire lives unknown to me behind every screen name (if it's not a
bot) and to realize it's sometimes difficult to properly infer the tone of
what someone is saying online. It's still an internal work-in-progress, but I
think I'm far more mindful of my behavior now than in years past.

~~~
criddell
I don't think OP is claiming toxicity is new, but the extreme polarization of
everything is.

Look into the lives of politicians in the 1950's and 1960's. They frequently
socialized together and worked together. They were able to find common ground.
Every victory for one party wasn't necessarily a loss for the other. How often
do you think Mitch McConnell goes out for a drink with Bernie Sanders?

Tribalism, especially in relation to things like sports, is super-old. But
it's spread to all parts of our lives. Often it's semi-friendly (are you a vi
or emacs person?), but it can get out of control.

So how do you fix it? That's tough, especially when every news story seems to
have two equal but opposite sides.

~~~
Ididntdothis
"Look into the lives of politicians in the 1950's and 1960's. They frequently
socialized together and worked together. They were able to find common ground.
Every victory for one party wasn't necessarily a loss for the other. How often
do you think Mitch McConnell goes out for a drink with Bernie Sanders?"

That's the problem with transparency and constant scrutiny. In the 50s and 60s
you could hide affairs like Kennedy did. Or you could have a drink with people
from the other party, discuss things openly and find a deal without anybody
knowing. If McConnell met with Sanders today there would be a huge uproar.
There is something to be said for the ability to make backroom deals.

~~~
criddell
It isn't necessarily about making backroom deals. The tribe mentality is so
deep that I'm not sure that Sanders and McConnell would recognize that the
other person is trying to do what they think is best for the country. I'm not
sure they could sit down and talk about their families or their favorite place
to travel.

~~~
Ididntdothis
I just think the tribe mentality got informed by the constant news cycle now.
A lot of sane people have probably left congress and now you have a lot of
hyperpartisans who like fighting for its own sake.

------
bsenftner
So, after a day nothing but "this is the way it is" comments I gotta say: this
is not acceptable.

I've been pondering this issue for over a decade. I think the general attitude
is partially right: we can't do anything about troll postings. But we can
create incentives for troll (meaning nonconstructive) comments in online
discussions to funnel together while non-troll conversations seeking literally
any goal have incentive to maintain their conversations through something
similar to the "like" mechanism.

The basic idea is rather than a single comment and reply widget for online
articles and posts, the "comment reply" widget has an "attitude indicator"
that declares "this post is in this attitude". Others give the post a +/\-
rating for how well it fits the attitude.

It is a simple mechanism that alters how online discussions are handled. It
has to be simple, or it will not be adopted. The change could be significant,
enabling a diversity of on-topic conversations. Just pick your attitude, your
sub-culture and talk with birds of a feather. Whereas no serious conversations
can take place online today without random trolls stomping on everyone's china
- all because we have a single reply widget funneling all voices together into
a troll happy mess.

Anyone else thinking of possible solutions?

------
aantix
Our voices are nuanced. Our facial expressions are nuanced.

Text is not, unless your a 1% top tier writer.

Would love to see more experiments with video and voice to bring back the
subtleties of human communication.

------
atoav
I think one of the driving issues are filter bubbles. The intersecting areas
between different political directions is getting smaller and smaller, there
exists nearly nothing that is dedicated to healthy discussion between
different school of thoughts. Plattforms like youtube are making this worse,
because it's recommendation algorithms are pulling people towards more extreme
content instead of pulling them more to commonly agreed stuff.

At some moment we will have to agree that big online plattforms aim to change
behaviour, create engagement etc. All of this has side effects and these side
effects have tangible political consequences. The tangible political
consequences have in turn real hard consequences for single persons and groups
of people.

We technicians need to think more about the social impact of the things we
build. If it is a nice business idea but is a bad idea in the sense of Kant's
categorical imperative, just scrap it and move on. Ideas are cheap.

Also: get away from the screen. Meet and talk. Even wkth people with whom you
disagree with. Try to keep your own bubble permeable and aim to do the same
for others — you will not win your culture war anyways, so don't try to.

Instead try to widen people's horizon by taking them and their ideas
seriously, even if it feels stupid to you.

------
yodsanklai
> So how do we fix this?

Do not participate in forums where this type of behavior happens. And
participate actively in moderation (flag/upvote/downvote) to keep forums civil
and interesting.

I'd say forums aren't well-suited to discuss political issues. But I agree
that tech could help. For instance, Stack Exchange Politics is readable but
last time I checked, it didn't really provide much value. Mostly, arguing
about politics is a waste of time, on a forum or anywhere else.

------
prirun
My theory is that politicians (and religions, and any other group in power)
want polarization so that the plebs stay fighting amongst themselves rather
than uniting against their overlords. Much better to have us yelling at each
other about sexual preferences, sexual identities, religious zealotry, and
_anything_ else people get highly charged about, rather than all of us arguing
with them. Can't have the plebs be united against the masters...

------
arexxbifs
I recently stumbled across some unlikely part of the web - I think it was
Google reviews for a hotel in Nigeria - and people used it as some random web
chat. It was very uplifting to see complete strangers just politely saying
hello to each other, all from a place where good connectivity still might not
be taken for granted.

It reminded me very much of how I first experienced the net in 1995: I was too
amazed at talking to some random person from across the world to bother with
petty arguments about politics or religion.

Now, novelty has worn off and, perhaps more importantly, for some people _it
's never been new_ \- they weren't around to consciously and with great
curiosity try it out for the first time. Add to this the factor of prevalence,
the scale that brings and how centralized things have become.

I used to hang out on different special interest forums and IRC channels and I
have to say there was as much drama, cliques, in-fighting and groupthink there
as it is now. It was just smaller and more self-contained.

With today's massive community tools like Instagram, Facebook and Twitter,
where there are no clear boundaries between factions or groups of interest,
it's just a free-for-all: there's always someone willing to argue about
something, and they're all on the same platform, which they have access to
_all the time_ , as opposed to having to deal with the diplomacy of dial-up
and a shared family phone line.

If tech is to blame, it is only because it's too cheap and ubiquitous: The
Great Invention that was supposed to bring us together finally has, and it
turns out that we just fucking love to argue.

------
banner2018
A co-worker suggested me to be "blindly optimistic", I scoffed at the
suggestion. But it seems to be helping me. Hope it helps others too, so
sharing here.

~~~
tacocataco
It sounds to me your coworker is pushing a faith based belief system on you.

------
mLuby
Improve the economy.

Humans confronted with scarcity (real or imagined) react with rational self-
interest. They react more easily and strongly to perceived threats, become
selfish for themselves and their in-group, and over-demonstrate their
allegiance to that in-group.

Make people feel (economically) safe, that they have opportunities to
accumulate wealth, and can generally do what they want. They'll be less
fearful, and less afraid to see Others getting ahead as well.

------
eterps
I believe consensus building tools could help a lot to guide toxic
conversations in a more civilized direction.

Check these articles for ideas how that might be accomplished:

[https://news.ycombinator.com/item?id=22203937](https://news.ycombinator.com/item?id=22203937)

[https://news.ycombinator.com/item?id=21362182](https://news.ycombinator.com/item?id=21362182)

~~~
kangnkodos
These articles describe the vTaiwan system, developed in Taiwan to build
consensus.

"People compete to bring up the most nuanced statements that can win most
people across" different factions.

It includes an iterative process of proposing new statements on a topic, and
then voting on the statements. Over time, statements are developer which more
and more of the voters agree with.

------
honkycat
I can't speak for this site, but for the internet overall: There will always
be teenagers and trolls.

Teenagers will always find it amusing to say naughty things and being
transgressive. Trolls never bother finding something more interesting to do.

It is like a prank call. You used to call the local pizza joint, and ask for
orange chicken, annoying the manager until they flew into a rage. Now you get
on a comment section and annoy people.

------
buboard
You can't. The internet has grown in size irreversibly. There is more than
eternal september. when you have billion-people "communities" you no longer
have community, you have a large mob. In order for discussions to have depth,
they need a certain shared framework of understanding and certain level of
agreement on the definitions of words; in other words a baseline common
culture. The more people are in a community the thinner this common culture
gets as only the few common denominators survive. At this point, the
discussion can no longer have depth because people will be arguing constantly
about definitions and nothing productive will ever occur

The old story about the tower of Babel is true. It seems humanity has faced
these moments before. The solution is to separate the bubbles in separate,
more cohesive communities. Ideas need such sheltering in order to grow, even
if this means they won't blend easily with other ideas.

> Do we ditch them and go back to a literal timeline?

probably doesn't help. look at youtube comments. you can set twitter to
timeline it doesnt make a lot of difference

------
hinkley
Honestly I don't think we're the people to do it. Software developers are
grossly susceptible to the Principle of the Excluded Middle. We spend so, so
much time arguing about extremes (see also the post yesterday about Monoliths
being the future), while those calling for moderation only see modest amounts
of support.

If we want to be those people, the answer is 'personal growth', and lots of
it.

~~~
tacocataco
Considering how much the overton window has shifted since the 80's, I find
your "moderate" position extreme.

------
menacingly
Virtually everyone can agree that toxic voices are undesirable, it's the
equivalent of saying "evil people should be in jail"

Well, of course, but it's our inability to mutually agree on the terms "bad"
and "toxic" that makes this policy fundamentally impractical, and freedom of
expression the only workable solution.

These are not new problems, they're old problems in a new suit.

~~~
oneplane
The difference is in being abusive or not, just like freedom is not absolute
or in one direction; i.e. in the example of your freedom ending where someone
else's freedom begins.

------
ssivark
While there is definitely an uptick in polarization, I think the more
important and insidious trend is the increasing _ubiquity of bullshit_. We are
flooded with information online, most of which is irrelevant or
counterproductive to our interests. Needing to constantly wade through that in
triage mode (with a judgmental attitude) has the general effect of wearing
down people’s psyche and making them feel exhausted, anxious and on the edge.
In that mindset, of course small things are going to set them off — but that
is just the symptom, not the root cause. This is also a much harder problem to
wrangle with because it is less specific. The causes of this problem are
deeply embedded inside the incentives we have set up on the web over the last
two decades — to challenge those will require answering some hard questions.
At some level, people realize this (hence small efforts like the slow tech
movement, digital detox, etc), but they are yet to find the right balance of
convenience and sanity (for lack of a better word).

------
chestermacwerth
The current state of things is result of

(1) Normie-fication of internet. Regular people who aren't necessarily
enthusiats have a tendency to want or expect the internet to mirror their
daily life; they're not escaping into something better, they're just using
internet as extension of their normal life.

(2) An obsession with the self. Everything on social platforms today is self-
promotion. "Showing off," to put it simply. Naturally, when the bulk of your
internet activity is focused on self-promotion rather than pursuing an
intellectual interest, you are naturally exposed to criticism. Some of that
criticism will be harsh or inconsiderate, but frankly, that is just to be
expected and it's always been this way.

In terms of practical, "what can we do," there's only one thing I can think
of: getting rid of identities. The identity-obsessed, self-promotional social
platforms will always be doomed. There is no way to "make people be nice."
There is only a means to control the extent to which identity is exposed.

------
justizin
> Before, around 2010-2012, people who disagreed would usually leave it at
> that and walk away respectfully.

Bruh you have clearly never been to usenet lol

------
tboyd47
What we calling polarization and toxicity is actually 2 problems. Disagreement
and bad manners.

Disagreements are always solved the same way. Either by coming to an agreement
or agreeing to disagree.

Bad manners is a much bigger problem that has a variety of solutions that may
or may not work, and are totally up to the individual. Bad manners also
prevents problem #1 (disagreements) from being resolved.

------
austincheney
There are two forms of toxicity. This is vitally important to understand
before suggesting any solution.

* single user - evidenced by harassment, trolling, attacks

* group - evidenced by an echo chamber, which is typically present from down votes drastically outnumbering replies or replies suggesting silencing or solicitations for recruitment

Both forms are equally toxic, but they are not equally recognizable or
distinguishable. An example a lone attacker is typically repulsive and starts
out as a bold violator to most users in a well moderated environment and so
they rarely attract supporting attention from other users.

Contrarily, group attacks are typically benign, at first, but misery loves
company. The negative attention grows on itself drawing in users with
insecurities how tend to fear diversity. While group toxicity may start out
benign the toxic nature of it becomes starkly apparent as it is allowed to
fester resulting in comments that are direct and hostile attacks claiming
justification of group support or agreement to a premise.

Toxic groups do occur on HN, but they are rarely able to grow wildly out of
control in any measurable way because downvotes to any contribution are capped
at -4. The only evidence of group attacks in an online environment like this
are the quantity of down votes and the nature of the reply comments present.

It is my opinion toxic group behavior is a more serious concern and the lone
wolf, because the lone wolf flamer is easier to identify. In many cases users
have no idea they are contributing to group toxicity as conformance without
explanation may feel natural. It's also serious because it is substantially
harder to correct.

Either way the nature of toxic behavior is about attention whether it's to
draw attention to a single user's contribution or to silence a disagreement.

------
cagenut
I largely agree with most of whats already been posted. Many of these things
have been true since dial-up BBSs and usenet. We're mostly just seeing them
apply to more and more of the population.

But for a moment consider another factor, a broader scope. One of the ways I
like to describe the internet, and its effect on the first generations to
adopt it, is "we can share notes now".

What I mean by that is: previously a combination of physical barriers,
geography, institutions, power structures, and communication constraints made
it such that the vast vast majority of us really only got information from,
and discussed information with, people within a day's-walk radius of where we
lived. On the upside that meant lots of it was face-to-face and rooted in your
family and community. On the downside that meant we were _all_ incredibly
isolated and living in an almost completely false understanding of the world.

The internet has pierced (or broken) that. I can watch in near realtime as the
forest fires drive the people of Sydney from their homes. I can watch the
chart of the wuhan flu grow exponentially every day. I have seen high def
video of the slums in india, nigeria, and sacramento.

People of the past had far more first-hand experience with suffering than I
do, but they hadn't the slightest inclination of the _scale_ that we are all
as aware of as we choose to be now.

I think that combination of your brain reeling at the vastness of "the
problems" in conjunction with the powerlessness you feel about it leads to a
sortof emotional shrieking in horror that plays out as "toxic" posting.

Which is in part to say, I don't think its going to get better anytime soon.
As a matter of fact, as the challenges and constraints of climate change add
pressure every year, it seems very very likely to get worse.

------
anonymousiam
606 comments... Some good, some not so good.

Human nature will prevail. Guerrilla tactics will continue to be used by
organized groups to shame/banish those who do not confirm to their cultural
views. High-profile people will continue to be shamed for associating with
"bad" people. The era of civility and mutual respect on the Internet died over
20 years ago.

------
ccsnags
I think it is more of a social problem we can’t solve with code.

As methods of communication get cheaper and more accessible, it opens the
world up to see what humanity is, not what humanity wants itself to be.

People that maliciously attack those online are broken people. Broken people
exist in reality, thus broken people will exist online.

Google and Facebook can write code to stop harassment, but they can’t do so
without marginalizing people who are already on the fringes of society. Plus,
their system can be gamed. The output of such changes would mean that it would
stop some harassment at the cost of losing accessibility for everyone and a
meta game playing in the background that costs the corporations engineering
resources.

The best answer right now, in my opinion, is to create tools to fight
harassment, but give those tools to the users of the site rather than
automated algorithms. A good example of this is a mute feature. Other tools
like customized word filters, block lists and safe lists can also help empower
users.

~~~
ianai
I’ve got a solution. Have each domain need to reflect the nation of its
authorship. Ie a U.K. vlog has to post to a .uk domain. Really only meant for
the big media platforms like YouTube, Twitter, FB, etc. What they would see at
the .com site is all of their local content maybe with a trending globally
section.

------
maxk42
I think the rising sense of division is caused by social networking sites that
are increasingly interest-based or focused on small communities. When every
interaction is focused around people who already agree with you, it begins to
feel like the whole world agrees with you on everything. This is a pretty
pleasant feeling. But then when someone wanders in to your sphere with a
different point of view, the interactions are vitriolic. This is most notable
on social networking sites like Twitter where you mostly follow / interact
with people who agree with you, but randos can jump in out of nowhere and stir
the pot.

I believe that we, as a society need to engage with one another better. Be
less quick to judge. Be less quick to mock. Learn to get along with people
that hold views that are antithetical to our own -- just like we have to in
the real world. And when someone comes in to your community with the explicit
intent of disruption, simply ignore them.

------
notadoc
It's far beyond the web, polarization/toxicity is now all over politics,
media, outrage mobs, protests, etc. It's growingly reminiscent of cultural
revolutions and other negative historical events, which is not a good thing.

To start:

\- Stop using social media

\- Don't use services that have public digital scorekeeping and other vanity
metrics centered around ego and narcissism

\- Recall that the 1st Amendment is the first amendment for a very good reason

\- Recall that it's perfectly OK to disagree with others, no matter the
subject

\- Recall that the more difficult a subject often the more nuance is required

\- Reject divisive political and social movements

\- Reject identity politics

\- End outrage mobs

\- End PC policing nonsense

\- End cancel culture

\- Don't engage in 'call out culture' or 'woke' nonsense which incentivizes
toxicity, division, and mob behavior both online and offline.

As for future generations, perhaps k8-12 schools should have mandatory studies
of inquisitions, mob mentalities, lynchings, witch burnings, and cultural
revolutions, and the tremendously negative outcomes associated with those
types of behaviors.

------
dragonwriter
> Before, around 2010-2012, people who disagreed would usually leave it at
> that and walk away respectfully

That's never (even before the internet) been true in unmoderated open online
fora, and it's still true in closed and heavily moderated fora that seek that
outcome.

Unmoderated open fora have probably become more dominant because effective
moderation doesn't scale.

------
cjfd
There should be boundaries to what is considered respectable opinions that
exclude both the extreme right and the extreme left. If people do not express
these off-limits opinions they should be considered respectable and treated as
such in discussions. An example of off-limits extreme right opinion is
anything promoting racial or ethnic inferiority of one group or another. An
example of off-limits extreme left opinion would be attempts to stake all
differences in society on discrimination. Anything less extreme than that
should be considered fair game in the market of ideas. The extreme left and
right ideologies have cost tens of millions of people their lives in the
twentieth century and should not be considered respectable. Anything less
extreme might, depending on the circumstances, actually be a solution to some
problem that is right now occurring somewhere in the world and should at least
be considered.

------
blippage
Progressives think that the alt right is toxic, and vice versa.

So then the question is: whose view of toxicity is right? The answer we have
so far is: whoever protests the loudest or whoever controls the medium.

Although it's the answer we've got, it's far from obvious that it's the right
answer.

Progressives seem to have gained the upper hand at this point.

~~~
leoh
They're both right. Both positions are often not contingent upon kindness,
compassion, and a desire to maintain or increase everyone's wellbeing.

~~~
wonderment
It seems to me that each side is right about the other.

------
mirimir
I really doubt that people have overall become more contentious and
disrespectful. It's just that, with social media and the Internet generally,
it's so much more visible. People ranting about this and that were always
there, but you had to go out of your way to hear them.

Back in the day, many Usenet groups were OK during summer, when only
professionals were online, and suddenly became toxic in the fall, as students
went online. And then, with the first public ISPs, they became toxic all year.
That is, Eternal September.

So we can choose where to hang out. If we want interesting discussions, we can
frequent highly moderated sites, such as HN and professional subreddits. If we
want toxicity, we can frequent the chans or whatever.

Edit: I ought to have said that some "toxic" sites may be useful for finding
stuff that's suppressed elsewhere. If for no other reason, to know what's
happening.

~~~
arh68
I agree. There used to be many bars/restaurants/clubs, and you wouldn't
naturally bump into the clubs you disagree with. There even used to be a ton
of phpBBs/vBulletins out there. Even different subreddits/imageboards mostly
keep to themselves (sometimes a bit too much..), raids notwithstanding.

But there's only one Facebook. I guess the network effects give monopolies
such strong advantage, we tend to end up with maybe 1 or 2 of each kind of
social network (Reddit is the exception, with N subreddits). It forces All
Kinds together. If there were real competition, we wouldn't be so often
confusing deplatforming with denying speech (IMO; the line gets blurry).

I think HN has mostly dodged this issue for the most part through persistent
moderation, niche userbase, & people usually assuming good faith.

------
soheil
I think a "just let it happen" attitude like Google Adsense's with fake clicks
problem may be the solution. Google at one point decided instead of fighting
bots and publisher's tactics to click on their own ads decided to allow the
fake clicks reasoning that if the actual person click on the ad is not going
to buy a product the ROI of the ad would drop on that particular publisher's
website. Over time the system would correct for itself and just show fewer ads
on that site or lower its cost per click for ads shown on it, hence solving
the fake click problem in a clever way.

I think if we employ the same tactic and let polarization happen online there
will be communities where the ROI of discussions drops exponentially and there
will be places where less of that stuff happens for whatever reason and they
will be the truly engaging places to hang out online.

------
tnel77
I won’t act like I have the answer, but I do have a complaint about the state
of our news sources (specifically focused on the USA).

A lot of people like to throw around the term “fake news” when they see an
article from the news source that is opposite their side of the aisle. While
fake news does exist, CNN and Fox News are not fake news. They produce heavily
biased news. There is a major difference. While both news sources have been
caught lying before, both of them are usually telling the truth. It may only
be half of the truth and purposely mislead the viewer, but that doesn’t mean
the information being shared is fake or false.

I’m relatively young so I don’t know if there has ever been anything remotely
close to “unbiased news,” but just presenting the public with facts that
aren’t tarnished by bias would go a long way to helping people become more
informed and, potentially, less worked up.

~~~
cr0sh
What you may or may not know is that at one time, there was a doctrine by the
FCC known as the "Fairness Doctrine":

[https://en.wikipedia.org/wiki/FCC_fairness_doctrine](https://en.wikipedia.org/wiki/FCC_fairness_doctrine)

"The fairness doctrine had two basic elements: It required broadcasters to
devote some of their airtime to discussing controversial matters of public
interest, and to air contrasting views regarding those matters."

This doctrine was eliminated in 1987, after a series of lawsuits about it.

Arguably, it's part of what started this whole mess we now find ourselves a
part of.

------
zhoujianfu
Probably fixing the root cause of political divisions would have the biggest
effect.

Stuff like ranked choice voting and open/no primaries that would result in
more moderate politicians winning and less need for political parties.

Also a UBI might help diminish an “us vs them” mentality. The more good
policies we have, the more we ALL benefit.

------
SubuSS
I think it is a result of the network effect. The same window also marks the
rise of social networks. So the toxicity is more in your face and you almost
have no way of escaping it. Even if you're not on facebook, your friends are
talking about it / hackernews has articles on political issues and so on. It
permeates all your 'safe spaces'.

I also think we have non-cs folk entering the internets en masse and are there
by changing the rules and norms, which generates mania over things which were
just ignored before.

How do we stop it? I don't think we stop it. We build resistance to it. Every
generation has its detractor and for us it is the bad effects of this network.
Over time, we will start ignoring the noise and use these for what they are
actually useful for (or the next detractor takes over).

------
ionforce
More moderation. More value judgements (and actions) need to be taken against
content that is deemed unsavory. Bans, etc.

Higher barriers of entry. The right to comment and occupy mind space is given
away too easily. Penalize new accounts, start with limited capabilities,
disallow short comments, etc.

Highlight exemplary content.

~~~
teleclimber
> More moderation.

Agreed. However moderation power is something that needs to be considered
carefully. I think moderation power should be distributed broadly to the users
who have shown good stewardship of the community via their participation, and
not to a few chosen individuals.

> Higher barriers of entry.

Yes this is the key. It should take work to occupy mind-space. Your first
comments should be shown to very few people, probably experienced users in the
community, and their approval or disapproval will determine how much
visibility your next comments get.

------
kangnkodos
I saw a recent study that said frequent posters on twitter are likely to be on
the extreme left or the extreme right.

I don't have any data on how this has changed over time, but my guess is that
social media polarization has increased over time.

People who are in the middle, or have some beliefs on each side, or are
willing to seriously consider points from both the left and the right just
don't post much anymore. On the rare occasion that they do, they get downvoted
to the point that they are invisible.

So yes, when you look at social media you see the extreme left and the extreme
right lobbing insults at each other. Or you might catch a glimpse of the
extremists working to silence the moderates, if you dig deep enough.

I don't think it used to be as bad as it is now.

The polarization is continuing to get worse and worse.

How do we stop it? I wish I knew.

------
zzo38computer
Use user-defined kill-files (although if you want to publish them you can, in
case someone wants to copy and modify for their own use, or they might decide
not to use it at all if that is their preference). Allow anyone to comment and
to reply to anyone but remember someone might set up a kill-file to hide
messages from someone who does not want to see your messages (whether or not
you are an administrator of the service, or police, or whatever). We need more
freedom of speech, not less. But, freedom of speech should not mean to force
everyone to listen to you if they do not want to listen. (Kill-files could
also be based on whatever criteria you want, not necessarily only based on the
author of the message, although that would probably be the most common case.)

------
MattyRad
It's worth reposting the related thought piece from last week
[https://www.ribbonfarm.com/2020/01/16/the-internet-of-
beefs/](https://www.ribbonfarm.com/2020/01/16/the-internet-of-beefs/)

------
jeffrom
Here's the idea that imo needs to reach critical mass in our industry in order
to make progress on this issue
[https://en.wikipedia.org/wiki/Paradox_of_tolerance](https://en.wikipedia.org/wiki/Paradox_of_tolerance)

------
ddebernardy
Tech isn't innocent but you can't blame it all on tech either. Polarization
has been a thing for decades upon decades. In the US a major shift occurred
around the Civil Rights movement, after which Southern voters basically
drifted away from the Democrat Party and coalesced towards the Republican
Party. US politics has been polarizing along party lines ever since.

IMHO you won't fix it with technology as it runs much deeper. But you can
remove comments to sweep a big part of the toxicity under the rug.

If you'd like a good overview of the longer story, Ezra Klein, who just
released a book on the subject, was on Chris Hayes' "Why is this Happening?"
podcast earlier this week ("Why we're polarized, with Ezra Klein").

------
hadiz
It's a problem IRL, too. I am astonished by how much disinterested people are
in each other. I am very good at keeping a conversation going, I become
genuinely interested in the person and ask as many follow-up questions as
possible. Unless I know what we're talking about falls into my experiences,
I'd refrain from talking about myself.

But what I see constantly is that people have gone from a "what about you"
mindset to "what about me" mindset. My conversations end as soon as the person
is done answering my questions about them. Like, zero reciprocation, which is
god awful.

If these people participate online, where they can mainly get away with being
a shitty personality, then this toxicity is not that surprising to see.

~~~
toomuchtodo
User experiences are crafted to make the user the center of attention, or
gives them the levers to do so (Instagram and Facebook, for example). The
result, while depressing (and what you observe), should be of no surprise to
anyone.

I have no solution, only a small piece of advice: seek out others with empathy
and active interest in others. Avoid narcissists like the plague. Be humble.

------
movedx
When I started my training brand (The Cloud Coach) I wanted to share my
knowledge with the world through video training. What I've actually come to
understand is I have a deeper desire: I want a community of people I can have
a pleasant, on-point discourse with.

I believe others want this too.

To combat my desire for this I'm developing two things: a Discord community
for all but with a private section for paying customers. I believe when people
pay for access to a community that expects a certain behavour from them they
run with it.

Secondly I'm also developing a localised Meetup called Brisbane CloudOps. I
want to enable physical meetings and discussions too.

I think the answer, to be honest, is smaller, niche, private communities and
not vast open arenas.

------
gjsman-1000
A good first start would be to teach the old maxim, "don't believe everything
you read," in schools more. Most kids have only heard this phrase a few times
in their lives and never really process it in their everyday lives. It would
do a lot of good.

------
interlinkedcell
Everyone in the anglosphere is always online and can talk to each other now.
Meanwhile our culture is pretty polarized. Conversations that could never be
had in the time of top down privately owned broadcast & newspaper media are
now unavoidable.

I don't think it's a "problem" and I don't think there is or should be a
technocratic solution.

This is new normal, where political struggle has evolved into a prolonged
people's posting war, and civility is not respected. This is freedom of
expression, and it's a reflection of ourselves and society, you can break the
mirror if it upsets you but it won't change the underlying truth.

It's probably going to be like this for a while, and that's okay.

------
egypturnash
Sharply curtail the ability of companies to pour money into politics. Revoke
the "Citizens United" decision that means that corporations are "people"
whenever it pleases them to be so, especially for purposes like pouring a lot
of money into a reactionary politician who stirs up a lot of hate and makes
lots of regulatory decisions in the favor of corporations.

Limit how many media outlets any one entity can own. When all the news and all
the entertainment is owned by a giant inhuman corporation whose bottom line is
solely profit, they can and will act much more in their interest than in the
interest of any of the individual humans that make up this corporation, never
mind the humans they provide news and entertainent to.

If you have a shitload of tech money, move the fuck out of Expensive Tech
City, stop giving your money to landlords, start giving it to organizations
and politicians who are working to put back the barriers to this sort of thing
that corporations have been relentlessly campaigning against for their entire
lifespan. If you _run_ a tech company that's making obscene profits then start
sinking some of _that_ into that sort of stuff, fuck "increasing shareholder
value", have your accountants figure out creative ways to stiff your VC if
you've got that to deal with instead of finding creative ways to pay less
taxes. Give money to unions, unions work to make shit better for every worker,
and right now part of why people are so easy to stir up is because they are
broke and afraid of the fact that they are one financial crisis away from
homelessness/death/etc.

Oh also I guess yeah stop trying to make another goddamn Zuckerfortune by
relentlessly promoting whatever creates "engagement" in your new social media
site and not giving two shits about the fact that nothing gets people to keep
coming back to spend time on your hellsite like an intense argument, because
all you care is how long they're there looking at the ads you've parasitized
all around their human-to-human communications. Social sites are not helping
but this shit has been building since before "social media" was even a thing
anyone said.

------
drewcoo
Calling this "toxicity" seems itself to be a polarizing exaggeration.

Dealing with the new users eternal September was a Sisyphean task. I assume
that was an earlier stage of fake toxicity. Everyone complained about it then,
too, even though there was usually heavier human moderation. No one could
agree about how to handle the "problem."

Want to put controls on the damage software can do to society? Regulate. Maybe
even force us to be real engineers. I would like to think that software
killing people (e.g. 737 Max) would be a bigger factor in that legislation
than social networks encouraging meanness. The economics of the two seem to
favor the former more, too.

------
rossdavidh
Are you sure this is tech-related? Professor Peter Turchin, writing in 2012 (I
saw a rough draft of the book he eventually published) predicted a peak in
American polarization in around 2020, based on trends that go back many
decades. We certainly see it on the Internet, but there's good evidence it
would be happening regardless, because it can be (and was) predicted based on
trends which go back well before the Internet. [https://www.amazon.com/Ages-
Discord-Peter-Turchin/dp/0996139...](https://www.amazon.com/Ages-Discord-
Peter-Turchin/dp/0996139540)

------
m1n1
The U.S. FCC's fairness doctrine "required licensed radio and television
broadcasters to present fair and balanced coverage of controversial issues of
interest to their communities, including by devoting equal airtime to opposing
points of view." It was mostly repealed in 1987 under Reagan. A few provisions
lasted a little longer.

I think some radio shows etc have greater ability to be polarizing with this
out of the way, and our respective bubbles freer from outside disturbance.

[https://www.britannica.com/topic/Fairness-
Doctrine](https://www.britannica.com/topic/Fairness-Doctrine)

------
altitudinous
YES, UPVOTE SYSTEMS ARE POLARISING.

Reddit, Facebook and this website are some of many with this issue. Leads to
echo chambers, inability to see the opposing point of view as it is literally
hidden.

As a start point, higher karma should be allocated for highest quantity of
votes where total votes plus ones and minus ones total to zero. That way you
are swarded karma for posting both sides of an argument, not being just some
karma whore mindlessly agreeing with someone else.

Reddit and Hacker News REWARD groupthink, and REWARD brigading. They are the
problem.

I enjoy posting counter arguments, but no-one ever sees them because they get
downvoted to invisibility quickly.

------
themagician
Make it harder to use, like it was.

Doing things on the internet used to be harder. It used to be a network of
loosely interconnected websites. You wanted to have a voice you made a
website. You wanted to talk to others you joined a chat room or newsgroup or
message board.

Then we invented comments. Now we have apps with SSO. You don’t even need to
make a username anymore. The entirety of the internet congealed into 50 or so
web properties.

Slashdot is still around. It’s relatively hard to use. Their moderation system
brutally punishes trolls and rewards good behavior. It also allows for jokes
and sarcasm.

------
snthd
We're evolved to be in a social environment (the physical world) that
regulates our social interaction.

The internet abstracts that all away. If there are natural limits on what and
how you engage with online they're vastly different.

We need to make the internet closer to what we can cope with.

From my light understanding of "non violent communication" I think the NVC
creator would take the view that the internet just amplifies our existing
deficiencies. I think it would be worth trying forums that impose NVC
patterns; perhaps NLP could be applied to automate this?

------
aaron695
Step one - You need to understand the problem (ie. polarization != toxicicity)

Step two - You need to ask the correct question. There are lots of answers
here that are both useless and answer you.

Step three - You need to get people to answer the actual question

You need the better answers to rise to the top

You need the answers to be directly actionable.

You preferably need it to be quantifiable

You need to understand how it will work in a marketplace.

As an example, we should try and understand what happened when Twitter doubled
it's Tweet size. Did it make it better? Did it hurt their market? How
important is message size?

------
tigerstripe
Allow lawsuits against content providers, to be liable for what their users
post.

Hear me out: There's another article here about how the EARN IT would attack
Big Tech and reduce user freedom by limiting Section 230. Maybe Section 230
has helped the Internet grow into what it is today, but it might also be the
reason for a lot of the toxicity.

I'm personally in favor of privacy and digital freedom, but if repealing that
law would turn the Internet into a forum for civil intellectual discourse, it
might be worth it.

------
AlphaWeaver
Thanks for asking!

Just wanted to point out to the thread: this is a question that is a great fit
for VC3 ([https://vc3.club](https://vc3.club)). I started VC3 to try and bring
together people to answer questions like this (and others) that pertain to
building more meaningful community through the Internet. If you're interested
in discussing this, you'd be a great fit! Send us an application and we'll get
you in the discussion!

------
scrollbar
I just got my preordered copy of Ezra Klein's "Why We're Polarized" which
professes to study this topic in detail, in particular how it relates to
politics in the US [https://www.goodreads.com/book/show/49930783-why-we-re-
polar...](https://www.goodreads.com/book/show/49930783-why-we-re-polarized)

I have a feeling this issue is pretty complex. Looking forward to reading more
about it soon.

------
learnstats2
How is this fixed, truthfully? By censoring or oppressing one side of the
argument, so that divergence from the status quo becomes invisible or perhaps
illegal.

People disagree; people have always disagreed. The fact that we are now
repeatedly being told this is "toxic" is part of an ongoing campaign to
suppress dissenting voices.

It seems often to be the case that speech which oppresses people is "free";
while speech which identifies that oppression is "toxic".

------
beardedman
> Even niche sites like HN are not immune.

I've not noticed a difference over the years TBH. HackerNews is substantially
worse than most other sites IMO as well. Ironic that I'm disagreeing with you!
Ha!

I just mean - maybe you're at the part of your life where those extremes are
apparent and that a degree of humility, understanding & conversation goes a
lot further than most people think.

> Or is this more of a social problem that code can't solve?

More of a social problem I think.

------
throwaway55554
It's far easier to be negative than to be positive. I think people who would
normally try to be more positive have just given up trying. Everything has
gone downhill; there's virtually no serious journalism anymore (it's all ad-
money seeking with link-bait headlines), people who claim to be "leaders" are
not setting good examples (they act like bad children!), etc. So people just
have accepted this is the way it is.

------
mnemotronic
Responsibility. Because of the inherent remote and anonymous context of on-
line dialogs or exchanges, they don't have the same level of personal contact
as direct person-to-person exchanges. People say things on-line that they
would not, or wouldn't dare to dare to, say in-person. So some of the greatest
advantages of the internet, it's anonymity and remote nature, can be turned
into it's greatest threats.

------
annoyingnoob
Too many people cannot turn off social media. Also, our 'elected' officials
and the media continually drive the polarization that you see - they don't
want any middle ground, they want to force us into an us or them choice and
its working. The solution is finding middle ground where we can agree, we need
to come at the problem from a place of mutual respect rather than a desire to
obliterate the other side.

------
veeralpatel979
The Internet lets people say things you wouldn't say in real life because you
can be anonymous.

I myself have some opinions that are controversial and not politically correct
and for reference I'm a well-educated software engineer. There are people out
there with far more extreme opinions than me, and the Internet gives them the
opportunity to share them.

And the reward of angering and fighting with people who disagree, with no real
consequences.

------
isensible
The current polarization has been caused by political correctness and the
inevitable push-back against expansive redefinition of previously well
understood terms such as: racism, sexism, and xenophobia.

Take immigration, people who wish to discuss immigration often end up getting
called racist or xenophobic, and the response is then: "f*ck you, I'm not a
racist". Thus you have polarized the community.

------
gjsman-1000
Not to be pessimist but more realistic of an approach: I do not believe there
is any way to stop the polarization now; especially as any attempt to stop the
polarization appears to only increase the polarization.

And, well, polarization isn't exactly the end of the world. Polarization has
occurred throughout human history and is only occurring once again, and is a
seemingly integral part of the human condition.

------
ljm
I think it's a matter of what you choose to engage with, too. You could
surround yourself with toxic 'friends' and start wondering why the world seems
so angry now. So it is with websites and communities that you consider toxic
to your wellbeing.

There are plenty of parts of the web that aren't toxic or polarising; you've
just got to find the ones you enjoy interacting with.

------
harrisreynolds
Empathy is the one word that I think sums up the solution to this problem. And
it is beyond just the web. It is a broad cultural problem in the US.

Speaking generally we've lost our ability to put ourselves in someone else's
shoes.

This is the result of rampant tribalism and surely several other contributing
factors including the gasoline of social media magnifying the issue.

We gotta just start empathizing more!

------
say_it_as_it_is
I was hoping that cannabis legalization was going to mellow people out but
that seems to be taking longer than anticipated. In the absence of widespread
cannabis use around the world, I'd like to see people getting to know about
people unlike themselves and discovering everything they have in common. The
easiest way to destroy an enemy is by becoming a friend.

------
aurizon
The ability to be an anonymous troll has enabled the microminds that inhabit
trollspace. The loss of accountability did it. Since the first days of the web
this disease has festered - look at how many people have been harmed and
hounded to suicide on FB and similar. Private chat circles where all are know
have the inherent ability to defeat this - I belong to a number of them. I
have no wish to troll, and I do not, but if I did the admins would soon kick
me out. China deals with this with strong reles on ID that limit the true
anonymity that enables trolls - does it work over there? I have no direct
knowledge as I am not a mamber of any. The ability to create throwaway
accounts on FB will make this hard to control. I suppose they could make each
person allow new contacts as well as deny offenders would work - it would add
a threshing aspect and multiply traffic as these trolls created new logins,
begged their way into acceptance by a person, then offended them and got the
boot. I think FB thrives on the unhindered traffic, protecting people would
reduce their cash flow = will not do it unless large goverment stick used - as
in China. I handle it, as I learned, by total non engaging trolls. As they
say:- dot not roll in the mud with pigs, you will get dirty and the pig likes
it!! No offense to pigs intended,,,;)

------
hughpeters
You don't. Instead you teach to identify trolls / toxic commenters and ignore
them.

Just like bullies, they're a fact of life, and the key to dealing with them is
to not take them seriously. A bully's goal is to get a reaction out of you so
to win - don't react.

Sticks and stones may break my bones but words will never hurt me - this
applies online and offline.

~~~
anigbrowl
_A bully 's goal is to get a reaction out of you so to win - don't react._

That doesn't work. Bullies target weak people who they perceive as being
socially isolated and less likely to retaliate. If they don't get the reaction
they want, it's not unusual for verbal bullying to escalate to physical
violence. There's abundant academic literature on this topic.

------
john_moscow
I think, the problem is actually deeper in the society. From what I could
observe, once the our basic food/shelter needs are satisfied, we humans have
an inherent need for long-term goals. Like slowly progressing one's career, or
growing one's small business. Self-actualization, something to put your
passion into.

For instance, I myself am running a bootstrapped software business, draw
immense positive energy from seeing satisfied customers and feeling that I'm
the person that made it possible.

As far as I can see, due to commoditization of most labor (better
organization, moving of production offshore), it is becoming harder and harder
for people to self-actualize professionally. It's hard to draw personal
satisfaction from driving Ubers all day or being a cog in a corporate machine.
You perfectly know that one day you can be replaced by a fresh graduate with a
minimal training who would do just as fine. You can no longer have a self-
identity as "the best baker in town", because most people buy bread from a
supermarket. There are certainly artists, video bloggers, etc, but it's a tiny
percentage of the population and it pays a fraction of what the soul-crushing
corporate jobs do.

So, since people cannot self-actualize professionally, they seek it elsewhere.
Since the West has a culture of openness and acceptance to all kinds of
minorities and subcultures, many people's self-identity becomes their
belonging to a certain social group and their feeling of self-growth comes
from having others acknowledge their point of view. Except, this is a zero-sum
game: instead of creating value for those who need it, people begin competing
for other's attention and alignment.

For instance, if I am selling ice cream and the guy across the road is selling
chocolate cakes, we're at peace with each other because whoever wants ice
cream will come to me and whoever wants chocolate cakes will come across the
road. But if I go on a crusade trying to convince everyone that ice cream is
the only _correct_ desert, while my neighbor does the same for chocolate
cakes, we quickly become political enemies intolerant of each other.

To sum it up, if you want to fix it for yourself, put your passion into
something that creates value rather than aims at redistributing it, and you
will feel much better. This would be extremely hard in the current economy
though, and yes, it feels sucky because the previous generations sort of had
it for granted. I've no idea how to fix it globally.

------
tmaly
I think good moderators and a code of conduct that is out front and visible
helps.

Back when I first got online there was this idea of internet etiquette. I am
not sure where that disappeared to.

It helps if the online communities can meet in person. I remember we did this
with the first indiehackers meetup and it was great to see real people and
make a connection.

------
nkkollaw
The web mirrors people.

How do you stop people from behaving like they do..? And why would you want
to? Who should decide how people should behave (beyond established laws, of
course)?

If people don't do something illegal that should be reported, they're being
people. You have nice people, assholes, politically correct, politically
incorrect, etc.

~~~
dredmorbius
And mirrors change behaviours. You don't find people preening before blank
walls or open spaces. The mirror informs and provides feedback.

~~~
nkkollaw
This has nothing to do with what I said.

~~~
dredmorbius
Incorrect.

Media influence society.

~~~
nkkollaw
I literally have no idea what you're talking about.

Whatever you say, though.

~~~
dredmorbius
Think of society as a system, and approach the social sxiences as systems
sciences -- psychology, sociology, economics, political science. They have
state, observation, processing logic, interaction, and some new state -- a
basic control loop or OODA loop.

That's the basic premise of Norbert Wiener ( _Cybernetics_
[https://www.worldcat.org/title/cybernetics-or-control-and-
co...](https://www.worldcat.org/title/cybernetics-or-control-and-
communication-in-the-animal-and-the-machine/oclc/1120692518) and _The Human
Use of Human Beings_ [https://www.worldcat.org/title/human-use-of-human-
beings-cyb...](https://www.worldcat.org/title/human-use-of-human-beings-
cybernetics-and-society/oclc/748989559) \-- horrible title, great book), Alfed
Kuhn, _The study of society : a multidisciplinary approach_
[https://www.worldcat.org/title/study-of-society-a-
multidisci...](https://www.worldcat.org/title/study-of-society-a-
multidisciplinary-approach/oclc/1079762282)), Jay Forrester, and others.

Media are the informational element of that loop. Change it, and you change
behaviour.

So, no, media _don 't_ merely reflect society, any more than a mirror merely
reflects a person's image. In the first place, mirrors change behaviour. In
the second, media are far more than a mirror -- they adapt, transform, store,
play back, transmit (in both space and time), amplify and attenuate,
information. They reveal state; they express will, control, and narrative;
they create and propogate shared (and novel) models of understanding; any they
apply to both human and mechanical systems (as well as, arguably, nonhuman
biological systems).

The key point is that _if you change the informational component of a system,
you change its behaviour fundamentally._

Its range, bandwidth, latency, scale, topology (peer-to-peer, star/broadcast,
web/mesh), fidelity, recording and playback capabilities, modalities (symbols,
icons, text, pictures, sound, video, data, ...), search, association,
fungibility (modifiability / rewritability of content), and more.

That's the basic message of Marshall McLuhan ( _The Medium is the Message_ and
_Gutenberg 's Galaxy_), of Elisabeth Eisenstein ( _The Printing Press as an
Agent of Change_ ), and others.

------
JohnFen
It appears to me that the web is is reflecting the insane level of
polarization and toxicity in real life. Unless we can find a way to dial that
down, I don't think there's anything effective that can be done about the same
thing on the web (without having to engage in actions that are even worse,
anyway).

------
jmhnilbog
Platforms must be responsible for what they publish. If NBC ran child
pornography, it'd be held responsible. Facebook runs it and says, oops in ten
years we'll have AI to not show you that.

If you want to be a miserable person online, find an ISP willing to host your
garbage until they get held responsible as well.

------
vuldin
Another related (possibly identical) question: How do we stop
polarization/toxicity (or general negative attitude) IRL?

One possible answer: Disconnect, ignore, and avoid it while at the same
gravitate towards positive, solution-oriented actions/people.

Stop banging your head against the wall, it's not going to work.

------
terrislinenbach
It's neither new nor fixable. It's why we have elections. Now that unlimited
amounts of untraceable cash can easily subdue any and all resistance, well.
We're F*cked. There is nothing any of us can do about it but hope we die
painlessly before global warming really kicks in.

------
pier25
It has always been like this as far as I can remember.

It's probably more annoying now since:

1) There is a lot more people online now than 10 years ago. We've gone from
being in the far west to the industrial phase of the internet.

2) Since the web 2.0 thing there are now comments everywhere. It became even
worse with social media.

------
msms01
“Yesterday I was clever, so I wanted to change the world. Today I am wise, so
I am changing myself.” -Rumi

------
johndjos
I think it's simple: money. Take away the financial incentive to create
viral/outrage content.

------
dubcanada
People in 2010 didn't get into arguments online?

I don't think any of that is true, there were tons of arguments on forums and
IRC and stuff.

I believe this is purely a "now we see it all" where as before it was hidden
behind private invite only communities and hidden/private chat rooms.

------
danbolt
I came of age in the late 2000s, but “flame wars” were already a thing then,
correct?

I think the big thing that’s changed is that internet access and computing’s
audience has increased a lot over the past two deceased and it includes people
that would normally be polarized in public.

------
jcims
I told my daughter about the brief respite we had after 9/11\. It gave me the
impression that if shit really hits the fans we got each other’s back. Now of
course history is littered with extreme counterexamples but those don’t help
my cynicism go away sooo.

------
iovrthoughtthis
It is so much easier to hold and propagate a false / misleading view.

Anger pushes us to action more than agreement.

Truth us hard to find and difficult to defend.

When the world is right we’re not driven to action, nothing needs fixing.

I don’t know the answers but the odds are stacked against civility and truth
right now.

------
rapnie
Still on the tech side of things HN is doing quite well, with clever mechanics
to downrank incendiary stuff, killing troll comments asap, etc. Combined with
excellent, consistent moderation (courtesy dang et al).

With growing awareness + press coverage, those platforms who deliberately
implemented bad algorithms to harvest maximum attention (FB, YT, TW)
increasingly find it damages their brand image. There is an incentive to (at
least marginally) improve their features / biz models.

Then there is digital literacy. We used to have netiquete and that can be
taught. If you read the (very entertaining) article The Internet of Beefs [0]
you'll see there is a solution by not getting involved in a beef, or
extracting yourself if you got baited. Similarly it helps to know how to avoid
trolls [1].

It will not be easy to change cultures large-scale. Many people just like to
beef (there is a similarity with road rage too; people forget themselves
online). Others enjoy starting beef wars for the lolz, or - more sinister -
with strategic objectives.

At least we should be able to create more safe harbours, where people thrive
from uplifting experiences online. Changing culture bit by bit.

[0] [https://www.ribbonfarm.com/2020/01/16/the-internet-of-
beefs/](https://www.ribbonfarm.com/2020/01/16/the-internet-of-beefs/)

[1]
[https://github.com/prettydiff/wisdom/blob/master/Avoiding_Tr...](https://github.com/prettydiff/wisdom/blob/master/Avoiding_Trolls.md)

~~~
AnimalMuppet
I think one of the things with HN compared to FB et al is that HN isn't a for-
profit company. It's essentially a public service of YC. They don't make a
dime off of this, as far as I can see. Because of that, they don't have to do
evil engagement-boosting tricks to try to make more dimes.

~~~
krapp
>They don't make a dime off of this, as far as I can see.

Maybe not directly, but every YC company advertises themselves and their
projects here, and HN is known as the hub for the SV tech and startup scene,
which gives YC an automatic amount of "street cred" and visibility in all of
the discussions had here. I'm sure Hacker News has made YCombinator more than
a few dimes indirectly from all of that.

~~~
AnimalMuppet
Well, perhaps. But if you're right, then HN makes them _more_ dimes by _not_
turning into a sewer.

The point is, HN has different incentives from FB. That matters in the culture
of the resulting community.

------
alexandercrohde
It's not your problem to solve. It's likely not a problem at all.

I think it's just a reflection of our nature -- we all claim to seek boring
stability, yet actually thrive on drama. People who don't want drama have no
trouble avoiding it online.

------
pid_0
I think its a few things:

1) Anonymity when its not useful. Many forums are relatively anonymous to the
average user but most of the time its not useful. Theres no life or death
situation on reddit. Sometimes, it can be useful, but most of the time its
not. People post a lot of stuff they would never say in the real world because
they have no responsibility to back it up or be associated with it. Just look
at what they say on T_D

2) There are objectively a lot of instances of "censorship" of "wrongthink".
There just is. Not all opinions are good, important, etc, but the more you
shut them down, the more people will intentionally become toxic. This is
especially made worse by point 1.

3) Radical moderation of communities. User moderated areas like reddit are
shaped by the mods. T_D is largely the way it is due to rabid people shaping
what is and is not allowed. Same with twitter.

~~~
livueta
I think at this point in time the "anonymity makes people assholes" argument
is pretty dead - look no further than Facebook during the 2016 election for a
fantastic example of real names and faces being exactly zero inhibition to
every sort of nasty behavior imaginable. Attacks on anonymity as an inherently
flawed concept, only useful in specific niches like dissidents and reporters
and inimical to quality discourse, should be ignored like the Facebook "one
authentic self" gaslighting propoganda they are.

You also imply that anonymity is somehow "not useful" except in cases of
extreme threats to life or liberty at the same time you conflate
pseudoanonymity and anonymity on Reddit. If anything, Reddit is a great
example of how _insufficient_ anonymity can lead to evaporitively-cooled echo-
chambers in short order: karma and rep, coupled with persistent identities,
are powerful drivers of conformism.

Why, then, you ask, is HN not quite as bad when it is also pseudoanonymous?
I'd say it comes down to local culture. HN isn't as monolithic as it could be
because at least some users disagree with expressing disagreement via downvote
as long as a point is made in good faith. Similarly, some (almost entirely
anonymous) chan boards were actually really good in terms of quality
discourse, while others were shitholes. The differentiating factor is the
attitude of the community and its moderation philosophy, rather than some
simple "how anonymous is this site" bucketing.

That's why I'm convinced that a lot of the problems with modern social media
are the result of the creation of an environment where developing that type of
site/board culture that can foster discourse is impossible: if you take a
global platform where moderation is often outsourced and totally divided from
any of the communities it theoretically serves and populate it with people
who've not been exposed to the sort of behavioral norms common in earlier
Internet communities, you end up with no norms at all - just an infinite vista
of shitposting.

If anything, I'd disagree with your third point even more strongly than your
first: even though community-local moderation can exacerbate echo-chamber
effects, it can also foster productive cultures. In other words, community-
driven moderation is necessary but not sufficient for establishing worthwhile
norms. Even if the ratio is 10 T_D:1 HN, I'll still take that over the global,
no-holds-barred shitfight enabled by platforms like Twitter.

------
adultSwim
Meaningful political change could make people happier and generate a greater
sense of community.

Take for instance climate change. Young people are asking politely for
implementing needed solutions. At some point they will stop asking so nicely..

------
rapnie
OT: I notice that this Ask HN thread is suddenly revived and timestamps of
original comments are reset. My other comment is not 5hrs old but from 2days
ago when this item did not reach front page. Just curious how this works..

------
jccalhoun
I am not a believer in things being better back in the day. I think that is
nostalgia. Flame wars have existed since the beginning of the internet.

What we can to do make it better is to invest in teaching critical thinking,
media literacy, and actual argumentation.

And it is important to know when to walk away. There are a lot of times when I
see someone respond to a comment in a way that is shitty or just plain wrong
and I would love to try to correct them. However, I try to resist. If it is
reddit or something I will look at their post history. If they mostly post on
certain subreddits I know that there is no point in my continuing the
conversation. Nothing I can write is going to change this person's mind so I
stop engaging with that person. I'm not always successful in resisting the
urge to tell them how wrong they are but I try.

------
pdonis
_> is this more of a social problem that code can't solve?_

Yes. Any tool for communication can be used both ways. The way to fight
negative uses of the tool is to use it positively, not to try to limit the
tool.

------
heavyset_go
> _So how do we fix this?_

I don't believe that polarization or toxicity are endemic to online spaces.

Change the material conditions that cause people to become polarized or toxic
and online spaces will reflect that change.

------
naringas
we will have to understand how culture (i.e. original definition of memes:
cultural attitudes, bits of knowledge, generalized narratives) is a collective
action in which we all participate.

In essence it works by mimicry. we mimic our peers and our peers mimic us. we
just gotta be careful about what we mimic.

the tricky part is that this process is largely (but not entirely)
subconscious. and also it takes a lot of focus and well applied effort to
change previously learned behaviors. specially when "everyone else (within you
social circles) does this"

------
sdinsn
We don't, and we shouldn't. This is natural human behavior.

> Before, around 2010-2012, people who disagreed would usually leave it at
> that and walk away respectfully

I can tell you've never played Call of Duty.

------
ca98am79
The most important thing to do is to be aware of yourself and not participate.
To not judge others. The best way I have found to practice this is with
mindfulness meditation.

------
ssss11
I feel like there are people who inherently do this in any situation (eg. In
person discussion), so for me, this just seems like human behaviour regardless
of it being on the web. Sadly.

------
ltbarcly3
The problem with trying to 'fix the problem' is that most people seem to think
that the solution to polarization is to silence the 'crazies' that disagree
with them, and who are 'causing the polarization'.

In a diverse society it is normal to have a wide spectrum of opinions, and a
lot of them are going to seem wrong. Wrong ideas are not a threat, so long as
we accept that they are just opinions. What is dangerous is giving a small
group the power to decide which ideas are valid and which ones are not,
because that small group will be taken over by opportunists who will use the
power to control speech to further their own agenda.

------
tonfreed
Shitfights are what drive engagement and sell ad space. I think the only way
it's going to stop if we start moving to a pay model instead of relying on
engagement

------
logfromblammo
"We" don't fix it.

This hasn't been going on for 7 years; it has been happening for at least 40.

The rage stems from the wage, which has stagnated for the working class for as
long as I have been around to see it. People are trying to telegraph that
unpleasant things will be happening soon, if all the people who are constantly
dumping on their inferiors don't at least start handing out some umbrellas.

I think the tech has actually been a mitigating factor. Online mobs can't
throw real-life firebombs. When people go out to actually _do_ something,
there are fewer people in the same physical place to pump each other up.

------
brodouevencode
Based on the way I see some comments on HN voted on in an obviously
ideologically way, I'm not so sure HN hasn't jumped the shark in the same way.

------
k__
I had the exact opposite experience.

People stopped being "all technical" assholes that argued about everything and
started to think more about social stuff.

------
Jamwinner
By not perpetuating it.

Thats all we can do. But the most important step.

~~~
AnimalMuppet
If enough people don't perpetuate it, that turns down the gain in the echo
chambers. That cuts down on the feedback, and the noise becomes a lower
fraction of the total.

------
slumdev
I think most people would be less toxic if their posts and comments were both
(1) public and (2) linked to their real name.

Someone posting anonymously or under a pseudonym has nothing to lose. He might
not intend to offend, but he has no reason to guard his tone or consider how
his audience will receive him.

Someone who posts under his real name on Facebook but only shouts into an echo
chamber filled with like-minded "friends" also has nothing to lose.

Shame is a great motivator. Fear of loss of friends or career prospects is
also a great motivator.

~~~
jowday
The massive amount of toxicity and drama among Twitter users with accounts
linked to their real identity is strong counter-evidence to this. When I think
of "polarization and toxicity on the web", I think of Twitter arguments that
spill over into real life precisely because of this connection between online
identity and real identity.

------
zelly
BCI wire that lets you replay a stream of thoughts and emotions the poster had
while they were writing the post or recording the video.

------
thinkingemote
Id posit that there is no uptick but that our human nature always think things
are getting worse. I think pinker did a book about it.

------
anon_d
We can't fix this.

The whole HN narrative around this presumes that this is an unnatural outcome
caused by the bad incentives of tech platforms. This is just wrong.

The polarization comes directly from people being able to communicate freely.

The polarization is real. It was there before, but it was controlled by
censorship, basically. The polarization isn't _caused_ by the web, it's
unleashed by it. We can't stop it without neutering the web, and we don't want
that.

------
ianai
This topic was so successful I almost wish we could revisit it either once a
month or maybe even once a week here at HN.

------
mcantelon
Corporate media has done a lot to promote polarization (and even violence).
Social media is making corporate media redundant to some degree, but those who
run social media are also influenced by the same forces influencing corporate
media. And, of course, "woke" ideology hasn't helped given it's similar in
some ways to Maoism.

Until the cold war between globalism and nationalism finds some resolution
this atmosphere will likely persist.

------
Tiktaalik
Tech companies are loathe to hear it, but the only thing that works is
ruthless human moderation.

There's a reason that reddit and twitter developed Nazi problems while other
old fashioned human moderated forums didn't.

------
shaneprrlt
> Before, around 2010-2012, people who disagreed would usually leave it at
> that and walk away respectfully.

I literally lol'd.

------
tacocataco
Start with ending the class war in favor of the 99%.

The .000001% is heavily invested in keeping everyone divided and conquered.

------
jes5199
it almost doesn't matter whether the content is polarizing or toxic - people
form mobs whether their opinions are reasonable or not. Even reasonable
criticism gets converted into harassment campaigns, thanks to the dynamics on
twitter. (I mean, like 75% of it is awful content too so idk)

------
singularity2001
Am I the only one who did NOT notice a significant increase in toxicity?

The only incident I can remember was playing a game with chat mode where kids
went full scale anti-Semitic etc.

PS: I mostly avoid Facebook, reddit or wherever toxicity normally pops up

Oh I completely forgot the most toxic place on earth: stack overflow, but I
completely removed any activity there almost 10 years ago

------
graphememes
Stop encouraging it by rewarding those who engage / initiate it.

Stop advertising it as "news"

------
pjkundert
Solution: K-means clustering of accounts, with _long_ _term_ effect.

Everyone starts out as a “noob”, seeing the unfiltered cesspool, and nobody
with significant grouping weight sees their comments (except other noobs, and
people who’ve categorized themselves as “noobs” by Liking inflammatory
nonsense and Disliking reasoned debate.

In the year or so it takes for your identity to begin to develop weight toward
your cluster, you see less and less “noob” content, and more “cluster”
content.

If you like one-sided (eg. dismissive statist left/right, or aggrieved
libertarian, ...) or perhaps reasoned principled respectful debate — that’s
what you’ll see.

Everyone wants to see debate they’re comfortable with. If you want garbage,
that’s what you’ll see. If you want understanding of opposition views in a
reasoned, respectful environment, that’s what you’ll see.

What you won’t see is — trolls. They’ll all be busy trolling eachother, and we
won’t see their garbage.

The idea that we can use single-axis rating to create a universally acceptable
online environment is just, well, crazy. And yet, since the mid-80’s, that is
what literally every online “social” tool has been trying, vainly, to
accomplish! It’s stunning, really.

~~~
dredmorbius
Birthing into the sewer seems counterproductive.

That fared poorly for physical health, we learned.

------
CodiePetersen
You're not going to. There is real incentive to divide people into categories.
People in the grey are hard to deal with and hard to manage.

Massive tech companies like google have perfected how you are supposed to move
someone from unaware and not caring into the firmly aware caring and loyal
brand consumer. They extend that ability to anyone willing to pay for it.

All the twitter ads all the Facebook ads all the Google ads. Doesn't matter if
it's for potatoes or Trump, they have the same ability to subtly move you into
their camp. Even a basic search engine in reddit or google will constantly
feed you the same bs day in and day out based on what you already search for.
Take Google news. They aren't going to show you too much out of your region,
even in world news as an American you are usually going to see American policy
and any topics the engine thinks is important to you based on your previous
views.

The whole of Internet technologies is meant to divide and feed you the
confirmation bias info you love. Only stuff that already aligns with your
views. You take that and couple the fact that ai algorithms are literally
designed to take non linear data and plop them into definite categories and
you'll get independent/grey thinkers slowly being pushed to one side or
another no matter how complex the data.

Unless there is a way to forcefully break the feedback loop, you won't stop
it. If you sat day in and day out twirling a butterfly knife or painting or
whatever, you would be an expert eventually. That's what is happening now but
not with a useful skill, just information aligned with what you already
believe. Then after years of programming someone comes along from the other
category and tries to break your beliefs, of course you'll get toxicity
because it's hard to unlearn how to paint. The brain is not meant to unlearn
constantly practiced behavior.

------
cmoscoso
Being zuck-free[1] most of your time should help.

[1] no Twitter, Instagram or Facebook.

------
notadev
I am basing this on nothing but my own opinion. This has been going on for a
while. Sometime around 2010-2012 many news sites started removing their
comment sections. While most people thought many of these comment sections
were "cancer", it was a still way to comment on, add context to, or dispute
something posted to a news site. The claim was that "trolls" were causing too
many problems. This was around the same time the definition of troll changed
to mean anyone who disagreed with the OP or the content of a news article.

Once that happened, more and more opinion pieces started to be passed around
as "news" articles. As opinions are biased, those biases got more and more
apparent over time. With a lack of a way to respond, people turned to Twitter
and other services.

Add to that, a continual redefining of what speech is "appropriate" and what
speech is "problematic". This was a further attempt to control the
discussion/debate by defining what tools could and could not be used when
discussing/debating. This is seen by many who do not consider themselves
"liberal" or deeply concerned with social justice issues as another attempt to
exert power and control the narrative by silencing any opposition.

That's where it's all coming from. Toxic, problematic, etc. Are just synonyms
for those who exert power, namely those few who are given a voice...the self-
described journalist's, to label stuff they don't like. They use other terms
like racist, mysoginy, Nazi, Alt-right, all synonyms for people who hold ideas
contrary to their own.

As soon as we have a level playing field online, where all voices can be
heard, not just blue checkmark, then maybe we'll make progress. Until then,
things will get so worse and when they have the power to do so, these avenues
of expression...FB, Twitter, will just be more places where you can express
yourself freely as long as you say what you are allowed to say.

~~~
anigbrowl
_many news sites started removing their comment sections_

It's very rare to see a news website that doesn't have a comment section. That
claim needs some substantiation.

------
mrhappyunhappy
Join the YangGang. I had a 180 change of attitude towards people.

------
ttctciyf
Recreate USENET and we can put it all back where it came from..

------
cletus
The problem here is people. People are easily prone to fear, anger and hatred.
They want their views legitimized. It’s virtue signaling to likeminded people
and alleviate anxieties on a changing world.

The problem with the internet is that it’s simply too easy to spread
nonsensical views that have no basis in fact, like the anti-vaxxers and White
supremacists.

People also want to blame others for what’s wrong with their lives (“[ethnic
group X] are stealing our jobs”). Worse, is just as easy for opposing views to
do the same. This makes the first group feel like they’re under attack.

These people are easy to manipulate and they are manipulated through the
politics of fear. Look no further than Fox News and Donald Trump.

So this Is only an Internet problem in the sense that the barrier to saying
stupid unfounded shit is not so low and motion can (and does).

------
bvinc
I agree that there's been an uptick in toxicity and polarization. I think it's
partially due to more people being on the internet, the decentralized
information sources on the internet, combined with human nature and
generational and cultural reasons why it's difficult for some people to
determine what is factual.

But the single biggest factor that's driving this, in my opinion, is
recommendation systems. Every source of information now follows the same
pattern to increase engagement: Fill up the screen with as many different
links and recommendations as possible, and determine the links shown by a
machine learning algorithm to increase engagement.

If you have an app to communicate with friends, don't show them their friend
list, show them a "news feed". A video player? Put recommendations on the
side, and at the end of the video, and pop up recommendations when they hit
pause, and autoplay the next recommendation. Sort all comments and replies
based on engagement.

This increases income, but it makes people go crazy. Because what kinds of
content increases engagement? For a lot of people, it's crazy inflammatory
content. Women insulting all men, men insulting all women, Black Lives
Matters, Back the Blue, flat Earth, vaccines, conspiracy theories...
eventually the algorithm will find what triggers you to click or stay engaged.
And once it figures it out, you're hooked in a crazy polarized hatred cycle.

And it doesn't help that people and companies and Russian troll farms realize
this and push inflammatory content for clicks and followers.

How do you stop this? I have no idea. On a personal level, I use uBlock to
clean up all recommendations on sites that I visit, and I make an effort not
to respond to engage in any of the toxicity. But that doesn't really solve the
problem.

Companies don't want to solve it because they'll lose money. Legislation
doesn't seem like the solution. Individuals will complain if you took away
recommendations. No one really wants it to stop.

------
fortran77
This is nothing new. There were flamewars on USENET in 1989.

------
djanlermats2020
Remove practically all the tools built to combat "toxicity" and force people
to talk to each other and use their words again.

Also: Fire every moderator, everywhere, and never let anyone be a moderator
ever again.

------
johnchristopher
I think the only winning move is not to play :/.

------
cannedslime
I rarely see threats, I do see that dialogue is rarely actually happening, its
usually just shit flinging with buzzwords like cuck, biggot, xenophobe,
leftist, commie, nazi etc. Rarely do I see people make arguments, and even
rarer do I see people actually arguing as in exchanging ideas and answering
each others questions.

The more mainstream a site gets, the worse it seems to be. Especially where
voting is involved (reddit, facebook etc).

Add to this "grassroots" efforts to "takeover" public internet forums, which
has been going on for decades by both far left and right wingers. Applying for
moderator roles, not because they want to moderate discussion, but have
complete control of it..

------
geocrasher
By turning it off.

------
mErienda
Polarization is the natural response to disorder, so thought goes into
categorical, absolute mode and reactions are extreme (reactance). The solution
is Relativistic Thought.

------
melito
be cool in real life. the web is toxic can life can suck. mass media just
amplifies it and can make it more annoying.

------
lukaszkups
I was thinking about a solution that could be implemented by big players like
Disqus - I've took some ideas from StackOverflow reputation idea - here you
can read about it: [https://lukaszkups.net/notes/what-disqus-can-learn-from-
stac...](https://lukaszkups.net/notes/what-disqus-can-learn-from-
stackoverflow-and-make-internet-a-better-place/)

&tldr; limit comments per day, require minimum amount of positive reputation
to leave a comment, make it customizable per website by its owners.

~~~
KajMagnus
> require minimum amount of positive reputation to leave a comment

How can one get positive reputation, without first having published a few
comments that got upvoted?

Anyway here's a commenting system with a built in trust level system — more
similar to the one you'd find in Discourse, than at StackOverflow though:
[https://www.talkyard.io/blog-comments](https://www.talkyard.io/blog-comments)
(I'm developing it). There are Like votes = upvotes, neutral Disagree votes,
and (for community core members & mods only) the _Unwanted_ vote, which works
like downvotes and destroys "karma" (not completely implemented).

From your blog:

> Users should be able to gain reputation through their comments — we are
> currently able to upvote or downvote each user comment — that should somehow
> influence user profile reputation

And maybe an upvote or downvote by someone with high rep, could carry more
weight, than from a new member?

------
bsanr2
A thought that's been sitting in my Evernotes for a while:

"I wish I could code because I have an idea for something I think would be
valuable. Basically I think debate is a good thing, and especially online
debate, where you can meet & talk with so many different people and learn so
much you might have otherwise missed (and teach, too).

But the problem with the platforms people debate and converse on (Facebook,
reddit, YouTube) is that they're essentially popularity contests. They reward
being visible and playing to people who already agree with you, which means
many good ideas get steamrolled by enthusiastic trolls.

So I was thinking that maybe instead of relying on the "wisdom" of the mob, I
mean crowd, to determine what's noteworthy and important and productive -
which inevitably leads to a constant churn of middle school student council-
like dynamics - I thought maybe you could flip it.

Don't ask people who are like you if you're making a good point. Ask the
people who AREN'T like you. If a bunch of gun owners look at what a gun
control advocate has to say and think, "You know, they're onto something,"
maybe that's more valuable than getting support from other advocates.

So I imagine a debate space where people can talk with each other in a
structured fashion. Other people can watch. Somehow we have an idea of the
"way" people lean politically, identity-wise, etc., or at least how they
compare to others. Isn't this something that NN-driven text analysis would be
good at? And at the end, you can rate someone's ideas and performance, and how
much your opinion counts is weighted by how you compare to them, and then the
system's understanding of who you are is adjusted accordingly. It need not
even assign human-readable tags to a given personal value; all we'd know is
that this person is very or not very similar to a given other. Maybe this
helps to disarm trolls and to push thoughtful, respectful voices to the top.
Maybe it allows voices that would usually get drowned out on any given issue
to have a say. Maybe it helps us get to the heart of what debate truly gives
us, which is understanding and compromise and solidarity."

tl;dr Keep upvotes/downvotes but weight them based on how similar the person
voting is to the person being voted on.

------
smarri
A shadow internet for decent people

------
overcast
Do not respond, block, then report.

------
whatsmyusername
Take all the people out of it.

------
throwawaybbb
Stop the always on web. There is no reason why latency for Twitter or facebook
should be any faster than 1h.

------
reportgunner
Stop using social networks.

------
throwaway29303
That's a highly complex problem, IMHO; since it involves lots of variables and
variability. And, quite honestly, I doubt there's ever going to be a proper
way to solve it.

But here are some of my observations:

\- Everyone's online. Everyone. That includes people a) who had a less than
stellar upbringing (whatever that means to which one of us); b) with mental
illness (some of them untreated and/or self-diagnosed); c) from other cultural
background (which don't necessarily align with western point-of-views); d) who
are underaged; e) who are (very) bored (and some find entertainment in
provoking chaos); f) who lack proper education; g) who simply do not care
about educating themselves and others; and h) who are naturally antagonistic
for whatever reason.

\- Bots. There are also a good number of botnets out there (the "like" economy
comes to mind) or subverted systems (smartphones, routers, PCs, etc) which act
without the owner's knowledge;

I'm convinced there's also botnets out there that purposefully amplify certain
controversies online (websites' comment sections come to mind). Now,
understand, this doesn't apply to _every_ controversy. But I've seen some
weird stuff on YouTube, for example. And, quite frankly, it's not that hard to
extrapolate that since there are "like" bots there are equally "hate" bots as
well. The end game is, in most cases, visibility. In others, it's probably
social engineering for whatever reason. Mostly political.

Unfortunately, media outlets have fallen into this scheme as well. I mean,
it's not like the concept of clickbait is new. Attractive headlines are as old
as newspapers/journalism.

\- Highly progressive opinions will always clash with the status quo;

\- There's a fundamental and unavoidable loss (or lack) of signal whenever
people communicate in short sentences, or when they do not have enough time to
fully understand what's being said, or when they communicate with people who
they don't know;

If I have a friend who's a prankster/jester I might be used to their
shenanigans and tolerate them--a stranger might not find them funny. Maybe the
stranger will find it funny after they know my friend, but that would depend
on a lot of other variables as well.

\- There's also the problem of the person who transmits information not
expressing themselves properly (for whatever reason) which will unavoidably
lead to miscommunication;

\- Social networks are fundamentally designed for engagement--now that
everyone is online there will be a clash of ideas. Tribes will organically
form (just like offline). And - with few exceptions - what was meant as a
benign message ends up in a declaration of war from the other tribe.

So, in sum, I just scratched the surface of the problem. It's very hard to
make sense of all the noise and coming up with a proper solution to this
problem.

Is it an education problem? Is democracy the root cause? Would a totalitarian
(regardless it being Right-Wing or Left-Wing) system work? Is it something
fundamentally ingrained into our human condition that makes it impossible to
solve this problem? (Look at bees, for instance, they're ruthless and yet
highly efficient at what they do.) Is it a fundamental purpose (or lack
thereof) that each and everyone of us has defined - through whatever heuristic
and for whatever reason - that's to blame for all this chaos? Is it its
visibility?

Yeah. It's a hard problem to solve. Maybe lots of compromise would help.

I honestly don't know.

The quote

    
    
      "All models are wrong, but some are useful." --George Box
    

comes to mind.

------
yters
Let's argue in a polarized, toxic manner about the polarization and toxicity
of the web!

------
anonsivalley652
There are many pervasive, interconnected issues and trends to it, so there's
no simple or complete answer:

0\. Behavior online doesn't arrive out of a vacuum. If people are miserable
because they're working harder, making less and their society is in retreat,
they're probably going to take out their frustrations out on the easiest
targets.

1\. Unplug from social and mainstream media because it doesn't have much
value.

2\. Stop demonizing, hating on groups of people and falling for the unthinking
of mob tribalism, even rhetorically. Hate bad ideas, not people.

3\. Realize that we're divided-and-conquered if we're going to let a few rich
people and their corporate media keep us set against to each other. Solidarity
is the only way.

4\. Anonymity is good in small doses, but it's too easy for people to act
unreasonably hiding behind it (cyberdisinhibitionism). DHH's company wrote a
blog article about improving the quality of discussions with profile pictures
and real names.

5\. Stop and use every instance of it as a teachable moment, where feasible.
It only works though if people have shame and can be brought around to the
Golden Rule/empathy... it seems to me most parents these days aren't as
involved in active parenting, so their kids run ferrel and so more people grow
up to act more brutally and sociopathic. Furthermore, the current prevalence
of parasitic vulture capitalism valuing myopic greed and selfishness above all
else reinforces a disinterest in the concerns and well-being of others...
which is antithetical to community and civilization.

6\. There's nothing yet so far to replace the community function filled by
religion, and so many people aren't interested in behaving themselves or doing
right by their neighbors or strangers if they can get away with it.

7\. More people have lost most of their hope about the future. For example, no
healthy society has mass shootings/suicides nearly every day that no longer
make the news.

~~~
dane-pgp
I really appreciate the thoughts above, but I have to question point 4:

> improving the quality of discussions with profile pictures and real names

As a counter-point, imagine a discussion forum in an authoritarian country
where your picture and real name is placed next to everything you say online.
The discussions may end up being very polite, and full of agreement, but that
doesn't necessarily mean that the community is well served by that forum.

To some, that concern may seem irrelevant, since most governments wouldn't
attempt to control people that way, but I don't think we have to look very far
to find examples where people expressing unpopular opinions face social and
professional punishments. Perhaps you would agree that anonymity can be
helpful in such situations, (checks notes) "anonsivalley652".

------
cubano
I remember being a part of a very early forum called "Plastic" in the mid
2000's and, believe me, there was PLENTY of polarization/toxicity there, so
I'm not really buying that this is a new thing.

I remember being VERY active in that forum, but eventually I quit and sent an
email to the owner ( I know I know ), his name was Carl I think?...saying that
I could no longer deal with the absolute political polarization (anti-
Libertarian) I found on the site.

So yes...you "fix" it by being the change you want to see. There is really
nothing else one can do.

------
otakucode
Tech may not be entirely innocent, but at the end of the day it facilitates
both the good and the bad behavior equally well, so users do bear the most of
the blame. Helping the users such that they would not behave in such a manner
would, of course, be a complex matter mostly outside the realm of technology-
based tools, but I do not believe that is any reason to abandon the cause.

One thing we could do, which would not solve the problem but perhaps
illuminate a better way forward, would be to work on communications
technology. Not simply technology for data transfer, but for actual
information and understanding transfer. Technologies and systems which
facilitate the good behavior without facilitating the bad nearly as well. By
that I mean things such as systems which enable presenting large volumes of
nuanced information in ways accessible to more people, and systems which
enable both constructing, sharing, and refining complex arguments.

As an increasingly large portion of humanity conducts themselves online, we
must keep in mind that there is much of human behavior which is not admirable.
The solutions to that behavior are not technological except in the most
dystopian and (philosophically to me at least) disgusting scenarios. There has
never been, nor ever will be, a happy, prosperous police state. Autonomy and
free expression are not luxuries, they are necessary for human health. We must
also always be vigilant that the systems we implement are flexible and permit
society to change both within and through them. Take a thought experiment I
came up with for example. Imagine that tomorrow morning 90% of the population
of planet earth awoke to a realization that agitation over nudity was
ludicrous. Would it be possible for our existing systems to accommodate this
change, or would it actively thwart every single attempt by any individuals to
live their life according to this newly adopted principle? Would it result in
a global relaxation of pointless anxieties, or would it result in increased
anxiety as people felt themselves isolated in their realization, 'judged' by
their technology which would filter them, block them, and reject them at every
stage?

At no point in history has any society, so far as we know, hit upon "The
Correct Ideas" which represent unvarnished truth, eternal and unchanging. And
we should be careful to consider that our current social ideas are not
unwittingly treated as such, ossifying human culture.

In Eric Schmidt's book "The New Digital Age" he speaks about wishing to play a
very active role in exactly this kind of cultural ossification, expressing an
extremely elitist view that due to the fact Google is rich, they are Better
and should therefore take steps to actively guide and mold society in the ways
which Eric Schmidt believes are best. Those just so happen to be the social
values of the late 1990s when Google was introduced and which facilitated
their wealth-building. That is, to my mind, a dangerous game. Past history
would suggest that attempts at "social engineering" which do not rely
completely upon broad social consensus and upon society reaching its own
conclusions and doing its own enforcement of its own ideals tends to backfire
in spectacularly catastrophic and inevitably violent ways.

------
anigbrowl
While driving the marginal cost of communication to zero would seem to bring
people together, what it really does is bring _similar_ people together but
also exposes significant differences between people.

The essential reason for the toxicity is the contestation of online spaces,
which are virtual territory over which proxy wars can be fought as a low-cost
substitute for physical violence (although there is a direct nexus to physical
violence, and the threat thereof, by both individual and state actors, is a
factor in the aforementioned contestation.

There are 3 basic approaches to encroaching toxicity:

    
    
      a. ignore it aka 'don't feed the trolls.
      b. implement technical solutions to manage it.
      c. Fight it aka flame wars.
    

Ignoring it doesn't work. It just tells the most vulnerable members of a
community that they don't matter and that if they are repeatedly harassed
other community members will sympathize but not really do anything to help.

Technical solutions originate with the California preference for systems
thinking, and are reflective of the legislative and administrative technology
in which they've been incubated. They are somewhat effective, but any system
can be gamed. Most sites opt for a mix of technical means and hands-on
moderation by a benevolent* dictatorship __which works moderately well but is
not responsive or effective against determined attack.

* benevolent in terms of close alignment with the ethos of the forum, whatever that happens to be

 __dictatorship in terms of being arbitrary rather than mechanistic, semi-
transparent, and unilateral

Flame wars are upsetting to everyone, and people in the first 2 camps view as
the worst-case outcome because they take over the thread/forum/platform where
they occur and are destructive of comity, much like their real-world analogs.
However, they can be effective in repelling invasive toxicity - if a
sufficient majority of the forum regulars participate cooperatively. If too
few participate or forum norms inhibit or punish participation, then toxicity
will prevail or advance.

Here's some empirical evidence supporting this based on data collected from
raiding behavior on Reddit, which is similar enough to HN to serve as a useful
comparison (includes links to papers, slides):
[https://snap.stanford.edu/conflict/](https://snap.stanford.edu/conflict/)

Cross-platform raiding behavior has existed as long as bulletin boards, and
has been systematized and refined in line with the systematization and
refinement of game and software development strategies. Here's a (somewhat
offensive) overview from some years ago of trolling strategies, summarized
near the end in a convenient flowchart:
[https://digitalvomit.wordpress.com/the-ultimate-guide-to-
the...](https://digitalvomit.wordpress.com/the-ultimate-guide-to-the-
internets/)

The sophistication of raiding tactics, documentation, and so on has increased
significantly since that was published. Ultimately toxicity online is neither
a product of technology or the exposure of an inherent flaw of human nature,
but the visible manifestation of multilateral information warfare, which is
itself preliminary maneuvering and battlespace preparation for more overt
forms of conflict like cyberwarfare, open economic warfare, and kinetic
warfare.

------
DoreenMichele
To my mind, this is a little like people being depressed about The News. The
News tends to be negative stuff and it used to be available for an hour or two
on the TV. Now, you can access The News 24/7 via TV, internet, radio, etc.

Some folks find that really depressing because they get so much more
negativity in their headspace than they used to. A primary approach to not
being depressed about The News is to stop letting it take up so much of your
time and attention. Actively seek to tune it out and focus on other things.

Similarly, the internet is bigger than it used to be, so some of this is
perceptual and/or a numbers game. It's easy to find fightiness and feel like
"It's everywhere." It's relatively easy to take good things for granted and
underappreciate them.

Some things I find helpful:

1\. Actively seek constructive engagement. For me, this involves declining to
indulge the knee-jerk reaction to rebut anything and everything that directly
disagrees with some comment I made (ie replies to me that tell me "You are
wrong!" or similar). This is a bad habit of mine that just makes things worse
and no amount of trying to justify to myself why I tend to do this makes it
not a bad habit.

2\. If I do choose to reply to people who disagree with me or who are being
negative, think about what my goal is and what I'm trying to accomplish. Is
there particular information I would like to put out as a result of their
comment? Can I do it without just going down that path of "No, you!"?

3\. Grow a thicker skin. I don't absolutely have to have every single person
who replies to me on the internet like me, be nice to me, be my friend, blah
blah blah. It's okay for other people to disagree with me and to talk about
what they think. I can decline to take it so freaking personally that the
entire world doesn't always agree with me.

4\. Keep in mind that people are much more likely to reply to you online if
they disagree. There isn't a whole lot to say if you agree and HN in
particular actively discourages low value replies. So you aren't going to see
a lot of vacuous "Me toos" here. That doesn't mean people here hate me.

5\. View some of it through the lens of "HN/The Internet is bigger than it
used to be. It's a numbers game. Multiple replies disagreeing with me say more
about that fact than about me, this opinion, etc."

6\. Work on my communication skills. This is an ongoing effort. Some phrases
or framings tend to get knee-jerk negative engagement. Learning to say it
better helps reduce the nonsense.

At the same time, I try to make my peace with the fact that no amount of
effort on my end will ever completely put a stop to other people choosing to
do whatever the heck they choose to do. "You can't please all of the people
all the time" and that sort of thing.

(This comment is not intended to be comprehensive. It's just an off-the-cuff
forum comment, not a PhD thesis.)

------
forgottenpass
Stolen from reddit
[https://www.reddit.com/r/thelastpsychiatrist/comments/70rbuu...](https://www.reddit.com/r/thelastpsychiatrist/comments/70rbuu/the_gaslighting_of_the_quantum_superstates/dng1pkz/):

> ten years ago Steve 'Asshole' Jobs played a hilarious prank on all the
> digitally-illiterate 20th century luddites by pick-pocketing their ol'
> trust, reliable telephones and replaced them with computers instead. Tee-
> hee! Let's see if they notice the difference! And then he promptly died.

> The result, we see today, is a stratified understanding of "the internet".

> One level consists of everyone who really knows what the internet is. The
> internet's early adopters were a bunch of nerdy white males out of touch
> with society, guilty as charged. Annoying athiests and such. The thing is
> that this group (which I count myself among) knows how communities online
> function because they've been part of them for decades. They've lived the
> lifecycle of growth and collapse of forums (platforms) again and again. They
> see how technology brings people together, but not the people in their
> immediate life. Rather, safely brings anonymous groups of people who hide
> behind pseudonyms and avatars together through common interest and lively
> discussion and debate. [...] They see the new digital medium for what it is:
> connections between individual users spread across many different forums and
> platforms which are ever-changing, rising and falling.
> friends/strangers/communities first, platforms/sites second.

> Another level, the social media level, is what Fruit Juice Jobs foisted onto
> the unsuspecting public who now think they understand current technology and
> the state of the art of digital communications. Of course, they don't. They
> are sold a bill of goods and put all their identities into profiles which
> are used to sell them shit. And of course now they are vulnerable to a)
> people who spend hours arguing on the internet and tearing stranger's ideas
> apart (me right now) and b) actual trolls who love exploiting psychological
> vulnurabilites to make their victims squirm and squee and cry and through
> tantrums for the sake of drama (4chan, etc.). This decade-young group has no
> experience in what the internet actually and take no personal responsibility
> for their own online safety. For instance, they assume or act like a)
> Twitter will be around forever and b)Twitter can just block the bad guys and
> create a peaceful, harmonious online community. They cede all personal
> responsibility to a corporate hiearchy, like they were customers in a fast-
> food joint demanding to speak to a manager or some shit. They see and use
> the old medium in the new one, as McLuhan would say. Like how every town
> they visit has the same half-dozen franchises, their internet consists of
> the same top-5 "apps" and Google. Platforms and friends first,
> strangers/communities second.

------
hartator
Welcome to the Internet. :)

------
GhostKnight
we don't, we just don't

------
kerkeslager
(a) Bigotry is already normalized in the United States. The president is a
bigot and almost half the country voted for him. This is another negative
effect of deplatforming: if you live in a liberal city and only frequent
Reddit/Facebook, you live in an echo chamber where you don't have any
visibility into what's normal.

(b) The legitimacy of a platform is based on what is said there, not the other
way around. You view Voat as less legitimate than Reddit because you disagree
with what is said on Voat more than you disagree with what is said on Reddit.
If someone said something bigoted on Reddit, would you view it as more
legitimate? No? Then why do you think anyone else sees it that way?

~~~
gotoeleven
What did he do to call him a bigot? Enforcing immigration laws? Banning travel
from countries where we can't vet people properly? Declining to have the
military pay for people's sexual reassignment surgeries? Not letting people on
welfare earn citizenship? Restricting a refugee system that was being abused?

One of the sources of polarization is that the definition of hate and racism
and bigotry seems to be broad and expanding on the left, while on the right it
is narrow and static.

~~~
retromario
Maybe start in this extremely well-sourced article:
[https://www.theatlantic.com/magazine/archive/2019/06/trump-r...](https://www.theatlantic.com/magazine/archive/2019/06/trump-
racism-comments/588067/)

~~~
gotoeleven
Thanks for the link. The story about the 1975 housing discrimination case does
indeed look like Trump Management was racist. I think the question is was this
driven by personal racial animus or by competitive business considerations?
Also to what degree was Donald Trump setting these policies versus his dad
Fred? (Donald was 29 or 30 at the time).

The rest of the article is just stuff that really isn't that clear cut as to
whether Trump is actually racist. I think the last quote from the article is a
good summary of the divide:

>>>KWAME JACKSON: America’s always trying to find this gotcha moment that
shows Donald Trump is racist—you know, let’s find this one big thing. Let’s
look for that one time when he burned a cross in someone’s yard so we can now
finally say it. People refuse to see the bread crumbs that are already in
front of you, leading you to grandma’s house.

If you don't like a guy yeah all you need is bread crumbs to crucify him. If
you do, especially given the choice was Trump or Hilary Clinton, then you're
going to need more than bread crumbs to convince people he's racist. So the
original comment, that Trumps election means 50% of the country is fine with
bigotry, is just not true. There are people who support Trump who honestly do
not believe they are tolerating bigotry.

~~~
retromario
I didn’t respond to the original comment, just to this question:

>> What did he do to call him a bigot?

His track record is long, clear and consistent.

------
cluse
Social media platforms are designed to show you content that is more likely to
get you to engage with the site by sharing the content, liking it, or
commenting on it.

This creates an incentive for people to write provocative and controversial
content because it's one of the easiest ways to get attention, because the
platform assumes that if you're getting reactions from people, your content
deserves to be seen by more eyeballs.

I've thought a lot about this and I think there are many changes, mostly
design changes, that would decrease toxicity on social media platforms. I
think most platforms wouldn't implement these ideas because they would
decrease engagement and ad revenue on their sites.

Here are some of my crazy ideas (in paragraphs because I had trouble
formatting bullets):

Allow negative reactions from normal users to make content less visible to
others. The advantage of a downvote system is that it takes the burden of
content moderation off of moderators and puts it more on the people who
consume the content most. The disadvantage of a downvote system is that often,
when people are downvoted, they don't learn anything because they often don't
know why they were downvoted and they have to guess. Maybe when you downvote
someone, you should have to pick a reason and then a breakdown of the downvote
reasons should be available transparently for everyone to see.

Content recommendation improvements. YouTube's algorithm is infamous for
converting people from moderates to Neo-Nazis. Part of this is that the
algorithm that shows you what video you should watch next just shows you what
videos keep people watching YouTube. It makes no effort to separate news and
facts from opinion and infotainment, so the lines get extremely blurry. If
there were separate communities on YouTube for news on different topics, or
for opinion and infotainment on various topics, it could shield people who
really just want the news from being exposed to three-hour-long alt-right
rants, if that is not what they were looking for in the first place.

We need more awareness of emotional manipulation on social media and news
content. As an example, if a headline has the word "disturbing" in it, I would
call that emotional manipulation because the headline is trying to do your
thinking for you, reach conclusions for you, and tell you what to think and
feel before you had a chance to read the article and digest it. Too often, we
vaguely point to "education" as a way to protect people from sharing toxic
content or disinformation online. The fact is that educated people are human
too, just as emotional as anyone else, and are not going to be in a fact-
checking mindset when they see a social media post with a headline or image
that makes them feel sad, angry, or disgusted. We need a platform design that
lets ordinary people flag, downvote, or otherwise participate in the fight
against emotionally manipulative content.

We need to improve the moderation process so that people trust moderators
more, can understand their decisions more, and can contest their decisions if
necessary. I think that when a moderator removes a post, it shouldn't
disappear. It should be replaced with a list of the rules that were violated
for everyone to see. And moderators should be able to hide a post temporarily
to give the author a chance to edit it and resubmit it. Also, the first time
someone posts in a forum, when they write a comment, they should see sitewide
rules and community rules right before they submit, giving them a chance to go
back and revise their post. (This mainly applies to a place like Reddit, but I
think a platform like Facebook would also be better off with community rules.)
These changes would make it so that people understand moderation more, can see
a more transparent process, will stop being surprised by moderation, and will
overall have a more positive experience with moderators.

On Facebook, Twitter, and YouTube, I believe some of the problem is that the
platform encourages you to follow people. When you follow personalities
instead of topics, it introduces a lot of potential toxicity and content
moderation problems. There's no bechmark for civility, nothing you're
encouraged to talk about, just opportunities and rewards for creating drama.
If these platforms made it harder to make everything oriented around personal
brands, and easier to engage in communities that at least have a stated
purpose, that would probably help to discourage toxicity, because in a
community with a stated purpose, it's easier for a community member to say,
"That type of content doesn't belong here, this community is supposed to be
about x." Where x is a fairly uncontroversial topic.

------
0xff00ffee
We don't.

It is a feedback loop. The American ideal of rugged individualism has mutated
into anti-neighbor and is exactly the model for echo chambers (not neighbors
with diverse opinions but cultural clones!), which are facilitated by
technology. The ease at which our culture Balkanizes is backed up by trillion
dollar profits from a handful of companies, which then pay politicians to look
the other way.

This online toxicity is a reincarnation of the populist hate that has driven
wars for millennia. It is a cancer that has been given super-steroids because
of the intersection of huge amounts of cash and limitless political control.
Not to mention the propaganda that idolizes billionaires and strongmen.

I think the crash and burn of America would be a solution: the collapse of
trillion dollar companies and monopolies that own the government.
Unfortunately this is infecting other countries where the top 1% want's to be
as rich as America's, hence their urgency to imitate what is happening here
for their own profit.

There's no cure: the tech monopoly and insatiable greed of the top 0.1% will
be the main drivers of global poverty and oppression, and to get there they
need to keep us hating each other and divided. They even picked wedge issues
like race and religion, which are almost 100% irreconcilable.

I would give the top-tier credit, but I think stimulating base beliefs for
profit isn't really that hard, you just need a country-sized platform.

------
jejei992o
Convince 7 billion people harm reduction should be seen as the ultimate
motivation of society.

The web has nothing to do with it.

For a while we thought more people were getting cancer. Turned out cancer
detection rates were low and lots of mysterious deaths in history were
probably just cancer.

The web has helped illuminate in the US how big the political ideology gap
always was. It was papered over by information manipulation of the corporate
press for decades, coddling sensibilities of luddites, innumerate, and nesters
who preferred to stay home rather than see for themselves. They also made up
the biggest voting block for years.

I grew up in Trump country and left two decades ago. Was shocked to find
coasties really were convinced it was Leave it to Beaver land while rural
folks were convinced urban areas are universally slums. Those are REAL
narratives I get from people today. Shocked about how ass backwards the other
cohort feels about life. Ridiculously sheltered attitudes on both sides.
Complete disinterest in negotiating. As we see in Congress.

Consider that perfect rural life and urban police dramas are common fiction
tropes and it’s not hard to see why those emotional descriptions are knee jerk
go to for the masses

Free speech doesn’t oblige anyone else to abide the embedded semantics of the
speech in question. Emit whatever syntax you want, no one has to put their
agency into the behaviors the speaker thinks achieve the outcome they seek
with their speech.

Good luck.

------
claytongulick
As an intellectual exercise,since you seem so passionate about the subject of
anti-vaxxers...

Are all vaccines good? If not, who chooses? The government? What about when
Gov. Rick Perry tried to force the HPV vaccine on all teenage girls in TX,
even though it was suspected he had financial incentive to do so?

Is it ok for the state to force you or your kids to receive a medical
treatment? Even when there are multiple demonstrated cases of drug companies
breaking the law and corruption?

My point isn't to fall on either side of the debate.

My point is that it's wrong to silence the debate and to label people who are
thinking about these things so simply as "anti-vaxxers" and dismiss them.

Reality is complex. As soon as you're 100% sure you're right, it'll bite you.

~~~
dang
" _Eschew flamebait. Don 't introduce flamewar topics unless you have
something genuinely new to say. Avoid unrelated controversies and generic
tangents._"

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

We detached this subthread from
[https://news.ycombinator.com/item?id=22203689](https://news.ycombinator.com/item?id=22203689)
and marked it off-topic.

------
LemPop
This is a systemic issue to facebook and it starts with decisions made at the
top. The best quote I've heard recently is all massacres start with 'a word'.
Because Zuckerberg refused to accept responsibility for the power of word, he
allowed the genocide of 7,000 Rohingya including children. The UN reported
that Facebook was 'instrumental' in disseminating a message by those who
spread hate. The solution is moderators, in every language, with impeccable
credentials, and if facebook is still refusing to invest even this much then
decentralize the whole damn social media space, because Discord self-
moderates, Slack self-moderates, Stackoverflow self-moderates. Because no one
wants to be stuck in a room with hateful bigots but Zuck has forced it on you.

