
The Social Media Problem - janvdberg
https://jacquesmattheij.com/the-social-media-problem/
======
dexen
Two observations:

1\. The crazy ideas _jacquesm_ decries have a limited _influence_ on one's
life. Those are (predominantly) singular topics rather than all-encompassing
ideologies. They are more of conversation starters than life guides. Aside of
some rare dangerous ones (COVID-19 theories perhaps?), their impact places
somewhere between stamp collecting and junk food consumption. All the flat
earthers, the 5G's, the moon landing hoaxes etc., - let people have weird
hobbies. We can handle them as a society.

2\. Given how prevalent conspiracy theories are - across times, cultures,
geographical locations, social niches - I posit our propensity towards them is
not a random fluke of human psyche. No, I posit the conspiracy theories play a
role in some other societal processes. I posit that trying to eradicate or
suppress _the underlying mechanism_ , without understanding the mechanism in
the wider context of all societal processes would cause unforeseen negative
consequences.

~~~
iamben
Having been around quite a lot of conspiracy based stuffs, I have to disagree
with the statement you make about the impact.

Conspiracy theories are a slippery slope. You might not believe something at
first, but they are peppered with enough truth that you begin to accept them
all because one part was right. "Question everything," as they often tell you.
The problem is, facts and science usually comes from people in authority - the
people you're told to question. So you begin blanket distrust all "authority".
Which ends up being anyone telling your something _with_ authority.
Government, science, medicine, police, Greta...

I'll happily question - but the problem comes when your bias won't let you
trust anyone or anything. In fact you deliberately take the opposite stance
because "they lie" and they're probably "out to get" you. Then it really is
dangerous - and you feel clever, because it's like knowing some secret
knowledge the rest of the world doesn't - because the "sheeple" believe the
spoon fed lies.

~~~
chepaslaaa
I think conspiracy theories should be seen as a symptom of low quality
education, and that makes it a big deal.

~~~
thephyber
I repeatedly see very intelligent and successful people outright reject the
possibility that conspiranoids can be intelligent but misguided. I think
conspiracy thinking ("conspiranoia") is orthogonal to educational
attainment/quality. I think it's much more likely that we are all susceptible
to conspiranoia thinking (we all have the same cognitive biases and we all use
good-enough but imperfect heuristics to decide who/what to trust), but that
some of us just haven't been introduced to one we are amenable to by a person
of trust+authority. Given enough {theories, people we trust, exposure} we will
all start falling for some or many of them.

Of course the dumbest of us will fall for more theories, including the more
incredible theories, but to pretend like any one person is immune is to ignore
the step 1 of a 12 step program.

------
daenz
>The fig leaf of Free Speech will no doubt be brandished as the greatest good
and too sacred to be messed with but the facts are that every form of free
speech has limitations. And that actively trying to harm others by abusing
that right should come with some kind of limit or at a minimum a way to reduce
its reach. Most of the reasoning around free speech rights and such are from a
time when getting a letter across the country took three weeks.

This is what the seed of authoritarianism looks like: moral justifications for
forcing people to believe, communicate, and behave the way you think is
correct.

~~~
white-flame
The specific limit of free speech is where it causes legal damages to others.

The particular form of authoritarian scaremongering on display tries to
undefine that court-testable line to include any useful moral panic, but never
really addresses the notion that loosening that line means their own speech
could also easily be suppressed, as the "other side" could use the new
suppression tools as freely and subjectively as themselves.

~~~
conception
Two things-

What about legal damages but not directly cause and effect. Something like
stochastic terrorism?
([https://en.m.wikipedia.org/wiki/Lone_wolf_(terrorism)#Stocha...](https://en.m.wikipedia.org/wiki/Lone_wolf_\(terrorism\)#Stochastic_terrorism))

If someone is saying “Someone out kill x...” either directly or via dog
whistles and whoops someone “crazy” kills them, should we tolerate that as a
society? This is being played out by the massive increase, for instance, of
violence and crime on Asian Americans because of the rhetoric of right wing
media in regards to China but you can’t charge anyone with a direct comer and
yet damages are happening. And if the test is specifically legally proven
damages then well society makes the laws and the question returns to should we
as a society allow people to incite violence and how do we draw that line?
It’s not an easy question but I don’t think it’s as simple as “legal damages”.

The second point which is perhaps more to the grandparent is all voices are
certainly not equal - especially on the internet. Should a corporation or
state actor with the finances to blanket their voice across the spectrum of
media not have any limits on it? As speech is weaponized, which it was in 2016
and counties to be done so today, should it be regulated in some way like
other weapons are? What will society look like when the voices with the power
to spread their message continue to shrink? Everything needs moderation and
regulation especially when it can do such harm to society. Again there’s a
debate on where to draw the line but I don’t think we want to live in a
society of might/money makes right in the sphere of media communications any
more than we already are in it.

~~~
scohesc
The points you're making have been known issues of freedom of speech since the
drafting of the constitution. I think American society as a whole understands
this, and is willing to make these sacrifices because otherwise you open
yourself up to authoritarianism by more powerful entities.

------
smitty1e
Ain't the software; it's the peopleware.

> But there are a lot of them and they all have the vote, waiting to be pushed
> as useful idiots to some higher goal. Politicians have started to realize
> that they don’t actually need to get a real mandate, and that they won’t be
> held accountable for lying. At that level, every time someone lies more
> people become susceptible to being pulled in to this web of deception.

One viable thing to do is redistribute political power instead of wealth.

The act of trying to centralize everything into a legacy, monolithic,
waterfall, big iron system results in a Tower of Babel.

Its proponents swear it's Progress.

Return to the enumerated powers on the Constitution, and refactor all this
Progressive cruft to the State level, where voters can either be wise or
spendthrift at their leisure.

Doubtless we'll prefer a more catastrophic collapse, but dreaming is still
occasionally legal.

~~~
joe_the_user
_Ain 't the software; it's the peopleware._

What does that mean?

 _One viable thing to do is redistribute political power instead of wealth_

No, that's clearly not viable. The US is still "really" a democracy in the
sense those who get the most legal votes really do win - real, fair elections
right now, are a big factor in the distribution of power in the US, for
democratic-ness on a world-scale, that's pretty good. Yet the extremely
unequal distribution of wealth makes a mockery of the realness of voting's
impact. Most people are effectively powerless to make an impact on much
outside their immediate lives.

 _Return to the enumerated powers on the Constitution, and refactor all this
Progressive cruft to the State level_

This is an extreme reactionary viewpoint, taken from the 3-percenters or
similar outfit.

~~~
shaftoe
"Most people are effectively powerless to make an impact on much outside their
immediate lives."

Isn't this a good thing? Doesn't it mean that they can't harm others?

~~~
lovelyviking
It wasn't a good thing in Aushwitz, Birkenau and other death camps for those
who were about to die, while others around did nothing to stop it.

It isn't a good thing for people in Belarus right now with dictator applying
brutal force when he didn't manage to win even unfair elections. Police simply
shoot people on the streets and even aiming to home windows if they scream in
protest.

[https://www.youtube.com/watch?v=4-ynUNGNHdk](https://www.youtube.com/watch?v=4-ynUNGNHdk)

------
intended
Since people are discussing conspiracies, I present to you the Batman argument
to inoculate your near and dear ones.

1) All conspiracy theories require the evil organization to achieve several
super human/ super organizational feats such as: * Perfect coordination -
across timezones, personalities, objectives * Great wealth * excellent control
of public figures * Advanced technology * Great secrecy Etc.

These are all impossible asks, which any normal person knows if they have
worked in any team of people.

Many conspiracies die because of how hard it is to keep everyone aligned, and
if people were that efficient in secrecy, imagine how efficient they would be
without that burden.

Therefore,

Implication 1) If you believe in the conspiracy, we must also live in a world
where such levels of efficiency are _possible_.

which means that

2) It is also possible for an organization or individuals to exist, who take
advantage of these efficiencies and advantages to do "good".

Now if it was possible to stay in the dark, while having advanced tech and
great wealth - well you've just described Batman.

So the question you can train all your loved ones to ask, anytime they
encounter a great claim, is whether this claim means that we live in a world
where batman can exist.

\------------------------

While this is a childish way to encode a defense, its meant to be easy to
consume, and to protect people who regularly wont have time to spend the
effort to verify information or toxic content when they encounter it.

So while it is outlandish, that is partly the point - to diminish the
earnestness and terror that conspiracy theories exude, without your active
intervention being required.

edit: This can apply to any superhero/or superfriends. I just chose Batman.

------
babesh
This thesis is crap and is blind to much bigger conspiracy theories that have
been going on for thousands of years and that still have much stronger holds
on societies than anti mask conspiracy theories. Modern social media did not
cause them to grow to what they were since it did not exist when they were
created and grew.

Christianity: Jesus died and came back to life. He died for your sins. You are
going to hell if you don’t repent your sins. Please give the church 10%.

Buddhism: You are going to endlessly suffer. Please contribute to the temple
to support the monks. Please pay monks to bless funerals.

Hinduism: You are born into a caste and stuck in it. You are shit and
untouchable. If you touch me or try to marry my daughter, I will kill you.

The reality is that you have systems and people that prey on human weaknesses
for time immemorial. It is much more interesting to examine how those
technologies/systems operate.

~~~
babesh
Free speech was protection against conspiracy theories and now the author
wants to ban free speech to protect us from conspiracy theories. Sheer
nonsense.

------
mmaunder
Who gets to decide who doesn’t get to speak? If an individual, we suppress
popular opinion unpopular with leadership. If the collective, we suppress
unpopular data and ideas that may be accurate or innovative.

You can’t have selective freedom of speech.

~~~
razzimatazz
I agree selective freedom of speech is hard. Probably too hard.

I tend to support selective augmentation of other people's speech - "fact
checkers have shown this to be inaccurate" next to a conspiracy fan's latest
comment. As long as it is more precise than just downvoting to oblivion.

~~~
jdashg
And who's to object to that! Speech is free and visible, and no information is
withheld, it's just not passively amplified without reasonable vetting.

~~~
krapp
>And who's to object to that!

I take it you didn't see the outrage around here when Twitter "fact-checked"
Trump's statements - even that was denounced as propaganda and censorship.

------
quicklime
> Filtering out the bad from the good is going to be very important if we
> don’t want to accidentally lose such minor marbles as our democracies, our
> health and our safety.

I really think that a better solution to this problem is to fix the
shortcomings of our democratic systems. The majority of people don't believe
that 5G towers cause COVID-19, or that 9/11 was an inside job. A stable
democracy should be resilient enough to withstand a small number of wrong
people.

But that's not the case for the US. The election result for the entire country
(of 330 million people!) can be flipped by a few tens of thousands of voters
in the right electorates.

I recognize that this is going to be a hard problem to fix. But I do believe
it would be easier than trying to reverse the laws of economics that drive
social media companies to do the things they do.

~~~
burlesona
I have wondered how much less (if any) this is a problem in Democracies with
Parliamentary systems, particularly any that don't use "first past the post"
voting. Anyone from outside the US have insight on that to share?

------
uniqueid

       > quick way to report a Tweet or FB post that spreads disinformation would already help a lot. 
    

That point is what always goes through my head when a Dorsey or a Zuckerberg
or a Wojkiki gives another "more work to be done" apology.

If they've been doing so much work, and it's just the "moderation is hard"
problem that has slowed them, why do all of their sites make it so convoluted
to report content?

~~~
themacguffinman
How is it convoluted? The "Report" button is quite easy to find in tweets and
posts.

~~~
dgellow
I used it a few times to report clear old schools racist stuff. My experience
is that it takes 7 steps to report that content on twitter in Germany, and
isn’t as easy as expected. I documented the process here:

[https://twitter.com/dgellow/status/1274398162711691266?s=20](https://twitter.com/dgellow/status/1274398162711691266?s=20)

Also, every single time Twitter responded that they won’t do anything about
the reported content (again, I’m talking about clear old school racist
content).

~~~
themacguffinman
To be fair, that doesn't seem unduly convoluted to me. They are doing their
due diligence: it looks like you specifically picked the Twitter-is-liable-in-
Germany-if-they-get-it-wrong legal option, so they very reasonably give you a
questionnaire because they want you to be specific about what you're demanding
of their vaguely defined Netzwerkdurchsetzungsgesetz obligations.

Then they got back to you, saying they disagree with your judgement.

That's a lot of "moderation is hard" work and they're doing their job. However
much you might disagree with their verdict, they're not hiding reporting
functionality or making reports unduly difficult or ignoring reports.

~~~
dgellow
> it looks like you specifically picked the Twitter-is-liable-in-Germany-if-
> they-get-it-wrong legal option

I don't know what you mean by that. I live in Germany, I used the normal
report button, I haven't picked anything special.

In any case, given my experience, I will definitely not use that feature
anymore, it's too much effort to just report something that should not be on
the platform. The fact that the report button is available and easy to find
isn't enough for the platform to self regulate if the rest of the process is
broken.

~~~
themacguffinman
> I used the normal report button, I haven't picked anything special.

When you reported the tweet, you picked the "covered by
Netzwerkdurchsetzungsgesetz" option, which means you're demanding that Twitter
comply with Netzwerkdurchsetzungsgesetz to act on a violation covered under
the German law.

It's clear you live in Germany because Netzwerkdurchsetzungsgesetz reports are
only available in Germany, and given that Netzwerkdurchsetzungsgesetz reports
have a legally defined scope that only applies in Germany, I'd guess that it's
treated differently from reports for regular violations (like "It's suspicious
or spam" or "It's abusive or harmful").

Other regular options like the "abusive" one is much shorter and easier than
picking out sections of a law.

~~~
dgellow
Sure, but these other choices are also not correct. It was neither spam nor
abusive or harmful.

------
intended
These conversations become hard, because its not well discussed, and the
finesse to divide these into "new" and "old" problems is being developed.

For example an "old" problem is that this is how humans interact with
information and sensationalism. The news cycle effect, the sale of content
along with the packaging of professionalism - was a problem before the net.

But there are also a novel situations where this new rate, scale, and quality
of information creation/dissemination result in new structures issues.

An example of this is the perception that a belief is more widely held than it
is in the general public, due to social media. IF you got into a conspiracy
group, and it were populated by 1000s of real conspiracists, it would still be
a small fraction of the population of %(people online) _(speak your language)_
(other filters).

BUT - the human brain sees "enough" activity in the forum/web page to trigger
an internal "critical mass" threshold, this creates "social proof" and the
belief gets solidified.

A clear field where tech creates new problems is where tech outstrips old
societal and human thresholds of functioning.

Social media feeds which reinforce information, constant and from your pocket?
New problem

Automated creation of content to populate feeds? New _New_ problem.

Creation of propaganda ? Old problem, new medium, new tools - question on
whether scope and scale make it a new problem.

\-----

Perhaps the "new" problems can, to some extent, be solved by tech. "Old"
problems will almost certainly need tech+political will and manpower to solve.

~~~
mistermann
> BUT - the human brain sees "enough" activity in the forum/web page to
> trigger an internal "critical mass" threshold, this creates "social proof"
> and the belief gets solidified.

I think you're pointing out some important but extremely easy to overlook
things here. How many newspaper articles, blog posts, and twitter/forum
discussions have we had in the last two years about fake news and conspiracy
theories, typically written in confident but grave tones? But of the people
writing these things, how many of them _really_ know what they're talking
about? What is the _source_ of their knowledge? What is the underlying source
of the claims Jacques is making in the article we're discussing? How many of
the people opining on such subjects actually spend any serious time within the
communities they're reporting on?

If the topic/culture was something other than the behaviour of conspiracy
theorists (say, African Americans), do you think it would have any effect on
how the information would be received by readers? In that case, I suspect a
lot of people might fairly quickly wonder if the author actually happens to
know as much as they claim, and I think it would be fairly easy to find plenty
of anecdotal conversations on the internet to substantiate that prediction.

The human brain "sees" all sorts of things. It can produce an instantaneous
answer to most any question you ask it, whether or not it actually has much
actual concrete data to work with. There are a variety of ways to control this
default functionality, the major progress we've made with decreasing racial
stereotyping in the last several decades is a good example. But that took _a
lot_ of coordinated messaging, education, and peer pressure, for a very long
time. Despite that, we're still rather far from declaring mission
accomplished. _And that is just one topic_ \- how many others do we have in
this same category?

I doubt we'll make much progress on these issues, especially now with the
brutal efficiency of mass communication we've released into society (with
hardly a second thought), until we can finally realize what the root of the
problem is: the human mind. And it's not just "those other people" whose minds
are flawed, the problem is with the base hardware and software itself - it's
just easier to notice in some people than others, especially when it's
constantly in the news.

~~~
intended
Funnily, the questioning of authority is a common trait of over turning common
sense arguments in conspiracy circles, kinda showing how easy it is to make
the argument, and how tough it can be to defend.

And even more ironically, since I Don't have any sources off the top of my
head which can link you to good content with research, where people are
verifiable sources, I am left with personal experience and memory :).

~~~
mistermann
> good content with research

On which topic?

------
motohagiography
Maybe the only ethical thing to do at this point is for the founder/owners of
these platforms to commit acts of sabotage so that people might become free of
them. Investors are all hedged, founders have cashed out, some of them aren't
profitable anyway, employees can find other work if its done over time.
Nothing of value would be lost.

The point about connecting gullible people is interesting and useful. It's
likely social media has just automated the underlying phenomenon behind mass
hysteria.
([https://en.wikipedia.org/wiki/List_of_mass_hysteria_cases](https://en.wikipedia.org/wiki/List_of_mass_hysteria_cases))

The real social media problem is we've put a slot machine in every pocket, and
now nobody can turn them off until the hysteria has run its course and we
evolve a resistance to them - or if the founders do this one unthinkable
thing.

------
baby
So we have great tools to facilitate communication between people, and yeah
there is downside to having people communicate, does it mean communication is
bad? I don't think so.

------
core-questions
> It would really help if both Facebook and Twitter would be far more pro-
> active in shutting down these fountains of nonsense.

When I see this, especially from someone who bills themselves as

> "Professionally I get paid to separate fact from fiction"

it horrifies me. What they're saying is that the existing draconian control of
speech that is implemented on the social media networks is not enough, that we
need to hire even more people to do moderation, that free speech needs to die
even harder, because we can't trust poor stupid people to make their own
decisions, read what they want to read, and say what they want to say.

What a bunch of elitist bullshit. Small wonder that people are increasingly
finding people like Jacques Mattheij to be unpalatable, nanny-state toadies
that are to be routed around at best and attacked argumentatively when needed.

Yes, there's bullshit about COVID and chemtrails and aliens all over the
internet. Why do you care so much? You're so worried that people are going to
make their own choices, live their own lives, make mistakes that impact
others?

This shows me that people have not learned what free speech is, what it is
for, and why it was a hard-fought right that continues to need fervent
defense. Mattheij can't conceive of a world where his draconian moderation
system could possibly backfire. Mattheij hasn't looked at what's happening in
Belarus, in India/Pakistan, and what happened during the Arab Spring when
governments decided to use the power at their disposal to shut down on the
people's ability to freely converse with one another, even about Unapproved
Topics, even with Officially Illegal Opinions. Mattheij either didn't read any
of the excellent 20th century dystopian science fiction, or thought it was a
user's manual for how to create an idealized totalitarian state.

Because of course, gigantic corporations and the government are going to
choose what's in everyone's best interests. Of course, it is inconceivable
that they would have any agenda of their own, or would want to make cultural
changes at scale without the full consent of society. It's never been tried,
anywhere, right? Never ends badly?

Ahh, but it's to save us from ourselves, right?

~~~
dvt
Man, it's sad that HN is going down the reddit rabbit hole, spam downvoting
these kinds of well-though-out posts that (admittedly) go against the
zeitgeist of SV "know-better-than-thou" elitist mentality -- with absolutely
_zero_ discussion or counter-arguments. The arguments here are sound:

\- Free speech is a virtue worth striving for

\- Disinformation -- to the detriment of the public -- is a price we're
willing to pay (barring some narrow cases)

\- The alternative is draconian authoritarianism

As Juvenal asked almost two millenia ago: _Quis custodiet ipsos custodes?_ I'm
sick of programmers that never picked up a philosophy or political science
book in their lives think they just solved the world's problems by giving
Google or Facebook _carte blanche_ to censor as they see fit (because you
happen to agree with the outcome today).

~~~
PaulDavisThe1st
>As Juvenal asked almost two millenia ago: Quis custodiet ipsos custodes?

Part of the last 2000 years has been the attempt to development systems of
government which inherently reduce the importance of this question. For
example, by enforcing churn on "custodes", by creating a culture in which
radical transparency is the norm, by encouraging highly participatory
democracy.

Now, I will concede that these efforts have not born much fruit. Certainly not
enough fruit to make Juvenal's question moot. But we are now in a situation
where we may be faced with "draconian authoritarianism" _because_ of the
social and psychological impact of social media and its manipulation, so I'm
not convinced that this "free speech vs. draconian authoritarianism" dialectic
is really useful in an analysis of this issue.

Asserting that there are only two ends of the spectrum: free speech or
draconian authoritarianism is very typical American thing to do; other
cultures have a much easier time dealing with the idea that while the slope
may be slippery, we can choose to take some steps down it and then _stop_.

(I'm not saying that I know you're American, merely that this sort of
perspective is much more typical in the political and civic culture of America
than most other countries in the world).

And frankly, I don't think your arguments are sound, because they hinge on a
hand-waving claim:

>Disinformation -- to the detriment of the public -- is a price we're willing
to pay (barring some narrow cases)

You (and I, and the rest of us) don't _know_ what that price really means.

~~~
acephal
>Part of the last 2000 years has been the attempt to development systems of
government which inherently reduce the importance of this question. For
example, by enforcing churn on "custodes", by creating a culture in which
radical transparency is the norm, by encouraging highly participatory
democracy.

Majority of the last two thousand years in Western Europe was inept Roman
dictatorship followed by Germanic warlords duking it out. More like the last
500 years, no? Saying 2000 years lends an air of credibility that shouldn't be
there, though you may just be going for word play in response to the
grandparent

------
fossuser
I think he's right, but there's a subtlety here that's often missed.
Zuckerberg talks about it directly [0] and I think his approach is probably
the most reasonable.

Social Media platforms do have some responsibility to moderate (and they have
the legal Good Samaritan protection to do so in the US thanks to section 230).
They have a particular responsibility since their 'engagement' algorithms have
often historically led to making things worse by elevating controversial
content. When they're using algorithms to elevate certain types of content
they are employing some level of editorial control and have some
responsibility as publishers in how they rank that content (what they choose
to show more widely, what they choose to allow to fall into obscurity).

FB for their part has put a lot of effort into determining this kind of
moderation standard and what they allow from users [1]. The recent political
stuff is interesting in part because Zuckerberg is arguing that democratically
elected politicians should be an exception to these rules, basically that
citizens have a right to see the speech of their elected leaders (and lies
should be corrected by a free press). While FB is legally able to moderate the
speech of an elected politician on their platform, Zuckerberg argues it's
wrong for FB to decide what political speech is okay to block because they
shouldn't be in the position to make that call. I think he's right to be
concerned about that precedent. This doesn't apply to the speech of regular
users or even to the political speech of non-democratically 'elected'
politicians.

[0]: [https://zalberico.com/essay/2020/06/16/mark-zuckerberg-
and-f...](https://zalberico.com/essay/2020/06/16/mark-zuckerberg-and-free-
speech.html)

[1]: [https://www.vanityfair.com/news/2019/02/men-are-scum-
inside-...](https://www.vanityfair.com/news/2019/02/men-are-scum-inside-
facebook-war-on-hate-speech)

[Edit]: Also this:

> "One thing that interests me and that I have so far not been able to find
> out is how it starts. How does a ‘normal’ person step into this cult world
> where up is down and not think to themselves: “Hm, this does not seem like
> it is believable”."

I think people _do_ think this at first, but I think we are all more
vulnerable to being infected by bullshit then we like to believe. People are
wildly inconsistent, even in their own views most of the time (yes even you,
and me). We think poorly, many people believe crazy things, we argue via
motivated reasoning, don't extend views globally, don't consider things the
same when they're out of sight etc.

When exposed to lots of wrong information repeatedly I think most people get
corrupted, even those that are pretty analytical. I think it takes continuous
vigilance to not believe crazy things (and I think people get worse as they
get older at being able to do this well).

I don't think these are outliers, I think this is the norm. Most people just
don't wonder about things at all so avoid being radicalized into harmful
action based on their own crazy beliefs. I also suspect in-person interaction
has a lot of built in de-escalation mechanisms that bias towards unity and
friendliness most of the time, so things get worse when you lose that natural
control.

~~~
not2b
What I notice about the Facebook way of doing things is that, especially in
their mobile app, they will highlight the most outrageous and flame-worthy
comment on any contentious topic. The algorithms, I'm sure, are doing this
deliberately, because they prioritize "engagement". They want people pissed
off so they'll respond, get into arguments, and see more ads in the process.

Remember the classic XKCD comic "Duty Calls":
[https://xkcd.com/386/](https://xkcd.com/386/) ? It seems the social media
companies have decided that promoting this kind of thing, keeping that guy up
all night arguing, makes them more money.

~~~
fossuser
They definitely did do this (though arguably not intentionally to spread
outrage) and scammers took advantage of it. Scammers spread misinformation
that generated controversy to drive revenue to ad farms (and of course Russia
leveraged it more for political rather than monetary purposes).

Steven Levy's new book (Facebook: The Inside Story) goes into a lot of detail.
One interesting thing was that in 2016 the false stories tended to target pro-
trump conservative groups because the lies spread more easily (more willing to
just share things that fed into existing cognitive bias). They tried fake lies
with the left too, but they tended to be less effective because people would
call them out as fake and they'd fizzle out (though I'd suspect today it'd be
easy to spread lies feeding into the more extreme aspects of the left too).

Today though they're directly aware of this and FB in particular has a ton of
people working on 'integrity' to stop this kind of bad incentive.

------
nemo44x
Ok I’ll bite. I’m sure this person means well. But I have to ask - why are
they the authority on what is OK for people to believe, talk about, and
propagate? Who owns the truth? Why is this persons views the correct ones? Or
even more, what about things that are talked about on the internet, have their
groups, that they don’t find problematic but others do. And why are those ok?
Because they believe this?

The foundation of Liberalism (I’m not saying left wing here) is that no one
owns knowledge. No one has the last day and all opinions can be heard. And
mocked relentlessly. And through this we will discover as close to the truth
as we can.

Nobody - no group, movement, person, or authority owns the truth. All ideas
are fallible. If people use the internet to say the most outrageous things
then do be it. The minute someone gets to say “that’s ok to be said and this
other thing is not. End of story” we are in trouble. It doesn’t mean someone
can’t be raked across the coals for their awful ideas - you need to own it.
But anyone or group that gets to determine what is ok and what is not is, in
essence, a fundamentalist regime.

We don’t need inquisitors. We don’t need thought police. We do need people
willing to say “that’s bullshit and here is why”. If you can prove that
generally speaking 2+2=5 then the foundation of arithmetic is broken. But
meanwhile I can poke holes in your argument. Until you can prove it better
than 2+2=4 then I can judge you for a sophist at best, a dipshit less
generously.

~~~
elisbce
> The foundation of Liberalism (I’m not saying left wing here) is that no one
> owns knowledge

This is complete BS to begin with. "No one owns knowledge" is a typical
ideological statement without any merits or proof. It is an idea that you want
to believe, that is based on nothing factual.

Yes, you can have opinions. Tastes. Ideologies. Beliefs. And they could be
true, or false, or changing, depending on where you are and how you look at
it. They are all subjective.

However, there are things in this world that are neutral to opinions and
stands the test of time. Facts that happened in the physical world and
mathematical knowledge, are two examples of such objects. They are the truth.
They are absolutely correct. There is no room for "disagreement" because they
are not subjective opinions. They are facts that can't and won't be changed.

Math is always correct. 100% correct. From day 1 they are discovered /
invented till forever. Explicit assumptions and theorems that followed by
rigorous proof are absolutely correct. You can expand it later, you can
discover more, you can enrich math. But as long as it is correct, it stands
forever. Every single second you are using technologies based on mathematical
knowledge that was proved thousands of years ago. They never failed, and will
never fail.

Facts, in the physical world, are objective as well. You can argue whether a
killer is a good person or not, or how good/bad he is, but you can't alter the
fact if he did pull the trigger and killed someone. That is a fact.

Physics, based on a wider set of physical assumptions, can be viewed as close
to truth, because they have stood the test of time by physical experiments and
phenomena. There is a chance that they are not completely true and will be
subsumed by more advanced theories, just like Newton's Laws, however, they are
close approximations to reality. That's how we get the rockets into the space.
By science.

Now comes to my point. You have the right to express your opinions, even if
they are harsh criticisms. But you don't have the right to spread false
information that can be proved false by facts. You don't have the right to
spread lies. You don't have the right to say whatever you want, however you
want it.

Why? Because statements against facts are false, are lies, and spreading lies
can hurt and cost others' lives, sometimes millions or more. This is why there
is law against spreading hate speech, against spreading extremist ideology
against human race, against fraud and scams.

Free speech is about being free expressing your opinions within the framework.
Outside these boundaries, your speech becomes a weapon that can hurt others
and that is not allowed by law and society.

The fundamental rule in the society is simple. You have every right about
yourself, but you don't have the right to hurt others.

~~~
froasty
That these facts inhere in the physical world (and I don't disagree) does not
directly correspond to the beliefs of the individuals living in said world.
Language exists and is the vehicle used to transmit beliefs about facts.

This is why we speak of proof in the first place. The fact that person A died
at another's hands does not directly correspond to the belief held by an
individual that person X was the killer. This is why we have "beyond
_reasonable_ doubt" as a legal evidentiary standard.

I agree that factual axiomatic truths exist in the physical world and are
evident to individuals _as_ factual axiomatic truth. The problem is
communicating these as such without granting license to counterfeits.

The most successful form thus far has been to let the facts speak for
themselves to individuals as factual axiomatic truths.

The least successful has been to establish unassailable dogma concerning what
is and is not factual.

The certainty that you and others present concerning your set of beliefs-as-
facts-themselves is of the same species as other true believers throughout
history. Historically, this has led to the suppression of the scientific
method and an open society in favor of a prelatical and clerical class that
determines by edict what truth is, often solely at the behest of the hegemonic
power and not factual axiomatic truth.

This is why it is preferable to have minimal rather than maximal control over
the exchange of ideas concerning the nature of what constitutes a fact, let
alone what those facts actually are.

------
username3
Misinformation spreads because Twitter rate limits users replying to
misinformation.

------
4bpp
Anecdotes abound, when this is really not the sort of debate that should be
conducted using anecdotes (cf.
[https://slatestarcodex.com/2015/09/16/cardiologists-and-
chin...](https://slatestarcodex.com/2015/09/16/cardiologists-and-chinese-
robbers/)). Sure, some societies have uncensored social media and wind up with
a lot of people believing that vaccines are a Bill Gates conspiracy,
occasionally leading to real harm. Other societies, as a downstream effect of
uncensored social media, wind up with racial pogroms, as a commenter elsewhere
in here pointed out. On the other hand, some societies with censorship wind up
totalitarian hellholes, and some societies didn't even have social media at
all and still had racial pogroms (whether initiated by local word of mouth, or
the local counterpart of the same elites that presumably would be in charge of
censoring social media if there were social media). The existence of these
anecdotes tells us nothing about whether we would be better off with or
without uncensored social media, or whether some other more subtle solution
than censoring or not censoring would lead to better outcomes. So, which one
is more likely? Which one has worse consequences _in expectation_? This
argument can only be done quantitatively, and as far as I know nobody has
produced the numbers that we would need to do that.

We might as well try to determine whether private possession of bladed objects
should be permitted based on a comparing whether an emotional account of a
marital stabbing, defense of the cultural value of home cooking or appeal to
the value of adults feeling like society trusts them to hurt themselves feels
more moving.

(If we stop trying to pretend that there is a meaningful utilitarian argument
to be had here, though, on an _idealistic_ level, where emotional appeals are
properly at home, I'm with the more critical commenters here. Quoting from low
culture like video games may be a little tacky, but: "Beware he[sic] who would
deny you access to information, for in his heart, he dreams himself your
master.")

------
mountainboy
This article sounds to me like a lot of hand-wringing and complaining that
people are thinking for themselves and talking amongst themselves rather than
worshipping at the ivory tower.

In other words, music to my ears.

~~~
mountainboy
downvoted... must've struck a nerve.

~~~
chillacy
> Be kind. Don't be snarky.

> Please don't comment about the voting on comments. It never does any good,
> and it makes boring reading.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
throwaway234101
> But a person who does PCR analysis and knows the ins and outs of PCR tests
> that maintains that there is no such thing as COVID-19 is on another level
> for me. That’s the kind of purposeful refusal of the world as it exists all
> around us that I can not get my head wrapped around.

A person who specializes in analyzing viruses tells the author something
counterintuitive and instead of re-assessing his own assumptions, he labels
the specialist delusional. That last sentence, "That’s the kind of purposeful
refusal of the world as it exists all around us that I can not get my head
wrapped around", could easily apply to the author.

