
Platforms' current attempts to regulate misinformation amplify opinion power - walterbell
https://www.tandfonline.com/doi/full/10.1080/21670811.2020.1773888
======
bjt2n3904
Flash back, four years.

"But @bjt2n3904, misinformation is bring spread, that is causing 'real harm'
(TM)!" Some justification. Real harm is being done now, and it isn't doesn't
need social media to survive.

You know, I spoke out against online platforms taking responsibility for what
it's users wrote back then. But I wish I had the forethought to say this:

How do you think conspiracy theorists, who believe their messages are being
suppressed to hide the truth, will react when Facebook steps in and fulfills
their prophecy?

How do you think people who don't live in the tech bubble will react when they
share an opinion on politics, and it gets marked as "Actually this is untrue"
by Facebook, and buried?

How do you think people, who already believe everything they read on the
internet, will react when they see Facebook flag something as "untrue"?
Certainly, it won't reinforce their biases about how <political opponent's
supporters> are all misinformed people with the wool pulled over their eyes,
and can be dismissed off hand.

Surprising to no one, these attempts at correcting "misinformation" are having
the opposite effect. Who would have thought that using technology to solve an
obviously human problem wouldn't have worked?

Edit: "Ackshually, @bjt2n3904, Facebook isn't under any obligation to follow
the 1st Amendment, there's this great XKCD comic that explains this, hold
on..."

This is what we mean when we say "freedom of speech". It isn't just a good
thing for the government, it's good for society, too.

~~~
metalliqaz
Honestly I don't care what conspiracy theorists, science illiterates, and
racists think. Let them believe the world is against their points of view. It
should be! I want to make sure the ignoramuses don't infect others.

~~~
bjt2n3904
I hear what you're saying. It is incredibly difficult and frustrating to hold
a discussion with these types of people. But look at the life of Daryl Davis.
Instead of socially isolating "those people", he reached out to them. By some
counts, he's helped over 200 people leave the KKK. I believe his methods have
been far more effective than socially ostracizing people.

~~~
shuntress
That type of direct action on an individual level is definitely valuable but
it is also challenging and difficult to scale.

The issue you originally brought up though concerns mass communication on a
different level than those individual actions.

For example: Is the top of my facebook news feed an article about Daryl Davis
(how his methods are helping misguided people and how I might apply or examine
those in my own life to gain a better understanding of and more useful role in
society) or is the top of my facebook feed an article about how there are many
fine people on both sides and that maybe the KKK has a point and I should hear
what they have to say?

~~~
bjt2n3904
I can't speak for what's at the top of your Facebook feed. (I can hardly speak
about what's at the top of my own!) This is something beyond my control. It's
easy to sit back and call for Facebook to change the channel for me. It's much
more difficult to be the channel of change. (Especially, I might add, when
algorithms control what people see. Even so, when I directly respond to my
friends, that's hard for them to ignore!)

The only thing I can be in control of, is what comes out of my own mouth.
These are the things I believe in and stand for, that I earnestly think have a
far better chance of rectifying things like racism, hatred, and our divided
nation.

Having Facebook and Twitter do the work for us is alluring. But judging by its
results so far, I don't think it's working. As Daryl said, he's a musician.
Not a trained psychologist. If he can do it, so can you! Where should we
invest our efforts? In what has been proven to reduce hatred, but is difficult
-- or what we falsely hope will reduce hatred, and is easy?

~~~
shuntress
That is an example. Not a literal description of _my_ facebook feed.

>It's easy to sit back and call for Facebook to change the channel for me.
It's much more difficult to be the channel of change.

The point is that these are two separate things that should _both_ happen and
_should_ enable each-other in a positive feedback loop. I wouldn't phrase it
in as dismissive a way as calling for facebook to "change the channel for me".
Rather, I would say that I don't want facebook picking the channel for me in
the first place.

------
rsweeney21
Thomas Paine's "Common Sense" and "The American Crisis" would have been
considered "misinformation", "propaganda" or "harmful content". These are
considered "the two most influential pamphlets at the start of the American
Revolution, and helped inspire the patriots in 1776 to declare independence
from Great Britain" (Wikipedia)

So many of the great discoveries in science were controversial in their time.
"On the Origin of Species", expanding universe, microbiology. Do you really
want the government deciding what is acceptable? No, we don't. We fought a
revolution to have the ability to think for ourselves.

I'm not that worried though. Even if platforms are subjectively regulated by
companies or the government, life seems to find a way. There are plenty of
other mechanisms to publish scientific breakthroughs.

~~~
bjt2n3904
Neil Tyson, 2018: I’m okay with a US Space Force. But what we need most is a
Truth Force — one that defends against all enemies of accurate information,
both foreign & domestic.

Has a government ever made the wrong choice when it decided which scientific
theories are valid? I mean... I certainly can't think of any! Hooray, Truth
Force!

1 -
[https://twitter.com/neiltyson/status/1031556958153666561](https://twitter.com/neiltyson/status/1031556958153666561)

~~~
john-shaffer
The problem with Truth Forces of the past, such as the venerated Inquisition,
is that they just couldn't be everywhere at once. They could never completely
stop people from saying things in private, or from running to safe havens. If
only the governments of the time had had platforms as pervasive and automated
as Facebook and Google! They could have fact-checked and shadow-banned content
about the earth being round or moving around the sun, the earth being older
than 6,000 years, species adapting over time, etc. The damage has already been
done, but at least we have proper censorship in place now to bury any such
misinformation that may occur in the future.

------
joshribakoff
I’m not buying into this article. Twitter for example isn’t reining this
“opinion power”, the bad actors (certain “politicians”) are the ones reining
the opinion power. Placing a warning label that a politician is saying
something factually inaccurate isn’t reining opinion power

Additionally the title is very click bait. There is no data or any kind of
citations that can be used to conclude attempts to regulate misinformation
amplify the bad actors efforts as the title would imply, in fact the paper
attempts to point at other countries that have regulated things like child
pornography and hate speech which is pretty unrelated to the matter of
regulating misinformation

~~~
mc32
The thing is the warnings may not be applied equally. Or even if they are
applied equally to all liars, some claims may in retrospect turn out right and
then the warning turns out to have been wrong.

~~~
joshribakoff
To me this just sounds like an argument against fact checking in general, all
this is also true of any third party fact checking service, either way someone
can be politically slighted, but also anyone is free to start their own third
party fact checking service (at least in a free society)

~~~
AnimalMuppet
It is true of any fact checking service. It's important, then, that there be
more than one. Any single oracle of truth becomes a single point of attack for
liars, propagandists, and suppressors of truth.

------
giantg2
How about the fix being proper education?

If we educated everyone on logic and reasoning, then each person could do
their own fact checking. I'm pretty sure that if the citizenry isn't capable
of doing their own fact checking, they will also be susceptible of falling
victim to false/compromised fact-checking systems. This lack of education is,
in my view, the root of many problems in the US today and undermines the basis
of a legitimate/well founded democracy.

~~~
pubby
Two of the most gullible, conspiracy-minded people I know have PHDs in tough
fields. I don't think education is as related as you think.

~~~
yks
Once I heard some gossip about the Triple Nine Society (99.9th IQ percentile).
Apparently the members are mostly doomsday preppers and conspiracy believers.
Obviously selection bias applies but I found that somewhat thought-provoking.

~~~
pessimizer
I joined Mensa for a while and found it to be full of right-wing weirdos who
are absolutely the audience for this stuff. The boardgaming was good, though.

~~~
giantg2
Interesting. Both the people I know who are a part of Mensa seemed very
middle-of-the-road. One seemed to lean more left even. Perhaps it's influenced
by location?

------
radisb
>Moreover, as the self-appointed facilitators of the new self-government of
the online global population,32 currently there is little that would prevent
the leading social media platforms from using and canalizing this civic power
for the goals they see fit – much like a government with the difference that
governments are subject to democratic oversight..

2 things on the above: First point, self-appointed does not matter. It is just
a declaration. What matters is that people/users themselves turned this
declaration into a fact. So, second point, platforms are subject to democratic
oversight: Like you control your vote, you control your keyboard or mouse that
clicks the sign-up button or accepts the TOS, you control the mouth and the
fingers that propagate the information, and the brain that consumes it.

I really dislike it when people , by their own actions, become dependent on a
service someone else offers in such a degree that they feel they are now
entitled to the service and that it is now a public good. They strive to
establish and cement this dependence on their own terms, assigning now the
responsibility of maintaining it to the offerer (again on their own terms).
That's the relationship between a spoiled child and its parents, not between a
man and the society.

Sorry for my bad English. Not my mother tongue.

------
IncRnd
> What if … Facebook was a government?1 It would govern a huge nation. With an
> expected rise in 2020 to 2.6 billion users,2 it would connect more people
> than are governed by any one nation on this planet.

If Facebook were a government, there would be a civil war almost immediately,
. There are no services provided. The only income or taxation for the
hypothetical Facebook government would come from selling-out the citizens.
Facebook is only alive, because it is protected by actual governments.

------
Threeve303
Once real ID and contact tracing lead to a social credit system we could mass
produce polygraph machines and attach the feed back to your new centralized
login. Shrink it down enough and add a polygraph module to every phone. That
way any time you post, people could see your polygraph results and know that
there's an 80% (?) chance you're telling the truth.

~~~
ceejayoz
Polygraph machines don't actually work.

~~~
Threeve303
I have been reading about them lately. What's interesting is that some studies
show about a coin flips chance of it being correct while others show upwards
of 85%.

Obviously what I said would never work but I've been trying to think about a
technical solution to lies and misinformation. I'm not sure there is going to
be one.

~~~
ceejayoz
> What's interesting is that some studies show about a coin flips chance of it
> being correct while others show upwards of 85%.

This may reflect their effectiveness as an _interrogation_ technique - folks
who _believe_ they work may be more likely to confess, as they think it's all
over.

~~~
Threeve303
That must be the difference. Reminds me of a scene from The Wire:
[https://www.youtube.com/watch?v=DgrO_rAaiq0](https://www.youtube.com/watch?v=DgrO_rAaiq0)

------
Barrin92
I don't buy the argumentation of the article. It centres around 'opinion
power' the ability to set a political or cultural agenda and steer democratic
discourse, and argues that holding platforms accountable makes them more
legitimate wielders of opinion power, which the author claims is a mistake.

I think there's some things wrong with this. First off I don't think Facebook
or Google exercise a lot of opinion power, in particular given their
dominance. Yes, the article is right to point out they support climate change
action, or fight certain internet regulation, or gay rights but this is very,
very mild, often non-partisan civic stuff and often just financial self-
interest.

I don't see Facebook or Google going around wielding the power of legacy media
over politics, having a clear political slant in how they operate, or
whatever. They mostly stay out of it, again probably for business reasons.

More importantly though I think the article romanticises smallness, citing
Barlow's cyberspace independence, advocating for less concentration,
transparency and so on. It's the usual anglosphere-centric argument. Just give
people transparency and get rid of all the big bad guys and the enlightened
citizens of the cyberspace will find their way out of it, nothing new about
that.

I don't think this is realistic. People will use large social networks and
platform giants because they provide the best service, they're not going to
dig through transparency reports or statistics about algorithm influence, and
this means corporate responsibility is necessary. The author may not like it,
but large firms are the principal actors in our society, and they need to be
held accountable, even if that gives them a seat at the table.

------
mc32
There is no easy answer that I can see to this dilemma.

Advertising driven platforms lead to people seeking meme type approval and are
encouraged to share outlandish claims. On the other hand allowing government
to set policy is obviously problematic (insert CCP, United Russia, or even the
Dem or Repub parties in the US setting the rules). Other than perhaps
dispersed federation (to cap growth) I don’t see a good way to manage this.

~~~
uniqueid
Another option is for a government to involve itself _indirectly_. It could
regulate by requiring a social media company meet a vague "community
standard." I'm not convinced that would work, but it's less authoritarian than
a government dictating a list of acceptable content to private companies.

Then you use some system (tally letters of complaint? votes? random sample of
citizens, akin to "jury duty"?) to see if the community itself is satisfied.

~~~
neuronic
Are you saying that governments should become admins in society-wide forums?

~~~
uniqueid
I'm saying mc32 listed two possibilities, but there is a third: for government
to delegate to the public. The people can decide what standards of moderation
they want.

That's an in-between approach, as it doesn't involve government officials
drawing up a list of acceptable speech.

The philosophy is similar to that of a referendum, jury duty, or an election.
It's the community who decide if a company is behaving acceptably.

------
jude-
I almost wonder if (1) capping the audience size a user can reach, and (2)
limiting the rate of a user's audience churn would dissuade users from
amplifying misinformation. Not only would it cap the dopamine hit they'd get
from sharing outlandish bullshit, but also limit the damage they could do by
doing so.

~~~
yellow_postit
Capping access to reach is akin to current editorial media — politicians and
celebrities get more reach than others. When those in a position of reach
choose to share outlandish misinformation we still as a society don’t seem to
have good ways to hold them to accounnt. Once those large reach accounts share
out then smaller ones can disseminate to all the untracked nooks and crannies
and amplify further.

------
est
If two people have the same priors, and their posteriors for an event A are
common knowledge, then these posteriors are equal. - Robert Aumann

[http://mason.gmu.edu/~rhanson/deceive.pdf](http://mason.gmu.edu/~rhanson/deceive.pdf)

------
Heyso
"Attempts", when it is a success you won't hear about it.

