
Why Is Facebook So Afraid of Checking Facts? - agreenlake
https://www.wired.com/story/why-is-facebook-so-afraid-of-checking-facts/
======
dilippkumar
This article misses a crucial point: I do not want fact checked information. I
do not want a ministry of Truth telling me what’s an acceptable set of
statements to trust. I do not want a third party providing unsolicited
opinions on how much I conform with the acceptable statements a ministry of
Truth has blessed for public use.

I’ll provide three arguments. First, how will Facebook or any other entity
that takes on the responsibility to fact check stand up to coordinated
misinformation campaigns like “Iraq has WMDs” where an omniscient fact checker
could have literally saved thousands of lives.

Second, how will Facebook or CNN or Snopes or any other entity fact check
disagreeing experts on a subject - for example Sweden’s epidemiologists vs
rest of the world taking opposing views on quarantine measures - is anybody at
Facebook or any other place qualified to fact check Sweden’s top government
advisers?

Third, how will Facebook or any single entity fact check unqualified experts
on subjects? When Albert Einstein submitted his paper on the Photoelectric
effect (which eventually won him a Nobel Prize) he didn’t have a Ph.D nor was
he enrolled as a student - he was just an anonymous clerk at a patent office.
Today’s reddit would brand him an “armchair physicist” conducting thought
experiments. How on earth will Facebook or any Ministry of Truth attempt to
fact check anything like this?

So, if you rule out Facebook’s ability to fact check coordinated government
propaganda, government policy advisors that are literally balancing lives and
the economy and fundamental advances in science, what else is there to fact
check?

A fact checker can only end up becoming a re-enforcement mechanism for
popularly held beliefs. Let’s ask Galelio how that worked out for him.

~~~
TomMckenny
So the enormously popular post telling parents to make their kids drink Lysol
should not have even the slightest warning next to it because the poster could
be like Galileo or Einstein?

~~~
new2628
You are not arguing in good faith. You are making a caricature of the opposing
argument, presenting in the worst light possible.

~~~
TomMckenny
I'm pointing out a problem with a real post that actually exists right at this
very moment.

~~~
new2628
It is easy to pick one clear-cut case that supports one's viewpoint. The
question is how to handle the more murky cases.

~~~
crooked-v
Sure, but by answering the clear-cut cases you've at least defined boundaries
for your problem space.

~~~
AnthonyMouse
But the clear-cut cases aren't actually the problem. You don't need to censor
things that are so obviously stupid. The bottle of Lysol says things like
"Hazard to Humans and Domestic Animals" and "call a poison control center"
etc.

To the point that it's incredible that the few people who actually do it are
even telling the truth about why, instead of being cases of insane parents
looking for a cover story when they want to murder their kids, or Munchausen
by proxy or something like that.

Meanwhile the same principle gets you absolutely nothing in all of the non-
obvious cases, because when the answer is non-obvious (and therefore much more
problematic if wrong) then Facebook doesn't have it either.

------
okintheory
It seems to me that there's an obvious alternative to checking facts, but it's
one that scares FB much more than fact checking. Suppose they were
transparent! Suppose it was easy to see unobfuscated statistics about who is
paying for what on FB. Suppose journalists could easily look up how much white
nationalists are spending on targeted advertising in each US state. Suppose we
could see how much Exxon, Chevron, Koch Industries are spending, or health
insurance companies, or anyone else for that matter. Suppose we could also see
how much engagement there is with each post from Breitbart, NYTimes, Young
Turks and so on. If we had these facts, as Facebook does, we'd be off to a
good start.

~~~
glitchc
Why just stop at Facebook? All businesses operate on the model where they take
dirty money and try to do clean things with at least some of that money. Heck,
most charities operate this way. Let's just make everyone transparent. We
should know if the Red Cross is receiving money from Smith and Wesson. Bad
individuals and corporations make donations to good causes all the time. You
can think of it as moral outrage, but personally I think of it as reparations.

The consequences of that action will be as follows: Most of these businesses
AND charities will drown in the moral outrage and close their doors. And that
will have a detrimental impact on society, because the good that those
businesses/charities did will go with the bad.

Facebook has one powerful good use case. It makes it trivially easy to connect
with relatives and friends far away from us. There are other platforms out
there, but until Facebook, there wasn't much, and even now, Facebook is the
simplest and most convenient of them all. If Facebook dies, this good use case
goes with it.

~~~
catalogia
> _All businesses operate on the model where they take dirty money and try to
> do clean things with at least some of that money._

That's a bit too cynical even for me. There do still exist businesses that
simply produce a product or perform a service, selling it honestly without any
tricks or subterfuge.

~~~
glitchc
It's realistic and not cynical at all. Just the way life really is.

Try this thought experiment: The local mafia needs to eat, sleep, buy cars,
equipment. Should local restaurants, grocery stores, rental complexes,
dealerships, the local Home Depot, accept their patronage? Is that dirty money
or clean money? Is Home Depot morally corrupt for selling lumber to the mafia
boss for his new deck?

The answer to these and more is: no. Moral policing never works. We have laws
and law enforcement to pursue criminals. It's their job, not Home Depot's.

~~~
catalogia
Not all businesses pay tribute to or otherwise knowingly deal with local
mafias. That's not realistic.

------
pseingatl
Perhaps because they do not want to be considered "publishers."

------
CyanLite2
Simple: "Ignorant" people make up a large percentage of their user base. They
argue with "informed" people and it drives traffic and revenue.

Meanwhile the social fabric of the United States is being torn apart by
political polarization.

~~~
Nextgrid
Ignorant people are also more likely to click on whatever ads are shown
without any skepticism.

------
choward
As much as I dislike facebook, I think the root problem is the lack of
critical thinking many people have. Our education system that's optimized for
standardized tests don't really help with that.

For example when I state opinions I don't expect everyone to take them as fact
without doing their own research. But that's pretty much what a lot of people
do.

Doing your own research should be the norm (your already using the internet!)
with any opinions you read like on Facebook. But there are sources you should
be able to take seriously like the government. However, there is one
government official that straight lies on Twitter all day and a good majority
of the country believes him. When you believe that it's a lot easier to make
the jump to believe everything you see on Facebook.

One thing Facebook should do is make the news feed fully configurable and
deterministic so you only see things from people you know in a non biased way.
That will never happen since that's a big part of their business model.

------
caseysoftware
As much as I'd like to hammer on Facebook for this, they're just one of many,
many questionable actors here.

Almost 10 years ago, I conducted an experiment. I watched an hour of CNN every
night but it was never that night's coverage. It was from exactly two weeks
ago.

It was amazing how much "breaking news!" was irrelevant or just outright
wrong, how many large trend predictions were wrong, and how many "[person]
will do X" were wrong. While the predictions could have been portrayed as
opinions, they were presented as facts and the obvious next steps or
conclusions.

I realized pretty quickly that avoiding CNN kept out the blatantly wrong
information so even if I didn't replace it with anything, I was net ahead.

A few years ago, I discovered this article and realized that some portion of
it was probably on purpose:

[https://aeon.co/essays/how-the-internet-flips-elections-
and-...](https://aeon.co/essays/how-the-internet-flips-elections-and-alters-
our-thoughts)

------
advisedwang
Paying people to review content costs money and that eats into their bottom
line. Efficacy and political fallout are factors for sure, but "company avoids
spending money" doesn't need much further explanation.

This is no different from a company using cheaper (and perhaps less safe)
parts, using the cheapest call center they can (and letting customer service
suffer) or any other way of cutting corners to save a few cents.

------
reaperducer
_Why is Facebook So Afraid of Checking Facts?_

IMO, for two reasons. First, because some facts, especially in news stories,
are easily back-and-white: The car crashed. The person died. Some facts are
greyscale. The trouble you incur from fact-checking the greys isn't worth the
time, money, or effort.

Which leads to reason number two: Facebook is less Algorithms Will Save The
World™ than Google, but in spite of its mantra of "connecting people," it's
not all that interested in interacting with actual people. Wetware is messy,
time-consuming, and expensive. And while there are plenty of companies that
find success working with people, that sort of work doesn't fit into its SV
Bubble mindset, or lead to hockey stick-style revenue.

------
munificent
The answer is obvious: Facebook loves misinformation.

Facebook's business model is fairly straightforward: they want you to spend as
much time on Facebook as possible so you can consume as many ads as possible.
Contentious, provocative content drives engagement. It's easier to provide an
unending source of engaging content if you don't restrict that content to the
set of things that are true. Truth has the unfortunate tendency to be boring
in most cases.

If Facebook nuked all the lies, Facebook would be less interesting and ad
revenue would go down.

------
typenil
Ah yes. Let's have Facebook be the arbiter of truth; after all, their behavior
so far is nothing short of the pinnacle of ethics, wisdom, and transparency.

Let's not worry about how that could backfire at all or the countless times
that the "correct" opinion has turned out to be wrong. The truth is too flimsy
to stand on its own. We must protect our inferiors from wrongthink.

~~~
neogodless
As long as Facebook is such a large part of how a large number of people
interact with each other and share information, the power or risk of their
power is great.

So perhaps the question is - where is the greater risk? That they merely
facilitate the spread of misinformation as a neutral party, or that they take
a opinionated stance to reduce the spread of misinformation, but hold the
power to bias that filter?

I don't like either option, but I also don't see the multitudes of individuals
choosing to largely choose an alternative for their interactions any time soon
either. An ideal for me would be a much more decentralized internet, but there
are lots of initiatives to make that happen that are arguably much less
successful than Facebook's grasp on the power it holds.

------
ksk
That's the job of news organizations and journalists who have access to
experts to verify the information. Also, it seems like Facebook is just a
scapegoat here. IMHO, The real problem is that people on both sides of the
political spectrum are re-sharing "news" that they have zero idea if its true
or not.

~~~
neogodless
This is essentially an "individual responsibility" argument.

Do I think people "should" be individually responsible? Yes. Do I believe
that's going to happen? No.

Forty years ago we got our news from a handful of networks that stuck to "the
middle". It was no doubt imperfect, but if they would post crazy conspiracy on
the nightly news, they would quickly vanish as a network. On the other hand,
convince a few million individuals to share a far left or far right story full
of misinformation, and it works. The individuals are not "shamed" into
avoiding that mistake again.

~~~
ksk
Its also a "facebook is not a news organization or a fact checking
organization with access to an army of experts in every domain" argument.

>Forty years ago we got our news from a handful of networks that stuck to "the
middle". It was no doubt imperfect, but if they would post crazy conspiracy on
the nightly news, they would quickly vanish as a network.

I do not think that rumors and misinformation were less prevalent 40 years
ago. Is there any consensus (not sparse data) on that? My hunch is that we're
in a much better situation today than ever before. Aanyone with an internet
connection and some free time has the ability to fact-check a piece of news.
This was not available to people earlier, so they had to use their judgement
and either choose to believe it or not.

------
lucidone
Only allowing "true" information on any platform (or in any form of discourse,
really) is effectively impossible given the subjective nature of truth.
Philosophers have spent their lives arguing this topic, a tech giant isn't
about to answer that question.

~~~
protonfish
Agreed there is no guaranteed way to only allow truth. However, there are many
easy ways to remove utter nonsense. Maybe we should start with that and see
how it goes.

~~~
lucidone
I agree in principle (e.g., Pizzagate and similar conspiracy theories), but
delegating the arbitration of facts to a private company seems dangerous
territory to me. The inconvenient answer as it appears to me is that folks
have to discern what is nonsense and what isn't, which leaves us exactly where
we are.

~~~
krapp
>I agree in principle (e.g., Pizzagate and similar conspiracy theories), but
delegating the arbitration of facts to a private company seems dangerous
territory to me.

No one has delegated "the arbitration of facts" at any universal or societal
scale to Facebook, nor is Facebook attempting to arbitrate all "facts" or
"truth" on their own platform. The principle with which you agree appears to
be the principle Facebook is actually using, and I see no "dangerous
territory" in them moderating their own platform. There are plenty of other
platforms where Pizzagate and other such conspiracies are taken as fact.

> The inconvenient answer as it appears to me is that folks have to discern
> what is nonsense and what isn't, which leaves us exactly where we are.

This is not an inconvenient answer so much as a thought terminating cliche.

~~~
lucidone
I don't think Facebook has the public good in mind when they develop their
product, but their bottom line first and foremost. They're a private company,
they're beholden to their shareholders.

I also don't think that shifting responsibility from software to humans is a
"thought terminating cliche" but a necessary step in how we orient the
conversation about the mass dissemination of propaganda via tech platforms.
There's a reason folks are eager to believe conspiracy theories and
misinformation.

We'll have to agree to disagree.

------
justinzollars
The answer isn't always "a" or "b" its often shades of grey.

------
foobar_
I still don't understand why social media has to censor things when it can
just as well label it as inaccurate / spam according to experts or something.

Now flat earth's think they are right like Galelio.

------
dadarepublic
From what I gathered FTA it sounds like the author is putting forth the notion
that Facebook is relying on old, mostly debunked theories around a "backfire
effect" \- when fact-checking increases polarization and results in the
opposite of the intent of the fact-check.

The author then goes on to discuss how more recent studies (including their
own research) and even some of the original authors of that pervious research
had changed course and believe that fact-checking can indeed help course
correct those misled by misinformation.

Therein lies the rub, according to the author Facebook's policies around the
original research hasn't been updated to match more recent findings. This can
be troubling given the stakes at hand.

~~~
thisisnico
Facebook would be so much better with fact-checking. It seriously would
improve the state of the world.

~~~
vorpalhex
What makes you think that? Their moderation team is not paid well, mostly
outsourced to contracting firms, and has no accountability.

What happens when their moderation team decides some anti-vax nuttery is "a
fact" and then bans anyone disagreeing?

As a reminder, this is a group that handed out punishments for pictures of
infants being breastfed.

~~~
thisisnico
I'm not insinuating that they use the same method used as the moderation team.

------
BenoitEssiambre
Adding a "(?) Unverified" to the Like options might help?

------
TomMckenny
Because they were hammered into submission by demagogues and their supporters
who rely on disinformation and conspiracy theories to retain power.

------
cheez
because when ppl are involved, there is no such thing as an objective fact.

------
bertil
> Whatever Facebook says (or thinks) about the backfire effect, this
> phenomenon has not, in fact, been “shown” or demonstrated in any thorough
> way.

I’m positive that the company has conducted internal AB-test to check for
those and has a far more nuanced view on those than Wired assumes.

