
Facebook admits it must do more to stop the spread of misinformation - CodeGenie
https://techcrunch.com/2016/11/10/facebook-admits-it-must-do-more-to-stop-the-spread-of-misinformation-on-its-platform/
======
btilly
This article criticizes Facebook for firing the human editors that had been
keeping things sane. However they were pushed to that by accusations of bias
in the right-wing media. Accusations that looked likely to lead to a
Congressional investigation.

See, for example, [http://thehill.com/policy/technology/279361-top-
republican-d...](http://thehill.com/policy/technology/279361-top-republican-
demands-answers-over-alleged-facebook-political-bias).

Now they are in a situation where they are damned if they do, damned if they
don't. And people immersed in echo chambers will accuse them of bias no matter
what.

But the entire system is fundamentally broken. Pay per ad incentives lead to
rewarding viral content. And content that induces outrage is far more likely
to go viral than pretty much anything else. Plus it goes viral before people
do pesky things like fact checks. And the more of this that you have been
exposed to, the more reasonable you find outrageous claims. Even if you know
that the ones that you have seen were all wrong.

For an in depth treatment of the underlying issues, I highly recommend _Trust
Me, I 'm Lying_.

~~~
jerf
Yes, trying to solve this problem by creating a definition of "misinformation"
sufficiently precise to act on it for all possible articles, then trying to
remove it somehow is probably an AI-complete problem. If it's not AI-complete
it's probably just plain an ill-formed question. That's before we ask
questions about the biases that get embedded into the misinformation detector.

This can't be fixed at Facebook scales by building a platform that so highly
incentivizes low-quality content of all kinds, then trying to stop the content
so late in the cycle. The entire incentive structure has to be rethought.
Unfortunately, that's probably a problem that Facebook is literally incapable
of solving without going out of business, because Facebook the corporate
entity _is_ that incentive structure. To fix Facebook requires Facebook to
become not-Facebook.

~~~
Bud
There is no need to create a Grand Unified Theory of misinformation, here.

This problem isn't that hard. You filter this stuff and rank it by its SOURCE,
not by the content of individual articles.

It's not hard to figure out which online sources are pushing out most of the
abjectly-bad propaganda, and de-rank them.

~~~
jerf
"This problem isn't that hard. You filter this stuff and rank it by its
SOURCE, not by the content of individual articles."

I don't think reifying _ad hominem_ into code is the solution to the problem.

Of course, if you don't mind a rare few false positives here and there on
articles, I've got your filter right here:

    
    
        def source_is_trustworthy(source):
            return False
     

Remember as you sit here thinking through your exceptions what we're talking
about; if your "trustworthy" source was confidently telling you about how
Clinton was going to win, _it 's not one of the exceptions_. I won't say I was
_confident_ Trump would win, but I was certainly less suprised than most; the
fact that I generated this somewhat more predictive model by tossing out
pretty much every "mainstream" news source is not a good sign for them. And to
be honest, Trump news isn't the only thing that I find this useful for. Beyond
the bare facts, most media outlets really aren't good for much anymore, and do
you ever have to de-spin their news just to get those bare facts in the first
place. (I can't, however, pack up into a recipe how to do this yourself. I
think we're in a period of transition in the news industry, much greater than
just "the internet makes the old dinosaurs stumble" makes it sound, and I'm at
the moment not really all that confident in anything.)

~~~
sethrin
As someone who vehemently opposes Trump, I feel that allegations of anti-Trump
bias in the mainstream media were entirely correct and somewhat to be
expected. Although Trump is certainly a name that sells papers, Trump's
repeated threats to open up news media to broader libel laws were not well
received. The news organizations' endorsements were pretty one-sided[0]. I
don't want to take you too literally, but it seems to me pretty obvious that a
model ignoring the MSM entirely would not be preferable to one that took into
account the MSM opinions and then applied a correctional factor. I'm sure we
agree that truth is a function of our means for determining truth, but I
suspect we disagree strongly on the reliability of Internet news sources.

[0]
[https://en.wikipedia.org/wiki/Newspaper_endorsements_in_the_...](https://en.wikipedia.org/wiki/Newspaper_endorsements_in_the_United_States_presidential_election,_2016)

~~~
teekert
You must have missed the leaked emails in which Clinton exerts massive
influence on the media, planning every last details in exchange for all kinds
of rewards [0]

[0] [http://observer.com/2016/08/wikileaks-reveals-mainstream-
med...](http://observer.com/2016/08/wikileaks-reveals-mainstream-medias-
coziness-with-clinton/)

~~~
sethrin
I did miss that. Was that article intended to support that view? The incidents
listed don't seem to have been terribly well planned or executed, or to have
garnered any positive results. That seems difficult to reconcile with the idea
of a powerful media conspiracy.

------
colllectorof
_" OMG, Trump has won through lies and deception! We failed to stop him. How
on Earth did that happen? We must out-manipulate our opponents next time."_

If you read between the lines, this is what the article condenses to.

The discussion here is mostly creepy groupthink shit.

Social networks fact-checking their content? What's next? Should AT&T stop the
spread of misinformation over its phone lines? Should USPS fact-check your
mail?

Facebook is not a real news source and never going to be one. At best it's a
communication medium. At worst it's a giant propaganda machine. Any moves to
get it further away from the former and closer to the latter are just
machinations to change _who_ benefits from the propaganda and nothing else.

We don't need an "improved" Facebook. We need a working replacement for old-
school newspapers, TV stations and radio channels. The "new media" eroded all
of those, but failed (so far, at least) to provide anything of equal utility
and value. Hence all the issues involved in the coverage of these elections.

~~~
B-Con
Agreed.

The problem is the users, not the platform. If users propagate misinformation,
the _users_ propagate misinformation.

The easy solution people often jump to is fighting negatives with negatives,
assuming it yields a positive. But often a positive approach is more
effective. Offering incentives for good acts, not just disincentives for bad
acts, is a fairly popular recent trend backed by research.

In this case I can't help but think that the best solution is to focus on
education and the propagation of correct information, not censorship (or at
least, something that smells like censorship) of bad information. If "new
media" is a problem we should be fighting it closer to the source.

I don't know what Facebook's role in that would be, but ideally, as a
platform, it would be minimal.

But ultimately, it's worth remembering that it's hard to build a good system
with bad raw materials. If people are interested in falsehoods and echo
chambers, their social media will reflect that.

~~~
bo1024
Hmm, I mostly agree, but it's important to remember that a bad mechanism can
encourage bad behavior as well. Facebook must make choices about design of the
feed algorithm that have important effects on incentives and behavior. So even
if they take a minimal censorship role, we cannot ignore the effect of the
feed algorithm - in fact, we should focus on how to design it!

For example, the extent to which one "bubbles" users into their own echo
chamber is largely up to the algorithm designer. If you look at "content
aggregator" sites like HN, reddit, Facebook, they all have pros and cons. HN
and reddit give incentives in the form of karma. They all have different
levels of "bubbling" and opting into or out of bubbles.

Good design of such sites is an open problem -- even formalizing good _goals_
for such sites is an open problem -- but it is a design problem we should be
thinking about and addressing.

------
japhyr
It's also on us to resist the temptation to build social media bubbles around
ourselves, and to poke into each other's bubbles. Every time I've wanted to
block a friend or relative on facebook, I've put my phone down and come back
to it later. Looking away from people we disagree with isn't working.

My cousin shared a post this morning asking why people on the left aren't
celebrating the fact that a female campaign manager helped put someone in the
white house for the first time. I wasn't sure what to say as his friends piled
on to say things like "Yeah, I thought they were for women's rights?!" Here's
the response I finally came up with:

 _I 'm not celebrating her "success" because I imagine my facebook and twitter
feeds look a lot different than yours. My feeds are filled with first-hand
stories from women around the country who are being more openly harassed than
they were last week. It's happening often enough that's it's really dismissive
to say "Oh those are just a few assholes." People are openly harassing women
in the name of Trump._

I think when we act as consumers of social media we need to stop building our
own bubbles, and reach out into other's bubbles. And when we help build social
networks, we need to intentionally structure them in a way that maintains
connections, rather than isolating individuals and groups.

~~~
gdulli
When I knew people who supported Romney, I felt no need based on that to
exclude them from my life or my social media circles.

One's ability to support Trump tells me much more about a person. There are
too many things that are good and should be fundamental about a functional,
enlightened society that one must reject in order to support Trump. Prejudice,
fraud, bullying, and sexual harassment must all be accepted.

People who would accept these things are not welcome in my life. It's not
because they're on the other side. It wasn't like this in 2008 or 2012. This
time it goes deeper than that.

~~~
tomp
And to support Hillary, one must accept lies, corruption, warmongering,
corporatism, voter manipulation and unfettered globalization.

It's a choice between two evils, however you look at it.

~~~
wang_li
Don't forget tolerance of rape and child prostitution. (See Jeffery Epstein
and William Jefferson Clinton.)

~~~
sushiwarrior
I mean, Trump is arguably FAR more involved with Epstein though - he literally
has a quote saying that he acknowledges Epstein surrounds himself with young
women:

“I’ve known Jeff for fifteen years. Terrific guy,” Trump told New York
Magazine for a 2002 profile of Epstein. “He’s a lot of fun to be with. It is
even said that he likes beautiful women as much as I do, and many of them are
on the younger side. No doubt about it — Jeffrey enjoys his social
life.”[1][2]

How can someone come to the conclusion that this means only Clintons support
this? Clearly the common denominator is rich people abusing their power to
rape children, not politicians raping children, and not Clintons only raping
children. Rich people are abusing their power and America voted to fix it by
electing a rich person who has abused that power.

[1]: [http://dailycaller.com/2016/10/09/the-friendship-between-
tru...](http://dailycaller.com/2016/10/09/the-friendship-between-trump-and-a-
billionaire-pedophile-that-nobody-wants-to-talk-about/) [2]:
[http://www.snopes.com/2016/06/23/donald-trump-rape-
lawsuit/](http://www.snopes.com/2016/06/23/donald-trump-rape-lawsuit/)

------
alexc05
I think it would be relatively easy to have an "auto-snopes" feature which
detects a URL being shared and immediately attaches a post that says "this has
been debunked by X"

For example - I've seen dozens of links to Michael Moore's trumpland speech
which strategically ends with the phrase "America will elect Trump and it will
feel great"

Sample: [http://www.zerohedge.com/news/2016-10-25/michael-moore-
trump...](http://www.zerohedge.com/news/2016-10-25/michael-moore-trumps-
election-will-be-biggest-fuck-you-ever-recorded-human-history)

    
    
        He concludes:
             
        Yes, on November 8, you Joe Blow, Steve Blow, 
        Bob Blow, Billy Blow, all the Blows get to go
        and blow up the whole goddamn system because 
        it's your right. Trump's election is going to
        be the biggest fuck ever recorded in human 
        history and it will feel good.
    

but the truth is that is not what he __CONCLUDED __

He
continues:[https://www.youtube.com/watch?v=sVLTQIUMq18&t=30](https://www.youtube.com/watch?v=sVLTQIUMq18&t=30)
[sic]

    
    
        ... and now you're fucked.
    
    

EDIT - I feel like I should clarify I mean "auto-snopes" figuratively. Auto
fair-balance might be a better descriptor. Something that links to an opposing
opinion automatically, in cases of absolute falsehood the debunking, or even a
CSS CLASS where there is a bright red "FALSE" wrapper.

Snopes doesn't have to be the automatic choice.

Breaking filter-bubbles & ensuring _truth_ would be my goal on an idea like
this. Not strictly "promoting liberal media"

~~~
Spellman
I have personally asked an individual why they don't trust snopes.

Their claim is that it's a Democrat rag towel and shill.

I then ask, then what fact-checker would you trust?

Their response: None of them.

Congratulations. We've reached a point, for this individual, where "facts" as
decided in the common forum are suspect and the only thing they trust is
themselves (and whatever non-Mainstream Media they listen to). We have reached
a point where the only truth is what they decide is the truth.

That is a non-trivial problem to resolve. If we can't even agree what basic
facts are, there is no way to even have a discussion.

~~~
234dd57d2c8db
Have you done any research at all on snopes? I just spent 5 minutes on google
and found that one of snope's main "fact-checkers" is Kim Lacapria. She used
to write for a blog called "The Inquisitor", a site known for posting bad news
stories, similar to the right-wing Alex Jones programme. They have published
fake quotes and stories without doing research:
[http://www.inquisitr.com/670091/retraction-and-apology-to-
ro...](http://www.inquisitr.com/670091/retraction-and-apology-to-roger-moore/)
[http://www.rawstory.com/2015/07/story-about-costco-
pulling-d...](http://www.rawstory.com/2015/07/story-about-costco-pulling-
dinosaur-cake-was-hoax-by-insquisitr-contributor/)

As for Kim, she describes herself as "...an openly left-leaning individual
myself", if you ctrl+f for those words here:
[http://www.inquisitr.com/402558/scandal-envy-behind-
petraeus...](http://www.inquisitr.com/402558/scandal-envy-behind-petraeus-
drama-allegations-obama-ignored-benghazi/)

As you can see, journalistic integrity is lacking from both sides of the
political spectrum. Please do not get all high-and-mighty and think you're
better than everyone else, because you would be falling into the same trap you
mock others for. I hope in the future you will examine sources with a more
objective view and keep an open mind, and not just believe whatever someone
wants you to believe so they can make a quick buck off of your ignorance.

~~~
pimlottc
Come on. You've highlighted one staff member as having once mentioned having a
personal political leaning (like most people do) and that she once worked at a
blog with a tarnished reputation and spun that into implying Snopes is
untrustworthy. I'm sure you could meet this extremely low bar of "proof" for
practically any news organization.

If you really want to present a case of bias, find some real evidence.
Specific articles with specific instances of false or misleading information.
Otherwise, you're just sowing doubt by casting aspersions based on vague
associations.

~~~
234dd57d2c8db
This post wasn't meant to be a thorough debunking of Kim and snopes. I
actually think snopes does a mostly good job of producing accurate
information. However, it does show in some cases, such as the Hillary Libya
"We didn't lose one person" question from the debate.
[http://www.snopes.com/hillary-clinton-benghazi-
msnbc/](http://www.snopes.com/hillary-clinton-benghazi-msnbc/)

Snopes and Kim claim "Hillary Clinton overlooked Benghazi victims when she
said that "we didn't lose a single person in Libya" during a campaign event on
MSNBC." is false. However, reading the article for her evidence, you can
observe that Kim accepts Clinton lying by omission by saying the conversation
was not about Benghazi.

Looking at this from a high level, Clinton is implying the the Benghazi attack
is a completely unrelated event from the USA's war in Libya. However, these
two events are both part of the Libyan intervention and it's disingenuous from
Clinton to make such a claim because as a result of the Libyan intervention,
Americans did die. Clinton is obviously playing a political game to play down
the negative consequences of the Libyan intervention. I don't blame her for
doing this, by the way. I would too if I wanted to be president.

Just because they didn't die in the initial confrontation doesn't mean "No
Americans died." Snopes fails to point out the fact that Clinton does not
include Benghazi in the overall Libyan confrontation. Let's say I am in a car
accident with someone, and they are injured. If 6 months later, they have
complications and die from the injuries I caused with the initial car
accident, I am still responsible for the death of the individual.

Left wing bias: No Americans died in the initial confrontation with the Libyan
government.

Right wing bias: Clinton killed Americans in Libya.

Neutral bias: Clinton made a decision to support the invasion of Libya. As a
result of the invasion, Americans died in Benghazi.

~~~
knz
> Neutral bias: Clinton made a decision to support the invasion of Libya. As a
> result of the invasion, Americans died in Benghazi.

Even that statement could be twisted if you draw the conclusion that Clinton
supported the invasion, people died, therefore she is partly responsible. And
it's missing information about the "600 requests for additional security".

I do understand and agree with your point.

Perhaps the more important issue is that some voters have an overly simplistic
world view. "Casualties from an attack on a diplomatic enclave in a volatile
country" should be separate from "US Foreign policy had implications for the
conflict in Libya".

~~~
234dd57d2c8db
Agreed, the unbiased statement could definitely use some more words to show
it's an opinion that Clinton COULD be responsible for the result, but it's not
black and white. It's just frustrating that people don't recognize that yes,
snopes, in general, is a trustworthy source. However, snopes is a group of
people, each with their own individual biases.

These biases can occasionally influence the content. To treat these sources as
some kind of perfect "machine" that can just be unleashed onto facebook to
automatically say "You're wrong, this is how it really happened." is a scary
thought. Especially when the process and machine are blackbox and we don't
have any way to verify which "facts" are being pushed, by whom, and for what
reasons. And when you bring up the fact that there might be bias, you become
one of _those_ people, like you're some kind of conspiracy nut for asking
questions.

I think that happened this election. The polls all said "Clinton crushes
Trump." Anyone that asked about the validity of those polls was mocked and
laughed at and seen as some kind of rightwing conspiracy nut.

Well, turns out they were dead wrong and maybe if people recognized their
unconscious bias, we wouldn't have a president Trump right now and people
would have said "Holy $@&* these polls are wrong we need to change our
strategy so we can win." I kept repeating to people, don't underestimate
Trump, he's smarter than you give him credit for and he's a huge threat to
Clinton. Nothing but jokes and mockery from the majority of democrats I talked
to.

------
exabrial
If Facebook wants to be a "source of truth" they're going to need to hire
moderators outside of Silicon Valley that represent a wide array of values,
traditions, and backgrounds, in a ratio that represents the actual population,
and have them peer review each other without fear of repercussion from their
employer. I think it's hard for the Silicon Valley types to surrender this
type of control... For instance, would this article even be on HN if Hillary
had won the election? If we're talking about neutrality, that's an important
point to consider.

Actually, the lack of multiculturism is true for quite a few Silicon Valley
companies. Try to find a station that has 'Today's Metal' on Google Music,
good luck ...there are a couple hundred different indie channels to listen to
that are very meticulously organized into genre, sub-genre, and sub-sub-
genres. I'm not complaining, I simply use different products, but I think it
points back to the source of where the lack of multiculturism stems from.

------
trynumber9
You think the users are too stupid to fact check. But what makes you think the
users care about the truth anyway?

More often than not they double down with "well it's emblematic of the greater
problem" or something to that effect. They then look for new evidence to
support their views. Like it or not, some sites are designed to build echo
chambers and are not for general discourse.

~~~
faet
A lot of people don't care. They believe an image on facebook that reaffirms
their belief.

You try to counter with statistics from the FBI, DOJ, BLS, DOL, or any other
organization and the numbers are 'made up'. And studies have been done that
when presented with counter arguments it strengthens the original
misperceptions.

Corrections also don't get much traction. I've seen many conspiracy theories
pop up for a week, then die down. While the correct gets no traction at all.

>As a result, the corrections fail to reduce misperceptions for the most
committed participants. Even worse, they actually strengthen misperceptions
among ideological subgroups in several cases.

[http://www.dartmouth.edu/~nyhan/nyhan-
reifler.pdf](http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf)

People don't want news, they want to hear what they already 'know'.

~~~
tomjen3
The problem is that there are no organizations that can give out unbiased
facts. This includes the FBI, DOJ, etc. If you have grown up in a household
that were in opposition to either of these organizations (and that could be
either democract or republican, I can't imagine a poor black person in the
Ghetto likes the FBI more than a Montana rancher), why should you trust their
statistics? Heck even if you do trust them, don't forget the old saying: lies,
damned lies and statistics.

------
rwhitman
If you do a root cause analysis of this election's result, much of it points
back to Silicon Valley in one way or another.

Workforce automation created millions of unemployed people in places far away
from economic hubs, with no hope of employment in their hometowns or re-
skilling for the new roles, leaving them desperate. Proliferation of mobile
apps lead to an explosion in use of social media. The wild popularity of this
democratic social media encouraged a culture that allowed misinformation to
flourish without a lot of consequences.

Bad actors took advantage of this, and reinforced distorted false narratives.
Desperate people latched onto the messages presented to them, that gave them
solutions repeatedly reinforced by social media, and many took an enormous
risk - they voted against their best interests in order to solve their
problems with the information they had available. And that is how we got here
today.

~~~
cjjuice
Yes, Federal jobs should have been moved from cities to fill those lost.
Norway did something similar.

~~~
reubenmorais
That sounds like an interesting story, how can I find more about it?

~~~
cjjuice
Here is one article on the subject. I am sure there are more out there.
[http://www.lifeinnorway.net/2013/03/have-the-norwegians-
got-...](http://www.lifeinnorway.net/2013/03/have-the-norwegians-got-it-
right/)

------
overcast
This is seriously hilarious. Now that's he's won, against seemingly all odds,
we have to find some other reason for why EVERYONE ELSE, was completely WRONG.
Every media outlet, everyone in politics, all the polls, completely shit on
Trump. Now look who is the most powerful person in the entire world.

The Democrats, and the rest, did it to themselves. Overconfidence, smear
campaigns, and all of the morons doing the polls.

We are all sick of politicians, and the people have spoken, let's just see
what happens. Get over it already.

~~~
lightbyte
>we have to find some other reason for why EVERYONE ELSE, was completely WRONG

>and the people have spoken

Trump lost the popular vote, so this isn't true. He got about 25% of the total
voting population overall.

~~~
rawdoc
I agree that just because Trump won the election doesn't mean the people gave
him a mandate, or have spoken, but looking at popular vote in a system that
doesn't even focus on popular vote is incorrect IMO.

If the rules were to get popular vote then campaigning would be done totally
differently. The candidates are doing everything they can to win states, not
win popular vote.

~~~
qb45
Regarding popular election, "mandate" and "only 25% voted for him", it seems
that all people who are now dissatisfied have voted for Clinton. Which
suggests that the 50% who didn't vote maybe don't oppose Trump that much.

I observed that leftists have a tendency to systematically overestimate their
numbers, influence and the amount of agreement or even familiarity with their
ideas among the public.

------
gr_thrwy
Another place where Facebook's failure to detect fakeness has proved costly is
in their social graph.

A lot of sites moved away from comments platforms like Disqus to Facebook in
the hope that the quality of discourse would improve and trolling would
decrease. Instead, clicking on some of the most vehement commentators' names
would invariably lead to suspiciously bare accounts (and often suspiciously
fake names) with a handful of friends themselves, all with similar
characteristics. Unfortunately the people who are influenced by this sort of
thing are not tech savvy enough to do even this basic level of checking.

There is a a sort of "uncanny valley" that a technically savvy and experienced
person can detect when looking at a fake profile, that I daresay Facebook's
algorithms just can't.

Then there is also the problem that Facebook really doesn't care about
blatantly fake accounts until they are reported. The sorts of people who are
trapped in some filter bubbles are unlikely to be savvy enough to know how to
report these profiles (I've reported many-dozens at least-ranging from
community noticeboards to cupcake businesses, all pretending to be people and
missed by FB's much vaunted ML)

Of course, there is a certain irony in using a throwaway account to discuss
fake accounts, but without a patina of "realness", it triggers a greater level
of skepticism, which is a good thing.

When the history of 2016 is written, a large part will be filter bubbles and
trolls expertly manipulating huge swathes of electorates enabled by the hubris
and greed of Social Media networks.

------
stevendhansen
Except it isn't just the spread of misinformation. Even more important (IMHO)
is the insulating effect of showing people only like-minded opinions,
effectively trapping everyone in a bubble. I can't think of any way Facebook
can solve this problem without drastically decreasing their reliance on ad
revenue. What are they going to do, force people to view opinions they
disagree with?

~~~
gdulli
Everyone should be responsible for the curation of their own timeline. I
barely use Facebook but hate the idea that I didn't get a straight
chronological feed.

I was a huge fan of Twitter but was driven away when they followed suit. My
reasons had more to do with my own preference of how I enjoy using the
service, but having algorithms manipulate the information I see so as to
maximize engagement metrics is even more troubling.

~~~
lotharbot
There is a "chronological" option on facebook.

~~~
gdulli
It's illusory. They may or may not, based on their algorithms, bump something
old to the top of your feed after it gets a like or a new comment. They call
it "most recent" and they can define "recent" the way they want to.

------
masmullin
> Last week Buzzfeed reported on an entire cottage industry of web users in
> Macedonia generating fake news stories related to Trump vs Clinton in order
> to inject them into Facebook’s Newsfeed as a way to drive viral views and
> generate ad revenue from lucrative US eyeballs.

Does anyone else find this incredibly ironic? Buzzfeed ratting out others
doing clickbait?

~~~
ktRolster
Buzzfeed lately has been trying to 'outgrow' their reputation as clickbait,
and actually do traditional investigative reporting. With moderate success.

I agree with you though, it's hard to take them seriously.

~~~
Nadya
"Actual journalism" Buzzfeed articles are of surprisingly great quality. They
fund that quality through clickbait, which is their primary monetization
strategy if I recall correctly.

It does make the right hand difficult to believe when the left hand is off
being silly - but I'm actually impressed by the level of works produced by
their journalism department.

~~~
madenine
Their monetezation strategy is advertorial content posts.

People are so used to clickbait as a way to get impressions and clicks from
display ads that many don't realize there isn't a single 'traditional' display
ad on BuzzFeed's entire site.

They make their money by charging for references to 'advertisers' products and
services throughout BuzzFeed content - something they do really well.

Doesn't come cheap either - big cash outlays iirc.

------
reustle
Not long ago I saw a post about the oil pipeline protests. It was a photo of a
huge crowd, and stated that the media was not covering the protests properly
and that people are taking a stand. The photo was of tens of thousands of
people, and had over 230,000 shares. 230k....

A quick image lookup showed that it was in fact a photo from Woodstock 1969,
but since the comments on the photo were restricted, nobody had been able to
point it out.

I went to report it, but facebook seems to have removed the "misinformation"
option when reporting content (thought there was one before?)

------
grandalf
Trump's strategy worked brilliantly on social media. Here's the strategy in a
nutshell:

1) Say something outrageous that news orgs will grab for a quick, clickbait
story that will generate tons of ad revenue.

2) Let outraged people share via social media.

3) Benefit as some of the people seeing the content will not disagree and will
take the candidacy seriously.

4) Win.

It doesn't matter what Facebook does with its trending section, the real value
of Trump's strategy came with the way it exploits the basic sharing / newsfeed
mechanism's intended behavior. Even today, many Trump opposers think that they
were helping by posting their outrage at every rude comment Trump said.

~~~
fullshark
Except he closed the gap in the last few weeks of the race when he stopped
saying outrageous shit.

~~~
debacle
People still want to believe that Trump couldn't have won except by exploiting
stupid people.

~~~
grandalf
His message resonated with a lot of people. His tactics exploited social media
patterns and mass media patterns effectively. He also ran against a candidate
plagued by scandal and widely disliked.

------
Kenji
Let's be frank here: Free speech and facebook never go together. They want to
present as a respectable forum where prominent people like politicians can
have their platform (same with Twitter). That needs some kind of filtering.
And, as we saw with pretty much the entire American press, as soon as you
start filtering and selecting, you start to induce gross biases and
distortions of the truth. The truth only comes out in a clutter of
contradicting opinions, engaging discussions, pieces of evidence and leaked
materials. Of course, that is an ugly mess not many people want to put up
with.

~~~
matt4077
> The truth only comes out in a clutter of contradicting opinions, engaging
> discussions, pieces of evidence and leaked materials

It feels like we're way beyond that. It's degenerated to the point of two
tribes who no longer have any "shared truth" from which to argue. If I see a
post of Facebook saying that Obama has instituted martial law in New York, how
can I convince the 4364 people liking it that it's a lie? If I point out that
the New York Times would probably write about it, I'll just get laughed at.

The truth seriously fared a lot better in times where newspapers and a few TV
stations acted as completely undemocratic arbitrators.

~~~
gibrown
I think it may be worse than two tribes. Its a million tribes with limited
overlap in what they consider the truth, but all having a large megaphone that
cuts across tribal lines while still building bubbles.

My analogy is breaking down. I don't think we understand how the complete
removal of barriers to publishing are changing society. Nor do we understand
how to cope with it. A bunch of folks just elected an authoritarian to "guide"
the way.

------
tmptmp
I am not a Trump supporter, as far as his anti-Muslim stance is concerned. But
I do support some of his economic stances which are labeled as "protectionist"
by mainstream media. I may be wrong as I am not an economic expert also his
take on immigration is certainly a thing of significance.

A related note regarding Facebook here: AFAIK, Facebook's curators were very
biased and were always removing anything to do support what "mainstream vocal
leftists" find objectionable. e.g. Be it to do with the shameless suppression
of news related to people like Geert Wilders or Pamella Geller who are not by
any means right wing fanatics. Geert Wilders is a staunch supporter of
homosexuals. Just because he criticizes the barbaric ideology of Islam he is
labeled as a "right-wing nut" by leftists with covert/overt Islam-apologetic
stances.

It is well known that Saudi kingdom has large investments in mainstream US
media, that's what Trump drew people's attention to. Who knows how much of
Facebook is controlled by Saudi and the likes. [1]

It's no surprise that mainstream media got the prediction about Trump wrong. I
guess, there predictions and poll-results were in fact propaganda against
Trump. I felt it that way, many people I know felt it and I am sure many more
people must also have felt it.

[1]
[https://www.youtube.com/watch?v=Ex9ldUHSgjs](https://www.youtube.com/watch?v=Ex9ldUHSgjs)

------
brentm
This is a tough position for FB. No matter what they do, half of the the world
will think the actions they take are wrong. Most humans are intelligent enough
to know that stories like Hillary Clinton running a child sex ring (or
whatever) are false. People choose to share these because they want to believe
it or think it's funny. As they say in their mission statement "Facebook’s
mission is to give people the power to share and make the world more open and
connected." it is a commendable mission but it's also a messy one. I think
inevitably they will have to do something. Maybe some kind of "truth
barometer" on stories that attain a certain volume of engagement. At least
then they can say "look! we're trying" without taking away the people's right
to troll. In this particular election though I doubt anything would have
helped. The winner has been openly trolling for years, his direct actions set
the standard for truthfulness much lower than ever before.

------
malchow
It seems abundantly clear that the problem isn't low-quality content in the
feed. That's unavoidable. It's the very small trickle of high-quality content
in the feed – content from actual people who want to contribute to Facebook.
There isn't much of it. And Facebook wants to show a fresh feed every refresh.

~~~
andoma
Having uninstalled Facebook from my phone and being too lazy to login on it
from my primary computer (2fa) I only look at the feed maybe once every other
day. Amazing how much better it is. Relevant and interesting stuff from real
friend I care about, no junk. I can recommend that to everyone.

------
thrden
The timing of this announcement does not seem to help assuage fears that
facebook was using its platform to push an agenda. It was this accusation that
initially forced them to switch from human editors to an algorithm for
trending news. I suspect that we as a country are going to have a conversation
regarding the nature of privately held nearly public spaces on the internet
and what their obligation to us is, if any.

------
StanislavPetrov
In corporate speak "misinformation" is defined as "any information that
doesn't support our narrative". Certainly it is the goal of Facebook, the
government, and every other large media corporation to control the flow of
information. You need look no farther then Obama's recent statement that news
needs to be "curated" by government appointed gatekeepers, to filter out all
that pesky information that doesn't jive with government propaganda.

~~~
nickodell
Do you mind linking to where Obama said that? The closest I can find is this:
[https://www.yahoo.com/news/obama-decries-wild-west-media-
lan...](https://www.yahoo.com/news/obama-decries-wild-west-media-
landscape-214642552.html)

>"We are going to have to rebuild within this wild-wild-west-of-information
flow some sort of curating function that people agree to," Obama said at an
innovation conference in Pittsburgh.

Why do you think that the curation is going to be done by the government or by
government appointed gatekeepers?

~~~
marknutter
It's implied by the use of the word "we" and the fact that he is a government
employee.

~~~
rat87
> It's implied by the use of the word "we" and the fact that he is a
> government employee.

I think it's clear that's he's asking for a rebuilt mainstream media so that
people can at agree on facts.

------
guycook
"As the Americans learned so painfully in Earth's final century, free flow of
information is the only safeguard against tyranny. The once-chained people
whose leaders at last lose their grip on information flow will soon burst with
freedom and vitality, but the free nation gradually constricting its grip on
public discourse has begun its rapid slide into despotism. Beware of he who
would deny you access to information, for in his heart he dreams himself your
master."

~~~
ern
AI curated feeds are, of course, not the free flow of information, rather they
are blind algorithms looking at what one's "friends" like and what you've
previously liked and then feeding you more of the same, creating filter
bubbles and fracturing society, while allowing nefarious actors opportunities
to game their inherent naïveté.

Closer to the free-flow of information was USENET, and even though it was
eventually overrun by trolls and spammers, filtering was largely human driven
and unsophisticated, via their news clients, and one had little choice but to
look at contrary opinions, and even to debunk some of the totally false junk,
which is now hidden from view or buried at the bottom of long threads, left
for the "true-believers".

I've been reading parts of the "alt-right" (deliberately breaking my own
filter bubble) for many years, and although I despise what they stand for, I
do agree with them on one thing..human ability is not evenly distributed. Of
course they use this point to advance a racist agenda, but the fact remains
that many people of all backgrounds are just not smart enough to deal with the
"free" flow of information. They lack the intelligence to deeply introspect on
their own biases and to understand that what they are being fed is garbage
aimed at manipulating them. The sooner geeks (and those with a "small l
libertarian" streak) accept that reality, the safer[1] the world will be.

[1] in a literal sense, now.

------
ausjke
the elite mainstream media and the supposed-to-be-independent reporters are
the biggest losers in this election, as they took sides so strongly that the
other 50% will never bother to have anything to do with them. facebook needs
improvement? sure, but more important, those biased media outlets will be
irrelevant for most of the people due to not only technology advancement, but
also what the news industry has been doing over the last few decades, i.e.
they're all so biased, that you always know what they're going to say, why
bother reading there.

anyone can tell me one true independent news source in US today? there is
_none_.

~~~
dmode
What about the non-elite supposedly non-mainstream media ? Like Fox News ? Who
perpetrated a false story days before the election and came back to apologize
for it ? They are real winners right ?

~~~
ausjke
for the record I don't read fox-news and don't know what you're talking about.
I read Reuters but its US news is limited, I did some study and was educated
it is probably the closest to be called as a neutral source.

~~~
kanwisher
oddly enough the two financial news wires, Reuters and Bloomberg were the
least bias. Cause they give sterile news about markets, their customers want
to know how markets will go, not hear what they want.

------
mstade
I read somewhere that Facebook isn't liable for information posted there, that
the user's posting the information is the liable party. Here in Sweden, that's
not how news organizations work. Here, whatever they publish they are liable
for, regardless of who originally wrote it.

I know Facebook isn't a news organization, but it's treading a fine line isn't
it? At some point one might argue that by being an aggregator and driving a
large portion of its engagement in the sharing of stories is akin to being a
publisher, and thus should be held liable for the things it publishes –
regardless of who pulled the trigger. Of course, this would be a huge
liability for Facebook, and a growth inhibitor because all of a sudden, a
story going viral isn't exactly a good thing if it means they can easily be
sued for libel.

I'm not a lawyer (probably obvious given the above) but I think it's a bit
strange that given Facebook's position they can so easily get away with a
"wasn't me!" kind of attitude to misinformation and straight up lies being
spread around presented as though the stories were in fact news.

~~~
acbabis
I'm not a lawyer either, but I don't think what worked in Sweden will work for
the US. Not only is it hard to sue people for libel, but content sites like
YouTube fought pretty to hard to make sure they couldn't be responsible for
things other people post on their sites (as long as they promptly respond to
DCMA notices).

------
ihsw
The moderation of Facebook is outsourced to a select group of organizations,
not unlike Twitter with their Trust and Safety Council.

[https://about.twitter.com/safety/council](https://about.twitter.com/safety/council)

[https://blog.twitter.com/2016/announcing-the-twitter-
trust-s...](https://blog.twitter.com/2016/announcing-the-twitter-trust-safety-
council)

Unsurprisingly, Twitter's Safety Council members are predominantly left-
leaning with ties to the (now defunct) Clinton campaign. I would presume there
are plenty of seats to be filled now.

~~~
duaneb
Isn't twitter moderation openly mocked for being, well, hardly functional at
all? It's been a bastion of bigotry for the past year. It's Trump's main
platform. If they're trying to suppress the right, they're doing a TERRIBLE
job.

~~~
makomk
Twitter moderation is slow and ineffective, but still partisan - they're jut
not very good at it.

~~~
bduerst
If it's ineffective then what is the evidence is it that it's partisan?

------
roughly27
Determining what is, and what is not, misinformation seems:

1\. Intractable. How exactly do you propose to accurately vet every bit of
information posted to facebook?

2\. Dangerous. Who does the vetting? If it's an algorithm, who writes the
algorithm? No matter which way you slice it, "doing more to stop the spread of
misinformation" imposes someone's 'one true view' of reality and stifles the
engine of western civilization: the open, competitive marketplace of ideas.

This is clearly an emotionally driven sentiment caused by the moral outcry of
a person considered 'bigoted' being elected to office. Stifling political
expression will only make Trump/next trump/next next trump's populist appeal
stronger.

------
mucker
Or Facebook can take their thumbs off the scale on what is supposed to be
friends sharing information. Do you realize how Orwellian it is for a big
company to say friends Bobby and Sally have to be monitored in what they
share? How does this not bother people?

I don't care if Bobby and Sally believe that we never made it to the moon and
wear tin foil hats. It is _none of FB's damn business_. They are a _platform_
not the editors of all things true. They should not issue us all with truth
detectors.

------
piotrjurkiewicz
So what do they propose? To hire more human editors/moderators to
evaluate/censor links and stories?

FB already has too much power and its moderators affects 'politically
undesirable' content too much.

One week ago, FB blocked an event page of Independence March, which
traditionally takes place on November 11th (Polish Independence Day) in
Warsaw. This is the biggest mass event of Independence Day, having more than
100k participants each year.

FB also blocked or removed pages of NGOs and political parties, which are
organizing or support the march (some of them had 80k or 170k followers) and
personal accounts of people involved in these organizations.

Then FB went full rage and started to block personal accounts of everyone who
invited or even positively mentioned the Independence March, including for
example the personal account of editor-in-chief of the second largest daily
newspaper in Poland
([https://twitter.com/sjastrzebowski/status/793001362070052864](https://twitter.com/sjastrzebowski/status/793001362070052864)).

The most extreme case was the personal account of a MP, who wrote on his
timeline: "I will be [on the Independence March] along with my family, whether
FB likes that or not."
([https://twitter.com/jakubiak_marek/status/793497135954202625...](https://twitter.com/jakubiak_marek/status/793497135954202625...))
His profile was blocked for 24 hours after that.

Another case was a personal profile of a retired Intelligence Agency officer,
who revealed in a FB post, that a local coordinator of an anti-government
liberal-left protest movement during the communist period was a colonel of
Soviet-dependent military intelligence agency.

All of this happened just within the last month. FB actions generated a huge
pushback and hit the headlines. Deputy Minister of Justice qualified FB
actions as "censorship". Minister of Digitization tweeted that she "asked FB
management for a talk". Many people started deleting their FB accounts in
protest.

FB got frightened and reactivated the event page of Independence March, but
many nationalist/conservative organizations profiles still remain blocked.

------
OliverJones
Here's what to do about Facebook.

(1) permanently close your account.
[https://www.facebook.com/help/delete_account](https://www.facebook.com/help/delete_account)

(2) if you're like me and you can't do that, "unfriend" everybody except the
people you work with. Delete all your affiliations and locations.

If enough people did this FB would have to do a little struggling to retain
membership. That's the kind of incentive they need. Altruism demonstrably
doesn't work, especially Sand Hill Road altruism.

(I help a couple of little nonprofits so it happens that I need a fb account.)

~~~
antocv
Alternative, lightweight solution,

* Do not visit facebook often, do so only in incognito mode, delete other existing facebook cookies on all your browsers.

* Do not use facebook app, inform people that it is a battery hog and steals everything it can on your phone.

* Do not accept new friend requests, or send new ones.

* Do not post status updates or any other kind of content

* Do not "Like" things, do not comment. Do not produce anything for facebook actively.

* Keep it only as a place to check if other people upload pictures of you by scrolling the feed a little, to tell them to stop if they post information about you, use the web interface for messaging any friends who dont know any other means of digital communication.

* Sometimes Like things you dont, spread some misinformation, devalue what content/profile they already have about you.

* Ofcourse set all your posts to private, and all other "settings" to maximum privacy, allow noone to write on your "wall" and so on.

~~~
amelius
Also a good tip: unfollow all your friends by default, except for a select
few.

------
kirykl
Wouldn't Facebook have the perverse incentive to define 'misinformation' to be
any information which offends the most people, regardless of truth, in order
to keep users from leaving

~~~
mucker
Of course. Worse, they actually have two different strategic factors that play
poorly together: 1.) Retain users 2.) Have a personal bias about what will
offend users

These two items can magnify each other in a false direction as they attempt to
retain users that they perceive they have.

------
Hydraulix989
I don't understand how this is different than any other communication medium.

Telegraph operators shouldn't feel any onus to ensure that they stop the
spread of misinformation, right?

In fact, I think it is VERY detrimental for them to step in and started
editing posts, bringing back the human editors. Humans have biases, ML less
so.

If an algorithm decides that something is trending, even if it is "incorrect"
to some interpretation of the word, then it should be treated the same. It's
just a matter of principles.

------
belorn
How are we to ever prevent misinformation when news papers are happy to
publish that "political candidate did crime X! anonymous john/jane doe said".

If news papers were held to ethical requirements that prevented mere
accusations from being news, I could see how Facebook should also be held
liable for not following the same rules. As it is now, someone could just
append "john doe said" at the end of any conspiracy theory and by news
standard, its fact checked and ethical.

------
SeanDav
It is a temporary aberration. In due course, people will come to realize that
they can't believe everything they read on the internet (duh), especially
where provenance is not clear, and Facebook will improve their algorithms.

Some kind of reasonable balance will be found.

People blaming Facebook for Trump is far more a reflection of themselves than
of FB news algorithms.

~~~
mtberatwork
I certainly do not blame Facebook for Trump, but I don't see any balance
happening anytime soon. It's not just an issue of the Internet, but also with
traditional forms of media like cable TV, talk radio and print. Distrust and
misinformation are shoveled out daily. Even the local pastor is in on the
action these days.

------
joggery
You can't prevent false ideas spreading by censorship; quite the contrary.

The idea that one can create an AI or censor or committee to separate truth
from falsehood and then put it in charge is plainly authoritarian.

~~~
ben010783
What about just keeping out stories that are patently false? If I did an image
search for cars, I wouldn't want a boat in there. If I did a search for news,
I wouldn't want a fabricated story.

~~~
joggery
Such stories would not go viral.

~~~
kmtrowbr
That's not true: false, viral stories were the very lifeblood of this
election.

~~~
joggery
I'm sure that it so. However, for a story to spread it has to seem like it
_might_ be true to a significant fraction of the readers. So it can't be
'patently false'.

------
TrumpVoter12345
Throwaway for obvious reasons. It's dangerous to support reform against the
reactionary left.

I'm a Trump voter. I consider myself pretty well educated (M.Sc. CS/EE,
honors), and I work in a research lab. I'm not your typical Trump voter. Full
disclosure: I'm also an immigrant (legal one, nearly 20 years), naturalized US
citizen, I make several hundred thousand dollars a year, and pay a ton in
taxes. I'm liberal on social issues. I support LGBT and gay marriage. I'm an
atheist. I have a wife and a kid. I'm also pro-gun, pro marijuana, and pro
letting people do whatever they want with their lives. I'm also extremely
fiscally conservative, and against illegal immigration.

My FB feed was so skewed against Trump I had to close my account. Bombshell
stories on Clinton wouldn't trend, while the most inane BS about Trump would
immediately reach the top and stay there.

It is strange for me to hear FB being accused of being "too conservative". I
certainly saw nothing like that "red" feed on WSJ.

BTW, I could elaborate why I voted for Trump if anyone is interested. I don't
agree with him on everything, but on the balance he seemed like the lesser of
two evils to me.

~~~
spiritofenter
I would like to hear you elaborate on your reasons for voting Trump,
particularly what kinds of changes you are expecting/looking forward to.

~~~
TrumpVoter12345
Happy to oblige. Let's get the big one out of the way: I disagree with Trump
on climate change. That said, the current president did jack sh#t about it,
and so would Clinton (who is owned by her wealthy donors). Combined with the
fact that the President can't enact policy unilaterally, and there's broad
consensus around climate change in the legislative branch, I don't think too
much damage will be done if any at all.

I voted for Trump for several primary reasons:

1\. I'm very much against war, and Clinton has shown herself as a war hawk
again, and again, and again, and basically wrecked much of the Middle East as
a secretary of state. Enough is enough. This alone would have been a
sufficient reason for me to vote for Trump. 2\. I do not believe in government
handouts. I believe that for prosperity, and especially to improve the lives
of the working poor, it is necessary to create jobs. To do this, you need to
reduce supply of unskilled, below-minimum-wage labor flowing across the
border, and make keeping jobs in the country the low energy state into which
corporations will naturally arrive, if they want to have US markets at their
disposal. Trump promised both of those things. 3\. I do not believe that free
flow of military age males from the countries with which we're de-facto at war
is a wise thing to have. Refugee needs would be much better served by their
neighbors, not by people whose military blew up their relatives. 4\. It is
becoming blindingly obvious that Obamacare is a trillion-dollar handout to the
Big Pharma and medical insurance companies. The healthcare is SIGNIFICANTLY
more expensive now than it was before, and coverage is worse as well. And the
cost is growing double digit percentages every year. Donald Trump promised to
improve competition by allowing import of drugs and letting people shop for
medical insurance across state lines. Not sufficient, IMO, but we just can't
afford to pay 10x for the same drugs anymore. 5\. The vast majority of Donald
Trump's contributions comes from his small scale supporters, not from Wall St
and megacorps. As such, he's not beholden to their interests, unlike certain
other candidate. Thiel being the only notable exception I can name off the top
of my head. 6\. I've found what the press did in this cycle completely
disgusting.

So TL;DR: anti-war, pro-jobs, anti-sjw/pc, pro-LGBT, not owned by Wall St or
the establishment. That sounds pretty good to me, loose tongue and personal
antics notwithstanding. I don't agree with him on some things, but by and
large there's more overlap than with Clinton.

~~~
spiritofenter
Thank you for responding, this gave me food for thought. (Also, you can add
extra line breaks in your comment to get breaks between your numbered points.)

Also wondering how much you were convinced about Trump's sincerity. Whether he
truly cares about the promises he made or the people he made them to.

~~~
TrumpVoter12345
I'm more convinced of his sincerity than of Clinton's. Let's face it: he
doesn't _need_ presidency. 400K a year is pocket change for him, he's not a
politician and he did not spend 30 years of his life converging on this goal.
Moreover, he partially financed his own campaign with tens of millions of
dollars. Truth is, all politicians lie to get elected, he's no exception.

But I believe he did not lie about the core tenets of his campaign: wars,
immigration, globalism. He will do something about those at the very least.
That's good enough for me.

------
danblick
Neil Postman argued in "Amusing Ourselves to Death: Public Discourse in the
Age of Show Business" (1985) that _entertainment_ had become the "supra-
ideology of all discourse on television". _Good television_ is entertaining
while _bad television_ is not, so we learn to judge all content on television
based on its entertainment value.

"Americans no longer talk to each other, they entertain each other. They do
not exchange ideas; they exchange images."

I think Postman's comments are extremely relevant today (30 years after 1985)
except that now they would apply to the Facebook news feed. (The quote about
'exchanging images' is now literally true.) The role of "news as
entertainment" does a lot to explain the Trump's emergence as a candidate in
the first place (he got a lot of coverage early on because of his
outlandishness).

------
jondubois
I don't think this is only Facebook's fault. I watched several extremely
biased Youtube videos on both sides.

I guess it's just how decentralized media works; it only tells people what
they want to hear.

Not so long ago, the media used to be relatively trustworthy, now everything
has become propaganda. It started online mostly but now even major TV channels
and newspapers which used to pride themselves on journalistic integrity have
become propaganda machines.

There is no truth in the media anymore; the only way to approximate the truth
is by watching opposing media channels and then 'averaging them out' in your
mind.

All the news I was reading online was so obviously biased in Hillary's favor
that I was compelled to watch Fox news (which I used to think was complete
garbage) just to try to get the other side of the argument.

~~~
miles7
This assumes Fox is as biased in one direction as, say, NPR is in the other.
But you can tell the quality of reporting from the content itself and the way
it is presented. For example, while cable news channels will go to the same
non-expert pundits for every story, NPR will actually call a different
specific expert who is knowledgable to comment on each story.

~~~
jondubois
Sure, that's something you have to account for as well (they're not equally
biased).

That said, you'd be surprised how well media organizations can bend hard facts
(and so-called 'experts') to match their opinions. People who've worked in
quantitative market research will tell you that you can find pretty much any
pattern you want in any data if you present it (and label it) carefully.

Just because they've put in extra work to get an 'expert' and did some
research doesn't mean that they're unbiased; it just gives you extra
confidence - But you still have to fact-check yourself.

------
alva
Can't wait for the Facebook Department of Truth™ to inform me!

------
youdounderstand
Just stop using aggregators. Pay for real news sources that hire actual
investigative journalists.

~~~
vetinari
Real news sources are dead and actual investigative journalists do not exist
anymore.

~~~
paublyrne
Well that's patently not a true statement. The Guardian, the NY Times, and Der
Spiegel are the first three publications that do important investigations that
come to mind but they are not alone.

There are a lot of PR plants even in the more reputable newspapers, which is
unfortunate, but there are still regular thorough investigations, which I
assume are getting harder to do as money and resources continue to shrink.

~~~
doomlaser
The blind NYTimes bias has been readily apparent in this election cycle. WaPo
as well. The entire media system missed this election.

A staffer in the Obama White House remarked on the quality of reporting now,
as of a few months ago:

"All these newspapers used to have foreign bureaus. Now they don’t. They call
us to explain to them what’s happening in Moscow and Cairo. Most of the
outlets are reporting on world events from Washington. The average reporter we
talk to is 27 years old, and their only reporting experience consists of being
around political campaigns. That’s a sea change. They literally know nothing."

[https://www.washingtonpost.com/blogs/erik-
wemple/wp/2016/05/...](https://www.washingtonpost.com/blogs/erik-
wemple/wp/2016/05/05/white-house-official-on-some-reporters-overseas-
expertise-they-literally-know-nothing/)

~~~
linkregister
That said, NYT performed some significant investigative reporting. Though
Gawker broke the Clinton private email server story, the New York Times vastly
expanded the investigation and brought the story into the mainstream.

The statement that the entire media system missed the election is an
exaggeration. One function of the NYT, the election prediction section,
failed. It shouldn't reflect on the various investigative reporters doing
actual digging.

~~~
doomlaser
I would recommend reading this commentary, from the political director of CBS
News, today:

"The mood in the Washington press corps is bleak, and deservedly so.

It shouldn’t come as a surprise to anyone that, with a few exceptions, we were
all tacitly or explicitly #WithHer, which has led to a certain anguish in the
face of Donald Trump’s victory. More than that and more importantly, we also
missed the story, after having spent months mocking the people who had a
better sense of what was going on.

This is all symptomatic of modern journalism’s great moral and intellectual
failing: its unbearable smugness. Had Hillary Clinton won, there’s be a
winking “we did it” feeling in the press, a sense that we were brave and
called Trump a liar and saved the republic."

[http://www.cbsnews.com/news/commentary-the-unbearable-
smugne...](http://www.cbsnews.com/news/commentary-the-unbearable-smugness-of-
the-press-presidential-election-2016/)

~~~
mmel
I'm frankly amazed that was published at CBS and not some pseudonymous post on
medium.

------
std_throwaway
Misinformation is any information that does not follow the official guidelines
laid out by the ministry of truth.

------
alexmingoia
The problem is that the Facebook feed shouldn't be about news in the first
place, or anything that isn't original content from the people I follow. I
miss the days when the news feed was photos of my friends, notices about new
their new job, what they ate, etc. instead of a stream of tabloid junk and
recycled news articles.

Unfortunately Facebook gives no tools to limit the posts to only original
content and profile updates. Curating your followed list is pointless since
reposted crap and original content are mixed together.

~~~
kmtrowbr
I agree. I think there could be a limit on the amount of links and of images
containing text that you can post in a day, as well as warning messages next
to content that comes from questionable sources.

------
rahrahrah
Admiting to fight misinformation amounts to the same as admiting that they
will pursue a left-wing bias.

------
avivo
We are developing systems to identify and check the impact of misinformation
at scale (triage + human in the loop).

If you're interested in potentially joining or funding us, let us know
[https://docs.google.com/forms/d/e/1FAIpQLSclhq7zrUKI3nxJFiYw...](https://docs.google.com/forms/d/e/1FAIpQLSclhq7zrUKI3nxJFiYwJ-
NvyW4xAeLbSF7cWxst6GKBON9Y7Q/viewform)

\-----------------------------

Also relevant — my comment 43 days ago to the new "Request for Startups":
[https://news.ycombinator.com/item?id=12594955](https://news.ycombinator.com/item?id=12594955)

"I'm really happy to see YC putting attention toward the (increasing) market
failure around media quality. It's a serious problem, with IMHO imperils
governance and stability worldwide.

I'm exploring a way to improve this dramatically, and fairly rapidly;
essentially creating a stronger market for trust & quality, with resulting
ranking rewards from FB, Google et al. If you are interested in learning more
and perhaps being involved, let me know (yc@aviv.me). Experience with news
organizations, partnerships, platform companies, pr, moderation systems, or
machine learning are all especially helpful."

------
jondubois
The biggest source of bias in the media is not related to feeding people false
information, it's about misdirection.

If you have two articles A and B, where A is more important than B, sometimes
the media outlet will deliberately choose to publish B instead of A if it
matches their philosophy better (even though it's less important). If the news
outlet shows each reader 5 articles of type B for every 1 article of type A -
Then they are manipulating the reader's opinion to become biased towards type
B thinking (by focusing their attention on less important stuff). It doesn't
matter if all articles are 100% accurate, your perception is being affected
through the exposure imbalance; through priming.

I think that's what happened with the Clinton/Trump fiasco; "Hillary media"
was publishing more articles about Trump's "sexual deviance and unstable
character" while "Trump media" was publishing more articles about Hillary's
"deception and murky financial schemes".

------
Zenst
How do they define misinformation as many grey area's, for example religion.

So in such grey area's one persons information is another's misinformation and
that gets messy for anybody, let alone AI.

------
leurfete
So Facebook will be openly committed to censoring certain information for the
public good. What could go wrong?

~~~
ern
Their algorithms are already doing this. It's not like it's a pure
chronological feed.

------
sandworm101
Facebook isn't the problem. It's a function of its users. It is as much
responsible for misinformation as a city park is responsible for the crazy guy
shouting about the lizard people. The problem is all the users who actually
believe anything they read via facebook.

~~~
xs
Suppose you were the person to relay the message though.

Imagine one friend tells you "Can you please tell Adam X", but you know X is a
flat out lie and is actually horrible and offensive. So do you agree to spread
the lie? Oh and you're doing this free, and it's not your friend it's 2
strangers.

~~~
sandworm101
"Can you please tell Adam X" is a hair away from spreading rumor, hearsay. I
would ask, at least in the context of facebook, why the speaker cannot talk to
Adam himself. If he cannot, if he doesn't have the connections required to
speak to Adam, then his message probably isn't worth forwarding.

~~~
xs
I was trying to paint the picture that Facebook is the one passing along
messages from one person to another. So yes, people could turn their computer
off and go directly to someone elses house and tell them the message, but I
think we've grown accustom to sending messages to each other over social
media.

------
bobsgame
I would like to see less headlines frame every statement made by a company as
"admits." Facebook is not admitting to wrongdoing or confessing to a secret
crime they tried to cover up, they are _acknowledging_ there is an issue that
they have to improve on.

------
webwanderings
[http://www.niemanlab.org/2016/11/the-forces-that-drove-
this-...](http://www.niemanlab.org/2016/11/the-forces-that-drove-this-
elections-media-failure-are-likely-to-get-worse/)

------
arprocter
If it's such a problem why not just disable the Newsfeed?

No one needs to get their news from Facebook

------
triplesec
Very relevant to understanding today's media and politics is Adam Curtis' new
documentary film Hypernormalisation. It's long but very worth it

[http://www.bbc.co.uk/programmes/p04b183c](http://www.bbc.co.uk/programmes/p04b183c)

If anyone has links to the BBC iplayer for the US, if there is one, or another
legit source for outside the UK, please do reply with it.

I strongly encourage you to watch it, and if you can only find it from a dodgy
source, buy someone a classic BBC DVD of something for Festivus so support
their making more. (lots of things, Dr Who, Jeeves and Wooster, the Century of
the Self...)

~~~
justin_
Seconding this recommendation. American viewers can watch the film here:

[https://thoughtmaybe.com/hypernormalisation/](https://thoughtmaybe.com/hypernormalisation/)

------
Osiris30
"It’s totally forgotten now, but for the 100 years after the American
Revolution, the U.S. government made it free or almost free to send newspapers
anywhere by mail. It was available to papers of all political perspectives,
with no government censorship. The rationale was straightforward: This was
necessary for people to participate in governing themselves."

[https://theintercept.com/2016/11/09/donald-trump-will-be-
pre...](https://theintercept.com/2016/11/09/donald-trump-will-be-president-
this-is-what-we-do-next/)

------
marze
If the goal is not to bias, total transparency is the answer.

------
Animats
Here's a tool someone could write. Write a browser add-on which modifies
Facebook pages so that the news links are changed to Google, Bing, and
Wikipedia searches for the news headline. Connect to Google/Bing in a mode
with no signin or cookie info, so you don't get "search personalization". This
will get you out of Facebook's filter bubble quickly. Google and Bing have
problems, too, but they are somewhat better at blocking spam, having been at
it longer and being in the business of forcing spammers to buy their ads
instead.

~~~
majani
Better and easier to do a Google News search for the link itself. If the link
doesn't show up in news search, it can be considered a less credible source.
If the link shows up, it'll be accompanied by follow up sources

~~~
jwm4
Google as the arbitrator of what's good or bad? Please!

~~~
majani
well are you going to suggest a better solution?

------
dsugarman
this is what I would propose. Attach a very visible rating on each story
shared from all news sources, the rating says how much the news source holds
to it's journalistic integrity. There should be a very clear criteria that
facebook holds it's news content providers to. If you click on the rating of a
low rated news source, you should be shown exactly what extremely misleading
articles they have published in the past, these reviews should also be tagged
so you can view it directly if someone is spreading misinformation on the news
feed.

~~~
kmtrowbr
I agree with you. I would keep it simple and as transparent as possible.

------
gdubs
Look folks, I get that this is a libertarian crowd but hear me out:

If someone says the earth is flat, it's not dystopian for editors to minimize
those claims, or even point out how incorrect they are. Instead, in today's
media world, the response would be, "two sides disagree on contour of the
planet", and would present two talking heads to chatter about their opinions.

Facts are facts, and we're losing the battle online to disinformation.

Beware of slippery slope arguments about how editorializing leads to
propaganda.

------
makecheck
There are two big problems shown by this. First, that any one service like
Facebook could become so large and unopposed that we _need_ to care how it
handles something like this. And second, that people seem fine with the idea
of living only inside Facebook, unfriending or muting anyone who disagrees
with them, until nothing else matters.

This is an utterly unhealthy way for people to be. If you want to combat
misinformation, you must also encourage (teach?) people to stop being spoon-
fed and start being more curious and more critical.

~~~
kmtrowbr
Facebook could deliberately inject a certain amount of contrarian content into
people's feeds.

------
psyc
"Stop the spread of misinformation" is a subset of "stop the spread of
information", which is yet one more way of saying that some people can't stand
the freedom of speech.

~~~
alexc05
That is fair - however it does not have to be implemented by stopping the
information from spreading. An alternative could be to refuse to distribute
the false information without a disclaimer reading "DEBUNKED / FALSE / see
here for fair balance"

Or include a DIV to one side with a bulleted list of false claims.

Consider the TRUMP quote image:
[https://pbs.twimg.com/media/CRxEt7pU8AAzekC.jpg](https://pbs.twimg.com/media/CRxEt7pU8AAzekC.jpg)

People Magazine 1998 - I'd run as a republican because they are the dumbest.
I'd lie my face off and they'd love me [paraphrase]

A Facebook-fact-checker wouldn't have to suppress the sharing of the image. Id
could create an overlay or add a subtitle or tag which reads "FALSE QUOTE -
this isn't real"

Rather than relying on the hundred million individual conversations being
_maybe_ fact checked - an algorithm / ai / editorial squad could apply fact
checking automatically.

The IMAGE thumbprint / URL flagged in the facebook database as proven false,
it gets the extra-content on its own.

------
mbostleman
It would be nice if this were approached more as a user experience enhancement
as opposed to a content editorial function. I would love to see the former as
my feed is so overwhelmed with garbage. The latter, however, would almost
inevitably turn sinister - the leftist, PC, Silicon Valley influence would
have no ability to check itself - even if it's not on purpose, the isolated
bubble think could not be overcome. I can't imagine that it would hard to
automate away teenagers purposely creating fake content.

------
white-flame
I really enjoy the dynamic of large forums, where varying ideas can pitch in,
and I appreciate that much more in comparison to family members who are
steeped in facebook.

Global forums can end up with a fairly civil discourse, because everybody
knows and acknowledges that there are differences in opinion. If you want your
thoughts to be respected, you need to make them level headed, even if it is an
emotional argument. The broader the forum, the better it tends to be. Of
course, it also depends on good moderation, but even the self-moderation of HN
and /. tend to end up being level headed and fairly moderate.

But there are no alternative opinions or moderators in facebook. People
subscribe to small echo chamber radicals, which laser focus on some hyper
exaggerated version of their outlook and there's no counterpoints in sight.
Disagreements with what other people post ends up in unfriending, because it's
too personally dissonant.

This whole blurting out of one's impassioned political, religious, and social
ideologies is simply not appropriate public discussion!

You don't go out to social gatherings and start yelling stuff like this. Yet,
on facebook, people share such content on their walls for all their
acquaintances to see, which ends up being incendiary. These are subjects
reserved for those you have enough rapport with to warrant deep controversial
dives, not your personally public web presence. The very nature of the
mechanics of social media tends to stoke and reward this behavior.

------
GBond
Zuck seems to be crafting and curating his persona & reputation for a future
life of public office. Changing up facebook's bad content reputation will be
important for this path.

------
zxcvvcxz
Something I haven't seen mentioned yet is the voting system. I think the
voting system ruins online discourse.

Let me explain. Within my own social media and online forum bubbles, there are
a certain set of norms and beliefs that are easy to express.

But should I express the 'wrong' beliefs in the 'wrong' forum, I simply get
downvoted to oblivion. And then people don't see what I have to say. Look what
happens here on Hacker News, my comments will literally start faaaaading
awaaaaaay!

~~~
dredmorbius
Truth isn't a majority opinion.

It's correspondence to an external Universe.

Voting systems _may_ help get you there, but not if the voting-group tendency
rewards values other than truth.

Related: Celine's 2nd law.

The problem is complex:
[http://plato.stanford.edu/entries/epistemology/](http://plato.stanford.edu/entries/epistemology/)

------
DINKDINK
Ministry of Truth

------
thr0waway1239
The linked article from NiemanLab [1] is a really odd read. It ends like this:

"It’s been said that we get the media we deserve: that the journalism we see
is a reflection of business structures and audience decisions, not the result
of an elite’s decisions to shape public opinion. There’s a lot of truth to
that. But the information we produce and consume is generated by human beings,
not systems, and those human beings have just gotten the shock of their
professional lives."

So he starts by saying that the journalism we see is _not_ the result of an
elite's decisions to shape public opinion as if it is a good thing (which you
would probably agree with).

Then the next line says that the result of the opposite - the unwashed masses?
- is (some bad thing will occur).

At this point, you are probably expecting something neutral to follow. I mean,
after all, the author talks about the journalism profession, which is supposed
to provide both sides of the picture.

What actually follows is this:

"If we’re going to build a better environment for news, we need to think about
these issues in a much bigger context than one election night. And it’ll take
everyone — journalists, readers, tech companies, and more — to make it
happen."

Great, so the author has just defined himself, Facebook, the general media and
the people who agree with his views as the "elite" who wants to "fix" the
problem by shaping public opinion in a way which is more palatable to him.

Maybe this attitude is what caused this "systemic shock"?

[1] [http://www.niemanlab.org/2016/11/the-forces-that-drove-
this-...](http://www.niemanlab.org/2016/11/the-forces-that-drove-this-
elections-media-failure-are-likely-to-get-worse/)

------
cm3
I'm still waiting for the day that there's a responsible entity that only ever
publishes things when they're 100% certain and is willing to bet their freedom
(going to prison) on it. But given how information and influencing of opinions
is a market and means control of the population, I'm afraid this won't happen
with official support, and only be seen as the crazy lunatics news agency that
publishes stuff a week or month later after having vetted things.

I'd like to say leave speed reporting to the Twittersphere and mandate a clear
label on unvetted news reports on any network, but I doubt politics can have
such influence on the media. I would love it if the news reports had a
watermark that says fresh-and-unvetted just like "preliminary results" or
"consult your doctor before taking".

First we have to encourage and support critical thinking, but too much of it
may lead to some influencers misusing it to support causes which deny past and
current crimes of humanity on itself and the planet.

~~~
robbrown451
That's scary. 100% certain? Nothing is 100% certain.

So they can't even say "it's likely to rain tomorrow"? This doesn't seem
thought through.

~~~
cm3
Not all topics warrant certainty, but given the power of news outlets forming
opinion and thereby influencing the population's behavior, I do think certain
topics demand responsible reporting which doesn't report anything at all if
uncertain. It's the same as a good police detective not disclosing
speculations to the media because they pursue many leads but only conclude
one, and you don't want a mob go lynch people. The same logic applies to news
outlets forming people's opinions. That's why I think there's a need for,
admittedly few, certain-report-only news agencies. That way, if you read posts
on TheSun or E!Online, you get accustomed to not taking it seriously, forming
doubt that this is most likely speculation. Once something gets confirmed
unquestionably, it can migrate to one of the few vetted-only news outlets, if
that's something they cover.

That said, our weather models are pretty good but not good enough to make
certain predictions that far away into the future, but they can for the next
few hours.

It's like a software company's model of code branches. The
Apple/Google/whatever filesystem team works on something, it gets pushed into
their level of production branch, then it percolates up to the shared
production kernel branch, and after a couple more layers it hits the common
branch, which is what public production binaries are made from and consists of
kernel, userland, foobar modules all merged together. Not all software shops
operate this way, but it's what size of a project can demand after it hits
certain amount. The linux kernel works this way too, to name a successful non-
commercial project. You can argue this doesn't prevent regressions, and that's
true, but it's hard to deny there would be more regressions (aka false
reporting) with unfiltered (aka unvetted) reporting.

~~~
robbrown451
I can understand having _some_ news outlets with a much higher threshold.

I still don't believe the term "100% certain" is meaningful. Maybe if they
were to put a label on certain facts: "This fact is considered by our
editorial board to be 93% certain." And maybe have a chart, so that figure can
change over time.

I think there are better ways, and that there should be accountability. I just
am not in favor of black and white terms for concepts that, to me, are purely
shades of gray.

------
mtgx
So who's going to be Head of Truth at Facebook?

~~~
dredmorbius
Someone very good at it, I hope.

------
omegaworks
I think it would be better to provide something like the opposite of "friends
you'd get along with"

If we could switch the facebook algorithm to different modes - "people you
agree with" "people who'd challenge you"

That would be better than straight up blocking false things. Let the
conversations happen.

------
flipcoder
Censorship

------
cakeface
This is sort of starting to remind me of the Reticulum in Neal Stephenson's
book _Anathem_. He talks about how they are constantly checking the reputation
of data and only valuing high reputation sources. It's certainly hand wavy but
interesting. Life imitating art imitating life.

------
Zikes
Good luck with that. Reddit's failed at it. Twitter's failed at it. It's
almost always bundled with some sort of hate speech or political rhetoric that
triggers swaths of people when they feel like they're being silenced.

There probably is no right way to do it.

~~~
duaneb
I tend to agree—I think that the more we get upset by these sorts of
behaviors, the more it encourages them. "Don't feed the trolls" is really the
only thing that works, and it's a pretty miserable experience to have nothing
you can do about harassment.

Which isn't to say I disapprove of reddit, twitter et al removing hate speech;
just that it's a very late and reactive solution to a problem that seems
inherent in social media.

Really, all we can do is improve filtering, and facebook seems almost
ideologically opposed to increasing user control over the content they see. So
I don't see facebook being a pleasant place to visit in the near future.

~~~
piotrjurkiewicz
> hate speech

On many subreddits, being in opposition to illegal massive immigration is
being considered as a 'hate speech' and is a reason for ban.

~~~
duaneb
Err, citation needed. Preferably by someone employed by Reddit. If it's just
the subreddit.... I don't know what to tell you beyond "find better ones".

~~~
piotrjurkiewicz
I was talking about /r/europe for example.

> find better ones

It's not that simple, having in mind that:

1\. Current moderators clique holds the most adequate name for pan-Europe
subreddit.

2\. Everyone registering account from Europe-located IP address is being
automatically subscribed to this subreddit.

------
eggy
My elders always taught me believe half of what you read, and believe is not
such a good word here.

People need to use their wetware and common sense. Tabloids were printing
alien invasions with photos, or that certain celebrities were aliens, during
the 70s and 80s, and _most_ rational people did not believe them. The papers
were allowed to continue printing.

How about multiple sources, and not just being tuned into FB for your entire
world view? I'm glad I taught my children how to live without their smart
phones, and now they use them wisely. They can even coordinate meeting up when
out in the city if their phone's battery or phone fails.

Common sense is a waning resource.

------
will_pseudonym
We obviously need a commission that determines what things are facts and which
things are fiction. It will be comprised of experts who know more than the
populace, so it will be much better. It will be called the Ministry of Truth.

~~~
kmtrowbr
The radio, print, and television media have laws concerning what they can
publish. I don't think it's so incredibly impossible to imagine that the new
technology which is replacing those old forms, might also have something in
the spirit of those older laws. Also consider: Facebook is already censoring
what you see, it's just censoring to show you things you will like.

------
spinlock
Call me old fashioned but I blame people for being stupid rather than
technology.

------
nogbit
LOL, whats it going to do run as a NPO and take donations from its users and
no ads? Cmon Zuck, I dare ya because you know that is the only way to "fix"
this...fix your bottom line.

------
MaysonL
"Rhetorical Fever" is entirely too contagious. Recovery from it can provide
immunity, but is fairly rare. (See _Shikasta_ by Nobel-Prize winner Doris
Lessing for more).

------
millettjon
I deleted my Facebook account and suggest others do the same.

~~~
dredmorbius
Are you going to suggest this solution to _every_ mass media channel which
turns out to have perverse or pathological dynamics?

Do you realise your personal actions in this case don't have any impact (at
the margin) on the whole?

Are you willing to entertain that perhaps there's something in the function of
Facebook (and other online sites) that leads to information flows with
strongly negative social welfare outcomes?

(I don't have, and have never had, a personal Facebook account.)

------
readhn
Facebook needs a censorship department! Get a list of prohibited topis from
white house and block those posts on facebook.

Better yet, ask Trump what topics to block :)

------
squozzer
A humble question --

Had Hillary won, would this post even be on HN?

~~~
return0
No. And facebook does not pretend it 's not pro-Hillary (its CEO endorsed
her). That doesn't mean the discussion is not warranted. In fact it's long
overdue. Facebook is now a medium like the rest, it needs to be regulated like
the press imho.

~~~
emodendroket
"Regulated like the press?" What the hell does that mean?

~~~
return0
Freedom from censorship, enforcement of defamation laws.

~~~
emodendroket
Those things already apply to Facebook users. Facebook is actually proposing
active censorship here.

~~~
return0
Yes that s what i m saying facebook has no business to censor people.

~~~
emodendroket
Oh. Well then I agree.

------
ddingus
Facebook isn't the right venue for this.

Of course it's filled with garbage!

While it's noble to seek better, the reality is people are using poor tools
because the good ones are failing them.

Ask where did journalism go and that's the start on real solutions.

If we had that as we used to, the garbage on FB would be much less of a worry.
The laughable would fit laughed at as it should.

------
hedonistbot
I love it when engineers try to fix social problems. When the constructs of
the solutions are human-made then they are gameable and therefore not a
solution. And also why is this suddenly a problem? I would wager that if
Hillary won there would be much less talk of "bias" and "misinformation".

------
grandalf
This headline could be retitled: "Silicon Valley calls upon Facebook to use
more aggressive censorship"

------
surferbayarea
Also to all the people here who are ok with selling their privacy to facebook,
google - how do you now feel that a Trump-led government will have 100% access
to all your data! It is time that we fix the internet from the menace of the
advertising business model - that has not only destroyed the internet but
society itself...

------
ern
Zuckerberg seems to be denying that its a problem:
[http://money.cnn.com/2016/11/10/technology/facebook-mark-
zuc...](http://money.cnn.com/2016/11/10/technology/facebook-mark-zuckerberg-
fake-news/)

------
randomgyatwork
What is the difference between 'fake' news and true news thats inconsistent
with prevailing narratives?

By letting 'the people' choose what trends, at least you are being honest.

Most real news is fake anyways.

------
hunvreus
The same way parents are responsible for the education of their kids, not
schools, individuals are responsible for feeding their brain and developing
opinions.

Facebook isn't promising more than cat gifs, Buzzfeed click bait and rants. It
is not more an echo chamber than your group of friends.

------
amelius
Can't proper moderation be crowdsourced somehow?

E.g.: a group of fact-checkers, with reputation based on previous performance
and a page-rank like system, and perhaps a meta-level above that (people
checking the fact-checkers), so that gaming the system becomes difficult.

------
daveheq
Define misinformation, because certain groups will call certain facts
misinformation.

------
michaelbuckbee
An interesting contextual point is that Amazon (or perhaps more precisely Jeff
Bezos) has an outstanding and tremendously well respected journalism feature
whereby the publish original, factual, well-researched articles every day at
the Washington Post.

------
goshx
They could start by including "misinformation" as a reason when you report a
post.

------
known
Unlike traditional media, social media spreads free opinion
[https://en.wikipedia.org/wiki/Decline_of_the_West](https://en.wikipedia.org/wiki/Decline_of_the_West)

------
known
FB is supposed to promote
[https://en.wikipedia.org/wiki/Wisdom_of_the_crowd](https://en.wikipedia.org/wiki/Wisdom_of_the_crowd)

~~~
dredmorbius
Mackay had some thoughts on that topic.

[https://archive.org/details/memoirsextraord05mackgoog](https://archive.org/details/memoirsextraord05mackgoog)

[https://archive.org/details/memoirsextraord01mackgoog](https://archive.org/details/memoirsextraord01mackgoog)

------
ixtli
I don't mean to be all doom-and-gloom but it seems too little, too late.
Someone else in this thread put it well:

> people immersed in echo chambers will accuse them of bias no matter what.

------
jondubois
I think that news and advertising is too powerful to be in private hands. All
major news and advertising sources should be nationalized or turned into non-
profits.

------
tomohawk
They have it backwards. Its not about blocking misinformation. That just puts
you in the role of the Inquisition. It's about seeking the truth.

~~~
dredmorbius
Both matter. Blocking fuckwits (or misinformation) is a very useful heuristic
to promoting truths.

(Google "Block fuckwits" for my earlier discussion(s))

------
cpncrunch
What if they added a flag button labelled "This post contains misinformation"?
Then, if sufficient people click it, it is reviewed by a human editor.

~~~
linkregister
Great, I can expect every New York Times and Fox News article to be flagged by
people who didn't like it.

~~~
cpncrunch
You'll notice I said that such articles would be reviewed by a human editor,
which would prevent this type of abuse.

------
emodendroket
I mean it seems a little dystopian if Facebook is going to become the arbiter
of what information is accurate and start culling stuff.

------
munificent
Facebook relatively recently reaction buttons. Would a "This is untrue"
reaction button help?

~~~
dredmorbius
That's an option. You've got the fundamental problem that many people treat
truth as a popularity contest or majority opinion. It's not.

You'd need at some level a source of trusted truths against which to compare
or rate statements. Though user-based flagging could work.

There's both a wisdom and a madness of crowds to contend with.

------
swiftisthebest
All they need to do is have a skeptical reaction along with their other
emotive reactions.

------
agumonkey
What if Facebook found a way to turn it's billion into tiny wikipedia fact
checker.

------
SixSigma
Itt: people who think the world is dominated by unequivocal ideas and
information.

------
dvh
I was thinking something similar about gmail. Once a year or so someone send
me powerpoint with "the Mars would be as big as moon" and other obvious
nonsense. Gmail should notify them before sending that it is indeed bullshit,
if they really want to send it.

------
_audakel
why not just add a "spam flag" on all posts and let the users clean it up? Or
something like a downvote.

------
6d6b73
So they admit they want to censorship FB.

~~~
Afforess
Censorship of libel and intentional mischief is no vice.

~~~
istorical
It's up to courts to decide whether something is libel or slander, not an
algorithm or an engineer at Facebook.

Private communication / media organizations like Facebook deciding what's
acceptable and what's not would only be censorship.

~~~
Afforess
No, it's not. Courts are a last resort, when private mediation has failed, not
the first line of action. If I publish news articles claiming "the sky is the
color red, and only evil people disagree", courts do not need to be involved
for Facebook to bury my articles. Most mischief is obviously classifiable
without outside mediation.

------
andrewclunn
Let people develop critical thinking skills and determine their trusted
sources. Keep censorship out of this!

~~~
dredmorbius
That's been tried. It scaled poorly.

------
mrle
what is misinformation? what is false today can be true tomorrow.

------
awesomerobot
[Citation needed]?

------
fiatjaf
It must stop its operations. That is too much.

------
kmtrowbr
I have had a lot of similar thoughts recently. If you think of this
historically -- soon after mass-communication technologies were first
invented, they were eventually abused, the radio propaganda of WWII being one
of the best examples.

Laws were introduced, and corporations grew up around these technologies, the
producers of the content became professionalized, and by doing it full time,
some of them even developed ethics and standards of behavior.

Now, that old media has been destroyed by new technology. The new technology
is fundamentally superior in terms of the volume of information that can be
transmitted, and it is both push and pull. However, the old media's "checks
and balances" were also destroyed, and that is having incredibly negative
consequences.

I think there are some concrete, simple, technical steps that Facebook &
browser vendors can do:

* Before allowing you to share an article from a source that is of questionable trustworthiness, it should show you a warning message: "This site is known to contain untrue content." They might also consider adding an alert next to links from these sites when they are being displayed in the news feed. I think technically, this problem is not that different from the way that email systems deal with spam.

* Similarly, the way that browser vendors show you an alert before allowing you to navigate to sites that contain viruses, etc, the browser can show you an alert message before taking you to a site that is known to have untrue content. And just like you see a red error message when visiting a site that doesn't have a proper SSL setup, this could appear for sites that are untrustworthy.

Don't underestimate the psychological power of strategically placed small red
warning messages over time.

Obviously making the decisions of what goes on the list will be a highly
political affair, but what is nice to realize is that, unlike the government,
all these technologies are developed by private companies and given to their
users for free. If you keep the bar for what is considered "lying" pretty
high, most educated professional people (who are the employees of these
companies) would be able to agree on which things to warn about.

Additionally, the "social media" space has greatly matured and consolidated.
There is no longer really a direct competitor to Facebook. Facebook & browser
vendors now have the opportunity and, the responsibility to innovate in this
area. I think they can do a lot without really impacting their bottom line.

Don't allow yourself to get sucked into the problem of how computers might be
able to determine "truth" \-- that, is absolutely unsolvable. We will not be
able to censor the CNN, the NYTimes or even Fox News (in general). However,
there are many dark corners of the internet that are actually pretty
influential, that could be toned down with this approach.

~~~
dredmorbius
Not just mass media.

Writing coincided, and probably facilitated, building of cities, and rule of
law (which previously couldn't be written down).

Printing (1436) in Europe saw the split of the Catholic church into multiple
denominations.

The continuing fall in printing and paper costs, and rising literacy,
eventually lead to revolution and populist reform first in the American
colonies, then Europe. Particularly the Chartist movement in England and the
Revolutions of 1848 throughout the Continent.

Mass comms as you said, with Fascism in Italy and Germany.

[https://ello.co/dredmorbius/post/gqzszjwf4unuqfupzqff8g](https://ello.co/dredmorbius/post/gqzszjwf4unuqfupzqff8g)

~~~
kmtrowbr
Absolutely: technological changes in "information technology" bring about
social changes as well.

~~~
dredmorbius
Corollary: it's not a question of "how do we preserve the current system given
some change in media technology?" It's "how will our current system change
_given_ some change in media technology?"

------
tmat
ministry of truth'd

------
lujim
*some misinformation

------
nickjamespdx
Oh really? You think?

------
anonymous42380
Deleted.

~~~
rhizome
What should we do to you for posting it? Let's start there.

~~~
anonymous42380
Good point. I've deleted it.

~~~
rhizome
The question remains.

------
nurettin
How hard is it to write a bot that googles the facts and checks if they
contradict the story?

~~~
dredmorbius
See Google's Knowledge Engine.

The problem is challenging. Perfection is impossible (for deeply philosophical
reasons), but a reasonably tractable system strikes me as possible.

[https://www.google.com/intl/es419/insidesearch/features/sear...](https://www.google.com/intl/es419/insidesearch/features/search/knowledge.html)

