
Facebook to Let Users Rank Credibility of News - peterkshultz
https://www.nytimes.com/2018/01/19/technology/facebook-news-feed.html
======
themagician
This is the opposite of what they should have done. Shows just how much of an
echo-chamber Facebook has become. This is just adding another metric by which
garbage can be made legitimate and vice versa.

Facebook would be a much better platform if they removed the like and comment
count from every post. It’s those numbers that lead people to believe that
garbage is legitimate in the first place. By hiding these numbers it would
force people to think for themselves.

Same thing with YouTube. Fake News becomes real news when it had enough views,
likes, upvotes, comments, etc.. These counts don’t really need to be shown to
the end user.

~~~
yAnonymous
If you really want to fight the root of the problem, it's not likes and
comments, but education.

Should we censor social media, because people are too stupid to deal with it?
I don't think so and if you do, you're playing in the hands of corporations
who want a cheap work force.

Your "solution" isn't a solution, but a hot fix that will cause more damage in
the future. It's exactly the shortsighted crap politics are trying to do right
now to get immediate results.

~~~
y0ghur7_xxx
> If you really want to fight the root of the problem, it's not likes and
> comments, but education.

I agree, but it's impossible to educate everyone on any subject. I think in
this case, people who are experts in the matter, should give a reputation
rating on the article. That would be really great. For example, I am no expert
in theoretical physics, but I may enjoy an article about it, but how do I know
it's not a bunch of bullshit I am reading? If I see that Stephen Hawking gave
it a 5 star reputable source rating, I "know" I can believe it.

~~~
pizza234
> I agree, but it's impossible to educate everyone on any subject.

Critical thinking is universal though. It's not a necessity to have a solid
judgment of a piece. "Critical abstention" is a possibility, and primarily
formal analysis is another (eg. search for common communication red flags).

I don't think that the parent idea of removing likes and counts is realistic
(in any imaginary scenario I can think of).

~~~
SmellyGeekBoy
> Critical thinking is universal

I'm assuming you don't spend a lot of time on Facebook?

~~~
yAnonymous
I'm assuming you do. Otherwise you would have understood the comment.

------
imgabe
This seems very lazy on Facebook's part. Rather than employing people who will
research and fact-check, they're punting the problem to the users. This whole
situation arose _because_ users are very bad at determining which sources are
credible. People will believe anything that confirms their biases, and
disbelieve anything that doesn't.

~~~
matt4077
I'm sure Facebook wouldn't care about a few million spent on a panel of the
world's greatest connoisseurs of real news.

This feature is in fact guaranteed to cost far more than a few experts ever
would.

But Facebook knows that any expert panel will without a sliver of doubt
quickly converge to a ranking that has the Economist and the New York Times in
top positions, and Breitbart somewhere behind a random word generator.

Any working statistical method will lead to the same result, obviously. But it
gives Facebook the option to invoke HN's favourite argument: "it can't be
political because it's the algorithm. See: here are numbers."[0].

[0]: Compare, for example, the libertarian love for bitcoin, and how it's free
from the politics that undermine Central Banks,

~~~
peoplewindow
Any working statistical method will lead to the same result? What do you mean
by that?

I read articles in the New York Times, the Economist and also Breitbart (UK
edition) fairly regularly. Whilst they cover very different stories as you'd
expect given their political biases, I have not noticed any major difference
in accuracy when dealing with objective facts. This is partly because
"mainstream" media is quite unreliable, rather than any awesome quality of
reliability inherent to Breitbart, but the idea that it's unreliable seems to
be to be coming from people who simply dislike conservative worldviews ... and
desperately want to stop people from reading them. Same reason they try and
smear anyone who goes looking as nazis, bigots etc. They fear that if someone
reads things from the "other side" they might find it's not so unreasonable
after all ...

------
goalieca
> “There’s too much sensationalism, misinformation and polarization in the
> world today,” Mark Zuckerberg, Facebook’s chief executive, wrote in a post
> on Friday. “We decided that having the community determine which sources are
> broadly trusted would be most objective.”

Is this truly ignorance or is it malice?

~~~
excalibur
It's arrogance. Zuck wants to be in control of which sources are seen as
trustworthy and which are not, but he doesn't want to actually be on the hook
for making these decisions. So he implements a faux-democratic facade, and
gets to make the actual calls behind the scenes by tweaking algorithms. He
wants to have his cake and eat it too.

~~~
threeseed
Let's be clear here. Facebook is a private website.

They legally, morally and ethically have the right to determine what content
their users see. Especially given that we have demonstrated evidence of fake
news being disseminated.

The idea that this was supposed to be a democratic process was never claimed
nor should it have been expected.

~~~
te_chris
They are, for now, as most monopoly industries were before their negative
effects caught the attention of govts and they suddenly found themselves
regulated.

------
emodendroket
This is a frankly awful idea. I do not want Facebook acting as some sort of
arbiter of what information is truthful. The cure is worse than the disease.

~~~
0x7f800000
This is worse than that. Imagine your Trump-supporting extended family voting
down CNN news stories as "not credible."

Expect mass brigading by The_Donald.

~~~
ppbutt
You think CNN is 100% honest and credible? The same could be said of liberals
or any arbitrary group of people.

~~~
MBCook
I don’t think that was the point.

The point is that even credible sources are going to be voted down as
“idealogically biased“ no matter how neutral the story is.

Headline says candidate Johnson lost by 15%? Better vote it down. After all,
you hate candidate Johnson. Doesn’t matter if he actually lost.

~~~
emodendroket
That's true but that seems like the easiest issue to deal with, considering
how much information you have about user's political preferences if you're
Facebook.

------
oliwarner
Holy hell, Facebook! For a cash-rich company employing some of the cleverest
people on the planet, you're managing to be incredulously stupid here. People
who aren't qualified to rate sources should not be asked their opinion. If you
want peer review, do peer review... But ask-the-audience isn't that.

That all said, there is a redeeming possibility here. You could _also_ do an
in-house monitored, _qualified_ peer review of select articles from popular
sources and use those results to audit users. If somebody keeps hating on WaPo
and pushing Fox you could make assumptions about the quality of their
judgement.

~~~
thedoops
What if Facebook is weighting the votes based on content consumption and
interests of users? If that's the case, then it's possible to get some
interesting results. We need to be doing experiments like this.

~~~
oliwarner
Just remember that divisive interests —where "fake news" lives— cut two ways.
Inferring too much from consumption might confuse "can't stop watching this
trash-fire of a government" and "MAGA". Or "aren't these right-wingers a bunch
of nutbags" and "lock her up". There's significant overlap in consumption.

Worthy of experimentation, but not to decide what's shown on my wall.

You probably _can_ get there in the end, but it's probably easier to pay
somebody to objectively rank to the political position, and journalistic
integrity of major publications.

------
philipodonnell
> “There’s too much sensationalism, misinformation and polarization in the
> world today [ed: because we rely on users to decide what can be trusted],”
> Mark Zuckerberg, Facebook’s chief executive, wrote in a post on Friday. “We
> decided that having the community determine which sources are broadly
> trusted would be most objective.”

Its a bit tone-deaf, Mark. Remember the "we had no effect on the election"
line that everyone laughed at? This year its "user curation is the answer to
fake news". Everyone is laughing at you again.

If you want to determine credibility, hire people to judge credibility. Hire a
mix of viewpoints. Its not that hard, it just costs money, which you have more
than enough of. Maybe give some of that money back to the users who turn over
their online lives to you and actually improve their lives, instead of just
monetizing them by feeding users garbage that gets them to click and consume
while at the same time you go around complaining that users aren't
discriminating enough with those clicks and consumption.

~~~
paulddraper
Zuckerberg said no such thing. He said _fake news articles_ on Facebook did
not affect the election.

Also, that people were using some sleazy disinformation on Facebook as a
convenient excuse.

> "I think there is a certain profound lack of empathy in asserting that the
> only reason someone could have voted the way they did is because they saw
> fake news,” Zuckerberg said. “If you believe that, then I don’t think you
> internalized the message that Trump voters are trying to send in this
> election."

------
rdtsc
> The same criticism has also engulfed other social media companies such as
> Twitter, which on Friday said it was emailing notifications to 677,775
> people in the United States that they had interacted with Russian propaganda
> accounts around the time of the 2016 election.

Wonder if they included the part where they (Twitter) went to RT, which is
pretty much Moscow's propaganda arm, and offered to sell them $200k worth of
adds geared specifically for US elections crowd.

------
tantalor
You have been able to do this on Google News for a long time:

 _See more stories from preferred sources, and block news sources you don’t
like._

 _\- Under "Preferred," list publications you want to see more news from._

 _\- Under "Blocked," list publications you don’t want to see any news from._

[https://support.google.com/news/answer/1146405](https://support.google.com/news/answer/1146405)

------
marenkay
Why not remove anything except from social interaction from FB instead of
this?

IMHO anything on FB beyond connecting with people is worthless anyway and
mostly driven by bots and not people these days.

~~~
darkstar999
Because they'd be giving up a big chunk of traffic and "engagement".

[http://fortune.com/2017/09/08/facebook-twitter-snap-
news/](http://fortune.com/2017/09/08/facebook-twitter-snap-news/)

~~~
marenkay
^ using social media as trustworthy news source is troubling IMHO. That could
only work if social media was under the same legal regulation as press is.

But user ranking is basically FB saying "no, no. We don't do news, that's not
our business. Sign waiver here"

------
1024core
This is a genius move by FB.

Now it becomes a battle of which side can marshall more FB users to flag
stories of the other side as fake. Which, in turn, results in more views and
clicks for FB.

~~~
FilterSweep
This also precludes malicious actors are more incentivized to continue
flagging posts while other users move on.

------
alexgandy
Bad or good, this is an admission of failure. Facebook just flat out said
"screw it, you all decide what's real...because we sure as shit can't."

~~~
ikeyany
They sure as shit can, but they sure as shit don't want to. They could have
tackled this 10 years ago but instead they optimized for likes, clicks,
traffic, and revenue.

------
elcapitan
This of course just shifts the problem to Facebook then ranking the
credibility of users, on which they will base the raking of credibility of
news.

~~~
banderman
Or they already have a good idea about the credibility of the news sources and
this will let them determine which users are credible.

------
elorant
This is so stupid. For one, people tend to have herd mentality. An article
seen the wrong way could trigger an immense backfire where a medium's
credibility tanks in a matter of days. And then it would take years to reverse
the damage.

Furthermore, it opens the way for anyone with a botnet of infected machines to
rig the credibility of any news outlet to his liking.

~~~
matt4077
It would seem that Facebook is rather well-positioned to identify such
problems in the data.

They could, for example, only count votes by long-term users. They also have
almost complete knowledge of their users' political leanings, making it
possible to counteract any attempts at manipulation.

~~~
pixl97
Eh, if FB doesn't release those algorithms to an independent 3rd party, one
I'd to assume that the manipulation is on FB's part.

------
carwyn
They would be better off auto-blending articles at the other end of users'
opinion spectrum into their news feeds (or comment threads) to try and burst
the echo chambers.

"Here are some articles at the polar opposite to your norms."

~~~
asgioiobuio
I don't think that would work well at all. "The polar opposite to my norms" is
stuff like Alex Jones and Breitbart. I won't learn anything from exposing
myself to such idiocy. Dishonest clickbait is worthless, even if it comes from
a diversity of viewpoints.

No algorithm is (yet) smart enough to judge the quality of journalism.
Facebook shouldn't try.

~~~
darkstar999
This is exactly why we need to be broken from echo chambers. If you think the
polar opposite of whatever you believe is conspiracy and idiocy, you are in an
echo chamber. There is rational journalism for whatever is the opposite of
your beliefs.

------
dvt
Slightly off-topic: are there any social networks out there _not_ polluted by
news -- fake or otherwise? Is there room for a new social network that's for,
you know, people?

Snapchat's a tabloid, Facebook is basically a news RSS feed, and I'm even
starting to get news "stories" when browsing hashtags on Instagram. I just
want to discover _people_ and talk to _people_. Apart from dating apps
(Tinder/Bumble/etc.), it seems that all social networks have become news
networks.

~~~
emodendroket
That might be because "the news" isn't some abstract thing that has no effect
on people's lives, so _people_ want to talk to _people_ about the news and
politics.

~~~
dvt
Not saying you're wrong, but I don't know, I was perfectly happy chatting with
friends on AIM messenger a decade ago. Myspace was also super spammy, but
never became a (literal) news feed. I don't mind stupid pro/anti-Trump memes
or whatnot, but giving news distribution companies (from Breitbart to the
Economist) a platform on your social network seems like an anti-pattern.

~~~
emodendroket
I'd much rather see a news article someone thought was trenchant or
interesting than a meme image that serves no purpose but to try and rile
people up. Which one is a better basis for conversation?

~~~
dvt
Let me rephrase: I think it's perfectly fine to _share_ news articles and talk
about them, etc. But the distribution of news has become fundamental to FB and
Snapchat (and to a lesser extent Twitter and IG) -- see FB's trending stories,
or Snapchat's "Discover" tab. It's not at all like "oh Bob thought this Fox
News article was interesting" \-- although to be honest, I would actively
discourage that, too.

~~~
emodendroket
I get a lot of news in my feed, but I willingly subscribed to multiple news
sources and regularly comment on articles, so I can't comment on the idea of
news being foist upon me.

------
severian1778
ban certain users from voting on certain topics of even seeing them and game
the system to always lean unfairly to your political leanings. no watchdogs,
no transparency.

------
srcmap
Perfect for collecting more user attributes and use them as future "targeted"
advertising.

Future Republican candidates can put out FB ads only target Orange County
Voters who dislike CNN's news article on Trump.

Democrat party can put FB ads asking for campaign contributions only from
folks in SF that like CNN's news article on Trump.

~~~
threeseed
That's not how this works. It's more like an electoral poll. From the article:

"It declined to say how many people were polled or which news outlets they
were asked about."

------
willart4food
I can't wait till 4chan gets wind of this one!

------
alkonaut
What they should do isn’t show me news that _I’m_ likely to agree with, but
news that nearly noone disagrees with. It should be very easy to notice what
has both large numbers of people disagreeing and agreeing with, and people
within the same social networks agree with eachother. That type of news could
just be supressed. If all that’s left is news with kittens then that’s a
glorious success. I can get actual news elsewhere, and Facebook can select a
few publications of good quality and _replace_ anyone that shares a breitbart
article with a corresponding one from the guardian.

~~~
InternetUser
> news that nearly noone disagrees with

Well, in a country where 30 states voted for Trump, 20 states voted for
Hillary, and 100 million people chose not to vote (and that's 42% of the
eligible electorate[0]), who shall be arbiter of what news, at least U.S.
political news, "nearly noone disagrees with"? I mean, given that a majority
of politically active people in a large majority of states voted for Trump, I
expect that you'll agree with me that Alex Jones' endlessly pro-Trump daily
videos should be featured right at the top of everyone's News Feed. I know _I
'd_ love that.

[0] [http://archive.is/NDUkb](http://archive.is/NDUkb)

[https://www.youtube.com/user/TheAlexJonesChannel/videos](https://www.youtube.com/user/TheAlexJonesChannel/videos)

~~~
soundwave106
Not all news is political news.

I don't use my Facebook much except to occasionally share vacation photos
these days. Checking recently seems to show no news of any sort. Just some
silly memes, weather phenomenon (supermoons!), pop-geek-culture stuff, and
daily status updates. Since it's the season, there's some American football
discussion, which people obviously have disagreements on, of course, but in
that "sport" sort of way. :) I'm sure major news (like major weather events)
will still make my feed, but there seems to be a noticeable decrease in
political stuff.

My cynical take on this announcement is that, for those that post and share a
lot of news, there is a high probability that your news bubble will be more
heavily reinforced. Not all Trump voters are Alex Jones heads, of course, but
if you are one, maybe Alex Jones will still be at the top of your news feed,
and you will get all of the Alex Jones-y shares you want. It just won't be on
the feed of the friend who prefers is apolitical merely sharing silly cat
videos, or the person who is sharing Rachel Maddow clips. Facebook, in other
words, will strive to bring you the best news feed that you will never
disagree with, all the better for dopamine-inducing clicks and likes.

(I don't really know, of course, if my cynicism is off the mark, but these
sort of opaque-algorithm news feed games Facebook plays are a large reason why
I rarely use my Facebook feed. :) )

~~~
alkonaut
I believe Facebook has a responsibility to not show Alex Jones videos even to
those that would agree with them.

~~~
soundwave106
Well, on the one hand, sure. I think Alex Jones is a conspiracy-theory
charlatan, a good representative of "fake news" as a matter of a fact (see:
Pizzagate). The fact that his site also pushes dubious supplements (like
"super advanced vitamin B-12" and "survival shield", which is a "proprietary
nascent iodine" formula) says it all to me.

On the other hand, no. I'm not comfortable with a social media network that
determine your news feed "appropriateness", via an opaque Brazil-esque (the
movie, not the country) bureaucratic process that says "yay" or "nay". And
while Alex Jones may be an easier business case here due to the "fake news"
elements, I'm not seeing much transparency about what their post blocking
process really is. Other than it seems real messy and haphazard, and a lot of
people complain about it for various reasons, whether you are a black person
posting police brutality videos, or are activists documenting the Rohingya
genocide, or have conservative opinions, or are Palestinian, or post breasts
no matter the purpose, or post iconic Vietnam war photos, or various other
reasons that come up in articles.

If Facebook made their guidelines clear, and the process was transparent, I'd
be more comfortable. When the Wikipedia community decides a source is no
longer reliable, you can at least read the debate on the reliable source
noticeboard. They have guidelines
([https://en.wikipedia.org/wiki/Wikipedia:Verifiability#Reliab...](https://en.wikipedia.org/wiki/Wikipedia:Verifiability#Reliable_sources))
and an ID guide
([https://en.wikipedia.org/wiki/Wikipedia:Identifying_reliable...](https://en.wikipedia.org/wiki/Wikipedia:Identifying_reliable_sources)).
It's possible to disagree with the decision still, but at least you have
better context.

~~~
alkonaut
> On the other hand, no. I'm not comfortable with a social media network that
> determine your news feed "appropriateness", via an opaque Brazil-esque (the
> movie, not the country) bureaucratic process that says "yay" or "nay"

Completely agree. But I'm actually less comfortable with the status quo of
state controlled botnets spreading a goo of fake news, and people living in
those fake media bubbles it creates. Wikipedia is definitely a role model
here. Unsure if twitter and facebook could achieve the same level of control,
if they could it would be great.

------
romanovcode
This is a horrible idea in general, but very good for facebook.

Facebook users will be in even greater echo-chamber re-assuring their world-
views therefore they will be more engaged with the website, and this is all FB
cares about.

~~~
InternetUser
Here's my theory, and I think it's quite obvious: Zuckerberg, as someone with
a reported $74B in net worth[0], is actually a Republican, and strongly
supports Trump and is pushing for his reelection. All the apologizes and
damage control is just virtue-signalling to appease the entertainment industry
(by which I include the news industry), which relies on and trusts Facebook
for their very existence.

[0] [https://www.forbes.com/profile/mark-
zuckerberg/](https://www.forbes.com/profile/mark-zuckerberg/)

------
jsjohnst
I just wish they would implement a "Snopes.com filter" which marked things as
proven false. I would say something like 1/5 posts some of my relatives make
is easily disproved by Snopes.

~~~
booleandilemma
I liked snopes.com more when they were a site that debunked urban legends and
not a place people went to for political info.

------
HughG
Will Facebook allow users to rate the credulity of other users? That might
make this feature more useful ...

------
wyck
I enjoy seeing FB sabotage thier brand into the ground, not because of it's
service, but rather because of it's sess pool leadership.

My feed is basically a bullshit mill of news instead of actual "freinds"
experiences. Even given the bullshit, a good % of the experience stuff is
basically just selfie show-offs. It's almost become a mirror of the self, with
very little socially valuable interaction.

oh you're on a beach ...like

oh you want to signal your virtue ..share

Facebook is starting to become the anti-social platform in the sense that it
has an overall negative social effect.

------
xster
And Rome lets the mob run the Comitia Tributa via popularity contests,
including dismissing opposition Tribunes.

I'd argue Rome died not in 476 but in -121 when the Senate killed Gaius
Gracchus and didn't realize they needed reform and create an actual political
process that doesn't lead to smooth talkers and thus tyranny.

I believe Plato's and the Founding Fathers' preferred solution here would be
for each individual to find the smartest willing person they know and let them
offer their analysis for consideration.

------
davesque
So they're now going to rank news stories based on the reported (i.e.
_subjective_ , not objective as they claim) familiarity and credibility of the
source. It's hard to know how this would help without more details. Does
anyone have any more information on precisely how this heuristic is defined? I
have my doubts that this could even work at all. It seems the whole lesson of
recent events is that the average person is _not_ qualified to judge the
credibility of hardly anything.

------
ryandrake
When you let users contribute to any kind of measurement, it turns into a
measure of popularity. It's now a game of "get enough users to do X". By
relying on end-users to rate and rank things, we're letting popularity stand
in for all kinds of quality measurements. Lots of 5 stars on Amazon = must be
a good product. Lots of Yelp reviews = must be a good restaurant. Lots of
upvotes = must be a good comment. Talk about doing the same thing over and
over and expecting different results!

------
skate22
A fraction of revenue from popular pages should fund a manual fact checking
process that either temporarily grants or denies that page a trusted status.

~~~
matt4077
People don't care about "fact checking". The NYT, or the Wall Street Journal's
newsrooms simply don't make factual errors, except occasionally spelling
someone's name wrong, Nobody cares.

~~~
skate22
Your counter example uses a trusted source of info. The people who actively
read those sources likely DO trust them. Facebook lets anyone post anything
(to the extent that fake news is a huge problem), and reaches a much larger
audience.

------
transitionnel
I think Facebook's main benefit from this is that they will now have users
clicking a button essentially to say, "yes, I believe this." Now they have a
direct metric for what type of manipulation works for that user. Because
really, no user can actually verify the truth of a media outlet's claims
without heroic efforts.

------
nazz
What will be done in response to a bot storm?

------
ryan-allen
I would expect people will downvote things they don't agree with, I'm not sure
how I see this working?

~~~
emodendroket
Well, easy, only "mainstream" sources end up being considered trustworthy, and
any alternative news source that challenges the world view of NPR, CNN, Fox
News, MSNBC, et al (sources that are frankly more alike than different) just
gets shut out. Look how quickly places like the Washington Post ate up that
"PropOrNot" stuff.

------
alexc05
So they're going to ask general accounts to rank things for reliability in the
era Russian "bot-armies" able to up-rank en-masse?

Suddenly they're going to be able to buy credibility the way they buy likes?

I'm not sure I'm a fan of this. I hope they've gamed that out.

------
cocktailpeanuts
Before I go in, I must say this comment is not about politics , and I am not a
Donald Trump supporter (I can't believe I have to always prefix this before
saying anything sane nowadays otherwise I get stoned to death online)

That said, here's a story. One day during the presidential campaign I saw a
person who normally really despised Donald Trump say "Hey I hate him but I
think he kinda made sense on his speech today". Remember she watched the
speech because she hated him and was looking for every chance to make fun of
him but instead came off rethinking things for just that speech.

Then the next day she watched a mainstream TV news anchors go through the
speech, one by one line and criticizing, using Trump's past history and etc.
Then she said "I knew it he's an evil, he almost made me listen to him, so
devious".

I normally never watch TV so observing someone else reacting completely
opposite ways based on just a 30 minute TV show is when I realized how
mainstream TV is shaping people's minds and propaganda in 2018 is stronger
than ever. Most people won't want to believe that they are under heavy
influence of political propaganda because their identity is connected to it,
but that is the truth.

Just to be clear, this is not even about politics. This phenomenon is
increasingly common across every aspect of the society. For example, if you
look at what's happening in the cryptocurrency ecosystem, it's full of
propaganda by people who have more knowledge manipulating people with less.
The only way to overcome this is to:

1\. Try to be as emotionally detached from the events as possible

2\. Actually try to learn what is going on, instead of listening to what
everyone else is saying, because even your most trustworthy friend, family, or
even a very reputable nobel prize winner is under this influence unless they
followed this principle, which most people don't have time to engage in.

3\. If you ARE that reputable person, be careful what you tell your followers.
If you haven't gone through step 1 and 2 and just saying "This is who I am,
I'm just expressing myself, take it or leave it, just unfollow me if you don't
want to hear what I say", you are being extremely irresponsible. You are
basically taking your own gullibility and amplifying it to hundreds of
thousands or millions of other people who are probably in less fortunate
position that you are (which means they will suffer exponentially more from
this misinformation than yourself)

Facebook doing this won't help fake news, it will only accelerate what they
are already guilty of, because most people aren't even aware they are
misinformed and furthermore don't want to believe they are wrong. So it will
only result in larger and larger filter bubble which separates people even
more.

It's like religion, when was the last time you were able to convince a
religious person into believing that God doesn't exist? Imagine telling people
on Facebook to vote whether God exists or not on Facebook.

Personally I think there's a huge opportunity hiding in this madness somewhere
if you're an entrepreneur.

~~~
jackaroe78
Welcome to the machine!

------
WillReplyfFood
A good way to prevent false stories, would be to reward users who take the
time to find evidence to the contrary, with some gamified rank system.
"Mamothhunter of Lies"

------
FloNeu
This is really the best they can come up with... So much wow - no way someone
can manipulate that... Glad the world is save again _tadatam_

------
FloNeu
Wow - that's such a high-tech solution... No way someone can manipulate
that...

------
WillReplyfFood
Automerging similar comments would at least reduce the ilusion of a crowd.

------
ppbutt
"Facebook to let Users Rank Credibility of (pre-filtered) News"

------
drderidder
What could possibly go wrong. coughthatshowyougottrumpcough.

------
menckenjr
If enough people believe something it must be true, right?

------
diogenescynic
What a horrible idea.

------
ntrepid8
What could possibly go wrong?

------
dundercoder
This can only end badly.

------
menckenjr
If enough people believe it it must be true, right? Right?

------
excalibur
BuzzFeed. The most trusted name in news.

~~~
askvictor
Actually BuzzFeed has recently been doing some excellent investigative and
long-form journalism. Surprising turnaround; kind of like Vice.

~~~
mcintyre1994
The BBC article
([http://www.bbc.co.uk/news/technology-42755832](http://www.bbc.co.uk/news/technology-42755832))
made the point that this system would have stopped Buzzfeed being able to make
that change (to the extent they'd still rely on FB traffic):

> For instance, Buzzfeed's initial beginnings as a viral site would have
> almost certainly hindered its growth into a serious news organisation had it
> been subject to the ideas about to be put in place by Mr Zuckerberg's team.

I'm not really sure what the answer to this is - but they are really good now
and would definitely have been perma-black-holed before now by many measures I
can think of.

