This is the opposite of what they should have done. Shows just how much of an echo-chamber Facebook has become. This is just adding another metric by which garbage can be made legitimate and vice versa.
Facebook would be a much better platform if they removed the like and comment count from every post. It’s those numbers that lead people to believe that garbage is legitimate in the first place. By hiding these numbers it would force people to think for themselves.
Same thing with YouTube. Fake News becomes real news when it had enough views, likes, upvotes, comments, etc.. These counts don’t really need to be shown to the end user.
If you really want to fight the root of the problem, it's not likes and comments, but education.
Should we censor social media, because people are too stupid to deal with it? I don't think so and if you do, you're playing in the hands of corporations who want a cheap work force.
Your "solution" isn't a solution, but a hot fix that will cause more damage in the future. It's exactly the shortsighted crap politics are trying to do right now to get immediate results.
The experience from a few months ago during a conference showed me that education won't fix echo chambers.
It was a conference that was organized by a organization that holds one of the major initiative against false news, and where one of the main tracks was about addressing the issue of false news. Last year they had a massive booth at the middle of the conference hall about false news. This year they brought in a key note speaker which brought several political slides while making misleading conclusions that was later proven to be false, but the whole thing echoed with the audience. It seemed to not matter a bit that it also just happened to be false.
When facts get in the way of a political message, even the most educated seems to let the facts go if its align with their own views. I wish education would work, but I don't see it as a proven method to solve echo chambers or political news.
The whole idea of "fake news" came out of US politics to begin with, so no surprise organisations set up specifically with that as a mantra are ideological in nature and uninterested in actual facts.
I believe with your quotes, you may be referring to the phrase "fake news" but fake news has been around as long as humans & probably even longer. I'm glad people are giving this a discussion though.
No, "fake news" refers to a very specific phenomenon. It's not just any kind of misleading or incorrect reporting. Trump was the one to start saying "not fake news, YOU'RE fake news", but the phenomenon as initially encountered was about shoddy and small fraudulent websites, often trying to deceive their audience to believe that they were a different website, pushing the most anger-inducing news.
It's a far cry from the NYT or Fox News reporting incorrect things, whether deliberately or not, to someone called Foxx News or Non York Times reporting that Islam causes prostate cancer.
I don't think yellow journalism has the same properties as fake news. For one, yellow journalism doesn't try to deceive its readers as being as the same source as another more reputable source. With print media, this was harder to do.
Fake news has a slightly different approach because it's much easier to reach an audience online and elude trademarks.
Interesting that "Before it was news" is on the list of fake news sites, its a page aggregator like Reddit and hacker news, fake articles can be posted, but doesn't make the entire site fake.
Community news/comments can be wrong, but that's a far far different than a site intentionally promoting fake propaganda, which is the true meaning of a fake news site.
And even then, propaganda can blur into a political view, then even farther into down right lies.
I think there's also a risk of conflating 'education' with our 'education system' that teaches citizens to be good productive workers but doesn't teach citizens how to analyze and debate about the nature of our society.
I believe the parent might be referring to 'education' as in logic, political history, philosophy so that the student can retrace the historical steps and recreate a societal scaffold that may or may not look like society today in the same way civil engineers are taught trig, calculus, static mechanics etc so the engineer can retrace historical steps and engineer the structures that may or may not reinforce or criticize our engineering approaches today.
No fewer than 6 of the coveted "swing states" switched from blue to red[0], and many counties across the country flipped as well. What persuaded hundreds of thousands, if not millions, of 2012 Obama voters to be 2016 Trump voters? Trump himself educated them, in his speeches. I'm one such person, spent the entire election season commenting and chatting with others on Reddit and YouTube with others, and there were countless Trump supporters who had been Obama supporters. Trump educated us more about the political system than any politician or journalist ever has. He taught people the phrase "lobbyists, donors, and special interests." That itself had a very powerful effect.
Your not the first person to worry about our mandatory education. It varies enormously depending on the resources of the surrounding area, and I am sure there are some schools that cover modern politics.
I have been in this country all of 10 years and I knew about lobbyists, donors, special interests even before coming over. US and lobbying system is famous across the world, even my 75 year old parents know about it and they live in some far flung developing country. I am surprised that Trump had to educate people about this basic stuff.
> If you really want to fight the root of the problem, it's not likes and comments, but education.
But how do educate yourself though ?
The same argument was already made about books or newspapers: depending of the subject a lot of them are trash. Same for TV news. Wikipedia is also not a defacto trustable source.
Except for first hand experience on a subject, you will never be able to find a source that is inherently trustable (and you might even interpret your experience the wrong way, so you should doubt yourself too...).
To educate yourself, there's a point where you have to try to trust some of the information you get. Having that info come from a trusted source or with a high level of peer review is a useful first level of filtering. From there you can (and surely should) invest in more time to double check or cross check if you want, but IMO quick filtering is necessary, and can't be brushed away saying "people should know better".
>Except for first hand experience on a subject, you will never be able to find a source that is inherently trustable
Even first hand experience isn't trustworthy. Our memory is so deeply flawed that I can't even tell with certainty which first-hand experiences I had. Simply presenting "evidence" that you should remember something can cause your brain to make that memory up. And if the experience is genuine and and I try to remember some detail I didn't pay close attention to, my brain will most likely just make it up. So the only reliable way to learn from personal experience is to learn during the experience, but confirmation bias will make me miss most opportunities to learn.
Honestly, it's amazing that we are capable of learning at all.
First of all, good education should come from schools.
Being able to further educate yourself and judge the trustworthiness of sources are direct results of a good primary education.
Many western states have criminally decreased education expanses for decades and the current state of the civilization is in big parts a result of that.
This shit is infectious, too. Many stupid people think their kids don't need a good education, even if there's a chance for them to get one.
I think this is the key point. Even if schools presented largely propaganda but they spent an effective amount of time teaching useful critical thinking then the students might have enough experience/tools to think for themselves and not accept everything in the books.
I think the type of education they're talking about is not formal education like in school but how to educate yourself about today's events. Not something you could or should go to school for.
I'm not so sure, critical reading/thinking skills could be taught in school, just take newspaper articles and pick out the parts that are fake/misleading. Finding teachers able to do this may be easier said than done though, as would doing it in a non-partisan way.
Here are the "peers" that I have reviewing what I hear from politicians: the salary on my paycheck, the tax figures on that paycheck, and the prices at the gas station and supermarket. I don't need to pore over news editorials and blogs posts or watch commentators on any TV channel to know which political figure is making more sense and describing reality more accurately.
I agree that if you have “tangible” evidence of a fact it makes it easier to disprove a claim. I would still be wary of this logic, as chain of events can be unintuitive as well.
For instance some politicians make a bold move that has clear and immediate positive effect. Yet if it has huge indirect and long term consequences that only surface once they are out of office, from your perspective they will get all the credits and none of the blame.
Having different perspectives from “experts” is in my opinion important. The problem is of course to find the right experts, and to have an insight on their limits or their biases, which is usually the beginning of become a kind of expert yourself.
I think history would be a good source. We think we're facing unique challenges but the debates we're having today are already eerily discussed ad nauseam by Greek thinkers. And first sources are already so accessible that it's difficult to land on untrustworthy extractions unless you read contemporary commentaries.
>If you really want to fight the root of the problem, it's not likes and comments, but education.
I believe this is a design issue (systems design, not UI design). Filter bubbles and confirmation bias can reinforce each other in a vicious circle. Financial incentives for "news outlets" to mislead people make it worse. That's the heart of facebooks issue.
With a herculan effort we might be able to overcome that with education, but improving Facebook's design to lessen the impact of echo chambers and social confirmation would be much easier and faster. Removing all interaction is one way to do that, though I agree that that's a bit heavy handed and goes against Facebook's incentives.
> If you really want to fight the root of the problem, it's not likes and comments, but education.
I agree, but it's impossible to educate everyone on any subject. I think in this case, people who are experts in the matter, should give a reputation rating on the article. That would be really great. For example, I am no expert in theoretical physics, but I may enjoy an article about it, but how do I know it's not a bunch of bullshit I am reading? If I see that Stephen Hawking gave it a 5 star reputable source rating, I "know" I can believe it.
But that still requires that you have knowledgeable baseline of who is and isn't a "good" theoretical physicist. For a lot of people for example Dr. Oz is a "good" medical doctor and Gwyneth Paltrow is a trusted source on women's health and with that as a baseline you haven't solved anything.
> I agree, but it's impossible to educate everyone on any subject.
Critical thinking is universal though. It's not a necessity to have a solid judgment of a piece. "Critical abstention" is a possibility, and primarily formal analysis is another (eg. search for common communication red flags).
I don't think that the parent idea of removing likes and counts is realistic (in any imaginary scenario I can think of).
>> If I see that Stephen Hawking gave it a 5 star reputable source rating, I "know" I can believe it.
People everywhere are going away from this model as "experts" are consistently bribed through various means.
There is no good short-term fix. People need to become more educated and adapt. If the argument is that they can't, well, maybe our brains have been defeated entirely by advertising and messaging. It wouldn't be the first time a complex problem was mostly solved.
One way to work against that is proper antibribery laws and their enforcement. Compared to other industrial countries, the US allows many forms of corruption and disinformation, that, once forbidden, would sustainably change the educational and political landscape (most notably bribes (officially donations) to political parties.
See, I can make things up too. Furthermore the amount of education necessary to judge the trustworthiness of all potential subjects is not obtainable by an individual, meaning they have to hope the experts they choose to listen to aren't making it up.
"Should we censor social media, because people are too stupid to deal with it? I don't think so and if you do, you're playing in the hands of corporations who want a cheap work force."
Just as a sort of general, broad response to this sentiment: yes absolutely. I envy your optimism, but the belief amongst the Hacker News set that humans are universally capable of educating their way out from under deeply-rooted biological faults is misguided. You're not going to educate people out of conferring status on the most popular people/stories/posts/etc. You're competing against at least a few hundred thousand years of evolution (and many of these processes go back much, much further than that.) You won't win.
So, yes, absolutely we should be working on ways to acknowledge and route around human foibles, not pretending we can eliminate them.
It's sort of ironic that a proposal not to show "likes" comes in for strident criticism on HN. After all, this very site has vacillated a few times on whether or not to show the karma score for each post, and seems to have decided it's better for the community not to do so.
The censorship doesn't have to be confidential. It can be auditable and publicly recorded. But in that case in Ukraine, they did all the education stuff and let the fake news through, and that wasn't enough to stop the country from being ripped apart.
It won't work. I know a lot of educated folks who believe the fake news. Politics has for a long time lied by omission in the national press. Now they just blatantly lie. The stuff we see today is propaganda dressed up in identity politics. In addition, there are plenty of fake Facebook accounts and other avatars on comments sites that may also be Russian trolls pushing disinformation.
I think an interesting way Facebook could approach determining credibility would be by having users bet (betting with some made up points instead of real money). Since this forces readers to speculate, it would encourage critically reading the articles. I think a betting system would push Facebook less in the direction of an echo-chamber. The difficulty with applying a betting system is that it still doesn't tell you whether the information is true, thus bets can't be evaluated.
I like the idea in principle but betting, real money or not, relies on agreed upon, objective results/outcomes. There in lies the problem; who's going to be the arbiter of truth?
However, it seems possible to get further along than we are now by creating betting pools/markets where the participants do agree on a set of sources for information.
Actually, I've thought along these lines before rather indirectly by theorizing that the WSJ was probably the most non-partisan source of "news" because most of it's clientele demanded such due to their participation in the markets. But I realize that's highly speculative.
They will never do it because it’s those little numbers that give you your dopamine fix, and thus central to their revenue.
Social media is a game and the score is whatever metric is shown. You want to most points for yourself and you only want to participate in popular content in the hopes it will earn you more points.
I am seriously considering a year without adblock and going out of my way to click as many ads as possible. Blocking is ineffective, it's time to fuck over the metrics
Can you explain this? maybe it's because of the filter bubble i live in but i almost never see this on whatsapp (then again, maybe i wouldn't know if i had.)
The fact that Facebook walls are visible to everyone, any controversial topic usually gets reaction from both sides. Whatsapp groups are formed based on likemindedness (I am excluding core family and friends). I call these groups echo-pipes which silently form a larger echo chamber.
WhatsApp groups are not discoverable (or publicly listed) unless someone invites you or provides you with a specific invite link with which you can join the group.
They already stopped the first "fake news" label experiment because it simply caused people to click on it more. I think that's the core problem: people don't really want to know if something is legitimate or not, they want controversial stuff and if it's labeled as fake, they simply want to see it and decide for themselves anyway. And, I'm sure, when something goes against your beliefs, you will take that fake label as proof that people don't want to believe/the system is trying to brainwash you.
My Facebook feed is mostly personal updates about hundreds of people I have met, and it feels like the most democratic media I’ve ever seen. It improved a lot when I unfollowed the handful of people who post junk like sportsball.
Are you assuming the faceless masses are exposed to some unimaginable horror that you aren’t seeing? Or tell us more about the horrors you do see.
>feels like the most democratic media I’ve ever seen.
The Democrats thought the same thing too, before Trump won the election. Perhaps you are falling victim to a filter bubble confirmation bias?
A democracy implies independent national sovereignty. When anyone (govt, corp, or rogue hackers) across the world can effectively manipulate your elections at scale, sovereign democracy is no longer a thing.
I'm having a hard time finding the link, but there was an article posted here on HN a while back (which the author deleted because of the underhanded nature of the endeavor) about a marketer whose ad campaign for getting a government initiative passed was failing even though they were using all the traditional methods.
Then they decided to spend ~$100 on buying fake reddit upvotes for their initiative and it received millions of views and ended up being highly successful.
It's interesting how far a little money and technological finesse can have such far reaching effects.
It's more like, letting users "legitimize" news is how you get bots screwing with everything. It may not be a problem for some, including myself since my feed is similar to how you describe yours, but network waide, it will pose an issue.
Those counts are the rating by which people determine legitimacy of content. Popular content must be correct by virtue of that fact that so many people can’t be wrong. And when you only show people content that they all ready agree with it further reinforces its legitimacy.
Remove the counts and people have no external validation for the legitimacy of a piece of content. They actually have to read it. What we have now is a system where someone sees a headline, “Proof Obama birth certificate fake!” With 100k like and 10k comments and they don’t even read it and just comment, “I knew it!”
When you pick up a newspaper you have to actually read it. You can’t just look at the headline and see how many people agree with it before you can form any kind of opinion.
What seems likely to me is that people share obviously fake news not because they see it has a lot of likes but because it confirms things they already believe.
What they see is often determined by those counts in the first place, and what people decide to read/comment/share is more often than not because a post has a lot of likes/comments. Then they share it to get likes for them self.
If the only incentive to read/share/comment was a genuine understanding or willingness to share and participate the entire space would be different.
For almost all stories you can work out its popularity just by the number of comments. And you can determine the comment count (-/+ an order of magnitude) just by scrolling through the comments. The accurate number isn't important for most people.
None of this will prevent the dissemination of fake news. What will do is (a) not prioritising it and (b) detecting whether it's fake. Facebook just added solutions to do both.
People comment on stories with the most comments and usually reply to the first set of posts because they want their post to be seen, accelerating commenting in that thread.
If you want to count comments fine. Most people won’t. FB actually makes this hard to do.
Would be fascinating to even try it here on HN. If you hit all the integer counters it would dramatically change the way people use the site.
I just went to a popular article by the New York Times on Facebook. Just by scrolling down through the pages of comments I could determine that it was likely to be a popular article.
Go ahead remove the number of comments. It won't solve the problem Facebook is trying to solve.
So nix free speech (and rely on unknown propietary algorithms)? I don't think that's a good idea either. Much of the time, there's more value in the comments than you could gain from an article, video, etc - as is the case here.
Nothing to do with limiting free speech. Quite the opposite.
Instead of showing people only content that they agree with, with counts that reinforce the legitimacy of that content, you would show people everything that is popular and let them determine for themselves what they think about it.
I’m not talking about removing the comments. I’m saying don’t show the count. If people want to read the comments they can.
Limiting free speech may not be your intention, but it's exactly what's going to happen when you remove transparency.
We have a similar this situation in Germany right now. A private corporation designated by the government can delete Facebook posts as they see fit under the guise of hate speech. Facebook had to agree, because they'll be hit with fines if they don't.
What should be controlled by law is now in the hands of a corporation and they are quietly removing content that is clearly not illegal.
Removing transparency is a dream come true for corrupt institutions.
This seems very lazy on Facebook's part. Rather than employing people who will research and fact-check, they're punting the problem to the users. This whole situation arose because users are very bad at determining which sources are credible. People will believe anything that confirms their biases, and disbelieve anything that doesn't.
I think this is spot on, but it misses the core reasoning: this approach permits them to look like they're doing something, without actually changing the status quo. In other words, this lets them have their cake and eat it too, because users (and news orgs) can't blame Facebook for the rankings - they're "from the users"!
At least, that's how this news reads from my admittedly-cynical perspective.
How would that work with people who think the New York Times is fake news to begin with? They're not going to suddenly change their mind because fb says they're real. It'll simply reinforce people's mind or they'll say fb is biased or paid off or whatever. Even if fb had a perfect system it wouldn't solve the problem. People didn't use logic when coming to the idea that NYT is fake and Breitbart is real, so you can't use logic to get them out of it.
It's a no win situation at this point. I'm not sure there's any solution short of removing news from the platform.
I'm sure Facebook wouldn't care about a few million spent on a panel of the world's greatest connoisseurs of real news.
This feature is in fact guaranteed to cost far more than a few experts ever would.
But Facebook knows that any expert panel will without a sliver of doubt quickly converge to a ranking that has the Economist and the New York Times in top positions, and Breitbart somewhere behind a random word generator.
Any working statistical method will lead to the same result, obviously. But it gives Facebook the option to invoke HN's favourite argument: "it can't be political because it's the algorithm. See: here are numbers."[0].
[0]: Compare, for example, the libertarian love for bitcoin, and how it's free from the politics that undermine Central Banks,
Any working statistical method will lead to the same result? What do you mean by that?
I read articles in the New York Times, the Economist and also Breitbart (UK edition) fairly regularly. Whilst they cover very different stories as you'd expect given their political biases, I have not noticed any major difference in accuracy when dealing with objective facts. This is partly because "mainstream" media is quite unreliable, rather than any awesome quality of reliability inherent to Breitbart, but the idea that it's unreliable seems to be to be coming from people who simply dislike conservative worldviews ... and desperately want to stop people from reading them. Same reason they try and smear anyone who goes looking as nazis, bigots etc. They fear that if someone reads things from the "other side" they might find it's not so unreasonable after all ...
It's also a huge amount of hubris. You either punt this problem to the users in which case you have massive, incurable, vote brigadding, or you pretend to be opening it up "democratically" but in reality you use heavy handed algorithms to curb all outliers in whatever bias the algorithm deems fit.
> “There’s too much sensationalism, misinformation and polarization in the world today,” Mark Zuckerberg, Facebook’s chief executive, wrote in a post on Friday. “We decided that having the community determine which sources are broadly trusted would be most objective.”
It's arrogance. Zuck wants to be in control of which sources are seen as trustworthy and which are not, but he doesn't want to actually be on the hook for making these decisions. So he implements a faux-democratic facade, and gets to make the actual calls behind the scenes by tweaking algorithms. He wants to have his cake and eat it too.
Let's be clear here. Facebook is a private website.
They legally, morally and ethically have the right to determine what content their users see. Especially given that we have demonstrated evidence of fake news being disseminated.
The idea that this was supposed to be a democratic process was never claimed nor should it have been expected.
They are, for now, as most monopoly industries were before their negative effects caught the attention of govts and they suddenly found themselves regulated.
This is a frankly awful idea. I do not want Facebook acting as some sort of arbiter of what information is truthful. The cure is worse than the disease.
The problem isn't that simple. Most mainstream news, including CNN, have traded in objective reporting for idealogical dissemination. You can very often, legitimately call CNN "not credible". Just in the last couple years they have:
1. Told their viewers that it was illegal to read the DNC emails leaked to Wikileaks themselves, and to just listen to what CNN has to say about them.
2. Got caught posing one of their cameramen as a protester/rioter for an interview about why he was "in the streets".
3. Deceptively edited a video of Trump dumping fish food into a koi pond to omit the fact that he was following the Japanese PM's lead, then spent a whole day talking about how disrespectful it was.
4. Spent a whole morning talking about how Trump drinks diet coke, while there was an active terrorist attack in NY.
That's just what I pulled off the top of my head, too.
Edit: to the user that downvoted my comment: which of the four bullet points was false? It's easy to protect the liars who benefit our goals. It takes intellectual honesty to criticize one's allies.
to the user that downvoted my comment: which of the four bullet points was false?
Didn't downvote (and I don't have CNN), but your complete lack of links to back up any of the things you mentioned makes it very hard to judge if they're actually true or false.
I suppose that's possible, but I expect that, like Google's attempts to address the same issue[0], the biggest loser in this change is going to be independent news sources.
From the article, it weren't independent sources which were punished, but far-left and far-right sources.
> However, in addition to AlterNet and other left-wing news sites, Rosenfeld said right-wing, alternative media outlets have also seen a reduction in viewership due to the change in Google’s algorithm.
So? Independent news sources almost always have a non-mainstream political bent; that's why the writers aren't just writing in the New York Times. I don't think most people understand "weeding out fake news" to mean "only allowing a narrow spectrum of respectable opinion." And these sources do real reporting and break real news.
That's true but that seems like the easiest issue to deal with, considering how much information you have about user's political preferences if you're Facebook.
This matches similar research I saw in the last few days on some other political topic (can’t remember which) were some of the most important and active tweeters were Russian on both sides.
Russia wasn’t trying to get Trump elected (though they HATED Clinton). They were trying to get America to fight with itself and sew division. They did a really good job.
Certainly people on both sides were accused of that but the idea that Black Lives Matter was actually just a Russian plot is frankly pretty repellent, and that paper in the end just defers to Twitter's ideas of who's a Russian troll, and how did they decide? We've seen a lot of really shoddy work in this department.
I find that incredibly unlikely. I used to work in bot fighting on Google. Claims of Russian bots on Twitter and especially claims by academics are invariably riddled with methodological errors that render the "research" useless. I wrote about this problem here:
These studies are seeing patterns in noise. That's why the narrative changed over time - it used to be "Russian twitter bots got Trump elected!" and now it's "they're supporting both sides .... to sow division!".
The former claim was extremely implausible but at least it had some sort of inner logic to it, in that Trump was more friendly to Russia than Clinton. The new spin doesn't even make logical sense. However it makes perfect sense if you're just picking random Twitter users and trying to explain their behaviour through the prism of some convoluted conspiracy.
Sure it does. The appearance of a wide division in our political landscape leads to a wide division in our political landscape. People tend to "pick sides" in political fights and when something gives the appearance of a serious political dispute, people are motivated to pick sides and entrench their opinions. This is in keeping with getting Trump elected, as a large part of his support was due to the fear created by the appearance of growing civil unrest. None of this is particularly hard to understand.
The logic makes no sense because it specifically fingers Russia, yet any adversary of the United States could conceivably benefit from "sowing division". Once you've left behind the specific support for Trump the underlying logic linking it to Russia collapses but the conclusion has been kept, which indicates motivated reasoning.
There's a deeper issue here too. It paints the posting of political talking points from both sides of a US election as generic "sowing division". That is a world view that is quite totalitarian. I could describe it equally as "invigorating the democratic process by increasing interest in the election" and be no less accurate.
a large part of his support was due to the fear created by the appearance of growing civil unrest
You haven't shown that, you haven't even laid the groundwork for that. It's actually the first time I've ever seen such a claim. Most analyses of why he won point to the weakness of Clinton, his policies in immigration and trade deals, his opposition to political correctness and so on. Not "the appearance of growing civil unrest".
This link goes into detail about the perspectives of some whites on the BLM movement and how Trump played on those fears and represented a solution to growing minority influence (presumably at the expense of whites) in the wider culture. This also needs to be understood in the context of disruptive BLM protests as well as the white nationalist protests in Charolettesville.
>yet any adversary of the United States could conceivably benefit from "sowing division"
The point is only reasonable when divorced from all context of Trump's Russian contacts, Russia's interest in getting sanctions overturned, Russia's known past disinformation campaigns, DNC hacking, wikileaks, etc. Within the proper context it is entirely unreasonable to even posit some other actor working to influence the election.
But the claims are no longer that Russia manipulated social media to get votes for Trump specifically. Therefore any Trump/Russia link is irrelevant to the current narrative, which is that "someone" is using "bots" to generically "sow division".
None of these claims have any actual evidence backing them, and when claims can be sourced the sources collapse under examination; see my article above.
But we shouldn't need to deeply analyse academic papers to see that this narrative is nonsense. It is just not internally consistent. Putin had clear reasons to want Trump to win, that I accept. So why would his staff post things supporting Clinton? There is no logic to this idea whatsoever, and it's not where the narrative started.
I mostly agree with your perspective, but Trump repeated "law and order" like a kind of mantra at the debates and I think a fear of lawlessness is a real motivator for his base.
Maybe it makes sense, but Vladimir Putin has superpowers if everything people blame on him is true. And Mr. Hearn's article nicely illustrates how shaky the foundation is for a lot of claims of Russian influence.
This is one of the most absurd talking points of this whole thing. It doesn't take "superpowers" to zero in on existing rifts in a society and magnify them. It has been well documented that Putin has a well-oiled disinformation machine that is adept at exploiting existing social issues.
Please read my article. The sources you believe are "well documenting" these things are junk - see the paper that generated stories about 100,000+ Russian bots and how its analysis simply doesn't support their conclusions at all. Their definition of "bot" was equivalent to any human who happened to use Twitter slightly more than average, so not surprisingly they identified huge numbers of "bots".
I've looked into these stories about Russia repeatedly. They never survive basic fact checking, and I'm not a journalist with special sources or access.
Who doesn't? The US has those too and they don't just operate internationally.
But more to the point, there is a moral panic going on where the bar is drastically lowered for what information is considered credible of it involves "Russian collision," and it seems to mostly serve domestic political purposes. How else can you explain a raving crank like Louise Mensch appearing in mainstream news sources?
Aren't they being forced into this role? Facebook is routinely cited as a vector for supposedly democracy wrecking Russian propaganda. There has been a stream of threats and deadlines imposed by European governments demanding that Facebook et al "stop" "hate speech."
Forgive me but if the Powers That Be were breathing down my neck 24/7 I think I might start to believe I'm expected to "do something."
Do you think the Trump administration's regulatory apparatus is going to go after them for that? That seems unlikely. Certainly some people have called for them to get involved but I'd say those people are also making a serious misjudgment.
Well, sort of. The votes give them raw material but interpreting the results and then demoting content is Facebook's job (and almost certainly nobody will be able to see how they do it). And getting demoted by Facebook, Twitter, and Google (all of which are involved in similar initiatives that mostly serve to demote independent news sources) makes a huge difference in our increasingly monopolistic media landscape where those platforms are how many people get their news.
“I consider it completely unimportant who in the party will vote, or how; but what is extraordinarily important is this — who will count the votes, and how.” - Stalin (though the sentiment was not original to him. Variations on the same theme have been recorded as far back as 1880, at least).
Holy hell, Facebook! For a cash-rich company employing some of the cleverest people on the planet, you're managing to be incredulously stupid here. People who aren't qualified to rate sources should not be asked their opinion. If you want peer review, do peer review... But ask-the-audience isn't that.
That all said, there is a redeeming possibility here. You could also do an in-house monitored, qualified peer review of select articles from popular sources and use those results to audit users. If somebody keeps hating on WaPo and pushing Fox you could make assumptions about the quality of their judgement.
What if Facebook is weighting the votes based on content consumption and interests of users? If that's the case, then it's possible to get some interesting results. We need to be doing experiments like this.
Just remember that divisive interests —where "fake news" lives— cut two ways. Inferring too much from consumption might confuse "can't stop watching this trash-fire of a government" and "MAGA". Or "aren't these right-wingers a bunch of nutbags" and "lock her up". There's significant overlap in consumption.
Worthy of experimentation, but not to decide what's shown on my wall.
You probably can get there in the end, but it's probably easier to pay somebody to objectively rank to the political position, and journalistic integrity of major publications.
> “There’s too much sensationalism, misinformation and polarization in the world today [ed: because we rely on users to decide what can be trusted],” Mark Zuckerberg, Facebook’s chief executive, wrote in a post on Friday. “We decided that having the community determine which sources are broadly trusted would be most objective.”
Its a bit tone-deaf, Mark. Remember the "we had no effect on the election" line that everyone laughed at? This year its "user curation is the answer to fake news". Everyone is laughing at you again.
If you want to determine credibility, hire people to judge credibility. Hire a mix of viewpoints. Its not that hard, it just costs money, which you have more than enough of. Maybe give some of that money back to the users who turn over their online lives to you and actually improve their lives, instead of just monetizing them by feeding users garbage that gets them to click and consume while at the same time you go around complaining that users aren't discriminating enough with those clicks and consumption.
Zuckerberg said no such thing. He said fake news articles on Facebook did not affect the election.
Also, that people were using some sleazy disinformation on Facebook as a convenient excuse.
> "I think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news,” Zuckerberg said. “If you believe that, then I don’t think you internalized the message that Trump voters are trying to send in this election."
> The same criticism has also engulfed other social media companies such as Twitter, which on Friday said it was emailing notifications to 677,775 people in the United States that they had interacted with Russian propaganda accounts around the time of the 2016 election.
Wonder if they included the part where they (Twitter) went to RT, which is pretty much Moscow's propaganda arm, and offered to sell them $200k worth of adds geared specifically for US elections crowd.
^ using social media as trustworthy news source is troubling IMHO. That could only work if social media was under the same legal regulation as press is.
But user ranking is basically FB saying "no, no. We don't do news, that's not our business. Sign waiver here"
I would very much say: any form of social media and news should not be mixed. Social media is basically attention whoring for everyone these days, truth or value does not matter much in that regard.
Now it becomes a battle of which side can marshall more FB users to flag stories of the other side as fake. Which, in turn, results in more views and clicks for FB.
They sure as shit can, but they sure as shit don't want to. They could have tackled this 10 years ago but instead they optimized for likes, clicks, traffic, and revenue.
This is so stupid. For one, people tend to have herd mentality. An article seen the wrong way could trigger an immense backfire where a medium's credibility tanks in a matter of days. And then it would take years to reverse the damage.
Furthermore, it opens the way for anyone with a botnet of infected machines to rig the credibility of any news outlet to his liking.
It's only stupid if you believe Facebook wants to solve their echo chamber problem.
Facebook draws its revenue from personality profiling its users for targeted advertising. How much more profit do they make when they can actively push people towards binary thinking, and make them easy to categorize?
It would seem that Facebook is rather well-positioned to identify such problems in the data.
They could, for example, only count votes by long-term users. They also have almost complete knowledge of their users' political leanings, making it possible to counteract any attempts at manipulation.
They would be better off auto-blending articles at the other end of users' opinion spectrum into their news feeds (or comment threads) to try and burst the echo chambers.
"Here are some articles at the polar opposite to your norms."
You could bust the echo chambers by not tailoring content and showing popular content to everyone without telling them why it’s popular. If you saw two popular pieces of content next to each other, one from Fox News and one from CNN but with no indicators (likes, comments, etc.) it would force you to actually read the content and come to your own conclusion.
Instead we show people what they want to see with a number next to it that reinforces the popularity and legitimizes whatever it is. Part of how birtherism became a thing. You only see articles from a source you trust and you see that 100k people also approve of ever post on the topic. “Hmmm. Well, those 100,000 people must be into something!”
I don't think that would work well at all. "The polar opposite to my norms" is stuff like Alex Jones and Breitbart. I won't learn anything from exposing myself to such idiocy. Dishonest clickbait is worthless, even if it comes from a diversity of viewpoints.
No algorithm is (yet) smart enough to judge the quality of journalism. Facebook shouldn't try.
This is exactly why we need to be broken from echo chambers. If you think the polar opposite of whatever you believe is conspiracy and idiocy, you are in an echo chamber. There is rational journalism for whatever is the opposite of your beliefs.
Slightly off-topic: are there any social networks out there not polluted by news -- fake or otherwise? Is there room for a new social network that's for, you know, people?
Snapchat's a tabloid, Facebook is basically a news RSS feed, and I'm even starting to get news "stories" when browsing hashtags on Instagram. I just want to discover people and talk to people. Apart from dating apps (Tinder/Bumble/etc.), it seems that all social networks have become news networks.
Mastodon has a norm that news/current events are hidden behind a content warning/subject line. So if you want a news-free timeline, you can have it. You're still responsible for who you follow, but the timeline is straight reverse-chronological with no algorithmic fiddling, and numbers aren't shown for favs and boosts unless you zoom in on a post's details. There's less incentive for clickbait and outrage.
It could get worse in the future, as it's relatively small (1 million known active users) and has a positive founder effect, but for now it's very good.
That might be because "the news" isn't some abstract thing that has no effect on people's lives, so people want to talk to people about the news and politics.
Not saying you're wrong, but I don't know, I was perfectly happy chatting with friends on AIM messenger a decade ago. Myspace was also super spammy, but never became a (literal) news feed. I don't mind stupid pro/anti-Trump memes or whatnot, but giving news distribution companies (from Breitbart to the Economist) a platform on your social network seems like an anti-pattern.
I'd much rather see a news article someone thought was trenchant or interesting than a meme image that serves no purpose but to try and rile people up. Which one is a better basis for conversation?
Let me rephrase: I think it's perfectly fine to share news articles and talk about them, etc. But the distribution of news has become fundamental to FB and Snapchat (and to a lesser extent Twitter and IG) -- see FB's trending stories, or Snapchat's "Discover" tab. It's not at all like "oh Bob thought this Fox News article was interesting" -- although to be honest, I would actively discourage that, too.
I get a lot of news in my feed, but I willingly subscribed to multiple news sources and regularly comment on articles, so I can't comment on the idea of news being foist upon me.
Well, my FB isn't polluted by news. I don't know why people talk as if they're forced to look at their news feeds, I hardly ever do. I use it for chatting with friends around the world, looking at their pics sometimes, making friends.
ban certain users from voting on certain topics of even seeing them and game the system to always lean unfairly to your political leanings. no watchdogs, no transparency.
What they should do isn’t show me news that I’m likely to agree with, but news that nearly noone disagrees with. It should be very easy to notice what has both large numbers of people disagreeing and agreeing with, and people within the same social networks agree with eachother. That type of news could just be supressed. If all that’s left is news with kittens then that’s a glorious success. I can get actual news elsewhere, and Facebook can select a few publications of good quality and replace anyone that shares a breitbart article with a corresponding one from the guardian.
Well, in a country where 30 states voted for Trump, 20 states voted for Hillary, and 100 million people chose not to vote (and that's 42% of the eligible electorate[0]), who shall be arbiter of what news, at least U.S. political news, "nearly noone disagrees with"? I mean, given that a majority of politically active people in a large majority of states voted for Trump, I expect that you'll agree with me that Alex Jones' endlessly pro-Trump daily videos should be featured right at the top of everyone's News Feed. I know I'd love that.
I don't use my Facebook much except to occasionally share vacation photos these days. Checking recently seems to show no news of any sort. Just some silly memes, weather phenomenon (supermoons!), pop-geek-culture stuff, and daily status updates. Since it's the season, there's some American football discussion, which people obviously have disagreements on, of course, but in that "sport" sort of way. :) I'm sure major news (like major weather events) will still make my feed, but there seems to be a noticeable decrease in political stuff.
My cynical take on this announcement is that, for those that post and share a lot of news, there is a high probability that your news bubble will be more heavily reinforced. Not all Trump voters are Alex Jones heads, of course, but if you are one, maybe Alex Jones will still be at the top of your news feed, and you will get all of the Alex Jones-y shares you want. It just won't be on the feed of the friend who prefers is apolitical merely sharing silly cat videos, or the person who is sharing Rachel Maddow clips. Facebook, in other words, will strive to bring you the best news feed that you will never disagree with, all the better for dopamine-inducing clicks and likes.
(I don't really know, of course, if my cynicism is off the mark, but these sort of opaque-algorithm news feed games Facebook plays are a large reason why I rarely use my Facebook feed. :) )
Well, on the one hand, sure. I think Alex Jones is a conspiracy-theory charlatan, a good representative of "fake news" as a matter of a fact (see: Pizzagate). The fact that his site also pushes dubious supplements (like "super advanced vitamin B-12" and "survival shield", which is a "proprietary nascent iodine" formula) says it all to me.
On the other hand, no. I'm not comfortable with a social media network that determine your news feed "appropriateness", via an opaque Brazil-esque (the movie, not the country) bureaucratic process that says "yay" or "nay". And while Alex Jones may be an easier business case here due to the "fake news" elements, I'm not seeing much transparency about what their post blocking process really is. Other than it seems real messy and haphazard, and a lot of people complain about it for various reasons, whether you are a black person posting police brutality videos, or are activists documenting the Rohingya genocide, or have conservative opinions, or are Palestinian, or post breasts no matter the purpose, or post iconic Vietnam war photos, or various other reasons that come up in articles.
> On the other hand, no. I'm not comfortable with a social media network that determine your news feed "appropriateness", via an opaque Brazil-esque (the movie, not the country) bureaucratic process that says "yay" or "nay"
Completely agree. But I'm actually less comfortable with the status quo of state controlled botnets spreading a goo of fake news, and people living in those fake media bubbles it creates. Wikipedia is definitely a role model here. Unsure if twitter and facebook could achieve the same level of control, if they could it would be great.
A lot of people would strongly disagree with Jones’ videos, while some would share and like them. So they are clearly controversial and “bubbly” (as those liking them and those disliking them will belong to connected groups, socially and locally) - so Facebook should simply try to limit their spread as they are likely not objective and certainly divisive. That was exactly my point.
But I still believe that Facebook could whitelist news sources too - but obviously if they do there will be a massive cry from those that thought breitbart or drudge was going to pass that cut.
This is a horrible idea in general, but very good for facebook.
Facebook users will be in even greater echo-chamber re-assuring their world-views therefore they will be more engaged with the website, and this is all FB cares about.
Here's my theory, and I think it's quite obvious: Zuckerberg, as someone with a reported $74B in net worth[0], is actually a Republican, and strongly supports Trump and is pushing for his reelection. All the apologizes and damage control is just virtue-signalling to appease the entertainment industry (by which I include the news industry), which relies on and trusts Facebook for their very existence.
I just wish they would implement a "Snopes.com filter" which marked things as proven false. I would say something like 1/5 posts some of my relatives make is easily disproved by Snopes.
I enjoy seeing FB sabotage thier brand into the ground, not because of it's service, but rather because of it's sess pool leadership.
My feed is basically a bullshit mill of news instead of actual "freinds" experiences. Even given the bullshit, a good % of the experience stuff is basically just selfie show-offs. It's almost become a mirror of the self, with very little socially valuable interaction.
oh you're on a beach ...like
oh you want to signal your virtue ..share
Facebook is starting to become the anti-social platform in the sense that it has an overall negative social effect.
And Rome lets the mob run the Comitia Tributa via popularity contests, including dismissing opposition Tribunes.
I'd argue Rome died not in 476 but in -121 when the Senate killed Gaius Gracchus and didn't realize they needed reform and create an actual political process that doesn't lead to smooth talkers and thus tyranny.
I believe Plato's and the Founding Fathers' preferred solution here would be for each individual to find the smartest willing person they know and let them offer their analysis for consideration.
So they're now going to rank news stories based on the reported (i.e. subjective, not objective as they claim) familiarity and credibility of the source. It's hard to know how this would help without more details. Does anyone have any more information on precisely how this heuristic is defined? I have my doubts that this could even work at all. It seems the whole lesson of recent events is that the average person is not qualified to judge the credibility of hardly anything.
When you let users contribute to any kind of measurement, it turns into a measure of popularity. It's now a game of "get enough users to do X". By relying on end-users to rate and rank things, we're letting popularity stand in for all kinds of quality measurements. Lots of 5 stars on Amazon = must be a good product. Lots of Yelp reviews = must be a good restaurant. Lots of upvotes = must be a good comment. Talk about doing the same thing over and over and expecting different results!
A fraction of revenue from popular pages should fund a manual fact checking process that either temporarily grants or denies that page a trusted status.
People don't care about "fact checking". The NYT, or the Wall Street Journal's newsrooms simply don't make factual errors, except occasionally spelling someone's name wrong, Nobody cares.
Your counter example uses a trusted source of info. The people who actively read those sources likely DO trust them. Facebook lets anyone post anything (to the extent that fake news is a huge problem), and reaches a much larger audience.
I think Facebook's main benefit from this is that they will now have users clicking a button essentially to say, "yes, I believe this." Now they have a direct metric for what type of manipulation works for that user. Because really, no user can actually verify the truth of a media outlet's claims without heroic efforts.
Well, easy, only "mainstream" sources end up being considered trustworthy, and any alternative news source that challenges the world view of NPR, CNN, Fox News, MSNBC, et al (sources that are frankly more alike than different) just gets shut out. Look how quickly places like the Washington Post ate up that "PropOrNot" stuff.
Before I go in, I must say this comment is not about politics , and I am not a Donald Trump supporter (I can't believe I have to always prefix this before saying anything sane nowadays otherwise I get stoned to death online)
That said, here's a story. One day during the presidential campaign I saw a person who normally really despised Donald Trump say "Hey I hate him but I think he kinda made sense on his speech today". Remember she watched the speech because she hated him and was looking for every chance to make fun of him but instead came off rethinking things for just that speech.
Then the next day she watched a mainstream TV news anchors go through the speech, one by one line and criticizing, using Trump's past history and etc. Then she said "I knew it he's an evil, he almost made me listen to him, so devious".
I normally never watch TV so observing someone else reacting completely opposite ways based on just a 30 minute TV show is when I realized how mainstream TV is shaping people's minds and propaganda in 2018 is stronger than ever. Most people won't want to believe that they are under heavy influence of political propaganda because their identity is connected to it, but that is the truth.
Just to be clear, this is not even about politics. This phenomenon is increasingly common across every aspect of the society. For example, if you look at what's happening in the cryptocurrency ecosystem, it's full of propaganda by people who have more knowledge manipulating people with less. The only way to overcome this is to:
1. Try to be as emotionally detached from the events as possible
2. Actually try to learn what is going on, instead of listening to what everyone else is saying, because even your most trustworthy friend, family, or even a very reputable nobel prize winner is under this influence unless they followed this principle, which most people don't have time to engage in.
3. If you ARE that reputable person, be careful what you tell your followers. If you haven't gone through step 1 and 2 and just saying "This is who I am, I'm just expressing myself, take it or leave it, just unfollow me if you don't want to hear what I say", you are being extremely irresponsible. You are basically taking your own gullibility and amplifying it to hundreds of thousands or millions of other people who are probably in less fortunate position that you are (which means they will suffer exponentially more from this misinformation than yourself)
Facebook doing this won't help fake news, it will only accelerate what they are already guilty of, because most people aren't even aware they are misinformed and furthermore don't want to believe they are wrong. So it will only result in larger and larger filter bubble which separates people even more.
It's like religion, when was the last time you were able to convince a religious person into believing that God doesn't exist? Imagine telling people on Facebook to vote whether God exists or not on Facebook.
Personally I think there's a huge opportunity hiding in this madness somewhere if you're an entrepreneur.
A good way to prevent false stories, would be to reward users who take the time to find evidence to the contrary, with some gamified rank system.
"Mamothhunter of Lies"
The BBC article (http://www.bbc.co.uk/news/technology-42755832) made the point that this system would have stopped Buzzfeed being able to make that change (to the extent they'd still rely on FB traffic):
> For instance, Buzzfeed's initial beginnings as a viral site would have almost certainly hindered its growth into a serious news organisation had it been subject to the ideas about to be put in place by Mr Zuckerberg's team.
I'm not really sure what the answer to this is - but they are really good now and would definitely have been perma-black-holed before now by many measures I can think of.
Facebook would be a much better platform if they removed the like and comment count from every post. It’s those numbers that lead people to believe that garbage is legitimate in the first place. By hiding these numbers it would force people to think for themselves.
Same thing with YouTube. Fake News becomes real news when it had enough views, likes, upvotes, comments, etc.. These counts don’t really need to be shown to the end user.