If we really want to solve the problem instead of an an outlet to vent frustration like spiled brats, then we should think deeper.
Why does my mum believe every piece of crap that shows up on Facebook and WhatsApp? And why do I call bullshit within a few seconds of reading it?
This I believe is the right angle. Why do people believe obvious lies. And why is it obvious to some and not others?
I suspect the root of the problem is our system of education. The primary goal these days seems to be obedience and unquestioning believe in authority.
If this isn't solved (I doubt it can). Then asking for fake news control will bring a bigger problem - loss of free flow of information...
China already has an excellent way of controlling fake news. But is that what we really want? Those who will benefit from the Chinese news control are those behind the sudden interest in fake news.
Wake up people!
You would think so. But my dad is a varsity Professor with a PhD, my mom has a Masters in English.
They believe literally every cock and bull story they see on Facebook.
You make an interesting point when you state that "The primary goal these days seems to be obedience and unquestioning believe in authority." because for me that is both the problem and, ironically, the very obvious solution.
If there was a trusted (read, authoritative) service that one could use to vet such stories then the problem would go away.
The devil is in the details though. I've seen people dismiss Snopes as some sort of leftist agenda pandering website.
>You would think so. But my dad is a varsity Professor with a PhD, my mom has a Masters in English. They believe literally every cock and bull story they see on Facebook.
This is precisely what the parent to your comment is saying. Many (most?) degrees hold more symbolic value (read, fake) than real practical value because that is what education has devolved into: favoring symbolism (test scores, degrees) over real life applicability.
>If there was a trusted (read, authoritative) service that one could use to vet such stories then the problem would go away.
This is a logical fallacy.
If you already think you lack the ability to find truth, how would you know it was real when someone else supposedly gave it to you? If you can't trust yourself how can you trust your trust in others?
But this is exactly the issue. If you're saying "these people aren't competent to evaluate news, so they need a trusted source", then how can they possibly evaluate their trusted source?
It's not a surprise that people who can't pick out fake news stories also can't tell how effective Snopes is, it's the heart of the problem. (Well, that and "if Snopes became authoritative, how would you know if it started lying?")
I would guess that the difference between you and your parents is years of exposure to the tropes, tone, register and other pragmatics of internet content.
Here's the fun part, in China, spreading rumor is a criminal offence, but debunks (or at least one sided debunks) are not. So people and PR firms and using fake news to counter fake news.
Step one is to realize your side is just as full of shit as the other and to accept that your favorite politicians are not right about even half of most public policy ideas.
After that, then you can start working on removing biases.
Edit: Also fake news is such a shitty term. Call it what it is: propaganda. Media that plays to the worst elements of our animal brains to bring the greatest emotional response and move our fingers to the share buttons. It's nothing new, we just have an unprecedented amount of it available to us and little desire to hear the complex and often unintuitive truth to many issues: nuance doesn't fit in a tweet.
This gets even easier if you are misleading and manipulating facts rather than outright making them up. The pope endorsed trump sounds immediately made up, and has nothing to back it up. Hillary conspired to keep drug prices high, when "backed up" by emails is easier to believe. If I already believe she's corrupt then why would I doubt more evidence.
The Daily Mail is still running stories about straight cucumbers now.
The German government wants to implement insane penalties for not removing "fake news" fast enough, but what they're really trying to do is make sure Facebook only tells their side of the story, i.e. establish censorship on the internet.
Government-critical sites that had millions of followers were already banned while media supporting the government can happily keep spreading their equally or even worse fake news.
It's becoming a world-wide trend to call unfavorable things fascist and then employ fascist methods to fight them. Just like calling oligarchic structures a democracy, because it sells better.
When I read
> I highly doubt that Facebook will “move fast, break things.”
> Instead, Facebook is pushing half-hearted solutions and scapegoating the issue to other parties, prepping to weather this storm on an already-laid foundation of media dependence and habit.
I start to think Facebook is only waiting to be disrupted.
This may be one of the human sides of a technology developed by people who want to "fix everything from polarization to terrorist attacks to how we live together", but fail to provide basic rights to its users (privacy and freedom of information among others).
Any opinion ?
Take all of my upvotes.
I'll never get my head around "Antifa" doxxing and assaulting Trump voters because they're "facists". When someone stands up against these kids, ie Based Stick Man, they're the ones that get arrested not the aggressors.
“Beware that, when fighting monsters, you yourself do not become a monster... for when you gaze long into the abyss. The abyss gazes also into you.”
Given which, "penalties for fake news" means one of: "after reports, which is too late", "all of it, so shut down your company", or "anything which annoys us enough to prosecute, so be proactive about pushing our stance". Choice three seems to be the usual outcome.
The only viable solution must come from the top-down. People will only reject fake news when leaders emphasize critical thinking and shoot down false stories, especially when the fake narrative is favorable to them or their position.
2. Teaching kids to spot "fake news" at school is laughably impotent and "feel good". People believe fake stories because they're emotionally-invested in the subject (see PG's "keep your identity small" post). Anyone that thinks we can solve fake news through "critical thinking skills" is... fake news. The problem is emotional involvement. If you're emotionally involved in a story, you won't want to use critical thinking.
3. You can get people to change their opinions, e.g. from climate-denial to pro-science (through using their cognitive dissonance against them). That's how you got Sanders converting Trump supporters at a recent town hall. You haven't made them more rational, you've just taken their emotions, which pointed one way, and made them point the other way. So, if you want to make people trust "the establishment" or the Democrats more, you don't need to fix points 1. and 2. (which require fixing people), you just need to learn about cognitive dissonance and persuasion. But that doesn't solve the core problem of not having an informed population.
The alternative is to not do anything at all, but that leaves our democracies open to attack from malicious actors.
These both lead to bad outcomes. So let's consider alternative approaches.
I believe 'fake news' is a symptom as well as a cause of political polarization. The less polarized and divisive our politics is, the less willing people will be to blindly accept fake news. Basically the potential receptive audience to fake news will decline. Given that approaches which try to attack fake news directly both have negative consequences, I think if we instead try to target other sources of polarization in society the fake news problem itself will become less serious.
Let's start with getting money out of politics, instituting term limits for congress, finding an alternative to gerrymandering for the creation of electoral districts and most importantly growing the economy.
These are challenging and nontrivial approaches but will ultimately attack the root cause of fake news spread. It is much better to treat a disease at its source than to just treat its symptoms.
Many people don't know that the Swiss System is actually based on the American one, with some small differences. These differences have lead to at totally different result. Switzerland has about 12 parties in the lower house and 6 in the upper. The President and his Ministers are replaced by a council of ministers (7) that is elected by the parliament, currently 4 parties are represented.
The US would not need to copy the Swiss model, but some changes that lead to similar effects would help the US political system and the 'fake news' problem.
A good place to start: https://www.youtube.com/playlist?list=PL7679C7ACE93A5638
Else we're no better than school yard gossip.
Too many outlets use opinion pieces as fact and engage in public outraging by selecting the most controversial, inflammatory, or biased stories and opinions to get viewership. Hell, you don't have to look far for them claiming not to be news but "Entertainment News" and frankly, it's insulting. News used to have standards. Unfortunately it scrapes the bottom of the barrel.
That's one thing I've noticed drives a lot of the fake news sites - they're loaded with ads. Clearly that's their goal. If only we could get Google to cut off ad revenue to websites that are spreading made-up news.
Good recent example? This: http://archive.is/R5Z1J
Fake news about Trump spread like wildfire. Even worse, the fake news sites were set up by a media company to advertise their new film. I think that's absolutely reprehensible behavior on their part.
And it's all about the ad revenue.
Fake news, or lies as we should really be calling them, is very much a social problem. We've had "fake news" in the form of parody websites like the Onion for a long time, and whilst a large number of people fell for their stories, they often didn't make the same mistake twice. People are looking for new sources of news, and the combination of this with easily shareable and difficult to verify stories means that the lies spread quickly.
Part of this has been caused by the attack on mainstream media (MSM) - people no longer trust the MSM because they're being told not to by politicians, and the fact that they're often caught misreporting stories doesn't help this. So, people turn to smaller more focused sources - often ones which they've never heard of before, and which they haven't been told not to trust. How does Facebook solve this? I'm not sure. Flagging stories as "this hasn't been reported elsewhere", or by filtering new sources of news might be a start, but it wouldn't be a solution. I'm not sure people would trust Facebook to manually moderate content, and it would simply give fake news and politicians the ammunition they needed to target Facebook as being against them. The very nature of news is that its unexpected and mostly unpredictable, so training algorithms to detect fake news reliably, without hiding real news, is a very difficult task.
Facebook can't take responsibility for fake news, society needs to do that.
I'm not sure that's the biggest part of the problem, though maybe it's different in my bubble. In my experience, the MSM destroyed their own credibility by acting as stenographers for political and corporate power, and, in the process, spreading "fake news". And it wasn't the little independent partisan sources, it was literally the WaPo and the NYT that helped the Bush II regime lie us into the Iraq War (14 years ago today!).
It's hard to gain back credibility when you've so comprehensively destroyed it in order to be close to power. And the NYT and WaPo have not even pretended to be interested in doing so.
Chronologically implausible. Trust in the MSM has been collapsing for decades; politicians only recently began telling people not to trust them. Politicians telling people not to trust the media is an effect of the fact that trust had already collapsed, not the cause. If people had trust and regard for the media, then politicians, another group that people have little trust and regard for, telling people not to trust them would have had no effect.
Personally, I can't help but notice that almost nobody seems to propose that if the mainstream news organizations want to fight "fake news", maybe they should make the effort to become more trustworthy. Not make an effort to appear more trustworthy, but to become more trustworthy. I think it says something profound about our era that this hardly even comes up.
Jon Stewart touched on this recently on an appearance on The Late Show.
>"This breakup with Donald Trump has given you, the media, an amazing opportunity for self-reflection and improvement, Instead of worrying about whether Trump is un-American, or if he thinks you're the enemy, or if he's being mean to you, or if he's going to let you go back into the briefings, do something for yourself. Self-improvement! Take up a hobby. I recommend journalism."
But I think there could be some mechanisms in place, similar to how Youtube handles copyright claims - where upon reporting, or a request for a source from any Facebook user - the publication would get it's Reach capped, or the publication would be hidden, until further evidence is displayed.
Now, there are a lot of problems with this approach - first it's user based (which has it's pros and cons), second it may be target for malicious activity and anti-competitive pratices - but on the other hand if the source was displayed, it would somehow protect them (at least briefly), without having the Reach capped, but still flagged if the source is dubious.
Come to think of it, that might not be the worst thing in the world.
Something at a publisher level, where the publisher themselves is given a fake news rating, rather than individual stories, might work, but even that is full of difficult to solve issues.
Publishers always had reputation (good or bad) based on their work, which maybe could be displayed with a rating - the problem is when even those with good rep are wrong...
This is a practical suggestion, but not one I find reassuring. The Youtube copyright system works fairly well from the perspective of big rights holders, but is relentlessly abused for both censorship and corporate overreach in taking down fair-use or even non-infringing content.
That's basically the outcome I expect from a fake-news-takedown system, too. It would almost inevitably end up working for whoever was willing to throw the most time and money at manipulating content.
Imho, at the least, Facebook should merge commenting from larger groups of users on a single political post, in such a way that a balanced view of pro and con arguments can be read. (This means that if one of my friends shares a post, I don't see just my friends' comments).
Similar to how some aren't looking for news, but affirmations of their political beliefs.
Take for example recent CIA leaks. Reuters, when reporting it , included a statement from antivirus software manufacturing saying that "We can prevent attacks in real time if we were given the hooks into the mobile operating system". Given long history of AV software manufactures not following even basic security practices, I doubt that security experts would agree that this is how we should ensure safety of mobile devices.
 David Lieberman, "Fake News,", TV Guide 1992, quoted in "Toxic sludge is good for you".
 Eric Auchard, "Wikileaks' CIA hacking dump sends tech firms scrambling for fixes" http://www.reuters.com/article/wikileaks-products-idUSL5N1GL...
However, the term being loosely defined should not make us lose sight of the fact that there is an specific kind of "fake news" that is 100% a consequence of the priorities and incentives that Facebook has put in place: making outrageous headlines and stories out of thin air, for no other reason than making money from advertising.
Because we don't see outrageous headlines anywhere else ! That's why !
The problem is that, to judge something as 'fake', an authority needs to deem what is 'real'. And abuse of authority never happens, right? Banning fake news is akin to state censorship, since ultimately the one making the penalties (the state) gets to decide what 'fake news' actually is.
Buy a subscription to the NY Times, or any other of the major newspapers or newsmagazines.
Did you know the journalism industry is in a tailspin, and many professional reporters can't find jobs because people feel they can find the same information for free from blogs and on facebook?
Newspapers aren't perfect, and bias in unavoidable, _but_ I don't see too many blogs paying copyeditors and fact checkers to double check their work.
The good newspapers also take responsibility for their work and protect their sources.
I find it sad that as a society we are abandoning the press, which is an important check and balance on the other institutions in society.
Instead we prefer our dopamine squirts from swiping our facebook feeds on our phones, and being fed targeted stories that confirm our biases.
I find it a sad irony that a generation who often likes to sneer at older industries, and disrupt them, thinks they have replaced the free press with something 'better' just because it is delivered via a cellphone.
They are mistaking the medium for the message. The information quality is lower, but if it feels new and shiny, it must be better right?...
 Yes I know most of these sources offer their content for free online, but I'm suggesting to pay for it anyway. You know support them before they disappear completely.
> This is just the latest in a long string of recent responses to the issue that shirk responsibility for the content that users consume and share on Facebook.
It really isn't. This is the core issue.