Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the opposite of what they should have done. Shows just how much of an echo-chamber Facebook has become. This is just adding another metric by which garbage can be made legitimate and vice versa.

Facebook would be a much better platform if they removed the like and comment count from every post. It’s those numbers that lead people to believe that garbage is legitimate in the first place. By hiding these numbers it would force people to think for themselves.

Same thing with YouTube. Fake News becomes real news when it had enough views, likes, upvotes, comments, etc.. These counts don’t really need to be shown to the end user.



If you really want to fight the root of the problem, it's not likes and comments, but education.

Should we censor social media, because people are too stupid to deal with it? I don't think so and if you do, you're playing in the hands of corporations who want a cheap work force.

Your "solution" isn't a solution, but a hot fix that will cause more damage in the future. It's exactly the shortsighted crap politics are trying to do right now to get immediate results.


The experience from a few months ago during a conference showed me that education won't fix echo chambers.

It was a conference that was organized by a organization that holds one of the major initiative against false news, and where one of the main tracks was about addressing the issue of false news. Last year they had a massive booth at the middle of the conference hall about false news. This year they brought in a key note speaker which brought several political slides while making misleading conclusions that was later proven to be false, but the whole thing echoed with the audience. It seemed to not matter a bit that it also just happened to be false.

When facts get in the way of a political message, even the most educated seems to let the facts go if its align with their own views. I wish education would work, but I don't see it as a proven method to solve echo chambers or political news.


The whole idea of "fake news" came out of US politics to begin with, so no surprise organisations set up specifically with that as a mantra are ideological in nature and uninterested in actual facts.


I believe with your quotes, you may be referring to the phrase "fake news" but fake news has been around as long as humans & probably even longer. I'm glad people are giving this a discussion though.


No, "fake news" refers to a very specific phenomenon. It's not just any kind of misleading or incorrect reporting. Trump was the one to start saying "not fake news, YOU'RE fake news", but the phenomenon as initially encountered was about shoddy and small fraudulent websites, often trying to deceive their audience to believe that they were a different website, pushing the most anger-inducing news.

It's a far cry from the NYT or Fox News reporting incorrect things, whether deliberately or not, to someone called Foxx News or Non York Times reporting that Islam causes prostate cancer.

https://en.wikipedia.org/wiki/List_of_fake_news_websites#Lis...


This phenomenon was previously called 'yellow journalism' and should be a core component of any halfway-decent, modern American History course.

Have we forgotten about the Spanish American war?


I don't think yellow journalism has the same properties as fake news. For one, yellow journalism doesn't try to deceive its readers as being as the same source as another more reputable source. With print media, this was harder to do.

Fake news has a slightly different approach because it's much easier to reach an audience online and elude trademarks.


Interesting that "Before it was news" is on the list of fake news sites, its a page aggregator like Reddit and hacker news, fake articles can be posted, but doesn't make the entire site fake.

Community news/comments can be wrong, but that's a far far different than a site intentionally promoting fake propaganda, which is the true meaning of a fake news site.

And even then, propaganda can blur into a political view, then even farther into down right lies.


I think there's also a risk of conflating 'education' with our 'education system' that teaches citizens to be good productive workers but doesn't teach citizens how to analyze and debate about the nature of our society.

I believe the parent might be referring to 'education' as in logic, political history, philosophy so that the student can retrace the historical steps and recreate a societal scaffold that may or may not look like society today in the same way civil engineers are taught trig, calculus, static mechanics etc so the engineer can retrace historical steps and engineer the structures that may or may not reinforce or criticize our engineering approaches today.


> education won't fix echo chambers.

No fewer than 6 of the coveted "swing states" switched from blue to red[0], and many counties across the country flipped as well. What persuaded hundreds of thousands, if not millions, of 2012 Obama voters to be 2016 Trump voters? Trump himself educated them, in his speeches. I'm one such person, spent the entire election season commenting and chatting with others on Reddit and YouTube with others, and there were countless Trump supporters who had been Obama supporters. Trump educated us more about the political system than any politician or journalist ever has. He taught people the phrase "lobbyists, donors, and special interests." That itself had a very powerful effect.

[0] https://www.nytimes.com/elections/results/president


I'm seriously wondering about the education system in the US if you had to learn about lobbyists, donors and special interests from Trump.

This is primary school stuff in my country at least, along with the explanation for why political ads on TV are banned here.


Your not the first person to worry about our mandatory education. It varies enormously depending on the resources of the surrounding area, and I am sure there are some schools that cover modern politics.


I have been in this country all of 10 years and I knew about lobbyists, donors, special interests even before coming over. US and lobbying system is famous across the world, even my 75 year old parents know about it and they live in some far flung developing country. I am surprised that Trump had to educate people about this basic stuff.


> If you really want to fight the root of the problem, it's not likes and comments, but education.

But how do educate yourself though ?

The same argument was already made about books or newspapers: depending of the subject a lot of them are trash. Same for TV news. Wikipedia is also not a defacto trustable source.

Except for first hand experience on a subject, you will never be able to find a source that is inherently trustable (and you might even interpret your experience the wrong way, so you should doubt yourself too...).

To educate yourself, there's a point where you have to try to trust some of the information you get. Having that info come from a trusted source or with a high level of peer review is a useful first level of filtering. From there you can (and surely should) invest in more time to double check or cross check if you want, but IMO quick filtering is necessary, and can't be brushed away saying "people should know better".


>Except for first hand experience on a subject, you will never be able to find a source that is inherently trustable

Even first hand experience isn't trustworthy. Our memory is so deeply flawed that I can't even tell with certainty which first-hand experiences I had. Simply presenting "evidence" that you should remember something can cause your brain to make that memory up. And if the experience is genuine and and I try to remember some detail I didn't pay close attention to, my brain will most likely just make it up. So the only reliable way to learn from personal experience is to learn during the experience, but confirmation bias will make me miss most opportunities to learn.

Honestly, it's amazing that we are capable of learning at all.


First of all, good education should come from schools.

Being able to further educate yourself and judge the trustworthiness of sources are direct results of a good primary education.

Many western states have criminally decreased education expanses for decades and the current state of the civilization is in big parts a result of that.

This shit is infectious, too. Many stupid people think their kids don't need a good education, even if there's a chance for them to get one.


And if schoolbooks are filled with 'fake news'. All you've done at this point is create an appeal to authority.

http://www.textbookleague.org/103feyn.htm


A good education is far more than just textbooks. Critical thinking is what is missing here - and is what needs to be taught in schools.

Mindless regurgitation of facts doesn't really help anyone - and that's the majority of what I've seen in highschools in America.


I think this is the key point. Even if schools presented largely propaganda but they spent an effective amount of time teaching useful critical thinking then the students might have enough experience/tools to think for themselves and not accept everything in the books.


I think the type of education they're talking about is not formal education like in school but how to educate yourself about today's events. Not something you could or should go to school for.


I'm not so sure, critical reading/thinking skills could be taught in school, just take newspaper articles and pick out the parts that are fake/misleading. Finding teachers able to do this may be easier said than done though, as would doing it in a non-partisan way.


Here are the "peers" that I have reviewing what I hear from politicians: the salary on my paycheck, the tax figures on that paycheck, and the prices at the gas station and supermarket. I don't need to pore over news editorials and blogs posts or watch commentators on any TV channel to know which political figure is making more sense and describing reality more accurately.


I agree that if you have “tangible” evidence of a fact it makes it easier to disprove a claim. I would still be wary of this logic, as chain of events can be unintuitive as well.

For instance some politicians make a bold move that has clear and immediate positive effect. Yet if it has huge indirect and long term consequences that only surface once they are out of office, from your perspective they will get all the credits and none of the blame.

Having different perspectives from “experts” is in my opinion important. The problem is of course to find the right experts, and to have an insight on their limits or their biases, which is usually the beginning of become a kind of expert yourself.


I think history would be a good source. We think we're facing unique challenges but the debates we're having today are already eerily discussed ad nauseam by Greek thinkers. And first sources are already so accessible that it's difficult to land on untrustworthy extractions unless you read contemporary commentaries.


>If you really want to fight the root of the problem, it's not likes and comments, but education.

I believe this is a design issue (systems design, not UI design). Filter bubbles and confirmation bias can reinforce each other in a vicious circle. Financial incentives for "news outlets" to mislead people make it worse. That's the heart of facebooks issue.

With a herculan effort we might be able to overcome that with education, but improving Facebook's design to lessen the impact of echo chambers and social confirmation would be much easier and faster. Removing all interaction is one way to do that, though I agree that that's a bit heavy handed and goes against Facebook's incentives.


> If you really want to fight the root of the problem, it's not likes and comments, but education.

I agree, but it's impossible to educate everyone on any subject. I think in this case, people who are experts in the matter, should give a reputation rating on the article. That would be really great. For example, I am no expert in theoretical physics, but I may enjoy an article about it, but how do I know it's not a bunch of bullshit I am reading? If I see that Stephen Hawking gave it a 5 star reputable source rating, I "know" I can believe it.


But that still requires that you have knowledgeable baseline of who is and isn't a "good" theoretical physicist. For a lot of people for example Dr. Oz is a "good" medical doctor and Gwyneth Paltrow is a trusted source on women's health and with that as a baseline you haven't solved anything.


> I agree, but it's impossible to educate everyone on any subject.

Critical thinking is universal though. It's not a necessity to have a solid judgment of a piece. "Critical abstention" is a possibility, and primarily formal analysis is another (eg. search for common communication red flags).

I don't think that the parent idea of removing likes and counts is realistic (in any imaginary scenario I can think of).


> Critical thinking is universal

I'm assuming you don't spend a lot of time on Facebook?


I'm assuming you do. Otherwise you would have understood the comment.


>> If I see that Stephen Hawking gave it a 5 star reputable source rating, I "know" I can believe it.

People everywhere are going away from this model as "experts" are consistently bribed through various means.

There is no good short-term fix. People need to become more educated and adapt. If the argument is that they can't, well, maybe our brains have been defeated entirely by advertising and messaging. It wouldn't be the first time a complex problem was mostly solved.


One way to work against that is proper antibribery laws and their enforcement. Compared to other industrial countries, the US allows many forms of corruption and disinformation, that, once forbidden, would sustainably change the educational and political landscape (most notably bribes (officially donations) to political parties.


An educated person can judge the trustworthiness of a source well enough.


No they cant.

See, I can make things up too. Furthermore the amount of education necessary to judge the trustworthiness of all potential subjects is not obtainable by an individual, meaning they have to hope the experts they choose to listen to aren't making it up.


Yes, they can.

If all you do to measure trustworthiness is check if the person has an expert tag, I can see your problem.


"Should we censor social media, because people are too stupid to deal with it? I don't think so and if you do, you're playing in the hands of corporations who want a cheap work force."

Just as a sort of general, broad response to this sentiment: yes absolutely. I envy your optimism, but the belief amongst the Hacker News set that humans are universally capable of educating their way out from under deeply-rooted biological faults is misguided. You're not going to educate people out of conferring status on the most popular people/stories/posts/etc. You're competing against at least a few hundred thousand years of evolution (and many of these processes go back much, much further than that.) You won't win.

So, yes, absolutely we should be working on ways to acknowledge and route around human foibles, not pretending we can eliminate them.


We really are better off if wiser and more capable organization -- like a bigger brother of sorts -- gets rid of the wrong facts and incorrect ideas.


It's sort of ironic that a proposal not to show "likes" comes in for strident criticism on HN. After all, this very site has vacillated a few times on whether or not to show the karma score for each post, and seems to have decided it's better for the community not to do so.


I'm a strong civil liberties anti-censorship person overall…

But https://www.npr.org/sections/money/2017/08/25/546127444/epis... convinced me that this is a case where education is NOT enough and there's a time where the only solution is plain censorship, unfortunately.

The censorship doesn't have to be confidential. It can be auditable and publicly recorded. But in that case in Ukraine, they did all the education stuff and let the fake news through, and that wasn't enough to stop the country from being ripped apart.


It won't work. I know a lot of educated folks who believe the fake news. Politics has for a long time lied by omission in the national press. Now they just blatantly lie. The stuff we see today is propaganda dressed up in identity politics. In addition, there are plenty of fake Facebook accounts and other avatars on comments sites that may also be Russian trolls pushing disinformation.


Maybe the problem is education. Education is the solution for most problems. It's also the hardest, most expensive solution.

You will never educate the masses and even if you did, some of what you taught them would be wrong.


I think an interesting way Facebook could approach determining credibility would be by having users bet (betting with some made up points instead of real money). Since this forces readers to speculate, it would encourage critically reading the articles. I think a betting system would push Facebook less in the direction of an echo-chamber. The difficulty with applying a betting system is that it still doesn't tell you whether the information is true, thus bets can't be evaluated.


I like the idea in principle but betting, real money or not, relies on agreed upon, objective results/outcomes. There in lies the problem; who's going to be the arbiter of truth?

However, it seems possible to get further along than we are now by creating betting pools/markets where the participants do agree on a set of sources for information.

Actually, I've thought along these lines before rather indirectly by theorizing that the WSJ was probably the most non-partisan source of "news" because most of it's clientele demanded such due to their participation in the markets. But I realize that's highly speculative.


I agree with you, but I'm sure they will never remove those metrics.

You might like the extension Facebook Demetricator: http://bengrosser.com/projects/facebook-demetricator/

Personally I block the entire news feed with uBlock Origin which has made my experience of Facebook tolerable.


They will never do it because it’s those little numbers that give you your dopamine fix, and thus central to their revenue.

Social media is a game and the score is whatever metric is shown. You want to most points for yourself and you only want to participate in popular content in the hopes it will earn you more points.


How do you disable or block the "suggested" crap

I am seriously considering a year without adblock and going out of my way to click as many ads as possible. Blocking is ineffective, it's time to fuck over the metrics


Most ad blockers will let you select an ad to block, and depending on how specific the selector is it might block other occurrences.

What you are thinking of exists: AdNauseam. It clicks on every single ad you see to disrupt the metrics.

https://adnauseam.io/


There is no likes and views count on Whatsapp, but the fake news problem is much worse on Whatsapp than Facebook and YouTube.


Can you explain this? maybe it's because of the filter bubble i live in but i almost never see this on whatsapp (then again, maybe i wouldn't know if i had.)


The fact that Facebook walls are visible to everyone, any controversial topic usually gets reaction from both sides. Whatsapp groups are formed based on likemindedness (I am excluding core family and friends). I call these groups echo-pipes which silently form a larger echo chamber.


Are these public groups? How are they discovered?


WhatsApp groups are not discoverable (or publicly listed) unless someone invites you or provides you with a specific invite link with which you can join the group.


As someone who hasn’t used WhatsApp, I am surprised and curious about this. Any other experiences or information would be much appreciated.


They already stopped the first "fake news" label experiment because it simply caused people to click on it more. I think that's the core problem: people don't really want to know if something is legitimate or not, they want controversial stuff and if it's labeled as fake, they simply want to see it and decide for themselves anyway. And, I'm sure, when something goes against your beliefs, you will take that fake label as proof that people don't want to believe/the system is trying to brainwash you.


> This is the opposite of what they should have done. Shows just how much of an echo-chamber Facebook has become.

News itself is an echo chamber. Don't see how that is facebook's fault.

The reason news is toxic isn't facebook. It's foxnews, nytimes, wapo and the entire media industry.

Not sure why you think "like" buttons would change that.

The news echo chamber existed before facebook. It existed for zuckerburg was born. It'll exist long after facebook is gone.

The only solution is for facebook to ban news entirely from facebook. But then the news industry would get very upset.


Are you talking about yourself or other people?

My Facebook feed is mostly personal updates about hundreds of people I have met, and it feels like the most democratic media I’ve ever seen. It improved a lot when I unfollowed the handful of people who post junk like sportsball.

Are you assuming the faceless masses are exposed to some unimaginable horror that you aren’t seeing? Or tell us more about the horrors you do see.


>feels like the most democratic media I’ve ever seen.

The Democrats thought the same thing too, before Trump won the election. Perhaps you are falling victim to a filter bubble confirmation bias?

A democracy implies independent national sovereignty. When anyone (govt, corp, or rogue hackers) across the world can effectively manipulate your elections at scale, sovereign democracy is no longer a thing.


A $100,000 ad campaign overpowered the dems $1.4B budget? The DNC should hire those guys! They're political geniuses!


I'm having a hard time finding the link, but there was an article posted here on HN a while back (which the author deleted because of the underhanded nature of the endeavor) about a marketer whose ad campaign for getting a government initiative passed was failing even though they were using all the traditional methods.

Then they decided to spend ~$100 on buying fake reddit upvotes for their initiative and it received millions of views and ended up being highly successful.

It's interesting how far a little money and technological finesse can have such far reaching effects.


It's more like, letting users "legitimize" news is how you get bots screwing with everything. It may not be a problem for some, including myself since my feed is similar to how you describe yours, but network waide, it will pose an issue.


So, your feed seems fine, my feed seems fine, but somehow the electorate at large is subject to mass mind manipulations by bots?


As a technologically literate individual who takes pains to protect myself online, yes, my feed is inherently more "fine" then the general population.


Surely the problem is the lack of a dislike button?

I wonder if Reddit has as much fake news as Facebook?


Probably less completely ridiculous stories, but it does tend towards groupthink. Popularity is not a great gauge for truth.


> I wonder if Reddit has as much fake news as Facebook?

I think that /r/politics, /r/shitredditsays & /r/The_Donald might count.


I can't see why that would help.


Those counts are the rating by which people determine legitimacy of content. Popular content must be correct by virtue of that fact that so many people can’t be wrong. And when you only show people content that they all ready agree with it further reinforces its legitimacy.

Remove the counts and people have no external validation for the legitimacy of a piece of content. They actually have to read it. What we have now is a system where someone sees a headline, “Proof Obama birth certificate fake!” With 100k like and 10k comments and they don’t even read it and just comment, “I knew it!”

When you pick up a newspaper you have to actually read it. You can’t just look at the headline and see how many people agree with it before you can form any kind of opinion.


What seems likely to me is that people share obviously fake news not because they see it has a lot of likes but because it confirms things they already believe.


What they see is often determined by those counts in the first place, and what people decide to read/comment/share is more often than not because a post has a lot of likes/comments. Then they share it to get likes for them self.

If the only incentive to read/share/comment was a genuine understanding or willingness to share and participate the entire space would be different.


This makes absolutely no sense to me.

For almost all stories you can work out its popularity just by the number of comments. And you can determine the comment count (-/+ an order of magnitude) just by scrolling through the comments. The accurate number isn't important for most people.

None of this will prevent the dissemination of fake news. What will do is (a) not prioritising it and (b) detecting whether it's fake. Facebook just added solutions to do both.


Remove the number of comments.

People comment on stories with the most comments and usually reply to the first set of posts because they want their post to be seen, accelerating commenting in that thread.

If you want to count comments fine. Most people won’t. FB actually makes this hard to do.

Would be fascinating to even try it here on HN. If you hit all the integer counters it would dramatically change the way people use the site.


I just went to a popular article by the New York Times on Facebook. Just by scrolling down through the pages of comments I could determine that it was likely to be a popular article.

Go ahead remove the number of comments. It won't solve the problem Facebook is trying to solve.


Yeah, but you actually had to do that. You didn’t get to just look at the headline and see how many people agreed with it before you even read it.

You would, of course, have to combine this with not just showing it to people who already hold those opinions.


   you can work out its popularity just by the number of comments
Popularity, yes. Accuracy, no.


That's the kicker though. For the vast majority of people popularity == accuracy.


> work out its popularity

Or controversy.


So nix free speech (and rely on unknown propietary algorithms)? I don't think that's a good idea either. Much of the time, there's more value in the comments than you could gain from an article, video, etc - as is the case here.


Nothing to do with limiting free speech. Quite the opposite.

Instead of showing people only content that they agree with, with counts that reinforce the legitimacy of that content, you would show people everything that is popular and let them determine for themselves what they think about it.

I’m not talking about removing the comments. I’m saying don’t show the count. If people want to read the comments they can.


Limiting free speech may not be your intention, but it's exactly what's going to happen when you remove transparency.

We have a similar this situation in Germany right now. A private corporation designated by the government can delete Facebook posts as they see fit under the guise of hate speech. Facebook had to agree, because they'll be hit with fines if they don't.

What should be controlled by law is now in the hands of a corporation and they are quietly removing content that is clearly not illegal.

Removing transparency is a dream come true for corrupt institutions.


> So nix free speech

1. It's Facebook, not the government 2. They aren't banning anything

How is this a free speech issue?


No OP, but free speech is an ideal, unlike the first amendment which applies to the government.


It's still not a free speech issue.

The question is how Facebook should order news sources/articles. They're suggesting changing their strategy. This isn't anything like censorship.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: