Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Facebook must kill its news feed. Democracy depends on it (medium.com/tim.dick)
23 points by Johnny555 on Oct 5, 2017 | hide | past | favorite | 19 comments



If democracy is threatened by a simple automated news feed then it's doomed to fail anyway.


The mob is fickle, and prone to learned helplessness, which invites corruption, bureaucracy and the fall of “Rome.”


Exactly. The next Facebook news feed will just fill the void, and people who blindly believe what they are exposed to will be in the same situation. Killing Facebook news feed is trying to treat the symptom while completely ignoring the disease.


Magnifying gossip is the issue. It’s almost as if we need a few thousand people or AI with really good domain knowlege and common sense whom can somehow handle moderating the onslaught of conspiracy theories, rumors and general BS. Seems very difficult to scale on the tech side. We might be better off finding a path to building in-real life communities whom can educate each other and discuss such things civilly.


In theory I agree, but in practice people are much more emotional than we like to admit, and we come with loads of baggage in the form of preconceptions, presuppositions, and ideologies. We also have a tendency to be wrong about things sometimes (to err is human?), even in fields where we have expertise.

Could AI fix this for us? Maybe. AI isn't entirely free of the bias of its creators tho. As far as solutions to magnifying gossip and fake news, I don't really have an answer. Perhaps you are on the right track.


I think your other comment is spot-on: https://news.ycombinator.com/item?id=15413662


Based on the results of the last USA election, that appears to be the case.


Some of this technology is new and awareness among the general public is quite low. Give it some more time and people will just ignore most of this stuff (and probably leave facebook also). Give it 2-5 years and my guess is that this fad of deriving news from social media will go away.


Personally I've started to view facebooks feed (and others) as being indistinguishable from tabloids, like the National Enquirer here in the states. Grandmother gives birth to triplets! Honeymooners abducted by aliens! What I read there is not to be trusted, it's not a reliable source. Same goes for yahoo, msn, etc. Local news is a little better, but still sensational and salacious. If I want news online I go to Bloomberg, bbc, etc.


You said it much better than I did :-) People will understand over time this is tabloid news.


OK, so Facebook kills the home feed tomorrow. What replaces this?

Human-curated news? Either it'll be fast enough that people get sloppy with facts, or it'll be slow enough that people move to faster sources. Clickbait wins.

Another site? Most other sites (Google News, Quora, whatever) use clicks as an important measure of interestingness. Clickbait wins.

Broadcast TV? Have you seen what the networks do when the facts are scarce and the topic is urgent?

Eventually this all gets sorted. In the moment, fake news always wins, especially if there are organizations that are invested in producing it. That's fine. The trick is to help real, verifiable news and level-headed analysis propagate as it comes out as well.

It's a solved problem via reputation, but we've collectively decided that reputation isn't that important in the last 15 years. It's an open but in-progress question for ML / AI. There are probably other models, but we'd first need to change the consumption norms to make them appealing.

Clickbait is always appealing, so it wins when people are trying to find new information before they start to sort through it.


Is this article meant to be serious? The newsfeed is Facebook's most profitable product. The argument here is Facebook must kill itself for the sake of democracy. That isn't a solution. Besides the fact that I don't think Zuck, FB employees, and their shareholders would like to see their multi-billion dollar product upended, It's like trying to put the formula for a nuclear bomb back in pandora's box. Are we to depend on a organizations moral code in order to protect society from the news feed highly potent propaganda machine?


If the premise is accepted as true, then it shouldn't be Facebook's choice.


There’s no simple, panacea, technical solution to prevailing ignorance. Granted people can have their sentiment adjusted by captological factors, but people need to dig into issues, do credible research, learn about history and cultivate better BS detectors. Lazy, passive, clueless wallflowers do not a healthy society make.


You made me look up a new word :-)

> Captology or behavior design is the study of computers as persuasive technologies. This area of inquiry explores the overlapping space between persuasion in general (influence, motivation, behavior change, etc.) and computing technology. This includes the design, research, and program analysis of interactive computing products (such as the Web, desktop software, specialized devices, etc.) created for the purpose of changing people's attitudes or behaviors.

Source: https://en.wikipedia.org/wiki/Captology


The absurdity of the headline makes every following point impossible to take seriously. Saying dumb stuff has a cost.

It sounds like "facebook needs to start being about people you know, and stop being about everything else."

That's great advice IMO, but it's not what makes money, so... don't even try.


isn't groupthink/spoon feeding articles something that all feeds naturally do? e.g. reddit also suffers from similar problems, depending on what subreddits you follow. Does this mean we kill reddit?


My mom and grandma (and the rest of my family) are not on reddit, so I don't care. But Facebook appeals to the masses with much greater reach, thus they can do much greater harm if they are serving biased news.


I agree with the premise, but there's no way it will ever happen.

Most troublingly, “trending news” accelerates news in what is what is known a “feed forward” loop: i.e. the more we (and external bots and trolls) click, the more we will see an item. This reduces visibility of other news, resulting in “groupthink” that shrinks our perspective. Trending news algorithms can be adjusted but will always create “observational bias” in that we react most to what we see most.

This describes Reddit to a T. Definitely a feed forward loop if there ever was one.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: