Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Facebook fires human editors, algorithm immediately posts fake news (arstechnica.com)
31 points by CodeMage on Aug 29, 2016 | hide | past | favorite | 8 comments


I can see how the logic around this went: There have been some negative press from folks who worked on the trending team which probably pissed Facebook off. Theoretically, clicking on a trending topic would bring up the latest news articles posted and provide additional information/context.

The reality is that I am seeing a lot more tabloid-esque topics that seem like they might be relevant but are not once I click through.

Yesterday, a trending topic for me was #mcchicken which I thought oh maybe McDonald's released a new chicken sandwich or some type of food/health news. Instead, it was trending because some guy basically stuck his penis in a mcchicken sandwich and posted a video of it. Previously, I would have expected human editors to either deem this unnewsworthy or at least provide a better one word description than #mcchicken maybe like #guyhassexwithmcchicken.

Anyways, I'm sure it will get better over time but I am missing the old module. I thought the descriptions were succinct and well written.


Maybe they should emphasize that it's merely a trending topic, like hashtags on Twitter. In some (well, many) cases, cutting a topic because it appears to be fake only fans the flame that there's a coverup of purportedly real news (because why would they bother to cover it up, as that logic goes).


    Facebook explained that the new, non-human Trending module is 
    personalized "based on a number of factors, including Pages you’ve 
    liked, your location (e.g., home state sports news), the previous 
    trending topics with which you’ve interacted, and what is trending 
    across Facebook overall." 
I don't understand how putting their users in an algorithmic echo chamber is a good idea (not to suggest manually curated the trending news was good either). There is already enough distortion of the world caused by whatever flavors of nonsense are popular among a user's set of friends. Why add more?


  Why add more?
Same reason fast food sells. Exploitation of innate sensory pleasure. "I'm perfectly aligned with my friends and it feels great"


This article buries the fact tha there are still engineers looking at the algorithm's results and making sure they are "tied to a current news event in the real world". According to Facebook, the editors mainly wrote summaries of the stories.

Here's the original post: http://newsroom.fb.com/news/2016/08/search-fyi-an-update-to-...


That seems to be the way things are going today for automated news. Google News is posting a headline directly from snopes.com if you search "North Dakota". The headline is racist in the extreme.

I think some folks need to rethink what is an actual news site, but I expect the continued tweets about it being Facebook doing some payback to conservatives. This stuff is very touchy.


Well, humans are many times fooled by fake or biased news. Some can be caught by simple verification or confrontation with other sources. Some require a deep research or specialized knowledge.

It's not easy to train an algorithm to outperform humans when complex context is required.


If they change the algorithm to filter out fake or biased news, they might alienate their user base.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: