Hacker News new | past | comments | ask | show | jobs | submit login
Facebook removed the news feed algorithm in an experiment (bigtechnology.substack.com)
156 points by WookieRushing 85 days ago | hide | past | favorite | 139 comments



When people argue against the 'news feed algorithm' they mean anything other than a reverse chronological feed of posts from the people they follow. status updates, and interactions between people/pages you both follow.

This article mentions that they turned the news feed off but people were still hiding posts from pages they don't follow, which friends had commented on. These shouldn't appear in a news feed that is not curated as they are not following that page, and is one of the things people are complaining about in the algorithm.

They didn't test the algorithm vs no algorithm, they tested the current algorithm vs another algorithm.


Also, if the user base is conditioned to seeing things and posting things to serve a particular type of timeline, it's not easy for everyone to transition to a completely different method and for that method to stabilize/work quickly.

When Facebook started changing their timeline and messing up the chronological order of posts it had a really strange effect on reality. Old news stories and posts were showing up for many months late, and they were reminding people about their pets that died years before as well, many people have forgotten that.

The best option would be to abandon the ideal that one single news feed is best for everyone and give control back to users along with an option for a truly chronological time line. Thy should also make multiple pages that rank posts based on taxonomy that users can browse content that is most liked by everyone on the platform.

The only reason why Facebook wants to be able to have singular time lines is so that they can push targeted ads without it becoming obvious to their user base, but if the taxonomy pages were titled and organized properly, the ads would be somewhat more relevant by nature, and not require them to invade everyone's privacy like they have been doing thus far.


100% agree. In addition, look at their metrics. When it comes to "meaningful social interaction", calling Uncle Joey a stupid ass because he posted another semi-racist Obama meme is the same as telling Cousin Jane you like her baby pics.

I should HOPE "meaningful social interactions" go down with a reverse-chron, friends-only feed.


Aren't those both equally meaningless? Maybe this is why I don't participate in social media anymore XD


They might do sentiment analysis, in which case you'd have to put thinly veiled sarcasm in your reply to Uncle Joey to be counted.


> "Without a News Feed algorithm, engagement on Facebook drops significantly, people hide 50% more posts, content from Facebook Groups rises to the top, and — surprisingly — Facebook makes even more money from users scrolling through the News Feed."

Yes, it is confusing when they say they "turned off the algorithm" because what it sounds like is they are still using an algorithm here, just a far worse version of one, maybe an earlier version of the algorithm.

But if posts are "rising to the top" and "they saw double the amount of posts from public pages they don’t follow, often because friends commented on those pages". This still sounds like an algorithm is ranking posts, just in a "worse" way.

Removing the algorithm to me would mean seeing posts in a reverse-chronological order as they happened. Everything would appear equally. Maybe some controls are given to users to hide certain types of items, such as 2nd degree pages (pages your friends follow and comment on, but you do not follow), group posts, and so on.

But as soon as you re-order posts, you are using an algorithm. It is very disingenuous to claim you removed an algorithm when all you really did was replace it with a worse algorithm. Then justify your actions because the worse algorithm performed worse (wow shocking i know).


I think it's still legit to think of what they did was turning off the algorithmic filtering, but did not turn off the algorithmic event generation. To be fair although, group and friend posts are things people explicitly subscribed to, it's only the friend comments that they didn't explicitly subscribe to. Which makes me think, did they do a version of the experiment without "people commented on X" items and see what happened?

I think the results would be somewhat similar although, because group activity will still dominate. Knowing a place like FB although, they probably tried all the combinations to see what happened. I'm curious what those results are too.


it seems like they're using algorithm to mean "a more complicated means of calculating what to show you," and they're saying there is no algorithm involved when they use a simple means of determining what to show you. Which is of course also an algorithm -- when a person the user is friends with interacts with something, show that in the user's feed. But algorithm somehow means magic nowadays.


It seems Facebook is so far up their own ass that they actually think this "no algorithm" test they did is what people are clamoring for.


Except that it is exactly what people have been asking for:

Simple reverse chronological order feed with no ranking.


Did you even read the comment I replied to? BS like "we're throwing this post in your feed because one of your friends liked/commented on it" is not part of what people have been asking for, but that was still part of their "no algorithm" test.

All they did was get rid of the ranking, but that's only part of the issue.


I am some non-zero amount interested that a Facebook friend of mine liked or commented on some random post. I’m not as interested as something that they posted originally, but clearly non-zero interested in many of those.


That's perfectly fine. Me personally, I am not at all interested in it. But Facebook could very easily allow us to tailor our individual Newsfeeds to our liking, so you can have what you want and I can have what I want.

But that's not what Facebook wants, so that's not what we get.


I can't tell if they want us to believe none of their brilliant minds realize this, or that they deliberately wanted a test that would give them an excuse to keep things as is.


Which one is more likely? There’s your answer.


FBPurity is a browser plugin that lets you decide what to see on FB. You can decide to only see what friends post and not what they like, share, or comment on, etc.


Wow. Comparing with this addon and without is pretty jarring. I might actually start using the newsfeed again with this addon.

Holy cow do FB just shove a bunch of shit into the feed.


Doesn't seem to work, won't filter out Sponsored posts despite the option being selected...


Facebook changes the way Spοnsored content is formatted on a moment to moment basis, sometimes in a way that breaks accessibility software, to foil exactly this kind of software, and ad-blockers.


I don't see sponsored posts but perhaps that's due to Ublock Origin, not FBPurity.


And as Twitter shows, it's not hard to offer both options and let users choose.

Twitter also implemented a timer so that, if you choose simple reverse chron, it forces you back into the ranking algorithm after a certain period of time.

Hmm, I wonder why they'd go to the trouble to do that? Maybe these social networks have motivations that override user experience?


FB isn't going to do anything that doesn't affect at least 30% of users. As always this is all about revenue.


I'm sure FB has done many, many things that affect less than 30% of users


OK name something they added that they actually kept in the current version of FB that is used by less than 30% of users in the life of their account and isn't there for regulatory reasons. They use way too much A/B testing for anything like that to survive longer than 2 years.


I have no algorithm again, at least until FB decides to break it:

https://news.ycombinator.com/item?id=29004489


Sadly, that link is not available on mbasic, the only version this computer can handle.


Expand your mind to "a reverse chronological list of actions taken by my friends"? I don't want to--and don't need to--miss out on my friends replying to stuff just because I don't want to have an opaque content filtering and recommendation algorithm curating my life.


Data wins arguments. One of my least favorite Facebook quotes. Bad data wins arguments, too.

The misdirection term here is 'news feed ranking algorithm'... and what that means in the experiment versus what you might think that means, huge difference. e.g. I think most would assume the algorithm is responsible for showing you an unknown post that a friend merely liked, but it's right there in the article as still happening.

The results and how people used it don't at all say to me that they enjoyed it less, it shows explicit care & intention to curate their own feed by hiding what they don't like.. which is how it should be. If they find the group posts too overwhelming, they can mute/unfollow/leave, or other methods of grouping posts can be explored.

But look, they did this one bad science experiment, and now it's taken as fact and becomes folklore.


You can say that explicit curation and intention is the way things "should be" as much as you like, but all the evidence suggests that most people strongly prefer automated curation.


Not all automated curation is built equal and boiling it down to just one thing really confuses things. I absolutely adore YouTube's automated curation since the primary goal there is just to steer me to things I'll find interesting - the ads are present on all content and so YT just wants to keep me on the platform for as long as possible.

When it comes to Facebook it always feels like I'm being steered towards topics that yield monetizable verbiage. If a friend likes an upcoming concert I'll definitely hear about it loud and clear - while as an upcoming picnic or personal project being planned is less likely to float to the top.


YT's curation keeps trying to shunt me off into neoconservative conspiracy theory videos, which I never click on, but occasionally auto play if I've left the window in the background. About all I can figure I've done is watch some videos about military-themed multiplayer video games. I don't actually play these games, but sometimes the commentary is funny.

Facebook kept trying to sell me an Oculus Quest weeks after I had already bought one ("look at the metrics! 95% of people who saw the ad also bought the headset", "you're reading the graph backwards").

To me, it lays bare the myth of advertising analytics. All this data, all this tracking, and none of it is actually all that useful for the stated purpose. Makes one wonder if it's really all for fleecing advertisers or if it's to keep totalitarian regimes happy.


You can go into your YouTube watch history and delete videos that you don't want influencing your recommendations. I do this all the time because if I watch something out of the ordinary, my recommendations will be bombarded with similar stuff that I don't want to see. I just wanted to see that one video with the dog doing funny things, damn it!


YTs recommendations are infamous for getting people radicalized, into conspiracy theories, and filled with misinformation.


YT recommendations are always something I'm at least modestly interested in - even if I wouldn't know to independently search for. I think YT is following the model clearly demanded by current American ethics that suggests that the only thing worse than talking about what a nice guy hitler was is stopping someone else from talking about what a nice guy hitler was.

I do think that at some point YT, FB and everyone else (google even!) will have to reckon with radicalization - but I still think that YT's recommendations are quite a bit more valuable than FB.


I'm highly skeptical of those claims. This seems mostly like wild claims from legacy media that doesn't want you to watch streaming video.

Yes, I'm familiar with the relevant studies.


It literally happens to me every week.

It's easy to tell that it's happening because you will see obscure, random news/opinions channels with high view counts. Something you can only get when you at some point have been promoted by Youtube.


I watched a Youtube video about how birds aren't real, that the US government killed them all and replaced them with surveillance drones.

There is no doubt I watched that. Nor is there any doubt that it impacted my recommendations.

The question is, what is the likelihood of it having radicalized me into believing this? Is Youtube successfully recruiting people to genuinely believe birds aren't real/aliens/etc. on a wide scale? I do not believe that is the case. I'm not claiming it has zero impact on this. I'm saying I'm skeptical that the scale is meaningful, especially compared to legacy media.

What I know to be certain is that traditional/legacy/corporate media is constantly and successfully recruiting people into believing conspiracy theories on a broad scale like Iraq had WMDs and Trump was a Russian asset. And yet there is not serious talk about how the corporate media is "infamous for getting people radicalized."


Which legacy media is not making a streaming migration effort currently with enough sway to manipulate the narrative? I'm not aware of any major players not moving to svod or avod in the US.


All you need to know is Youtube, one of the biggest and most popular streaming services with young people, is literally turning your children into Nazis.


I've had a Google account for... 15+ years? and have never really interacted on YouTube, in large part because I don't want to give Google my data. Just recently though I've been getting more into hobbyist boardgaming and wanted to help out some of the little guys in this niche by giving them some likes and subscribes.

So, now it's feeding me incel videos. Maybe this just is giving me more info than I want about people who play board games. But wow, Google.


Which studies are funded by legacy media? There are undoubted way more that are not.

I’ve been fed all kinds of UFO and other strange videos by YT myself.


I believe you. The question is not whether you were fed conspiracy theories, it's how effective were they and "compared to what?" Because that's what's implied by these articles/studies. They claim "people are radicalized," which is very different than "people watched some UFO videos."

How many people watched Youtube videos and were radicalized into believing in flat earth, aliens, etc?

Compare that to how many people watched legacy media and were radicalized into believing that Iraq had nukes and Trump was a Russian agent?


YT continues to push Jordan Peterson videos on me even though I've downvoted them many times. Other friends report the same thing.

I think the algorithm has figured out that if it can get people into JP, there's a chance they go down into more extreme rabbit holes and thus become super-engaged. So it's worth it for the algorithm to keep trying to push JP even on people who don't seem interested at first.


Don't downvote. It's probably seen as "engagement".

Click the "..." on the recommendation and choose "Not Interested" - optionally you can specify you don't like the video.


I used to get a lot of "Jordan Peterson" recommendations too. A bit off topic, but I really don't like his take on many subjects. I can see that he's truly articulated and it seems he mostly wins arguments by carefully crafting his phrases and forcing his view to less skilled communicators rather than by following a reasonable line of thought. I'm glad I don't receive recommendations of his videos anymore.


> he mostly wins arguments by carefully crafting his phrases and forcing his view to less skilled communicators

I think it's the other way round. When people interview Jordan Peterson, they use pre-prepared phrases to and ascribe to him opinions and views he doesn't hold. e.g. the infamous Cathy Newman interview, "So what you're saying is".

This doesn't always happen e.g. the debate between Russell Brand and Jordan Peterson.

I don't think there would have been nearly such a controversy around his work, if he had not touched on the pronoun issue, which was about compelled speech, not even particularly about pronouns.

However that issue is a bit of a hornet's nest, and Peterson is by no means alone for being targeted, as also seen in the UK with Professor Kathleen Stock.


Youtube's algorithm for me is a perfect example of what a poor ranking algorithm looks like.

I subscribe to a number of mainstream, local news feeds as well as our government's daily COVID updates. And ocassionally I will do a Google search for random terms. And yet somehow at least a few times a week I will be recommended some obscure news source often indeed with conspiracy or some ultra-right wing edge.

I very much sympathise with the challenge that Facebook's data scientists have to deal with. Incredible hard problem to solve.


I wonder how much wagging the dog FB has ended up doing to everyday activities? Do people give more weight to doing things that have a better opportunity for more FB engagement? (Thinking more on this I don't think FB is the sole culprit here)


I think the trouble is trying to have it both ways: excusing the existence of a curation algorithm because "people prefer it" [0] while also refusing to take all the responsibilities normally associated with a publisher who curates and delivers content.

[0] Never mind the more obvious problem that "people prefer it" is not a great excuse for intentionally makinga product as addictive as possible.


Giving users the option to toggle the algorithm on/off would be nice.


It used to be that way. Then they started "forgetting" everyone's setting to turn it off (go back to the original reverse time based feed), then they took the option away. After purchasing Instagram they did the same exact thing, FB engineering specifically removed the feature on their platforms.


The fiction of rational choice usually makes at least a performative nod towards informed consumers, actual choices, clear results, and consent. Which of those prerequistites does Facebook fulfill?


> all the evidence suggests that most people strongly prefer automated curation.

What evidence would that be?

I don't think having to actively force a feature on your audience is a good sign that they prefer it.


Platforms based on automated curation thriving while platforms based on manual curation wither. If Facebook gave people the opportunity to turn off the algorithmic feed, and they took it... they'd just spend all their time on TikTok instead.


It depends what you mean by “prefer.” I would prefer straight chronological even if I spend less time on it and click like fewer times.


Smokers strongly prefer cigarettes.


I think the comparison is unfair to cigarettes, which at least provide some fleeting pleasure and look cool while you're smoking them.


True, also cigarettes provide an amusing bogeyman to compare things to, whereas Facebook is no laughing matter.


Given they made MORE money in this experiment, I suspect the decision to roll it back was more nuanced than a naive analysis and bad science experiment.


The goal wasn't more money, there are more variables involved. One way to look at it might be, short-term more money vs long-term total narrative control.


given that the algorithm is never going to be as effective as an editorial department, I don't understand the "total narrative control" argument.

Facebook and instagram are social phase locked loops. Open a new instagram account, search for content that you like, like them, then what how it shove more/similar stuff into your feed.

Now pivot to a different subject, only like that, and watch how your feed moves to that subject more.

its not really rocket science, or indeed anything overly complex.


Putting users into echo chambers is not narrative control?

They could have made the algorithm so it showed users brand new different material, or opposing material, or more details, or related material from friends instead of corporations, or from people geographically nearby, or chronological, or let users find their own material and/or build their own feeds.

Instead of any of that they show them similar material, like you said, effectively putting everyone in echo chambers.


But people don't like new material.

The top comment mentions people hiding posts from pages they don't follow - alternatively, that's called showing brand new content you haven't seen before.

Opposing material suggests a binary, sure that makes some sense in a US centric political spectrum, but what's the opposing viewpoint to my friends photo from hiking last weekend?

Geographically nearby could mean my neighbor 5 doors down who I don't know anything about and am not friends with. Do you think, if she were a privacy-concerned individual like many people on HN are, she'd be happy to know I saw her post about her new flowers?

It's less narrative control and more human condition - if anything I'm glad they tried this experiment. Chronological feeds are how you end up with news teams spamming posts every three minutes, and you having to hit the hide button every time.


> The top comment mentions people hiding posts from pages they don't follow - alternatively, that's called showing brand new content you haven't seen before.

These were people who had already been sorted into echo chambers. Turning off the algorithm sent data to them they had already learned to hide. I'd be interested to see how this same experiment fared with completely new user. Maybe we'll run across a new tribe in the Amazon or something so we can try this out.


> Chronological feeds are how you end up with news teams spamming posts every three minutes

They do this anyway.


> Chronological feeds are how you end up with news teams spamming posts every three minutes, and you having to hit the hide button every time

Or you just stop following the spammy ones


But then like the article says you are spending all your time curating the experience.

People don't want Facebook to be a chore.


> so it showed users brand new different material, or opposing material

Brand new, yes, facebook is biased to new stuff. Its just there is a high incentive to repost old shit, because spammers know its effective.

Opposing material? no, that requires comprehensive understanding of the context of the share, the content of the material and the target audience's overton window.

> let users find their own material and/or build their own feeds.

virtually nobody does this, or indeed wants to do it. What's more its very rare that anyone is any good at it (hence why people don't subscribe to news wire services)

also, the research emphatically underscores this. People hunt more, and are less engaged. its more effort.


I didn't comment on what was best or what kept people engaged or happy, I made the point that facebook does have narrative control, they decide on the algorithm. They don't give users that option.


Opaque ML algorithm can't tip off the press to misdeeds the same way a disgruntled editor can.


I don’t remember what Facebook had instead of a news feed in 2008, but I do remember thinking Facebook was so cool then. When messenger came out, it was amazing. It was totally normal then to message random people who you thought were cool and just have a conversation. I would even get random chats from Facebook employees. It just seemed so different then than now.

I’m just trying to piece together the evolution of Facebook, feeds, and then when I stopped caring. Like, I don’t think the feed was always like this. At one point there was nothing, sure, but there was also at one point a reverse chronically sorted log of what your friends were doing I think? That was the best. By the time my parents were on I think there was a few years of overlap before I just forgot about it.


They had the news feed back then too, although it looked very differently. The two major differences between fb and myspace was the forced layout and that you didn't have to browse the site to see if there were updates on friends walls.

The major difference is what people are posting, and how tangentially related to your network, the posts on your newsfeed are.

Back then celebrities and news media weren't part of the platform, so you didn't really have these major intersections in the graph. I also believe that you had to re-share in order to push a post into a node that isn't directly connected to the posts author. Today a like is enough.

The reason why facebook is uncool now, is a mix between who the active users are, and how much room and focus facebook puts on links to newssite and posts by people who aren't your friends.


This is exactly how I feel as an end-user over the same time period to the point that I don't even login anymore except for work. It's exhausting and it's no longer about the projected vision of "connecting people", it's about connecting people to content and other monetizable assets as a first-order priority in everything they do and touch.

This is why Facebook/IG is unrecoverable to me as a destination for connecting with the people I care about. Instead, it's become iMessage and I'm quite happy about that. No ads and the conversations/photos are a lot more authentic compared with social media.

I still love social media as a form (I think), it's just become more media and less social.


According to Wikipedia, Facebook had a news feed in 2008; it was added in 2006, before which it was necessary to view a user's profile to see their posts. The switch to an algorithmically generated, rather than chronological feed started in 2011.

Facebook's decline, for me was not due to the feed being algorithmic. I think it got better around that time; showing original content from people I like to interact with first is a positive experience. What's not positive is showing me most things other than original content from my friends.

I'm not sure why the change happened, but at some point it did. Most of what I see posted on Facebook now is not original content from my friends. I mostly don't want to see third-party content. The share button was there long before I noticed this trend, but people are using it a lot more. I just went and cataloged 50 algorithmically-chosen posts. Here's the distribution:

Shared third-party post or link: 24

Original text: 9

Original image: 9

Directly-posted image of third-party content: 3

Promotion of a physical product by a page I follow: 2

Promotion of media by a page I follow: 2

Promoting own event: 1


I have the strong suspicion that Facebook often decides first that it wants to show me some post, then does some graph-walking and invents a "reason" after the fact why that post was somehow related to my friends list.

At least that would explain why a video that some friend of a friend watched 3 days ago is suddenly at the top of my newsfeed.


I repeated the count with a chronological feed, which is possible to get in a desktop browser with facebook.com/?sk=h_chr

Shared third-party content: 16

Original text: 5

Promoting own event: 4

Original image/video: 11

Directly-posted image of third-party content: 0

Page promoting product: 1

Page promoting media: 5

Group activity: 8

What would really make Facebook better for me is an algorithm that prefers original content. It might not be enough if that was something I could enable myself because what gets interaction from others affects what people post.


For the record, the turning point was 2016, when basically every single sleeper cell “friend” was activated by mass agitprop, leading to a frenzy of political activity, and in the process suffocating everything else valuable of human attention.


Yeah, and you could see it coming. I remember joking pre-2016 about the upcoming election, wondering how many of my friends would start unfriending each other. I had no idea how big a change was actually coming.

I wonder why it didn't happen in 2012. I remember ACA arguments on Facebook but while they were contentious, they were generally value-driven and not based off of batshit lies.


Politicians are generally an older crowd. Obama got a lot of credit in 2008 for twitter use, wherein that use was basically tweeting campaign statements and updates.

It was probably 2016 by the time that politicians realised they could use social media to whip up such strong feelings to maybe benefit their campaigns.

People certainly disagreed with each other online in 2012 (I remember reddit had subreddits dedicated to complaining about how much pro Ron Paul content was posted in the mainstream subreddits), but I think it's the active engagement of the politicians themselves that turbo charged this.


> I wonder why it didn't happen in 2012. I remember ACA arguments on Facebook but while they were contentious, they were generally value-driven and not based off of batshit lies

Both candidates in 2012 went out of their way to maintain civility to each other publicly. I think when a candidate treats their opponent with respect, the followers tend to follow mostly in suit, but if they don't, it kind of just opens the floodgates, and once that happens, there's no going back.


> Both candidates in 2012 went out of their way to maintain civility to each other publicly.

When George W. Bush was running for governor of Texas against Ann Richards in the late 90s (before anyone but us was on the internet), Karl Rove distributed printed pamphlets, particularly in churches in East Texas among evangelicals, talking about her secret lesbian lover. She's not a lesbian.

And of course when Hillary Clinton lost in 2016 she invented (or paid someone else to invent, more accurately) this ridiculous Russian conspiracy narrative that persists because press outlets affiliated with her party continue to amplify it.

In 2000, the aforementioned GW Bush's party had the chief justice of the SCOTUS (who incidentally got that job despite living in an AZ neighborhood back in the 60s deed restricted to whites only and organizing a sort of election day mob that would physically confront non-white voters standing in line to vote) stop counting ballots to ensure that Bush won.

Before Reagan appointed Rehnquist chief justice, when he was campaigning against Carter in 1979/80, he was giving speeches at notorious lynching sites around the former confederate states and talking about the "oppression of the IRS" (this was shortly after the IRS had stripped Bob Jones University of its non-profit status during Carter's tenure for refusing to admit black students).

Civility is anomaly, not the trend.

I'm all for criticizing Facebook for what Facebook does wrong but it's just a mirror, it doesn't have any original content on it. The same can be said of political candidates. If their message lacks resonance with what would-be voters believe already, no one will repeat it.


Before I deleted Facebook, I remember specifically around 2014-15 was when things started to change significantly on newsfeeds. It was less about keeping up with friends and more about advertisements.


Maybe we were just younger and had friends who did cooler things?


Definitely part of it, also we were more naive and willing to share more of our lives online. It’s why youth social networks seem alive and others seem to be decaying.


The younger cooler people aren't on Facebook


I shudder to think about a system where random people could message me.

On Quora, I just turned off that feature. Almost every single message I got was "hi", from somebody who was trying to either sell me crypto or catfish me.

Maybe there's a period when a new open messaging system opens you up to just fun new people, but when it grows, spammers and scammers will follow. Glad you enjoyed Facebook before everybody got to enjoy Facebook, but most people never saw it like that.


> I shudder to think about a system where random people could message me.

Do you not have a phone number that anyone can call? Or an e-mail address that anyone can send to? Or have you used a platform like IRC that allows users anyone to send you direct messages?

> Glad you enjoyed Facebook before everybody got to enjoy Facebook, but most people never saw it like that

Facebook messenger isn't overrun by spammers. I've only used the messenger a handful of times but IIRC it wasn't hard to tell the difference between messages from friends and requests from people I wasn't friends with.

Spam detection also isn't terribly difficult at scale. Spammers need to message thousands or more accounts to even have a chance at converting someone, which is so far away from the normal use patterns of a real user that it's easy to flag.


Yeah, I have a phone number, and I don't pick it up if I don't recognize it. I don't know you, I probably don't want random communication from you.

I am pleased with the state of spam detection for things that implement it. Google does a good job, both on my phone (Pixel) and my email. Quora does not, and I resent it; it makes the site worse, so I turned it off.

If Facebook were letting strangers talk to me, I'd probably stop using it.


You just need a high enough barrier to entry or real consequences to deter malicious behavior. Spam & scams work because there's no downside; if they had to pay a fee to open an account every time they get banned for spam they'll move on very quickly.


Spammers took over!


Have a chronological time series newsfeed only from friends. Once you have caught up show the recommended / algo posts.

+

Facebook should allow you to go filter and show posts using a calendar. Like all post from X date to Y date, from this location / friends. It will actually improve usage.

Miss the old photo sharing days.


The last year or so I was on Facebook I had basically hidden or otherwise removed anything that wasn't posted by a friend directly. It would take me around five minutes or less to scroll through everything once a week.

The problem is that if Facebook implemented this site wide engagement would drop dramatically. No one really posts to Facebook anymore, people love it when you do, because it's actually kinda novel.


I've started doing that, but the problem is facebook will prefer to hide thing my friends are doing and showing me more of the garbage others are sharing. Hiding everything someone shares makes things better, but I've been doing that for a month and still see tons of posts that people have shared. (some people share 40 things per day, they need to get a life)

Facebook doesn't actually hide things when you make them show none from X. I see many of those things after I hide all them.


FBPurity is a browser plugin that lets you hide the junk you don't want to see on FB, set the newsfeed to chronological. I have mine set to only show what my friends post, but not the things they like or share or comment on.


This would make them hugely less profitable, and therefore would never happen.


but, as the post points out, you get lots of stuff people don't like (like group posts, or people abusing public figures, or arguments)

so whilst in theory it sounds good, in practice, not so much


The article seems to miss the point of the experiment. It wasn't an actual proposal to turn off the news feed for everybody. The purpose was to validate the news feed, which it did. So FB did not "give up". The experiment was a success.


While Facebook has created an addictive and destructive engagement machine, I'm really unimpressed with the contrarian response - as if manual follow/curation & reverse chrono is the only alternative.

While transparency and control of the "ranking algorithm" is important, I reject the implication that infiniscroll content feed is the best that social media can ever be.

Facebook (and others) have decided that discovery, chores, social upkeep, political activism, learning etc all belong in One Big Recommendation System. This is entirely new territory from a psychological perspective: Throughout history, humans have focused largely on one mental task at a time. You don't read Dostoevsky while having a drink with your friends, or answer emails while watching TV. Mental multitasking quickly converges on the low effort/instant gratification task. Add your favorite attention disorder to amplify this problem to the point of surrender.

Any meaningful alternative needs to account for these psychological biases. Although this is new territory, we've been through this before - with porn. Basically, if you mix porn with other content, then porn becomes predominant. Hence, any non-porn platform needs to ban porn. Reddit is an interesting counter-example, because their subreddit barriers work pretty damn well, so even though there is porn on the platform, it hasn't infected everything else.

Similar to the porn issue, we'll never extinguish celebrity clickbait, conspiracy theories and racist uncle posting memes. However, we might be able to partition it. How about designing tech which makes it POSSIBLE to avoid "crap", at least temporarily, without an outright boycott, careful curation or browser extensions?


I'm under the impression that people are getting irrationally afraid of "algorithms". Like if some evil AI is controlling people mind.

People wants to read the stuff they are interested in. No algorithm is forcing people to watch fox news for instance.

That being said, FB should give the option to disable the news feed algorithm (or to have several version to choose from maybe), if that makes people happy.


>People wants to read the stuff they are interested in. No algorithm is forcing people to watch fox news for instance.

I think a mistake is thinking of people as static sets of tastes and interests. No one is going to be fed news that are a big deviation from their current worldview, but small deltas pile up over time.


Twitter finally returned the option to make the feed chronological again and I’m starting to like Twitter a lot more again. It was way more political and contentious with the smart feed on.

FB should explore these options more. It sounds like there has to be more done here than just turning off the smart feed entirely.


Algorithms are tuned by human beings that are motivated by profit. Algorithms are not computer magic that show people "stuff they are interested in", they're designed to put companies' partners' content in front of eyeballs for money for long periods of time.


Engagement loops do control people's minds. They explicitly aim to maximize the amount of time people spend on a platform. They are inherently manipulative by definition.

It's common when interacting with an engagement loop to experience a loss of agency (perceived, at least). People don't like that feeling, even when they like the content they are being shown.

Machines can be built to exploit odd elements of human psychology. I strongly recommend the book Addiction by Design.


I think "control" is too strong of a word. I'm sure they can influence behavior but other things do influence behavior (advertising, medias, TV, sugar) and we're not excessively worried about them.


I think it's exactly what they do. Control doesn't have to be total. People lying in bed doomscrolling for hours are being controlled, even if just partially, by an engagement loop.

Engagement loops, like the other things you listed, are not inherently "good" or "evil". What organizations use them for may be good or evil, but that's a separate discussion.


I dread that even if it were stripped from Facebook tomorrow the damage is done and cat out the bag, another platform would just copy and rise to the top of the pile that is "social" media...


Not if the other platform is decentralized and doesn't rely on advertising money.


>Not if the other platform is decentralized...

Like all of the other decentralized social media platforms that most people just don't care about? I'd argue that social media has become so polarized that, no matter their views, most people will want some sort of centralized body setting rules that act in their (the user's) interests.

>... and doesn't rely on advertising money.

People have widely adopted social media because they're not charged anything (aside from their privacy being invaded) to use it. How is such a platform staying online without charging users or using advertiser money?


> centralized body setting rules that act in their (the user's) interests.

Not sure it's possible. Only if you (or your friend) are the admin, you will like the rules.

> How is such a platform staying online without charging users or using advertiser money?

It's much more secure and reliable (long-term) to have a sustainable small server for friends than to rely on a huge anti-user for-profit company. One could also create a paid service. For example, I am using a paid email-provider.


A new social media platform taking off through ad revenue is enough of a lottery, let alone through a paid service.

Even if it’s as low as $1 a month, you’re not gonna get average users to join a new social media platform in the name of politics.

People want to connect with their friends, that’s all they ever wanted out of these sites.


I suggest that only some servers are paid. Others can be run by yourself or your friends. And they can federate. It definitely works with email, even though a large part of the network is from Google. All we need is a federated protocol (which AFAIK is already even required by GDPR).


That is true, I’ve looked at Twitter and went “how is something so basic in functionality not an open standard?”.

Although it’s not always as intuitive as email. IRC could’ve extended into something like Discord had it not been so programmer centric.

Either way, these people have us by the balls over already established syndication capabilities like RSS, just weighted in their direction(s).


> I’ve looked at Twitter and went “how is something so basic in functionality not an open standard?”.

Mastodon (Activity Pub) is basically the open standard you are looking for here.


I know, and I've seen it. A lot of the major publishers like CNN are loosely mirrored and far less consistent.

With enough man power, you can recreate anything, but without a strong userbase, well...

And don't even get me started on a name like "Mastodon", lol.


> a federated protocol (which AFAIK is already even required by GDPR).

You might be thinking of the GDPR's Right to Data Portability[0] which includes "the right to transmit those data to another controller ... by automated means".

This should require Facebook to synch the posts you make on their website to another account you hold on a Fediverse node, but unfortunately it doesn't require Facebook to synch content that you can see (but didn't produce) from their website to that node.

[0] https://gdpr-info.eu/art-20-gdpr/


Thank you, I stand corrected. Still, a good step in the right direction.


>How is such a platform staying online without charging users or using advertiser money?

The resources required to provide the world with the valuable parts of these services are small enough to be funded through non-profits or benefit corporations. I'd gladly donate my time and money to a 501c3 facebook killer.


The damage is done by the psychological manipulation used to keep people engaged with the platform, not the centralization or advertising.

Platforms that eschew psychological manipulation don't grow as large. That is the proverbial cat.


But if it was centralized, convenient, had shareholders and an advertising budget to get peoples attention faster than the decentralized one, then they’ll suck up the advertising money.

People are exchanging time for money. They will do this.


Imagine being able to choose your own algorithm. I’d want reverse chronological most of the time. And occasionally I might check out some of the other AI powered ones too. Imagine even having that choice at all.


What I really want is only first part posts. If you didn't write the status or take the picture yourself, then I don't want to see it.


This is hilarious (what Facebook calls meaningful!). I know people where back and forth comments with them are a lot less than meaningful or desired.

"Meaningful Social Interactions — the back and forth comments between friends that Facebook optimizes for."


> Even though the researcher kept the “integrity pass” in place, or the first layer of the algorithm that sorts for integrity ahead of engagement, they said that “integrity bad metrics still shot through the roof.”

Not very interesting without understanding these algorithms. "Integrity pass" seems a good marketing name, but, at least this article, does not explain how it works. It would be like reviewing the Boing 737 Max and only knowing MCAS as "that thingy that makes the plane easy to fly" instead of http://www.b737.org.uk/mcas.htm


I haven't been following this story -- has Frances Haugen actually revealed any fraud or illegal activities? This disclosure, while interesting, seems to fall squarely under material that should be kept confidential. As an outsider looking in, I'm wondering if she joined Facebook solely to collect a bunch of material she could disclose to the public, which seems unethical.


"Their findings: Without a News Feed algorithm, [...] Facebook makes even more money from users scrolling through the News Feed."

The obvious conclusion one can draw is that the product is primarily engineered to control users' information intake, and only secondarily to make money.


that was the worst thing to happen to facebook. before the newsfeed people barely talked about ridiculous political standpoints and it was more engaging. sure people sometimes posted a thing or two on their own but now everyone is an expert about things they read from for-profit news conglomerates, terrible for people with anxiety and ptsd because that was one of the main places people met, so turning facebook off is like going into total isolation for some. but going on the site definitely increased anxiety and traumatic triggers in people who would otherwise avoid the news


This article seems very misleading and honestly doesnt really analyse Facebooks actual experiment (which seems pretty flawed as other HN posts have already pointed out). Pretty disappointing.


There used to be a secret url param to only show friends posts, no pages or groups. It was heavenly.


FBPurity is a browser plugin that lets you do those things.


Would want something that works on my phone and not having to deal with the mobile browser. I would say a great majority of Facebook usage is via phone and via the official Facebook app.


I don't have the FB app on my phone. After blocking all the crap with FBPurity, it only takes a minute or two to check FB each day so there is no need to have it on my phone.


> Congress may strip Facebook’s legal protections for the content it amplifies.

How so?


Please give me the damn option to turn off the news feed.


Title seems like Facebook bashing but the article is more even keel. Suggest that "Then it gave up" be removed from the post title


That would be editorialising, since that's in the actual article.

> Otherwise please use the original title, unless it is misleading or linkbait; don't editorialize. https://news.ycombinator.com/newsguidelines.html

Edit: I don't mind being downvoted, but I would really like to know why that's the case so I can improve how I contribute to the forum.


I'm with you, seems like the title should've stayed. If the framing was a problem, it should be prefixed by who is making the claim.


For the most part it's correct, they emded the experiment. However, it would probably communicate better to say "Then had to turn back." That probably conveys what happened, in that the "bad" stuff that we attribute to the algorithm still managed to grow without it, and normal engagement went down.


Just delete your Facebook account.


Facebook is getting ever more desperate to get clicks and interactions. More and more of my notifications that will say they just happened or happened 20 minutes ago then I click it and find it was a post from the day before. How stupid since most of the time it is a repeat post I have already purposely passed on.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: