
Facebook executives shut down efforts to make the site less divisive - longdefeat
https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499
======
guy_ro
Hey, Facebook VP of Integrity here (I work on this stuff).

This WSJ story cites old research and falsely suggests we aren’t invested in
fighting polarization. The reality is we didn’t adopt some of the product
suggestions cited because we pursued alternatives we believed are more
effective. What’s undeniable is we’ve made significant changes to the way FB
works to improve the integrity of our products, such as fundamentally changing
News Feed ranking to favor content from friends and family over public content
(even if this meant people would use our products less). We reduce
distribution of posts that use divisive and polarizing tactics like clickbait,
engagement bait, and we’ve become more restrictive when it comes to the types
of Groups we recommend to people.

We come to these decisions through rigorous debate where we look at all angles
of how our decisions will affect people in different parts of the world - from
those with millions of followers to regular people who might not otherwise
have a place to be heard. There’s a baseline expectation of the amount of
rigor and diligence we apply to new products and it should be expected that
we’d regularly evaluate to ensure that our products are as effective as they
can be.

We get criticism from all sides of any decision and it motivates us to look at
research, our own and external, analyze and pressure test our principles about
where we do and don't draw lines on speech. We continue to do and fund
research on misinformation and polarization to better understand the impact of
our products; in February we announced an additional $2M in funding for
independent research on this topic (e.g.
[https://research.fb.com/blog/2020/02/facebook-
misinformation...](https://research.fb.com/blog/2020/02/facebook-
misinformation-polarization-rfp-two-million-dollar-commitment/)).

Criticism and scrutiny are always welcome, but using cherry-picked examples to
try and negatively portray our intentions is unfortunate.

~~~
abraae
Just to cherry-pick from your reply here, if $2M is the biggest ticket item
you have to show for independent research on this topic - then you're woefully
short given Facebook's revenues and size.

10 short years ago, nobody could have imagined that huge swathes of the
population could have been swayed to accept non-scientific statements as fact
because of social media. Now we're struggling to deal with existential threats
like climate change because a lot of people get their worldview from Facebook.
Algorithms have decided that they fall on one side of the polarization divide
and should receive a powerful dose of fake science and denialism ... all
because clicks and engagement.

~~~
guy_ro
A counter-point to this is that studies show polarization has also fallen in
some countries over the past years - including ones where social media
(Facebook or otherwise) is popular. Studies also show some of the most
polarized segments in the US to be the older population, which uses social
media less. We definitely have work to do, but this suggests there are many
factors at play.

~~~
mola
You have an interest in the outcome of the research, why should we trust you
to conduct or fund it properly? Your track record is terrible. The fact you
are throwing around dubious research as facts is crazy.

I'm sorry, but really, there's no reason to believe a word a you say.

You get paid from Facebook, Facebook lied again and again, how on earth do you
expect people to take you seriously?

Your job and title is to give FB the optics of caring about integrity, and was
invented as part of FB PR in the aftermath of the Christchurch massacre.

The amount of cynicism is just mind boggling.

~~~
rat9988
>why should we trust you to conduct or fund it properly?

You are welcome to conduct research or fund it. You can even help current
research by criticizing their research in its substance, or engage yourself
politically (at least by voting) so we can have more and better research about
the topic.

~~~
mola
I do, but that's irrelevant to the fact the corporate funded research is used
to muddy the waters and make reality even less accessible than it is.

It's not the lack of research that's a problem, it's the weaponization of
research.

~~~
thu2111
Outside of actual weapons research, research can't be "weaponised" and
academia is rife with incredibly strong political biases. Political bias in
academia is so severe that there is actually an entire foundation devoted to
trying to combat it (the Heterodox Academy).

Your posts sound like you believe corporations shouldn't ever do research and
worse, that academics don't have any interest in the outcomes of their own
work. But that's nonsense, of course they do. They want to publish papers,
they want their research findings to be novel and widely cited, they want to
build a reputation. They have all kinds of self-interested incentives that act
against producing accurate research findings; hence the replication crisis!

------
fpgaminer
You know what the internet needs? User agents.

We've got this idea stuck in our heads that only the website itself is allowed
to curate content. Only Facebook gets to decide which Facebook posts to show
us.

What if, instead, you had a personal AI that read every Facebook post and then
decided what to show you. Trained on your own preferences, under your control,
with whatever settings you like.

Instead of being tuned to line the pockets of Facebook, the AI is an agent of
your own choosing. Maybe you want it to actually _reduce_ engagement after an
hour of mindless browsing.

And not just for Facebook, but every website. Twitter, Instagram, etc. Even
websites like Reddit, which are "user moderated", are still ultimately run by
Reddit's algorithm and could instead be curated by _your_ agent.

I don't know. Maybe that will just make the echo chambers worse. But can it
possibly make them worse than they already are? Are we really saying that an
agent built by us, for us, will be worse than an agent built by Facebook for
Facebook?

And isn't that how the internet used to be? Back when the scale of the
internet wasn't so vast, people just ... skimmed everything themselves and
decided what to engage with. So what I'm really driving at is some way to
scale that up to what the internet has since become. Some way to build a tiny
AI version of yourself that goes out and crawls the internet in ways that you
personally can't, and return to you the things you would have wanted to engage
with had it been possible for you to read all 1 trillion internet comments per
minute.

~~~
munificent
The fundamental flaw is this:

The primary content no user wants to see any every user agent would filter out
is _ads_. Since ads are the primary way sites stay in business, they are
obligated to fight against user agents or other intermediary systems.

The ultimate problem is that Facebook doesn't want to show you good, enriching
content from your friends and family. They want to show you ads. The good
content is just a necessary evil to make you tolerate looking at ads. Every
time you upload some adorable photo of your baby for your friends to ooh and
aah over, you're giving Facebook free bait that they then use to trap your
friends into looking at ads.

~~~
pwdisswordfish2
"The ultimate problem is that Facebook doesn't want to show you good,
enrishing content from your friends and family."

Well, it is someone else's website. What do you expect Zuckerberg has his own
interests in mind.

In 2020, it is still too difficult for everyone to set up their own website,
so they settle for a page on someone else's.

If exchanging content with friends and family (not swaths of the public who
visit Facebook - hello advertisers) is the ultimate goal, then there are more
efficient ways to to do that without using Zuckerberg's website.

The challenge is to make those easier to set up.

For example, if each group of friends and family were on the same small
overlay network they set up themselves, connecting to each other peer-to-peer,
it would be much more difficult for advertisers to reach them. Every group of
friends and family on a different network instead of every group of friends
and family all using the same third party, public website on the same network,
the internet.

Naysayers will point to the difficulty setting up such networks. No one
outside of salaried programmers paid to do it wants to even attempt to write
"user agents" today because the "standard", a ridiculously large set of
"features", most of which benefit advertisers not users, is far too complex.
What happens when we simplify the "standard"? As an analogy, look at how much
easier is is to set up Wireguard, software written more or less by one person,
than it is to set up OpenVPN.

~~~
seph-reed
Why are you defending Zuckerberg for being a dick? If you have power, you have
responsibility: full fucking stop.

The idea that it's okay to be a selfish child with power is tantamount to
allowing driving while drunk. Power is deadly, you can just as easily crush a
persons life as you could their legs with a car. Don't drive drunk, don't be
in power if you can't be a responsible citizen about it.

~~~
matz1
The poster doesn't defending Zuckerberg, merely trying to explain why
Zuckerberg did what he did.

>If you have power, you have responsibility: full fucking stop.

No, if you have power you get to decide the rule.

Same as drunk driving, the reason we are not allowing driving while drunk is
because the people who are against drunk driving has more power than the
people who are pro drunk driving. The side that has more power get to decide
what the law is.

~~~
seph-reed
You're confusing legal responsibility with responsibility proper.

Dictators are responsible for the death of millions, even if they make laws
that say otherwise.

~~~
matz1
Responsibility proper is subjective.

>Dictators are responsible for the death of millions.

The Dictators very well may believe that their action is the proper
responsibility (According to him).

My point is you can't simply ask them to stop by telling them they have
responsibility. Both side may view responsibility differently.

------
supernova87a
I really didn't realize until perhaps the last 2 years that Facebook
fundamentally tapped some hidden human need/instinct to argue with people who
they believe are incorrect. Specifically, and more importantly, combined with
the human inability to actively decide to _not_ pay attention when things are
inconsequential or not yet worth arguing about.

Sometimes, just shutting up about an issue and not discussing it is the best
thing for a group to do. _Not_ more advocacy or argument. Time heals many
things. No app is going to help you take that approach -- and that's not what
technology is going to help solve (or is incentivized to solve). Just like
telling a TV station that's on 24 hours to _not_ cover a small house fire when
there's no other news.

People are not good at disengaging from something when that's the right thing
to calm the situation. And Facebook somehow tapped into that human behavior
and (inadvertently or purposefully) fueled so many things that have caused our
country (and others) to get derailed from actual progress.

There is no vaccine yet for this.

And not to dump on the Facebook train, since others would have come to do it
instead. But they sure made a science and business of it.

~~~
derg
In general and not necessarily related to just facebook, but one of the best
things I've come to learn about myself and the world around me is that
sometimes the absolute _best_ thing you can do for yourself is to just shut up
and walk away, even if you know in your heart of hearts that you are correct.

~~~
brewdad
I've gotten better, over time, at typing my heated, emotional response to
someone online and then hitting Cancel or the back button and not posting.

Sometimes I do hit the Reply button though. I still have room for self-
improvement. :-)

~~~
dghughes
I've been doing that more recently too. I say to myself do I really want to do
this? Or why am I getting involved in this? Especially Twitter where you can't
unselect yourself from a conversation.

My new mantra is the saying, "not my circus, not my monkeys".

------
ENOTTY
Here's the paragraph I found most damning. It would make me want to assign
liability to Facebook.

> The high number of extremist groups was concerning, the presentation says.
> Worse was Facebook’s realization that its algorithms were responsible for
> their growth. The 2016 presentation states that “64% of all extremist group
> joins are due to our recommendation tools” and that most of the activity
> came from the platform’s “Groups You Should Join” and “Discover” algorithms:
> “Our recommendation systems grow the problem.”

~~~
tantalor
You are surprised? Here's the mission statement:

> Facebook's mission is to give people the power to build community and bring
> the world closer together. People use Facebook to stay connected with
> friends and family, to discover what's going on in the world, and to share
> and express what matters to them.

Encouraging group communication is the primary goal, regardless of the
consequences.

~~~
ENOTTY
It’s one thing to enable people to seek out extremist communities on their
own. It’s quite another to build recommendation systems that push people
towards these communities. That’s putting a thumb on the scale and that’s
entirely Facebook’s doing.

This is one example, and it’s quite possibly a poor example as it is a
partisan example, but Reddit allows The_Donald subreddit to remain open, but
it has been delisted from search, the front page, and Reddit’s recommendation
systems.

------
lmilcin
The problem really is platforms that give people content to please them. An
algorithm selects content that you are likely to agree with or that you have
shown previous interest. This only causes people to get reinforced in their
beliefs and this leads to polarization.

For example, when I browse videos on Youtube I will only get democratic
content (even though I am from Poland). Seems as soon as you click on couple
entries you get classified and from now on you will only be shown videos that
are agreeable to you. That means lots of Stephen Colbert and no Fox News.

My friend is deeply republican and she will not see any democratic content
when she gets suggestions.

The problem runs so deep that it is difficult to find new things even if I
want. I maintain another browser where I am logged off to get more varied
selection and not just couple topics I have been interested with recently.

My point of view on this: this is disaster of gigantic proportions. People
need to be exposed to conflicting views to be able to make their own
decisions.

~~~
dang
Sorry for the self-reference outside of a moderation context, but I wrote what
turned into an entire essay about this last night:
[https://news.ycombinator.com/item?id=23308098](https://news.ycombinator.com/item?id=23308098).
It's about how this plays out specifically on HN.

Short version: it's because this place _is_ less divisive that it _feels_ more
divisive. HN is probably the least divisive community of its size and scope on
the internet (if there are others, I'd like to know which they are), and
precisely because of this, many people feel that it's among the most divisive.
The solution to the paradox is that HN is the rare case of a large(ish)
community that keeps itself in one piece instead of breaking into shards or
silos. If that's true, then although we haven't yet realized it, the HN
community is on the leading edge of the opportunity to learn to be different
with one another, at least on the internet.

~~~
anigbrowl
The thing is that HN is essentially run like singapore - a benign-seeming
authoritarian dictatorship that shuts down conflicts early and is also
relatively small and self-contained. One thing that doesn't get measured in
this analysis is the number of people who leave because they find that this
gives rise to a somewhat toxic environment, as malign actors can make hurtful
remarks but complaints about them are often suppressed. Of course, it tends to
average out over time and people of opposite political persuasions may both
feel their views are somewhat suppressed, but this largely reactive approach
is easily gamed as long as its done patiently.

------
casefields
Mirror: [http://archive.md/YQeJY](http://archive.md/YQeJY)

~~~
neonate
Updated: [https://archive.md/FyTDB](https://archive.md/FyTDB)

~~~
obi1kenobi
Wow, Cloudflare's 1.1.1.1 DNS server sets up a man-in-the-middle (broken cert
gives it away) and serves a 403 Forbidden page when clicking on this link.
Verified that 8.8.8.8 works fine.

~~~
Defenestresque
I don't want to derail the discussion too much either, but anyone curious
about the reasoning can see this comment from CloudFlare [0]

>We don’t block archive.is or any other domain via 1.1.1.1. Doing so, we
believe, would violate the integrity of DNS and the privacy and security
promises we made to our users when we launched the service.

>Archive.is’s authoritative DNS servers return bad results to 1.1.1.1 when we
query them. I’ve proposed we just fix it on our end but our team, quite
rightly, said that too would violate the integrity of DNS and the privacy and
security promises we made to our users when we launched the service.

>The archive.is owner has explained that he returns bad results to us because
we don’t pass along the EDNS subnet information. This information leaks
information about a requester’s IP and, in turn, sacrifices the privacy of
users. This is especially problematic as we work to encrypt more DNS traffic
since the request from Resolver to Authoritative DNS is typically unencrypted.
We’re aware of real world examples where nationstate actors have monitored
EDNS subnet information to track individuals, which was part of the motivation
for the privacy and security policies of 1.1.1.1.

> [snipped the rest]

[0]
[https://news.ycombinator.com/item?id=19828702](https://news.ycombinator.com/item?id=19828702)

~~~
eloff
I'm not sure if it's a separate issue, but I've noticed 1.1.1.1 sometimes
can't resolve my bank. Adding 8.8.8.8 as an alternate DNS service resolves the
issue for me. I don't know if it's just balancing the requests or only using
8.8.8.8 if the primary fails. I'd like to know the answer to that.

------
contemporary343
Every platform ultimately makes choices in how users engage with it, whether
that goal is to drive up engagement, ad revenues or whatever metric is
relevant to them. My general read is that Facebook tries to message that
they're "neutral" arbiters and passive observers of whatever happens on their
platform. But they aren't, certainly not in effect, and possibly in intent
either. To preserve existing algorithms is not by definition fair and neutral!

And in this instance, choosing not to respond to what its internal researchers
found is, ultimately, a choice they've made. In theory, it's on us as users
and consumers to vote with our attention and time spent. But given the
society-wide effects of a platform that a large chunk of humanity uses, it's
not clear to me that these are merely private choices; these private choices
by FB executives affect the commonweal.

~~~
AlexandrB
It's pretty laughable for Facebook to claim they're neutral when they
performed and published[1] research about how tweaking their algorithm can
affect the mood of their users.

[1]
[https://www.theatlantic.com/technology/archive/2014/06/every...](https://www.theatlantic.com/technology/archive/2014/06/everything-
we-know-about-facebooks-secret-mood-manipulation-experiment/373648/)

~~~
1propionyl
Even if they hadn't done that, it would still be a laughable claim prima
facie.

There's something of an analogue to the observer effect: that the mere
observation of a phenomenon changes the phenomenon.

Facebook can be viewed as an instrument for observing the world around us. But
it is one that, through being used by millions of people and
personalizing/ranking/filtering/aggregating, affects change on the world.

Or to be a little more precise, it structures the way that its users affect
the world. Which is something of a distinction without much difference,
consequentially.

------
vtail
Disclaimer: I started working at FB recently.

Consider the following model scenario. You are a PM at a discussion board
startup in Elbonia. There are too many discussions at every single time, so
you personalize the list for each user, showing only discussions she is more
likely to interact with (it's a crude indication of user interest, but it's
tough to measure it accurately).

One day, your brilliant data scientist trained a model that predicts which of
the two Elbonian parties a user most likely support, as well as whether a
comment/article discusses a political topic or not. Then a user researcher
made a striking discovery: supporters of party A interact more strongly with
posts about party B, and vice versa. A proposal is made to artificially reduce
the prevalence of opposing party posts in someone's feed.

Would you support this proposal as a PM? Why or why not?

~~~
smhinsey
This is why the liberal arts are important, because you need someone in the
room with enough knowledge of the world's history to be able to look at this
and suggest that maybe given the terrible history of pseudo-scientifically
sorting people into political categories, you should not pursue this tactic
simply in order to make a buck off of it.

~~~
Barrin92
You don't need liberal arts majors in the boardroom, you need a military
general in charge at the FTC and FCC.

Can we dispense with the idea that someone employed by facebook regardless of
their number of history degrees has any damn influence on the structural issue
here, which is that Facebook is a private company whose purpose is to
mindlessly make as much money for their owners as they can?

The solution here isn't grabbing Mark and sitting him down in counselling,
it's to have the sovereign, which is the US government exercise its authority
which it has forgotten how to use apparently and reign these companies in.

~~~
dkn775
A lot of people wouldn’t know about the policy avenues that can be used to
regulate these companies (of which FTC is not the only one), or how even
advisory groups to the president could help.

------
knzhou
The degree to which “damned if you do, damned if you don’t” is in effect here
is remarkable. If Facebook literally removes anything, then HN is outraged
because it’s censorship, paternalism, all that. But if Facebook _does not
adopt an actively paternalistic attitude where it shows people content that
they deem is “good for them”_ , then that’s outrageous too. Both complaints
predictably rocket to the top of HN.

Which is it, guys? How can you simultaneously be outraged that Facebook is
imposing any restrictions on speech at all, and horrified that it _isn’t_
actively molding user behavior on a massive scale?

There’s an amusing comment from a Facebook employee downthread asking: if
division is caused by showing people opposing political opinions, should we
try to stop that to reduce division, or should we do nothing, to avoid forming
filter bubbles? Predictably, every single reply condemns him as evil for not
realizing one of the options is obviously right, but they’re split exactly
50/50 on what that right course of action is.

~~~
closeparen
If Facebook followed some deterministic algorithm like "show all content from
friends, in chronological order" then I don't think there would be such loud
voices calling for it to also solve $social_problem.

But Facebook _does_ exercise editorial control, in the service of engagement.
It's fair to ask that this curation consider other objectives as well, or at
least counterbalance the side-effects it's known to have (divisive content is
more engaging and so is amplified; at least correct it back down to neutral).

~~~
ccktlmazeltov
If facebook was following a deterministic algorithm, people would complain
too.

~~~
deadwing0
Is that reason enough not to try? Because people will complain? You know very
well that people will complain no matter what happens, yet we still must seek
to work towards truth and a better way, even though "better" is in opinion
word, and sometimes so is "truth."

~~~
ccktlmazeltov
good luck filtering human content via deterministic algorithms, and if people
can reverse engineer these then they'll get gamed... it's just a stupid idea.

------
yumraj
I've always wondered how such discussions go in company meetings where some
product/feature has harmful effect of something/someone but is good for the
business of the company.

I cannot believe that everyone is ethicality challenged, only perhaps the
people in control. So what goes through that minds of people who don't agree
with such decisions. Do they keep quiet, just worry about the payroll,
convince themselves that what the management is selling is a good argument
_for_ such product/service....

Luckily I've never had to face such a dilemma, but can't be envious of those
who have faced and come out of it by losing either their morals or jobs.

~~~
zelon88
I've typically found my employment via companies who deal with a variety of
contracts, some of them for weapons or defense contractors.

I could go down the rabbit hole of chasing down all those contracts and would
probably find that many of the products my company makes get sold to groups
and causes that I don't support. But in the end; I've gotta eat.

Do I want to throw away my career which is 99% unrelated to the SJW cause I
support just because 5% of our products _eventually_ get used against that
cause. What about the 95% of our products which go to worthy causes?

I'll say it again... I just gotta eat, man. What's good for the gander is
probably good for the goose too.

~~~
peruvian
If you're working for defense or weapons contracts you're supporting the
industry that has kept us in the Middle East for almost two decades.

I agree that it's often a moral grey zone but in this case it's pretty clear.
If you're an engineer there's plenty of other companies to choose from.

~~~
zelon88
While that is true, I've worked in manufacturing environments with high tech
equipment. This manufacturing equipment is so sensitive it gets covered with
tarp during dog-and-pony shows. We are using equipment and techniques in the
USA that other nations could only dream of implementing. Why do you think most
airplane manufacturers are located in the USA? Don't you think an airline
would buy aircraft engines from China if they could?

Keeping America on the forefront of technology has its benefits. If we don't
invest in cornering these technologies; our adversaries will.

Unfortunately it's the same technology that has kept us in the middle east
that's also been a forceful deterrent which safeguards all Americans.

------
LordHumungous
> Facebook policy chief Joel Kaplan, who played a central role in vetting
> proposed changes, argued at the time that efforts to make conversations on
> the platform more civil were “paternalistic,” said people familiar with his
> comments.

I think Joel was right.

~~~
renewiltord
I, too, agree. Facebook is just an extension of the open society.

~~~
tarkin2
Disagree. It's different to normal society.

Normal society encourages civility by offering the inclusion into a needed
physically-near social group. Digital society deincentises civility by
offering a multitude of alternative groups.

~~~
robertlagrant
It's localness by homogeneity rather than geography.

~~~
tarkin2
A community based on geographic locality is the prerequisite for a functioning
society and state. Anything that polarises that geographic community
ultimately damages the state by attacking its precondition.

------
daenz
Facebook discovers profitable strategy that news organizations have been using
for decades.

~~~
forgingahead
This is the most relevant comment to this discussion. News orgs have been
profiting off the same negative elements of human society for decades.

~~~
tzs
News organizations present a limited, curated view from fact checked, verified
sources. The information flow is mostly one way, from the news organization to
me.

A social media news feed might present the same underlying story to me, but
via some opinion blog that has not fact checked it or verified sources. It
might also come with assorted speculation by the posted, ranging from wild ass
to outright insane conspiracy theories.

And social media is designed to get me to offer my opinion on it, and to see
other people's opinion, and for all of us who read it to discuss it in a semi-
pseudonymous free for all.

The news organization approach is much more effective if the goal is to
actually inform people about the negative event.

------
donw
I think this is perhaps an example, on a grand scale, on why you (generally)
can't use technology to solve cultural problems.

Before the Battle of Trebia, Hannibal wanted to set up an ambush for the
Romans. He gathered 200 of his best troops together, and told them (a) that
they were squad leaders; (b) they each should pick 10 of their friends to form
their individual squads; and (c) the plan of attack.

A modern person might ask: "Well, why didn't he just gather ~ 2000 soldiers
together and communicate the plan of attack directly? That must be better than
passing a battle plan via a game of telephone".

The simple answer is that, before electronics and without a purpose-built
theater, you really could only speak, directly, with about 200 people at once,
because that's as far as the unaided voice can carry.

Many inventions and societal changes over the course of the past 2,000 years
have accelerated the flow of information. Printing presses, movable type,
widespread literacy -- I doubt that many of Hannibal's troops could read! --
postal systems which eventually spread across the globe, telegraphs,
telephones, fax machines...

Each of these increased the speed, range, and coverage of communication to a
varying degree, and each had a profound impact related to the scope of that
change.

Humanity is only about a decade or so into a world where J. Random Person has
the potential, through viral spread, to communicate with the full extent of
their social graph, and to find like-minded actors.

Compared to before, this is a massive change, and if you weren't old enough to
understand The World Before The Internet, it's hard to grasp just how massive
of a change this has been.

Our cultural and legal mechanisms simply haven't caught up yet, nor have they
adapted to the new evolutionary tempo.

~~~
knzhou
This is also why I'm never particularly concerned about somebody popular being
kicked off a platform. The ability to quickly go from being unknown to having
millions of views, distributed using somebody else's servers, paid for by
somebody else's money, is unprecedented in human history. If Youtube and
Facebook started banning people en masse, the degree of free speech in our
society would at worst regress to about what it was in early 2000s, hardly a
dystopia.

------
renewiltord
People always blame Facebook when the existence of Internet forums has always
led to radicalization of individuals. Facebook's crime is making forums
accessible to all.

These are just your fellow people. This is how they are in the situation that
they're in. So be it. Let them speak to others like them.

The cost of that is many angry people. The benefit of that is that folks like
me can find my people. That benefit outweighs the cost.

This is just the price of the open society.

~~~
chongli
_People always blame Facebook when the existence of Internet forums has always
led to radicalization of individuals. Facebook 's crime is making forums
accessible to all._

If it were only that, I would have a hard time assigning blame to Facebook.
However, it is not only that. Facebook exercises editorial control through its
recommendation engine. Users don't see all posts in chronological order. They
see posts ranked by Facebook based on invisible and inscrutable algorithms
that are optimized for engagement.

It just so happens that making people angry is an effective way to keep them
engaged in your platform. Thus it's not fair to call Facebook a neutral party
if they're actively foregrounding divisive content in order to increase
engagement.

~~~
renewiltord
I'm sympathetic to this position. I've heard people say the same about YouTube
and I don't have a concrete position on this.

On one hand, if someone were to tell me "The Mexicans are ruining America" and
I were to say "Damned right! Who else do you know who says these great and
grand truths about America?" I would expect that person to introduce me to
more people like them and my radicalization and engagement would increase out
of my own desire to have more of this thing. That aspect of Facebook's
recommendation engine just seems like a simulation of a request for more like
what I want in a very obedient manner. That is, the tool is actually
fulfilling what I am expressing I desire.

On the other hand, the inputs are inscrutable and not clearly editable. For
instance, suppose I look at myself and say "God damn it, some of these things
I'm saying are really bigoted. I don't want to be like this", I cannot
actually self-modify because there is no mechanism on Facebook to modify the
inputs. It'll select for me the content I have these auto-preferences for but
not the ones I have higher order preferences for.

Essentially it's a fridge that always has cake even though I want to lose
weight.

So, yeah, I'm sympathetic that I cannot alter the weights on my recommendation
and say "I want to clear your understanding of the person I want to be. Stop
reinforcing the one I am now."

Certainly the recommendation engine is a flaw. I do _like_ recommendations
though and that's my favourite way of browsing YouTube in the background. It's
pretty good at music discovery. So, perhaps it needs to be only opt-in.
Imposed by choice rather than by default. It still has to be possible to turn
it off.

Even then, I'm not sure. This is an ethical question I've been thinking about
for ages: Is it ethical to allow someone to make a choice that could be
detrimental and that they cannot recover from? What are the parameters around
when it is ethical? Opting in to recommendations could be a one way trap.

------
DanielBMarkham
Everybody who uses Facebook should spend about ten minutes on it. Catch up
with the important things friends are doing and leave.

Unfortunately, this behavior is not in Facebook's best interest. For them,
it's Facebook now, Facebook later, Facebook as far as the eye can see.
Everything is Facebook.

There is a premise to this article that needs to be called out and expunged. I
have come to the sad conclusion that Facebook is a company that should not
exist. It's laying waste to huge sections of the economy that used to provide
valuable, informative content, it's in a battle to suck your entire day away
from you with streaming and other services, and it's premise is in direct
contradiction to how we know societies evolve. You can't start with "how do we
fix it" and end up anywhere good.

They're not dummies. There might be a lot of happy-talk, echo chamber
discussions happening inside the company, but they know the score. That's why
they're picking political winners and losers. I imagine there's a ton of money
heading out to both parties to provide cover over the next few election
cycles.

I think looking back, if we manage to navigate our way through this period,
it's going to be viewed as a very sad and dark time, much like the dark ages.
I sincerely hope I am completely wrong about all of this.

------
Fiveplus
This entire thread, discussion and the article in focus make me so relieved.
I'm so proud of my decision to facebook, twitter and reddit altogether. There
is soooo much less noise in my life. I'm finally reading books, enjoying my
hobbies while still getting what 'I' like from the internet - RSS feeds to
give me the latest and most popular developments in news without any user
generated comments. 1-on-1 messaging services to help me stay connected with
my loved and dear ones and an occasional tour of websites like HN and my
favorite blogs from the bookmark folder. I do not want the reader to assume my
model is perfect, it's subjective. But that's the point - it is what I make
out to be the perfect browsing model and intended use-case of internet to me.
Another minor point, ever since I moved away from reading what 'people' have
to say in comments, it de cluttered my mind.

The internet is what you make of it. I let it direct how I used it, and
getting myself away from that grip and 'sucked into' environment is a
blessing.

------
jbay808
I'm no fan of Facebook. But for what it's worth, back when I was still using
it in ~2010, it helped me learn a lot about the worldviews of people on the
opposite end of the political spectrum who I rarely if ever interacted with in
person. The mechanism for this was Facebook Groups - I'd hang out in climate
change denial groups talking to denialists and asking them questions. And
although it didn't change my mind and I didn't change theirs, I (and my, err,
opponents) both actually learned a lot and came to see the other side as more
honest and less irrational/evil than we once thought.

I don't know if Facebook still serves this purpose today.

~~~
crocodiletears
It's less like that anymore. Group raids involving post reporting became a
huge issue a while back, so most political pages use membership application
questions requiring you to positively affirm or signal in-group association
before joining. Nothing prevents you from lying to get into a group, but it's
oddly effective as a mechanism for preventing partisan opponents from engaging
in any dialogue.

~~~
karmelapple
I think a big part of the shift of interactions over the last 5 - 10 years is
the communication platform (Facebook, in this case) bringing in new users who
had zero experience debating in a text-only format. It’s probably inevitable,
unless the platform tries to educate and heavily police new users on what
proper behavior is.

Facebook was incentivized to grow as fast as possible. Comments and discussion
was one of many vectors for growing; photos, news, and silly images was just
as important. The quality of all that wasn’t as important as the content
coming from people you know and trust.

Contrast that with a community like HN, where quality of comments and content
is much more important, since you have little to no trust for almost all
people submitting content.

------
tunesmith
Facebook and other similar systems reward engagement. Engagement happens when
people are surprised. Surprise happens when people come across new apparent
"information". New information is most easily propagated through the use of
lies.

It follows pretty clearly. If they don't want divisiveness, they have to
either step away from rewarding engagement, or they have to stop people from
lying. They're in a bind, except it's society that is bearing the cost.

------
mwfunk
It just feels like weaponized Usenet from the mid-'90s, or almost every
popular online forum since then. Multiplayer game communities even. They're
like tinderboxes for negativity. Very small numbers of bad faith actors
(griefers, trolls, scammers, spammers, or just plain assholes) can trivially
derail entire communities. Even without people trying to screw everything up,
plain old human nature, and the nature of electronic communications, can make
it happen as well. It just takes a little longer.

Put another way, each flame begets one or more flames, whereas each good
comment might get responses but maybe it stands on its own. Over time the
signal to noise ratio of any forum tends to degrade to nothing as the forum
becomes more popular because of this. Moderation, scoring systems, etc. can
ameliorate this but in general the less specialized the forum, the worse it
is. It's like entropy in that it only goes in one direction, it's just a
matter of time and how much you can push back on it. Bad comments beget more
bad comments, but good comments don't necessarily beget more good comments.
And at some point, the ratio of bad comments to good comments drives away any
potential good commenters and the event horizon is crossed and the forum dies.
Or it lives on as a cesspool for whatever.

The difference between Facebook and Twitter in 2020 vs comp.os.linux (or
whatever) in 1995 is that it's not specialists screaming at each other about
which distro or programming language or OSS license is best (or worst). It's a
much wider net of far less informed or rational people, encouraged to argue
about infinitely dumber and less knowable or debatable stuff. It's like scammy
clickbait, but for arguments rather than clicks. The other difference between
Facebook and Twitter in 2020 vs online communities of the past is that
Facebook and Twitter make money off of it. All this BS fuels "engagement" and
keeps larger volumes of people posting and therefore revealing themselves to
trackers and creating a stream of ad views for the platform owners. At some
point I do think the toxicity of the platforms will start costing them users,
but that doesn't seem to be happening anytime soon.

------
hadtodoit
Why does facebook need to do anything about this? People have been disagreeing
with each other violently or otherwise for as long as humans have existed. Do
they think they can do anything about this?

~~~
kgin
There has never been a mechanism whereby everyone can be against everyone else
about everything.

When my high school english teacher and my aunt are arguing about politics and
they've never met each other, it's clear this is a new development in human
conflict.

------
dredmorbius
Related, earlier this week on the New Books Network

Cailin O’Connor, "The Misinformation Age: How False Beliefs Spread" (Yale UP,
2018)

(New Books in Journalism) Duration: 40:00

Published: Wed, 20 May 2020 08:00:00 -0000

Media:
[https://traffic.megaphone.fm/LIT1956686397.mp3](https://traffic.megaphone.fm/LIT1956686397.mp3)
(audio)

Podcast:
[https://www.podcastrepublic.net/podcast/425693571](https://www.podcastrepublic.net/podcast/425693571)

Why should we care about having true beliefs? And why do demonstrably false
beliefs persist and spread despite bad, even fatal, consequences for the
people who hold them?

Author page: [http://cailinoconnor.com/the-misinformation-
age/](http://cailinoconnor.com/the-misinformation-age/)

Editor's book site:
[https://yalebooks.yale.edu/book/9780300234015/misinformation...](https://yalebooks.yale.edu/book/9780300234015/misinformation-
age)

Worldcat: [https://www.worldcat.org/title/misinformation-age-how-
false-...](https://www.worldcat.org/title/misinformation-age-how-false-
beliefs-spread/oclc/1112906678&referer=brief_results)

------
alzaeem
The outrage towards Facebook causing divisiveness is a red herring. You want
to see divisive content, go to foxnews vs cnn. Pretty much the entire media is
partisan and biased towards their constituents' points of view. For Facebook,
it would be nice if they stick to showing whatever is posted by a user's
friends or organizations they like/follow without much curation, but my view
is that their impact on divisiveness overall is miniscule

------
m12k
One man's division is another man's engagement

------
mudlus
Are we are going to have to wait for a generation to die and for millions of
lives to be lost (indirectly, say, through a demagogue's botched response to a
pandemic needlessly leading to the infection of millions) before the average
person is comfortable using a protocol (say, ActivityPub and RSS) instead of
these parasitic for-profit platforms?

As long as the search for truth is burdened with advertising on platforms
democracy and freedom are doomed.

If you don't see these things are linked, then you're part of the problem.

------
beepboopbeep
This is why twitter and facebook don't have dislike buttons. By removing a
quick and easy way of voicing dissent to a point, people take to the comments
to verbally punish others. For a site that is dependent on user engagement,
anger/outrage/frustration/negativity in general is a gold mine. I remember
when reddit tried removing the down vote button the comments got NASTY. They
back-peddled very quickly from that decision.

------
arbuge
"It is difficult to get a man to understand something when his salary depends
upon his not understanding it." \- Upton Sinclair

------
todd_henderson
Quick recap. Zuckerburg invited Roger McNamee to serve as an advisor. Stopped
listening when he expressed ethical concerns about the business model.

Hired Yael Eisenstat right after the Cambridge Analytica scandal to help with
ethics and oversight. Soon after, reduced role to primarily just optics.

Created new Oversight Board with many high profile individuals. Then they did
this.

------
specialist
From the article:

 _" Worse was Facebook’s realization that its algorithms were responsible for
their growth. The 2016 presentation states that “64% of all extremist group
joins are due to our recommendation tools” and that most of the activity came
from the platform’s “Groups You Should Join” and “Discover” algorithms: “Our
recommendation systems grow the problem.”"_

Then:

 _" In keeping with Facebook’s commitment to neutrality, the teams decided
Facebook shouldn’t police people’s opinions, stop conflict on the platform, or
prevent people from forming communities."_

Does not compute.

How can they claim to be neutral about the very problem they themselves
created?

There's _a lot_ of daylight between proactively accelerating extremism and
censorship. This is not a binary choice.

I'm right alongside Kara Swisher on this topic: Facebook's leadership team is
apparently incapable of nuance, self awareness, or acknowledging culpability.

------
cwperkins
I think NYT has set a decent example with how to deal with internet comments
sections. I like the idea of a US House of Representatives type approach to
comments where every person in the house is given an equal amount of time to
address the house so you can hear all perspectives.

The way NYT has done this is by introducing "Featured Comments". A team at
NYT, presumably ideologically diverse, picks insightful features to highlight
out of all comments. You can still view comments sorted by number of
recommendations, but they default to the Featured Comments.

The web forum I think needs this more than any else is the r/politics
subreddit of Reddit. Someone please let me know their experience, but I don't
think the comments on highly upvoted content are insightful at all. A lot seek
to exacerbate and misrepresent which IMO adds fuel to the flames of the flame
wars.

------
throwwayr
It’s hard to tell why automated feeds are treated differently to manual feeds.
If a blogger repeated a libellous claim, they could be sued, even if their
primary activity was curating other people’s opinions. Why companies that run
algorithms that do the same thing at scale get a free pass is hard to fathom.
These aren’t dumb pipes, but carefully programmed algorithms, tweaked for
generating maximum engagement.

I would suggest that any service that provided a curated feed of content
pushed to users, should be treated as a publisher and held liable for the
content it promotes. Importantly, “curation” would include spam filtering.

Google News, Facebook and Twitter would all be deemed publishers under this
rule. Search engines wouldn’t.

It would probably kill their business models, but that could well be a net
gain for humanity.

------
prirun
Politicians, news companies, and yes, Facebook, all promote divisiveness IMO.
It seems to trigger a primal instinct in (some?) humans to belong to "this"
tribe or "that" tribe, and they will go to great lengths to preserve and
promote "their" tribe while at the same time trying to squash & demoralize the
"other" tribe. I've known people like this, and for them, it seems to be
almost like a sport: they _enjoy_ arguing about why they are right and you are
wrong.

A person like me who doesn't share this black/white tribal thinking will
eventually (usually quickly) walk away from someone super aggressive about
their opinions (which they usually believe are facts). It's boring and becomes
quickly obvious that there is no point in trying to have any kind of
intelligent discussion, because thinking is not part of the process.

Now if you can get two of these aggressive types going at it against each
other, now you have a real spectator sport. It will _never_ end, because
neither side is thinking about what the other is saying; they are both just
defending their entrenched positions, and they both _enjoy_ the "battle".

The arguing becomes like entertainment for these folks, and the more they
argue, the more engaged they become in the argument. IMO, that's why
politicians, news, and companies like Facebook, _want_ divisiveness. They want
their audience to feel compelled to interact and engage.

The thing I don't understand is that, while I don't know it for a fact, it
seems that the "middle", those who are not fanatics, is a much larger
audience. They are turned off by all the aggressive black/white arguing among
politicians, news, and internet sites like Facebook. I've never been on
Facebook or looked at a Facebook page. I stopped watching TV news after it
turned into shouting matches over opinions instead of delivering facts. Same
for politicians.

It seems like courting the middle, moderate audience would lead to a larger
customer base. But that must be wrong, because surely these gigantic media
companies would have tried it by now.

------
cmrdporcupine
A rather persistent recruiter from FB contacted me recently, and given the new
WFH scenario there I was almost considering looking into it further, despite
it probably being a frying-pan-fire thing (coming from Google)

But after reading this... yeah, no.

------
blhack
IF facebook offered me the option of paying $5/mo to just get API access to
the things my friends posted, and I could display them however I want (LIKE
FOR INSTANCE IN CHRONOLOGICAL ORDER!) I would happily pay it.

------
geori
So Neal Stephenson wrote a book about this - at least the good half of the
book -
[https://en.wikipedia.org/wiki/Fall;_or,_Dodge_in_Hell](https://en.wikipedia.org/wiki/Fall;_or,_Dodge_in_Hell)

It's 20 years in the future where facebook and similar services are much, much
worse. Wealthy people pay for editors to remove misinformation from their
feeds. And the country gets bifurcated with coastal elites having access to
editors and flyover country ("Ameristan") has turned into a conspiracy plagued
wasteland.

------
majky538
I remember content from friends, then no related content when Facebook was
testing feed changes and now, it's mostly based on meme pictures and group
posts. I forgot that there are any "friends".

------
JackFr
I simply cannot understand the motivation of people who seemingly want to be
made angry.

I’ve had friends tell me I’m just buying my head in the sand, but I don’t
think I am. I’m trying my best not to be manipulated into a worse emotional
state. I don’t go on Facebook anymore because I realize that objectively time
spent on Facebook made me less happy.

------
beliefchallenge
I'm building a social media site that will bring people together rather than
drive people apart. It's called Belief Challenge. It's social media for open-
minded people. Try it!
[http://beliefchallenge.com](http://beliefchallenge.com)

------
jimmaswell
They decided against paternalistic meddling and let discourse happen
naturally? That sounds best to me. I don't want Facebook to be a school
teacher hovering over a lunch table to make sure nobody swears. People posting
"divisive" content is far preferable to the alternative.

~~~
banads
If FB wanted to "let discourse happen naturally" and not be paternalistic,
they wouldn't use an opaque, non-chronological algorithm to control who gets
to see what in such a way that primarily benefits FB's bottom line.

What is this alternative you speak of?

~~~
quotemstr
Optimizing for engagement does not favor any particular viewpoint. The authors
of this article are incensed that Facebook doesn't engage in more _viewpoint-
based_ adjustment of the conversation. Favoring or disfavoring a post based on
the viewpoint it expresses is very different from optimizing an algorithm to
give a user more of what he wants, whatever that is.

~~~
skosch
That's a misunderstanding of the problem.

Optimizing for engagement tends to favour _extreme, simplistic,_ and _highly
emotional_ viewpoints. In other words, it caters to human nature. This
tendency is harmful to rational discourse, regardless of whether or not you
happen to agree with any given viewpoint.

------
kolbe
Funny. I'm sure the Wall Street Journal knows the same thing, but reaps
profits from it as well.

------
sneak
Do we ask or expect the same of the phone or cable companies?

Why the agenda to (further) censor Facebook and similar?

~~~
AlexandrB
Phone companies don't set up incentive structures that encourage a certain
kind of content. Facebook has an "algorithmic" feed, likes, and "engagement"
metrics that rewards certain behaviours and punish others. They are rightly
being pilloried when these incentives encourage and promote constant outrage,
conspiracies, and completely fact-free fear mongering.

~~~
dleslie
This is a thought-provoking answer, and it shows how Facebook (and others) are
straddling the line between publisher and platform.

IMHO, because they do perform curation, both algorithmic and manual, they
should be considered publishers.

~~~
sneak
Seems to me that their argument against censorship should be even stronger,
then, as editorializing is protected expression.

~~~
dleslie
It would be, yes, and if Trump acts on his threats to investigate censure on
social media then this may be a good position to take.

The problem is that being a publisher brings greater legal liability for the
content that they publish; whereas as carrier/platform can wash their hands of
the data that they transmit and claim that they have no part of it.

------
LogicRiver
Facebook thrives on being able to create divides and bias among its users,
good or bad.

------
im3w1l
What can you realistically do? The alternatives as I see it are,

    
    
        Riling people up (showing different opinions)
        Echo-chambers (showing same opinion)
        Sweeping issues under the rug (showing neither)

------
LordFast
Social media is an addictive substance and should be controlled. End of story.

~~~
dlivingston
It's clearly not "end of story". What does it mean to be controlled?
Regulations or nationalization? What kind of regulations? Do the regulations
vary across countries? What kind of social media - just Facebook, or all
social media platforms?

Saying "end of story" is the sort of needlessly dismissive and self-righteous
rhetoric that always makes me sad to see on HN.

------
adameast1978
A decent of percentage of people get on facebook primarily to argue and that
increases these session time for these users. Probably wouldn't be beneficial
to FB but probably to people.

------
neycoda
The sad fact is that people choose flavorful news over verifiable facts.

------
aantix
Sheryl Sandberg wants males to "lean in", assume a more cooperative role.

But then she actively supports the most divisive platform in history. A
perfect dismount in her mental gymnastic routine.

------
MattGaiser
Reddit has the same issues of division and does not do anything as a company
to sort people. It all comes down to the individuals themselves.

Is division really all that new or can we just see it more now?

~~~
newacct583
It's a little different. Reddit doesn't choose the content presented to users,
they allow the community to self-sort into community-managed subreddits with
their own cultures and preferences and voting behavior. In fact reddit only
barely exerts any control over the selection of subreddit moderators (mostly
stepping in only to resolve things in extremis).

Facebook's algorithms decide on __everything __in your feed. If you aren 't
interested in politics on reddit you might never see it at all. If Facebook
thinks you might be a republican (and often that's just a demographic thing
coupled with a few past clicks on political stories), they will _literally
fill your screen_ with paid advertising designed to drive your political
preferences.

The point is that division is visible on Reddit (and everywhere), but _driven
and encouraged_ by Facebook. And that these are different phenomena. I'm not
completely sure I agree, but the point isn't as simple as "division exists".

------
sabujp
What I can't get in my head at this moment is why Facebook does this. When it
was still very young there was a lot of people who loved their product and
they said so no way.

------
JoeAltmaier
Its a new societal urge: the addiction to feeling righteous indignance.
Endorphin rush, available to anybody with a keyboard. Gonna be hard to put
that genie back into the bottle.

------
mindfulhack
You can more strongly control - and capitalise on - people when they're
divided and isolated, triggered and engaged, in their own little world where
they _think_ they're engaging with the whole world, but in reality they have
no idea that it's just their tiny little access of it.

This is control. Not uniting humanity. It's 'divide and conquer', through
business. The users are the conquered ones.

It is hurting civilisation greatly. Facebook is the archetype of capitalism
needing to be reined in by government due to its bad effects on society. It's
like pollution, but sociocultural pollution.

I don't care that a company made money while producing the pollution. I don't
care that people voluntarily chose to buy their products whose production
produced the pollution. That doesn't justify their business activities.

It's _pollution_.

------
artche
I think most social media discussions have degraded to outrage of the week.

I limit myself to instagram stories once a month to broadcast that I’m still
there to my close friends.

------
redorb
I've asked the question - what if FB went for bartender rules? No politics no
religion... sometimes I feel like those are 65% of the content.

------
save_ferris
Zuckerberg’s invincibility as CEO is nothing short of one of the greatest
failures of modern capitalism. It’s simply astounding that such a terrible
leader has retained control of what is clearly a company out of control. And
the market accepts all of it while individuals constantly criticize his and
Facebook’s actions.

People always throw around “well stop using Facebook” but that clearly isn’t a
reasonable solution from a scalability standpoint. What percentage of those
people also hold Facebook stock, either directly or through a hedge fund, ETF,
etc.? It could be more than we think.

At the end of the day, profits don’t care about people, and this is the
consequence we all have to live with.

------
jdofaz
Maybe divisive content keeps other people engaged but I stopped getting
enjoyment out of facebook years ago and I avoid it now.

------
donohoe
Do you work at Facebook? This is reprehensible.

    
    
      In essence, Facebook is under fire for making 
      the world more divided. Many of its own experts 
      appeared to agree—and to believe Facebook could 
      mitigate many of the problems. 
      The company chose not to.
    

Unless you are actively pushing to change it from the inside, you should leave
now. Take a reasonable amount of time to find a new job and leave.

Otherwise you're complicit.

------
patchtopic
Do we have a non-Murdoch propaganda machine source for this article?

Yes FB is bad, but the Murdoch press is not a reliable source.

------
dafty4
If there is an effort to broker civil and constructive debate, isn't division
fine?

------
seemslegit
How dare they - that used to be the job of traditional media and
entertainment.

------
munificent
A few years back, there was a documentary called "The Brainwashing of My Dad"
about how Fox News and conservative radio turned a relatively non-political
Democrat into an angry, active Republican.

In the past couple of years, I witnessed the same thing happen to my mother,
except driven almost entirely by Facebook and its non-stop parade of right-
wing pro-Trump racist memes.

------
dghughes
I wonder what would happen if Facebook and Twitter were shutdown for 30 days.

------
iamspoilt
Link without paywall: [http://archive.vn/YQeJY](http://archive.vn/YQeJY)

------
bookmarkable
Perhaps important journalism, but it is behind a paywall, so apparently WSJ is
satisfied that only their subscribers know this information about Facebook.

Meanwhile, Facebook is not behind a paywall, so they can monitor the
conversations of billions of people despite monthly stories that circulate
illustrating gross misconduct.

------
12xo
Attention is the currency of media. Sensationalism is the fuel.

------
astrophysician
As a total outsider following this from a distance, I sort of feel for
Facebook and other social media platforms facing this problem -- they've run
up against a fundamental issue for which there doesn't seem to be any
satisfying solution. Misinformation and propaganda are rampant on their
platforms definitely, and echo chambers that reinforce divisive worldviews
have probably deepened real societal divisions, but how do you actually
implement a policy to stop this? What is "propaganda"? What is
"misinformation"? The entire core of Facebook's existence is advertising,
which means user engagement and reach is the only thing that drives your
bottom line; they _want_ to drive users to Facebook and keep them there, and
keep them engaged. They've just happened to discover a universal human truth
along the way, which is that people _like_ feeling validated, and people
_like_ being a member of a tribe. Facebook is the way it is because thats what
users _want_ , whether they will admit to it or not.

Anything that Facebook does will be perceived as making a political and/or
moral statement, which they obviously are trying very hard not to do, because
as soon as you take a position you alienate half of the population (at least
in the US). They've apparently decided to go the route of burying their heads
in the sand instead of _trying_ to make things less tribal and divisive, which
in all honesty is a pretty understandable position to take, and yet _even
while actively trying not to piss off conservatives_ they have still landed in
hot water over perceived favoritism towards the left. They are damned if they
do and damned if they don't.

So honestly, what is the proposed solution here? What would you do if you were
in Zuckerberg's shoes? Do you campaign for regulations that take this issue
off of your hands but that let the government call the shots somehow? Do you
look at your board members with a straight face and tell them you're going to
tank user engagement for some higher, squishy moral purpose for which there is
no clear payoff?

------
kisna72
I don't understand why this is a surprise to anyone lol

~~~
kgin
There's a difference between suspicion and confirmation

------
dredmorbius
The Verge has an unpaywalled story;
[https://www.theverge.com/2020/5/26/21270659/facebook-
divisio...](https://www.theverge.com/2020/5/26/21270659/facebook-division-
news-feed-algorithms)

------
kingkawn
The implications of being correct are in for a few tweaks

------
visarga
Very nice two phrases ... I can imagine the rest...

------
anigbrowl
I've posted a lot over the years about FB being leveraged by genocidal regimes
and bad actors. While I don't think they necessarily pursue such ends, the
fact is that social media is a battlespace from where real-world aggression
can be launched, and that renting out platform space to this end has been
extremely profitable.

Perhaps it has already been posted elsewhere in this very long thread, but if
not I heartily encourage more ethically minded FB employees to leak the
presentation in question and indeed anything else they consider relevant. At
some point it will be too late to feel bad about not having done so when it
could make a difference.

------
noizejoy
As once said by Billie Eilish “duh”!

More seriously: Arms dealers are not exactly benefitting from facilitating
peace making efforts either, so economically this makes all the sense in the
world to me.

------
ggggtez
>Another concern, they and others said, was that some proposed changes would
have disproportionately affected conservative users and publishers, at a time
when the company faced accusations from the right of political bias.

This is the same thing they were worried about in the lead up to the 2016
election when they fired their newsroom for not promoting pizzagate and other
conspiracies that would be deemed as "biased" against conservatives. And they
clearly still haven't learned anything about why letting engagement algorithms
run wild is bad for society.

------
mindfulhack
Because I feel that the freely accessible HN should not be considered a
glorified comment section for another pay-only news site, here is the article
archived in full text:

[http://archive.is/cSUlh](http://archive.is/cSUlh)

(See here for general tips on paywall articles on HN:
[https://news.ycombinator.com/item?id=20313964](https://news.ycombinator.com/item?id=20313964))

(Edit: apologies, forgot to click on 'More' button to see previously posted
archive links.)

------
Seahawkshacker
#Deletefacebook

------
trekrich
they don't want the narrative changing from extreme left to center.

------
Seahawkshacker
#deletefacebook

------
TheAdamAndChe
Anyone know a good paywall workaround for wsj?

~~~
kordlessagain
Full page screen capture plugin on Chrome plus a community that posts to a
IPFS node and updates some decentralized search thing to be able to find it?

------
kasajian
Why post something on Hacker News that can't be read because it's behind a
paywall?

------
stickfigure
Can't read the article, but I've seen a lot of my friends unfriend other
people that have political opinions that differ from theirs. And the ever so
popular post "If you disagree with _thing xyz_ let me know know so I can
unfriend you!"

This isn't Facebook's doing. People self-select monocultures.

------
OctopusSandwich
No matter what Facebook does, journalists (fake news) will criticize them.

------
ccktlmazeltov
flagged because it's behind a paywall

------
gentleman11
Paywall. Can anyone summarize

------
garrytopl
Paywalled... See me next time when I can read the content

------
artemisyna
Can someone copy/paste the original text or post a non-paywall link?

------
retpirato
That's hardly surprising given facebook's track record of censoring
conservative users. Liberals can deny that all they want. It doesn't make it
untrue.

------
codermobile
You

------
engineer_22
Nationalize Facebook.

~~~
wtfno009887466
ROFL, who's the worse privacy violator, Facebook or the NSA

~~~
ouid
Facebook. With absolute certainty.

~~~
crocodiletears
NSA's a black box whose sole purpose is the aggregation and analysis of any
information with potential relevance to US national security. It has the
capacity to compel or infiltrate companies like Facebook to make them
cooperate with its goals, and data sharing agreements with multiple nations.
Privacy violation isn't a side-effect of its business model, it's its raison
d'être.

I wouldn't dismiss NSA so offhandedly along this metric, even if it's
ostensibly more constrained along legal boundaries.

------
alkibiades
the media really just wants fb to prevent division by only allowing a center-
left world view.

------
alpineidyll3
Regulators need to stop giving tech giants a pass on common carrier liability.
It would solve a lot of problems overnight.

------
adamnemecek
I hate FB as much as the next guy but I think that Facebook is an amplifier of
other trends.

I think that the underlying issue is the two party system. The echo chambers
get amplified.

~~~
ver_ture
The two party system does not affect this discussion. Facebook's algos will
show you more and more $x content if you've liked $x or subscribed to it, and
never show you $y content since you'd probably not like and engage with $y.
Doesn't matter how many parties/topics/underlyingIssues there are.

If FB were neutral they would show you every FB post, millions per second
whizzing past your screen, but they can't do this, they have to curate a wall
for you to slowly scroll through and for most revenue, like, share, or comment
on.

Therefore, to show you the most content that you will like, share, or comment
on, they repeat the type ($x) you've already liked, creating the echo.

So no, it is not mostly a problem of the underlying issue of the two parties,
this is entirely about how FB curates your wall and simply doesn't show you
"the other party"/$y or anything deviant/$y of your likes.

Edit: changed political parties to variables to illustrate point.

------
shaan1
There are more than 1.5 billion users on Facebook. If they are not worried,
and want to be misused, why the hell are others so hell bent on bringing down
Facebook lol.

If the users really cared, we wouldn't be having this talk.

Also this is the media wanting to bring down the enemy.

