
Exposure to Ideologically Diverse Information on Facebook - fgeorgy
https://research.facebook.com/publications/1629735407303857/exposure-to-ideologically-diverse-information-on-facebook/
======
nmrm2
The vast majority of political content on Facebook is the variety of bullshit
that shows up on conservative/liberal "news" sites. Most of those sites make
the editorial pages of places like Fox and HuffPost look positively civil and
well-informed.

Does it really matter whether people read bullshit that they disagree with?
Reading bullshit from either side of the political spectrum isn't really
teaching anyone anything (and, moreover, _isn 't designed to_), so why the
premium on consuming all varieties of bullshit? The most likely reaction to
reading bullshit you disagree with is "This Is Bullshit". Which isn't so
surprising when the content really is bullshit.

For me, the more interesting question is whether people are exposed to high-
quality justifications for a range of opinions. And I can't remember the last
time a research paper showed up on my facebook feed.

~~~
eridal
> _The most likely reaction to reading bullshit you disagree with is "This Is
> Bullshit"_

I wonder what do people do when they read bullshit that they _agree_ with.

Is Facebook trying to find ways to maximize how much bullshit will you agree
with?

~~~
nmrm2
_> I wonder what do people do when they read bullshit that they agree with._

They agree with it (maybe while realizing it's bullshit).

I think among people calling for content diversity, there's a mostly explicit
assumption that encountering more opinions you disagree with will cause you to
either treat those opinions with more respect, or else smell bullshit on your
own side of the fence more often.

IME it doesn't work that way, regardless of what happens in the lab. If
anything, reading bullshit you disagree with has the opposite effect -- _"
wow, that's a really shitty reason to believe X. People who believe X must be
stupid to believe that reason._"

Seeing _high quality justifications_ for opinions you disagree with or meeting
people you _respect_ who disagree with you can both lead to attitude changes
and even changes of opinion.

But reading punditry on the Internet? Nope.

 _> Is Facebook trying to find ways to maximize how much bullshit will you
agree with?_

Well definitely, because most marketing is bullshit. But I don't think that's
what this paper is about.

~~~
ikeboy
Yeah, there's
[https://en.wikipedia.org/wiki/Inoculation_theory](https://en.wikipedia.org/wiki/Inoculation_theory)
and
[https://en.wikipedia.org/wiki/Confirmation_bias](https://en.wikipedia.org/wiki/Confirmation_bias)

------
mklappstuhl
Here's a PDF version
[http://cn.cnstudiodev.com/uploads/document_attachment/attach...](http://cn.cnstudiodev.com/uploads/document_attachment/attachment/681/science_facebook_filter_bubble_may2015.pdf)

~~~
jaawn
Thank you! Facebook is blocked where I work, so this gives me a chance to read
this when I have some time.

------
lectrick
I am convinced (although I do not yet have enough data to prove it) that
continuous exposure to a lack of ideologically-diverse information is actually
very long-term-bad, leading to unnecessary deaths (i.e., war, people killing
each other over mere worldview differences, media taking advantage of
polarizing forces to whip people up and generate ad views, etc.).

I base this only on my Psych-major training and just in thinking about things
and observing.

If anyone had any actual evidence that might support this conclusion, I would
be grateful.

If Facebook actually understands that it is kind of their responsibility at
this point to ensure people's feeds are more diverse than they might otherwise
prefer (i.e., gently introducing discomfort), this might mark the first time
I'm actually in favor of them keeping their feed ranking algorithm secret (so
as to quiet the masses, most of whom seem content to never experience
ideological discomfort). But this, of course, depends on trusting Facebook...

~~~
inverba
_> I am convinced (although I do not yet have enough data to prove it) ... If
anyone had any actual evidence that might support this conclusion, I would be
grateful._

This is the opposite of how useful research and/or data science works. Data
should be taken as is and then learned from. It certainly should not be
gathered in an effort to directly prove a conclusion that you are already
"convinced" of.

It's very disheartening to see this as a comment on a data science article on
hacker news... unless you're being sarcastic/ironic? This has to be
sarcastic/ironic, right? Right? :(

~~~
lectrick
> This is the opposite of how useful research and/or data science works. Data
> should be taken as is and then learned from. It certainly should not be
> gathered in an effort to directly prove a conclusion that you are already
> "convinced" of.

Get down off the pulpit. If the data said the opposite, I would instantly
change my worldview.

But you are correct, in that my wording or attitude was incorrect. I should
have said "I suspect" instead of "I am convinced" and instead of "might
support this conclusion" I should have said "might support or refute this
hypothesis".

Which is to say, don't scientists at least have a _hypothesis_ in mind before
they collect data related to it? Otherwise, why would you be testing at all,
and for what exactly? You can't just take millions of data points, put them
into a blender and get proven theories out of it!

Anyway, that's what I'd have at this point, a hypothesis. I should have used
that wording, my bad.

~~~
AnimalMuppet
And something that you suspect (or even of which you are convinced) but can't
prove is a decent starting place for trying to find the data to prove or
refute it.

------
bpodgursky
Most interesting quote to me:

"Individual choice has a larger role in limiting exposure to ideologically
cross cutting content: after adjusting for the effect of position (the click
rate on a link is negatively correlated with its position in the News Feed;
see fig. S5), we estimate the factor decrease in the likelihood that an
individual clicks on a cross-cutting article relative to the proportion
available in News Feed to be 17% for conservatives and 6% for liberals, a
pattern consistent with prior research (4, 17). "

~~~
mfoy_
(Disclaimer: I didn't read the paper. I'm not sure if someone _knows_ a link
is cross-cutting before clicking it.)

So if I'm reading that correctly, liberals are more likely to click on a
cross-cutting article. That alone doesn't really tell me much. So without
trying to inject too much personal bias...

Does that mean:

1) Liberals are more likely to click on something they disagree with.

    
    
       a) Because they are more open minded, they click on articles from contrarian points of view?
    
       b) Because the article just looks interesting, but it turns out they disagree with it because they are more closed minded?
    

2) Conservatives are less likely to click on something they disagree with.

    
    
       a) Because they are more open minded, they agree with more of the things they read than liberals?
    
       b) Because they are more close minded, they don't even bother opening articles they think they will disagree with?

~~~
gph
This is pure conjecture but...

Liberals in general could be considered the progressives while conservatives
are traditionalists. So conservatives are trying to hold onto the old ways (or
at least a perversion of them), whereas liberals are looking for reasons why
the old ways are wrong and we need to progress.

So conservatives love reading articles that glorify old traditions/ideals,
while liberals read the same article in order to confirm all that's wrong with
the old ways therefore giving support to their belief that we need a new way
to progress forward.

Just my initial speculation, take with a large grain of salt.

~~~
mfoy_
I like this idea as another possible interpretation of that data point.
Essentially you're saying that it's because everyone's clicking the more
conservative links which causes the skew, albeit for different reasons.

The only point I'm trying to make is that the simple sounding sentence "we
estimate the factor decrease in the likelihood that an individual clicks on a
cross-cutting article relative to the proportion available in News Feed to be
17% for conservatives and 6% for liberals" really let's you draw your own
conclusions.

~~~
V-2
Note that the paper observes:

 _" liberals tend to be connected to fewer friends who share information from
the other side, compared to their conservative counterparts: 24% of the hard
content shared by liberals’ friends are cross-cutting, compared to 35% for
conservatives"_

The more articles that I can't agree with I find in my feed, the less likely I
am to click on one of them than a guy who sees them less often.

Even if we both read exactly 3 such articles a week.

------
return0
I would take this with a grain of salt, facebook has an interest to present
itself as being unable to effect it's users' opinions.

------
mklappstuhl
Is the actual paper hidden behind a login? o.O

~~~
rdwallis
Maybe you have to login so they can adapt the paper's conclusions to match
your social graph.

------
V-2
It's called "cognitive bubble"

