
Facebook tinkered with users’ feeds for a massive psychology experiment - r0h1n
http://www.avclub.com/article/facebook-tinkered-users-feeds-massive-psychology-e-206324
======
gwern
Great! We should be congratulating Facebook for this, for several reasons:

1\. now Facebook can start designing against this. What design features
promote negativity? Where is negativity being inadvertently fostered? 2\. now
everyone _else_ can benefit from the research as they chose not to keep it
private despite the potential PR ramifications like this 3\. this is not
research more than a few entities in the world can or will do.

Ethically, I don't see the problem. Users already rely on Facebook to tinker
with user feeds in a variety of ways, and this tinkering causes them no harm,
and may benefit them and all future users. (When the users signed up for
Facebook, did they sign an agreement saying 'you agree to be mentally poisoned
with rage, frustration, anger and disappointment, without recourse, by anyone
you are foolish enough to friend and let flood your feed'? I rather doubt it.)
Given that users are already buffeted by countless unpredictable random forces
beyond their control or comprehension, what is tweaking one factor going to
amount to? This is the same as clinical drug trials: the question should not
be whether it is moral to run a trial, the question should always be whether
it is moral to _not_ run a trial.

The OP shows no sign of understanding any of this.

~~~
feral
A private corporation is developing the capability to alter the moods of
massive populations in a controlled way; unlike mass media, society doesn't
(yet?) understand this sort of influence.

Perhaps soon Facebook will understand how to tweak the news feed algorithm to
e.g. whip up negative sentiment in response to an attack? Or stop a particular
candidate being elected? Etc.

Perhaps they will use this for good, as you mention. But its still huge power
for a corporation to wield; effectively in secret - its easy enough for an
observer to make a judgement about a bias in e.g. Fox News, and call that out
- but it'd be very hard to observe subtle changes to individual news feeds.

I don't really buy your argument that they have a clear moral responsibility
to develop this capability. That depends on the potential good of such
capability if used for good, the potential bad if used for bad, and the chance
it will be used for each. So I don't see this as analogous to a corporation
conducting a clinical trial of a drug; this is more analogous to a clinical
trial of a drug that could also be used as a weapon, with the more complicated
ethical issues that would entail.

I think its absolutely something there should be ethical discussion around.

~~~
gwern
> Perhaps they will use this for good, as you mention. But its still huge
> power for a corporation to wield; effectively in secret - its easy enough
> for an observer to make a judgement about a bias in e.g. Fox News, and call
> that out - but it'd be very hard to observe subtle changes to individual
> news feeds.

This is power and capability given to all media organizations. When you permit
their existence, you permit both their deliberate use of their capabilities
for goals you dislike but also their _accidental unintentional random_ biases.
There is no difference in terms of the consequences.

I am reminded of one of the AI Koans
[http://catb.org/jargon/html/koans.html](http://catb.org/jargon/html/koans.html)
:

> In the days when Sussman was a novice, Minsky once came to him as he sat
> hacking at the PDP-6. “What are you doing?”, asked Minsky. “I am training a
> randomly wired neural net to play Tic-Tac-Toe” Sussman replied. “Why is the
> net wired randomly?”, asked Minsky. “I do not want it to have any
> preconceptions of how to play”, Sussman said. Minsky then shut his eyes.
> “Why do you close your eyes?”, Sussman asked his teacher. “So that the room
> will be empty.” At that moment, Sussman was enlightened.

~~~
3rd3
Isn't your reference to other media organization an appeal to popularity? Why
shouldn't we strive for media that are more concerned about privacy and
dignity? Furthermore the interactivity and private data make social networks a
whole different beast.

> _There is no difference in terms of the consequences._

Not in terms of consequences, but there is the difference that in a scientific
study you need the consent of the experimental subjects. I think it's
problematic that they assume this was given when the users clicked the TOS
checkbox, since they must know that a large percentage does not know what it
entails.

------
spicyj
Am I a terrible person for finding this neither surprising nor unethical? If I
had the opportunity to do this study I'd definitely do it – where else are you
going to have the opportunity to see insights like that?

~~~
meowface
I don't find it surprising or unethical, but I do question the validity of the
conclusions drawn and the idea that doing this would actually get them the
kind of information they wanted.

------
brudgers
(IANAL)

The standard for psychological experimentation would normally be informed
medical consent, not the terms and conditions of a commercial contract.

This is the sort of thing that a clinical psychologist could lose their
license for.

~~~
kghose
Yes. If your study is done with federal money and at a research institution
you need the Institutional Review Board to clear your experiment. One of the
requirements is that patients/subjects be given a consent form and the
experiment explained to them.

It is possible to perform an experiment, hiding its real purpose, if the
results of the experiment require the subject not to understand the
design/aim. But usually these are badly designed experiments, because you can
never know if a subject has been exposed before to the experiment, and can
therefore game it.

I guess FB is arguing that users gave informed consent when they signed up for
facebook. I don't know if there is a review board for experiments at FB.

I think, with a good enough lawyer, you can successfully execute a class
action lawsuit if you find users whose mood worsened as a result of this (a
user under treatment for depression, for example, whose mood is clinically
assessed)

------
toddh
Reactions against this sort of research are similar to the reactions against
the nudge research. People don't like it because it seems manipulative. But
the thing is any order of events a programmer throws against the wall will
impact people. Just like any process design will nudge people in some way.
Isn't it better to know how people are affected by our designs? Because people
are being affected, we just don't know how. It's better to know.

------
yuubi
According to the paper[1], "[the research] was consistent with Facebook’s Data
Use Policy, to which all users agree prior to creating an account on Facebook,
constituting informed consent for this research."

The Onion AV Club bit quotes "internal operations, including troubleshooting,
data analysis, testing, research and service improvement" as part of the FB
policy.

If that's really the part they're relying on, informed consent doesn't mean
much.

[1]
[http://www.pnas.org/content/111/24/8788.full](http://www.pnas.org/content/111/24/8788.full)

------
wmeredith
This is gross, but unsurprising. Facebook hasn't cared about their users in a
long time. It's their information not yours, it's their network not yours.
It's a systematic way for a third party to manipulate you into doing what they
want or what whoever is paying them the most wants.

Ugh, what a scourge on the tech community. Shit like this reflects badly on
all of us. Users hate this manipulative crap. Their MySpace days can't come
fast enough.

~~~
grecy
While I agree with you, I think you'll be hard pressed to find a money-making
company that isn't trying their darnedest to manipulate you into doing what
they want (buying their stuff, mostly)

------
mschuster91
Honestly, all I want is:

a) have an option to AUTOMATICALLY have all those stupid games banned, without
having to ban a new Farmville version or clone every other week (aka: I just
want to view real, user generated content and not be woken out of my sleep due
to some retarded friend playing Farmville at night and asking me for crops)

b) make it REMEMBER the "show me the full feed" and not switch to that
abomination it calls "top posts"

------
dang
Mostly a dupe of
[https://news.ycombinator.com/item?id=7956470](https://news.ycombinator.com/item?id=7956470).

------
platz
Wonder if we'll hear from any of the data scientists, whom surely exist in
good numbers here.

Also, I wonder how much cognitive dissonance programmers feel, or often they
rationalize, or reassign values, when things are done (which are quite
typical) in the nature of their profession, to this effect.

------
judk
Fascinating how hard scientists will work, hunkered down in front of their
calculators, to discover basic facts about human social experience that
regular people have known for thousands of years: emotions are shared.

~~~
harmegido
Why would you make this comment? Do you really think it's a waste of time to
try to better understand the world? Are you aware that there exist many, many,
many examples of experiment showing commonly-held beliefs to be incorrect?

------
hindsightbias
Nobody could have predicted, except Vernor Vinge.

------
spingsprong
So now when the citizens are angry, the government can have a talk with
Facebook, to see if they can settle the nation down a tad.

Not creepy at all, nope.

~~~
spingsprong
Or thinking about it more, get ready to downvote this too, governments are
more likely do the exact opposite.

Want to cause a revolution against a dictator? Talk to the social networking
sites, see if you can increase unrest. We already know the US government
talked to social networking sites to help the Iranian people overthrow their
government.

[http://www.cbsnews.com/news/state-dept-asked-twitter-to-
dela...](http://www.cbsnews.com/news/state-dept-asked-twitter-to-delay-
maintenance/)

A few small steps from that, and you're manipulating the emotions of nations
in an attempt to start a revolution, instead of just help one that formed by
itself.

------
wwwwwwwwww
i cant wait to see facebook to leverage this to condition users into favoring
advertisers/products who pay for premium advertising packages

seriously what the fuck is facebook's PR team thinking letting their research
staff publish something like this

~~~
scrollaway
> seriously what the fuck is facebook's PR team thinking letting their
> research staff publish something like this

Yes, it'd be so much better to let that research be done in secrecy, and sold
to the highest advertising bidder. /s

"Scary" as some of the things Facebook (replace by the bigdata corp of your
choice) does may sound, I appreciate a lot when extremely-large-scale real
world scientific tests, like this one, can be achieved and I appreciate a
thousand times more the results being published. Call me crazy, but to me this
is what "advancing the understanding of the human race forward" looks like...
rather than "let's do designs on both round AND square watches".

~~~
A_COMPUTER
Do I have any reason to think they aren't running other experiments in secret
and selling it? And considering that the US government already takes bids for
"persona management" to influence social media, why shouldn't I think they're
not a customer?

Facebook, and all centralized social media platforms, are dangerous. Nobody
should have this influencing power.

~~~
scrollaway
> Facebook, and all centralized social media platforms, are dangerous. Nobody
> should have this influencing power.

I couldn't agree more. That also was not my point.

~~~
A_COMPUTER
Secret or not secret are not mutually exclusive, was all my point. the rest
was just an aside. The knowledge is even useful, but for me it's been
completely overshadowed by knowing that they are manipulating us to learn, in
effect, how to manipulate us better. It's not even theoretical.

I was watching Google I/O, and in the section about Android TV, they showed
how the device, if you left your TV on, would show you soothing images that
they selected. What kind of power is that! Imagine if suddenly, all the images
on every Android TV switched to the same image. How would what that image
presented affect the world at that moment?

