
Facebook experiments had few limits - anigbrowl
http://online.wsj.com/articles/facebook-experiments-had-few-limits-1404344378?mod=WSJ_hp_RightTopStories
======
macspoofing
Informed consent is a bedrock of ethical psych research. If a lowly undergrad
is expected to get consent before administering any sort of dinky survey, why
is Facebook exempt? They created a study which purposely manipulated the
emotional state of their customers in order to publish results in a scientific
journal. So Facebook doesn't have to give a shit about ethical research
because they're Facebook? I don't understand how people here can't understand
why these protocols are in-place and why others see a problem when they are
ignored. It was completely unethical.

~~~
icambron
Eh, so someone made up a bunch of rules and stamped "ethics!" on them, and
now, completely independent of the original justification, it's somehow the
law of the land. But, at least as applied here, it's pretty bogus.

Think of this way: there's some sort of algorithm that controls what goes on
your Facebook feed. There has to be, basically, or else they just have to show
you everything. The algorithm takes in sentiment analysis as part of making
those decisions, which seems perfectly reasonable. So far, just by building
something that interacts with users and affects their emotions, FB has a
completely uncontrolled experiment. If they show negative stories vs positive
stories, what happens? So far, they just don't know, which seems really bad.
Well, to find out the consequences of what they actually do, they have to dial
up and down some knobs and see the effects. If they can't _try things and find
out what happens_ , how do you expect them to do anything? Just guess? Even
worse, if they just guess, they're still experimenting, but now with way less
information.

This is completely normal, and every business does it. They have to answer
questions about what effect the things they do have, or get stuck not doing
anything. A huge part of what businesses do is interact with people, so a huge
part of that effect is actually the impact on their customers' emotions. What
if we make the colors on this display brighter? What if we play different
music in our store? Psychology experiment! You can call it "manipulation" if
you want, but you're really just going for cheap connotation points. Almost
any editorial decision you could make on any subject with any audience is
manipulation. Being systematic about it doesn't make it more so. Do you think
you should be asked for consent before being subjected to an A/B test on an
ecommerce site? Because you're definitely being manipulated in the relevant
sense.

It sounds to me like the objections are not so much that FB _ran an
experiment_ so much as it's that they _published a paper about it_. You could
argue that that really is dispositive, but, well, nobody is. And it seems if
they're going to do some research, we should actually encourage them to share
their results.

~~~
seanp2k2
Extreme example (IANAL): a person who uses Facebook heavily and is part of the
"more negative stories" group commits suicide. Is Facebook exposed to any
legal risk because of this?

IMO, it would have been pretty easy to ask some random users if they wanted to
participate; from asking around my group of friends, we all would have opted
in because we find it interesting, especially if the findings would be
published.

~~~
anigbrowl
Exactly. Most people would be delighted to be asked if they wanted to be part
of Facebook's 'Insider' test group or somesuch, and they don't have to know
what they're being tested on ahead of time. The FB researchers certainly knew
what standard practice in this area was, any psychology/social
science/economics paper that depends on tests or surveys spells the
participation invite out first thing in the methodology section.

------
zachrose
> The little-known group was thrust into the spotlight this week by reports
> about a 2012 experiment in which the news feeds of nearly 700,000 Facebook
> users were manipulated to show more positive or negative posts. The study
> found that users who saw more positive content were more likely to write
> positive posts, and vice versa.

Ok, this seems like it should be about as controversial as a supermarket that
plays downbeat music and tests to see if sales decrease.

That's not to say that the study was manipulative, involuntary, and barely
consented to, but hey, that's Facebook in a nutshell.

The tone of this article is "these studies happened with no oversight!" but it
never suggests who would oversee Facebook, except Facebook, which doesn't
really seem like oversight at all.

~~~
betterunix
Quite a few conferences now require that experiments involved human subjects
be reviewed by an ethics committee of some kind _before_ being carried out.
Universities have such committees and they do oversee themselves pretty
effectively.

So no, I do not think that Facebook overseeing its own researchers is so far
fetched. The ethical review board would probably consist of a combination of
lawyers, PR people, and people with research backgrounds. The lawyers and PR
folks would not have seen this experiment as a "great opportunity for research
that was previously impossible" so much as "lawsuit bait" and "PR nightmare."

~~~
zachrose
That's a good point. It is possible for a panel of professionals to qualify
and limit the kinds of work are ethical. That's what ethics are! Accountants,
lawyers, and doctors are all good at that.

The critical difference, in my mind, is that academic research takes place
more or less in the public sphere. Even when it's not, an academic can always
be counted on to challenge and discredit a fellow academic.

An in-house council of research practice could exist, but it would take a
group of professors with reputations on the line. And then again, to my main
point, Facebook's central business practices revolve around manipulating users
in ways that are even _less_ concerned about the user's well-being.

------
click170
I'm genuinely scared by the number of people who do not understand what is
wrong with Facebook doing this without consent. I guess that was just a
classic case of Projection on my part, thinking that other people felt the
same way I do about this.

One of the questions this raises for me though is if these people who don't
see this as an issue feel that way because they've already submitted to the
oppression that is Facebook? Is it the case that Facebook has already "broken"
them, as you would a steed?

------
aniro
What a brilliant and effective tool for propaganda and other forms of social
control.

We should be sure to only show people patriotic words on July 4th so they can
think patriotic thoughts and suffer none of that negative criticism stuff
about the government they are so abused with normally. Maybe we should do that
for all National holidays. Or maybe we should just do that all the time.

Good thing it wasn't a military branch providing funding for this. That would
be scary.

------
ufmace
I've honestly been ignoring all of the articles on this subject up to now.
Finally decided to read one and see what all of the fuss is, and I still don't
see the point. Are we supposed to be upset that they manipulated which things
show up on the news feed in what order or something? At the end of the day,
it's still just Facebook.

~~~
anigbrowl
I was a bit surprised by the opener:

 _Thousands of [Facebook] users received an unsettling message two years ago:
They were being locked out of the social network because Facebook believed
they were robots or using fake names. To get back in, the users had to prove
they were real._

I vaguely remember the fuss about this at the time on HN, and was a bit
disturbed to discover that it was staged. I haven't used Facebook for more
than an hour or two since 2010, but for many other people it seems to be their
'home on the internet' and the notion that they would be arbitrarily locked
out of their accounts under false pretenses is a troubling one.

------
millioncents
I cannot recall in which book I read this, but there was a similar experiment
completed before. (Possibly an NLP book)

A University professor asked students to solve an anagram. After they were
done, they had to walk into the professor's office to tell him the word. In
his office he'd be pretending to have a conversation with a visitor.

The words the the students had to solve were either positive, negative or
neutral.

The result was that students that had a negative word, were far more likely to
interrupt the professor's conversation with the visitor more quickly than
students that had a positive or neutral word.

After the students told the professor the word, he asked the students if they
believed the word had an impact on their emotion. All of the students said no,
but the data showed the negative words had an impact.

------
adregan
I wish this had all come out in time to have been a part of this week's "This
American Life." The episode was called "The Human Spectacle"[0]. There are so
many interesting parallels with several of the pieces presented (most notably,
Acts 1 and 2)

It's staggering to think that out of all the Facebook users, 700,000 is still
only a small portion of the total available users. I wonder what Facebook's
ongoing role will be as a source of anthropological data?

0: [http://www.thisamericanlife.org/radio-
archives/episode/529/h...](http://www.thisamericanlife.org/radio-
archives/episode/529/human-spectacle)

------
hagbardgroup
> "They're always trying to alter peoples' behavior."

Just like every customer of the Wall Street Journal that's not a subscriber.
All of the sophisticated advertisers for the Journal, online and in print, are
running similar tests.

------
zaidf
This will go down as a case study on how to _not_ publish a study for the
mainstream. The reality is that companies are running experiments similar to
fb all the time in order to optimize revenue. But most companies don't publish
their results like fb chose to do.

------
hadtocreateit
I generally refrain from posting a comment but the current debate is just
getting ridiculous. Its more about people hating facebook rather than any real
ethical issues. The worst part being, how the IRB's are invoked as if they are
filled with ethical gods. A similar situation has happened before as explained
here in the NYTimes article by Atul Gawande
[http://www.nytimes.com/2007/12/30/opinion/30gawande.html?_r=...](http://www.nytimes.com/2007/12/30/opinion/30gawande.html?_r=1&oref=slogin)

The key question is at what point is a certain method a quality control
decision vs a research experiment. What Facebook did was an internal quality
investigation. What are they supposed to do show posts at random, for this to
qualify as an "experiment" we need a baseline. In reality there is no such
baseline.

~~~
ghaff
Yeah I seem to be in the minority in that I find it hard to get upset here. In
fact, the lesson I take away is that if you, as a company--especially a highly
visible one, do any sort of A/B testing which might be remotely seen as
manipulating behavior by anyone (which covers broad ground), you probably
shouldn't make it public.

I'm not sure I see how this is all that different from running a bunch of ads
with different messages and/or emotional content and seeing how they
differential ly perform.

~~~
Alupis
No...

You should tell your users and include it in your TOS. (not do it, then add it
to your TOS after the fact).

~~~
mkal_tsr
Yup, that's the big issue in my opinion. Their TOS never had research as
condition that you accept. When Reddit decided to use their data-set for
research and make it available, they had a public blog post explicitly calling
it out and had an option to disable your data for research. Naturally I opted
out. I was informed of the change and I acted on it. Those users on Facebook
never agreed to the research component of the ToS (because it didn't exist
yet), did not know they were involved, and were included in a PNAS study that
had Cornell researchers involved.

------
droidist2
What's next? Facebook studies showing that users who were shown more baby
pictures in the news feed became more likely to have babies within the next 2
years?

------
tomasz207
This reminds me of the short story titled "The Wave."

------
jimjohnson
making a link [https://gist.github.com/](https://gist.github.com/)

------
Dwip551
:D

------
PeterGriffin
I'm just thinking, let's say McDonald's was putting something in our food to
alter our emotional state and measuring the outcomes.

Would food industry people be coming out in support of them the way it seems
half the people here are fine with what Facebook is doing...? "Hey, it says
Happy Meal on the box, and they're delivering!"

~~~
beedogs
It's really incredible; the number of people with full-blown Stockholm
syndrome in this thread is staggering.

Why do so many people here think this kind of thing is okay when Facebook does
it? Is it simply because they feel kinship to a tech-related company?

~~~
PeterGriffin
> Why do so many people here think this kind of thing is okay when Facebook
> does it? Is it simply because they feel kinship to a tech-related company?

Yup. It's like asking in a wolf forum if having the sheep for dinner is
considered ethical.

