
XKCD: Research Ethics - codelucas
http://xkcd.com/1390/
======
panarky
Companies have been purposely making people feel bad about themselves for a
very long time.

Ads intentionally make people feel ugly, lazy, stupid, fat, dirty, smelly,
dandruffy, lonely and ashamed so they'll buy whatever product is on offer.

Commerce is all about manipulating people for profit. I'm not a Facebook fan,
but tell me how this is worse than standard operating procedure with every
other consumer product?

~~~
eridius
Because those aren't psychological experiments. Nobody's claiming it's ethical
for an ad company to make you feel bad about yourself, but that's not really
the point here. The point is that there are strict ethics rules for running
experiments, and one of those rules is, except for a limited exception (that
does not apply), the subjects need to be knowingly taking part in the
experiment.

~~~
jessriedel
You're not explaining the inconsistency, you're just giving it a name. If
someone says "why are the sentences for crack cocaine 5 times longer than for
regular cocaine?", it's silly to answer "because these are the rules we have,
and they were passed by official people".

~~~
eridius
That's really quite a terrible and completely irrelevant analogy. It's not
that the rules are different, it's that we _have_ rules, period, for
experiments. There are no ethical rules for advertising. There are rules about
lying, but that's basically it. But experiments? Yeah, we have ethical rules
for those, and Facebook seems to have deliberately ignored them.

~~~
aianus
What about ad companies running different ads in different markets and
analyzing the resulting sales data? Isn't that an experiment?

~~~
semanticist
Not in the sense the OP means. They mean in the sense of a scientific
experiment being done nominally to advance our understanding of the world (or
in the case of psychology, specifically the mind and our perception of the
world), vetted by a review board, and published in a peer-reviewed journal as
part of the academic process.

This is very different from the colloquial use of 'experiment' to talk about
an A/B test or similar. It's a very specific context where we have very
specific rules in order to prevent people from committing horrendous acts in
the name of obtaining greater understanding.

------
plorg
It annoys me a good bit that when I choose the 'recent stories' feed rather
than the 'top stories'...whatever it is...that the setting gets reverted every
time the app is restarted, or occasionally in the web version as well. Of
course, it's par for the course, so it's not like I should be surprised. Yet
it seems like the setting is stored on the Facebook servers and seems to
persist across browser sessions on the web app.

I guess you could chalk some of that up to incompetence, but most of it to
apathy.

~~~
mijoharas
I just read this the other day, if you have a link that you access facebook
from you can change it to be
[http://www.facebook.com/?sk=h_chr](http://www.facebook.com/?sk=h_chr) [0] and
that will fix it for you[1]. Be that as it may, I do agree that it seems to be
developer apathy that this hasn't been fixed already.

[0] [http://www.facebook.com/?sk=h_chr](http://www.facebook.com/?sk=h_chr)

[1]
[http://www.pcworld.com/article/262206/make_your_facebook_new...](http://www.pcworld.com/article/262206/make_your_facebook_news_feed_default_to_most_recent.html)

------
tsycho
I found this article quite insightful on this topic:
[http://www.zephoria.org/thoughts/archives/2014/07/01/faceboo...](http://www.zephoria.org/thoughts/archives/2014/07/01/facebook-
experiment.html)

Here's my TLDR... If this research enables Facebook to curate people's feeds
to make them happier, would we consider the cost acceptable? Does Facebook
even have the right to do so? Is it ethical to hide your friend's serious post
or cry for help and show you cat pics, even if that will make you happier in
the short run? Separately, Danah talks about how the media's hypocritical
outrage over this is likely going to lead to Facebook and other companies
still doing a bunch of unethical things to increase their revenues, but just
becoming more secretive about it.

Seriously, read the article. Don't stop with my inadequate tldr.﻿

------
knodi123
The newsfeed doesn't show us every single thing that every friend does. So it
has to make some kind of decision about which things to show and which not to
show.

If the newsfeed accidentally did a poor job of this (and mine often does), we
could accuse facebook of being incompetent (or less than perfectly competent).

If, however, newsfeed _intentionally_ did a poor job of this, and _purposely_
showed us things specifically designed to make us sad, that's gone beyond
"less than competent". That's "actively malicious".

THAT is the reason people are getting upset. Or at least it would be if they
understood how facebook works.

~~~
nemothekid
And what standards body has defined the metrics so that we can measure what a
"poor job" is in this case? The comic still stands, Facebook could very well
be doing a poor job now, and is that unethical? What if FB's input had no
overall effect? Is the test still unethical even though no one was sad? How
will we define what is and isn't unethical? If advertisements makes me sad,
and FB experiments with placing more ads on my news feed, leading me to become
depressed, is that unethical? Are they still intentionally doing a poor job
even though they are probably more or less unaware of my disposition to ads?

I think Randall has a fair point here. FB has essentially been doing this for
years and _something_ about their latest test got all our panties in a bunch.
Its clear though now, FB never had to adhere to any sort of standards for
their website.

~~~
DalekBaldwin
Going further: what if this was broad exploratory study, not meant to test any
specific hypothesis? What if they rated posts based on thousands of features,
a parameter space large enough to guarantee they would find plenty of
correlations (both spurious and substantial), and randomly perturbed each
user's feed and monitored how the features of their contributions changed? I
would expect that they already do this, and from the very small amount of
detail I've seen in articles about the mood factor so far, I wouldn't be
surprised if this was just one of the more interesting correlations that fell
out of that sort of process.

------
qu4z-2
I wonder whether this study really shows what people are inferring from it.
Aside from the somewhat unreliable metrics ("not happy" contains happy =>
positive!), an increase in negative posts when being shown negative posts
could subconsciously promote Facebook as a place to vent, etc. I'm sure the
study addresses these things, but as usual the summary is "Seeing unhappy
things on Facebook makes people unhappy!"

------
Houshalter
I don't get the controversy over this. How is this different than a news
website displaying more tragic stories or more upbeat stories on their
homepage, and seeing how it affects users? Someone in another thread mentioned
supermarkets playing sad music and happy music. Or even Facebook's normal
newsfeed algorithm tweaking, where they've very likely done tests similar to
this.

~~~
rtpg
Because in your example, it is an interaction between you and the news site.
You expect the news corporation to do this.

Facebook is not about communication between you and Facebook, but between you
and other users. Facebook is effectively manipulating communication and the
relationship you have with your friends on the website.

The manipulation of the feed is subject to the intent of the manipulation. If
the goal is to make sure that you see things you want to do anyways (posts by
friends you interact with often, for example), then that seems reasonable.
Here the manipulation is not to make the user's experience better.

~~~
Houshalter
Their goal is to make you stay on the website the longest, or click on the
most ads, or whatever. It's not like facebook is normally benevolent and
tuning their filter to help you. In any case, the filter is presumably for
posts from distant friends and pages you've subscribed to, not manipulating
your close friends.

