

The brief history of Facebook apologizing - asicboy
http://boingboing.net/2014/07/02/the-brief-history-of-facebook.html

======
axxl
I'm always interested in the wording of the apologies themselves as well. In
the most recent one: "My co-authors and I are very sorry for the way the paper
described our research and any anxiety it caused." Specifically "the way the
paper described our research". Not for actually doing the experiment, just
that it came off wrong to the public...

~~~
gress
In this case it seems reasonable.

1\. A/B testing is done all the time and has undefined emotional effects on
users other than to maximize addiction & ad revenue. Google does this
constantly and is worshipped for it's ingenuity.

2\. The whole point is that they didn't _know_ whether there would be
emotional contagion, which is why they conducted the experiment. And yet
similar experiments are happening all the time, so the experiment didn't add
to the emotional burden of users - it just uncovered it.

[edit: I'd go further and suggest that maybe it would be unethical for
Facebook _not_ to do experiments like this. They need to understand this kind
of effect in order to make responsible design decisions.]

~~~
PeterGriffin
Google is most definitely not "worshipped for their ingenuity" in data mining
and profiling their users. Have you been living under a rock for the past
decade?

People don't trust Google. They've ruined their brand by abusing their users'
trust, and Facebook is on the same path.

As for what's "reasonable". Apple is operating a large messaging network as
well - iMessage. And they also published a paper recently.

However, their paper was about how their architecture prevents anyone, not
just third parties, but in most cases _Apple 's own employees_ from reading
and manipulating user data. No one at Apple can read a message you send
between two iOS devices. Apple aren't saints, but I like their idea of
_reasonable_ way more than what you're trying to sell here.

~~~
gress
Woah - slow down. I'm not selling anything.

I think you're right about Google's brand, but I think they can simultaneously
be distrusted and worshipped at the same time.

I also think that Apple is showing that compromising privacy is a consequence
of Google and Facebook's business models, and not some fundamental rule of the
web.

But, I don't think any of this changes my view that this particular study is
not unethical in the way people are claiming given that Facebook exists and
behaves as it does. If you think that Facebook operates an unethical business,
I'd still argue that this doesn't make it worse, and that becoming less
ignorant about how they affect people is a good thing.

------
jackgavigan
Apologies begin to sound pretty insincere when the company (or person) making
the apology _keep on doing the same damned thing that they 're apologising
for._

This particular apology is even worse because they're not apologising for
_doing_ the research; they're apologising "for the way the paper described our
research and any anxiety it caused."

They're not saying "We will never do this again."

They're not saying "We will never do this again without first obtaining
ethical approval from an appropriate review board."

They're not saying "We will never change your Facebook newsfeed to affect your
emotional state in order to make you more susceptible to adverts for
particular products."

They're not saying "We will never accept payment from an advertiser to change
your Facebook feed in order to affect your emotional state."

So, what conclusions should we draw here?

~~~
gress
How is this the same as something they've done before?

------
spike021
Not really sure why Facebook needs to apologize at all.

It's a free service that you agree to use. You should read the fine-print if
you care about anything like this.

If it can be proven that experiments or anything else worth apologizing for is
detrimental in some way, then something needs to be said. But otherwise, I
think they're free to do with they want with their data.

~~~
sp332
Free things still aren't allowed to cause harm to the people who use them. And
the fine print wasn't changed until after the experiment.

~~~
spike021
Right. Like I said, if it can be proven that their experiment caused harm of
some kind, then that's when something should be done.

------
higherpurpose
Well, you know how it is. They do it once, shame on them. They do it 10
times...shame on you for continuing to use Facebook.

