
Facebook Further Reduces Your Control Over Personal Information - jlhamilton
http://www.eff.org/deeplinks/2010/04/facebook-further-reduces-control-over-personal-information
======
fnid2
This weekend one of my friends (IRL) asked me why I wasn't on Facebook. She
was shocked, as many who know me are, when I tell them I am not on facebook.
They see me as a internet guru. How can an internet guru not be on facebook?
What they don't know is that most Internet Gurus I know are also not on
facebook.

The reason I am not on facebook is quite simple, I do not want to give
facebook my private information, because I do not trust Facebook. I do not
trust Mark Z. I do not believe that they have individual users' best interests
at heart.

What I do believe is that they are interested in getting more users, sharing
more information, and making more money. None of these are in my best
interest. In fact, as Facebook gets more users, the quality of facebook will
continue to decline, as will their treatment of those users.

It takes a lot of money to operate 10,000 servers (or however many they have
at this point). It takes a lot of ad revenue. It takes a lot of people and
investors. All of that means continued focus on the bottom line to ensure
survival.

Facebook's only value is in the information people put into it and they will
do what they need to do with that information to keep powering those tens of
thousands of servers.

~~~
by
"What I do believe is that they are interested in getting more users, sharing
more information, and making more money."

I don't see why sharing more information will make more money in the long
term. Keeping the users happy and engaged would seem more profitable to me.
What is the logic?

"In fact, as Facebook gets more users, the quality of facebook will continue
to decline, as will their treatment of those users."

I don't understand why treating the users badly would increase the value of
the company.

~~~
fnid2
as wmeredith replied below and I'll elaborate here:

Facebook has only one product, information. When companies want to grow, they
have to produce more product. Facebook can produce more product in only three
ways: getting more users, getting users to add more info, or sharing more of
the info they've already added. There is no other way.

When new user accounts plateau, when users are content with the amount of info
they've added or the additional value of that data declines due to signal-to-
noise, the only method they have left is to share more of the information they
already have in the form of greater public visibility.

Remember, Facebook's customers are not the users, they are the advertisers. If
you want to establish a relationship with a company designed to make profit,
profit should be the foundation of the relationship. Facebook's users do not
profit. They provide labor to improve the value of facebook with no
compensation in return, so the relationship is unbalanced. Even factory
workers are paid, albeit not much comparatively, but someone else somewhere is
making less and will migrate to the factory to fill jobs people quit.

In facebook's case, turnover can't be filled. The people won't come back and
there is a diminishing quantity of new users to pick from. Eventually, users
will stop joining because the knowledge we are sharing here will reach them
before they do and they'll have lost their naive innocence -- a requirement of
any sharecropper.

Why more sharing of information will make more money is because money is made
through the information. Basically, advertisers can better target ads. By
treating users badly, I mean in an increasingly selfish manner by doing things
their users don't appreciate. Users don't want someone unilaterally deciding
how much of their, what they consider, private information is shown to third
parties -- millions of other parties -- like the FBI, Police, IRS, potential
employers.

~~~
stanleydrew
_Facebook's users do not profit. They provide labor to improve the value of
facebook with no compensation in return, so the relationship is unbalanced._

I think this is demonstrably false. It's true, my brother doesn't get paid by
Facebook to spend time on the site. Google doesn't pay me to use GMail either,
and yet I use it every day all day. Money isn't the only way to provide value.
And there's no way Facebook could have reached 400 million users without
providing value to a lot of those people.

------
_delirium
Wow, this is pretty lame. The various stuff I've become a "fan" of (apparently
now renamed "like") was under the understanding, explicitly stated by
Facebook, that it wasn't part of the public information. Seems they changed
that without warning or opt-out? Is there at least an easy way to quickly
remove all the hundreds of things I've become a fan of before Google starts
indexing them, if I'd prefer them not to be indexed?

What I'm getting from this is that you should assume that _all_ your Facebook
information is publicly available, and act accordingly, because they could
make it so at any time. At this rate of trustworthiness, I wouldn't be that
surprised if in 2 years status updates were made retroactively public with no
opt-out (and maybe without even telling you).

Edit: Is this even legal? Consider the following scenario: Someone signed up
for a Facebook account 3 years ago, and entered some of this information, at a
time when their privacy policy explicitly promised that the information would
not be shared. They have not logged in since, so cannot be said to have even
implicitly agreed to a change in the privacy policy (and Facebook has not
mailed out any notice of the change). Now their information is made public, in
violation of the privacy policy. Not sure how easy it'd be to enforce, but at
the very least it seems _sleazy_.

~~~
ElbertF
> you should assume that all your Facebook information is publicly available

Indeed you should, I'd say this applies to any website.

~~~
_delirium
That would certainly make mint.com pretty interesting...

~~~
mjgoins
mint.com is a much more absurd idea than facebook. And the fact that it has
users is even more mind-boggling.

------
tcskeptic
Unless I am misunderstanding this <http://bit.ly/c4NFgO> it appears that this
article misrepresents the changes in a couple of important ways:

1) Facebook says this is opt-in: _Opt-in to new connections: When you next
visit your profile page on Facebook, you'll see a box appear that recommends
Pages based on the interests and affiliations you'd previously added to your
profile. You can then either connect to all these Pages—by clicking "Link All
to My Profile"—or choose specific Pages. You can opt to only connect to some
of those Pages by going to "Choose Pages Individually" and checking or
unchecking specific Pages. Once you make your choice, any text you'd
previously had for the current city, hometown, education and work, and likes
and interests sections of your profile will be replaced by links to these
Pages. If you would still like to express yourself with free-form text, you
can still use the "Bio" section of your profile. You also can also use
features and applications like Notes, status updates or Photos to share more
about yourself._

They reiterate this further down the page: _"If you don't want to show up on
those Pages, simply disconnect from them by clicking the "Unlike" link in the
bottom left column of the Page. You always decide what connections to make."_

2) If you choose to opt-in you can control whether your friends see the
connection on your profile by using the privacy settings.

So, it is opt-in, there is a simple way to remove yourself if you change your
mind, you can control the visibility on your own profile, and if you don't
want to faff around with the connections, you can just list this stuff as text
in some of the available text only fields.

What is the issue?

------
char
"As Facebook's privacy policy promised, 'No personal information that you
submit to Facebook will be available to any user of the Web Site who does not
belong to at least one of the groups specified by you in your privacy
settings'."

This is STILL true, and it's publicly known how to maintain privacy. You can
keep all your information private to everyone who isn't a friend, and you get
to control exactly who your friends are.

If you become a "Fan" of something or post on someone's comment/Wall, you're
sharing some of your information with that page. No shit it's public. Why is a
misunderstanding/lack of education about Facebooks policies equivalent to that
site being sneaky and/or evil?

~~~
jfager
You're misunderstanding the issue. Today's change takes previously private or
limited exposure profile information and uses it to create ad-hoc groups that
people never explicitly acted to join. It's very similar to the Google Buzz
fiasco, except with interests being exposed instead of contacts.

~~~
indigoviolet
It's opt-in.

~~~
jfager
Barely. The explanation of what you're opting into is terse and vague enough
that most people won't really understand what's going on, and if you decide
not to opt in, FB wipes out almost your entire profile.

~~~
indigoviolet
How would you support the claim that most people won't understand what is
going on?

Edit: I guess the point I'm trying to make is that Facebook actually tests the
UI and copy in experiments and in the lab before deploying it; so when you
make a claim that it's not working, I'd like to see some supporting evidence.

~~~
jfager
Because I have friends and family who ask me all kinds of crazy questions
about how things work, and I'm guessing the experience I had on FB this
morning won't be enough to let me dodge those questions. It's a personal
opinion, but one I think is pretty well grounded in my own experiences.

What isn't opinion is that opting out results in wiping out almost everything
from your profile. That's a fairly punitive action to take against someone who
doesn't want to opt in to an "opt-in" feature.

And saying that "Facebook actually tests the UI" means pretty much nothing to
me. Do they publish the tests and the results? How confusing can a feature be
before it's scrapped? Does the importance of the business function the feature
supports affect that threshold? On top of that, given FB's history of privacy
screwups and subsequent rollbacks, why should I have any faith in the company
at all in this area? FB has always seemed more than comfortable with the "ask
forgiveness rather than permission" model.

