
Facebook's Zuckerberg Preaches Privacy, but Evidence Is Elusive - pseudolus
https://www.bloomberg.com/news/articles/2019-05-01/facebook-s-zuckerberg-preaches-privacy-but-evidence-is-elusive
======
gfodor
Two points:

\- Focusing on adding e2e encryption for messaging services serves as a nice
deflector for FB's privacy issues. Their advertising systems aren't going to
change in terms of efficacy and revenue potential if private messages between
users are inaccessible to Facebook. I'd be surprised if they're using this
data anyway, but it's Facebook, so who knows. In any case, expect the
messaging to focus on increasing privacy through message and video encryption,
which completely ignores the underlying issue of profile-building and behavior
modification that Facebook's non-messaging platforms allow, and which I didn't
seem to hear any plans around addressing.

\- IMHO, pretty much all posturing around privacy by Facebook should not be
taken seriously until they announce a change to their business model. Since
they haven't, it doesn't take much effort to tease out the rest: their
business model relies upon surveiling user behavior and selling behavior
modification products, so you can expect no announcements around product
changes that would undermine those efforts significantly in the name of
privacy until their business model changes. Everything until then is just at
best noise, at worst dishonest framing to take the heat off of them by those
who are ignorant of the underlying dynamics, like regulators or the general
public.

~~~
rando444
Just a note on your first point. End-to-end encryption only encrypts messages
in transit.

This does not make messages inaccessible to Facebook, as they control both
endpoints.

~~~
menzoic
End to end means from client to client. Facebook wouldn't be able to see the
messages.

~~~
nokya
Not really. End to end encryption means messages will leave the app encrypted
and only the recipient app will be able to read them.The middle man remains.

A good analogy: it's like writing a letter and asking the mailman to put it
into an envelope, so she leaves the room and comes back with your sealed
envelope.

The mailman then looks at you and says "I won't read it, I promise.", Wink
wink.

That's end-to-end encryption for the commons.

~~~
gfodor
The main gap in trust is that facebook does not disclose their source code and
have a way for users to confirm their device is running the published code.
Fundamentally, if their implementation is properly implementing a published
e2e protocol, they should not be able to read the messages, since the only
thing traveling in the clear over the wire through their servers are public
keys.

~~~
panarky
_> The main gap in trust is that facebook does not disclose their source code_

Nah, nobody gives a damn about the source code, or reproducible builds to
ensure the binary they're executing was compiled with that source.

The main gap in trust is that Facebook has a long history of lying and
cheating to maximize for themselves so there's no basis to trust that their
new moves are good for users.

But conspiracy theories about e2e being read by Facebook are probably bogus,
and certainly a distraction.

Even though the source is closed, I'd bet they're doing a credible job of
securing messages so that even Facebook can't read them.

That's not the issue, it's a distraction from what's really important.

What's really important is that Facebook has lost control of the monster it
created. This is a way to let the monster loose and avoid accountability.

Their platform amplifies harmful content like incitement to violence,
terrorist recruiting and coordination, and political propaganda.

By encrypting everything so even Facebook can't read it, Facebook escapes
accountability for the harm their platform inflicts on people.

Very similar to a chemical company dumping toxic waste in public water,
Facebook is dumping their pollution on the public by using strong encryption
to make it physically impossible for Facebook to control the monster they
created.

------
throwaway55554
> “It’s going to take time,” Zuckerberg said of Facebook’s privacy-focused
> future. “I’m sure we’re going to keep on unearthing old issues for a while,
> so it may feel like we’re not making progress at first. But I think that
> we’ve shown, time and again as a company, that we can do what it takes to
> evolve and build the products that people want.”

IOW, they'll talk about privacy for a while until everyone stops talking about
how evil FB is. Then they'll slowly stop talking about it and work their way
back down to where they are now. All the while not actually doing anything at
all to change.

~~~
pinewurst
It's about FB/Zuckerberg redefining "privacy" as being between FB users and
talking to that. They don't have to work their way back down from that.
Privacy between users and FB (e.g. none, even negative) isn't mentioned at
all.

~~~
rhizome
_It 's about FB/Zuckerberg redefining "privacy" as being between FB users and
talking to that_

Even moreso, it'll be about FB redefinining "privacy" as "whatever FB is doing
with their users," then lobbying themselves into a regulatory capture for the
nation (globe?) as a whole.

I forget where I heard it, but there's an old aphorism, "of course you're
free, because this is what freedom looks like."

------
duxup
Have there ever been any effective privacy limitations that Facebook placed on
itself that have done a thing?

We've seen internal memos raising questions effectively ignored. Facebook's
own actions seem to indicate there is no limit to what they'll do to users.
Their own actions seem to indicate there is no limit to what they'll do to
companies that partner with, specifically lie about what level of access
they're providing. Their own actions seem to indicate they don't seem to care
even when they operate on another platform such as when they released a
traffic monitoring VPN on Apple's app store, it was removed by Apple... so
they just renamed it and put it out there again.

Facebook as an entity seems antithetical to privacy at its core. It couldn't
have grown into what it is with limits... I can't imagine they are capable of
being anything else.

~~~
johannes1234321
Depending on the definition of "privacy". Over time they have limited access
to their APIs , so that a new "Cambridge Analytica" has a harder time to
extract larger amounts of data.

Of course that serves their business: in the beginning they had to be the hub
everybody connects to, to get as much attention as possible. Now their
business is to monopolize the data, so that ads are sold via their systems.

Framing that as privacy is a great strategy, from Facebook's perspective.

~~~
JohnFen
> Framing that as privacy is a great strategy, from Facebook's perspective.

Indeed, because it lets them pretend like they're defenders of privacy while
being able to completely ignore the privacy threat that Facebook itself
presents.

------
elliekelly
> “A lot of the focus is on changing the way that consumer-to-consumer
> interaction works,” said Greg Sparrow, senior vice president and general
> manager at CompliancePoint, a data privacy and security consultancy. “While
> that is laudable and it’s great that they’re doing that, but fundamentally
> it doesn’t address the problem on the back-end side, which is businesses
> gaining access to this information and how they’re using it from a data
> monetization perspective.”

Bingo. Most people on Facebook are well aware of how "public" their posts are
but aren't aware of how public their _personal_ information is to advertisers.
Facebook's new "privacy" focus isn't intended to solve the platform's real
privacy problem, it's intended to distract from it.

Edit: Spelling

~~~
omarchowdhury
> Most people on Facebook are well aware of how "public" their posts are but
> aren't aware of how public this personal information is to advertisers.

How public is it, then? As a Facebook advertiser I've never seen this elusive
personal information collected from the masses, available for indiscriminate
pickings. Facebook only sells access to eyeballs coupled with anonymized
targeting based on this personal information you refer to, not the information
itself.

~~~
elliekelly
The average Facebook user is largely unaware of how Facebook tracks their
activity far beyond what they say and do on facebook.com in order to harvest
data about their personal lives: their financial situation, their relationship
status, their medical history, etc. Just because you can't download a file of
someone's personal information "for indiscriminate pickings" doesn't mean they
aren't selling access to it.

For example, someone might be gay and haven't yet told friends and family.
Facebook probably knows from their browsing history. How difficult would it be
for someone to run an ad on Facebook, cleverly disguised as an "article" to
encourage clicks, targeting gay people in a particular region. Five minutes,
tops? Well every gay person who clicks on that link has just given away their
IP address and location information and the purchaser of that advertisement
has a pretty accurate list of gay people and where they might be located.
Hopefully they're using that list for benign purposes but who's to say?

Do you think the average Facebook user is aware of how their information is
leaked by simply clicking on a link in a Facebook advertisement?

~~~
omarchowdhury
You (or someone who believes the same) should actually try this experiment and
see if the reality matches the expectation, to any level of (potentially)
destructive accuracy.

~~~
sapski
Here's a paper that describes it, confirmed with real profiles, not just
speculation: [https://hal.archives-
ouvertes.fr/hal-01955327/document](https://hal.archives-
ouvertes.fr/hal-01955327/document)

~~~
omarchowdhury
Thank you, I appreciate the reference. I'll take a deep look at this.

------
kerng
Isn't this at least the 3rd time he says these things?

Unless there is a complete engineering stop and development of privacy
policies and tools, and review of all existing tools and processes the
necessary cultural change will not occur.

Facebook would have the money to do that and possibly have positive change
throughout the industry if they would do something like this. Like Microsoft
made the Secure Development Cycle a thing in the early 2000s.

But I'm already reading news how now all Facebook messaging platforms will be
able to send messages to each other - sure thing that that violates everything
WhatsApp stood for in past. Also, tons of privacy questions that come up with
such merges of technology and the data it connects.

------
blablablerg
Cookie Monster Preaches Restraint, but Evidence is Elusive

~~~
airstrike
This is such a fantastic analogy I'm not sure if it's funnier or scarier

------
username223
"Privacy?" Hah! More like "plausible deniability for fueling ethnic violence."
If they can't read the messages, then they aren't responsible. Meanwhile, most
of the useful ad targeting data is in the metadata, which they still collect.
They're truly shameless.

This is also funny, in a grim sort of way:

> Late last year, Facebook admitted that Clear History is taking longer than
> expected -- it turns out that browsing data, which the company uses to help
> send more targeted advertising to users on its social platforms -- is more
> deeply ingrained into Facebook’s systems than anyone realized. Simply
> finding and deleting the correct data without disrupting Facebook’s
> advertising and analytics businesses has been a big enough challenge that
> the product hasn’t gotten off the ground...

If Facebook were serious about this, they could easily implement a "nuke all
of my data" option that would wipe all of your history. Not just the stuff
gathered from beacons, but everything, so people could start over from scratch
with their new understanding of how Facebook operates.

But they clearly aren't serious. I suspect that the only real "clear history"
option will be exporting your data, deleting your profile, and making a new
one, preferably using a new email address from a public computer.

------
doctorRetro
Facebook will not change until there is sufficient external pressure - be that
an authority, competition, etc. - to force it to change. End of. We've seen
the same story for the last couple of years; Facebook is bad with privacy,
Facebook makes token attempt to improve privacy, Facebook gets busted for
being bad with privacy, and repeat.

~~~
OrgNet
Facebook will never change... it will just become irrelevant one day.

~~~
doctorRetro
I'm fine with that. Sooner the better.

------
skilled
The unfortunate part is that Facebook is designed to replicate the addictive
qualities of sugar and even more heavier drugs like cocaine.

As a result, it's not that you can't make your own social platform, it's that
your platform lacks critical components for instant gratification.

Maybe I am wrong though. Open to discussion for my perspective.

~~~
athenot
I have been putting thought into a social platform that would truly be for
users and not just a gamification of attention.

\- First it would have a cost, either in money to pay someone to develop/run
the platform or in time/knowledge to tinker with some open-source solution.
That shouldn't be a problem as the target market is the group of people who
are making a conscious choice to dodge the ad platform. BUT it does severely
reduce the network effect, possibly down to the point of becoming an ultra-
specialized niche club.

\- Then, there's the point you bring up: you'll essentially be trying to sell
steamed broccoli inside of a candy store.

~~~
elliekelly
I've also been thinking about something like this modeled something like a
mutual fund. I actually don't think people have an issue with the ads as much
as they have an issue with the lack of transparency & control over the
information that's "leaked" by interacting with the ads. The approach I've
been tinkering with is:

\- User data is held in a trust with independent trustees (like a mutual fund
complex) who oversee and set limits on _how_ the platform company uses the
data for advertising.

\- Users can "withdrawal" their data at any time.

\- Instead of micro-targeting individual users advertisements would be sold to
groups (like a mutual fund) based on anonymized data from the users who choose
to join that group.

\- Users could "invest" in a particular group by joining groups that reflect
their interests and creating content/posts in those groups. Daily up votes
would represent the creators "share" of the advertising proceeds to that group
(like dividends) so good content is rewarded.

\- Development/maintenance would be paid by a flat % of advertising revenue
taken off the top before the "profits" are distributed to "shareholders."

------
kensign
This is analogous to Google declaring one fine day that its getting out the
advertising business, but adding security features to do so. MZ describes
steps to improve security and retooling history features to have less
immortality and permanence, but people already share facebook posts and
profiles as images, not only as stateful markup. People will continue to
expose each other and there's essentially no way Facebook can guarantee
privacy in that regard. Limiting partner access to user data will ultimately
be a trade-off of what they can afford to lose and that's not a commitment to
privacy either.

His proposed strategy doesn't really makes sense and seems like misdirection
and lip-service. You can't change the way people use Facebook, but you can
forgo any pretense of privacy, which may be the only thing that can honestly
and realistically be done.

------
johnnycab
If you have made a pact with FIST (FB/Insta/Snap/Twitter) or the other members
of the gang, then it is akin to jumping in a pool of sharks after cutting
yourself open ─ not even the friendliest of sharks is going to resist the urge
to rip you apart and feed itself!

These breaches and preaches are now a daily occurrence, having found a new
level of tedium, especially since overtly parasitical behaviour has long been
admitted and identified within certain ecosystems.

------
lamby
Like investments, the implicit question of whether one can trust Facebook
(etc.) going forward is actually orthogonal to their current or previous
behaviour. Whilst it might inform and be predictive to a great degree, there
is no way to bind "future Facebook" to any real guarantees. In Debian
licensing, we call this the "tentacles of evil" test..

------
dangerboysteve
I'm reminded of the scorpion and frog fable for some reason.

------
craftoman
It's like an atheist who preaches about religion or a Ponzi scheme fraudster
talking about fundamentals of economics.

------
powera
I think that the new User Interface that Facebook is rolling out will show
whether or not Facebook has a real commitment to privacy.

~~~
yourduskquibble
> I think that the new User Interface that Facebook is rolling out will show
> whether or not Facebook has a real commitment to privacy.

How would you determine if the new user interface provided (or not) any [new
or perceived] level of privacy?

------
sidcool
I am indifferent to FB. They started with a promise, then they realized the
money potential and went ahead abusing it. They were caught and faced some
flak. They realized that people have started caring about privacy. Hence this
new pitch. I don't say the Mark Zuckerberg is evil, he himself is helpless. He
can't change the FB DNA overnight, it would risk the very existence of the
company. In summary, don't expect things to change anytime soon.

~~~
snarf21
No, he is evil. He called people who gave FB data "... dumb f#@$s" back when
he was 19. It was just that in the beginning they weren't pushing the
monetization yet. Now that is all it is about and people are starting to
understand how they are being exploited. The only thing that has changed is
that you are aware. _HE_ is the FB DNA.

~~~
SmellyGeekBoy
Not that I disagree with you, but when people bring this up I like to point
out that I said a lot of stupid stuff as a teenager, and the vast majority
would be lying if they claimed that they didn't, too.

~~~
snarf21
That's a fair point and I'd be more inclined to give the benefit of the doubt
if he behavior had changed in the last 15 years. The recent app around VPN and
teens shows that he doesn't care (and they renamed and republished that app
multiple times).

~~~
Shish2k
> they renamed and republished that app multiple times

Do you have a source for that? I'm only aware of Onavo code being shared with
another FB survey app which was running in parallel (ie, not the same app
renamed and republished)

------
bg4
Fool me once, shame on you... Fool me 100 times, shame on me...

------
blunte
I think "elusive" is a particularly kind word...

------
fromthestart
Why are we still listening to anything that tech CEOs say?

There seem to be absolutely no consequences to their repeated misdirections
and outright lies.

------
Simon_says
LOL, evidence is in plain sight.

------
ionised
People really need to stop listening to what Zuckerberg says and judge him on
his actions.

That way, you'll see that the 'dumb fucks' comment his made in his youth was
not merely an example of immature cockiness tempered over time by maturity,
but rather a profound insight into who he is and what he wants.

------
markdog12
The Big Lie?

------
chewz
I am producing baking soda. Unfortunately it is also containing arsenic. But
it is very popular baking soda and I am making a lot of money. Some people
complain so I make a pledge that one day I will change my production process
to make my baking soda arsenic-free.

~~~
JasonFruit
If I understand your analogy, you're saying that privacy violations are like
food-safety violations: they should be prevented by legislative action and
government force. I think the analogy is flawed. When you die from poisoning,
it's an irremediable situation; you can't just say, "I'll never do business
with them again!" and fix it. Also, you have to eat something; you can't opt
out of the whole market. Finally, it directly affects your physical safety, so
that intentional disregard of food safety is akin to violence.

Facebook's problems, and privacy violations in general, are different. It's
not necessary to engage in mediated social interaction; you can opt out with
no loss except convenience. There's no bodily harm, let alone death, so if
you're burned once, you can simply never do business with them again.

That means the market _could_ fix this problem. That it hasn't says that there
aren't enough people who agree that it is a problem, or that the cost is worth
the benefit to them. In a case like that, not legislation but education is the
solution.

~~~
svachalek
Once they've gathered information about you and sold it, you can't take it
back. For most people living in western democracies, living with no privacy
won't kill you but where is the guarantee your government and the corporations
it empowers will always be so benign?

~~~
JasonFruit
I thought about mentioning that, but even under less liberal governments, the
case where an invasion of privacy is fatal is pretty exceptional. I decided
not to muddy my exposition with it.

