
We Can't Just Assume that Facebook Will Do Its Best - Quanttek
https://www.zeit.de/digital/internet/2019-01/privacy-katarina-barley-data-protection-facebook-ad-targeting-mark-zuckerberg
======
deugtniet
"If an algorithm doesn't work, then responsibility lies with the person who
deployed the software. Real people cannot become laboratory rats for the
testing of an algorithm."

I'm wondering what the HN opinions are on this view. My guess is that the
argument will be that this will stifle innovation. But Facebook has such clout
that algorithms have very direct effect on people, up to even the security of
a state. I hear Facebook was a major contributer to the ethnic cleansing of
Rohingya. Is algorithm deployment subject to (government) audit? Or are there
alternatives to ensure the algorithm does nothing bad?

~~~
feanaro
I think letting governments have power to control which algorithms get
deployed is a bad thing. Instead, individuals need to cultivate a better
intuition on how and why granting a large amount of power to a single entity
is detrimental for them in the long term. They also need to develop tools and
heuristics that lessen the probability of this happening.

Power needs to be distributed as much as possible, so that no single entity
can do a large amount of damage and have a large amount of leverage. In other
words, Facebook only has as much power as the people using it let them have.
The people need to become acutely aware of this and act accordingly.

~~~
johnchristopher
No. We have elected representatives to study these questions in depth. You
can't ask every citizen to be an expert in ethics, genetics, medicine,
computer sciences, virology, etc. so they can make the
obvious/good/true/rational choice.

That's why societies have laws. So we can know what to do or not when faced
with such questions instead of passing the burden to the individual under the
guise that it's "freeing people".

~~~
Mirioron
> _No. We have elected representatives to study these questions in depth._

Except the elected representatives are dumb as rocks. I'm sure they studied
how the cookie law would work really in depth before voting on it. That's why
the cookie law worked out so perfectly, right? Oh wait, it didn't do ANYTHING
other than cost companies millions of euros and wasted the time of the general
public. It had ZERO benefits and a CS student could tell you that the way it
was designed it could never have any benefits. I guess the only benefit of the
law is to show that lawmakers continue to be incompetent at technology.

> _That 's why societies have laws. So we can know what to do or not when
> faced with such questions instead of passing the burden to the individual
> under the guise that it's "freeing people"._

Societies have laws so that the regular person thinks that everyone plays
according to the same rules, but that's not true in practice. Laws are
intentionally written to be vague and not consistently enforced. This has led
to a situation where there are so many laws that everyone breaks several every
year and as a result if somebody higher up in the chain doesn't like what
you're doing they can use the state to harass you with.

If laws were truly about knowing what to do then it would be impossible to
graduate from mandatory education without having studied every single law that
will apply to you in most circumstances. Furthermore, vague laws wouldn't be
written and would get removed from legal codes. Furthermore, laws that are
only selectively enforced would be removed, because laws can only be just if
they apply to everyone consistently.

------
newscracker
This headline just needs a two word response: "Of course!"

Facebook the company needs to be changed drastically by strong external forces
(maybe governments, though the effects could include some harm too). It will
not do anything on its own while Mark Zuckerberg and Sheryl Sandberg are stil
at the company and/or have influence (which they will have, for a long
time...at least the CEO will).

~~~
billfruit
I wouldn't trust the US government any more than I would trust Facebook the
company(Speaking as a non US citizen). Many countries governments are less
transparent than publicly traded companies. Governments often act with shady
'national security' and geopolitical interests, or sometimes even elections in
mind, but at least public companies more or less motivated towards increasing
shareholder value.

I do rue the dominant narrative in much of HN and the mainstream media, who is
the more dangerous actor, FB vs NSA? My bet is on the NSA.

~~~
Mirioron
Speaking as a European, I would trust the US government over the German
government at least.

~~~
newscracker
Could you please elaborate on what aspects you’re referring to and provide
some links?

From one perspective, as far as pervasive mass surveillance and exchange of
surveillance information are concerned, the U.S. in the Five Eyes group has
more influence and impact than Germany in the Fourteen Eyes group.

------
expertentipp
What German ministers have to say about Schufa Holding AG running undisclosed
rating algorithms on the entire resident population of their country?

------
pragmacoders
I'm curious what a solution like "All Machine Learning algorithms, and the
data that taught them, must be open-source and easily accessible by the
public" would look like.

It'd allow for public scrutiny at the algorithms that run our lives. It'd
essentially mandate thorough vetting of how anonymous and/or biased the data
being fed into these algorithms is.

Which would be interesting

~~~
moultano
You really don't want that.
[https://en.m.wikipedia.org/wiki/AOL_search_data_leak](https://en.m.wikipedia.org/wiki/AOL_search_data_leak)

~~~
pragmacoders
Well the idea would be that companies would be forced to actually figure out
how to properly clean data or forced not to use it out of public pressure

~~~
Mirioron
It's probably not possible to clean data like that. Just a few slices of a
person's life are enough to uniquely identify them. So all you're asking then
is effectively a ban on machine learning.

~~~
matt4077
Apple and others have poured a lot of resources into a concept called
“differential privacy” that can mathematically guarantee privacy for data in a
machine learning context.

It’s rather well known, so you should probably read up on it before espousing
strongly worded opinion weakly connected to the current state of the art.

~~~
moultano
I don't think anyone has managed to train differentially private models with
acceptable performance, and releasing raw data in differentially private way
seems impossible to me.

------
empath75
I don’t know that i’d trust a company that would intentionally steal millions
of dollars from children.

[https://www.revealnews.org/article/facebook-knowingly-
duped-...](https://www.revealnews.org/article/facebook-knowingly-duped-game-
playing-kids-and-their-parents-out-of-money/)

~~~
danra
That's harsh - they do not intentionally steal money from children.

They intentionally use children to steal their parents' money.

------
pasta
Trust is gone and it will be the end of the big Facebook (website).

I think Facebook (company) is lucky with Instagram because that's where the
people are going right now.

~~~
raverbashing
Yeah, if the Zucker doesn't destroy it

Though I have to say that at least in its current form, the ability of
Instagram to spread crap is limited

~~~
ardy42
> Though I have to say that at least in its current form, the ability of
> Instagram to spread crap is limited

That proved to be untrue:

[https://www.nytimes.com/2018/12/18/technology/russian-
interf...](https://www.nytimes.com/2018/12/18/technology/russian-interference-
instagram.html)

[https://www.theguardian.com/technology/2018/dec/18/instagram...](https://www.theguardian.com/technology/2018/dec/18/instagram-
facebook-russian-propaganda-ira)

> But two new analyses of the Russian online propaganda campaign by the
> Internet Research Agency reveal that this view of Instagram was as rose-
> colored as, well, an artistically filtered Instagram post.

> “Instagram was perhaps the most effective platform for the Internet Research
> Agency,” states the report by New Knowledge, an American cybersecurity firm
> which analyzed data sets from Facebook, YouTube and Twitter.

> During the period studied by the report’s authors, IRA posts on Instagram
> garnered more than twice as many engagements (such as likes or comments) as
> IRA posts on Facebook – 187m on Instagram vs 77m on Facebook – despite the
> fact that Facebook offers many more ways for users to interact with content,
> and Instagram has no native “sharing” button to promote virality.

------
onetimemanytime
On the contrary, we should assume that Google, FB, MSFT etc etc will do the
worst (for users /society) if it's better for their bottom line. Their trust
level is in negative territory, it was zero a few years ago.

~~~
ardy42
> On the contrary, we should assume that Google, FB, MSFT etc etc will do the
> worst (for users /society) if it's better for their bottom line. Their trust
> level is in negative territory, it was zero a few years ago.

All the refrains that try to excluse/explain corporate misbehavior by claiming
that that only duty of a corporation is to create value for its shareholders
are only tempting a massive regulatory response in the future.

If capitalist ethics are unused to justify socially unethical and immoral
behavior, don't be surprised when democracies start to turn against
capitalism...

~~~
Lio
It’s always worth remembering that the term “Banana Republic” is as originally
used to describe countries where the local government had been effectively
taken over by United Fruit Company [1] to the extreme detriment of the local
people.

The lesson here is that you can’t trust large corporations to act ethically
and if you can’t trust you need to regulate them.

(That a clothing company rather insensitively chose Banana Republic as a trade
name sort proves that you can’t legislate against bad taste).

[1]
[https://en.m.wikipedia.org/wiki/Banana_republic](https://en.m.wikipedia.org/wiki/Banana_republic)

------
caiocaiocaio
Like almost all articles criticising Facebook, this has a prominent 'like on
Facebook' icon/link. It is first among the social media logos. Unlike most
articles criticising Facebook, their logo is not fixed and permanently
visible, so they must be getting really serious about their criticisms.

~~~
unicornporn
> Like almost all articles criticising Facebook, this has a prominent 'like on
> Facebook' icon/link.

The "voting with your wallet" logic doesn't work very well with operations at
the size and impact of Facebook. Zeit may be dependent on Facebook for profit.
That's why some people ask for regulations. Companies should not be expected
to act as moral agents.

You should also know that journalists writing for a paper has little to do
with business strategies of the owners.

------
Quanttek
(I'm not sure why the mods changed the title as it's quite important to note
that this is is an op-ed by Katarina Barley, the German minister of justice)

~~~
yorwba
_Please don 't do things to make titles stand out, like using uppercase or
exclamation points, or adding a parenthetical remark saying how great an
article is. It's implicit in submitting something that you think it's
important._

...

 _If the original title includes the name of the site, please take it out,
because the site name will be displayed after the link._

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

I'd say that applies to author names, too. Besides, I think she's written the
article in her capacity as minister of consumer protection, the fact that
she's also simultaneously minister of justice isn't all that relevant.

------
nannePOPI
This is a classic tactics by politicians: instead of reprimanding the judicial
system for refusing to do their job properly, they cry and whine publicly in
order to get the consensus to create more laws, which often guarantee more
power to the government and also the creation of special regulatory bodies
where they can put their people with a juicy governement salary.

All the problem described in this thread could be solved with laws from 25
years ago. Facebook stealing call and contact data? No, they can't do it. No,
GDPR was not needed. No, putting a phrase in a ToS that nobody reads is not
enough to save a company from being destroyed in court in case of serious
wrongdoing. Friendly fraud? Also a crime, just use the existing laws. Calling
the users dumb fucks? Not a crime. Manipulating the users emotion or testing
the algorithms on users? Also not a crime and I don't see why it should be.

Of course, it's much easier for the judicial system to not bother with those
big companies by applying the existing laws. Why risk getting attention from
such powerful people? And after all, if they were to apply the existing laws,
then the poor politicians would have much less fake crisis to work with in
order to expand the role of government and make money.

This is how the whole government thing work. Proved since the beginning of
time. Trust them even less than Facebook.

~~~
koboll
>All the problem described in this thread could be solved with laws from 25
years ago. Facebook stealing call and contact data? No, they can't do it. No,
GDPR was not needed. No, putting a phrase in a ToS that nobody reads is not
enough to save a company from being destroyed in court in case of serious
wrongdoing. Friendly fraud? Also a crime, just use the existing laws.

Even if it's true that these are illegal, you're not going to be able to
ascribe a "crime" to someone you can charge ex post facto. At most, you might
see a lawsuit that results in a settlement for a small number of affected
individuals, which is going to be accounted for already by Facebook's legal
team in their decision to implement these polices.

We _do_ need new laws here. The fact that Facebook feels so free to do these
things means that they are unafraid of current law, probably with good reason.

~~~
nannePOPI
If facebook's actions are illegal now, they are illegal now, there is no ex
post facto. Can't talk for the US, but for example the "friendly fraud" could
be considered fraud in many European countries, since it brings profit to
facebook by inducing people into error. The punishment includes jail time. You
do need someone to start a lawsuit, of course, and that's what the judicial
system should do when the fraud is repeated. But do they do it? Nope, instead
we just have the politicians whine that they need more power. What a surprise.

The reason why FB and big companies in general are unafraid of current laws is
that they know they won't be applied. It's a big hassle to punish the rich and
the corporations, because people working in the judicial system just don't
care about making big enemies. Also it's a pain in the ass because there are
two thousand layers of limited liability they have to uncover before they can
put the responsible people in jail. It's much easier to focus all the energy
on some poor guy selling some weed or a small business not submitting the
right form at the right time. Punishing them need zero effort, they can't
defend themselves properly and their punishment justifies the work of the
judicial system.

Elected politicians should be the one to keep the judicial system in check,
but they don't have an incentive to do so that it starts to punish rich people
FIRST. In the end the people, by showing support for "more laws" they only get
more laws, which will cost more money to the taxpayers and also will make life
difficult for the small business while big business won't care and will even
be advantaged by them. All this aside from the fact that it doesn't make sense
to make another law when there is already a law.

------
porpoisely
So are we back to having dozens of facebook spam every day? Can't believe a
nearly 2 day old facebook spam story is still on the frontpage.

Facebook is a private company. It will do its best for its shareholders.

And as best as I can tell, what this german minister of "justice" and a
segment of the politcal and business elites are complaining about is that they
don't get to control facebook. These authoritarians want to control what
people read, what people say and how they think. No different than saudis or
the chinese or the russians complaining about facebook.

If you don't like facebook, stop using it. If germans don't like facebook,
create a german version of facebook. Frankly, the germans should be more
worried about the nazi/stasi-esque german intranational spying. It's strange
how the people complaining the most about facebook and its privacy
violations/spying are also the leaders in spying on its own citizens. I doubt
even china spies on its own citizens as much as the british and the germans
do.

~~~
alkibiades
people hate facebook because trump won and brexit passed and they hate that
the people didn’t listen to the elites.

~~~
colordrops
I assume this is sarcasm.

~~~
alkibiades
you just think it’s a coincidence that all the fb hate started after trump got
elected?

~~~
darkpuma
It didn't.

~~~
alkibiades
yes it did. atleast the latest wave of intense fervor. it started when people
claimed russian bots on fb affected election. of course there’s always been
some criticism but not to the same extent

~~~
darkpuma
> atleast the latest wave of intense fervor.

That's moving the goalpost to a subjective location. But if you never saw
intense criticism of Zuckerberg and his company on HN before the 2016 election
outcome, then you just weren't paying attention. The "dumbfucks" comment was
what, a decade ago?

