
It’s time for a digital protection agency - adrian_mrd
https://www.bloomberg.com/news/articles/2018-03-21/paul-ford-facebook-is-why-we-need-a-digital-protection-agency
======
plorg
I have a loosely imagined regulatory solution to the particular problem of
secretive data brokerages and user targeting. It works at follows:

* Targeted ads are required to include both a declaration that they are targeted to a user and what criteria were used to select the user.

* Ad and data brokers are required to provide a chain of discovery detailing where the individual ad or data entity received information that contributed to the targeting of the user. (I think this was hit upon in TFA, but it was difficult to get through with the focus as it was mostly on credentials and PII).

* Firms which hold user targeting information and information on individual user preferences are required to allow users to have their records removed from the firm's database.

I'm sure this seems punitive to the companies it would affect, but I'm not
sure that's a problem overall. It allows people to control the information
that is collected about them and puts the onus on the companies benefiting
from that information. If user targeting really does help in product discovery
and user preference marching, then the ad and data companies will be rewarded,
and the price of regulation will be the cost of information transparency. If
it turns out that users really don't want to have dozens of companies tracking
their every move then the advertising business model will be starved and make
way for companies with more innovative, user-friendly business models.

Edit: autocorrect fucking up words that autocomplete originally predicted
correctly.

~~~
michaelbuckbee
If you haven't already, you might enjoy reading the EU's forthcoming GDPR
regulations [1] they cover all of this and much, much more including things
like:

\- data should be exportable in digital formats

\- you can request what information a company has about you regardless if
you're a full "user/customer" (think of things like how FB tracks anyone who
visits a webpage with a Like button).

\- you have the right to request a manual review of computer made decisions
(ex: a credit approval)

\- right to have your data corrected/fixed if it's listed incorrectly

\- right to escalate to each country's "supervisory authority" (this is the
agency within each country that handles things)

\- serious fines in case of non compliance (% of revenue, not a flat amount)

1 - [https://blog.varonis.com/gdpr-requirements-list-in-plain-
eng...](https://blog.varonis.com/gdpr-requirements-list-in-plain-english)

~~~
evrydayhustling
Agree about the overlap, but want to call to attention OP's requirement that I
don't believe is part of GDPR:

> Targeted ads are required to include both a declaration that they are
> targeted to a user and what criteria were used to select the user.

This makes a massive difference! GDPR currently revolves around informed users
making a request. OP's requirement would force companies to educate users
about their methods. I see an analogy to the health warnings on tobacco ads.

Incidentally, as someone who thinks targeted ads can be beneficial, I think
this would help me interpret and engage with "sincere" targeting.

~~~
michaelbuckbee
This is a part of the GDPR - the language is so generic that I think much of
this will come out in future court cases, but check out Article 21 and 22 of
the GDPR which I think addresses this (but IANAL)

[https://blog.varonis.com/gdpr-requirements-list-in-plain-
eng...](https://blog.varonis.com/gdpr-requirements-list-in-plain-
english/#article21)

People can object to be profiled and can ask a person for help like: "Why am I
seeing this? Please stop"

~~~
evrydayhustling
Belated thanks for the really useful link! I hadn't looked into these opt-
out/review requirements before. But, FWIW, I still think there's an important
difference between a user-driven opt-out (which I also don't read as a
guarantee of explanation) and proactive publication of the personalization
criteria. It's still possible for an uninformed individual to never even
realize content is being personalized for them, much less pass the threshold
of asking how.

------
manigandham
As usual, a bunch of nonsense by people who have no idea about this.

The fundamental problem is that digital advertising is a 12-figure global
industry with practically 0 oversight and regulation. This is an industry that
sells influence at scale. Anyone with a credit card can start changing how
people think and act but there are absolutely no real consequences for bad
actors.

Even the most minimal laws around who can advertise and how would radically
change everything. Google has even more data than Facebook. Amazon has just as
much. Your ISP has just as much. These silly little projects to chase the
latest scandal will do nothing in the long run. The only way to fix anything
is to regulate the core, not try and fix every little symptom that occurs.

* Before the inevitable comments, yes advertising works, yes it works on you no matter how much you think otherwise, and no adblockers dont magically solve everything.

~~~
edanm
> This is an industry that sells influence at scale. Anyone with a credit card
> can start changing how people think and act but there are absolutely no real
> consequences for bad actors.

Isn't this true of almost any industry which lets you put out information?
E.g. news publishing? Book publishing? Blogs?

I mean, what exactly would you do, ban all communications? Maybe I missed
something, but it seems like everyone just assumes that advertising is the #1
biggest influence on most people, and was used to completely change the tide
of democracy, when in reality it seems to me that it's a _small_ part of the
problem, at most.

I'm serious about the question btw - what _would_ you do? You say regulate at
the core. I'm not saying necessarily don't regulate (though that is where I
lean) - I'm asking, what exactly do you propose?

~~~
JumpCrisscross
> _what would you do?_

1\. Break up Facebook under anti-trust law. Social network share is as
dangerous as, if not more dangerous than, market share.

2\. Pass an American GDPR. Consumers get an absolute right to audit and delete
their data. Explicit consent is required for each instance of third-party
sharing. Companies are liable to their users for breaches, with a minimum
amount claimable through an easily-accessibly regulator.

~~~
734786710934
What would breaking up Facebook look like? Most features of Facebook couldn't
operate as standalone businesses without creating their own massive ad
networks.

~~~
titanomachy
Instagram and WhatsApp worked fine before they were part of Facebook.

~~~
734786710934
Both were being run off of VC money when Facebook acquired them. They would be
in Snapchat's shoes right now if they were still independent.

------
ig1
Most developed countries have an agency dedicated to data protection.

The UK has the ICO, Japan has the Personal Information Protection Commission,
Canada has the Office of the Privacy Commissioner, Switzerland has the FDPIC,
etc.

Their exact role varies from country-to-country but the US is one of the few
modern countries to not have a national body dedicated to the field.

~~~
walterbell
FTC in the US.

~~~
e12e
Indeed. I came across this, which should probably be of interest to readers in
the US:

[https://www.ftc.gov/public-statements/2016/01/two-way-
street...](https://www.ftc.gov/public-statements/2016/01/two-way-street-us-eu-
parallels-under-general-data-protection-regulation)

However, I seem to recall, that last time I checked (several years ago) there
appeared to be very slim data protection in the US. Or rather, somewhat strict
rules for the federal government, few or no rules for private business. To the
point of federal agencies contracting out databases to firms, so as to dodge
legislation...

------
tscs37
Tbh, I don't trust any corporation in the US to keep my data safe anymore. You
can't fix that kind of reputation damage

------
Mononokay
What did people really expect would happen with Facebook? It's social media,
not a file storage service - of course your data isn't going to be private.
It's more or less what you should expect when signing up on any social media
platform.

~~~
e12e
I certainly expect my phone company to not record my calls, or my isp not to
mine my email. The problem isn't that fb is social media; or that Google/Gmail
is an email provider - it's that both of them are ad agencies and private
intelligence agencies (they gather, refine and sell information; ie
intelligence).

So it really is: Facebook is an intelligence agency and information broker:
why would you supply information on yourself and your friends to them. And the
answer is likely: I didn't know they were an intelligence company; I thought
they were a (social) personal media platform. Like email or a blog, only
slightly more modern.

And this isn't a _wild_ expectation. Eg the telephone providers are regulated,
and can't sell the content of your calls, even if it's "just" if you're
talking about a recent pregnancy, or how you're thinking about buying a car...

~~~
dredmorbius
Your phone company has, in all liklihood, maintained a log of all your calls,
dating to the 1980s.

[https://www.schneier.com/blog/archives/2006/03/atts_19trilli...](https://www.schneier.com/blog/archives/2006/03/atts_19trillion.html)

~~~
e12e
I'm sure Norwegian military intelligence has a semi legal access to meta data
and contents of my calls: but at least it's clearly illegal for them. The
largest isp/phone provider (telenor) actually fought back against the eu data
collection directive because they had to store more data than they currently
did (3 months).

But yeah, data protection in the private sector in the US is dismal.

------
cowmix
As a side note. In the past 5 years of doing serious contracting work in every
industry you can think of (pharma, banking, manufacturing, etc) the places
where I have seen info sec taken the most seriously are two: gaming studios
(any) and Bloomberg.

------
mlb_hn
The author's suggested fix is "Let’s make a digital Environmental Protection
Agency. Call it the Digital Protection Agency. Its job would be to clean up
toxic data spills, educate the public, and calibrate and levy fines."

A couple upfront issues with this: 1) "cleaning up toxic data spills" \- this
doesn't seem well worked out, unless the author suggests going and deleting
the stolen data off others' computers

2) "educate the public" \- the author suggests explaining how to deal with
identify theft. That's great, but doesn't address the secondary issues of
advertising/propaganda/other clever unintended uses of data.

~~~
Mononokay
It also only really works for things that happen domestically - watch as every
social media company moved overseas.

Social media companies are in the wonderful position as to where they don't
have to take money outright from users to operate, so they can claim they
don't operate in any country they'd like to claim they don't, and can move to
a foreign country with a lower tax rate without harming themselves at all
really. With no users exchanging currency and no (retail) physical presence
they're incredibly free as far as mobility goes. Unless I'm missing something
about this all, it's surprising that it's not common for them to.

~~~
JumpCrisscross
> _It also only really works for things that happen domestically - watch as
> every social media company moved overseas_

A limited solution is better than no solution.

~~~
Mononokay
In some cases, it isn't. This would be one of those cases - tax increase
because of the spending increase required for a new department, little benefit
in practice, anti-free market, etc.

It's a culture issue, not particularly a regulation issue. Fining companies
for doing with data exactly what they say will be done with data in their ToS
(Granted, ToSs are a terrible concept - they are legal and enforcable
depending on how the government is feeling for the day, though) would allow
the government to effectively shut down platforms it didn't like by fee.
Proposing a department to "Calibrate and Levy Fines" upon media is bizarre.

The proper thing to do would be solely focusing on educating the populus, or
alternatively funding the EFF to do it for them.

------
organsnyder
Regardless of how well you feel this article is written whether you agree with
it, it's important to not ignore it and others like it. This reflects a
growing sentiment among much of the population—address their concerns, or
regulation will do it for you (likely in a sub-optimal, if not
counterproductive, way).

~~~
domevent
That ball is already rolling downhill, and I don’t think it can be stopped. In
addition to the perfectly reasonable concerns of a few billion people, there
are also some very wealthy and entrenched media interests who only stand to
gain by kicking it along. I don’t think the reputation of tech is salvageable
at this point, and the result will be having to live with regulation and
oversight like every other industry. Even if everyone shaped up overnight (and
they won’t) there is still so much yet to emerge into the public light about
what has _already_ happened. Uber and Facebook alone have almost certainly
only just begun to bleed scandal.

~~~
laen
> the result will be having to live with regulation and oversight like every
> other industry.

The only other industry I think could be comparable to this, from a privacy
standpoint, would be the banking industry. They are largely flying under the
radar during all of this, any hypothesis why? Is it because their regulation
and oversight is working like you are alluding to? Or is it because the
banking industry has not been caught...yet?

------
dictum
I can't wait for the 2023 thinkpieces on how the Digital Homeland Protection
Agency or some such has been taken over by purveyors of ungood ethics and _we
must do something about it_.

------
riazrizvi
Google built an algorithm that promoted Alex Jones 15 billion times to
vulnerable people. I wonder are they systematizing established marketing
influences, or are they creating a new era of conspiracy theory prone
populations because those types of articles have better click-through rates
for ads?

------
tbabb
I've been saying for years that software engineering needs to be elevated to
the same standards held by other engineering fields; like civil, mechanical,
biomedical, and so on:

If an engineer or a firm is negligent and people die and/or millions of
dollars are lost, they are kicked out of the industry/lose their right to
operate as a business.

For example: To work on cryptography and security, you need a degree and to
have passed certification, perhaps at regular multi-year intervals. Then if
you build a login page and store the unsalted passwords in plain text and
someone pwns your site, you lose your license, could be fined, sued, or
possibly go to jail if your negligence is criminal.

Then, if you are a company and you need a login system, you either (a) hire
certified software engineers to write one, (b) subcontract a certified firm,
or (c) license a certified off-the-shelf solution. If pwnage happens, the
company is liable if they failed to do one those things. Therefore it becomes
in the interest of companies to do security correctly, and of engineers to
only attempt it if they are competent and qualified.

It's really simple: In other fields of engineering, there are _consequences_
if you fuck up. If you design a bridge that falls down and kills people, you
lose your career, are sued, and/or go to jail. Not so in CS. That needs to
change.

~~~
nitrogen
Consequences and credentialism don't have to go hand in hand. Software's lack
of credentialism is what allows so many to rise above their circumstances,
break through class barriers, etc.

~~~
tbabb
Somehow I doubt you would apply that argument to designing bridges, or medical
practice, or airline piloting, or bus driving, or law, or any of a zillion
other professions where credentials are normal and essential to safety and
liability. We don't let random schmoes do those other things-- aspiring and
starry-eyed or not-- without first proving that they know what they're doing.

We certainly don't wait until the bridge has fallen down or the plane has
crashed or the patient has died before we put a burden on the professional to
certify their competence.

I don't think it's unreasonable to mandate rigorous certification for life-
critical, security-critical, and financial software engineering.

You can write a cookie clicker with a high school degree and put it online if
you want. But the moment your cookie clicker takes credit card numbers, you
should be legally obligated to know what you are doing or hire someone who
does.

~~~
dictum
When did Facebook, Equifax et al start hiring _random schmoes_ , or more
charitably, people who don't know what they're doing?

~~~
tbabb
One could argue from inspection that, when it comes to security, Equifax's
talent is scarcely better than random schmoes, yes.

But it's not just about preventing the hiring of "random schmoes". It's about
legally formalizing responsibility and accountability, and the incentive
structure that arises from that. As before, this is well-tested in other
professions.

Facebook might have made difference decisions if they had special legal
obligations regarding "the handling of sensitive personal information."
Perhaps their engineers would have thought twice about giving unfettered
access to third party APIs if they knew that a breach down the line could ruin
their careers.

------
SomeHacker44
I really don't think the proposal goes far enough.

Not only should the agency collect and monitor all "leaked data," but it
should set clear an detailed regulations on what can be collected, how it must
be revealed to the people who it is relevant to ("relevancy TBD"), how it must
be removed at the request of those same people, and how it must be amended
when (claimed to be) incorrect.

Many of these things are already done by other organizations for subsets of
data (e.g., the regulations on credit reports). It just needs to be expanded
to all kinds of data.

Europe is way ahead of the USA on this one. As someone managing the
implementation of a lot of the GDPR regulations on data access (e.g., "Subject
Access Requests") for a small company, I absolutely wish I as a US citizen had
the rights to do this stuff to US companies. But, I don't. Sucks to be an
American again. Maybe this can fall under MAGA? LOL

------
Nomentatus
"YouTube. It has users who love conspiracy videos, and YouTube takes that love
as a sign that more and more people would love those videos, too."

Not exactly. YouTube sends everybody down rabbit holes, because it adores
sticky topics and video sources (more views, more $), and so rewards those who
create a bit of an information monopoly by simply lying; after which, one of
their videos leads you to another one of their videos. Nobody else is making
videos on that, 'cause you made it up. You win. Novel "information" is more
likely to be viewed through, and then followed up on with searches for more on
the topic. So make ____up, and YouTube is all about you, thrilled to
facilitate the niche info-market you 've created out of thin air or wildly
exaggerated.

Merely having your own misleading phrases to refer to your bent views will be
heavily rewarded by search engines including Google's and YouTube's. For
example:

"I wrote about this in my new book, Algorithms of Oppression: How Search
Engines Reinforce Racism. In it, I discuss Dylann Roof, the Charleston mass
murderer, who said he Googled the phrase “black on white crime” after the
Trayvon Martin shooting. He has talked about how important that experience was
in forming his white supremacist views. He noted in his online diary that when
he Googled the phrase “black on white crime,” the search engine gave him
information that shocked him—and helped him come to a different understanding
about the so-called truth about race and the value of a multiracial society.
That’s because his search only returned the white supremacist websites that
use such a phrase—a phrase that is used by hate-based sites to radicalize
white Americans against African Americans and other people of color, including
Jewish people. Google didn’t provide any context on the white supremacist
movement. It didn’t provide any counterpoints of view."

[https://logicmag.io/03-engine-failure/](https://logicmag.io/03-engine-
failure/)

------
ausjke
that's indeed scary, but I saw no way out, but I will do the followings:

    
    
        1. remove my rarely used facebook account.
        2. remove twitter account.
        3. remove gmail, use outlook email instead, probably host my own email.
        4. for private messenger, use 'signal' app instead.
        5. use vpn more.
    

There are more stuff I could not remove though, e.g., my Amazon account, ebay
and paypal, etc, also my account and posts at HN, hi I can not even remove my
posts not to mention my account at HN, will HN sell me someday or is it doing
this already?

The only solution I see, is that paying for all those services: pay for
twitter, facebook, gmail etc, so they do not need your personal info to
profit? of those they need supervise, means if they still violate my privacy
after I paid, sue them to hell.

~~~
j605
The problem with using signal is getting your contacts on it as well. You
could always switch email providers but not chat applications.

------
gerash
Perhaps one solution could be to keep your personal data encrypted but then if
you need an online service you'll need to somehow let the services access it.

That's the part where it gets tricky. Either the code working on your data is
sandboxed from the outside world which sounds impractical. What if needs to
talk to other backends. Alternatively the operating system ACL framework is
super granular so you can give the service the minimum amount of data it
needs.

Or another solution could be a way to easily see which entities/apps have
access to your data and the time they accessed them. Like some sort of an
audit log with accessible UI.

A non-profit agency is nice and all but doesn't give much guarantee.

------
Nomentatus
Note a huge reason to target political ads is precisely that you can send your
targets highly offensive or obviously misleading ads that you know they won't
be offended by, without showing the rest of the electoral how vile the shit
you're slinging at voters really is.

Targeting covers the stench. Keeping the average voter from seeing your
nastiest ads is just as important as who does see the ads.

Democracy is about informing voters, targeting is all about keeping
information about your ads _from_ most voters.

------
jaequery
Regardless of regulations, I feel you can’t really stop what the employees do
with the data. For every company, there is an employee who have the key to
your data just doing their jobs, be it devs verifying data to sysadmins
managing the backups. Sure you can have better auditing procedures and
analyzing logs, but if its the group of guys especially at the head, its hard
to prevent that. Social hacking is happening without us knowing it and that is
what worries me more.

------
soared
The problem with Facebook is it allows people to gather data about a user and
their friends. Literally every single other platform (including google) can
gather data about you but not your friends.

That is the current problem that needs to be immediately solved IMO. After the
obvious, easy win then we should move on to more difficult regulation like
others have suggested.

------
Mediumium
Silicon valley didn't fail at all.

The main business model of a lot of silicon Valley businesses is based upon
selling, buying, trading and using people's personal data, so considering this
they didn't fail at all.

Edit : IMO to solve privacy we need to create another business model which is
obviously not based on data / advertising.

------
AlphaWeaver
The explanation of HIBP went a tiny bit overzealous...

> For example, the website of Australian security expert Troy Hunt,
> haveibeenpwned.com (“pwned” is how elite, or “l33t,” hackers, or “hax0rs,”
> spell “owned”),

------
wemdyjreichert
Users turn over their data. The solution to people recklessly turning over
their once considered private info is not to implement yet more regulation.

------
shmerl
_> Silicon Valley Has Failed to Protect Our Data._

They are joking, right? FB was never about protecting data, it was always
about profiting from selling it.

------
arez
...They pay taxes... wtf :'D

------
johnnyOnTheSpot
This is all a very new problem only seen under the new POTUS.

~~~
johnnyOnTheSpot
Given the massive uptick in stories around Facebook, it would seem this
statement would be true.

------
zmix
I thought _you_ guys are "Silicon Valley"?!

------
duncan_bayne
Silicon Valley, huh? I wonder how many trackers appear on the Bloomberg page.
Let's see ...

14, according to Ghostery:

* Doubleclick

* Lotame

* Scorecard Research

* Taboola

* SailThrough Horizon

... and nine more.

In the unlikely event anyone from Bloomberg is reading this: fuck you, you
hypocrites.

------
sqdbps
The US must not introduce any laws detrimental to US companies especially when
China is bolstering their tech firms (and stealing our IP) and the EU is
trying everything short of altogether outlawing US tech companies out of their
market.

Just ride out this silly populist convulsion and we'll all be the better for
it.

~~~
llukas
Protect US companies but screw US people? Did you forget about Equifax leak?
Why everybody else can have their data protected by not US people?

~~~
sqdbps
Generally what's good for US companies is good for the US and its people, I'm
fine with a legislative response to the equifax breach, something along the
line of mandated disclosure and penalties for lax security but we are in the
midst of a global battle and so our tech firms must be protected and not
assailed by our gov't.

~~~
sparkie
The same lack of regulation over what Google, Facebook et al can collect about
you is also lack of regulation over what Huawei, Xiaomi et al can collect
about you. It's not giving US companies any competitive advantage. The Chinese
companies have the advantage that they're also collecting the data on Chinese
citizens, which is out of bounds of US companies unless they're conducting
shady deals with the Chinese government.

------
dqpb
It's been interesting to see how much press Facebook, and to a lesser degree
Google, silicon valley, and "tech" have been getting lately.

I look forward to when the press seriously talks about the truly evil
organizations that are physically drugging, poisoning, and killing people and
our environment.

------
marknadal
Oh my goodness, the solution is to spend billions on a government agency?

Seriously, you can just use encryption for free. People are building entire
apps around E2E P2P tools that exist today:

[https://hackernoon.com/so-you-want-to-build-a-p2p-twitter-
wi...](https://hackernoon.com/so-you-want-to-build-a-p2p-twitter-
with-e2e-encryption-f90505b2ff8)

