
How the GDPR Will Disrupt Google and Facebook - cpeterso
https://pagefair.com/blog/2017/gdpr_risk_to_the_duopoly/
======
shadowtree
And if you think GDPR is a toothless joke, let's take a look at the defined
fine stucture.

It is pretty simple, only 3 levels (strikes for the fellow Americans):

Strike 1 - Stern warning letter

Strike 2 - 2% of your TOTAL GLOBAL REVENUE

Strike 3 - 4% of your TOTAL GLOBAL REVENUE (or 20mil EUR, whichever is higher)

And now you know why GDPR is a board level topic. Keep in mind that the EU/US
Safe Harbor agreement got axed due to a lawsuit of a single student from
Vienna against Facebook. So all you need is a single pissed off German
customer you ignored when asking for their data report card and you're
_fucked_.

For startups - GDPR is like Y2K at the time, a GOLDMINE. So much opportunity
to sell solutions, from real to snake oil. GDPR compliance is already and will
continue to trigger a massive wave of investment.

Enjoy :)

~~~
throwaway_9123
Throwaway account.

I have national sales responsibilities for one of the majors. Think
IBM/Microsoft/Oracle/etc leading a sales team of 74 reps.

You'd be surprised at how LITTLE sales we've generated from GDPR. We've been
providing free GDPR assessments for the past 1.5 years for over 200 accounts
as lead gen opportunity and very little sales have resulted.

It all boils down to companies simply don't believe the fines will be enforced
given just how expensive the fines are.

And since GDPR doesn't go into affect until May 2018, companies are just
waiting and seeing what happens.

It's really hard to sell GDPR because it's essentially an insurance policy.
Why spend $5m on software and another $5m in services ($10m combined) if your
total fine is only $20m. Do you as a company have a 50% chance of getting
fined? If not, then roll the dice and not buy a solution.

~~~
simonh
The fine doesn't absolve you of responsibility for complying. If you're fined
you have to pay up AND you have to comply. Otherwise they'll just fine you
again, as they did to Google.

~~~
emn13
Nevertheless, a 4% fine is very low, given the low frequency of fining. Tech
firm margins are much larger than this; so while it's clearly unethical to do
so, it may be more profitable to simply accept the fines as a kind of tax for
as long as possible, and to continue to profit from all that data until things
get really dire. In actuality; a firm wouldn't need to choose quite so starkly
to flaunt the law; simply failing to invest and dragging your feet looking for
impossible have-it-all solutions might well be enough to get away with a few
fines until you really try to get your act together.

If you will; it's the difference between the VW approach and those of (as it
appears anyhow) all the other carmakers. They're all cheating; most simply
were wise enough to avoid doing so explicitly.

Data protection is also harder to enforce than emissions; and just look at how
laughably incompetent emissions enforcement is to get an idea of how seriously
you're likely to get caught if you happen to collect too much private
information.

I expect the same here as in emissions: no real compliance for years (if not
decades), and when enforcement comes, it won't be the regulator that actually
catches even egregious wrong-doing. I mean; the high-profile players will play
lip-service of course, but that's it.

~~~
RivieraKid
I really doubt that these companies would lose more than 4% revenue by
complying. Worse ad-targeting in the EU is not worth that much.

~~~
emn13
I'm not so sure. And it's not just ad-targetting - all kinds of personalized
stuff and simply general purpose data mining suffer too. And don't forget that
they wouldn't get the full 4% immediately; and would likely be fined much less
than once per year based on current trends anyhow. So that 4% is going to be
further diluted.

~~~
simonh
That's not what has happened so far. The search Engine Results fine of €2.4bn
was based on the length and severity of past infringement and they were
threatened with a $10m per day fine, equivalent to 5% of global revenue, on an
ongoing basis if they didn't comply within 90 days. So they absolutely have
been hit with a heavy lump sum fine from day one.

There's no need to theorise about how the EU might enforce such laws, we've
got actual examples of them enforcing laws like this already and they do not
mess around.

~~~
emn13
Google's revenue is _all about_ data collection. If they can't collect lots of
data, the whole business model is a lot more questionable. In the face of
that, 2.4bn _once_ is a trivial fine; consider that that's something like
what... 3% of their revenue in one year?

Of course they'll try to avoid that in the future, but the fine is mild enough
that it's not going to cause firms to err on the side of caution. They're
going to look for the absolute edge of the law.

Frankly, if google had not leveraged their search "monopoly" (not quite a
monopoly), I suspect their market cap would have been more than 2.4bn lower;
so this was a pure win - especially since conviction and detection aren't a
slam dunk.

~~~
simonh
Did you even read my post? It wasn't €2.4bn once, it was that PLUS €10m PER
DAY of persistent non compliance. That would have come to €3.6bn per year.

The lump sum was just for backdated non compliance.

~~~
emn13
I read that: the point is that 2.4bn just isn't all that much given what it
does to the value of the company. It's probably a risk worth taking as long as
you can get away with it. And yes; that means you'll need to eventually adapt
- not because 10m a day is necessarily enough to actually enforce that, but
also because this kind of stuff is gamable; complying with the ruling without
much risk of competition at this point is pretty easy. And you'd need to make
the calculation that even if 10m a day were acceptable for the gain, simply
ignoring high-profile judgments against you may have worse ramifications down
the line.

I'm not saying it's nothing: it's that it's a risk worth taking given the
gains. If you're building a trillion dollar company (i.e. google), then
eliminating competition or accepting some judicial friction as a way to
establish dominance in your (data-mining) field is perhaps acceptable or even
wise.

In that, these fines simply aren't punitive enough, especially since they come
so late. And again - it's not black and white. The existence of such rules
will alter behavior; it's just a question of whether the reaction will be
legal mitigation tactics, a company-wide change in approach, or something in
between.

Put it this way: if you can corner a market worth trillions, risking how much
loss is acceptable to reduce or eliminate competition? I'd venture that these
fines are at _least_ one order of magnitude too small to be really frightening
(which isn't to say that the behavior google was convicted for deserves that
amount, simply that anything less than that means that law can't really be
enforced)

------
AriaMinaei
While I see some of the concerns about the _technicality_ of the law as
completely legitimate, it still bothers me that so many people reject the
whole spirit of this law, and cannot put the negative of "tax on startups"
against the much greater good of personal privacy.

I've just started a business myself, and this regulation affects my company
too. It makes development costlier; it'll take from the precious little time
we have to spend on compliance paperwork rather than work on our core
business. In the short run, it does hurt our chances of success.

Yet, none of the trouble is even comparable to what's to be gained here. And
it bothers me (though doesn't surprise me) that some people don't see that.

It also bothers me that such vocal opposition barely comes up when the
discussion is just about bigger companies such as Google and Facebook. How can
we expect "un-evilness" from bigger companies when we're barely willing to do
anything in that regard ourselves?

~~~
fixermark
Possibly because personal privacy as a "much greater good" is open to debate.
People place wildly differing values on that property.

Google's current ecosystem of data-sharing means that Assistant can make
educated context guesses on what I mean when I talk to it based on my browser
history and map navigation history. If the new privacy constraints damage that
passive interconnection, that's not a net good for me.

~~~
AriaMinaei
> ... privacy as a "much greater good" is open to debate

I agree, but that's a different topic, really. The comments here aren't about
the (un)/importance of privacy. The main debate seems to be either about the
technicality of the law and its possible unintended consequences, which are
legitimate concerns, or they're about how "this is gonna make my job much
harder," which is not really a legitimate concern in this context, and those
comments were the ones I was talking about.

> Google's current ecosystem of data-sharing means that Assistant can make
> educated context guesses on what I mean when I talk to it based on my
> browser history and map navigation history. If the new privacy constraints
> damage that passive interconnection, that's not a net good for me.

I think we're overestimating a technical difficulty here, and downplaying a
moral principle.

Providing a personalised service without storing large amounts of personal
information in a central location is not impossible. It's just technically
harder to do.

And even if it was impossible, then still, we need to sort out the moral
consequences first. Not by banning technological progress of course, but
perhaps by bringing more oversight to corporations. Or by making sure that
people of lower socioeconomic background aren't hit harder than the wealthy.

------
antoncohen
I encourage a little more thought before cheering this on as a win. While GDPR
isn't as ridiculous as the Cookie Law, it still shows that the EU/EC don't
understand the technology they are trying to regulate, and it comes at a huge
cost to tech companies.

Take the _right to be forgotten_. First of all, it should be common sense that
no one has the right to force legitimate news articles to disappear because
they don't like the content, but that is what the EU has ruled should happen.

I get the desire to have a company forget about you, and remove all the
personal information they have. It makes sense from a personal standpoint. But
how do you do it technically?

If you follow GDPR strictly you would need to be able to purge the data from
your _backups_. Now most backups are considered immutable, so you aren't going
to do that, meaning you need a way to ensure that "forgotten" users never get
restored.

But how do you even delete the live data? Does the tech company you work for
have the ability to delete all traces of a user from their system, cleaning
severing all relationships with other objects in your system? Do you have the
ability to retrieve _everything_ you know about a specific user, and provide
it to them? You will need to write the code to do this.

There is a good chance your little startup that isn't cash flow positive will
have to spend $1 million of its VC money on becoming GDPR compliant.

Do you sell a SaaS service to businesses, and those businesses send you their
customer's data? Then you are the processor and they are the controller. Cool,
less for you to do, sort of. Except that controller must agree to _every_ sub-
processor you use. Want to switch from AWS to GCP? You can only do it if all
your customers agree. Want to use try out a new metrics or logging service? If
it will have any PII you can't do it without customer (controller) permission.

You will basically need to hire full-time compliance officers to deal with
this. The big tech companies already have compliance officers, but GDPR is so
massively invasive to businesses that even small companies now need compliance
officers.

~~~
Slartie
> First of all, it should be common sense that no one has the right to force
> legitimate news articles to disappear because they don't like the content,
> but that is what the EU has ruled should happen.

No, it is about deleting personal data attached to your user account, not
"news articles". This thing intends to make the "delete my account" button to
actually, you know, "delete my account", instead of fake-deleting it by
setting a "deleted" flag and telling me that everything is gone now while
still keeping gigabytes of data associated with me in your database.

> [...] meaning you need a way to ensure that "forgotten" users never get
> restored.

If this is considered to be a hard problem, then I assume storing some list of
deleted users in a separate place and immediately purge those users from the
backup after restore must be some kind of rocket science.

> There is a good chance your little startup that isn't cash flow positive
> will have to spend $1 million of its VC money on becoming GDPR compliant.

I wouldn't call it "to become GDPR compliant", I would call it "to build a
sound database structure". Because if you are unable to purge all data
associated to one of your users' accounts from your system without destroying
the integrity of the rest of your data, then you obviously have a half-baked
system at your hands that lacks a core feature - to actually delete accounts.
And you surely should spend some of your money to refactor this crap into a
long-term viable solution while you are still small and agile enough to do
that. Because it's only going to be way more expensive later on...

~~~
spydum
You missed the point: if I take an image backup of a disk and store it, even
in encrypted format, and it contains John Doe's account, and he comes along
and asks for his account deleted, I would have to purge not just my database
record, but the backups images from the past as well. That invalidates ALL of
my backups. It's not often practical to backup INDIVIDUAL users..

~~~
solomatov
This is very easy to work around. Associate each user with an encryption key
and store in a separate database which backed up with some retention time. It
shouldn't be huge so that's not a big problem. Encrypt all data related to the
user with this key. When you make a backup, this data is stored in the same
encrypted form. When you delete user, just delete his record together with
this key. After this the user's data is virtually irretrievable and for all
practical purposes is deleted.

~~~
will_hughes
Unless I'm misreading you, you've just created a more complicated version of
things. How do you delete the immutably backed up keys?

You've gone from: backup.bak to backup-encrypted.bak and backup-keys.bak

~~~
yorwba
Obviously, you can't keep insisting on immutable backups. Instead, you'll have
to modify a single key in the backup file to become invalid.

This does make your backups slightly less reliable, because it's one more
thing that touches them, but if you do a sane implementation and exhaustively
test it, the risk is manageable.

I'm also not sure you really need to keep that many backups of this file.
Replicate it and make sure you can roll back when your replication is borked,
but if you _really_ need to restore your database from months ago, using a
newer list of encryption keys shouldn't be a problem.

------
_pdp_
This has nothing to do with the size of your company/startup and it has
nothing to do with regulatory compliance. It is a pretty simple at its core:
if your company/startup gets breached and as a result PII data leaked, then
you are liable for the penalty according to the general rules. I don't think
anybody will argue this is a bad thing. If anything, it will help companies to
be a little bit more careful with what sort of data they collect because
frankly, at the moment almost every company is perhaps guilty of collecting
far too much personal data under the assumption that one day it may become
useful. If you collect PII data then you are liable for damages if you happen
to mishandle it.

So here is how to avoid the GDPR penalties.

1\. Get compliant - it is pretty much ISO27001 and it will cost you money 2\.
Don't collect excessive PII data and if you do, store it securely - after all
it is a very basic ask 3\. Avoid collecting PII data at all cost - think of it
as another form of PCI

Frankly, there is no need to panic.

------
CobrastanJorji
> The critical question for both businesses is whether users will click “yes”,
> when asked to consent.

Yes, users will click yes on basically anything. Facebook could put up a
message that says "In order to proceed, click yes to give us half the money in
your checking account" and the majority of Facebook users will still click
through. Look at EU cookie warnings. Did any of those warnings noticeably
impact anybody's traffic after the first week?

~~~
mattmanser
Cookie warnings is a bad example as it was widely acknowledged by anyone
vaguely techy that they'd got it completely wrong from the start.

~~~
ben_jones
Eh, couldn't you make the argument that the Cookie warnings showed that a
government body could meaningfully change the experience of the internet? As
small as the warnings are, had anything like it been done before?

~~~
foota
Sure, now we get annoying warnings on every website we visit....

~~~
kuschku
Because the UK intentionally built a loophole into their version of the law.

The actual original cookie law, that was decided on EU level, requires users
actually to be able to opt out.

But it was a directive, and so countries could interpret it for local
implementations.

The GDPR is a regulation, which means its text is directly law, and it also
means it can be a lot stricter.

~~~
grabeh
Not so much a loophole but the UK Information Commissioner certainly adopted a
more relaxed approach to enforcement of the requirements of the cookie law
compared to other countries.

And the law required prior informed consent to cookies with opt-out not
generally being considered to be valid consent.

------
bflesch
I see this as yet another tax on (European) startups who have to invest even
more resources into regulatory compliance.

This prohibition of freely using all available data will create great
arbitrage opportunity for the shadow economy, and will have a net negative
effect on innovation.

I think prohibition has very bad side effects, and that MORE transparency is
the way forward in politics, economy, and also society. This includes allowing
businesses to use all the data they can get their hands on. People can produce
infinitely more data than any google can realistically process.

I cannot understand why people who are otherwise for transparency and against
prohibition are celebrating this as a big win against FB/AMZ/GOOG, as those
players can easily shell out another $10M here and there to be compliant with
this regulatory monster.

~~~
the_mitsuhiko
> I see this as yet another tax on (European) startups who have to invest even
> more resources into regulatory compliance.

This also affects American companies and the degree it affects you primarily
depends on how much of your business model was depending on you doing
nefarious things with customer data.

~~~
bflesch
Of course this affects american companies, as the whole thing is primarily
designed as a weapon for the EU against US dominance in the consumer space.

The big problem for the EU is that consumers actually choose the best product
in a free market (the internet of free services), and they overwhelmingly
decided to use the US-based options.

All the framing as "nefarious" is propaganda, consumers choose freely the
option they value the most. If someone else comes along providing more value
than google or facebook everyone would switch in an instant.

~~~
the_mitsuhiko
I suppose we fundamentally disagree on a few things here so I won't go into a
discussion about the motivation of this law (which I as a EU citizen support).
I do however want to leave a note on this quote here:

> All the framing as "nefarious" is propaganda, consumers choose freely the
> option they value the most.

Customers cannot choose freely. I'm a customer and I cannot chose certain
products because they do not exist. Companies I never engage with are tracking
my activities through tracking pixels and other things and because I never
establish a business relationship with them, I cannot avoid that. This bill
now forces a company I might do business with not do business with companies
that do not permit me to get rid of my data.

I think this a good development because it finally makes certain backroom
deals visible.

~~~
bflesch
> Companies I never engage with are tracking my activities through tracking
> pixels and other things and because I never establish a business
> relationship with them, I cannot avoid that.

You can easily avoid being tracked by using an adblocker. Other websites only
track you because they are business partners of the tracking companies, which
provide a lot of value in terms of analytics for the business - free of
charge.

> I think this a good development because it finally makes certain backroom
> deals visible.

As a German, I'd like to have more transparency into the backroom deals that
are done in Berlin and Brussels.

But this won't happen unfortunately, and they'll try to regulate IT to death
to the benefit of local corporations who failed again and again providing the
consumer with as valuable producs as their US counterparts.

~~~
the_mitsuhiko
> You can easily avoid being tracked by using an adblocker.

Except not really. Plenty if tracking happens regardless based on
fingerprinting. And even ignoring ads there are plenty of free services that
after a while turn iut to be so shoddy that they lose the data i left on their
services and provide no way for me to demand deletion.

I get a mail every other month that my email address and password where found
in a data leak.

This regulation is a good first step of forcing companies to think about the
consequences of having data.

> As a German, I'd like to have more transparency into the backroom deals that
> are done in Berlin and Brussels

Same. I want a lot of transparency including from my own government. I'm
however going to accept any positive development and won't demand them to be
in a certain order :P

There are a lot of people who are trying to shape the EU into a better
institution. It's not perfect but it's a pretty good start.

------
danarmak
> Nor can they deny access to their services to users who refuse to opt-in to
> tracking.[1]

Taken literally this means it's illegal to provide a service in exchange for
tracking. Can someone elaborate on whether this is true and what else it
applies to or what else other business models are made outright illegal?

~~~
dordoka
I guess that sentence is referencing article 43. IANAL.

See
[http://data.consilium.europa.eu/doc/document/ST-5419-2016-IN...](http://data.consilium.europa.eu/doc/document/ST-5419-2016-INIT/en/pdf)

~~~
danarmak
Article 43:

> In order to ensure that consent is freely given, consent should not provide
> a valid legal ground for the processing of personal data in a specific case
> where there is a clear imbalance between the data subject and the
> controller, in particular where the controller is a public authority and it
> is therefore unlikely that consent was freely given in all the circumstances
> of that specific situation. _Consent is presumed not to be freely given if_
> it does not allow separate consent to be given to different personal data
> processing operations despite it being appropriate in the individual case,
> or _if the performance of a contract, including the provision of a service,
> is dependent on the consent despite such consent not being necessary for
> such performance._

(My italics.) The second sentence seems clear: "consent is presumed not to be
freely given" if the service could be provided without the consent. Which
means consent cannot be traded in exchange for an unrelated service, like e.g.
webmail.

I'm not sure what the relation between the two sentences is. Does the second
one ("consent is presumed...") apply only to the cases addressed by the first,
i.e. "where there is a clear imbalance"? Or are they independent?

~~~
dordoka
Again IANAL and not english native speaker, but my understanding is that they
are not independent.

Consent is presumed not to be freely given IF it does not allow..., OR IF the
performance...

~~~
danarmak
I was asking about the relation to the first sentence, the one starting with
"In order to ensure that". That first sentence talks about "a specific case
where there is a clear imbalance between the data subject and the controller,
in particular where the controller is a public authority". Is the second
sentence ("Consent is presumed not to be freely given...") limited to cases
with such imbalance?

~~~
dordoka
Oh sorry. In that case, my understanding is that they are independent.

------
Sephr
The author believes that users have little incentive to allow Google to
provide personalized Google Search results.

I don't think any technical-oriented people in this thread would agree that
they have "little incentive" to allow Google Search personalization. When I
turn off Google Search personalization, I get inferior search results that are
less likely to be what I was searching for.

If you don't want your results personalized, there is an option in the search
results to turn personalization off.

The problem I have with this law is that Google will need to default to non-
personalized results and then prompt users if they want personalization.
Google probably doesn't want to increase UI friction, so they will most likely
just disable personalization and not prompt to enable. This will result in
less-engaged users and inferior search results for the average EU citizen.

~~~
balls187
> I don't think any technical-oriented people in this thread would agree that
> they have "little incentive" to allow Google Search personalization.

I disagree. I turned off Search personalization a long time ago, and haven't
looked back.

~~~
Jordrok
Same here. I do most of my browsing in an incognito mode-like environment (so
no persistent cookies or sign in to my Google account), and even when I am
signed in I turned off the search history a long time ago.

------
AJRF
I'm going to read thoroughly through the terms and conditions and get a case
going in European Courts when this comes into play, because you know for a
FACT Google and Facebook will put in some vague term to let them collect data
for "future" improvement of the service. Watch and see.

~~~
phatbyte
I believe that's what EU is trying to achieve:

> “A purpose that is vague or general, such as for instance ‘Improving users’
> experience’, ‘marketing purposes’, or ‘future research’ will – without
> further detail – usually not meet the criteria of being ‘specific’”

------
raimue
In case you are puzzled the same way as me, GDPR stands for General Data
Protection Regulation.

------
hedora
Sweet. How to I inform google that I moved to europe (even though I didn't)?
VPN tunnel?

~~~
s17n
If you want to opt out of data collection, it's not that hard, just go to
google.com/privacy

~~~
confounded
What does this definition of "opt-out" mean?

~~~
microcolonel
The code in their tracker stub (always inlined as a <script> tag) is rendered
inoperable.

------
davidgerard
We [technology dept at a non-computer business in the UK] got the lecture
about this at work. Turns out geeks are fans of this approach!

I've been using "GDPR hazard" as a useful way to kill bad ideas at work. "Sure
you can do that! We just need you to confirm that your business unit accepts
responsibility for this user-identifying data and ... oh, we can delete it?
I'll do that now then."

We have _lots_ of user-identified data, going back years. I can't see it as a
bad thing for us to behave properly with regard to it, and to be required to
do so.

------
0x27081990
Google and Facebook will find a way. Problem are small/young startups

~~~
kbart
What problems? So you can't collect all the data on your costumers and do with
it as you see fit? And when it gets leaked just say: "Oops, sorry"? If for
some startups taking private data seriously is a "problem", I want to see them
burn in fire.

------
fixermark
So, how long until this one also also gets neutered when European governments
realize they can't even bring their own websites into compliance with the new
law? ;)

~~~
kartan
> So, how long until this one also also gets neutered when European
> governments realize they can't even bring their own websites into compliance
> with the new law? ;)

I have worked for 2 big tech companies in Europe. And in both, there is a big
effort to make sure that they are compliant with the legislation. I see
everyone taking it seriously. Why do you think that it is going to fail?

Even the stupid, really really stupid, cookie warning was implemented
everywhere. What does this different? (A part of being actually a good law
that protects citizens from indiscriminate tracking).

~~~
fixermark
Regarding the cookie warning, I was mostly thinking about this infographic
humorously describing the delta between what was originally intended and what
was eventually implemented [[https://silktide.com/app/uploads/2013/01/Cookie-
Law-infograp...](https://silktide.com/app/uploads/2013/01/Cookie-Law-
infographic-reduced-size.png)]. As well as the ICO back-pedaling from explicit
consent to implied consent ([https://www.out-
law.com/articles/2013/january/ico-to-change-...](https://www.out-
law.com/articles/2013/january/ico-to-change-cookie-policy-to-recognise-
implied-consent/)).

I take it the actual implementation went a bit more smoothly than the
infographic suggests?

------
mzzter
What does the first footnote mean?

> "Nor can they deny access to their services to users who refuse to opt-in to
> tracking.[1]"

> "[1] Regulation (EU) 2016/679 of the European Parliament and of the Council
> of 27 April 2016 on the protection of natural persons with regard to the
> processing of personal data and on the free movement of such data, and
> repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ
> L119/1\. See Recital 42’s reference to “without detriment”, Recital 43’s
> discussion of “freely given” consent, and Article 7(2) prohibition of
> conditionality. See also the UK Information Commissioner’s Office’s draft
> guidance on consent, 31 March 2017, p. 21, which clearly prohibits so-called
> “tracking walls”."

What, in this regulation, prevents the company from denying users (who opt
out) access to a service they provide free of charge or a downgraded
experience? And how would a court measure the level of service?

~~~
taysic
Im guessing they would force the company to make tracking not on by default
but rather opt-in (lol)? But what's confusing is - if you clearly state how
the data will be used upfront, do you need to offer an opt-out? What if your
service simply doesn't work unless the data is used in a particular way?

~~~
grabeh
Under the GDPR one way to use data for purposes unrelated to the underlying
service provision is to look to obtain consent. That is not opt-out consent
but clear informed consent through opt-in.

If your service won't work functionally without certain data then consent is
not right ground of processing to rely on. There is a specific ground relating
to processing necessary to provide a service.

If your service isn't financially viable because you can't use data to obtain
revenue to support the underlying service provision then consent may not be a
viable ground because of the above reasons.

The debate continues however on how to support free service provision outside
of the confines of consent.

~~~
taysic
Interesting but complex. I could imagine there are a lot of issues on how to
express your intent with the data.

Some uses of data might be ancillary to the direct goal of the user but still
unexpectedly useful.

Interesting from the article: "purpose that is vague or general, such as for
instance ‘Improving users’ experience’, ‘marketing purposes’, or ‘future
research’ will – without further detail – usually not meet the criteria of
being ‘specific’”.[3]"

------
richrichardsson
Would one way "around" this be, "Welcome to ABC Service, it costs X€/month to
use, or if you allow us to use your data to sell to our advertisers we will
waive this fee."?

------
Chaebixi
How might one make sure their accounts get classified those that fall under
the scope of the GDPR regulation? Would it be sufficient to set your location
to an EU country?

------
yuhong
I have been thinking that the ad bubble is a problem for a while now. Ads are
basically to encourage consumption, and the US economy has been debt based for
decades now.

------
throwaway91234
Throwaway since I don't want to involve my employer.

I actually work for a platform that is squarely in the GDPR crosshairs
(digital marketing). There are a lot of things where our lawyers' perspective
is different from what most people say here (I didn't talk directly to
lawyers, but I presume product managers did).

\- You don't have to comply in 2018, you have to show that you started
seriously working on a solution, even if you're not fully prepared. \- You
don't have to have automated processes for everything (e.g. delete from
backups), it's actually perfectly reasonable to say "we'll process your
request" and do it manually (ref: startups spending inordinate amounts of
effort for GDPR compliance). \- Opt-in is not as "game changer" as suggested
here, my understanding is that you can do implicit consent (notify the user
about what you do, give them a link to take action; crucially, that link might
even be the link to your privacy policy which contains the link to the opt-out
interface... if I got this right - and I think that I did - this may not
amount to much more than a slightly modified "this site uses cookies" thingy).
\- Delete requests may be handled by "de-identification" (don't delete the
data, delete the association with you). \- Related to that, while I don't have
a definitive answer, I strongly suspect that GDPR only applies to information
that can be positively associated with you (e.g. authenticated activity). I'm
not obliged to show you anonymous browser activity/information that I've
probabilistically associated with you, for the simple reason that I might be
wrong and I might disclose sensitive information (think about girlfriend
looking up "what does Amazon know about me" and finding up that "she is
interested in an engagement ring" because you anonymously browsed from her
computer, thus spoiling your surprise even though you were careful to delete
your browser history/ browse anonymously. Yes, incognito mode doesn't
necessarily help you - we do efforts to identify server-side the incognito
sessions and de-link them from the probabilistic marketing profiles, because
we don't want to negatively-surprise the customers; but I suspect not all
players are that careful).

Overall... despite what many people think, I think big players are actually
fairly careful/sensitive about your privacy (well, if we exclude Facebook here
:D ). It's the startups that would concern me more... they have very little
incentive to guard your data well, because there are so many OTHER reasons why
they might fail, that "privacy disaster" is very low on their list of
concerns.

------
mankash666
Are all companies beholden to this or those with legal entities in Europe.

For instance, can a Chinese company with ZERO legal presence in the EU
completely ignore these requirements? The internet has no real borders, after-
all.

~~~
M2Ys4U
Yes.

If you process the personal data people in the EU then you have to comply:

Article 3

Territorial scope

1\. This Regulation applies to the processing of personal data in the context
of the activities of an establishment of a controller or a processor in the
Union, regardless of whether the processing takes place in the Union or not.

2\. This Regulation applies to the processing of personal data of data
subjects who are in the Union by a controller or processor not established in
the Union, where the processing activities are related to:

(a) the offering of goods or services, irrespective of whether a payment of
the data subject is required, to such data subjects in the Union; or

(b) the monitoring of their behaviour as far as their behaviour takes place
within the Union.

3\. This Regulation applies to the processing of personal data by a controller
not established in the Union, but in a place where Member State law applies by
virtue of public international law.

------
andrei_says_
Just wondering, would an European residing in the US be able to request EU
compliance for their data?

------
1024core
So, are ad clicks "personal data" ? That would basically destroy all adtech
startups.

~~~
gonmf
I'd imagine the host website needs to ask for the permission to share specific
data with a third party named XYZ Startup, that will use that data for
language/country segmentation or whatever is they do, and if the user denies
that right then random ads are shown instead.

------
mdip
My first instinct was to be pretty happy about these laws, which surprised me
quite a bit. I generally identify on the libertarian/pro-capitalist side of
things and "as a general rule" subscribe to the belief that government
regulations are often more problematic than problem solving[0]. So I had to
take a moment to analyse why I felt this way.

Here's the problem as I see it: I _know_ all of the things that are collected,
how they're collected, what shady practices are used[1] and I'm completely
aware that there _is_ no anonymity left on the internet. The old "when the
product is free, you're the product" isn't lost on me. In reading this,
though, I was still finding myself a little outraged[2]. I look at it this
way: if _yesterday_ , we had an web with plain old "dumb advertising"
techniques limited in sophistication in the manner of television advertising
in the 90s, and today we ended up with _this_ , there would be rioting (in the
USA, anyway[3]). This didn't necessarily happen _slowly_ but it happened
gradually and _quietly_. I remember when Facebook announced that it was adding
the ability to track you on other sites that you visited while you were logged
out -- that was announced and it was met with criticism (briefly, though I
quit the platform about a month later in a quiet, personal, revolt).

Here's the thing - if you ask an average non-technical individual if they
understand that they're being tracked on the internet, they'll shrug and say
"yes". If you dig a little deeper, you'll discover that they haven't the
faintest idea how deeply they're being tracked and that they don't even have
an analogy in their own lives to equate that tracking to. I couldn't come up
with anything to describe the extent of tracking short of extremely lengthy
explanations of what's being done and used[4].

And then there's me - I understand I'm being tracked and have basically chosen
the head-in-the-sand approach. I use adblock, and a few extensions that
supposedly "limit tracking" (doubtful) but I know they're worthless. Here's
the thing, though, what choice do we have? And that's where I concluded how I
was able to land in favor of some form of regulation on this behavior[5]. It
is becoming increasingly impossible to avoid interacting with companies like
Google and Facebook[6]. I look at it this way -- a company that becomes a
monopoly in such an important industry can exert as much, if not more, control
over the citizenry than their own government[7] but without the limitations
imposed by democracy.

What should be done? I'm not sure. Self-regulation isn't working. I have zero
faith in government crafting any kind of law related to technology that won't
be some combination of horribly ineffective, worse than what we have today,
utterly _breaks_ something _really important_ , or is used as a means to
insert something _horrible_ (watch them try to pop in a line-item around key-
escrow). I'm kind of surprised to find myself thinking that approach that
looks the best, out of the options, is probably forced-competition through
breaking up the companies involved and I _hate_ that idea in principal and in
practice -- it's worked just-about as well in the past.

[0] I don't want this to devolve into a flame-war of whether regulating is
"good or not", though I fear I may have just stoked that flame, I'm simply
providing background for contextual purposes.

[1] I half- _admire_ the creative uses of WebRTC with STUN on what are
otherwise regarded as _highly reputable_ major news sites. It's difficult for
me to _not_ see that practice as poking a hole in my firewall and I feel no
less outrage when I see that happening than I do when a piece of malware does
the same thing.

[2] Part of me had forgotten the idea that when GMail was "scanning e-mails
for advertising purposes", they were scanning e-mails that were coming
_inbound_ from non-GMail users who couldn't have possibly consented to that.
I'm sure there's a really good counter argument, but I'd have a hard time not
feeling a little violated by that practice if I weren't a GMail user, already.

[3] Probably elsewhere, but my experience is that some European countries'
citizens (particularly the UK, where I have the most experience outside of the
US) are more tolerant to this sort of thing whereas when I was a child, you'd
have seen people gathering in militias the moment the government tried to
propose something like Real ID.

[4] I can only speak anecdotally since I had this conversation with family
members who are non-technical and after about _two hours_ , had them quite
disgusted -- asking _how is that legal_ ... and these are some of the most
government-skeptical conservative people you'd ever meet.

[5] And I have _zero_ faith in the US government being able to craft a law
that works. Minimally the "they must still offer the service if the user opts
out" will be removed, entirely, turning the "agree to be tracked" button into
the moral equivalent of the "Cookie Warning" \-- something you click because
you _have to_. And philosophically, if we weren't talking about monopolies or
near-monopolies here, I'd agree with that approach.

[6] Yes, DuckDuckGo is my default search engine, everywhere. And I've now
trained myself to use the shortcut to get to google for the 60-70% of searches
that DDG returns unworkable results. I think it's my search patterns, which
tend to be very narrow in results, causing Bing/DDG to "broaden" and ignore
terms (or when used with parameters, simply yield nothing). My parents (both
retired) use DDG and _rarely_ anything else since I switched all of their
browsers around (they didn't even realize I had changed it -- they don't think
of Google as a company, they think of search as something "the internet just
_has_ ..."). They are perfectly happy with it.

[7] Or can work in concert with it. Requirements to hand over Facebook
credentials at the border are becoming common. I'm waiting for the day when I
say "yeah, I don't use that" and end up back in a little room with an angry
looking man asking me a bunch of (the same; slightly rephrased) questions and
responding to them with the assumption that I'm lying (personal experience on
that one; not fun). I mean, after all, I'm a programmer/live on the
internet/et. al., surely I _must_ use Facebook and I'm trying to _hide_
something! /s

~~~
dingo_bat
Just want to say that any emails I receive are my property. And I can grant
consent to Google allowing them to mine these emails. Once you send an email,
you are ceding control to the recipient.

~~~
mdip
While it's nice to think that's actually the case, the legal situation is
almost certainly more nuanced. The thing that comes to mind for me is the
implicit copyright protection.

I'm not sure if that would apply to a message I sent to you, but it may, and
if that's the case, it's "my property", not yours. I'd be interested to know
if any case law exists on this. Considering how often the DMCA is abused, I
wouldn't be surprised if someone tried a DMCA takedown claiming copyright
ownership on an embarrassing e-mail sent to someone and then subsequently
posted online.

------
sageikosa
Cool. 88 more pages of functional specs on every project.

------
whyagaindavid
Will GDPR allow me to delete my Apple-ID? Thanks

~~~
virgilp
yes

------
fnord77
I need to move to europe.

------
Animats
Site will not render properly with Google Analytics blocked. Web site renders
with text on top of text. Grey on black text. Is this some fake site by a
Google front intended to give this idea a bad reputation?

~~~
prophesi
I use uMatrix on Chrome and have every non-pagefair.com script blocked,
including google analytics. Not seeing any site rendering issues here. Perhaps
it's your browser?

~~~
hedora
My adblocker blocks the base url of the page. Maybe there is some common
adblock regexp that matches stuff on this site.

~~~
eppsilon
That would make sense, as PageFair is an ad network:
[https://pagefair.com/about-us/](https://pagefair.com/about-us/)

------
cinquemb
Might be easier for small niche companies with no offices in the EU but
willing EU customers to ask for payment via a cryptocurrency.

It just seems to me in the long run, more and more laws like this will pop up,
and using cryptocurrencies will get easier/ more familiar.

~~~
rspeer
Nobody is going to look for SaaS on the black market, unless the service
they're looking for was illegal to start with.

~~~
cinquemb
> _unless the service they 're looking for was illegal to start with._

Luckily, we will always know of everything that was/is legal now will be
illegal in the future, and that people/companies throughout history will
always submit to the costs of regulatory compliance of all governments in the
world, no matter how burdensome they may be in specific instances.

