Hacker News new | past | comments | ask | show | jobs | submit login
GDPR After One Year (truthonthemarket.com)
285 points by goerz 32 days ago | hide | past | web | favorite | 408 comments



> individual “data rights” have led to unintended consequences; “privacy protection” seems to have undermined market competition;

That opening paragraph already speaks to the over-elevation of the market over any other concerns. So it perfectly fits onto "news.ycombinator.com". Human rights, including privacy and data rights, are more important than the profits of some companies

Most examples in the text are, for instance, related to companies failing to properly implement the GDPR (Amazon sending data to the wrong person, Spotify not asking for 2FA/email confirmation for the bulk download, companies deleting articles even when there would sufficient public interest, Ad vendors failing to ensure compliance and therefore seeing drops in demand, ...), that is, market failures - something this site would probably not call out but rather attribute it to the legislation.


Market competition is usually in the opposite to profits. Market competition usually works against the competing companies and in favour of their customers.


That is true in theory but it is not (solely) what is at stake here. Here, we are talking about the cost of regulations, and these do eat into the profits of companies. To some extent, these costs could also hurt competition (assuming the competitor had the same data-vacuuming business model). While we can not directly say whether the (possible) decrease in competition compensates for the compliance costs, looking at the overwhelming opposition by businesses small and large (incl FANG), it seems like the cost are higher.


Compliance costs money, what's supposed to be new or particularly bad about that?

Many in this thread seem to argue that launching an internet based business or entering a market as large as the EU was previously free and I honestly don't see where this illusion is coming from. Yes, GDPR compliance costs money, just like a whole bunch of other things.

> To some extent, these costs could also hurt competition (assuming the competitor had the same data-vacuuming business model).

Same question, if that data-vacuuming assumption is true, why should the general public care about preserving that? GDPR regulation and implementations are by no means perfect but it's not like transparency, forcing companies to think about their impact on their users privacy and other aspects don't present benefits as well. I realize that the question of regulation is a matter of philosophy just as much as it is political but painting GDPR as some kind of killer of good businesses seems, at least, very weird.


I'm in total agreement with you (see my original comment)


Oh, apologies, didn't realize the thread structure there on mobile. :)


Shhh, that kind of talk goes against the “market bad regulation good” thinking in vogue right now.


> that is, market failures - something this site would probably not call out but rather attribute it to the legislation.

Socially shaming companies does very little compared to legislation, so that's completely understandable.


In general this might be agreeable. But in these cases these are not direct unintended consequences of the legislation but rather consequences of companies failing to do their due diligence when implementing GDPR. It would be like companies putting people in bomb disposal suits after sth like OSHA is implemented and complaining about the cost and lost productivity. Or to take the EU: Recently the CJEU ruled that all work time has to be tracked (not only overtime), and a German company (in Germany this wasn't mandated before) would force employees to install invasive phone apps that track location etc. and use to determine work time. Employees rightfully complaining should direct their anger at the company, not at a law that would work perfectly well with a standard punch card system


> But in these cases these are not direct unintended consequences of the legislation but rather consequences of companies failing to do their due diligence when implementing GDPR.

The complexity of additional law is part of its unintended consequences.


How is sending the recordings of one user to another a result of law complexity?

In any case, there's no increased complexity here; the Data Protection Directive (enacted in 1995) already mandated access to one's data (Article 12 - Right of access). The GDPR mostly gave some real teeth to that existing right.


> That opening paragraph already speaks to the over-elevation of the market over any other concerns. So it perfectly fits onto "news.ycombinator.com"

I have not found the cultural climate on HN to be opposed to the GDPR at all. Both its goals and its implementation seem to be popular here overall, and most comments I see that are critical of it get enough downvotes to have their text significantly desaturated even if they're making a good argument.

It's my impression, though I have not worked in the field that people operating ad networks believe some kind of tracking is necessary to prevent click fraud, and do not want to sell ads free of any tracking even when they have customers asking for the option. I don't know how true this claim is or whether alternatives have been adequately explored.

What I do know is that some of the industries that use ads, such as newspapers are struggling to make enough money to continue operating. Nobody will miss a scummy adtech firm, but people might miss local news outlets. It's valid to talk about the impact on business not just in terms of profits, but also considering potential positive externalities of the continued operation of a given business.


I recall a discussion on the matter. Everyone that wasn't selling ads felt that if it was impossible to sell ads without tracking due to "click fraud", there was no obligation from society to prop up their failing business model by letting them ignore the privacy rights of the public just because it was expedient to do so.


> their failing business model

The business model is fine, the EU and you are choosing to kill it. Not only are people arguing for no tracking they are arguing on forcing other people to live without tracking. So if someone in Europe wants a free service in exchange for their personal information they don't even have the choice. Additionally GDPR requires that businesses not restrict their products to only those that are funding it.


Well yeah, that's the meaning of laws/regulation, they apply to everyone. If someone in Europe wants a free service in exchange for [something illegal] well sorry but not allowed. That's not the hard part of this whole debate...


Except for the fact that in this case the "something illegal" is the user's own data, which they should get to decide what to do with. Isn't that the whole point of this discussion? That user's have the right to their own data? If they decide they want a free service in exchange for it, they're only 'hurting' themselves after all, and if they really have a right to their data (instead of a right to have their rulers tell them what to do with their data) then they should be able to make that choice! You can't act like everything that's "illegal" is equally bad, by grouping "personal data" with other "[something illegal]" items. That's a pretty severe case of equivocation.


Is it really the case that they are only hurting themselves?

The effect of targeted advertising on platforms such as Facebook seems to have had a very negative effect on democracy, for example.


I'm not sure I see that, but I'd be willing to consider that possibility if there was any data to back that up instead of hysterical op-eds and articles. I see a lot of those, but not a lot in the way of convincing evidence.

Although, FB in general doesn't seem to help the political discourse, I don't think it's anyone's job to decide what ways of communication and discussion are right for a democracy, because if you control where people can speak and what they can say, it's a short step until you control how they vote.


> Except for the fact that in this case the "something illegal" is the user's own data, which they should get to decide what to do with.

An individual has negligible negotiation power. That's why we use regulations as a form of collective bargaining.


Wait what? I cannot say illegal things are illegal? The only association I made is that things that break the law are illegal, which is a tautology and obviously true. I never said everything illegal is the same.

Personal data has many meanings and for some, this data might bring their literal death so of course it's a serious topic. Not common in the last ~30 years of western history, but that's just a tiny slice of time/location so of course it's sensible many of us want to keep personal data, well, personal.


I apologise, I might have misunderstood your point then. I assumed you were saying something more interesting then a simple tautology. Why say a tautology?

I thought what yout were saying is that we shouldn't want it to be legal because it's illegal and therefore bad.


> Except for the fact that in this case the "something illegal" is the user's own data, which they should get to decide what to do with.

And what websites have been open with their data collection before the GDPR forced them to? When I opened the data collection dialogs introduced by the GDPR for the first time I expected maybe two or three entries, and it was near consitently closer to 40! WTF. Calling it "user choice" when the site owner deliberately ommits that kind of information is dishonest at best.


Forcing companies to tell users what they're doing with their data so the users can have informed choice isn't really my issue here.

The issue I have is the rules in what can be done with that data. Because those rules make it so that the users don't get to decide what they're OK with being done with their data.

If I'm being really honest here, the GDPR seemed pretty well intentioned, IMHO it just went a little far.


Maybe I read a different text than you but I have yet to encounter a scenario in which a user would consent to processing that the law still would not allow. Assuming the consent was gathered according to the basic principles of GDPR, fair, transparent, specific etc.

Maybe I am missing something. Would you provide examples? I would be very glad to learn if there are edge cases I may be overlooking.


The GDPR does not restrict much what you can do with personal data in principle, it does require much more effort in explicitly informing the user (also for updates) and it does grand inalienable right to users on their own data.


Those "inalienable" rights are kind of the issue here: it's telling the businesses and users how they can sell that data and what they can use it for, so it's not a choice by the user on what they do with their data, and more a collective decision with the government.

As for explicitly informing the user about what they do with their data, that part I have no quarrel with.


My point would be that there are a lot of illegal business practices. You as a costumer cannot buy expired food.

> it's telling the businesses and users how they can sell that data and what they can use it for, so it's not a choice by the user on what they do with their data, and more a collective decision with the government.

That is true, and that should happen in cases where market incentives do not align with social or public interests.


I see where your coming from, and it makes sense. Your point of view, if I understand correctly, is that the government should protect consumers from accidentally making bad choices. And I get that. It seems like it would be nice. But I don't think that's a road we want to go down, because if the government gets to make choices for people in one area, why not others? And why are we assuming that the government always knows better than the people that are actually in situations?

I mean, if I'm being honest my views on government aren't very common, so it'll probably be expedient to agree to disagree. (:



I don't think tracking adtech is a fine business model, but I'm inclined to favor technical solutions over legislative ones. There are several reasons, including: technical solutions are available to everyone, not just specific regions; technical solutions evolve and respond quickly to a changing environment; technical solutions offer users more direct control over what they will accept; technical solutions are not coercive or backed by the threat of violence.


I love it how everyone is downvoting you for pointing out something super basic: GDPR is essentially the government deciding that everyone in Europe's data belongs to the government, to decide what they can use it for and what they can't. That's not personal data rights. That's other people deciding what's best for you.


To make a little jump, this is like saying that child labor regulations mean that your children are actually property of the government. (not to show it is wrong, but that there is some background context needed)

Also I can still give facebook all my personal data. Simply facebook need to get my consent to distribute and sell it and I will forever have some basic control on what data fecebook has on me. The government has little to do in this.

Also (beware the strawman), as far as I know people cannot sell their own organs in the EU, is this a sign that your body belong to the state or that business models build on harvesting poor people organs are unjust?


So for your examples, I would say yes, yes through its rules and actions the government has clearly shown that it thinks it owns those things. Including our bodies (drug war anyone?). And a business model that pays for organs is not "unjust" but maybe a bad idea for those that would participate, obviously. I mean, the way you phrase that makes it sound like they're going to be kidnapping poor people in the streets to steal their organs if there wasn't a law against selling organs, which doesn't make sense.

> Also I can still give facebook all my personal data. Simply facebook need to get my consent to distribute and sell it and I will forever have some basic control on what data fecebook has on me. The government has little to do in this.

So long as the government doesn't force companies to provide the basic control, that's how it seems like it should work! (:


> I mean, the way you phrase that makes it sound like they're going to be kidnapping poor people in the streets to steal their organs if there wasn't a law against selling organs, which doesn't make sense.

My understanding is that figuratively speaking that is almost what happened with subprime loans.

Corporations and market can have a lot of power in performing predatory tactics. If drugs were simply legal quite a few business would sustain themselves on other people addictions.

One of the main reason we need regulations is that any sensible and obvious law (like not kidnapping people to harvest their organs) has loopholes (like keeping people poor, ignorant and devoid of mobility (lack of education, criminal convictions etc.)) so that they will agree to sell their organs.

Organ harvesting is a deeply extreme subject and obviously will not happen with or without regulations, but modern free society need are built on the free enterprise (eventually in the public sphere) of individuals and consequently they need to handle when individuals gather too much power and can destabilize societies.

Every free society has this problem (including bitcoin with a 51% attack) and needs to find a solution to both promise rewards for personal enterprise and incentives not to abuse the system (for bitcoin (IIRC) they are respectively money and loss of hardware investment)

> So long as the government doesn't force companies to provide the basic control, that's how it seems like it should work! (:

(interpreting as government should not force companies)

My problem with that is that principles do not help us distinguish fair competition from predatory unethical behavior. In the contest of personal data and privacy that is relevant as we live a completely different universe from just a few years ago.

Gossip is not illegal, but if you were magically able to listen to every conversation in a 10 km radius that would be a problem. Legal and illegal are often linked to how hard it is to do something and the scale at which you can do it.


Actions of a democratic government are actions of the people.


Which is a fine argument, but people often don't consider that this kills off all the businesses that rely on those adtech companies. I think part of the problem is that it's not immediately obvious that sites like Google and YouTube only run because of that adtech.


They only rely on ads because that's the path of least resistance. If the GDPR eventually means that there is no Google or no Youtube anymore – which is not very likely – that just means the cost of Google/Youtube doesn't justify its benefits, assuming markets work at all.


Those services would still exist in other countries. They just wouldn't exist in the EU.


> I have not found the cultural climate on HN to be opposed to the GDPR at all.

That strongly depends on the time of day, read: Whether Europe or America is awake.


This is an astute observation about this site. I too have noticed, on many topics, that the comments here tend to acquire a different character when it's the middle of the night in California.


Many in adtech do support privacy and data protection, after all they're just people too and want a good experience online.

The problem is that GDPR is well intentioned but poorly implemented, a common occurrence in politics with examples in lots of sectors. These kinds of unintended consequences are what happens when politicians don't quite understand the nuances of an industry and focus more on regulation-in-principle and showboating rather than actually effective rules.


No, unintended consequences happen in every change one implements. Show me a law of any consequence, and I'll show you some unintended consequences it brought on.

I think you are confusing unintended with unforeseen, which is a different matter. But I don't think these were unforeseen; some are problems with the implementation, which will be corrected by the companies responsible, and others are just inevitable (if you give people access to something, by definition it makes it easier for a third-party to abuse that access).


These problems were not unforeseen. The politicians were warned over and over again and you can find articles about it going back years.

The implementation problems are with the law, not the companies. There are ways to enforce data protection and privacy without such complex and nebulous laws that aren't even effective against the worst offenders.


> There are ways

I've seen many critics of the GDPR say something to this effect; I've yet to see any make it concrete. Without wishing to put you on the spot, what leads you to that conclusion?


Purely anecdotal, but I've had many debates over the last year on HN with people who claim that GDPR is just protectionism - rather than being a sincere effort to improve human rights online, they argue that it's just a sour-grapes effort to cripple American tech companies.


Pretty funny argument given that the big US companies are benefiting the most from it according to the article.

“The consequence was that just hours after the law’s enforcement, numerous independent ad exchanges and other vendors watched their ad demand volumes drop between 20 and 40 percent. But with agencies free to still buy demand on Google’s marketplace, demand on AdX spiked. The fact that Google’s compliance strategy has ended up hurting its competitors and redirecting higher demand back to its own marketplace, where it can guarantee it has user consent, has unsettled publishers and ad tech vendors.” (Digiday)


Having the intention to harm US companies and while implementing it actually benefiting them is not mutually exclusive.

Political motivations will be complex for anything with as wide impact and as complex as GDPR.

The perception that the US tech companies should be affected the most was certainly factored in during the political process involving thousands of people. It's debatable how small or great impact this had, not whether there was any.


> Pretty funny argument given that the big US companies are benefiting the most from it according to the article.

That would be the "unintended" part.

Only the ginormous companies can spend thousands of human hours on compliance while their smaller competitors either leave the market or get steamrolled due to the compliance costs. All this has happened before, and all this will happen again...


Undoubtedly it passed due to support from many people with each motive.


> Most examples in the text are, for instance, related to companies failing to properly implement the GDPR (... companies deleting articles even when there would sufficient public interest,...)

Some of those examples were deceptively reported. For example, the doctor who asked The Guardian to take down articles about her suspension: she had successfully appealed that case, and a judge overturned her suspension and ordered the record expunged: her name was dragged through the mud on bad information. This is exactly what right to be forgotten is meant for.


>> That opening paragraph already speaks to the over-elevation of the market over any other concerns. So it perfectly fits onto "news.ycombinator.com". Human rights, including privacy and data rights, are more important than the profits of some companies.

What do you mean? I use news.ycombinator.com every day and I consider human rights, and consumer rights such as privacy, to be extremely important. In fact, I use news.ycombinator.com because there is a very strong current in support of such principles by the users here.

There is also a strong streak of free-market capitalism and technology-first, you-can't-stop-progress techno-optimism, but that is the point. This site offers opportunities for debate.

You're assuming too much if you're extrapolating from a few comments you disagreed with to the entire userbase of this site. HackerNews is not an echo chamber. Not yet, anyway.


I'd say the consequences were intended.

> companies deleting articles even when there would sufficient public interest

Is that really part of GDPR? That just seems authoritative.


Not really. The GDPR stipulates a right to be forgotten [1] with the following exceptions:

> 3. Paragraphs 1 and 2 shall not apply to the extent that processing is necessary:

> (a) for exercising the right of freedom of expression and information;

> (b) for compliance with a legal obligation which requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;

> (c) for reasons of public interest in the area of public health in accordance with points (h) and (i) of Article 9(2) as well as Article 9(3);

> (d) for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) in so far as the right referred to in paragraph 1 is likely to render impossible or seriously impair the achievement of the objectives of that processing; or

> (e) for the establishment, exercise or defence of legal claims.

[1] https://gdpr-info.eu/art-17-gdpr/


If the fundamental thesis is that "the market fails" or that "corporations are irresponsible", then at the very least it should be predictable that some of these regulations will have the real negative consequences (which I think we can agree some of these are) expressed in this article. The goal of this realization is not necessarily to disparage the GDPR, but hopefully to learn and perhaps put together a more precise or better iteration in place that avoids these pitfalls. For example:

> Spotify not asking for 2FA/email confirmation for the bulk download

I'm not extremely familiar with the letter of the law, but if it doesn't specify that you need 2FA/email, and there are clear fines/downsides to not complying, I do not see how this is not a predictable issue that would come up. The incentive is to comply, since you've already put in the work to make it possible, and there are onerous punishments for not doing so. In other words, a false negative (disallowing the download) can be potentially perceived as much worse than a false positive (allowing the download). This seems built-in: the goal of the law was to give it teeth to allow the user to get this data. If we just default to "its the companies fault for not applying an additional layer of thought to all this", then whether its true or not (and I agree it is!), it does not realistically solve the problem - establishing blame doesn't necessarily provide a path to making this less likely in the future as long as the equation is still heavily weighted towards disincentivizing false negatives. This is another way of saying: if we want to characterize corporations as lazy/malicious/what-have-you, then we can't then be Pikachu-surprised-face when they act that way under the letter of the law like some monkey's paw scenario: we should instead "aw shucks, fool me once" and try to come up with something better.

> companies deleting articles even when there would sufficient public interest

Similarly, we should try to predict that it is entirely plausible that the rights will be attempted to be abused, or leveraged, by those that it provides an obvious benefit for, even if it is malicious. Here again it is interesting that we begin with the thesis that "we need these laws because companies have proven not sufficiently responsible enough on their own" and yet then immediately make a law that is vague and thus defers major parts of the decision to these same companies. Many times this ultimately comes down to litigation where the boundaries of laws are worked out, and this is a very reasonable response: its only been a year, the courts will hopefully work out when these rules are mis-applied. However, it is on us to make sure we litigate "too much" right to forget (as in the cases here) and not just cases where companies refuse to forget. If not, the courts will send a clear message that it is perfectly fine to blindly abide by every request as the path of least resistance. Again: the premise is that they don't care, and we still haven't figured out how to legislate caring.


> I'm not extremely familiar with the letter of the law, but if it doesn't specify that you need 2FA/email

The Regulation doesn't mandate specific technical implementations anywhere; it leaves that to the industry, which is the expert in that regard. But on the subject of the Right of Access it does explicitly say:

"The controller should use all reasonable measures to verify the identity of a data subject who requests access, in particular in the context of online services and online identifiers"

It seems to me that not using 2FA/email means they haven't used all reasonable measures to verify the identity.


This ideological purity doesn't work well in the real world, and it's not about profits either.

People don't care about privacy as much as you imagine them to, especially if they have to give up everything they get for ads today. One look at what people willingly share to the world on social media shows that.

But powerful monopolies are a problem and market competition is the correct answer to that power. Regulation isn't a magic cure and should be used to place guardrails on the market, but in this case could've been written far better to provide data protection without entrenching the major players even further.


> in this case could've been written far better to provide data protection without entrenching the major players even further.

Could it? How?


"GDPR has been the death knell for small and medium-sized businesses [...] Here is a partial list: [...] Unroll.me, inbox management app"

Don't know for the other companies, but for this one, good riddance, they had a notoriously scummy business model[1].

[1]:https://www.nytimes.com/2017/04/24/technology/personal-data-...


Every single company on that list deserved to die.


> Every single company on that list deserved to die.

Hi, I'm Brent Ozar, the cofounder of the first company in the list. (Ah, the joys of alphabetical sorting.)

I've written a big long post[1] about why we stopped selling to the EU, but here's the short story: the EU only represented 5% of our revenue, and for that small of revenue, I wasn't prepared to risk the GDPR's fines if any one of the third party tools we use had a problem.

During our GDPR prep with our attorneys, it was completely clear that the third party app ecosystem was in no way ready for GDPR enforcement actions. For example, we use WordPress and WooCommerce to sell online training classes. I'm a database administrator, and I know dang well that WP and WC aren't encrypting student data at rest, nor do they encrypt the other fields where people put student data - let alone how some of the plugins handle student data by storing it in the posts table, which was never designed to handle that kind of thing. If I had to face EU officials, I could never say with a straight face, "Oh yes, I was completely confident in WordPress's abilities to keep customer data secure."

I have confidence that someday, apps like WP and WC will have a better GDPR compliance story that doesn't just meet the bare letter of the law, but also the spirit. When they do, I'll be all about selling to the EU.

I'm doing the preparations that I can - for example, we've got a Privacy Policy that lays out our interactions with other partners, and lets EU folks request their data & delete it.

However, this is just the life of a small bootstrapped business: sometimes, you gotta make choices to focus on your best customers. 5% of my customers were threatening me with regulatory action that might result in huge fines if I let a ball drop. Unfortunately, I only have so many hours in the day. If I have the choice between doing regulatory paperwork for 5% of my customers, versus adding more value for 95% of my customers, I gotta make the obvious choice.

[1] https://www.brentozar.com/archive/2017/12/gdpr-stopped-selli...


That is a straight up dumb thing to say.

Care to explain why "Brent Ozar, IT consulting services" (first company on the list) deserves to die?


Phrasing is harsh, but from the sibling thread:

> If I had to face EU officials, I could never say with a straight face, "Oh yes, I was completely confident in WordPress's abilities to keep customer data secure."

Should this business really continue handling potentially user data if it can’t guarantee it will be secure down the line ?


> it can’t guarantee it will be secure down the line ?

Then no business can handle user data?

There's no such thing as a guarantee. You can do what you define as the best effort to secure it, but you can't guarantee it will be secure.


Well, he himself said as an explanation that he couldn't audit the user data he received to ensure deletion or to ensure that it didn't fall into the wrong hands.


Uh, no, I did not say that.


You did say you could not ensure that your users data was secure, due to limitations of Wordpress and various poorly coded plugins.

Which is tantamount to saying you cannot endure it doesn’t fall into the wrong hands?


That's not what he said.

He stated that people contact the company via many different methods: email, twitter, Instagram, etc... GDPR mandates that when the user demands it the company must delete all associated records. This small company doesn't have the resources (or doesn't want to waste time) to go through all emails, all twitter exchanges, etc... and expunge them all every time someone demands it.

As far as WordPress plugins go - I get that too. The place where I work has 100s of 3rd party packages. To go through them all would require Y2K level of effort to make sure they comply and/or upgrade ones that don't.

So I am not at all surprised that Brent Ozar didn't think EU was worth the effort.


This looks like an overzealous interpretation of the law more than anything else. Looking at both the founder's comment and yours, I can only +1 hannasanarion's comment.

And because someone will probably ask "why", here is why: 1) The GDPR was not designed to drown small businesses into expensive processes forcing them into bankruptcy or into cancelling their expansion in Europe. It was designed as to force business owners into thinking twice when they plan on getting rich by exploiting and reselling customer data to third-parties, or by performing "smart" operations on this data (i.e. any company that sticks the words "AI" or "smart" or "neural" or "deep" close to "customer data" in its business model).

2) If a user agreement (or privacy policy) specifies that data requests should only be carried out through medium X (e.g. an email address) then that shuts down all discussions surround the "people contact the company via many different methods: email, twitter, Instagram, etc..." argument.

3) "I'm a database administrator, and I know dang well that WP and WC aren't encrypting student data at rest, nor do they encrypt the other fields where people put student data." So what? GDPR nowhere says "student data should be encrypted at rest". It says it should be protected from unauthorized access, but not that it should be encrypted. Encryption is one way to respond to this requirement, and 9 times out of 10, it will be implemented with security flaws much worse than simply enforcing access control to the data. By trying to address a problem that does not exist with a solution that is inadequate, this business owner basically failed at his primary mission: managing risk.

Two arguments raised, two arguments completely wrong. Hence the justified conclusion: I am glad that these business shutdown or stopped playing with EU citizens data following the enactment of the GDPR.


> GDPR nowhere says "student data should be encrypted at rest". It says it should be protected from unauthorized access,

The reality is that you can't protect unencrypted data from unauthorized access. You can try, but you can't guarantee it, not when you have hosting partners, for example. Encryption is just one completely reasonable defense mechanism that needs to be part of a larger strategy. I wasn't comfortable defending the company without personally identifiable data being encrypted. You might be. I'm not.

> this business owner basically failed at his primary mission: managing risk.

To the contrary, I succeeded. I eliminated the risk at the cost of 5% of my revenue. I sleep great at night not worrying about the GDPR.


Each item on the list deserves a little blurb on why GDPR "forced" it's removal from the EU market.

Examples: Unroll.me's entire business model was made illegal in the EU.

Hitman: Absolution faced problems in taking ownership of its EU servers.

The two games Loadout and Super Monday Night Combat both claimed not having the resources necessary to comply with GDPR. For perspective Loadout had a peak of 208 concurrent players in 2018[0] while SMNC had a peak of 40 players in 2018[1].

[0] https://steamcharts.com/app/208090 [1] https://steamcharts.com/app/104700


What's wrong with Pottery Barn?


"GDPR has made it so that only large multinationals can steal your personal data :["


Exactly. Unroll.me continues to operate outside the EU and continues to sell access to people's emails. "We don't charge you for this amazing service" they claim, at no point making it clear what their business model is. They prey on the ignorance of consumers, exactly what regulations (like GDPR) should be protecting consumers against.

I suppose a similar article could be published about the pharmaceutical industry, crying about the cost and consequences of FDA regulation. Doesn't mean we should scrap FDA regulation and "let consumers decide" whether drugs are safe or not.


Drawbridge: cross device matching (ie fingerprinting) service

unroll.me: selling your inbox contents, but "anonymized"

FamilyTreeDNA: proudly letting law enforcement and probably tons of others search your (and your relatives!) DNA, without -- for your convenience -- asking your, or your relatives', consent

Klout: kinda scummy, and also not phased out because of GDPR

>Lithium CEO Pete Hess reported that Klout is a “a standalone consumer-facing service” that no longer fits the focus of delivering customer service solutions. In addition, “recent discussions on data privacy and GDPR are further expediting our plans to phase out the Klout service, giving us a chance to lead on some of the issues that are of critical importance to our customers: data privacy, consumer choice and compliance.” [1]

We're supposed to mourn these companies? We shouldn't trust an author or site whose best choices of companies to mourn are (1) at least in some cases, not gone because of GDPR, and (2) mostly companies we're better off without.

[1] https://www.relevance.com/lithium-ends-klout-gdpr/


Also Klout was shit. The negatives of social media magnified to parodic heights.

https://xkcd.com/1057/


Color me surprised, scammy business are losing millions and exiting the EU. I'm totally happy about the outcome.

Though there is still a lot of abuse and dark patterns going on, I believe most of them should make it as easy to "opt all in" as "opt all out" for the cookies for instance.


The problem is any regulation is that it increases the startup costs for smaller businesses.

So as more regulation comes in it will just end up cementing the large players in place as they can absorb the costs of any regulation, while smaller businesses will have higher startup costs (which lets face it were next to nothing).

So while you maybe rejoicing now that shitty companies have gone for now, regulation will just make it harder for these massive companies to be toppled as it makes it harder for smaller companies to comply.

The EU are trying to have article 13 pushed through and any site that has user generated content will have to have some sort of upload filter to check for copyrighted content. That is going to cost money to implement and since Youtube hasn't really be able to achieve it, the only people that will be supplying the software will be the likes of Google, Microsoft etc ... So again it will just make it harder to the small business and help the large businesses.

Also a lot of these regulations make are making the web a shittier place. Every time I go onto a site now, I have the stupid cookie and GDPR notice plaster in front of what I want to look at. I already protect myself and don't care about their attempt to track me. It is just an irritation that nobody pays attention to and it achieves the opposite of what it was intended to achieve.


We need regulations because people will go as far as they can to make more money. Businesses were upset when their country banned child labor while their concurrents' country didn't, same when weekends, vacations, reasonable work weeks were introduced. What about safety requirements, food quality inspections, &c.

Self regulating markets are a myth, just look at the US insurance and health industries if you want a proof.

That's also why in healthy countries you get a lot of free passes when you start a business: lower tax rate for a few years, 0% loans, advisors paid by the state, &c.

> regulation will just make it harder for these massive companies to be toppled as it makes it harder for smaller companies to comply.

Why did no one topple apple, amazon or google in the last 25 years? If anything the lack of regulations when they started allowed them to become the de facto monopolies we all know today.


Some of those companies aren't even 25 years old. They didn't get toppled because they were the young upstarts growing into incumbents.

The problem isn't supporting privacy and data rights, it's doing so in a way that creates unintended consequences which actually worsen the market and UX for consumers. There are better ways this regulation could've been written, but it wasn't. That's the issue.


> The problem is any regulation is that it increases the startup costs for smaller businesses.

Why should this be the one thing we optimise for?


No-one seems to have suggested that it’s the one thing we should optimise for, but it is important. Small businesses are the foundation of economies, and every extra overhead ultimately damages those economies and so needs some justification that is of greater value, financial or otherwise. One year on, it’s still not clear to me that GDPR has achieved that greater good, and I write that as someone who is a very strong believer in stronger privacy laws in principle.


Where do you think jobs come from?


Businesses that don’t have security issues when handling private data, obviously.

I agree with the GP, in that ease of starting companies should not be the primary goal, setting security and privacy on the back seat. It shouldn’t harder than it needs to, not easier at any cost.


> It shouldn’t harder than it needs to

This is what's happening though.


Because if you do poorly on the small business front, then they can't grow into bigger businesses. How many EU tech companies do you know of compared to American ones?


Many, but I'm european so it probably doesn't count (;

If you make it harder for companies for protecting people it's still a win. I recently visited SF, "the center of innovation" for the startup world. I saw 2 people defecating on the street in 2 weeks, countless peeing and had to jump over homeless at some points to walk the street. If that's the cost for startup and innovation, please don't bring it to EU.


I'm European too and I really wish people from Europe didn't have an attitude like yours. Some parts of Europe are incredibly poor, but of course we have a much smaller homeless problem, because if you're truly without shelter then you simply die in winter.


I'm from Spain and there are homeless people in Spain, just a lot less than in the US. But it's not because they die in winter, it is because you don't become automatically poor if you lose your job, or if you need an operation, or if you study at university. It's safety nets that avoid people losing everything and becoming homeless.


Legislation is not meant for you (at least for now).

It's meant for those who cannot/do not know how to protect themselves.


I don't see how this answers the point.

I'm of the opinion that privacy regulation is a good idea, but it's trivially true that it's an additional burden for start-ups. The Is it worth it? question is a legitimate one.


Honestly, the biggest problem with GDPR is it's current implementation, i.e. the on demand wipeouts.


And now those who cannot/do not know how to protect themselves will be unable to start a business on the internet in the EU. Do you think these two groups have to be mutually exclusive?


Protecting the masses is more important than edge cases. Most people will never start an internet business.


Without the people who start business type X, we won't have competition in business type X. Therefore a law that makes it hard to start businesses of type X will affect you whether or not you ever intend to start such businesses.

This applies for any X that you care to name, including "internet".

If you believe that you can both pass regulations that make businesses of type X harder to form, and enjoy the benefits of having new businesses of type X around, then there is probably a big flaw in your thinking.


In case of GDPR, X is "businesses abusing people's data", which essentially boils down to "adtech". We don't need more competition in adtech. We need adtech to die.


No, X is "businesses that handle people's data". For whatever reason.

The goal is to regulate adtech. But the effect is to impose regulatory costs on every company that wants to have a discussion forum on their website. (And the upcoming copyright bill is even worse.)


In the case where X is what you describe, then fine. If they can't start their company and simultaneously treat my private data with respect and care, then I don't care for them to exist.

The cost of business going up isn't necessarily a bad thing, if we're getting something valuable in return (IMO we are). The question is whether or not the increased cost is prohibitive, and you have not provided any evidence to suggest that's the case.


> The question is whether or not the increased cost is prohibitive, and you have not provided any evidence to suggest that's the case.

The thousands of companies that just block EU citizens rather than comply seems to suggest that they feel the cost is prohibitive.

As for more direct hard evidence I believe this would fall into the "unseen" category in Bastiat's That Which is Seen, and That Which is Not Seen and is, in effect, calling on someone to prove a negative.


> The thousands of companies that just block EU citizens rather than comply seems to suggest that they feel the cost is prohibitive.

They block EU because they deem compliance not worth the effort (now), usually because they get more than enough from their US markets. This doesn't mean the costs are prohibitive. Thousands more companies didn't block EU citizens. Some companies (notably news sites) even started to offer a superior product to EU citizens (e.g. plaintext news).

Also, even with those blocking EU or shutting down, nothing of importance is lost. These companies have competitors that are less abusive, who do fine.


Handling other people's personal data is a serious responsibility. GDPR imposes regulatory costs, in the same way that health and safety or environmental protection legislation imposes regulatory costs. It's not creating any new costs, it's just properly pricing an externality.

https://www.schneier.com/blog/archives/2016/03/data_is_a_tox...


Including all the businesses that run on adtech then? I guess so much for Google, YouTube, Facebook, Android etc.

I would much rather have adtech and those businesses. I think most people feel the same way, because they continue to use those businesses.


Adtech ought to die. Ideally, I would want to pay for Google and Facebook the same way I pay for Netflix and Spotify. In exchange, I would want them to treat the data about what I do online with the same respect with which my doctor treats my medical history.

The model where Google provides a service and users pay for it is more efficient and more societally healthy than the model where Google provides a free service, a million companies pay to place ads on it, and pass the cost of their AdWords budget onto users who get a 'free' service.

It is a model where consumers get better products, and where millions of creative minds aren't wasted making web pages uglier (or ruining cities with billboards, for that matter). It is a model where competition is also a little easier, because an alternative search engine can undercut Google's prices and carve itself a starting market niche, even if their service is not quite as good as the established competitor; instead of the current model where first you need to be better than Google in every way, and then you have to fight the network effect.

I have no clue how to get to world to switch to this model. It will require that elusive white whale, an online payment mechanic that is truly as frictionless as cash. And it will almost certainly require legislation rather than mere market pressure, because people can see their monthly Google bill but cannot see the vast costs of the marketing industry which they pay for every day.


>The model where Google provides a service and users pay for it is more efficient and more societally healthy than the model where Google provides a free service, a million companies pay to place ads on it, and pass the cost of their AdWords budget onto users who get a 'free' service.

That's cool and all, but people can't pay for it. These fees would add up quickly and you'd basically never go beyond your few webpages that you're paying for, because everything else costs money.

I probably would never have cared about the internet or anything related to computers, if websites had required people to pay. That would not have been an option for me or most people I knew growing up.


Everyone who cares strongly about this issue (not nearly as big a cohort as hn thinks) is against targeted ads. If they ever get their way and laws really end Google/Facebook's business model as GDPR intends, the much larger cohort of people who care more about not paying for services will start caring.


Adtech will die once people start paying for things on the internet. When that will be who knows.


I've started 2 startups in the UK since GDPR (well, 1 that happens to sell 2 different products), not really affected me one little bit.

But then again, they're not scummy companies.

Soooooo, bullshit.

I had to put in like a few hours thought into what data I was collecting and how long it was appropriate to keep it.

I happen to know quite a lot about GDPR because I dealt with it at a client I was previously working with, if you want to make it extremely complicated, you can. But you don't have to.

In one we actually track user's behaviour to make better recommendations, but we're open about it and they can disable it if they want. We also delete that data if they delete their account.

It's just a different mindset, it's their data, not yours. You're open about what you're doing and if they want you to delete it, you delete it.


> but we're open about it and they can disable it if they want

It's not legal, consent is opt-in not opt-out.


that does not says if there is an informative pop-up.


There are no costs because no one is enforcing it.

> In one we actually track user's behaviour to make better recommendations, but we're open about it and they can disable it if they want.

If I understand well this is opt-out instead of opt-in... If you would be slapped some percent of your revenue for this you would feel the costs. Not only the cost of fine, but also of reading and implementing GDPR more carefully. But data protection authorities don't have enough resources to audit even 1 / 100 000 of companies that ignore GDPR up to this level of detail. So you can live in happy ignorance that you are implementing GDPR.

That not to say that GDPR doesn't help in general. The issue is that it will be a dead law or a law that hits randomly some very, very small percentage of companies breaking it.

Having a law that no one implements properly is just a recipe for abuse of power by authorities. "Show me the man and I’ll show you the crime" is well known to people living under the Soviet rule. (And, No! EU is not the Soviet Union. But some DPA are in post-soviet republics with people that were raised in this mentality.)


"I happen to know quite a lot about GDPR because I dealt with it at a client I was previously working with,"

There we go. You already done the time investment at someone else's expense. So thanks for proving my point.

My comments weren't about GDPR but about regulation in general. Any regulation requires more work which makes it difficult for smaller players. You had to do the extra work.


Should we ban food inspections too, since that means smaller players have to do more work? How about automobile safety testing, it's such a hassle for auto makers. Why not get rid of building codes and prohibitions on lead in children's toys while we're at it.


I imagine the anti-GDPR-folks might argue that overly onerous restrictions have been harmful to smaller players. Temperature requirements effectively made Peking duck illegal in California, until a lawmaker representing the Chinatown area proposed a law specifically exempting it: https://www.sgvtribune.com/2015/08/22/peking-duck-is-so-impo...


Should we also abandon the regulation on not stealing things? It makes my startup idea much more difficult too.

Individuals' rights over their data should just be another human right like property rights and not being harmed by others.


> Any regulation requires more work

Thinking about what you do and how you do it is probably not a bad thing.


Two day's worth of research. Horrible, absolutely horrible.


[flagged]


Because that knowledge is worth thousands to tens of thousands of euros in lawyer time. And you're still not guaranteed to get it right or be covered.

Your example is like saying that everyone that wants any kind of job should know multi variable calculus. When people protest that that's putting too much of a burden on people, you bring up that you got a job just fine, because you learned multi variable calculus in school.


Wouldn't it be more similar to anyone wanting a job should know how to calculate and file taxes? Or that is too inconvenient as well?


Their example is like saying if you want to open a restaurant you better take the two day course on food safety. Equating GDPR compliance with multivariate calculus is just a gross exaggeration. Yes there are risks, you get those with every venture you start. You're pretty well covered with the technical due diligence we as a sector should have put upon ourselves in the first place and you can externalise the rest easily, just like people do with many other regulations like taxes/finances.

We should really separate the protection of scummy business models and down to earth stuff like data takeout / account deletion and transparency as to what companies do with user data. The latter is neither rocket science, nor should it be particularly hard for any startup that's over the "my company is a fancy slide deck" stage.


But it's not just that. Read the rest of the thread how much time and effort people had to spend at various companies for compliance. It's not just about data takeout and account deletion.


If your business model is scum I'd wager that it should die a slow and painful death.

I really don't see, why a scummy business should get a pass, just because it's a startup.


It hurts the non-scummy businesses as well.

So the regulation causes problems for people that haven't done anything wrong.

A lets be clear here. People aren't dying, it mostly ads and shitty data collection. I think it might be better to actually educate the public (which govs are doing) as to some of the pitfalls of the internet rather than regulating the crap out of it.


it mostly ads and shitty data collection

While this is true it's exactly that, which turned the world (and by extension the world wide web) into a fucking dystopia. Brexit, without the whole concept of targeted ads and the data collection that goes with it would have not been possible.

Yep, I think add tech is utterly and totally evil. And all that to make a buck, or a billion.

I, for one, think that's a disastrously high price to pay for a few successful tech companies.

People aren't dying,

Actually I disagree here. When you look at the consequences of the technology in countries like Myanmar, The Philippines, Brazil, Cambodia and others and the likes of Mr. Zuckerberg and his ilk giving exactly zero fucks (unless it becomes bad PR) I'm afraid you're definitely wrong on that one.


> Brexit, without the whole concept of targeted ads and the data collection that goes with it would have not been possible.

However nobody mentioned all the people that didn't bother voting because they were at Glastonbury which was on at the same time.

I very much doubt that is true. The UK has been a bad fit in the EU and there has been a sentiment for years that we don't want any EU interference. For example many don't want "The EU monopoly money" (not my words mind you) and generally the public is Euro-sceptic.

The papers and politicians were trying to find a scapegoat because quite frankly it didn't go the way they wanted. Much like Trump's victory claiming that Russia hacked the election (there were like a few thousand placed on facebook, which paled in comparison to the Democrat's budget).

Many of the people that voted out were of older generations that don't pay attention to tech. So I find it dubious how much influence the likes of Cambridge analytical really had.

> Actually I disagree here. When you look at the consequences of the technology in countries like Myanmar, The Philippines, Brazil, Cambodia and others and the likes of Mr. Zuckerberg and his ilk giving exactly zero fucks (unless it becomes bad PR) I'm afraid you're definitely wrong on that one.

Like exactly what? You haven't qualified anything here. You just claimed I am wrong because of what? What adverts, what is happening? This is a very vague claim.

I suspect much like the vote to leave the UK it will be very spurious evidence.


Like exactly what? You haven't qualified anything here. You just claimed I am wrong because of what? What adverts, what is happening? This is a very vague claim.

Vague claim? Not at all.

I was asking myself if I should actually bother to even answer, but then decided to invest a couple of minutes into some very basic DDG searches. You can find some results below.

Let me assure you that there's a ton more, if you just bother to open your eyes.

I close my argument here, since anything else would be either counter productive or violate site guidelines.

But please don't accuse me of sprouting vague claims or not qualifying my arguments just because you seem more interested in a timely Uber or a cheap stay and fuck all the consequences.

https://www.thedailybeast.com/exclusive-rohingya-activists-s...

https://www.theguardian.com/world/2018/apr/03/revealed-faceb...

https://www.nytimes.com/2017/10/27/world/asia/myanmar-govern...

http://nymag.com/intelligencer/2018/09/how-facebooks-free-in...

https://www.bloomberg.com/news/features/2017-12-07/how-rodri...

https://www.irishtimes.com/news/world/asia-pacific/facebook-...

https://en.wikipedia.org/wiki/Indian_Whatsapp_lynchings

https://www.theguardian.com/world/2018/jul/15/india-police-a...


> It hurts the non-scummy businesses as well.

If your business case depends on either abusing or being careless with other people’s personal data, how are you not a scummy business ? That’s basically all the GDPR requires of you, don’t abuse people’s personal data and be careful with it. Both seem like common decency to me.


If that were really all that the GDPR required, it wouldn’t cost businesses that already did show that common decency anything, would it?

In reality, all regulations have costs for compliance and those costs typically apply to some extent even if you weren’t doing anything shady at all.


> all regulations have costs for compliance

if you were _already_ complying before GDPR existed (because your business model isn't scummy), then GDPR compliance _should_ cost very little, if at all.

If you weren't complying at all, then adding compliance is very costly after the fact. If you cannot make your business work without complying, then the business must die, as there's no natural right for a business to exist.


if you were _already_ complying before GDPR existed (because your business model isn't scummy), then GDPR compliance _should_ cost very little, if at all.

But unfortunately, that isn't really how it works. Under GDPR you could still find your privacy policy now isn't written in the correct terms, or your previous consents or notices weren't worded properly and might not stand up any more, or your methods of storing data don't make per-person permanent deletion straightforward. And all of this remains true even if you were compliant with all previous data protection legislation (at least here in the UK) and even if you weren't doing anything sketchy with the data and have no plans to do so in future either.

If nothing else, you probably need non-trivial amounts of management time to understand the new rules, some extra legal advice that you're going to have to pay for, and an update of your key documents to make sure everything uses appropriate structures and wording to comply. That alone could already be a significant cost for a small, bootstrapped business, and that's without changing anything about the actual data you're collecting or how you use it.


This all is a reflection of the old talks about the costs of doing technical things right. One way of looking at them is that if something works for business, we should not pursue better software architecture or improve security or usability. Another way is to analyze and estimate the technical debt and eventually start paying it. This is exactly what happens with privacy now: business may cry about "removed incentives", "prohibiting costs", "eliminated opportunities" and other BS, but in the end it's just a compliance debt that they are not willing to pay. GDPR identified that debt and the mechanisms for claiming it, that's it. After the dust settles, there will be plenty of best practices and educated people which will make compliance easy, certain business models unpractical and the business will go as usual. Yes, compliance isn't a piece of cake, but there's nothing written in that law which a sane engineer or manager would not implement. Even the right to be forgotten makes sense: information about past crimes distributed via search is a kind of extrajudicial punishment which makes it much harder for people who already served their sentence to find a job and return to normal life. It's a job of a government to prevent them from committing another crime, it's not a job of a search engine or a news website.


I don't think it's the goal of privacy or the tech that caused the costs to be soo high. Probably more related to the ambiguity of the law.


I don't think the law is ambiguous. It's usually the situation when GDPR is already violated or going to be violated and data processor wants to find the least expensive solution to reduce the risks. In other words, it's not "How we should do it?", but rather "How difficult it will be to challenge our solution X in court? What our chances to win?" THIS is ambiguous, but it's the same with any regulation.


The regulation is very ambiguous.

Try to understand what is even personal data from this:

https://ico.org.uk/for-organisations/guide-to-data-protectio...

It is all about risk, ambiguity and individual circumstances. I dont think that is bad, but there is no clear record of what it even is we are meant to protect.


It is and it isn't.

If you're in the business of "doing free services so you can skim GB's of data from users" or you "sell wholesale data collected without notice", the EU doesn't want you.

If you're doing a good job of keeping user data private except at the direct request of a user in a plain-language direct permission, then you're doing a good job to the GDPR. Slipups happen, and as long as you do your best to stop the bad thing, limit the breach, notify users, and be a good steward for their data, then it's all good.

As a US citizen, I try to make a point to only work with companies that adhere to the GDPR. I know they don't have to do so with me. But it tells me their internal processes are set up to respect the user's rights. And well, running dual systems for different compliance regimes is a tough sell - its easier to do 1 big system.


> as long as you do your best to stop the bad thing, limit the breach, notify users, and be a good steward for their data, then it's all good

If that regulator happens to like you. There is no schedule of offenses and penalties and due process, only an absurdly high maximum for selective enforcement.


And there are a lot of regulators. Some of them a lot more combative than others. That is my main reason for dislike for the regulations.

Overall I support the regulations, but I really wish the penalties had more documented structure than “We will fine you anywhere from 0 to an 8 digit number (in our case) depending on what we think is right”.


The negative outcome of more specific fines is that they get progressively easier to circumvent.


There is due process. If you think a regulator's decision was illegal, you can escalate to the courts. Some member states may not have the best justice system, but that's what the ECJ is for.

There is no explicit schedule – that could be gamed – but that doesn't mean regulators can act arbitrarily. Punishments have to be proportional to the infraction, similar cases have to be treated similarly... The GDPR just does not spell out how public authorities work.

It actually does say that punishments have to be proportional IIRC. I'm not sure if that actually makes a legal difference or if it was included to make the GDPR easier to understand.


And you pay for the lawsuit out of your own pocket. Now you need to run a business and fight a very expensive legal battle against the government. That same government that regulates your business.


>And you pay for the lawsuit out of your own pocket.

Only if you lose.

> very expensive legal battle

EU ≠ USA

>That same government that regulates your business.

So what? If you have a grievance with an entity, that's the entity you have to fight a lawsuit against.


Are you sure you only pay if you lose?

>EU ≠ USA

I don't see why this changes anything. Lawyers still cost a lot of money. They might not seem like they cost a lot of money to Americans, but that's because Americans earn a lot more money.

>So what? If you have a grievance with an entity, that's the entity you have to fight a lawsuit against.

One of the grievances people have against GDPR is that they don't like how GDPR's enforcement depends so much on the individual person at DPAs. You'll still have to deal with the person afterwards that you sued.


> Are you sure you only pay if you lose?

Yes. Each party paying their own fees is a uniquely American thing.

> I don't see why this changes anything. Lawyers still cost a lot of money.

Prohibitively high lawyer fees are a uniquely American thing. The ECHR guarantees practical and effective access to the courts.

> One of the grievances people have against GDPR is that they don't like how GDPR's enforcement depends so much on the individual person at DPAs. You'll still have to deal with the person afterwards that you sued.

That Americans have against the GDPR. Given that the people who actually have experience with European authorities and law don't see these issues, it's very likely they don't exist.

You don't necessarily have to deal with the same person. Even if a DPA always assigns the same person to you, there is no oversight, that person is petty and cares more about harming you than about their job: We have rule of law and a functioning court system. And I can't help but find these continuing insinuations that we don't pretty insulting.


Precisely this. The cost and complexity of complying with GDPR is directly proportional to the scale and complexity of your data processing operations. If you comply with the principles of the legislation - collect the minimum possible amount of data, store it for the minimum possible time and process it only in ways that are essential - then compliance is very straightforward. Things only become ambiguous when you're trying to do something that the GDPR doesn't want you to do.


What's written on that web pages is clear enough for me and it's the same as my own understanding of personal data. It is rather abstract and I can admit that it may be not easy to understand for others without some good examples. But it's a complicated topic in general, that has to be studied beyond reading a single article or text of EU law.


What is your opinion based on? Have you both read the law and attempted to bring an organization into compliance with it?

I have, and it is definitely ambiguous. To take a simple example, consider all of the cookie warnings that you now see. Intelligent and informed people disagree on whether they are required, enforceable, or sufficient.


I have to deal with compliance on daily basis. Cookie warning is a usually misunderstood idea of having user consent for storing and retrieving information from his device. The law applies to the local storage and other similar solutions too, and it is the intention to use this data that has to be explained if it’s not one of legitimate purposes for which consent is not required (e.g. session id cookies and auth. tokens). Since it is mandatory, it becomes an UX topic, not a legal one - how exactly to integrate the collection of consent to all possible landing pages of your website so, that user will be informed about it prior to any data processing.


This lw has been in effect in the form of various national laws for over a decade. GDPR is only slightly different from Swedish national Data Protection Law, for example. So yes, this entirely the tech’s fault and debt. We as an industry have ignored these laws for too long, and now crying because the debt is being collected. Boo hoo. Cry me a river.


You have to admire the EU's gumption: forcing the payment of technical debt with GDPR, and forcing us to face the hard reality of copyright law with Article 13 (hopefully leading us to abolish it after realizing how ridiculous it is when seriously upheld).

There's something so naive or earnestly human about them. If the U.S. kept being the only relevant legal force on the Western Internet, we'd mull around in gray areas forever.


On the side of where all the internet platforms are from, it is the US that's relevant. Maybe it's not a coincidence that the EU doesn't produce many internet platforms that are good?


And you attribute this to regulations rather than the ground truth that the EU is a hodge-podge of very different cultures, countries, languages and laws that only recently implemented a shared currency, and is totally unlike the huge, wealthy and comparably homogenous market that is the USA?


The scope of what is "personal data" under GDPR is much broader than you are assuming, you are only considering the obvious, simple cases.

It also covers an astonishing amount of industrial sensor data used solely for industrial purposes. Unfortunately, for many high-scale industrial sensor data models the technical infrastructure required for compliance literally does not exist. In some cases we don't even have the computer science required to build the compliance infrastructure. But the vast majority of people would be very upset if the business model of some of these companies became "unpractical" and had to go away because GDPR compliance is effectively impossible. No amount of trying to do the "right thing" will make these industrial companies compliant.

There is gross misconception that GDPR only affects ad tech companies or retail or companies with business models involving people. This is far from the case.


Can you point to something which supports this claim?

In all of my reading it's been personal data, and definitely wouldn't apply to the things people would usually associate with "industrial sensors" eg. Carbon monoxide levels in a space, or even occupancy data (eg. for lighting/HVAC control) so long as it simply reflects whether an area within a building is occupied.

What's the specific requirement, and what makes it unattainable?


The position taken by every legal team I've worked with is fairly simple: if a sensor platform allows you to incidentally detect the existence of an unidentified individual at a point in space and time, then that sensor generates "personal data". The reason for this is that it is well-known that it is possible to analytically reconstruct the identity of individuals detectable this way with sufficient data. This is consistent with e.g. how ad tech data is treated under GDPR, so it is typically used as the standard for determining if industrial sensing platform data is "personal".

What people don't immediately grok is (1) just how many industrial sensor systems there are these days operated by diverse organizations -- almost every sensor type on an autonomous car, for example, is also widely used in many other industrial contexts, (2) the scale of sensor coverage in most places people occupy indoors and outdoors, which is far beyond what they typically imagine, and (3) how many of these sensors can be used to incidentally identify the presence of a person at a place and time, sometimes in very non-obvious ways. A single sample from a single sensor may not be identifiable but multiple samples from multiple sensor modalities often is. And the sensor modalities used for industrial sensor systems are increasing in diversity and resolution very quickly, which makes it even easier.

Humans perturb the environment they move through, and we have enough environmental sensors now that we can often track those perturbations across the sensor modalities to create a fingerprint. People have a difficult time imagining how easy this can be in practice until they've seen it done.


Thank you for this very interesting example! However applying this regulation to industrial sensors then is still the only right thing to do. Technical progress must be constrained by the speed with which society can adapt to it and by all the related concerns: if there’s lack of understanding on how to make the technology compliant or there are complications, it’s just that the cost of the technology appeared to be higher than anticipated. Business has to deal with it, just like in all similar situations - see hardware vulnerabilities in Intel chips for instance.


What youre saying makes sense, and I still agree with the GDPR. For example:

Power is used by a house. The meter runs. You pay the bill. The house has an address and a point of contact.

Power is used by the house. Machine learning is applied to map each individual and how they live in said house. The data is then sold to target things the ML algo picked up. You pay the bill. The house has an address and a point of contact, along with a detailed profile of each human in said domicile.

Same sensors exist, yet one violates the GDPR and the other one does not. Can you guess which one?


This is a very interesting list, especially the part about the GDPR increasing the attack surface and where the data gravity center is.

The part about "compliance cost" should be taken with a grain of salt. If you were compliant before, because you respected the users‘ privacy, the effort was relatively low.

The study about VC having dropped by 50% in the EU because of the GDPR sounds pretty weird to me. Unless of course there’s selection bias and we’re talking AdTech companies mostly.

An interesting number would be: how many people closed down forums and moved their discussion boards to Facebook?


> If you were compliant before, because you respected the users‘ privacy, the effort was relatively low

This is not true. Even if you are perfectly compliant, you need a complaint-response mechanism and lawyers in the EU ready to react to invalid accusations.

Given GDPR took a complain-investigate model, one also needs to be ready for power-tripping regulators. (Recall the Romanian data protector using GDPR to seize sources from a newspaper investigating corruption allegations [1].). Protecting against that requires, if not active lobbying, keeping lobbying connections warm. That costs money.

Ironically (and predictably), I’m seen more data being funnelled to Google than before. They have the scale to deal with this crap in each of the EU’s (currently) twenty-right member states.

[1] https://www.techdirt.com/articles/20181114/01491541047/yet-a...


> Given GDPR took a complain-investigate model

Ironically, when GDPR came into effect so many on HN were spreading fake news that companies would be litigated to death by users. Of course, to remove that possibility and ensure only legitimate claims are pursued, the data regulation authorities act as middle-man. Such cases of abuse could also just as easily be done when people could sue. For example, nowhere does the GDPR imply that you need to hand-over a source - that goes for journalists as well as non-journalists. Companies sued have the right to appeal and, if GDPR wouldn't have existed, the Romanian authorities would've probably just used e.g. tax law to stifle the RISE project.


> nowhere does the GDPR imply that you need to hand-over a source

Complain-investigate compliance regimes tend to result in deference due to the cost of investigations and other informal expenses regulators can rain upon the regulated. (It works in finance because financial firms have the margins to support it. Also, the industry regulators are checked by both the courts and a public regulator, the SEC.)

Complain-investigate is thus a terrible structure for a general business law. Strict liability for data loss or mis-use (including the rights to data transcripts and deltion) would have been simpler. (Albeit, less profitable for European law firms.)

Long story short, GDPR’s aims and technical costs (e.g. deleting user data from backups) are fine. The problem is the compliance structure. It’s fundamentally incumbent-biased, commercially and politically.


GDPR is just the 1995 Data Protection Directive with teeth. If you were compliant with the DPD, you were almost certainly compliant with the DPD by default. The principles are the same and many parts of the legislation were carried over verbatim. GDPR came as a shock only because many businesses had been flagrantly disregarding the (weakly enforced) DPD for many years.

https://en.wikipedia.org/wiki/Data_Protection_Directive

>Even if you are perfectly compliant, you need a complaint-response mechanism and lawyers in the EU ready to react to invalid accusations.

Did American businesses really think that they were immune from prosecution under EU law prior to GDPR? No European business was under any illusions about the extraterritorial reach of American courts.


> Did American businesses really think that they were immune from prosecution under EU law prior to GDPR?

Prosecutors need to build a case before causing costs for the suspected noncompliant. Complaints, and regulators in complain-investigate regimes, can incur costs with zero evidence. This is why most systems reserve such structures for high-margin, high-risk applications, like banking regulation. Deploying it as a general business law is aggressive.


I don't know about it being low effort to be compliant. We spent most of a year with a significant portion of our software engineering teams devoting time to GDPR even though we are not any kind of data collection company. It's the legal requirements -- we had to audit every last piece of software, make little tweaks if necessary, etc, just to ensure we were demonstrably compliant with the law.

I wouldn't be surprised if 150B is actually a low estimate.


> even though we are not any kind of data collection company

Are you collecting data on your customers? If you are, then one of the things your company does is data collection, even if that's not what's in your business plan.


We store enough identifying data to do business with our customers, we do not collect data for data's sake. Not for metrics, nor for ads, not to sell, etc.


Does having a table called "users" with usernames, emails, password hashes, and last logged in timestamps count as customer data?


Yes. Emails are 100% PII. Even user names can be argued to be.


The term "personally identifying information" does not occur anywhere in the text of GDPR; the regulations use the term "personal data", which is defined differently.

I raise this issue in almost every thread about GDPR, because although it might seem pedantic, the error strongly implies that people have not read or understood the legislation. The difference between personal data and personal identifiers is integral to GDPR and the legislation cannot be understood without fully understanding that distinction and the implications that follow from it.

https://gdpr-info.eu/art-4-gdpr/


It counts as personal data, and as such, under the GDPR you have a duty to handle that data sensibly and responsibly.


Every single company in the world collects data on their customers.


Every company receives data about their customer, usually leaked by the customers themselves. How they handle it and what they choose to store / delete differs wildly.


> If you were compliant before, because you respected the users‘ privacy, the effort was relatively low.

Why? Don't you still have to build systems to be able to comply with user requests in a timely matter?


First, if you cared about user privacy, you would store as few data points as possible.

Second, it’s very likely that you have APIs in place that can request all data for a user anyways. If you don’t know what data you have of your users, you don’t give a shit about their privacy, no?

Third, user requests are usually: a) what data do you store about me? B) Export all data. C) delete all my data (for real).


At any kind of scale that is 100% not the case.

The orchestration of a data extract from even a midsized corporation is a significant endeavour.

Someone in the company knowing what data we have on an entity is a significant step away from the entire company being able to access that, because, you know, we take data privacy seriously, so we don’t make it easy to access all data on a single entity.

If your approach to privacy is putting all the eggs in a basket, allowing easy extraction of everything from that basket, and hoping the basket can be kept secure I’d argue your model is weird to begin with.


This is so simplistic. There are many storage solutions for many different use cases.

Some of these are write once and immutable afterwards.

There are relational structures for transaction history that may also link to customers.

These all have to be re-designed in such away that information can be removed from the system and exported from the system, while keeping essential information (such as past sales records).

This is not an easy problem to solve.


> Some of these are write once and immutable afterwards.

Got an example of something like that that'd make it impossible to soft delete a person? I'm struggling to think of any datastore in regular use that's write only.



Yeah, as I thought, it's a blockchain/distributed ledger related technology. Hence why I said "regular use". I doubt large numbers of EU businesses are suddenly having to move data from their core ledger to another datastore because of this.


>Third, user requests are usually: a) what data do you store about me? B) Export all data. C) delete all my data (for real).

Are you implying that those 3 requests are simple to fulfill in a business running a modern software architecture?


Backups usually aren't designed for deletion of individual records. Until recently this was considered to be ok.


If your backups age out and are deleted over time it is ok (as long as that timeframe is reasonable).


_if_ you can ensure that you can re-execute deletion requests if the data needs to be restored.


With GDPR no backup can be kept more than 60 days.


It still is.


they aren’t but this is actually quite easy. PD gets encrypted on a per user basis. to forget a record, throw away the key.


>The study about VC having dropped by 50% in the EU because of the GDPR sounds pretty weird to me. Unless of course there’s selection bias and we’re talking AdTech companies mostly.

It's really that surprising to you that when the EU effectively bans one of the most profitable models of business that venture capital investment will drop by 50% in the EU?

To be fair, it's probably just not GDPR but all of these regulations combined. Venture capital can move across borders, why would you invest in a startup in the EU when you could just do it in the US?


> The study about VC having dropped by 50% in the EU because of the GDPR sounds pretty weird to me. Unless of course there’s selection bias and we’re talking AdTech companies mostly.

Could just be noise in the data?

Could be VCs determining that fewer products are actually worth pursuing if the main monetization model for everything is ads?

I dunno.


One thing my company did (we are US-based, but have an international operation as well) to try and mitigate the volume of compliance work to be done was section off software that would be used in the EU from everything else. Previously we had been working on making all of our software 100% internationally universal, but GDPR made that difficult, going forward we're kinda cutting loose the guys in the UK (there's some irony, I guess) to keep up the code that has to be GDPR compliant while the rest of the company focuses elsewhere.

So... anecdotally, I'm not at all surprised if the increased compliance cost made some people reconsider investments in EU businesses, even if they don't rely on ad revenue as a business model.


Personally, I'm totally fine with that lost investment. Especially since it's likely to be temporary.


Funny to read this in a US VC investment forum.


This isn't a US VC investment forum. This is a US forum subsidized by a startup accelerator, but otherwise quite generally about tech and geeky stuff, and frequented by lots of people from outside US.


Indeed. I'd even wager that a considerable portion of the userbase here has no idea what Ycombinator in the address stands for.


Only around 50% of the people in this forum are from the US.

https://news.ycombinator.com/item?id=16633521


The problem I think is happening with the compliance cost/VC stuff is that it's also tainted by the other internet junk (Article 13) that the EU passed soon after when they started doing this whole internet regulatory push.


Anecdotally, I thought that many companies just decided to exit the European market because the costs of GDPR were so high.


We did. We shut down our SaaS product in the EU and have no plans to reinstate it.


My question is, if you're a US startup, and you simply ignore GDPR requests, what happens?

Does Europe have some way to require its ISP's to firewall you off or blackhole your DNS? Can they force Amazon to shut off your AWS account? Do your executives risk being taken away in handcuffs to a European jail when they go to Europe on vacation?

If there are no consequences, why don't US tech companies just completely ignore it? (Of course, big players like Google probably have EU-based datacenters and other assets that could be seized to pay their fines. I'm thinking of small, cloud-hosted startups whose employees, bank accounts and physical assets are all on US soil.)


> If there are no consequences, why don't US tech companies just completely ignore it?

Once you grow big enough the EU will inevitably have leverage over you: Servers rented in the EU to lower latency, payment streams from EU customers, offices in the EU to get talent, subsidiaries created for tax reasons, executives on vacation, employees on conferences, money spent on advertising, etc.

If you are a startup in SV the EU migh not have much direct pressure it can apply, but how would an investor react when given the choice of "we could spend some more money now, or we could do nothing and be significantly limited once we grow to a certain size, basically unable to do anything significant in one of the largest economies of the world".


> how would an investor react when given the choice of "we could spend some more money now, or we could do nothing and be significantly limited once we grow to a certain size, basically unable to do anything significant in one of the largest economies of the world"

The simplest solution would be ignore GDPR, dominate the American market (which is easier to scale across than the EU), and then use that momentum to launch a simplified version in Europe. (Or buy a competitor.) The scale advantage will almost always outweigh being prepared for multi-market growth from the beginning.


Which gives ample room for a European competitior that does adhere to GDPR to clean up the EU market. We live in a very globalised world and the EU knows the leverage it has -- just as the US knows it's soft power extends well beyond her borders.


> Which gives ample room for a European competitior that does adhere to GDPR to clean up the EU market

Agreed. My point was with respect to an American start-up—compliance with GDPR is of lower priority than scaling. The priority, for both, should be scaling.

Advantage goes to the American start-up, however, in launching from a single market. But one might counter-argue that consumers in e.g. China will prefer to do business with European start-ups over American ones due to GDPR. (No evidence for that. But it’s a valid hypothesis.)


Not to mention that avoiding GRPR, laws that shouldn't need to have been written in the first place, is like walking around with a big sign 'we are evil and not to be trusted'. Because if you are to be trusted, a simple cursory check would simply affirm you are already within the GDPR.

We work in the b2b in the financial sector and part of our contracts in Europe is that all of the data is hosted in infrastructure that complies with the GDPR. That could be Google or Amazon, but not Slack or any SV startup.


Or they'll block payment processors from transferring money from EU customers to you

It's hard to imagine what a startup would be doing that makes them interesting enough for the EU to notice and want to levy fines, yet be completely out of reach.


You could probably get tangled if you accept money from EU citizens. If you don't take money from them (or use cryptocurrency), the EU can't really do anything.


If you're only moving packets, you generally have nothing to fear until the EU develops into an empire, at which point there's a good chance that they will have a mandatory firewall mechanism (some members already do impose firewall rules on ISPs through the courts, AFAIK).

If you have no business in the EU, generally the worst they'll do is censor your website.


If you are a US-based startup and don’t sell to EU customers, then I guess it doesn’t really matter if you attempt to comply.

However, most US SaaS-type startups very much want access to EU markets. Ignoring GDPR won’t matter until it does, and then when it does, it will matter very much. For example, you grow and want go establish a presence in the EU, investors with EU ties may be hesitant to get involved, a potential acquisition is ruined because the buyer has an EU presence and isn’t willing to take on the historical liability.

Yes, there’s a lot in GDPR. If you’re a startup that is making money by selling user data, the cost of compliance will be quite high. But if you are selling an actual product or service that generates revenue by collecting fees from your users, compliance is probably not as hard as you think. And building your startup with user data protection in mind, you’ll find it can be something you use as a selling point.

With more than a year of history, it’s not hard to find easy-to-digest articles that put GDPR in terms that an average person can understand. Integrate those principles and processes into your business, document what you’re doing, and then stick to it. Even without a huge compliance budget - if you do that and nothing else - you’ll be in a much better position than to just ignore it, even if you don’t fear punishment.


No B2C tech company can avoid doing business involving EU member state or UK citizens. You have to assume you’re in-scope unless you have zero contacts with Europe.


I thought it was EU/UK residents, rather than citizens. Even if all of your users are US citizens, some US citizens reside in Europe.


Neither residence nor citizenship are tests for GDPR’s territorial application.

It applies to any data collected in the Union: https://gdpr-info.eu/art-3-gdpr/


> Does Europe have some way to require its ISP's to firewall you off or blackhole your DNS?

Not in a systematic EU-wide way. Courts sometimes force individual ISPs to blackhole websites used for copyright infringement.

I guess if your company ignores the GDPR, it's treated as an illegal organization. So you may still be able to provide your service in the EU, but people cannot legally pay you, including paying you for ads.


GDPR enforcement is through the corresponding data protection offices of the different countries and fines issued by them (now if you don't pay the fines and completely ignore the offices that might be an issue that's escalated)

Some people are making it sound like the EU Cyber police is going to hack your services or parachute and kick their way into your office in SV because a user in Slovenia didn't get their data portability request on time, which is not what is going to happen.


GDPR set the terms of the debate as model regulation, and is inspiring similar legislation elsewhere. California's CCPA is largely similar to GDPR. Tech companies are lobbying in committee to neuter it, but it's not a foregone conclusion. Thus as a startup you should incorporate it in your architecture and roadmap, even if you do not execute on it right away.

Also your US clients may be subject to GDPR and pass it on to you transitively as they are required to do for subcontractors or IT services vendors.


If a US company wants to do business with another company (partenship, approved vendor, platform marketplace, or even having them as a B2B customer) that other company may require them to be GDPR complaint.


I never understood how anyone thinks being forgotten is a right. Wrong, false, liable, limited set of privacy related information should be correctable, removable. But facts about you and what you’ve done, no. I’m sorry you made embarrassing mistake. But it’s not worth losing so much public information and enabling bad actors to save yourself from your own actions.


>I never understood how anyone thinks being forgotten is a right. Wrong, false, liable, limited set of privacy related information should be correctable, removable

It becomes a right when enough people in a democratic society want it to be one, it's that simple. People in Europe believe that the right of individuals to control information about themselves and to not be stigmatized for actions in the past is to be valued higher than public access to it.

I perceive the US attitude simply as a sort of voyeurism. We already know it well from celebrity culture where people's entire lives are picked apart and put on a platter for the public to drool over, I have no interest of seeing it expanded to everyone, so I'm thankful for legislation to give me at least some control over information about me.

The biggest beneficiaries of this might very well be children who have had their entire lives put on the net by their parents without even having the slightest say in it.


>It becomes a right when enough people in a democratic society want it to be one, it's that simple.

That's not what a right is; a right is something that some logical/philosophical moral argument has detetermined people should have, regardless of what other people think. That is the whole point of rights in the US constitution: to protect people against the government and the "tyrrany of the majority". Otherwise you could call something like "the right of the German people to have no Jews within one kilometre nearby" a right if the majority voted on it, and the term loses all meaning.


this lockean discourse about natural rights and common law and republicanism isn't really a thing in Europe. It's very US specific, Europeans in general don't believe that their constitutions (if they even have one) are quasi sacred texts.

At the end of the day rights and laws are expression of preferences of the public. Europeans have different privacy rights because they want to have them. That doesn't mean they can't be good or bad, but they don't need to be derived from some higher realm of reason.


I'd disagree with that. Check out how many European countries have an https://en.wikipedia.org/wiki/Eternity_clause.

"The German eternity clause (German: Ewigkeitsklausel) is Article 79 paragraph (3) of the Basic Law for the Federal Republic of Germany (German: Grundgesetz). The eternity clause establishes that certain fundamental principles of Germany's democracy can never be removed, even by parliament.[6]"

"The Parlamentarischer Rat (Parliamentary Council) included the eternity clause in its Basic Law specifically to prevent a new "legal" pathway to a dictatorship as was the case in the Weimar Republic with the Enabling Act of 1933.[7]"


That’s a pretty specific implementation detail of a few specific democracies that doesn’t really refute anything that the parent comment said.

Like, for example, in the UK (which entirely lacks a formal constitution) the right to govern is literally derived from the most divine source: god himself, through his agent, the Queen.

In practice, despite the quasi-sacred and divine foundations of our government it doesn’t mean jack. If the Queen where to exercise any of her divine powers over the will of the current Parliament it would cause a constitutional crisis and she would have those powers immediately stripped.


Which only proves that Germany is indeed very sensible to how governments can degenerate under demagogic movements.

Recognizing that democracies can be infected does not mean that they must hold to some sacred text. It just mean that some path are considered too dangerous to even be considered.


Only under post-modern, fully secular humanist conceptions of rights.


There are human rights, civil rights and simple mere rights by law.

Also in the whole concept of criminal justice of most modern countries criminals that have completed their sentences are by default considered with a clean history unless relevant)


European correction system doesn't believe in turning each small crime into a life sentence of marginalization and punishment. Which is why we don't have to lock up a huge chunk of population into cages for life - letting people resume their lives after serving their punishment significantly reduces the amount of returning convicts and makes society a safer place.


> Wrong, false, liable, limited set of privacy related information should be correctable, removable

Which is exactly what the law is for.

The examples in the article were deceptively reported. For example, the doctor who asked The Guardian to take down articles about her suspension: she had successfully appealed that case, and a judge overturned her suspension and ordered the record expunged: her name was dragged through the mud on bad information.

Compare that to the US where any kind of accusation, even if it turns out to be false, can easily permanently ruin someone's job, career prospects, or life.


You can sue for libel in US as well as in Europe.


That is not going to give your career back, a few news site still have their wrong reporting on Nick Sandmann.


Well, I think it is a bit more complicated. Humans typically do not remember as good as machines. So the intention is probably to bring the internet experience closer to the real-life experience. And the GDPR doesn't seem to make a difference between embarrassing Youtube videos and simple facts about what someone did (would probably take decades in court to make such a distinction).

Nevertheless, it seems, the real problem isn't the impossibility of achieving such a 'delete-from-the-internet' mechanism, but rather the low value for average people and the high value for deceitful individuals.

I mean, I like having a legal right to make Google/Facebook/E Corp delete all the data they have about me and my usage, but there will be few occasions when I will make use of that right and even fewer for people who don't even care what they share online. Imposters, on the other hand, will find that right most valuable.


Sometimes only "mistake" you do is to be born into wrong minority.

Also in lot of countries punishment is more of a corrective action not revenge. This means even convicted criminals have right to hide convivtion after some time.


Agree, and it doesn't even exist in most domains. I can't expunge my school transcripts or information from credit bureaus.

Do we really think a society where there is no public information about people better?


You can, any bankruptcy information expires after 7? years, and school records aren't public.


Which country in Europe are you from that school transcripts are public?


That's a fair point. It isn't public information.

But, GDPR applies to any information, even if it isn't available publicly.


But it does not apply to governments


Credit bureau information tends to expire off the record over time doesn't it?


It seems like a biased list, but it's good that someone is collecting links about these incidents.


It seems like there’s things from both sides of the camp on first glance, can you point out the bias you see? (Legitimate question, and I only quickly glanced.)


I'm judging by "Compliance costs are astronomical" when the supporting evidence is largely estimates from before it went into effect.

So you can't take everything too seriously, but still, it's good to collect more links. Also, the author is being clear about the weakness of some of the supporting evidence.


At least from what I read, it is the slant of the entire piece than anything in particular. If we removed environmental protections we would also see a boom of growth in the industry, but we decided that those benefits don't outweigh the environmental cost, same here with GDPR, after Cambridge analytica using digital micro targeting to heavily manipulate populations and elections it is simply too high a cost to pay for what is otherwise a pretty shifty industry otherwise

I'd heavily recommend everyone here to read the book surveillance capitalism it is an incredible explanation of what goes behind the curtain

In short, GDPR has some relatively minor problems and externalities and blogger wants to scrap the entire GDPR because of them....


[flagged]


If you start out with an opinion and look for evidence to support it, yeah, that is biased. But it doesn't mean it's not a useful contribution to the conversation. Certainly it's better than not looking for evidence!


If you look for evidence to prove yourself right, then you will find it. Even if you're wrong.


The article vaguely links Cambridge Analytics to GDPR. Is there really a connection or is the article merely trying to frame GDPR negatively by comparing?


I don't think so. The APIs that Cambridge Analytica were taking advantage of were available long before GDPR became enforceable and are most likely illegal under GDPR because they allowed third parties access to your personal information without your consent - where it was a friend of yours who consented to revealing their information, Facebook would also reveal some information about you.

The Wikipedia article actually details all this quite well in the fourth paragraph (obviously without reference to GDPR): https://en.wikipedia.org/wiki/Cambridge_Analytica


Cambridge Analytics is a data portability exploit. It leveraged your friend's ability to send your Facebook data to third party apps. GDPR enforces more data portability, which in some sense allows for a larger attack surface for such exploits. The article mentions one example of hackers extracting all your personal data after a takeover of your account.


Framing a law that forces companies to allow you to export your own personal data in a readable format to reduce vendor lock in as "this will definitely cause the next Cambridge Analytics!" on the basis of "but user authentication might be bad!" is absolutely laughable and the fact that this article got so popular here is pretty discouraging.


Meanwhile, people started to believe that Google reads their mind and society evolved the way they behave as The Internet transforms the consequences for acts such as chatting or posting things in public (or having certain opinions in public)... I think GDPR same as other laws protecting citizens from The Internet failed to protect the user, even if it was a good first try, to begin with. Hope some standardization will come in the future to prevent all cookies or some HTML tag standard or whatever.


We have the Do-Not-Track header that should be interpreted as a denial to all cookie notices and GDPR requests but they didn't write that into the law.


exactly, that's what I mean: it cannot be part of a law... but such standard must exist and be enforced by the browser, same as window.alert is a browser pop up and not a javascript one. The cookie rejection should do it in the same way.

Laws like GDPR are encountering problems to be enforced without being too intrusive with the technology and the freedom to create products/standards.


The ePrivacy regulation is supposed to introduce something like like that. Passing it would require the consent of Austria or, of course, a sensible political system for the EU.


I have an interesting question about GDPR and all legal compliance efforts. When GDPR was first announced, I studied it in-depth because I'm the CTO of a company involved in first-party content analytics, and I wanted to ensure we complied.

In addition to making changes internally and technically to ensure compliance, I also prepared a long Google Slide presentation that basically summarized my technical understanding of GDPR, after receiving the advice of several privacy attorneys. The information in this slidedeck was presented to my whole company, as a way to further ensure compliance -- to make sure my employees understood the policy at least as well as I did, since I had spent countless hours discussing the implications of the law -- as well as reading the raw text, which is excellently published/annotated by Algolia here: https://gdpr.algolia.com/gdpr-article-1

My inclination was to publish this deck I had painstakingly prepared publicly, because certainly it would be valuable to others. I publish a lot of stuff publicly on our blog, for example: https://blog.parse.ly/post/author/andrew-montalenti/ -- with the only goal being to share information with the community.

But then, one of my attorneys advised me against this. Basically, the concern was that if I publish something publicly about my understanding of GDPR, and it contains an error of understanding (after all, IANAL), then I could be held accountable for that. That felt really crappy to me -- after all, I'm just doing the best I can, and it seems like there's a lot of misinformation about GDPR out there on the web. Does anyone know anything much about this? To what degree can a company executive get him or herself in trouble for publishing a document that summarizes his or her own understanding of the effect of regulation, if the executive's company is potentially affected by said regulation?


I am not an EU attorney, but typically the risk with publishing something like that is not that you make a mistake, but rather you get it right. The problem arises down the road when your company does something that violates the law. Now your wonderful presentation is used to prove that your company knew it was violating the law, even though the actual circumstances may be a bit more complicated.


Also getting it wrong might indicate they are unintentionally not GDPR compliant and make others aware of that fact. But would that actually be worse than regulators finding out later? Especially when you want to comply?


Publish it anonymously. :-)


"But then, one of my attorneys advised me against this. Basically, the concern was that if I publish something publicly about my understanding of GDPR, and it contains an error of understanding (after all, IANAL), then I could be held accountable for that. That felt really crappy to me -- after all, I'm just doing the best I can, and it seems like there's a lot of misinformation about GDPR out there on the web. Does anyone know anything much about this? To what degree can a company executive get him or herself in trouble for publishing a document that summarizes his or her own understanding of the effect of regulation, if the executive's company is potentially affected by said regulation?"

To me this sounds like typical lawyer paranoia. In what way could you be held accountable for publish your interpretation? You are not giving legal advice.


It may or may not be overly paranoid, but I think the risk would essentially be that a publicly stated incorrect interpretation could be successfully used in court as evidence of failure to comply. I doubt the executive themselves would be held directly liable or be personally punished, it's just that it's a risk for the company that doesn't have any tangible benefit from a legal standpoint - so from a lawyer's perspective, why do it?

That being said - it seems unlikely that his understanding would be inaccurate given the amount of time and research and he claims to have done, so the actual risk could be negligible. It might even be conceivable that such a public statement could be used as legal evidence in the company's favor showing that the CEO took every practical step possible to comply to the best of a reasonable and well-informed person's understanding of the law. The public relations boost of giving out good knowledge/guidance (attracting talent, customers/clients) might be sufficiently beneficial to justify the risk.


> then I could be held accountable for that

Held accountable by whom?

> To what degree can a company executive get him or herself in trouble for publishing a document

Not from the EU. They are interested in compliance, which you either are or aren't, and will be explained to you why you aren't.

Possibly from your own company, but I assume your understanding and presentation of the GDPR does not hinge on gross negligence and it's a pleasant normal working environment where making a simple mistake will not lead to retribution.

Other than that there are other companies that may follow your guidelines and will be found lacking. I'm not sure about this one, and it might depend on the legal environment of your country.

Depending on your field of endeavour and location, I'd say it might be worth publishing. If customers can see online you take the GDPR serious, it might increase customer confidence, and, should there have been a mistake in your understanding, it might be pointed out to you before it becomes problematic.


“...there’s a lot of misinformation about GDPR out there on the web”

It sounds like your lawyer is telling you not to add to that misinformation. Also, you already said this deck is based on advice from counsel, so you’re maybe dragging them into an endorsement, and there are strict ethical rules about what lawyers can opine about to non-clients.

Anyway, GDPR is not that hard to understand. Just read the source materials. It’s one of the least-difficult legal texts you can take on.

Also, why bother? Like the EU directive before it, there won’t be any meaningful enforcement of these rules. A few examples will be made, but you’ll need to be woefully unlucky to be one of those.

I wish I weren’t so cynical but I’ve been following this area since 1997. It’s just an excuse for lawyers and consultants to rack up fees through careful manipulation of FUD. The intentions of the lawmakers are good, I’m sure, but laws without truly vigilant enforcement are eventually flaunted.

(IAAL but not your lawyer.)


I’m being downvoted, but the article itself confirms that a huge, wasteful amount of time and money has been thrown down the drain by people who could otherwise just have read the original text: http://data.consilium.europa.eu/doc/document/ST-5419-2016-IN...


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: