That opening paragraph already speaks to the over-elevation of the market over any other concerns. So it perfectly fits onto "news.ycombinator.com". Human rights, including privacy and data rights, are more important than the profits of some companies
Most examples in the text are, for instance, related to companies failing to properly implement the GDPR (Amazon sending data to the wrong person, Spotify not asking for 2FA/email confirmation for the bulk download, companies deleting articles even when there would sufficient public interest, Ad vendors failing to ensure compliance and therefore seeing drops in demand, ...), that is, market failures - something this site would probably not call out but rather attribute it to the legislation.
Many in this thread seem to argue that launching an internet based business or entering a market as large as the EU was previously free and I honestly don't see where this illusion is coming from. Yes, GDPR compliance costs money, just like a whole bunch of other things.
> To some extent, these costs could also hurt competition (assuming the competitor had the same data-vacuuming business model).
Same question, if that data-vacuuming assumption is true, why should the general public care about preserving that? GDPR regulation and implementations are by no means perfect but it's not like transparency, forcing companies to think about their impact on their users privacy and other aspects don't present benefits as well. I realize that the question of regulation is a matter of philosophy just as much as it is political but painting GDPR as some kind of killer of good businesses seems, at least, very weird.
Socially shaming companies does very little compared to legislation, so that's completely understandable.
The complexity of additional law is part of its unintended consequences.
In any case, there's no increased complexity here; the Data Protection Directive (enacted in 1995) already mandated access to one's data (Article 12 - Right of access). The GDPR mostly gave some real teeth to that existing right.
I have not found the cultural climate on HN to be opposed to the GDPR at all. Both its goals and its implementation seem to be popular here overall, and most comments I see that are critical of it get enough downvotes to have their text significantly desaturated even if they're making a good argument.
It's my impression, though I have not worked in the field that people operating ad networks believe some kind of tracking is necessary to prevent click fraud, and do not want to sell ads free of any tracking even when they have customers asking for the option. I don't know how true this claim is or whether alternatives have been adequately explored.
What I do know is that some of the industries that use ads, such as newspapers are struggling to make enough money to continue operating. Nobody will miss a scummy adtech firm, but people might miss local news outlets. It's valid to talk about the impact on business not just in terms of profits, but also considering potential positive externalities of the continued operation of a given business.
The business model is fine, the EU and you are choosing to kill it. Not only are people arguing for no tracking they are arguing on forcing other people to live without tracking. So if someone in Europe wants a free service in exchange for their personal information they don't even have the choice. Additionally GDPR requires that businesses not restrict their products to only those that are funding it.
The effect of targeted advertising on platforms such as Facebook seems to have had a very negative effect on democracy, for example.
Although, FB in general doesn't seem to help the political discourse, I don't think it's anyone's job to decide what ways of communication and discussion are right for a democracy, because if you control where people can speak and what they can say, it's a short step until you control how they vote.
An individual has negligible negotiation power. That's why we use regulations as a form of collective bargaining.
Personal data has many meanings and for some, this data might bring their literal death so of course it's a serious topic. Not common in the last ~30 years of western history, but that's just a tiny slice of time/location so of course it's sensible many of us want to keep personal data, well, personal.
I thought what yout were saying is that we shouldn't want it to be legal because it's illegal and therefore bad.
And what websites have been open with their data collection before the GDPR forced them to? When I opened the data collection dialogs introduced by the GDPR for the first time I expected maybe two or three entries, and it was near consitently closer to 40! WTF. Calling it "user choice" when the site owner deliberately ommits that kind of information is dishonest at best.
The issue I have is the rules in what can be done with that data. Because those rules make it so that the users don't get to decide what they're OK with being done with their data.
If I'm being really honest here, the GDPR seemed pretty well intentioned, IMHO it just went a little far.
Maybe I am missing something. Would you provide examples? I would be very glad to learn if there are edge cases I may be overlooking.
As for explicitly informing the user about what they do with their data, that part I have no quarrel with.
> it's telling the businesses and users how they can sell that data and what they can use it for, so it's not a choice by the user on what they do with their data, and more a collective decision with the government.
That is true, and that should happen in cases where market incentives do not align with social or public interests.
I mean, if I'm being honest my views on government aren't very common, so it'll probably be expedient to agree to disagree. (:
Also I can still give facebook all my personal data. Simply facebook need to get my consent to distribute and sell it and I will forever have some basic control on what data fecebook has on me. The government has little to do in this.
Also (beware the strawman), as far as I know people cannot sell their own organs in the EU, is this a sign that your body belong to the state or that business models build on harvesting poor people organs are unjust?
> Also I can still give facebook all my personal data. Simply facebook need to get my consent to distribute and sell it and I will forever have some basic control on what data fecebook has on me. The government has little to do in this.
So long as the government doesn't force companies to provide the basic control, that's how it seems like it should work! (:
My understanding is that figuratively speaking that is almost what happened with subprime loans.
Corporations and market can have a lot of power in performing predatory tactics. If drugs were simply legal quite a few business would sustain themselves on other people addictions.
One of the main reason we need regulations is that any sensible and obvious law (like not kidnapping people to harvest their organs) has loopholes (like keeping people poor, ignorant and devoid of mobility (lack of education, criminal convictions etc.)) so that they will agree to sell their organs.
Organ harvesting is a deeply extreme subject and obviously will not happen with or without regulations, but modern free society need are built on the free enterprise (eventually in the public sphere) of individuals and consequently they need to handle when individuals gather too much power and can destabilize societies.
Every free society has this problem (including bitcoin with a 51% attack) and needs to find a solution to both promise rewards for personal enterprise and incentives not to abuse the system (for bitcoin (IIRC) they are respectively money and loss of hardware investment)
> So long as the government doesn't force companies to provide the basic control, that's how it seems like it should work! (:
(interpreting as government should not force companies)
My problem with that is that principles do not help us distinguish fair competition from predatory unethical behavior. In the contest of personal data and privacy that is relevant as we live a completely different universe from just a few years ago.
Gossip is not illegal, but if you were magically able to listen to every conversation in a 10 km radius that would be a problem. Legal and illegal are often linked to how hard it is to do something and the scale at which you can do it.
That strongly depends on the time of day, read: Whether Europe or America is awake.
The problem is that GDPR is well intentioned but poorly implemented, a common occurrence in politics with examples in lots of sectors. These kinds of unintended consequences are what happens when politicians don't quite understand the nuances of an industry and focus more on regulation-in-principle and showboating rather than actually effective rules.
I think you are confusing unintended with unforeseen, which is a different matter. But I don't think these were unforeseen; some are problems with the implementation, which will be corrected by the companies responsible, and others are just inevitable (if you give people access to something, by definition it makes it easier for a third-party to abuse that access).
The implementation problems are with the law, not the companies. There are ways to enforce data protection and privacy without such complex and nebulous laws that aren't even effective against the worst offenders.
I've seen many critics of the GDPR say something to this effect; I've yet to see any make it concrete. Without wishing to put you on the spot, what leads you to that conclusion?
“The consequence was that just hours after the law’s enforcement, numerous independent ad exchanges and other vendors watched their ad demand volumes drop between 20 and 40 percent. But with agencies free to still buy demand on Google’s marketplace, demand on AdX spiked. The fact that Google’s compliance strategy has ended up hurting its competitors and redirecting higher demand back to its own marketplace, where it can guarantee it has user consent, has unsettled publishers and ad tech vendors.” (Digiday)
Political motivations will be complex for anything with as wide impact and as complex as GDPR.
The perception that the US tech companies should be affected the most was certainly factored in during the political process involving thousands of people. It's debatable how small or great impact this had, not whether there was any.
That would be the "unintended" part.
Only the ginormous companies can spend thousands of human hours on compliance while their smaller competitors either leave the market or get steamrolled due to the compliance costs. All this has happened before, and all this will happen again...
Some of those examples were deceptively reported. For example, the doctor who asked The Guardian to take down articles about her suspension: she had successfully appealed that case, and a judge overturned her suspension and ordered the record expunged: her name was dragged through the mud on bad information. This is exactly what right to be forgotten is meant for.
What do you mean? I use news.ycombinator.com every day and I consider human rights, and consumer rights such as privacy, to be extremely important. In fact, I use news.ycombinator.com because there is a very strong current in support of such principles by the users here.
There is also a strong streak of free-market capitalism and technology-first, you-can't-stop-progress techno-optimism, but that is the point. This site offers opportunities for debate.
You're assuming too much if you're extrapolating from a few comments you disagreed with to the entire userbase of this site. HackerNews is not an echo chamber. Not yet, anyway.
> companies deleting articles even when there would sufficient public interest
Is that really part of GDPR? That just seems authoritative.
> 3. Paragraphs 1 and 2 shall not apply to the extent that processing is necessary:
> (a) for exercising the right of freedom of expression and information;
> (b) for compliance with a legal obligation which requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
> (c) for reasons of public interest in the area of public health in accordance with points (h) and (i) of Article 9(2) as well as Article 9(3);
> (d) for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) in so far as the right referred to in paragraph 1 is likely to render impossible or seriously impair the achievement of the objectives of that processing; or
> (e) for the establishment, exercise or defence of legal claims.
> Spotify not asking for 2FA/email confirmation for the bulk download
I'm not extremely familiar with the letter of the law, but if it doesn't specify that you need 2FA/email, and there are clear fines/downsides to not complying, I do not see how this is not a predictable issue that would come up. The incentive is to comply, since you've already put in the work to make it possible, and there are onerous punishments for not doing so. In other words, a false negative (disallowing the download) can be potentially perceived as much worse than a false positive (allowing the download). This seems built-in: the goal of the law was to give it teeth to allow the user to get this data. If we just default to "its the companies fault for not applying an additional layer of thought to all this", then whether its true or not (and I agree it is!), it does not realistically solve the problem - establishing blame doesn't necessarily provide a path to making this less likely in the future as long as the equation is still heavily weighted towards disincentivizing false negatives. This is another way of saying: if we want to characterize corporations as lazy/malicious/what-have-you, then we can't then be Pikachu-surprised-face when they act that way under the letter of the law like some monkey's paw scenario: we should instead "aw shucks, fool me once" and try to come up with something better.
Similarly, we should try to predict that it is entirely plausible that the rights will be attempted to be abused, or leveraged, by those that it provides an obvious benefit for, even if it is malicious. Here again it is interesting that we begin with the thesis that "we need these laws because companies have proven not sufficiently responsible enough on their own" and yet then immediately make a law that is vague and thus defers major parts of the decision to these same companies. Many times this ultimately comes down to litigation where the boundaries of laws are worked out, and this is a very reasonable response: its only been a year, the courts will hopefully work out when these rules are mis-applied. However, it is on us to make sure we litigate "too much" right to forget (as in the cases here) and not just cases where companies refuse to forget. If not, the courts will send a clear message that it is perfectly fine to blindly abide by every request as the path of least resistance. Again: the premise is that they don't care, and we still haven't figured out how to legislate caring.
The Regulation doesn't mandate specific technical implementations anywhere; it leaves that to the industry, which is the expert in that regard. But on the subject of the Right of Access it does explicitly say:
"The controller should use all reasonable measures to verify the identity of a data subject who requests access, in particular in the context of online services and online identifiers"
It seems to me that not using 2FA/email means they haven't used all reasonable measures to verify the identity.
People don't care about privacy as much as you imagine them to, especially if they have to give up everything they get for ads today. One look at what people willingly share to the world on social media shows that.
But powerful monopolies are a problem and market competition is the correct answer to that power. Regulation isn't a magic cure and should be used to place guardrails on the market, but in this case could've been written far better to provide data protection without entrenching the major players even further.
Could it? How?
Don't know for the other companies, but for this one, good riddance, they had a notoriously scummy business model.
Hi, I'm Brent Ozar, the cofounder of the first company in the list. (Ah, the joys of alphabetical sorting.)
I've written a big long post about why we stopped selling to the EU, but here's the short story: the EU only represented 5% of our revenue, and for that small of revenue, I wasn't prepared to risk the GDPR's fines if any one of the third party tools we use had a problem.
During our GDPR prep with our attorneys, it was completely clear that the third party app ecosystem was in no way ready for GDPR enforcement actions. For example, we use WordPress and WooCommerce to sell online training classes. I'm a database administrator, and I know dang well that WP and WC aren't encrypting student data at rest, nor do they encrypt the other fields where people put student data - let alone how some of the plugins handle student data by storing it in the posts table, which was never designed to handle that kind of thing. If I had to face EU officials, I could never say with a straight face, "Oh yes, I was completely confident in WordPress's abilities to keep customer data secure."
I have confidence that someday, apps like WP and WC will have a better GDPR compliance story that doesn't just meet the bare letter of the law, but also the spirit. When they do, I'll be all about selling to the EU.
However, this is just the life of a small bootstrapped business: sometimes, you gotta make choices to focus on your best customers. 5% of my customers were threatening me with regulatory action that might result in huge fines if I let a ball drop. Unfortunately, I only have so many hours in the day. If I have the choice between doing regulatory paperwork for 5% of my customers, versus adding more value for 95% of my customers, I gotta make the obvious choice.
Care to explain why "Brent Ozar, IT consulting services" (first company on the list) deserves to die?
> If I had to face EU officials, I could never say with a straight face, "Oh yes, I was completely confident in WordPress's abilities to keep customer data secure."
Should this business really continue handling potentially user data if it can’t guarantee it will be secure down the line ?
Then no business can handle user data?
There's no such thing as a guarantee. You can do what you define as the best effort to secure it, but you can't guarantee it will be secure.
Which is tantamount to saying you cannot endure it doesn’t fall into the wrong hands?
He stated that people contact the company via many different methods: email, twitter, Instagram, etc... GDPR mandates that when the user demands it the company must delete all associated records. This small company doesn't have the resources (or doesn't want to waste time) to go through all emails, all twitter exchanges, etc... and expunge them all every time someone demands it.
As far as WordPress plugins go - I get that too. The place where I work has 100s of 3rd party packages. To go through them all would require Y2K level of effort to make sure they comply and/or upgrade ones that don't.
So I am not at all surprised that Brent Ozar didn't think EU was worth the effort.
And because someone will probably ask "why", here is why:
1) The GDPR was not designed to drown small businesses into expensive processes forcing them into bankruptcy or into cancelling their expansion in Europe. It was designed as to force business owners into thinking twice when they plan on getting rich by exploiting and reselling customer data to third-parties, or by performing "smart" operations on this data (i.e. any company that sticks the words "AI" or "smart" or "neural" or "deep" close to "customer data" in its business model).
3) "I'm a database administrator, and I know dang well that WP and WC aren't encrypting student data at rest, nor do they encrypt the other fields where people put student data." So what? GDPR nowhere says "student data should be encrypted at rest". It says it should be protected from unauthorized access, but not that it should be encrypted. Encryption is one way to respond to this requirement, and 9 times out of 10, it will be implemented with security flaws much worse than simply enforcing access control to the data. By trying to address a problem that does not exist with a solution that is inadequate, this business owner basically failed at his primary mission: managing risk.
Two arguments raised, two arguments completely wrong. Hence the justified conclusion: I am glad that these business shutdown or stopped playing with EU citizens data following the enactment of the GDPR.
The reality is that you can't protect unencrypted data from unauthorized access. You can try, but you can't guarantee it, not when you have hosting partners, for example. Encryption is just one completely reasonable defense mechanism that needs to be part of a larger strategy. I wasn't comfortable defending the company without personally identifiable data being encrypted. You might be. I'm not.
> this business owner basically failed at his primary mission: managing risk.
To the contrary, I succeeded. I eliminated the risk at the cost of 5% of my revenue. I sleep great at night not worrying about the GDPR.
Unroll.me's entire business model was made illegal in the EU.
Hitman: Absolution faced problems in taking ownership of its EU servers.
The two games Loadout and Super Monday Night Combat both claimed not having the resources necessary to comply with GDPR. For perspective Loadout had a peak of 208 concurrent players in 2018 while SMNC had a peak of 40 players in 2018.
I suppose a similar article could be published about the pharmaceutical industry, crying about the cost and consequences of FDA regulation. Doesn't mean we should scrap FDA regulation and "let consumers decide" whether drugs are safe or not.
unroll.me: selling your inbox contents, but "anonymized"
FamilyTreeDNA: proudly letting law enforcement and probably tons of others search your (and your relatives!) DNA, without -- for your convenience -- asking your, or your relatives', consent
Klout: kinda scummy, and also not phased out because of GDPR
>Lithium CEO Pete Hess reported that Klout is a “a standalone consumer-facing service” that no longer fits the focus of delivering customer service solutions. In addition, “recent discussions on data privacy and GDPR are further expediting our plans to phase out the Klout service, giving us a chance to lead on some of the issues that are of critical importance to our customers: data privacy, consumer choice and compliance.” 
We're supposed to mourn these companies? We shouldn't trust an author or site whose best choices of companies to mourn are (1) at least in some cases, not gone because of GDPR, and (2) mostly companies we're better off without.
Though there is still a lot of abuse and dark patterns going on, I believe most of them should make it as easy to "opt all in" as "opt all out" for the cookies for instance.
So as more regulation comes in it will just end up cementing the large players in place as they can absorb the costs of any regulation, while smaller businesses will have higher startup costs (which lets face it were next to nothing).
So while you maybe rejoicing now that shitty companies have gone for now, regulation will just make it harder for these massive companies to be toppled as it makes it harder for smaller companies to comply.
The EU are trying to have article 13 pushed through and any site that has user generated content will have to have some sort of upload filter to check for copyrighted content. That is going to cost money to implement and since Youtube hasn't really be able to achieve it, the only people that will be supplying the software will be the likes of Google, Microsoft etc ... So again it will just make it harder to the small business and help the large businesses.
Also a lot of these regulations make are making the web a shittier place. Every time I go onto a site now, I have the stupid cookie and GDPR notice plaster in front of what I want to look at. I already protect myself and don't care about their attempt to track me. It is just an irritation that nobody pays attention to and it achieves the opposite of what it was intended to achieve.
Self regulating markets are a myth, just look at the US insurance and health industries if you want a proof.
That's also why in healthy countries you get a lot of free passes when you start a business: lower tax rate for a few years, 0% loans, advisors paid by the state, &c.
> regulation will just make it harder for these massive companies to be toppled as it makes it harder for smaller companies to comply.
Why did no one topple apple, amazon or google in the last 25 years? If anything the lack of regulations when they started allowed them to become the de facto monopolies we all know today.
The problem isn't supporting privacy and data rights, it's doing so in a way that creates unintended consequences which actually worsen the market and UX for consumers. There are better ways this regulation could've been written, but it wasn't. That's the issue.
Why should this be the one thing we optimise for?
I agree with the GP, in that ease of starting companies should not be the primary goal, setting security and privacy on the back seat. It shouldn’t harder than it needs to, not easier at any cost.
This is what's happening though.
If you make it harder for companies for protecting people it's still a win. I recently visited SF, "the center of innovation" for the startup world. I saw 2 people defecating on the street in 2 weeks, countless peeing and had to jump over homeless at some points to walk the street. If that's the cost for startup and innovation, please don't bring it to EU.
It's meant for those who cannot/do not know how to protect themselves.
I'm of the opinion that privacy regulation is a good idea, but it's trivially true that it's an additional burden for start-ups. The Is it worth it? question is a legitimate one.
This applies for any X that you care to name, including "internet".
If you believe that you can both pass regulations that make businesses of type X harder to form, and enjoy the benefits of having new businesses of type X around, then there is probably a big flaw in your thinking.
The goal is to regulate adtech. But the effect is to impose regulatory costs on every company that wants to have a discussion forum on their website. (And the upcoming copyright bill is even worse.)
The cost of business going up isn't necessarily a bad thing, if we're getting something valuable in return (IMO we are). The question is whether or not the increased cost is prohibitive, and you have not provided any evidence to suggest that's the case.
The thousands of companies that just block EU citizens rather than comply seems to suggest that they feel the cost is prohibitive.
As for more direct hard evidence I believe this would fall into the "unseen" category in Bastiat's That Which is Seen, and That Which is Not Seen and is, in effect, calling on someone to prove a negative.
They block EU because they deem compliance not worth the effort (now), usually because they get more than enough from their US markets. This doesn't mean the costs are prohibitive. Thousands more companies didn't block EU citizens. Some companies (notably news sites) even started to offer a superior product to EU citizens (e.g. plaintext news).
Also, even with those blocking EU or shutting down, nothing of importance is lost. These companies have competitors that are less abusive, who do fine.
I would much rather have adtech and those businesses. I think most people feel the same way, because they continue to use those businesses.
The model where Google provides a service and users pay for it is more efficient and more societally healthy than the model where Google provides a free service, a million companies pay to place ads on it, and pass the cost of their AdWords budget onto users who get a 'free' service.
It is a model where consumers get better products, and where millions of creative minds aren't wasted making web pages uglier (or ruining cities with billboards, for that matter). It is a model where competition is also a little easier, because an alternative search engine can undercut Google's prices and carve itself a starting market niche, even if their service is not quite as good as the established competitor; instead of the current model where first you need to be better than Google in every way, and then you have to fight the network effect.
I have no clue how to get to world to switch to this model. It will require that elusive white whale, an online payment mechanic that is truly as frictionless as cash. And it will almost certainly require legislation rather than mere market pressure, because people can see their monthly Google bill but cannot see the vast costs of the marketing industry which they pay for every day.
That's cool and all, but people can't pay for it. These fees would add up quickly and you'd basically never go beyond your few webpages that you're paying for, because everything else costs money.
I probably would never have cared about the internet or anything related to computers, if websites had required people to pay. That would not have been an option for me or most people I knew growing up.
But then again, they're not scummy companies.
I had to put in like a few hours thought into what data I was collecting and how long it was appropriate to keep it.
I happen to know quite a lot about GDPR because I dealt with it at a client I was previously working with, if you want to make it extremely complicated, you can. But you don't have to.
In one we actually track user's behaviour to make better recommendations, but we're open about it and they can disable it if they want. We also delete that data if they delete their account.
It's just a different mindset, it's their data, not yours. You're open about what you're doing and if they want you to delete it, you delete it.
It's not legal, consent is opt-in not opt-out.
> In one we actually track user's behaviour to make better recommendations, but we're open about it and they can disable it if they want.
If I understand well this is opt-out instead of opt-in... If you would be slapped some percent of your revenue for this you would feel the costs. Not only the cost of fine, but also of reading and implementing GDPR more carefully. But data protection authorities don't have enough resources to audit even 1 / 100 000 of companies that ignore GDPR up to this level of detail. So you can live in happy ignorance that you are implementing GDPR.
That not to say that GDPR doesn't help in general. The issue is that it will be a dead law or a law that hits randomly some very, very small percentage of companies breaking it.
Having a law that no one implements properly is just a recipe for abuse of power by authorities. "Show me the man and I’ll show you the crime" is well known to people living under the Soviet rule. (And, No! EU is not the Soviet Union. But some DPA are in post-soviet republics with people that were raised in this mentality.)
There we go. You already done the time investment at someone else's expense.
So thanks for proving my point.
My comments weren't about GDPR but about regulation in general. Any regulation requires more work which makes it difficult for smaller players. You had to do the extra work.
Individuals' rights over their data should just be another human right like property rights and not being harmed by others.
Thinking about what you do and how you do it is probably not a bad thing.
Your example is like saying that everyone that wants any kind of job should know multi variable calculus. When people protest that that's putting too much of a burden on people, you bring up that you got a job just fine, because you learned multi variable calculus in school.
We should really separate the protection of scummy business models and down to earth stuff like data takeout / account deletion and transparency as to what companies do with user data. The latter is neither rocket science, nor should it be particularly hard for any startup that's over the "my company is a fancy slide deck" stage.
I really don't see, why a scummy business should get a pass, just because it's a startup.
So the regulation causes problems for people that haven't done anything wrong.
A lets be clear here. People aren't dying, it mostly ads and shitty data collection. I think it might be better to actually educate the public (which govs are doing) as to some of the pitfalls of the internet rather than regulating the crap out of it.
While this is true it's exactly that, which turned the world (and by extension the world wide web) into a fucking dystopia. Brexit, without the whole concept of targeted ads and the data collection that goes with it would have not been possible.
Yep, I think add tech is utterly and totally evil. And all that to make a buck, or a billion.
I, for one, think that's a disastrously high price to pay for a few successful tech companies.
People aren't dying,
Actually I disagree here. When you look at the consequences of the technology in countries like Myanmar, The Philippines, Brazil, Cambodia and others and the likes of Mr. Zuckerberg and his ilk giving exactly zero fucks (unless it becomes bad PR) I'm afraid you're definitely wrong on that one.
However nobody mentioned all the people that didn't bother voting because they were at Glastonbury which was on at the same time.
I very much doubt that is true. The UK has been a bad fit in the EU and there has been a sentiment for years that we don't want any EU interference. For example many don't want "The EU monopoly money" (not my words mind you) and generally the public is Euro-sceptic.
The papers and politicians were trying to find a scapegoat because quite frankly it didn't go the way they wanted. Much like Trump's victory claiming that Russia hacked the election (there were like a few thousand placed on facebook, which paled in comparison to the Democrat's budget).
Many of the people that voted out were of older generations that don't pay attention to tech. So I find it dubious how much influence the likes of Cambridge analytical really had.
> Actually I disagree here. When you look at the consequences of the technology in countries like Myanmar, The Philippines, Brazil, Cambodia and others and the likes of Mr. Zuckerberg and his ilk giving exactly zero fucks (unless it becomes bad PR) I'm afraid you're definitely wrong on that one.
Like exactly what? You haven't qualified anything here. You just claimed I am wrong because of what? What adverts, what is happening? This is a very vague claim.
I suspect much like the vote to leave the UK it will be very spurious evidence.
Vague claim? Not at all.
I was asking myself if I should actually bother to even answer, but then decided to invest a couple of minutes into some very basic DDG searches. You can find some results below.
Let me assure you that there's a ton more, if you just bother to open your eyes.
I close my argument here, since anything else would be either counter productive or violate site guidelines.
But please don't accuse me of sprouting vague claims or not qualifying my arguments just because you seem more interested in a timely Uber or a cheap stay and fuck all the consequences.
If your business case depends on either abusing or being careless with other people’s personal data, how are you not a scummy business ? That’s basically all the GDPR requires of you, don’t abuse people’s personal data and be careful with it. Both seem like common decency to me.
In reality, all regulations have costs for compliance and those costs typically apply to some extent even if you weren’t doing anything shady at all.
if you were _already_ complying before GDPR existed (because your business model isn't scummy), then GDPR compliance _should_ cost very little, if at all.
If you weren't complying at all, then adding compliance is very costly after the fact. If you cannot make your business work without complying, then the business must die, as there's no natural right for a business to exist.
If nothing else, you probably need non-trivial amounts of management time to understand the new rules, some extra legal advice that you're going to have to pay for, and an update of your key documents to make sure everything uses appropriate structures and wording to comply. That alone could already be a significant cost for a small, bootstrapped business, and that's without changing anything about the actual data you're collecting or how you use it.
Try to understand what is even personal data from this:
It is all about risk, ambiguity and individual circumstances. I dont think that is bad, but there is no clear record of what it even is we are meant to protect.
If you're in the business of "doing free services so you can skim GB's of data from users" or you "sell wholesale data collected without notice", the EU doesn't want you.
If you're doing a good job of keeping user data private except at the direct request of a user in a plain-language direct permission, then you're doing a good job to the GDPR. Slipups happen, and as long as you do your best to stop the bad thing, limit the breach, notify users, and be a good steward for their data, then it's all good.
As a US citizen, I try to make a point to only work with companies that adhere to the GDPR. I know they don't have to do so with me. But it tells me their internal processes are set up to respect the user's rights. And well, running dual systems for different compliance regimes is a tough sell - its easier to do 1 big system.
If that regulator happens to like you. There is no schedule of offenses and penalties and due process, only an absurdly high maximum for selective enforcement.
Overall I support the regulations, but I really wish the penalties had more documented structure than “We will fine you anywhere from 0 to an 8 digit number (in our case) depending on what we think is right”.
There is no explicit schedule – that could be gamed – but that doesn't mean regulators can act arbitrarily. Punishments have to be proportional to the infraction, similar cases have to be treated similarly... The GDPR just does not spell out how public authorities work.
It actually does say that punishments have to be proportional IIRC. I'm not sure if that actually makes a legal difference or if it was included to make the GDPR easier to understand.
Only if you lose.
> very expensive legal battle
EU ≠ USA
>That same government that regulates your business.
So what? If you have a grievance with an entity, that's the entity you have to fight a lawsuit against.
>EU ≠ USA
I don't see why this changes anything. Lawyers still cost a lot of money. They might not seem like they cost a lot of money to Americans, but that's because Americans earn a lot more money.
>So what? If you have a grievance with an entity, that's the entity you have to fight a lawsuit against.
One of the grievances people have against GDPR is that they don't like how GDPR's enforcement depends so much on the individual person at DPAs. You'll still have to deal with the person afterwards that you sued.
Yes. Each party paying their own fees is a uniquely American thing.
> I don't see why this changes anything. Lawyers still cost a lot of money.
Prohibitively high lawyer fees are a uniquely American thing. The ECHR guarantees practical and effective access to the courts.
> One of the grievances people have against GDPR is that they don't like how GDPR's enforcement depends so much on the individual person at DPAs. You'll still have to deal with the person afterwards that you sued.
That Americans have against the GDPR. Given that the people who actually have experience with European authorities and law don't see these issues, it's very likely they don't exist.
You don't necessarily have to deal with the same person. Even if a DPA always assigns the same person to you, there is no oversight, that person is petty and cares more about harming you than about their job: We have rule of law and a functioning court system. And I can't help but find these continuing insinuations that we don't pretty insulting.
I have, and it is definitely ambiguous. To take a simple example, consider all of the cookie warnings that you now see. Intelligent and informed people disagree on whether they are required, enforceable, or sufficient.
There's something so naive or earnestly human about them. If the U.S. kept being the only relevant legal force on the Western Internet, we'd mull around in gray areas forever.
It also covers an astonishing amount of industrial sensor data used solely for industrial purposes. Unfortunately, for many high-scale industrial sensor data models the technical infrastructure required for compliance literally does not exist. In some cases we don't even have the computer science required to build the compliance infrastructure. But the vast majority of people would be very upset if the business model of some of these companies became "unpractical" and had to go away because GDPR compliance is effectively impossible. No amount of trying to do the "right thing" will make these industrial companies compliant.
There is gross misconception that GDPR only affects ad tech companies or retail or companies with business models involving people. This is far from the case.
In all of my reading it's been personal data, and definitely wouldn't apply to the things people would usually associate with "industrial sensors" eg. Carbon monoxide levels in a space, or even occupancy data (eg. for lighting/HVAC control) so long as it simply reflects whether an area within a building is occupied.
What's the specific requirement, and what makes it unattainable?
What people don't immediately grok is (1) just how many industrial sensor systems there are these days operated by diverse organizations -- almost every sensor type on an autonomous car, for example, is also widely used in many other industrial contexts, (2) the scale of sensor coverage in most places people occupy indoors and outdoors, which is far beyond what they typically imagine, and (3) how many of these sensors can be used to incidentally identify the presence of a person at a place and time, sometimes in very non-obvious ways. A single sample from a single sensor may not be identifiable but multiple samples from multiple sensor modalities often is. And the sensor modalities used for industrial sensor systems are increasing in diversity and resolution very quickly, which makes it even easier.
Humans perturb the environment they move through, and we have enough environmental sensors now that we can often track those perturbations across the sensor modalities to create a fingerprint. People have a difficult time imagining how easy this can be in practice until they've seen it done.
Power is used by a house. The meter runs. You pay the bill. The house has an address and a point of contact.
Power is used by the house. Machine learning is applied to map each individual and how they live in said house. The data is then sold to target things the ML algo picked up. You pay the bill. The house has an address and a point of contact, along with a detailed profile of each human in said domicile.
Same sensors exist, yet one violates the GDPR and the other one does not. Can you guess which one?
The part about "compliance cost" should be taken with a grain of salt. If you were compliant before, because you respected the users‘ privacy, the effort was relatively low.
The study about VC having dropped by 50% in the EU because of the GDPR sounds pretty weird to me. Unless of course there’s selection bias and we’re talking AdTech companies mostly.
An interesting number would be: how many people closed down forums and moved their discussion boards to Facebook?
This is not true. Even if you are perfectly compliant, you need a complaint-response mechanism and lawyers in the EU ready to react to invalid accusations.
Given GDPR took a complain-investigate model, one also needs to be ready for power-tripping regulators. (Recall the Romanian data protector using GDPR to seize sources from a newspaper investigating corruption allegations .). Protecting against that requires, if not active lobbying, keeping lobbying connections warm. That costs money.
Ironically (and predictably), I’m seen more data being funnelled to Google than before. They have the scale to deal with this crap in each of the EU’s (currently) twenty-right member states.
Ironically, when GDPR came into effect so many on HN were spreading fake news that companies would be litigated to death by users. Of course, to remove that possibility and ensure only legitimate claims are pursued, the data regulation authorities act as middle-man. Such cases of abuse could also just as easily be done when people could sue. For example, nowhere does the GDPR imply that you need to hand-over a source - that goes for journalists as well as non-journalists. Companies sued have the right to appeal and, if GDPR wouldn't have existed, the Romanian authorities would've probably just used e.g. tax law to stifle the RISE project.
Complain-investigate compliance regimes tend to result in deference due to the cost of investigations and other informal expenses regulators can rain upon the regulated. (It works in finance because financial firms have the margins to support it. Also, the industry regulators are checked by both the courts and a public regulator, the SEC.)
Complain-investigate is thus a terrible structure for a general business law. Strict liability for data loss or mis-use (including the rights to data transcripts and deltion) would have been simpler. (Albeit, less profitable for European law firms.)
Long story short, GDPR’s aims and technical costs (e.g. deleting user data from backups) are fine. The problem is the compliance structure. It’s fundamentally incumbent-biased, commercially and politically.
>Even if you are perfectly compliant, you need a complaint-response mechanism and lawyers in the EU ready to react to invalid accusations.
Did American businesses really think that they were immune from prosecution under EU law prior to GDPR? No European business was under any illusions about the extraterritorial reach of American courts.
Prosecutors need to build a case before causing costs for the suspected noncompliant. Complaints, and regulators in complain-investigate regimes, can incur costs with zero evidence. This is why most systems reserve such structures for high-margin, high-risk applications, like banking regulation. Deploying it as a general business law is aggressive.
I wouldn't be surprised if 150B is actually a low estimate.
Are you collecting data on your customers? If you are, then one of the things your company does is data collection, even if that's not what's in your business plan.
I raise this issue in almost every thread about GDPR, because although it might seem pedantic, the error strongly implies that people have not read or understood the legislation. The difference between personal data and personal identifiers is integral to GDPR and the legislation cannot be understood without fully understanding that distinction and the implications that follow from it.
Why? Don't you still have to build systems to be able to comply with user requests in a timely matter?
Second, it’s very likely that you have APIs in place that can request all data for a user anyways. If you don’t know what data you have of your users, you don’t give a shit about their privacy, no?
Third, user requests are usually: a) what data do you store about me? B) Export all data. C) delete all my data (for real).
The orchestration of a data extract from even a midsized corporation is a significant endeavour.
Someone in the company knowing what data we have on an entity is a significant step away from the entire company being able to access that, because, you know, we take data privacy seriously, so we don’t make it easy to access all data on a single entity.
If your approach to privacy is putting all the eggs in a basket, allowing easy extraction of everything from that basket, and hoping the basket can be kept secure I’d argue your model is weird to begin with.
Some of these are write once and immutable afterwards.
There are relational structures for transaction history that may also link to customers.
These all have to be re-designed in such away that information can be removed from the system and exported from the system, while keeping essential information (such as past sales records).
This is not an easy problem to solve.
Got an example of something like that that'd make it impossible to soft delete a person? I'm struggling to think of any datastore in regular use that's write only.
Are you implying that those 3 requests are simple to fulfill in a business running a modern software architecture?
It's really that surprising to you that when the EU effectively bans one of the most profitable models of business that venture capital investment will drop by 50% in the EU?
To be fair, it's probably just not GDPR but all of these regulations combined. Venture capital can move across borders, why would you invest in a startup in the EU when you could just do it in the US?
Could just be noise in the data?
Could be VCs determining that fewer products are actually worth pursuing if the main monetization model for everything is ads?
So... anecdotally, I'm not at all surprised if the increased compliance cost made some people reconsider investments in EU businesses, even if they don't rely on ad revenue as a business model.
Does Europe have some way to require its ISP's to firewall you off or blackhole your DNS? Can they force Amazon to shut off your AWS account? Do your executives risk being taken away in handcuffs to a European jail when they go to Europe on vacation?
If there are no consequences, why don't US tech companies just completely ignore it? (Of course, big players like Google probably have EU-based datacenters and other assets that could be seized to pay their fines. I'm thinking of small, cloud-hosted startups whose employees, bank accounts and physical assets are all on US soil.)
Once you grow big enough the EU will inevitably have leverage over you: Servers rented in the EU to lower latency, payment streams from EU customers, offices in the EU to get talent, subsidiaries created for tax reasons, executives on vacation, employees on conferences, money spent on advertising, etc.
If you are a startup in SV the EU migh not have much direct pressure it can apply, but how would an investor react when given the choice of "we could spend some more money now, or we could do nothing and be significantly limited once we grow to a certain size, basically unable to do anything significant in one of the largest economies of the world".
The simplest solution would be ignore GDPR, dominate the American market (which is easier to scale across than the EU), and then use that momentum to launch a simplified version in Europe. (Or buy a competitor.) The scale advantage will almost always outweigh being prepared for multi-market growth from the beginning.
Agreed. My point was with respect to an American start-up—compliance with GDPR is of lower priority than scaling. The priority, for both, should be scaling.
Advantage goes to the American start-up, however, in launching from a single market. But one might counter-argue that consumers in e.g. China will prefer to do business with European start-ups over American ones due to GDPR. (No evidence for that. But it’s a valid hypothesis.)
We work in the b2b in the financial sector and part of our contracts in Europe is that all of the data is hosted in infrastructure that complies with the GDPR. That could be Google or Amazon, but not Slack or any SV startup.
It's hard to imagine what a startup would be doing that makes them interesting enough for the EU to notice and want to levy fines, yet be completely out of reach.
If you have no business in the EU, generally the worst they'll do is censor your website.
However, most US SaaS-type startups very much want access to EU markets. Ignoring GDPR won’t matter until it does, and then when it does, it will matter very much. For example, you grow and want go establish a presence in the EU, investors with EU ties may be hesitant to get involved, a potential acquisition is ruined because the buyer has an EU presence and isn’t willing to take on the historical liability.
Yes, there’s a lot in GDPR. If you’re a startup that is making money by selling user data, the cost of compliance will be quite high. But if you are selling an actual product or service that generates revenue by collecting fees from your users, compliance is probably not as hard as you think. And building your startup with user data protection in mind, you’ll find it can be something you use as a selling point.
With more than a year of history, it’s not hard to find easy-to-digest articles that put GDPR in terms that an average person can understand. Integrate those principles and processes into your business, document what you’re doing, and then stick to it. Even without a huge compliance budget - if you do that and nothing else - you’ll be in a much better position than to just ignore it, even if you don’t fear punishment.
It applies to any data collected in the Union: https://gdpr-info.eu/art-3-gdpr/
Not in a systematic EU-wide way. Courts sometimes force individual ISPs to blackhole websites used for copyright infringement.
I guess if your company ignores the GDPR, it's treated as an illegal organization. So you may still be able to provide your service in the EU, but people cannot legally pay you, including paying you for ads.
Some people are making it sound like the EU Cyber police is going to hack your services or parachute and kick their way into your office in SV because a user in Slovenia didn't get their data portability request on time, which is not what is going to happen.
Also your US clients may be subject to GDPR and pass it on to you transitively as they are required to do for subcontractors or IT services vendors.
It becomes a right when enough people in a democratic society want it to be one, it's that simple. People in Europe believe that the right of individuals to control information about themselves and to not be stigmatized for actions in the past is to be valued higher than public access to it.
I perceive the US attitude simply as a sort of voyeurism. We already know it well from celebrity culture where people's entire lives are picked apart and put on a platter for the public to drool over, I have no interest of seeing it expanded to everyone, so I'm thankful for legislation to give me at least some control over information about me.
The biggest beneficiaries of this might very well be children who have had their entire lives put on the net by their parents without even having the slightest say in it.
That's not what a right is; a right is something that some logical/philosophical moral argument has detetermined people should have, regardless of what other people think. That is the whole point of rights in the US constitution: to protect people against the government and the "tyrrany of the majority". Otherwise you could call something like "the right of the German people to have no Jews within one kilometre nearby" a right if the majority voted on it, and the term loses all meaning.
At the end of the day rights and laws are expression of preferences of the public. Europeans have different privacy rights because they want to have them. That doesn't mean they can't be good or bad, but they don't need to be derived from some higher realm of reason.
"The German eternity clause (German: Ewigkeitsklausel) is Article 79 paragraph (3) of the Basic Law for the Federal Republic of Germany (German: Grundgesetz). The eternity clause establishes that certain fundamental principles of Germany's democracy can never be removed, even by parliament."
"The Parlamentarischer Rat (Parliamentary Council) included the eternity clause in its Basic Law specifically to prevent a new "legal" pathway to a dictatorship as was the case in the Weimar Republic with the Enabling Act of 1933."
Like, for example, in the UK (which entirely lacks a formal constitution) the right to govern is literally derived from the most divine source: god himself, through his agent, the Queen.
In practice, despite the quasi-sacred and divine foundations of our government it doesn’t mean jack. If the Queen where to exercise any of her divine powers over the will of the current Parliament it would cause a constitutional crisis and she would have those powers immediately stripped.
Recognizing that democracies can be infected does not mean that they must hold to some sacred text. It just mean that some path are considered too dangerous to even be considered.
Also in the whole concept of criminal justice of most modern countries criminals that have completed their sentences are by default considered with a clean history unless relevant)
Which is exactly what the law is for.
The examples in the article were deceptively reported. For example, the doctor who asked The Guardian to take down articles about her suspension: she had successfully appealed that case, and a judge overturned her suspension and ordered the record expunged: her name was dragged through the mud on bad information.
Compare that to the US where any kind of accusation, even if it turns out to be false, can easily permanently ruin someone's job, career prospects, or life.
Nevertheless, it seems, the real problem isn't the impossibility of achieving such a 'delete-from-the-internet' mechanism, but rather the low value for average people and the high value for deceitful individuals.
I mean, I like having a legal right to make Google/Facebook/E Corp delete all the data they have about me and my usage, but there will be few occasions when I will make use of that right and even fewer for people who don't even care what they share online. Imposters, on the other hand, will find that right most valuable.
Also in lot of countries punishment is more of a corrective action not revenge. This means even convicted criminals have right to hide convivtion after some time.
Do we really think a society where there is no public information about people better?
But, GDPR applies to any information, even if it isn't available publicly.
So you can't take everything too seriously, but still, it's good to collect more links. Also, the author is being clear about the weakness of some of the supporting evidence.
I'd heavily recommend everyone here to read the book surveillance capitalism it is an incredible explanation of what goes behind the curtain
In short, GDPR has some relatively minor problems and externalities and blogger wants to scrap the entire GDPR because of them....
The Wikipedia article actually details all this quite well in the fourth paragraph (obviously without reference to GDPR):
Laws like GDPR are encountering problems to be enforced without being too intrusive with the technology and the freedom to create products/standards.
In addition to making changes internally and technically to ensure compliance, I also prepared a long Google Slide presentation that basically summarized my technical understanding of GDPR, after receiving the advice of several privacy attorneys. The information in this slidedeck was presented to my whole company, as a way to further ensure compliance -- to make sure my employees understood the policy at least as well as I did, since I had spent countless hours discussing the implications of the law -- as well as reading the raw text, which is excellently published/annotated by Algolia here: https://gdpr.algolia.com/gdpr-article-1
My inclination was to publish this deck I had painstakingly prepared publicly, because certainly it would be valuable to others. I publish a lot of stuff publicly on our blog, for example: https://blog.parse.ly/post/author/andrew-montalenti/ -- with the only goal being to share information with the community.
But then, one of my attorneys advised me against this. Basically, the concern was that if I publish something publicly about my understanding of GDPR, and it contains an error of understanding (after all, IANAL), then I could be held accountable for that. That felt really crappy to me -- after all, I'm just doing the best I can, and it seems like there's a lot of misinformation about GDPR out there on the web. Does anyone know anything much about this? To what degree can a company executive get him or herself in trouble for publishing a document that summarizes his or her own understanding of the effect of regulation, if the executive's company is potentially affected by said regulation?
To me this sounds like typical lawyer paranoia. In what way could you be held accountable for publish your interpretation? You are not giving legal advice.
That being said - it seems unlikely that his understanding would be inaccurate given the amount of time and research and he claims to have done, so the actual risk could be negligible. It might even be conceivable that such a public statement could be used as legal evidence in the company's favor showing that the CEO took every practical step possible to comply to the best of a reasonable and well-informed person's understanding of the law. The public relations boost of giving out good knowledge/guidance (attracting talent, customers/clients) might be sufficiently beneficial to justify the risk.
Held accountable by whom?
> To what degree can a company executive get him or herself in trouble for publishing a document
Not from the EU. They are interested in compliance, which you either are or aren't, and will be explained to you why you aren't.
Possibly from your own company, but I assume your understanding and presentation of the GDPR does not hinge on gross negligence and it's a pleasant normal working environment where making a simple mistake will not lead to retribution.
Other than that there are other companies that may follow your guidelines and will be found lacking. I'm not sure about this one, and it might depend on the legal environment of your country.
Depending on your field of endeavour and location, I'd say it might be worth publishing. If customers can see online you take the GDPR serious, it might increase customer confidence, and, should there have been a mistake in your understanding, it might be pointed out to you before it becomes problematic.
It sounds like your lawyer is telling you not to add to that misinformation. Also, you already said this deck is based on advice from counsel, so you’re maybe dragging them into an endorsement, and there are strict ethical rules about what lawyers can opine about to non-clients.
Anyway, GDPR is not that hard to understand. Just read the source materials. It’s one of the least-difficult legal texts you can take on.
Also, why bother? Like the EU directive before it, there won’t be any meaningful enforcement of these rules. A few examples will be made, but you’ll need to be woefully unlucky to be one of those.
I wish I weren’t so cynical but I’ve been following this area since 1997. It’s just an excuse for lawyers and consultants to rack up fees through careful manipulation of FUD. The intentions of the lawmakers are good, I’m sure, but laws without truly vigilant enforcement are eventually flaunted.
(IAAL but not your lawyer.)