Class A - no data collection
Class B - anonymized statistics for diagnostics that cannot be used for marketing
Class C - anonymized usage statistics that can be used in aggregate
Class D - Targeted collection that can be used for targeted marketing
Consumers should have the explicit right to opt out of any and all data collection without risk of impairing the primary function of a device. For example there is no reason a TV should need to be anything beyond class A (maybe B). A smart speaker on the other hand needs to be a B maybe C. Nothing should need to be a class D.
It even goes further than your request for an opt-out. All data collection must be opt-in, and opt-in requests must be written in such a way that your users can understand them.
Opt-out in combination with not ”providing consumers recourse after their data has been collected” is especially dangerous. Extreme example: application X installs a keylogger, and sends all your keystrokes to evilEmpire.com. Would you consider that to be OK, as long as there is an opt-out button? Would you be OK if evilEmpire.com kept the keystrokes it collected before you opted out?
Under a GDPR-like system where the purposes of data collection were specified as only fulfilling that core offering, you'd get a pretty good guarantee that it wouldn't be used for personalised marketing etc. Yet on my reading of this system, it'd be a class D since this data could potentially be used for targeted marketing.
But I stand by the general sentiment of empowering users to decide and telling them in plain language the reasons why their data is being collected.
That's exactly what the GDPR does. You can use the data for other purposes (e.g. marketing) as long as you request for that specific permission in "clear and plain language".
Citing the text:
"The principle of transparency requires that any information and communication relating to the processing of those personal data be easily accessible and easy to understand, and that clear and plain language be used."
I'm fine with a smart speaker securely storing my voice data in the cloud to provide me with personalised speech recognition. I'm not OK with that data being sold to a fraudster who wants to phone my bank and impersonate me. If you classify based on data collection alone, then both these scenarios would result in a "class D" rating.
Student clubs (yes, the GDPR applies to them, too, even if they are 100% volunteer run) that want to keep address info of past members for communicating with alumni, for example, have to make that opt-in.
“necessary for running the business” has some gray areas. For example, an athletics club will want to keep a list of club records, photos of past victories, etc. That’s something that is allowed as “necessary for being a sports club”. I don’t know whether there’s any case history on this)
Class A = 100% of retail cost
Class B = 75% of retail cost
Class C = 50% of retail cost
Class D = free
I'm convinced that the vast majority of those outraged by today's data collection practices will conveniently direct their ire to some other cause when faced with the possibility of actually having to pay for the services they've gotten used to having for free.
I have written more about this here https://news.ycombinator.com/item?id=16351892
Maybe this is how the whole web-based fee structure should have been setup in the first place but it's not the current deal we have in place. How difficult will it be to shift an entire industry (and consumer mindset) to a vastly different fee structure?
Since they do such a terrible job of it anyway I can't imagine it being less profitable.
And then again, maybe actually paying for something like facebook is the right way to go.
Because that would provide an easy way out of the whole sitation. Just make the price ridiculously high, and all users will choose the free, your-data-belongs-to-us version.
It could actually be an eye-opener for many people. Most middle-class tech people would probably happily pay $10+ a month for the service Facebook provides, but if its something like $70 then Facebook is either scarily good at monetizing their information or unreasonably pricing the service.
On the flip side, you shouldn't be allowed to undercharge for the cost of a private membership. Transparency is good, let users be part of the free market for their data and make informed consent.
- big corporations which do collect the data will lawyer up and still collect data they do
- small companies will do whatever they can do and they will bet that enforcement is going to be sparse
- the enforcement will be non-existent
So nothing will change. The GDPR will not improve privacy, not improve users knowledge who is collecting what, will give false sense of privacy for some users, etc.
And ”enforcement will be nonexistent”, from a EU that has a history of taking on big companies on, say, cartel cases (http://ec.europa.eu/competition/cartels/cases/cases.html) or antitrust cases (http://ec.europa.eu/competition/antitrust/cases/)? Would you take that bet, again at percents of your global revenue?
The problem with tax law is the following:
- big corporations will lawyer up and still dodge taxes
I don't know if you meant to do so, but you seem to be arguing governments are incapable of enforcing law against companies.
Microsoft fined by European Commission over web browser
Apple ordered to pay €13bn after EU rules Ireland broke state aid laws
EU fines Facebook 110 million euros over WhatsApp deal
Europe is probably having a more functioning government than the US.
> Hell, where are the politicians who even have multiple words to say about privacy?
In the vein of Manufacturing Consent - the elites in the US are not split on the topic so there is no discussion. Larger companies, especially in SV, run on private data.
Of course, at the expense of privacy, but I'm still wearing my devil's hat here for a moment. And I would go as far as saying people by and large don't care as much about privacy as they do about cost, or getting free stuff (e.g. Facebook).
People won’t be immersed in something that doesn’t bring them value. Some products would make money from users by showing them ads, others would ask those users to pay. Both would want to create immersive experience in my opinion. I would even argue that less immersive experience (getting bored or distracted) leads to ad clicks. Whereas a service that charges wants to keep users truly engaged.
But let’s replace immersed with “enjoy using” or for it to be clear and intuitive. Some product designers and UI experts don’t need usage data, but I think a lot benefit from it to create a better and more enjoyable and clear user experience.
The rest of the apps - the less screen time they get, the better they're. I don't want to be "immersed" in Spotify. I want to find music I want and be done with it.
There's little to help UI design in data. Best way is watching your users (or at least using eye-tracker) and then talking to them. Looking at cold data, you never know why somebody took that long at certain step or kept clicking around. Why did they click wrong button - is it wrong color/positioning, unclear label or they just changed their mind? It's hard to know when users are annoyed or when it's stressful for them either.
Meanwhile data is great if you want to do shoot-in-the-dark A/B testing. Which is good to optimise for "more links clicked". But not so much for relaxed user happily doing his task in quickest way possible.
All buildings display said warning, because that way they have zero liability.
The trouble with the right to be forgotten in censorship. The concept is nice, but in the end, the right to be forgotten can mean corrupt powerful people can censor their misdeeds.
I think something similar in the US would be problematic without our freedom of speech. Even if you get a criminal record expunged, anyone who scooped up that data, that was once public, does have the right to hold onto and sell it.
Not to say that's a good thing. It does encourage Labeling Theory, preventing people with criminal records from being able to find legit work (a counter example, the sex offenders registry in Australia is confidential. It can only be accesses for very specific things, like employment at a school).
That would actually most likely not go through. As a powerful person, you're likely considered a person of public interest, at which point your right to be forgotten is forfeited, because it conflicts with free speech. (A judge will decide whether that's actually the case or not.)
The former would clearly run into 1st Amendment concerns, but I'm hopeful the latter can be allowed without the same concerns. Does the EFF oppose the latter type?
There are quite a few startups in the lending space. But most I've encountered rely pretty heavily on existing, underlying infrastructure, i.e. legacy finance.
But I’m imagining a distributed reputation rating scheme. There’d have to be some PageRank-analogous feature (I think that’s how PageRank works), so that a high rating from an entity with a high rating is worth more.
Still plenty of issues to remedy... spam, sock puppets, etc. But I’d bet that a distributed credit rating system could be built.
If you make a search engine vulnerable to a libel lawsuit because, for example, their search results make it look like Joe So-and-So was arrested for DUI (when it was actually Joe So-and-Sew or some such thing), they'll just stop indexing that stuff entirely.
Best to avoid creating new laws where old legal concepts will work fine.
If it's basically just non-newsworthy information about private citizens, then I do not think they should be shielded from liability.
As the ability to collect and process data becomes cheaper and easier to deploy, it seems to me that trying to preserve an assumption of universal privacy and anonymity trying to swim up a waterfall. Cameras are becoming so cheap they're practically disposable. Facial recognition software and the big data tools to manage all this data are also becoming more widely available. Are we going to legislate against all that? It's one thing to monitor high profile corporations like Google and Facebook, but if surveillance is cheap enough, how do you make sure that no one is amassing reams of private information?
The worst case scenario is that while corporations and criminal organizations continue to discretely gather private data, the rich and powerful will be able to afford the cost of privacy but the rest of us won't have a grasp on who knows what about us.
The alternative to working against the tools that technology affords us is to work with them. In some cases this means embracing radical transparency. We define a narrow range of places that really are private, and assume that anything that happens outside of those spaces is public. For example, what happens inside of one's bedroom is private, but what happens outside of one's front door is public. This information wouldn't be available only to the powerful or well-connected, it should be available to everyone. In particular, society should keep a close eye on the richest and most powerful people. Not necessarily on their private lives, but certainly on their finances.
I'm not arguing that we should give up all privacy. Encryption works and is difficult to defeat, so we should default to encrypting all interpersonal communication. We don't need to give up privacy, but we do need to prioritize what aspects of our lives should remain the most private. I do think that if that we're going to expect twentieth century notions of privacy and anonymity with twenty-first century technology, we're going to have a very hard time of it.
I don't necessarily have any expectation of privacy. A lot of it is public data (my email for example). But maybe I just don't want my email to be in a particular database and be used every which way, because I never consented to that particular corporation using my personal data as part of their business strategy.
I'm not just talking about "I signed up to Facebook". I'm talking about "I explicitly didn't sign up to facebook, and yet their Like button is tracking me across the web".
It's also things like the right to rectify that I appreciate a lot. Less so on startups, who nowadays have a clue how to do simple user profile forms, and more so for, once again, big corporations who have decided that real names never change, even if the CSR who signed you up over the phone completely misspelled it.
I don't think GDPR is as much about privacy as it is about control.
To know what data do you have and to able to manage it is not difficult nor expensive. It is a short-term cost for long-term benefits. The fact that most companies don´t have this in place shows how bad data architecture and governance is around.
> but if surveillance is cheap enough, how do you make sure that no one is amassing reams of private information?
Laws don´t suffice, but there are mechanisms to enforce laws. Whistleblowers protection, in-premise inspections, public claims, etc.
That is hard to assure food safety, doesn´t means that you give up. Time to time you find really bad transgressions, but it was worst before this laws.
> This information wouldn't be available only to the powerful or well-connected, it should be available to everyone.
The problem is the asymmetry of usefulness. For me, to know that you went two days ago to the supermarket and that you have a post about privacy is useless. For a marketing company may provide high value.
There is the risk that the powerful will become more powerful having data that they can use to increase their riches. While the poor will continue as poor as they can´t use the data for anything useful.
> but we do need to prioritize what aspects of our lives should remain the most private.
This is true, thou. To have public salaries, for example, may help employees to have as much information as companies already have. To give access to everyone to some data, can help to improve society. The difference is that I will prefer it the other way around. We protect our data and decide what to show, instead to make everything public and fix what breaks. So, I´m not so far away from your view. But I see it from another perspective.
> We define a narrow range of places that really are private, and assume that anything that happens outside of those spaces is public.
I think it is hard for the average person to even understand what public and private means in various contexts. If government can see everything is it private (even if it is called a "private" chat). If company that is providing the channel can see the conversation, is the conversation really private.
It is a bit like the the FDA and food safety. It would be nice if we didn't need the FDA and everyone could carry their chemical and biological assay kits with them, inspect and test the products and drugs they consume. But most people don't know how, or can't afford to do it. So however imperfect it is nice to have at least some entity, even if it is corrupt and has a revolving door with the industries it is trying to regulate, to set some standards. GDPR is a bit the same, it is not perfect but I still see it as a positive step forward.
You don't have an exa-scale storage array, and google does. Which one of you is at an advantage if everyone has to share data with everyone else?
Radical transparency is nothing short of digital feudalism, it puts all power in the hands of those that own the storage and processing. Let me now address your needlessly dystopian post one point at a time:
1. how do you make sure that no one is amassing reams of private information?
You license and audit large storage arrays. Peta-scale and above will do as a start. You can detect those remotely from their power draw alone, so they shouldn't be hard to find if you're not phoning in the job. They'll show up on power grid stats more or less the same way large weed grow ops do, and we already hunt those down in most western countries.
2. The worst case scenario is that while corporations and criminal organizations continue to discretely gather private data, the rich and powerful will be able to afford the cost of privacy but the rest of us won't have a grasp on who knows what about us.
Indeed, so why would we deliberately make that a reality? Taking action on data requires storage and processing capacity sufficient to process that data, which no one other than the rich and powerful has. Additionally, transparency laws are only going to reach the edge of your borders, so anything confidential that can be offshored will be offshored to bypass your laws, but only by those that can afford it.
3. For example, what happens inside of one's bedroom is private, but what happens outside of one's front door is public. This information wouldn't be available only to the powerful or well-connected, it should be available to everyone. In particular, society should keep a close eye on the richest and most powerful people.
But in reality, no one except the rich and powerful has space to store footage of everyones front doors, so boots-on-the-ground journalism against the richest and most powerful people will remain exactly what it is: detect/predict first, and then selectively record. Meanwhile, you've just created a law that allows facebook drones to prowl our neighbourhoods, recording as they see fit. Are you even on our team?
4. Not necessarily on their private lives, but certainly on their finances.
That's not going to work any better than it does today. Companies and individuals alike already funnel their wealth through shell companies in tax havens around the world to hide their activity. Those tax havens will not adopt your "transparency for everyone" laws, because their national income is based on hiding peoples financial activity, and your laws have just made that service even more valuable. They also won't sell you the privacy protections they're selling to the elite, because you're probably not rich enough. So all you've done is ensured that ordinary citizens can never access the financial privacy that the rich can buy off the shelf.
5. We don't need to give up privacy, but we do need to prioritize what aspects of our lives should remain the most private.
Sure, but we should prioritize it with a plan of eventually restoring privacy for all aspects of our lives, not with a plan of doing a shit half-job and then going for an eternal smoko.
The way to achieve radical transparency is simply a law that says that if you hold (some kinds of) personal data, you must make it publicly available for free.
There's of course the issue that some things must be kept private (e.g. authentication data, but maybe also things like web searches that are personal but essential to use a service) and drawing the line can be hard.
The issue that this tries to solve is not really "privacy" per se, but rather the existence of entities monopolizing data.
That'll work, but the theorem behind it is nonsense. We're trying to prevent people from building up secret stores of data on other people, but where I would force such an individual so caught to delete that data, you would force them to share it.
The enforcement cost is the same either way because it's mostly in the discovery and the prosecution, so where's the savings in sacrificing all privacy in the process? There is none, so we might as well keep privacy. What a silly proposal.
And if they do, then the data being public prevents those services from gaining a competitive advantage from the data, thus making it easier to compete with them, and resulting in a more competitive market and thus better services at lower cost for users.
That novel follows a police detective trying to solve a crime. A major source of tension is that all of the quasi-public data (public cameras, citizen movements, credit card use) is in the hands of a separate institution called Citizen Oversight. If I remember rightly, it was a separate, quasi-governmental (or non-governmental) body, broken down by region and with separately elected commissioners.
In the novel, the main focus is the relationship with the police, which was very tense; Citizen Oversight was very stingy with data. But you could easily imagine it having jurisdiction over corporate behavior around individual data. And having an active regulator whose job it is to enforce broad principles would have advantages over detailed rule-making fixed in laws. Especially so if they were part of a legally independent body.
It was definitely interesting to think about. And given that it came out in 1990, surprisingly prescient on the topic of data and privacy.
> The GDPR’s premise, that consumers should be in charge of their own personal data, is the right one
That's not just the GDPR's premise, that's the very foundation privacy as a civil right in Europe, and has been for a very long time.
The GDPR is just yet another attempt to force companies who have wilfully ignored the rights of millions of Europeans to start complying with laws we already had in place. It's not something new, just an iteration in enforcement.
America should make laws that suit America's values and principles, but as it stands, America has no deep concept of privacy. The GDPR is alien to American values.
(BTW, that quote is subtly wrong but illustrates the huge gap in perception: it should be "citizens", not "consumers"...)
The intention behind the GDPR is good, but it still hasn't gone into effect yet, and it remains to be seen what the long-term effects of it are. It's really premature to draw any conclusions about its effectiveness, and history provides us with countless examples of far-reaching regulation that either failed to have the desired outcome, or in fact ended up exacerbating the very problems that it aimed to solve.
With a law as massive as the GDPR, it's going to take several years to really get a sense of what steady state will look like, and there are all kinds of ways it can backfire. I hope it won't, but there definitely is a strong, unfounded bias in discourse towards assuming that the GDPR will succeed in the goals that have been projected onto it.
In discussions about the GDPR I see things that are part of Dutch law for years, in some cases dating back to the 1970s.
In practice nobody cared. In extreme cases the data protection authority would say something. But they were mostly understaffed.
I think you are dead right. GDPR is an incremental modernisation of the 1995 EU regulation. There have been a number of cases recently that have shown that Facebook, for instance, have been breaking the current EU law, but the national governments (Germany, Belgium recently) have had a hard time enforcing it in any meaningful way. GDPR will allow national governments to enforce their existing laws. If you are a US company who was breaking, for instance, the UK's Data Protection Act 1998 then I have very little sympathy if GDPR now breaks your business model. Breaking the law, but exploiting jurisdiction is not the kind of competitive advantage I will stand up for.
BTW you can't opt out of the law in a EULA.
In some ways, it makes it easier to comply, because you just have one set of rules rather than multiple national implementations of the Directive.
Theoretically “protecting” people is good, but protecting them from what specifically? Health records are already covered by HIPAA, so other than health, what needs more protections? For example, collecting MixPanel or Google Analytics data from a blog — what’s the actual risk of that data? Very interested in real examples and not just hypothetical fears..
What problem is the EU law solving? Have people on Europe been suffering harms until now?
This is not covered by GDPR, it is a fair use and anonymous.
HIPAA is not a widespread standard outside of the US
People have lost out due to credit card details having been stolen. PCI compliance is a contract between merchant and bank, and not statute law, and therefore we have seen colossal breaches (like Talk Talk ISP) that are hard to punish. The cost of these breaches currently falls on other merchants who have to lose out to fraudulent car use.
Next big co that looses thousands of cards, I really hope they get the top fines, as it is other companies that have to pick up the bill for their actions.
Those same citizens that voluntarily agreed to the EULA? Do you also support the 'War On Drugs' on the same premise?
Much of what is in GDPR was already illegal under the 1995 regulation, just hard to enforce on US companies
I'd support that War on Drugs.
Actually, it’s been in effect since April 2016, the information commissioners across the EU will be enforcing the regulations from 25 May of this year.
I'm not sure what, if any, of interview notes, performance reviews, or discussions about who to let go in a redundancy are "personal data" of the employee. I'm also not convinced anyone else does either.
Already I've started to see contracts with credit card gateways include PrivacyShield clauses.
Personally, all products I build going forward will be GPDR and Privacy Shield compliant even though I am in the US. I recommend other entrepreneurs do the same because it is probably easier to consider it now than it is to do it later.
For example (to give context we have PCI requirements to) when someone makes a change to the code we have a impact assessment that needs to be filled out. Among those are the questions:
1. How will this change impact security?
2. How will this change impact customer privacy?
We fill it out for every single change request (even if the answer to both is "It doesn't) just to document that we are thinking about it and engrain thinking about it into the company culture.
The danger for these kinds of controls is that you're trained to say "no impact " many times (because there is none most of the time)
Also, best practice would be to have "No impact" require an explanation not just simply a two word brushoff.
Edit: Also at some point you have to trust your team, hire the right type of people, and embed it in the company culture that the analysis is something to be taken seriously. If leadership takes it seriously the people filling out the forms aren't going to brush it off.
Not here, but I have seen many comments on other sites that imply this will be a burden on small companies implementing this and worrying about whether they are compliant with some rules that can be interpreted in different ways. Also answering requests for information which range from the benign and can be automated to the letter which caused a stir on linkedin  and can be viewed as complex and costly for a small business to answer.
The reason why I talk about small companies, in a lot of cases another already overworked person will need to wear another hat and may or may not do a good enough job. Verses the larger ones, they can implement a small task force and get this out of the way.
I know some commenters on HN would disagree with this and mention that these smaller businesses who don't adopt GDPR should go out of business. But I largely disagree. Businesses which close due to regulations, results in larger market shares to those left standing. Meaning that competition and what largely benefits the consumer dwindles down. Another knock on to this would mean that prices go up, due to those same regulations.
However, what I haven't seen talked about which I wonder if it will make the GDPR moot. Is that Trump is currently engaging in a trade war and I wonder if any lobbying attempts are being made for him to exempt US companies from it?
My company is squarely in the SMB camp at 21 employees and single-digit millions in revenue across three business lines.
GDPR compliance has already cost us hundreds of thousands of dollars and will cost us more as we go on. There will be some very minor benefits to our customers, perhaps, but for the same amount of money we could 100%-definitely-for-sure-absolutely give all of our customers things they would, if given the choice, trade those benefits for.
That's the thing; it's like when you buy an appliance. You can buy the thing that meets the needs for $X, or you can buy something that's better for $2X. Of course, the better appliance would be better. Should we make a law that requires companies to only make the better one? That law would provide a benefit, because consumers would get the better appliance, right? Okay, sure -- but at what cost? And what value-producing companies (because companies do provide value for customers!) are going to be marginally less efficient and therefore marginally less effective and therefore, on the margin, go out of business because of it?
There are valuable ideas in the GDPR. The execution is pretty crappy, and in the end it I think it likely reduces net consumer autonomy because it gives them less choice in how they relate to companies.
Wow, that's incredible! Can I ask where you are based and roughly how the cost is broken down?
I just can't grok how an SMB would need to spend so much on something that seemed relatively straightforward for my own business.
I go into some more specific categories in another subthread in this article.
Well, it depends. If by "better" we mean it has an extra secondary feature, probably not. But if by "better" we mean it doesn't catch fire during use, then probably yes.
The execution is pretty crappy, and in the end it I think it likely reduces net consumer autonomy because it gives them less choice in how they relate to companies.
Can you expand on this? As far as I know, consumers can still give you express consent to use the data in other ways.
In my role as a consumer, I have several services I use and many companies I've given my contact info to in order to get something from them. I love being able to use my information as currency. The outcome of the GDPR's treatment will not be that these companies still create the same whitepapers, services, etc. and just give them away for free. It will be that they move their energy to something else, because the whole point in creating those resources was to get the information, and this will probably cut that ROI for that in half or more. In my business roles, I had active projects to create valuable stuff for clients around that type of thing, and they now don't make sense.
Maybe? Many such laws already exist: regulations on fire safety of buildings, fire safety of pillows and furniture, energy efficiency and safety of cars and home appliances, anything related to food processing, and so on.
I don't think there are any consumer products at all where governments haven't made laws about only selling "the better one".
If you ask me, being ready for GDPR is much harder for big companies with legacy systems, bureaucracy etc.
And I find that letter easy to answer. If companies don't, that just makes GDPR even more necessary.
Hundreds/thousands of hours of time (over a thousand for sure by May 25th, and it doesn't stop there) of very expensive employees to understand what compliance means and work toward achieving it (law/privacy professional, developers, training time for all employees)
New software tools for specific compliance requirements (documentation, etc.)
Consulting and new services needed (EU representative, etc.)
Cost of EU servers (this is NOT required for compliance, but so many of our customers have a bad understanding of what the GDPR requires of them that we found we would lose tens of thousands in ARR if we didn't bring up an EU server stack).
I've spoken with friends at big megacorps, and the challenges are massively different. At our company, we need almost everyone to understand quite a bit of the GDPR. We also can't build giant custom solutions -- like many SMBs, we are essentially a framework built around a chain of dozens of third party services that all have their own GDPR needs, from G Suite to Mailchimp to github to our bookkeeping and accounting contractors. The job of GDPR compliance just doesn't scale proportionately from 1 employee to 1000 -- there's a base level before the scaling starts, and the scaling from there isn't even 1:1.
Once the GDPR has been around for a few years, everybody in the workforce will (or at least should) know what it is and how it works. The cost of training will only felt when somebody enters the workforce.
The cost of not doing it is widespread violation of our fundamental human right to privacy.
It's definitely worth it.
- As someone else mentioned, GDPR are actually easier to implement from a technical perspective in a project from the ground level than it is a legacy project that already has tons of data.
- If you know from the beginning something is not a viable business model than it is easier to shift your business model early on in the company than post-revenue after you already have your business built on personal data.
On the other hand...
- The rules are very complex and sometimes ambiguous leading companies to sometimes legitimately be unsure if they are doing the right thing. And paying a lawyer to tell them is outside the budget of small business.
- A mid-sized product has the worst burden because they need to convert legacy data AND don't have the resources.
- To properly implement often requires a at least intermediate knowledge of devops / encryption / etc which might mean no more MBA bootstrapping v1 after spending two weeks learning mySQL or a coding bootcamp without hiring an experienced Software Engineer.
Personally as a small company I don't mind it. But I'm also a coder. If I had to outsource my code to someone who was being paid hourly and you told me GPDR would add hundreds of hours (not unrealistic) it would be a big portion of my budget with no business benefit (though personally I think a lot of benefit for the customer).
If so this law could create a data aggregation giant, where data from many (non-web) sources is combined, many more than today, potentially aggravating the problem.
If the specialists can handle this well, this is probably not a bad thing. But it's another example of the increase in production values that gives an advantage to larger companies.
I do think GDPR is a personal privacy win, but I'm also interested to see what happens in terms of tech startups and new products pulling out of the EU.
I don't really buy this concept that you have a reasonable expectation of privacy on other people's websites and the site owners don't own data collected on their services unless the EULA specifically says something to the contrary.
As a practical matter, if we make it even harder to target advertisements then we'll end up with even more of these "you've run out of articles" type sites. I don't want to have to pay the ISP and then also pay every individual website. Collect all the data on me you want to make it so.
If you voluntarily walk into someones private shop, can you demand that the shop owner doesn't catalog that event? Is there an expectation of privacy while walking on the public street? If you voluntarily agree to receive access to a service in exchange for data collection, can that legal contract be invalidated by decree?
Don't take this as some sort of support for Facebook, I personally have never bought into the idea of social media. Luckily for me, I was a full adult long before social media appeared, so I was able to rationally see that the mass privacy invasion vs "free stuff" calculation wasn't worth it.
Having said that, you can't stop people from voluntarily submitting their data in exchange for services - there is simply no legal theory in support of banning that.
Unless you know the shop owner, you would not be personally identified, and yes, in fact, it would be illegal to use technology that personally identifies you when you walk into a shop. The event that _somebody_ walked into a shop can be recorded.
> Is there an expectation of privacy while walking on the public street?
Insofar as no records are made of your movement, yes. It is illegal to record somebody else's presence in a public space, although fair use examples exist (in the background of a personal vacation photo, for example). There are zones with video surveillance, but those are generally clearly marked. The general expectation is that nobody who does not happen to be in the same place as you at the same time knows that you have been there.
That is, in very broad strokes, the current legal situation in Germany pre-GDPR.
German law has often seemed silly to me, and this isn't an exception.
I guess software that simply displays your name on a screen, but does not (identifiably) record that fact would be fine, though that would pose the question how the software would connect your face with your name - you would probably have to volunteer a photo for that to work.
In the US and the UK, almost every business of any value is recording you from the moment you walk in. At the very least, they likely have a camera on the cash register to deter theft. The UK is widely known to record public spaces with some videos being made of following people in London for miles.
Outside of your own home, privacy regarding your physical person is basically nonexistent except in a bathroom stall. In the US, it’s 100% legal to take photos of other people in public without their permission.
I think the barrier to provide the maximum amount of privacy for citizens in every aspect of their lives is too high in most of the modern world. There is simply no precedent for limiting the amount of data that is collected in public that will sway legislators across the world.
They can, but you have to be informed of that fact. The business may only use the recordings to investigate a crime, it may not use it for anything else, and they have to be erased after a certain amount of time.
So, I can't take a snapshot in a restaurant or on the street if anybody is visible in the background?
> It is illegal to record somebody else's presence in a public space, although fair use examples exist (in the background of a personal vacation photo, for example)
If you were to publish that photo, however, you have to get all identifiable persons' permission or make them unrecognizable. That extends to other information usable to identify somebody such as a readable license plate.
No. Rather, the GDPR allows the shop owner to ignore such a request (that "someone" walked into the shop).
If you purchase something, and they keep records for invoicing/tax purposes, your request for erasure of that information can also be ignored.
> If you voluntarily agree to receive access to a service in exchange for data collection, can that legal contract be invalidated by decree?
No. However, you do have a right to ask the service to stop processing your personal data even though this may prevent you from accessing that service in the future.
> you can't stop people from voluntarily submitting their data in exchange for services - there is simply no legal theory in support of banning that.
And the GDPR doesn't do this. Indeed, I can offer a £5 amazon gift-card in exchange for some personal data that I will share with my client, and provided I'm overt at the time of the collection, and I hand over the data immediately (i.e. do not keep a copy for other uses) there's nothing the data subject can do to screw me out of what's a perfectly reasonable deal. Their right to erasure is irrelevant since I'm not keeping the data; they have no right to stop processing because I'm already done processing. And so on.
On the other end of the spectrum: if you voluntary back up your data in iCloud, can you demand Apple to not look inside it? If you voluntarily visit a sauna, can you demand that the shop owner doesn't videotape your entire visit?
The EU very much agrees with you: allowing data to be collected must be a voluntary act, and they don’t attempt to stop people from voluntary submitting data in exchange for services. The only difference is that the EU thinks “voluntary” cannot be implied or buried inside a lengthy EULA, but must be enforced through opt-in.
The data protection act has been demanding that since 1998 (in UK).
Can you provide some more color here? Without concrete examples this statements is pointless.
If you say 'we use data about the previous likelyhood of people in your profession and age group having an accident in order to analyze the risk....in order to price your insurance' then you are on safe ground
2) Enquiries would be from people who are a) concerned that the decision you made was unfair and would like it reviewed by a human. b) would like an explanation of the decision.
Personally I welcome this. Sitting in a bank and being refused a mortgage because 'computer says no' with no recourse or reason is tough.
If you are profiling for marketing reasons I wonder what kind of enquiries you are expecting?
That's what I assumed before I looked into it, but after I did it actually seemed quite reasonable.
Can you say what parts you find "overburdensome"?
-The "Right to Erasure" for one - any user can force you to delete some of your data at any time.
-Being forced to appoint a "Data Protection Officer" can definitely be burdensome to small businesses or startups that are already on the margin. More reasons the US startup scene will probably remain stronger.
-Heightened standard for "consent" to use of user data
You don't have to. Someone needs this responsibility but is doesn't have to be a specific person. For us the DPO only exists as a mailing address for a subject access request, the role is shared.
> -The "Right to Erasure" for one - any user can force you to delete some of your data at any time.
But only for data you have no real use for, or it would be exempt. The flexibility of the wording works both ways.
Sorry, don't see how this is really a burden. Any small business is going to get such requests so rarely that they can be handled manually. In any case, it's also unlikely that implementing a feature to allow users to do this themselves would have any real cost.
As a consumer I absolutely want this right. As a business owner, I absolutely want to do right by end users. I just don't see any issue with this.
Then you have cases like Equifax, where they have a bunch of data about me that I never gave them and they are doing a poor job protecting it. You could argue that I did consent to it as it was probably buried in the terms of credit cards or other credit documents but it's a reasonable expectation in my view that such data would be protected particularly since it includes critical identifying information such as SSN, account numbers, etc.
 LinkedIn with its dark pattern of getting access to people's emails and Facebook with its upload of contact lists on older Android devices.
It's also a really big deal that one can't protect (him|her)self from such data being shared with Facebook. That's why I've compared them with Equifax from a moral standpoint, since the parent comment specifically mentioned Facebook and LinkedIn as an opposition to what Equifax was doing.
Whether or not they do provide some benefit to actual Facebook users is completely irrelevant.
There is ample possibility that the unintended consequences of GDPR play out in ways the regulators do not expect. Assuming otherwise is foolish.
As far as I can see the companies targeted will do their upmost to avoid any impact on their bottom line so its quite likely they will discover plenty of holes in the legislation.
What if we stopped using "identifying" information as authenticating information? PII is only useful because the authentications systems we have in place are such sh*t. Changing this is a much more achievable scope, and would actually address the core value of stolen PII.
If we take computers out of the argument it would look like this: the government telling people that they can not take notes or make records of information that they hear. Case law has found, for instance, that photography in public (which is making records) can not be banned.
But the IP collection of how-to-live-with-epilepsy.com might be worse, again, since it implicitly carries the information that you do probably have epilepsy.
Companies outside of the EU only have to comply with it when they are processing data of EU citizens.
Also, the GDPR doesn't necessarily apply to every non-EU site that has EU visitors, only to those who in some way target EU customers (the rules are a bit ambiguous: https://gdpr-info.eu/recitals/no-23/)
So if someone outside the EU wants to benefit from the GDPR, the best way is to use services by EU companies, as those are required to apply it to everyone.