If you are a software architect, you need to ask yourself: can I delete a single user data across all my infra easily? If not, you might be in trouble a couple a year later when you are hit with thousands of deletion request and technically can't honor them.
Another side of tech debt I guess!
It’s probably healthy to design systems that are build to let people manage their own data though. I think people will slowly start to expect that of you in the future as more and more solutions start to offer it.
In the public sector of Denmark we give people access to all their health data. When I have blood drawn for analysis (not English and I don’t know what it’s called sorry) I can log in and see the results exactly like the doctor sees them. When I’ve been to a doctors consultant and she’s written stuff in my journal I can log in and read it. People are going to want that stuff once they get used to it.
Chances are, that's not the case.
The only notable exception is ad business because "users" are not customers and do not enter any transactions.
The legal basis is actually quite broad. For example, most companies need to keep records for accounting purposes for several years which would count as a lawful basis. Another example would be keeping client data in case you need to protect yourself against potential future litigation e.g. a gym may need to protect themselves against personal injury claims (otherwise you could injure yourself, ask them to delete the records and then sue them).
A legal requirement is "you shall retain these data for N years or face prison time" not "this might be helpful in a lawsuit"
This is covered here
* The right to object is not applicable if your processing is based on contract, legal obligation or protection of vital interest
* The right to erasure is not applicable in case of a legal obligation or public interest
Yes, you can ask that your avatar image be removed. But they can easily claim that your name, address, bank account number are needed for tax auditing and Know Your Customer purposes.
Also, remember that data subjects have a right to limit the purposes for which their data is used – systems need to be able to cope with that.
This is where a well thought-out and documented approach to personal information makes everything easier, for internal users of that data too. For legacy systems it can be a nightmare because nobody seemed to care, but with a clean sheet, _why wouldn’t you_ address data protection and privacy from the outset?
Easiest way to do this is probably to duplicate data into another table for tax audit purposes, that your "normal" applications have no permission to read. Then you can delete everything your application can access and still keep legal records.
There are several possible legal bases for processing personal data and legitimate interests is one; if legitimate interests wasn't your reason for processing personal data in the first place, then you couldn't rely on it later.
If legitimate interests were the basis for your processing, and a person requested their data be deleted, you have to be able to demonstrate that your legitimate interest overrides their rights. You should probably also demonstrate there aren't other steps you could reasonably take, eg partially deleting or anonymising the data.
I didn't save the link but at least one country's privacy regulator has said that when processing a deletion request you do not have to go through all your backups and remove the data from them too.
If you ever have to do a restore from those backups you will need to re-apply past deletion requests to the restored data before you start using it.
But honestly this is a very minor point, if that's what is keeping you up at night, you are amongst the most compliant ones !
GDPR is mostly about processing user data, not storing them. So user can execute "right to be forgotten", so the data will not be accessible in any way and you are not allowed to process them, however you have full right to keep those data if you believe that this user might show up five years later and try to sue you for a, say, undelivered order, etc.
In that particular case you need to be able to prove that user wanted to delete the data. Otherwise user might sue you that you've deleted those data without agreement and caused some business losses in that way.
Someone third party might also show up and accuse you that you deleted data as a part of conspiracy with the user (think of angry former wives/husbands).
Storage is considered as processing.
But yes, you're right that the Right to Erasure is not absolute.
If you use it to block the user from using your service or punish them in some other way, that might be very problematic in legal sense.
When you receive a deletion request, you must delete the data and stop using it from anywhere you don't have a legitimate reason to continue processing it under one of the reasons for processing personal data without user consent.
Which is easier said than done, because it's against general development practice of having a single copy of any data that you use everywhere without much checking. Instead, you have to either check a flag before using data (do I have consent? Has it been revoked?), or keep several copies of the same data for distinct use (for example, one copy for your app, one copy for legal reasons). The first approach helps you to be able to prove consent. The second approach helps you when you need to archive data past its retention date.
PII is a US legal term and means precisely nothing in the context of the GDPR.
There are also limits placed on this by other laws. For example tax agencies often require companies keep sales records for multiple years. A request for deletion doesn't remove all invoices from sales records with their name on it. I imagine there are more examples, but I think it gives a good indication that GDPR is not where the buck always stops.
I would point out that, if not for Covid, the UK DPA had planned to kill multiple adtech company during the summer. They had announced it.
There is enforcement. It is just slow because the DPA prefer working with companies and the business side so they can become compliant instead of randomly hitting.
They are here to enhance everyone rights, not to collect fees to fill state coffers.
Facebook and Google are still around and I haven't heard about them being investigated in the UK so something doesn't add up.
AFAIK many companies specifically choose their country of hosting by the amount of understaffedness (one prominent host-country to many global tech giants has a staff size in the very-low single digits).
This is part of a broader strategy by corporations to create an economic environment which smothers startups before they can even get started. Saves them acquisition costs later.
When when politicians get out of office, they have a nice highly paid corporate job waiting for them.
As governments keep introducing more regulations, eventually everyone in the world will be breaking some kind of law whenever they do anything at all.
That will allow governments to selectively imprison anyone who is working against the personal interests of the political and financial elite.
I think politicians really care about these things. But these big sweeping laws and changes are because of hubris (but then again, the job selects for that).
Society is damn complicated, everything affects everything. Making a change in one small part propagates quickly and unpredictably. But politicians don't internalise that (of course, if they did they would never get elected on the platform of "stuff is complicated, I don't know if this will help!").
So they see a problem. Our information is being used maliciously! We need to stop that! Protect the people! So some massive law gets passed with tons of unintended side-effects and it's captured by big corporations at the speed of light.
Besides, I don't see how stupidity and evil are mutually exclusive. Historically, they seem to always go hand in hand.
This isn't the first time the EU has done something that screws small businesses. VAT on digital goods was another case (there was no minimum threshold). At some point it'll start to seem intentional.
So... define "tech". If you mean "adtech", say that.
I do mean tech. An industry tends to breed more of the industry. Adtech is part of tech and a lot of online businesses rely on ads. If you remove that you also remove a large chunk of people that would work on this type of tech. Then some of them instead end up working for some US company. Europe has a much larger population than the US. Europe is largely as educated as the US. Where's our Microsoft, Apple, Google, Amazon, Samsung, Sony etc? We have SAP and that's it.
Edit: I like the idea of GDPR, but I cannot stand how people think it has no cost. A large portion of the internet relies on the ad industry.
There's a cost to not having the GDPR - that of our individual privacy.
Due to the minimal privacy implications, logs which (a) only store the minimal personal information feasible, (b) are deleted after a short period of time, and (c) are accessed in order to fix bugs or provide requested support are covered under the legitimate interests basis, according to my country's regulator.
> How do you track all of that data on developer machines?
> How does your system delete data from all backups?
I don't, but my policy will state how long backups are stored for as recommended by my regulator, and my process for restoring from them will involve deleting data which has been deleted for legal reasons since the backup - basically, "re-run the DELETE FROM query for everyone who's asked for their data to be deleted".
> Do you have an automated system a use can request all their data from?
Nope, but that isn't legally required by the GDPR - it's an operational efficiency if you're the sort of company that gets a lot of GDPR requests. Manual processing is fine and likely operationally efficient, as long as you know where personal data is stored in your system.
> How do you validate that they are who they say they are?
My intention is to respond with "you can verify your identity by logging in at https://X/login with your username and password and sending us your 'support code' from the settings page", handle password resets via email according to industry standard, and anything else I can respond with "we cannot verify your identity using the information you have provided" because, well, I don't hold other information I can verify people with.
> How sure are you that all your processes are legally enough?
Given that I am not an adtech company and have documentation showing that I am attempting to comply, I expect that my regulator will follow the approach they've taken so far, which is to notify me of a potential breach of the regulations and allow me to fix it before attempting to fine me. By the time I am in court I will have known about a potential breach of regulations for several months, including communication with the user and regulator, and will have had an opportunity to either fix the alleged issue or talk to a lawyer about it.
Europe has a fair chunk of tech, btw. SUSE's here, Spotify's here, Adyen's here, BlaBlaCar is here, there's a variety of food delivery companies which are generally being far more sustainably successful than their US-based counterparts, Skyscanner are here. TransferWise is here. The difference is that our companies largely have a business model from an early stage, and US companies don't, so they take up huge amounts of the market for a short period of time and then go bust.
Because that's basically all you need for compliance here.
## To benefit from the exemption from consent
**Subject to a number of conditions**, cookies used for audience measurement are exempt from consent.
**These conditions, as specified in the [guidelines on cookies and other trackers](https://www.cnil.fr/en/cookies-and-other-tracking-devices-cnil-publishes-new-guidelines), are**:
* To inform users of their use;
* To give them the ability to object to their use;
* To limit to the following purposes only:
* audience measurement;
* A/B testing;
* Not to cross-check the data processed with other processing (customer files, statistics on visits to other sites, etc.);
* To limit the scope of the tracer to a single site or application editor;
* To truncate the last byte of the IP address;
* To limit the lifetime of the trackers to 13 months.
Provided that the conditions are met, **we therefore switch from an opt-in to an opt-out regime**.
It is also possible for the same third party (subcontractor) to provide a comparative audience measurement service to multiple publishers, provided that **the data is collected, processed and stored independently for each publisher and that the trackers are independent of each other**.
## In practice
**Most large audience measurement offerings do not fall within the scope of the exemption, regardless of their configuration**.
What they are hoping is that -
a) Nobody is going to hold them to real compliance
b) User fatigue and dark patterns will make you just click "OK, fine" to everything and then they can claim to have permission.
The problem is that 'b' there pretty much rules out the possibility of freely given, informed consent, and makes the whole exercise pointless.
I have seen some large fines but all seem to be of "backend" violations. It would be nice if there could be a handful of large high profile sites given a huge fine for having one of those annoying popups with everything opted in.
There seems to be companies selling blatantly noncompliant GDPR popup tech too. That has got to be the most snake oil thing ever.
I don’t think anything big. Which is a shame, because as long as that continues, the fake-compliance popups will continue.
> There seems to be companies selling blatantly noncompliant GDPR popup tech too
We are using one of those (Sourcepoint , we don’t pay for it though), they are very configurable, you can be as compliant or non-compliant as you want with their settings. They support all variations.
200 million for British Airways according to https://www.enforcementtracker.com/
> The ICO’s investigation has found that a variety of information was compromised by poor security arrangements at the company, including log in, payment card, and travel booking details as well name and address information.
> (...) for not giving users the ability to refuse their cookies and force them to use them if they want to browse its website. In other words, it was not possible to browse the Vueling page without accepting their cookies.
"Assess the value of adding each dependency. Some commonly used software bricks are only a few lines long. However, each added element is an increase in your system’s attack surface. In the case where a single library offers several functionalities, integrate only the functionalities you actually need. By activating the minimum number of functionalities, you reduce the number of potential bugs that could occur."
The context is that an external library can mishandle personal info, but this is true even if you didn't care at all about security and privacy.
So a competitor can charge customers less. That in turn will mean your business will disappear, and the conforming implementation will still be around.
This is part of the point, isn't it?
You shouldn't have to read anything, that's the point. GDPR is the law that recognizes that people will never read, so the only protection possible is to both a) make acceptance explocit and b) make opt out default.
That is - If I just "accept" the policy or click accept to enter the website, I must be sure that I only accepted to the minimum you need to perform the business task, and not e.g. 3rd party advertising cookies, regardless of whether those cookies is what keeps the lights on.
This seems to apply in pre-GDPR codebases which I'm not sure this guide has taken into account. It probably is a lot less costly to architect an application from scratch, therefore easier to justify to a customer.
How much is it costing you to have all the ad cookies and trackers compared to their returns?
Do you know how to edit an individual out of the Postgres WAL in last week's full disk backup?
You either maintain a deletion list for when you recover backups (which is tricky) or you keep data encrypted with a different key for each user, then delete the key to forget the corresponding information
Hence the importance of data minimization and governance
Laws are generally complex, GDPR isn't really special in that sense; it's just newer than the rest. You can probably draft a similarly complex 16-page document for your country's/state's employment laws too, but that doesn't mean you need to work through all of that mess when hiring your startup's first five employees.
In reality, if you honor user data deletion requests, don't track people without asking (with easy "no"), and follow proper modern security practices, you're already so far ahead of the majority of tech businesses wrt the GDPR that you're good. Or at least, that's my impression.
its name in french means 'National Commission on Informatics and Liberty', it is an independent French administrative regulatory body whose mission is to ensure that data privacy law is applied to the collection, storage, and use of personal data.
Created in 1978. National data protection authority for France.
As such, it's one of the best guides you can find around, because everything you find in there is almost guaranteed to be exact (contrary to some companies selling compliance tools that are not compliant, or a random blog on the internet). And if there is any mistake there, well you can refer the CNIL to its own guide when they investigate you.
Obviously they want to make a guide to get to 100% compliance, but they as you said in your last phrase, they indeed don't have the manpower to investigate everyone, and if you are generally respectful of your users privacy, you don't handle sensitive data at a large scale and don't have any complaint against you, they have no reason to ask you for documentation (which is generally going to be the first step of an investigation).
The goal is specifically to make the GDPR less daunting and more accessible! I guess it missed the mark.
But yeah, the GDPR has to formalize a lot of common sense practices, if you don't consider your user as "data trove" you want to abuse, you will be mostly ok!
But you won't. Mostly OK does not protect you from ruinous lawsuits or investigations. With the way the justice system works on the EU level I would never trust "mostly OK". We had an EU court find that posting factual information about a religion was blasphemy and not protected as freedom of speech as a human right.
Just because they aren't an official EU institution doesn't mean much. It's still largely the same culture and legal traditions running it and every EU country is under it.
That's because those sites want to track you, for profit, and they usually want to confuse you into agreeing in the weakest way possible.
These popups are entirely unnecessary, if sites would just stop tracking you and allowing third parties to track you.
Yeah, who needs money to operate a service anyway. Can't they just be happy that I'm even willing to consume their content?! These websites do that so that they can provide the website in the first place. If they didn't then there would be no website for you to complain about.
Oh no, my business model of selling everything I can glean about you in exchange for some recycled 'content' is under threat! Oh noes!
If you can't operate without tracking and you can't legitimately and consensually persuade people to participate in it without resorting to dark patterns and hiding controls, your business is no longer viable, sorry. Find a new model or make way for someone else that can.
Given I have zero interest in seeing ads in the first place, this sounds like a 'you' problem.
I'm not trying to be snarky here - this idea that relevant ads are important to users seems to be some sort of industry delusion. Why should I care if the people trying to sell me stuff while I read a news article are less able to target me? The ads are already an attention-demanding annoyance, it hardly matters what they're for.
GDPR is the best "needle mover" in terms of "we don't care about xyz" in the last 15 years.
In a meeting, even as an external contractor, you can now much more easily highlight privacy/security issues without being told that you are "too caring", and it will actually have an impact (because there is a widely known law & control body).
If there is an annoying popup on a website it's not compliant. E.g. if you need to switch OFF multiple accept switches, then it's blatantly in violation (default must be off). I have no idea why sites would even bother putting up a noncompliant GDPR popup instead of just ignoring it.
Most GDPR implementations that are so annoying are actually wrong.
To me it's a stretch to say legislation ought to work like this.
Many of the "outlier good ideas" in legislation are those that integrate the facts on the ground and consider e.g. implementation complexity.
On top of that, there is a wide range given to the DPA which they are using. It is just that the EU agencies prefer to try to educate first before hitting companies hard, which makes sense.
In practice, if it had not been for Covid, the UK DPA would have killed multiple adtech company during the Summer. They had announced having planned it earlier this year.
The vast majority of those aren't GDPR-compliant, they're just attempts at paying lip-service while maintaining the old ways, and they're usually illegal.