I've seen a number of articles trying to frame the GDPR as some kind of shambles. The shambles is the way too many companies have abused and mis-processed the data for too many years and somehow the EU lawmakers are bureaucratic imbeciles. Yet, everyone I know is fully in favour of this as consumers.
And, for context, I am the person who will have to deal with these at our company. Our customers are absolutely entitled to expect us to process their personal information is a responsible manner and I hope a number of these letters are sent to every company, it's about time there was a power shift in this area.
Here's another, bake privacy into your company from the start. Create a culture that takes it seriously and threads it through everything it does. Once you have this culture you'll find it costs less than when you try to retrofit it after 3 years.
In terms of the benefits, I can only assume you're American to ask this. In Europe we view our privacy as a human right and that our lawmakers should protect that right, it's that simple.
Replace "privacy" with "security" above, and you'll get the widely accepted best practice approach: "you cannot bolt on security later", etc. Likely it will work for privacy equally well.
Not really. It will mostly be a problem for companies which use a lot of SaaS services with no on-premise solution and companies in the business of selling their users data. Not gonna shed a lot of tears for those.
How do you guarantee that the SaaS you chose is enforcing the privacy of the data you're paying them to process?
If supporting GDPRs is a requirement for having European B2B customers, SaaS providers are going to start certifying against and architecting around that.
Aldi has become extremely succesfull without knowing their customer. Ikea probably the same.
Doing the right thing is a burden, that’s why it’s called the right thing and not the convenient thing.
Small startups also often get the benefit of reduced regulatory burden, which is fitting because they have less overall impact on society. Once they become large, it is fitting that they play by rules that benefit the majority.
It seems you have an axe to grind and derailed the conversion to compare the US with ancient Rome. Please provide a citation showing a correlation between increased privacy regulations and reduced gini coefficients.
That's the real issue here. You can either decide that privacy is a basic right that everyone has and cannot be negotiated away, in which case this law makes sense.
Or you can decide that it's a decision each person makes, and let the market take care of providing options that are more vs. less privacy-respecting.
> Or you can decide that it's a decision each person makes, and let the market take care of providing options that are more vs. less privacy-respecting.
Or, collectively, a group of people can agree on a government that supports privacy protection for goods sold under its remit. Which is what's happened here. I get that it's not popular in the US, but privacy controls are quite popular in Europe, and the EU is in this case following the mood of its people.
Not sure why you phrase it "or"? I think we agree that that's one sensible approach to take (treating it as a basic right that can't be negotiated away, much like other things).
This is a false equivalence.
And those who end up in hospital, in the US, possibly without insurance? I guess that, and the ensuing bills is "just" an _extremely_ bad day then?
And if you wanted to examine whether it could be a matter of life or death, it wouldn't be that big of a stretch. Consider what might be the result of someone trying to escape from an abusive relationship having their personal data exposed. Or a whistleblower / political dissident. For example, imagine a Chinese dissident with a free-Tibet facebook record whose data gets back to the Chinese government.
I'd focus on China particularly because they're developing a system for working out how much people align with the state and it appears it may be partly based on the information they can find out about people's internet postings:
This might have an outsize impact on startups that deal primarily in data about people. I'm actually pretty OK with that. Some kinds of activities really should have high barriers to entry.
There are industries, like construction, that have a lot more regulations that this one. And small companies survive.
> What are the costs compared to the benefits?
Benefit: Citizens have the right to protect their privacy, to not be tracked without reason, to be notified when a data breach put their safety at risk, etc.
Cost: Companies need to have reasonable data governance that will increase short-term cost, but probably have a long-term positive impact on cost as bad data governance is just technical debt.
Yes, it makes it harder to get into some areas. If that means a net positive for society, I'm surprisingly OK with that. There's no intrinsic reasons why we should care how many companies survive.
However there are two forms of complexity at war here; complexity for business (regulation) and complexity for ordinary people (having data about you everywhere, about everything, forever). So we have to decide which kind of complexity is worse, or how to strike the right balance.
The real cost comes in "old" companies. Those should complain, but those are also responsible for need for GDPR.
Maybe there are no "costs". Just benefits and benefits?
Can anyone explain how it's possible to be positive about GDPR and HN at the same time? I'm not surprised that some people like it. I'm stunned to see them commenting here.
It collects user comments and posts. What you post to this website is entirely under your own control, and there is plenty of opportunity to meaningfully participate here while offering not much more than technical opinions.
Furthermore, none of the information you post here needs to be personally identifiable, under the definition of the GDPR. It is identified by a username, which can be completely arbitrary and unique. You could even use a new one for every post you make.
However, it does appear that GDPR will require that HN delete a user's posts upon request. It might even require that HN delete posts that mention other people, including nonusers.
Edit: Yes, also via Tor. It did ask for an email address, for password resets.
That would be different if they marketed/promoted/sold in the EU, offered European language or currency support, or somehow otherwise took action to position themselves for the EU.
As a thought experiment, if HN was regulated by GDPR:
1. Yes, all kinds of user generated content can contain GDPR Art. 9's special categories of personal data. HN would probably rely on the exemption in Art. 9(2)(e), which permits processing "personal data which are manifestly made public by the data subject." The purpose of HN is to let you share your own data on the Internet, that's the entire point. That's fine under GDPR.
2. HN would still need a lawful basis for processing under Art. 6. For a paid service, a Terms of Service would normally be fine. I don't think HN has or wants one of those, and they don't track users at all before registration, so they could collect an explicit consent from users on registration. If they did track prior, a cookie popup could collect the consent. Also, under Art. 8, the default minimum age of consent is 16, so we'd want to consider age confirmation too.
3. Archiving posts on the Internet forever is not a problem, if that's the intended use of the site, which it is. My guess is that deleting a user and their posts is feasible at the application/database layer. The problem would be deleting personal data from backups of the site if the user withdraws their consent and requests Art. 17 erasure. In that case, only retaining the backups as long as necessary and documenting that justification internally is probably sufficient.
4. Article 22 restricts "automated processing, including profiling, which produces legal effects concerning [the data subject] or similarly significantly affects" the data subject. Ranking, voting, and anti-spam probably don't qualify as weighty enough subjects to be restricted. Recital 71 ("Profiling" https://gdpr-info.eu/recitals/no-71/) sheds some light on what the EU is trying to prevent.
5. They'd have to get a data protection agreement or other Art. 46 agreement with hosting vendors. Cloudflare is on top of this: https://www.cloudflare.com/gdpr/introduction/ Not sure what other subprocessors are involved.
6. Being able to see most of your own data on HN means you have Art. 15 access, which is nice. I think they'd have to also give you any hidden metadata as well. Not sure what that might be (vote weight score?).
HN does not require you to disclose personal information, such as who you are.
However, I do see an issue: quotes by other users. That's one of the leaks that took down DPR. He deleted his old posts about Silk Road. But another user had quoted part of a post, which didn't get deleted.
I'm a lawyer but not your lawyer and I have no idea about specific YC or HN details, so take this with a grain of salt, but I think the best argument for why HN is exempt or at very low risk for enforcement is that it does not hold itself out into the EU market for business and is not otherwise subject to EU law(as far as I know, and I have no special knowledge). Users may be from the EU, but HN has no particular nexus to EU law that I'm aware of.
This is important because Article 2 of GDPR ("Material scope") expressly says "This Regulation does not apply to the processing of personal data ... in the course of an activity which falls outside the scope of Union law"
Small e-commerce sites (someone sells socks, hand made goods, used pianos, etc.) are different story here. Usually such sites were put together on some ready-made PHP + MySql solution hosted on a 100 bucks a year hosting and that was done by some small IT shop specialized in this kind of business.
Owners of such small firms are going to have really hard time with GPDR. I suspect there will be a lot of scummy law firms that will go after them and blackmail them either to use their "service to be GPDR compliant" or be sued under GPDR.
Such people are an easy target, they don't even realize that maybe software that was installed for them by some third party that no longer exists puts to logs customer first and last name, or there is somewhere backup with customer e-mails.
This law will have zero impact on say, Facebook, people would give them their data freely as they do now, average FB user will not risk to "get imperfect Facebook experience" (or some other similar clause that clever FB lawyers will figure out) if they block permission to be tracked and their data cannot be sold to advertisers.
And you have to be able to revoke consent, at any time, and it has to be as easy to revoke consent as to give consent.
The right can only be enforced against a "controller," which is the entity that "determines the purposes and means of the processing of personal data."
It's worth noting that GDPR does not give the data subject the right to request everything in the letter. Only a more limited set of things.
The practical effect for SaaS companies is that they should keep track of data and the systems and services where data is processed. With good preparation and a system of record for security/privacy management data, you can prepare for this kind of request very well. My company does just that - helps others prepare.
We help you create at that ecomply.io and then once you're done, we will help you create data subject access request as well.
Except, of course, if your company is one of those favored with an exemption from the GDPR. Because we can't have everyone playing by the same rules in the EU.
Who has received that? Can't find anything by searching.
Given that the "requests are complex or numerous", I will be responding within three months as recommended by the ICO. Have a nice day.
You now have plenty of time to deal with it properly.
If you do not have a lot of data on someone, then three months should certainly be enough time to properly respond to this.
Most businesses do not have any personal data on anyone beyond what you need for an invoice. If you have a dedicated CRM that contains leads of potential customers, or you use an online service like SalesForce, you can probably get their support in complying.
I imagine most companies might only deal with this once or perhaps twice ever, and if they do not keep very much personal data then automating it would not be very efficient use of their time. That's why in general I'd wait until you get such a request, ask your lawyer to explain it (will probably only cost a few hundred pounds), then decide what to do next.
Only very large companies (or companies that deal with a lot of personal data) will benefit from up-front automation.
I certainly have several businesses in mind that I plan to send requests to once GDPR is in place.
EDIT: Although I suppose small, non-EU businesses that mostly do not deal with personal data are unlikely to receive any requests. So you are probably right that "most companies" are unlikely receive GDPR requests.
https://selbstauskunft.net/ exists to allow you to send a BDSG §34 request (like GDPR request, but under the older German law for it) to basically any company. You select the company, sign it, and they automate all other steps.
I’ve sent dozens just this week.
Also, I'd recommend sending another request some time after quiting your contract with the company, just to make sure they give you a written guarantee that they deleted all data about you they don't _seriously have to_ maintain.
I also consider it activism to keep companies aware of their responsibilities.
Does anyone actually understand that this law will make things cost more?
Do you also argument against mandatory seatbelts? All they do is drive the cost of cars up. The justice system that works to enforce the laws brings the taxes up should we get rid of it too?
Anyway, the main target of this legislation are the hundreds of businesses you’ve never heard of brokering your personal information. I doubt they have salesforce leads for each person they track, and I think most people want to see that entire industry collapse.
Similarly (in terms of regulatory burden, not consumer sentiment), most legit consumer businesses rely on razor thin transaction costs. Spending any additional per-customer human time could be the difference between profit and loss.
To see the problem, consider what would happen if you walked into any store with affinity cards, and handed this letter to the manager.
There are only four requests in this letter, and they were written by a PWC consultant to appear as intimidating and confusing as possible, so a small business that does not have easy access to legal advice would not find it difficult to convince a regulator as such.
That said, a lawyer can help you identify them and ignore the rest. For the cost-conscious, spending time on the ICO's website will also help you discover them so that when you talk to a lawyer you can be efficient with their time (and therefore your spending).
> most legit consumer businesses...
Most consumer businesses do not keep very much personal data, if the cost of understanding this letter within three months would cause a company to go into administration then they were going to fail anyway.
> the main target of this legislation are the hundreds of businesses you’ve never heard of brokering your personal information. I doubt they have salesforce leads for each person they track
I don't agree with this at all. Who do you think the "main target" of this legislation is?
Why are you using lawyers to respond to DPA / GDPR requests?
I really don’t see what the fuss is about if you run a semi-professional operation.
Though this letter doesn't mention it, you not only have to provide all data in your systems -- every single db inside your company -- but also data from every 3rd party system. Your transactional emailer, your marketing emailer, your billing system, your logging system, your retargeting system, etc.
The data protection part of GDPR of course applies to all PII regardless of how it’s stored. But that part is not new in GDPR, the EU has had strong data protection laws for a while. (Even if people didn’t talk about it)
The last letter of the GDPR is Regulation. A regulation is very different than than a Directive (the pre-GDPR law is based on a directive). There is very little wiggle-room with a Regulation, even between countries. The ICO also works with other DPAs currently as part of Working Party 29, which ensures the DPAs are working in Sync.
So the ICO advice is worthy of close study, especially if your local DPA (assuming you have one) has not commented or given guidance on a certain matter.
To exercise the Union's competences, the institutions shall adopt regulations, directives, decisions, recommendations and opinions.
A regulation shall have general application. It shall be binding in its entirety and directly applicable in all Member States.
A directive shall be binding, as to the result to be achieved, upon each Member State to which it is addressed, but shall leave to the national authorities the choice of form and methods.
A directive is something member states have to implement themselves, probably also by passing a law using their own national process for doing so. As such there can be (greater) differences in the different national implementations of the directives.
The member states have agreed to abide by the GDPR, but their own specific data protection laws are allowed have slight variations, e.g. the specific age limit defining minor vs. adult.
They're widely respected, but you're right it remains to be seen whether UK and EU enforcement will diverge.
With a good system of record, you can track and manage all of the rest of the information and issues raised in the letter.
That said, in a large company with a lot of legacy systems, it may be tough to extract the actual data itself (or even know if your system of record is complete).
Look at the American Disabilities Act, an act that has done enormous good in many ways, but that has also lead to an entire industry of lawyers hassling tiny businesses over insignificant infractions. (e.g. https://www.mercurynews.com/2016/04/10/serial-ada-lawsuit-fi...)
Startups in the US won't have this hassle. You don't have to serve EU customers to reach mid size/product market fit, you can concentrate on iterating on your core product. When it's time to scale, then you can look at GDPR. So limited resources stretch further.
But if the lawyers in Europe start becoming a nuisance to startups there, it's just going to force more and more services to be located overseas, and more and more government complaining about the dominance of overseas tech, a problem they're probably going to make worse.
Startups in the US are what got us into this privacy nightmare in the first place. Of course, they are no longer startups, but they still didn't fix shit once they got bigger, so I don't see how this argument holds.
I like to think of privacy like internationalisation or security. When I started programming, Unicode/UTF-8 was niche and not well supported at all. Now, for new languages, it's a given. The same with decent crypto libraries. Databases now offer pretty great unicode support (except for the old ones where it had to be bolted on, coughMySQLcough). It isn't inconceivable that privacy tools become standard in databases and data processing frameworks.
Personally, I see this as a brilliant opportunity for people/companies who want to do the right thing for their customers (whether that's consumers directly, or a company using them).
My prediction is you'll see this with cloud providers strongest. Some are putting a lot of effort into GDPR, and a properly compliant provider will become a huge value-add, and not a liability.
Com'on. The internet and web when they started were a wild wild west that operated on the honor system. Most people were just starting to feel their way around what kinds of businesses could even exist on it. The Morris Worm was the canary in the coal mine about how the honor system wouldn't scale.
EU startups are no different than US startups, we just have more of them, there is a greater concentration of investment in that area here.
Blaming US tech is naïve. European companies have been engaged in non-digital forms of privacy invasion long before Google even existed.
Having said that, shunning one car dealership is way easier than trying to stop Facebook or Google slurping your data, even with ad blockers et al.
If I have a choice between an US startup that has no pressure to handle my data responsibly, and an EU startup that has a legal requirement to do so, I would choose the EU startup. The US startup may claim it takes care of my data and ask me to trust their word, but I know that the EU startup is forced to by law.
Perhaps it could end up as a competitive advantage for EU businesses.
And all bigger companies already have a data protection officer, so he just gets this new job title.
How is this much more of a hassle than being required to send people are receipt as proof of purchase...
Ensure customers can see what you record about them when logged in (probably in their user profile), then minimize what you record to what you need.
If the DPO complies with all of it, they will breach the GDPR (e.g. Request 9b). Of course a data subject also has no right to know what security controls (request 8) you have in place, other than they are 'commercially reasonable'.
A regulator can require this information, but not a consumer (data subject). This could be the basis of a great interview test for selecting your DPO.
Request 9b is a bit tricky since the regulator have to be informed but not per se the data subject. Only if there is a risk for the data subject they have to be informed.
The letter is carefully worded itself. The parts the data subject does not have a direct right to know are friendly request (eg 4 vs 8b).
You can answer 8b just with one word: Yes. (Well or No)
The takeaway here:
If you give this letter to you technical personal you will get a detailed overview of the infrastructure they use.
If you give the same letter to your lawyer you would get a very polite letter with the bare minimum of information.
Example for 8b would be this: "We have technology in place which allows us with reasonable certainty to know whether or not you personal data has been disclosed"
Do you know if there's really a requirement to provide requestors with your beliefs about the law, or with legal advice you've received?
Arguably, such technology doesn’t exist (at least when plugged into a computer network). What penalties are in place if you lie in the response?
There are technologies you can use (with varying degrees of effectiveness) to reduce the risk of data leaking by monitoring or intercepting specific mechanisms through which leaks can occur, but you can never have reasonable certainty in this respect.
At some point you will probably need to work with the real data to do anything useful with it. There are situations where you really can operate on obfuscated/encrypted data, such as comparing password hashes, but these tend to be the exception rather than the rule.
And so, if you're compromised at a point with access to the raw data, or anywhere else from which access to such a point can be gained, you've still lost control of the data.
And with lawyers and words I like to think of this quote:
"It depends on what the meaning of the word 'is' is. If the--if he--if 'is' means is and never has been, that is not--that is one thing. If it means there is none, that was a completely true statement....Now, if someone had asked me on that day, are you having any kind of sexual relations with Ms. Lewinsky, that is, asked me a question in the present tense, I would have said no. And it would have been completely true."
Businesses will do enough to pass the sniff test of proper compliance with GDPR, and no more. I've worked with enough to know most mid sized orgs are far too reactive, too technically incompetent, and far too busy making money to do a proper job on adhering. Most flout existing laws already, I don't think they'll be scared of disregarding elements of this too.
I know that there is a HUGE concern about the fines that can be used to backup GDPR.
I know of US companies that have a EU presence legally (but with little income from EU) that are considering just blocking EU traffic as a way to stay safe and smallest over-head.
Many countries in the EU have a great standard of living by focussing on individual's rights vs companies. Well, I say focussing. From our perspective, it's just normal and a good balance. But if you live in a country where companies can screw you over in a million ways ("at will" employment, arbitration, NDAs, etc.), maybe such rights might seem a bit alien.
And if you've tried to comply with the law, but unintentionally fail to handle some edge-case with low impact, the sanctions are pretty light (e.g. a warning letter). It's not draconian, as long as you don't cut corners.
Also, the fines here can be real money, which also isn't often the case. That plus the lack of clarity are why people are concerned about it.
Basically they're worried that you can do everything right and still be wrong because everything isn't well defined and is very difficult to define.
I also disagree with you that the EU regulations are a good balance - it’s skewed way too far towards over-regulation.
After the Equifax thing it's not looking like a very solid bet.
I think this will change the world, just as the EUs push for lead-free soldering did.
While some outfits may blithely whistle past the graveyard - do you want to become the precedent that starts paying the % of revenue fine for non-compliance?
If so, the GDPR is similar to a broken protocol.
Maybe the people who designed it assume that it will never be misused. Anyone with experience designing protocols could tell them how dangerously naive that is.
We keep being told that "data is the new oil". It is. Not for money making opportunities, but because you have to handle it responsibly and if it leaks it will cost millions to clean up.
It's very different. Here you're requested to answer fairly detailed and potentially tripping questions with potential legal implications on your business. This has little with how you secure things technically. It's all about jumping through some bureaucratic hoops, and wasting your time doing it. Answering those questions won't in any way, shape or form improve the security of your business. It's pure distraction.
The difference is that you know offhand how to do one of these things but not the other.
Answering the questions is not intended to improve the security of your business, it's a form of serving your customers.
My personal experience with the ICO has shown their quite lenient to mistakes, if you can show that you’re your honest best, and getting better.
No point crushing companies that are trying, better of getting the ones that just don’t care.
But it's not even just about getting to a point of getting fined or under some kind of investigation or audit. It can be all those clever customers who would use some automated service or a template, just to waste your time ... At least that's what the original post is about, but I hope it won't be too common.
Around here, regulators are prone to scoring easy points by going after the small, naive fish. All it takes is the wrong incentives: the department needs to show results, so it gives bonuses, or establishes quotas for successfully handled cases. Bam, your small business is now investigated because a government employee needs to meet a quota and correctly guesses you can’t afford competent legal defense.
People keep making this kind of argument, but it makes no sense.
Personal data isn't protected from leaks and privacy intrusions by documents or emails. It's protected by encryption, or only being processed by software with a clear purpose, or simply not being stored in the first place.
I suggest that it is not only possible but also quite likely that a reasonably diligent startup will be taking reasonable practical steps to secure personal data but will not have formal documentation or automated processes in place of the kind that would deal with a SAR like this.
We expect programmers to write working code out of general competence (and we even make sure they know how to write working code in the interview process), but we still write tests and insist that they pass. We expect finance folks to handle money correctly out of general competence, but we still have written policies about how money should be handled. The reason we do these is that good, well-intended people occasionally make mistakes, and in both of these cases, the mistakes have real consequences.
A written policy about how you handle data isn't going to save you if you're messing up in general. But it should be easy to write, and it will save you from "Wait, why did one of our interns add a library that sends stack traces and local variables to a third party? How did this code review even get approved?"
The documents don't protect your users' data. Your general technical practices protect your users' data. The documents protect your general technical practices.
So I think I would still argue that the security benefits of this law in terms of any documentation and processes it requires are at best unproven, and that a startup could be doing the practically useful things needed to protect personal data regardless of how compliant or otherwise they might be with any documentation requirements.
I wanted to tell you how impressed I am with how patiently and clearly you've responded throughout this comment section.
I likewise think the intent of the law is admirable: prevent future Equifax-es, give people control over their data, and centralize the requirements so that companies need to comply with a single EU standard, instead of 28 country-specific ones. But the amount of discretion left to regulators and the lack of any sort of proportionality built into the law make this all very scary. We are expecting a fifteen person small business to have a totally impractical degree of _documentation_ and _formal_ processes, which are 1) very expensive to produce, 2) totally unnecessary for an otherwise reasonable and well-intentioned group of people, and 3) crucially, basically orthogonal to actual data privacy and security best practices.
And even if you comply with the letter of the law, just reading and understanding an email like the one in this post will require hundreds of dollars of company time – beyond reading it, it will need to be escalated, someone will need to loop in a few other people to help with any new technical details, and so forth. If the fully-loaded cost of a white collar employee is $75/hr, this all gets expensive very quickly, and that cost can be levied on a company by an email that can be sent in one minute. Nobody is going to bring down Google with GDPR-spam but it would not be hard to do serious damage to a company of ten people.
There are a lot of well-meaning thoughts in this thread from people who are frustrated at the status quo but unfortunately don't understand how little this law will do to change it and how huge its costs will be.
When you try to deliver a novel product and build a business around it, you are forced to develop a strong sense of practicality and an understanding of the machinery of a business. Most people have never done this. Despite being very intelligent, a lot of these people haven't experienced the realities of creating a business, and as a consequence they don't really understand just how harmful this kind of law can be.
I admire how patient and articulate you are. (And I think your thoughts are clear and your point of view is correct and badly needed.) Would love to buy you a beer sometime.
Since Silhouette (and gdpr_throwaway) want to keep their anonymity, I opted for virtual beers by upvoting :) But happy to convert those karma points to real food or drink -- and hopefully an insightful conversation -- if you feel like getting in touch (my details aren't so private).
For the few small companies I've worked for, this would have been a bit of work once (document the dataflows), and then a fairly easy set of queries to be repeated each time.
To add to a sibling comment, Google can afford a big enough legal department for estimated 0.00000x% of their turnover that deals exclusively with these.
For smaller organizations, this becomes more like 0.x% of turnover...
Not to mention the distraction and plain overhead when you're juggling so many other things.
By that logic don't you need a lawyer to handle all customer support interaction?
Couldn't you get sued to fraud if you fail to document purchases in a legal-safe way?
The part that's not clear about the GDPR is whether you're obligated to manually answer any data-related question a user has, or if you can just post a comprehensive FAQ + data export / account deletion tool, and auto-respond to GDPR requests with links to those.
Here is a listing of everything you have a right to know about our company and processes under GDPR:
<huge info dump>
Here is all of the personal data we have about you:
<very long CSV file>
Ideally, the most time-consuming part of responding, after the first such letter, will be verifying the user's identity.
Then limit what you record. What do you need to store that isn't visible from peoples user profile when logged in?
It's unlikely that the number of requests of the type referenced in this article would be sufficiently large enough at that stage that it wouild "eliminate the value proposition."
Basically, you can request any non-sensitive information from any government agency and they have to provide it within a reasonable term or pay a fine to the requester.
This caused people to request all calibration reports of a speed camera if they got a ticket, because for quite some time the government would waive the ticket if you stopped the request.
When it got abused too widely they automated the process and now it's not a problem. This is also how large coorporations should handle this problem.
Can't feel too sorry for them
If you phrase it as "large companies," then it sounds bad - but it forbids incompetent large companies too. It enforces that only companies that are competent enough to answer questions about data protection can be in the personal data space. If a small company is inherently incapable of answering those questions or handling the data properly, it shouldn't be allowed in that space.
It's like saying that there's a "government enforced monopoly" keeping newcomers out of the food business by not letting them just make things in their apartment and hand them to Uber Eats. It is a technically accurate description, but most people who believe that government has any legitimate functions at all see health inspections as a good thing.
There will always be a few people out to cause trouble with excessive requests, but I don't think we should let that block access to non-sensitive information for things we as the tax payer have paid for.
You can wait 3 months (not one).
You can charge £10 if the request is complicated.
A $1000 fee would seem a little bit more fair.
The law needs to be applicable to everyone, and imposing high costs is generally considered to do the opposite: http://www.bbc.co.uk/news/uk-40727400
It is utterly unfair to compare subsidized access to an employment tribunal (potential harm: months of undeserved unemployment, loss of home and possessions; cost of investigation: spread across the entire nation's taxpayers) to almost-free access to your GDPR privacy report (potential harm: a little bit of mental discomfort; cost of investigation: borne by one organization, potentially ruinous for a small business or solo project).
Companies storing and losing PII have a huge negative impact on the affected users, like e.g. credit card fraud or tax refund scams. This bears a huge actual cost to the victims, either because they never get back the stolen money, or because they need to invest significant time and expenses to fight for it.
A company trying to make money of my PII should better be prepared to handle it securely and to delete it upon request. Handling of GDPR requests must be calculated by them as part of the data handling expenses.
In both of these cases if the information is misused it has consequences for the individual, and (relatively) higher cost for the individual on minimum wage.
In the former, this can affect your future ability to find housing. This potentially leads to extraordinary stress.
In the latter the consequences again affect both wealthy and poor, but the person living hand to mouth faces much more serious consequences if their wages are adminstratively docked to pay for costs fraudulently registered in their name. Further, they're unlikely to be able to pay an expert to resolve this or take time out of work to do this themselves.
Which items do you feel are an impossible burden?
From what I see most of the items pertain to one of two possibilities:
1 - General procedures or information about the company (keep this updated and it's the same for all requests)
2 - Information about the subject (export their data in an automated fashion)
The thing about 'decisions based on their data' might be tricky, but I guess you can share what you concluded from it and the overall rationale (for example, Facebook's "Why am I seeing this" over an ad)
There are law firms whose sole business model is targeting small companies for not complying with certain regulations like legal notice requirements or disclaimers on websites.
Only time will tell if this will be the case with GDPR but there definitely is a risk that this new regulation will be abused by dubious players.
Impossible? Why is it not possible?
The first thing I'd do if I was a black hat type attacker would be to submit GDPR information requests to all internet companies I could think of in behalf of all my targets.
 https://gdpr-info.eu/art-12-gdpr/ (point 2)
It raises the barrier to entry for small one person businesses even more, forcing out anyone who can't justify the costs of compliance.
1) Allow people to login and view their personal information: name, email.
2) Allow people to delete the profile.
And don't retain any data other than (1) or (2). If you want to track users to see if they clicked links and what countries they are browsing from then: (A) anonymize it or (B) make it visible in the profile information (1).
If all you record is name and email, you won't need a lot of infrastructure. Your policy might say you transfer email addresses to AWS when sending emails.
The right to control such information is established as a right of the individual; and if you have possession of some information about me, then yes, I have more rights to control what you are allowed to do with this information in your hands than you, and that information can never in any way fully become "your property".
As if possessing something makes it your property - property is a legal notion and (in democratic countries) means just what people want it to be.
Similarly, companies often include EULA and shrinkwrap contracts governing what users are allowed to do with information accessed on their webpages. So why can't users collectively write a similar contract pointing the other way?
"We put lots of engineering effort into mining your personal data and selling bits to other people, but we can't be bothered to put any engineering effort into disclosing on your profile or account-settings page what we're doing with your data."
A lot of the questions are answerable generically (no differences between users). You can't tell me that writing a data privacy FAQ with those answers in clear, simple language, once, with a link on every page and on users' profiles, is an excessive burden. These companies just don't want to have even that minimal burden and process to ensure that changes in usage of personal data get documented and updated on such a faq.
A letter like this would be a hugely disproportionate burden to a small business like that. It would take many hours, if not days, to reply properly to all of those points, even for a business that is doing nothing shady or unusual.
You can't just write "automate it" as if that has no cost.
If a start-up is doing things with personal data so that answering those questions takes more than a few paragraphs, isn't the start-up pretty much a personal-information-processing business, and doesn't it deserve to have the burden? Doubly so because start-ups often leave security considerations for later; any personal information they collect or share may not even meet the minimal industry standards and expectations of larger companies (not that such informal standards are adequate—those larger companies are often incompetent themselves).
It doesn't have to be doing any of that. Just the time and money to have a lawyer review this letter and identify the actual obligations is already a significant burden. For example, notice that just replying with everything requested here would in itself potentially breach data protection law.
A normal person who really was worried about how their data was being used would probably write a polite letter asking what data was being stored, how it was being used, and maybe a couple of supplementary points if they had particular concerns or perhaps had heard a warning about some specific practice that could be dangerous.
A lot of GDPR is not new. It's just clarification of existing law.
1) an updated Asset Inventory
2) a Data Classification Scheme
3) Data Labeling Policy & Procedure
Those are basic components of an InfoSec 101 course taught by Community Colleges and the top basic items GDPR is wanting.
Don't do those things when you start a business.
But, then, don't have your business collect and process data on individuals.
> But, then, don't have your business collect and process data on individuals.
Aren't those two statements together effectively equivalent to "don't ever start certain kinds of businesses"?
There are lots of other profitable businesses you're not allowed to start, like "an agile, disruptive restaurant that cuts costs by never cleaning" or "an investment advisor that front-runs their own customers" or "a healthcare startup that runs on unpatched Windows XP" or "a company that helps you get work visas for nonexistent jobs" or whatever.
In other words, some businesses have requirements. If you don’t want to follow those requirements, don’t go into that business.
If you want to be entrusted with people's private data, then the table stakes are much higher than simply starting a business, and you have to be prepared to invest the time and resources to do it properly, or you're not allowed to do it at all.
Don't start certain kinds of businesses without being willing to deal with the reasonable requirements of starting businesses of that kind.
If I start a biotech startup, then I need to make sure I'm keeping all health data I encounter well protected. This _does_ mean it's harder to start a business in this space—but not impossible
If you're not willing to make that tradeoff, then don't start that kind of business.
My data is my data, not the fundemental requirement of some businesses.
And beside that, regulations that effectively result in prohibiting certain kinds of businesses even though they don't explicitly do so are bad regulations IMO.
There are companies tracking the SSID of my phone with wifi beacons to find out which stores I was physically visiting. How do I opt-out of that?
Sorry to bring the tired "you're not the customer, you're the product" line, but the way the industry is set up today, I'm starting to doubt there is so much difference between the two options.
Tracking and data collection is baked into so many services nowadays that you'd have to be extremely attentive as a consumer to avoid any tracking - also be prepared to face a lot of inconveniences and restrictions. If possible at all.
To your point, if a business is not explicitly banned, but banned because of regulation about what that business can do, that’s exactly the sort of regulation we want. We don’t dictate your business specifically, just what you can and can’t do with the data. If you can operate within those regulations, congrats!
* Data Classifications
* Privacy Impact Assessments
* Log Reviews
* Incident Reponse
Also, are you seriously suggesting that in response to a formal legal communication it's a good idea to reply without having input from a lawyer?
For routine enquiries, maybe not. For a letter like this, from someone who is clearly intending to trip you up and cause trouble, our lawyer is the first call I'm making, every time.
And that initial conversation is already going to cost me hundreds of pounds and a half-day of work, even if I already have reasonable answers to anything we are actually required to respond with under the GDPR here.
/shrug It's your money. You could do that, or you could even light it on fire if you wish. It's no skin off my back. If your company is profitable enough to eat this self-imposed overhead, then its owners will just make less money. If it's not, then leaner competitors will replace it. I'm fine with either outcome.
The GDPR itself is very heavy and has little in the way of moderation for small-scale data controllers/processors, so in practice it's going to come down to interpretation by regulators (and potentially anyone who has rights under the GDPR and wants to make trouble, as in the example we're discussing). If you don't do enough, you potentially face even greater overheads due to formal audits, financial penalties, etc. If you do too much, then as you rightly point out, you leave yourself at a disadvantage compared to competition who don't do as much (and this remains the case even if that competition is knowingly breaking the law as a result, and that in turn doesn't matter if they face no meaningful penalties for it).
Life is risk. I contend that if you make a good faith effort to comply with this law (i.e. consult with a lawyer, once, to develop those eight documents you mentioned in another part of this thread) and generally practice good private information hygiene (wipe out old data, don't log private info, don't retain logs or emails too long, etc.), you're probably going to be fine. This is probably not going to be in the "inner loop" of risks your small business faces.
In every regulation, there are winners and losers. Some of the losers didn't do anything wrong, but are just losing because that's the nature of designing laws that factor in disparate interests. At this point, it's the law, and your only choice is how you're going to handle it. And my contention is that, if your small business is receiving letters like this with any regularity, calling a lawyer and spending half a day on it each time is not among the reasonable spectrum of risk-mitigating responses.
This transition period is ending this summer. Why is this discussion taking place now?
GDPR is very broad and open to interpretations, which will happen only when someone got caught, i.e. during first legal battles.
So, transition period does not really help, be that 2 years or 4. We need to see how this law gonna be enforced by regulators, and which common IT practices constitute breaking the law and which are not.
Because no-one thought to inform most of the businesses affected by it before, and awareness has only grown in recent weeks (and even then probably only among business people who frequent forums like HN where the subject has come up).
Every business I've worked with over the last couple of years of consulting have had sessions on GDPR entirely without any technically minded people having to bring it up.
I'm sure there will be people caught by surprise, by what I've seen has been very promising.
OK, but if you're going into a business and consulting, that already suggests both a certain scale and a degree of awareness within those businesses, so this isn't likely to be a representative sample.
Additionally, most companies without much technical infrastructure are less likely to be affected much in the first place.
This is just untrue. THere are fucking reams of advice to small businesses.
Also, having "fucking reams of advice" is not a good thing. To be practically useful for the kind of organisation we're talking about, advice needs to be clear and concise. A starting point that will take days just to read through and understand isn't very helpful.
You don't need a lawyer to reply to GDPR letters. You do need to comply with the law when you collect personal data. What you're saying is "I should be free to ignore the law until someone writes to ask about my compliance, and when they do it's burdensome for me to get legal advice to respond to that letter".
You keep asserting that it's not necessarily to have a lawyer review a letter, despite the letter being legal in nature and in this case clearly coming from someone who is looking to cause trouble. Clearly you and I have very different attitudes to risk in this respect.
In any case, an obligation to comply with the law is self-evident. My objection is that the law itself is poorly implemented and that what is necessary to comply is ambiguous.
Your repeated scare mongering around GDPR is fucking tedious, especially since almost everything you've said about it is false.
But most interactions with my customers do not begin with a multi-page letter that literally opens with a direct threat and then proceeds to demand a response on 40 different points.
Your repeated scare mongering around GDPR is fucking tedious
I run small businesses, and we have been dealing with GDPR issues. The ambiguity and overheads I have been talking about in this discussion are costing us time and money right now. Dealing with a letter like they one we're discussing would cost us more time and money. Apparently we aren't alone in these respects.
Some of the GDPR's supporters have argued that the lack of proportionality in the actual regulations is not a problem because the regulators will enforce it pragmatically. I have personally heard such arguments made about onerous EU rules before, and through my own businesses I have been on the receiving end of government mistakes and their rather unpleasant consequences. And again, that wasn't some freak unlucky event: thousands of other businesses are known to have been subject to similar problems, in more than one incident, involving more than one government authority.
A few people have suggested that involving lawyers in response to a letter like this is unnecessary. Clearly it's going to be a matter of risk assessment, but I don't think it's unreasonable. Once again, I have personally seen (at a former employer in this case) how much time can be wasted if a company gets caught up in formal legal proceedings even having done nothing wrong.
In short, there are people out there dealing with the issues you call "scare mongering" every day. These are not just hypothetical problems. Maybe you've never been caught up in them yourself, but sadly not everyone is that lucky.
especially since almost everything you've said about it is false.
If you're going to call me a liar, please at least tell me what I've written anywhere in this discussion that was false so I can set the record straight.
(you get fun stuff back, I got all the logs from my public transit card that way)
It would be better to get the lawyer involved when you start your business so you know you're complying with the law.
And almost everything in GDPR comes from existing laws (IN UK the data protection act and PECR), so if your breaking the law under GDPR you're probably breaking the laws that exist now too.
>a. In particular, please tell me what you know about me in your information systems, whether or not contained in databases, and including e-mail, documents on your networks, or voice or other media that you may store.
>b. Additionally, please advise me in which countries my personal data is stored, or accessible from. In case you make use of cloud services to store or process my data, please include the countries in which the servers are located where my data are or were (in the past 12 months) stored.
>2. Please provide me with a detailed accounting of the specific uses that you have made, are making, or will be making of my personal data.
Privacy Impact Assessment
>3. Please provide a list of all third parties with whom you have (or may have) shared my personal data.
>a. If you cannot identify with certainty the specific third parties to whom you have disclosed my personal data, please provide a list of third parties to whom you may have disclosed my personal data.
>b. Please also identify which jurisdictions that you have identified in 1(b) above that these third parties with whom you have or may have shared my personal data, from which these third parties have stored or can access my personal data. Please also provide insight in the legal grounds for transferring my personal data to these jurisdictions. Where you have done so, or are doing so, on the basis of appropriate safeguards, please provide a copy.
>c. Additionally, I would like to know what safeguards have been put in place in relation to these third parties that you have identified in relation to the transfer of my personal data.
>4. Please advise how long you store my personal data, and if retention is based upon the category of personal data, please identify how long each category is retained.
>5. If you are additionally collecting personal data about me from any source other than me, please provide me with all information about their source, as referred to in Article 14 of the GDPR.
>6. If you are making automated decisions about me, including profiling, whether or not on the basis of Article 22 of the GDPR, please provide me with information concerning the basis for the logic in making such automated decisions, and the significance and consequences of such processing.
>7. I would like to know whether or not my personal data has been disclosed inadvertently by your company in the past, or as a result of a security or privacy breach.
>a. Please inform me whether you have backed up my personal data to tape, disk or other media, and where it is stored and how it is secured, including what steps you have taken to protect my personal data from loss or theft, and whether this includes encryption.
>a. What technologies or business procedures do you have to ensure that individuals within your organization will be monitored to ensure that they do not deliberately or inadvertently disclose personal data outside your company, through e-mail, web-mail or instant messaging, or otherwise.
>c. Please advise as to what training and awareness measures you have taken in order to ensure that employees and contractors are accessing and processing my personal data in conformity with the General Data Protection Regulation.
Security Awareness Training
>8. I would like to know your information policies and standards that you follow in relation to the safeguarding of my personal data, such as whether you adhere to ISO27001 for information security.
Get an ISO audit.
You should be able to provide this from a SQL query.
Please tell us all what that query should be, then, and how it's going to cover the relevant data stored in log files, emails, remote services used for payment processing, off-site backups, etc.
That's just a very minimal set of other places that almost any new online business is likely to be working with on day one.
Data Classification Plan
Asset Inventory Plan
Privacy Impact Analysis
Access Control Plan
Data Retention Plan
Data Collection Plan
Breach Escalation Plan
You're suggesting that in order to handle this kind of request -- which none of my businesses has ever received from anyone in many years of trading -- we should write up 8 different formal policies? These businesses probably don't have 8 different formal written policies in total at the moment. This is just totally detached from the realities of running small businesses, though it does reinforce my point about disproportionate burdens.
[The parent comment appears to have been edited after I wrote this. The terms above were in the original.]
I wasn’t finished writing.
>we should write up 8 different formal policies?
Yes. That’s obvious.
More overheads are generally bad for business. In the run up to Brexit, and given figures from the Chancellor's statement just this week showing relatively low productivity and growth in the UK economy, it's remarkable how many people don't seem to have a problem with increasing those overheads and thus negatively affecting the creation and growth of businesses.
There is a balance to be struck here. Protecting privacy is important, but not regulating in a way that introduces excessive burdens is also important.
A) You are using personal data in good faith as part of and don't need a lawyer. Just reply. I work for an organisation at the larger end of the SME scale and wont be using a lawyer. Like I don't use a lawyer for routine contractual disputes like debt collection until the debtor refuses to pay.
B) You are walking a fine line and relying on the exact wording rather than the spirit of the law. You are not acting in good faith and trying to make money out of customer data. You need a consultancy firm and lawyers and you wont get any sympathy from me.
I'm not sure whether you are serious or this continues your repeated anti-EU comments on HN, Silhouette. I find it OT and I hope the moderators do to.
And of course, if people find out out you're under investigation, a lot of people are going to just assume you did something wrong. You won't be able to fix that no matter what the regulators conclude.
To the extent that I am anti-EU in some respects, particularly around the areas of small businesses and excessive regulation, that is born of experience. As I have mentioned in previous comments, which apparently you might have seen, I have been on the wrong side of EU rules being over-zealously applied before, and I have been on the wrong side of a government regulator that is for most practical purposes above the law making a mistake before. Some things that some commenters tend to dismiss as hypothetical, I know from direct personal experience to be real threats, and I will challenge bad laws that allow scope for such threats to exist.
I find it OT and I hope the moderators do to.
I'm sorry that you feel censorship is a useful response to someone with different experience and views to your own. I like to think that HN is a forum where people can discuss such differences of opinion openly and intelligently.
Access Controls, Data Classifications, and Privacy Impact Assessments requested by GDPR are not a threat.
That’s just security 101 basics.
And as I said elsewhere, if you think that threat is imaginary, please look at how many different national tax authorities have started large numbers of incorrect claims procedures against small businesses who had done nothing wrong just because the officials made mistakes with the new VAT rules and got their own records in a mess.
The company is the controller of the data, and Amazon is the processor.
Here's Amazon's declaration and stance, stating they are GDPR-compliant both as a company (when they are the controller - of their direct customers' data), as well as then they are a processor (infra for use by others who control private data): https://aws.amazon.com/compliance/gdpr-center/
There's generally no need for a controller who relays data to a processor to understand the intricacies of the implementation on the processor's side (is deleted data really deleted ?) - what's more important is the processor's self-declaration for GDPR compliance.
The above is my personal $0.02 as I've been spending quite some time getting into GDPR recently. IANAL
To be clear, many businesses may not have good answers right now. Their response should not be "this is too much of a burden" but instead "wow, we really need to find this out ASAP".
Where it gets complicated, i.e. where they buy your data from 3rd parties, I don't have a lot of sympathy for any of the complications involved. Most of the rest can be automated, not for a non-zero cost, but for a relatively low one if a startup goes in with these questions in mind, prepared to answer them when they come up.
I have businesses that don't do anything shady at all with personal data, and I'd like to think we're conscientious about handling what we do have. We follow general good practice in terms of encryption, hashing passwords, and so on. We've never had any sort of request for information under existing data protection rules, nor complaints under any other regulatory regime for that matter.
So, how much time and money should we spend putting together that boilerplate, just to tick a legal box? How much of the documentation formally required under the GDPR should we actually write, given that on the evidence of several years of trading so far it has literally no value to anyone? How much should we spend on things like getting lawyers to review the contracts we have with the small number of outside services we do use, which might have access to some personal data in connection with the services they provide for us, and how often?
If you actually follow the letter of the law here, the costs of compliance would be astronomical by small business standards. There is little proportionately built into the GDPR itself, so we are reliant on regulators to introduce it, and that's not a good position to be in either legally or practically.
Here's how the potential fines are defined:
Under GDPR organizations in breach of GDPR can be fined up to 4% of annual global turnover or €20 Million (whichever is greater).
whichever is greater... So since my company's turnover is order of magnitude less than €20 Million, I guess this means we can get totally buried??
If anyone from the EU visits your website, and you're collecting server logs or analytics with IP addresses in them, you're now processing personal data of EU citizens and subject to the GDPR. They've written this regulation such that pretty much everything on the internet is subject to it.
Note that even if I don't have an email server, relying on my ISP to handle that, desktop email clients download the headers from the server.
A lot of small businesses have no idea that they are storing that information.
It may be a bit of an unlikely scenario, but people should remember their opinions on region-specific content blocking even if they think their region has enough leverage to make everyone bend to their will.
Generally, your device is instructed via a publisher's site/app to reach out to ad tech servers either directly (firstparty), or indirectly (firstparty->thirdparty, firstparty->RTB exchange->thirdparty).
Due to the "chaining", GDPR is particularly onerous on the adtech industry. Granted all the data is keyed by semi-anonymous IDs (cookies, IDFAs, IPs), the concerns for consent, retrievals, deletions, in a cascading manner, are an industry-wide problem requiring collective action. The IAB proposed something for the RTB side, the publishers don't like it, and it'll be tense until and through May 25th :)
Having said that, nobody wants to shut-the-whole-thing-down. While all these servers may refuse service based on fuzzing the request as originating from the EU, they may also decide to serve as-best-as-possible and minimize logging of the sensitive fields - it may be better, for example, to lose some functionality for European devices (behavioural targeting, for example, the idea of showing you an ad for the Widget you just looked at over and over), than to serve nothing at all.
For example, if you decide to ignore tax laws in X, X might put pressure on your credit card processors to stop aiding your tax evasion. If the credit card processors respond by cutting off your ability to processes card, they might not bother just cutting you off from accepting payments from country X. They might cut you off completely. That would be pretty annoying.
Too "bad" about the US dropping the TPP, I assume that was the backdoor planned for "compliance".
Demand all you want, this is the point of national sovereignty.
I'm sure you can rely on your site being too small for EU regulators to bother with, and I'm sure it would be hard for them to enforce if you have no operations in the EU, but the fact you ignore the laws doesn't mean they lack jurisdiction.
A website hosted in the US, owned by a US citizen, residing in the US, is not subject to laws written in other countries.
As a US citizen, I am strongly against our interference in other countries, but even if/when we fix that, it wont matter if the root problem is not fixed, since another outside power could do the same thing.
It's a sign that the people here have the most fundamental control over their legal system. It's not my problem if country B cant do that, but I would REALLY like country B to have the same power over their legal system.
I could go into the real tests and what it means to have a legal system where the individual has so much power, and how to achieve that, but you are ignoring the distinction between enforcing foreign laws on a US citizen and a citizen of country B.
You are implicitly admitting the asymmetry, but instead of fixing country B, do you want country A to weaken it's system so that it has the same foreign influence bug as country B?
- Have their ISPs block access to your network
- Have their banks not process payments to you
And if you really want to generalize it to "laws" they can emit an arrest warrant: good luck ever travelling to another country that has an extradition treaty with any EU country.
They can't prevent a business in another jurisdiction from operating but they sure can prevent your business from being conducted with any EEA entities.
The key term there, of course, is "up to". You don't get fined the maximum amount for the smallest violation. It's a range, depending on the severity of the violation, and probably whether there was gross negligence and/or maliciousness.
I don't understand why this is constantly handwaved away with statements that claim to tell the future. If you are correct that the violations aren't as large in some cases, that can codify it a bit better than "trust us".
If this is carrot and stick the stick is fucking tiny and hardly ever used.
If the amount is anything substantial, more than contact information and whatever data customers might choose to be hosting with you, then you are exactly the right target for GDPR and you should be spending whatever amount you deem necessary to avoid the fines.
It's harsh, but it is true that software and service companies in general, maybe not you, maybe not your company, are far too lax with personal info, and so now legislative bodies like the EU are choosing to address that issue, and the easiest way to be in compliance is to not have anymore customer data than you actually need so when you do get hit with a letter like the one linked here, you have a much easier time responding.
Will this strangle some businesses? Even prevent some from even getting started? Undoubtedly, but that is a trade-off I'm willing to accept in this world where every incentive is stacked against the integrity of my privacy.
I suppose this is why I'm so frustrated by this whole issue. I have a lot of sympathy for your argument that some businesses exploit personal data in ways we might well agree are abusive, and that something needed to be done to curb that. But as someone who does try to do the right thing both ethically and legally, this is just another set of regulations that is going to cause compliance overheads for my own businesses while offering little if any real benefit to anyone in our case.
Meanwhile, if the risk of significant enforcement action against smaller businesses really is low, the door is open for competitors to take their chances and gain an advantage over us, particularly if they're not in the EU themselves. So it also seems to be a case of no good deed going unpunished.
That includes you, the individual as well, and I hope it works out for you the corporation.
There's been a round of companies "reconfiming" email lists "because GDPR" - but if those companies can't show clear opt-in before sending email they're already in breach of PECR.
Obviously this is a silly simile but the point remains: certain types of business have certain regulations, in this case if a business relies on keeping your private data then they have to follow the appropriate regulations, like most other fields.
- Our database (containing user data like login, e-mail etc.)
- Our third-party SaaS providers such as Mailchimp (e-mail address and name), Mailjet (no personal data stored directly there) and Stripe (transaction history).
Automatically pulling together the necessary information from these sources and sending it to the user seems totally doable and not overly complex.
In general, I think the whole idea behind these rights is to incentivize companies to implement well-documented and automated processes for dealing with user data, and to keep the data in as few places as possible.
BTW I'd be very interested to hear from people running startups how they process user data and how many different data stores / services they use to manage that data!
That in itself is reasonable, but it lacks the proportionality aspect that is so important. My own objections to the GDPR aren't about the spirit in which it's intended; while you might not guess it from my comments on HN today, I'm generally a very strong advocate of privacy safeguards. Instead, my concern is the amount of additional red tape and ambiguous obligations that the GDPR appears to be introducing for what ought to come down to simple questions like whether you are using personal data only for legitimate purposes and you are storing it safely, which plenty of us already were anyway.
There should be a distinct "If you adopt these reasonable policies, you are legally in compliance with GDPR".
He was asking in general however, without a mechanism to control that corporations are doing what yours is already doing, how would we verify compliance?
Which aspects do you think would be interesting to automate or are particularly painful from your perspective?
* Privacy Impact Assessmemts
* Breach Escalations
* Access Controls
That’s more like Security 101 Basics to me.
We should ask that to the dozens of millions of Americans who have their private data for sale even as I type this after the Equifax breach. Bonus: we can literally buy it and use that data to contact them directly and ask :)