> It's not ridiculous, but it does impose a cost on them, and by extension everyone else dealing with them, because you changed your mind. Whether you should automatically be entitled to impose that cost on everyone else and under what circumstances is not an easy question.
Sure, and that's why we have initiatives like the GDPR that try to answer that question. It's not going to be perfect, but throwing up our hands and giving up isn't an answer either.
> Maybe they need that data for their own financial records. Maybe those records are things they are required by law to keep.
Needing that information to be personally-identifiable is pretty rare, and cases where that information needs to be retained for a long time are even rarer. But in cases where it's necessary, sure, of course, go for it. The point is that the data needs to be collected and kept for a legit business purpose.
> That's a false dichotomy.
No, it's not.
> Practical data protection is almost always going to be about restricting not just the initial collection of data but also how that data may be used and by whom once it has been collected.
That's been shown not to work all that well. Companies lose control of data all the time, whether due to a data breach, or due to unscrupulous practices internally that takes existing data to use in new ways, even if not specifically authorized.
> An isolationist approach where everything can be kept totally secret is impractical, but it's usually not what we really want anyway, since then you couldn't do anything useful and intentional with the data either.
Sure, and nowhere did I suggest that's what I wanted. Please stop putting words in my mouth. I'm totally fine giving out "secret" data if there's a benefit to my doing so. But if there is no benefit to me, then companies should not be entitled to my data.
Sure, and that's why we have initiatives like the GDPR that try to answer that question. It's not going to be perfect, but throwing up our hands and giving up isn't an answer either.
I would argue that the approach taken by the GDPR is pretty close to just throwing up our hands and giving up, just coming down on the other extreme.
Needing that information to be personally-identifiable is pretty rare, and cases where that information needs to be retained for a long time are even rarer.
Not at all. Just look at the records you are required to keep under EU VAT rules. One of the big criticisms since the changes in 2015 has been that they require an already demanding standard of evidence to be kept for the location of every single customer you sell to (if you're selling something within the scope of the rule change, obviously), you're required to keep that information for years, and your records are subject to audit by any of 28 different national tax authorities.
But in cases where it's necessary, sure, of course, go for it. The point is that the data needs to be collected and kept for a legit business purpose.
And what happens when that data includes, say, an IP address that was subject to geolocation when a customer was charged, thus linking that IP address and everything else in every log or database entry that you ever collected that mentions it with that specific customer? Since you may effectively be forced to keep the IP address associated with the customer to meet mandatory standards for tax record-keeping, must you now purge every related record or log line even from backups, on demand and entirely at your own expense?
What if those backups are stored, as many are, in an encrypted, deduplicating format? Are you now required to go through every backup you've ever taken in the history of your business and systematically obfuscate or delete every mention of that IP address? Do you have to take steps to erase any trace of it from the storage media involved, in case the media are lost and subject to recovery measures after an ordinary deletion? Do you realise how much time and money would be involved in doing that, every time a customer decided they didn't want you storing any personal data about them any more? It's totally impractical. There has to be some measure of being reasonable and proportionate in what is required.
That's been shown not to work all that well.
It doesn't work well when the regulations aren't enforced and there are limited meaningful penalties even for the sort of gross negligence that we've seen in cases like the Equifax leak. The idea that what Equifax did was compliant with the current rules is laughable, yet they've barely taken a slap on the wrist for it, despite both the degree of negligence that led to the breach and the scale and nature of the potential damage.
There's nothing inherently wrong with the principle, though. After all, there are many other things that we could do, but which are illegal and most of us don't, and we penalise those who break those laws. Why should this be any different?
I'm totally fine giving out "secret" data if there's a benefit to my doing so. But if there is no benefit to me, then companies should not be entitled to my data.
But that wasn't the scenario we were talking about. We were talking about a situation where someone legitimately had personal data about you, and you subsequently changed your mind and wanted them to delete that data. From the point of view of someone controlling and processing personal data in legitimate ways, giving you an absolute right to revoke that permission regardless of the practical consequences to anyone else involved is a totally different situation to giving you a right not to be involved in the first place.
Sure, and that's why we have initiatives like the GDPR that try to answer that question. It's not going to be perfect, but throwing up our hands and giving up isn't an answer either.
> Maybe they need that data for their own financial records. Maybe those records are things they are required by law to keep.
Needing that information to be personally-identifiable is pretty rare, and cases where that information needs to be retained for a long time are even rarer. But in cases where it's necessary, sure, of course, go for it. The point is that the data needs to be collected and kept for a legit business purpose.
> That's a false dichotomy.
No, it's not.
> Practical data protection is almost always going to be about restricting not just the initial collection of data but also how that data may be used and by whom once it has been collected.
That's been shown not to work all that well. Companies lose control of data all the time, whether due to a data breach, or due to unscrupulous practices internally that takes existing data to use in new ways, even if not specifically authorized.
> An isolationist approach where everything can be kept totally secret is impractical, but it's usually not what we really want anyway, since then you couldn't do anything useful and intentional with the data either.
Sure, and nowhere did I suggest that's what I wanted. Please stop putting words in my mouth. I'm totally fine giving out "secret" data if there's a benefit to my doing so. But if there is no benefit to me, then companies should not be entitled to my data.