Until some lawsuits will happen and the EU has been more than happy to collect fines.
"The fine was imposed on a soccer coach who had secretly filmed female players while they were naked in the shower cubicle for years."
I think the previous poster's point stands: GDPR enforcement is doing almost nothing about improper cookies and internet tracking.
Why do you say this? At least ours is just swamped, so it'll take a while. Plus big cases take up a lot of resources. But there have been GDPR fines issued, and many many more warnings that might lead to fines if the company doesn't comply.
Plus, the swamping problem will stop because the first high-profile case will scare everyone else into compliance.
Not really, because Google and Facebook will also fight any accusation tooth and nail in the courts and through lobbying. The case must be very well documented and defended.
I'm not saying you're wrong, but it's too early to tell.
Once this is enforced properly, the web will be a better place. Right now it's a mess of cookie banners with no real function, that people just click OK to.
Which simply can't happen. There's a reason websites aren't manually indexed. And that same reason means you can never fully enforce these laws. What you can do, though, is score political points for selectively applying them to large unpopular players. It's always going to be a mess everywhere else, though.
Other actors realize that noncompliance is more expensive (when multiplied by the risk of being caught) than compliance. Done.
No need to process thousands of violations. All that's needed is a few dozen with lots of publicity.
The GDPR opened the floodgates, but the wheels are in motion, and I know of multiple SMBs who got warnings to cut out their practices.
 The part-time manager of a three story building posted a "list of debtors" on the lobby (which contained false information, though that wasn't relevant for the commission, which focuses on the privacy aspect)
- Use or make something similar of the ill fated Do Not Track header for anonymous users. The user decides on his stance regarding privacy. His machine makes it known to the server, the server acts according to the users wishes. No further user interaction required.
- Decide what to do if the user authenticates. Use the method above or offer a more granular way to control privacy settings to be configured by the user in his account settings.
Though that would basically kill analytics. The reasonable default for such a Header is no tracking. The controversy around the existing DNT header and the attempts of at least Microsoft to set it to "no track" on default, is "enlightening".
I the meantime I installed the Firefox Add-on "I don't care about cookies" which does a reasonably good job to remove these annoyances.
- just don’t collect anything, don’t even require cookies
From there every significant actors of the industry, short of firefox and Apple (perhaps Microsoft ?) just went on looking for the other ways, workarounds, anything to keep their business as close as it is now.
That going as far as completely cutting off whole swaths of users just to not bend to the rules.
Blame the players cheating and conspiring to bend the game and ignore or weaken any attempt to limit their reach.
And instead of the cookie popups it would have been much better to solve it the same way localization and notifications work: the browser asks users with an integrated dialog and the user can set it to not even show that popup in the settings. If the browser doesn't support that feature the website has to assume the user doesn't consent.
And regarding the option to set your preferences at the browser level, of course this is the best possible solution, but if you are following the ePrivacy reglement discussion, the article 10 (permitting browser to obtain consent for website with a standardized interface) is pretty close to getting killed.. Money talks.
Perhaps this will lead some advertisers to attempt sketchy things like server-side application integration so that their cookies 'appear' to be first-party; either way, the policy has teeth and can apply fines the same way GDPR can, so any advertisers (or services themselves) found to be storing cookies without consent which are not strictly for site functionality may find themselves in hot water.
I'll be willing to bet that less scrupulous marketers who make a decent chunk of their revenue from users who they mislead into clicking / purchasing goods (i.e. targeting less skeptical users) will also attempt to get their audiences to lower their cookie settings. Think banners with content such as 'to get access to this special deal, we need you to update your settings'.
Note that it's completely possible to build rich web applications that don't use any cookies at all, especially nowadays with localStorage and all the infrastructure for progressive web applications.
It's also worth noting that cookies were controversial when they were originally introduced - it's not like they're some fundamental infrastructure that we've always relied upon. Here's some privacy and cookie advice from 1998, for example: https://web.archive.org/web/19980210083135/http://internet.j...
His talk brought nothing new if you follow him, maybe it brought attention to the issues to a wider public (it was all over the news in Portugal).
On the other hand I feel his presence could serve to white wash the whole thing. How many companies represented there would be out of business if they embraced Snoden's beliefs?
If you allow conditional tracking, and invite workarounds, we are in a race to the bottom. Ethical players are caught in a position of play dirty or die. When growth/profits sag a bit, they will have no riposte to the board member who questions their unwillingness to engage in cutting-edge gamesmanship to get around the law.
On the other hand, if you make it strictly illegal, and make the penalty an existential threat to the company, then everybody can play a fair game. The board member who pressures the company to do such things will be putting the company at risk, and the ethical folks have their response.
Advertisers don't need to track people. They don't need our personal information.
Case-in-point: Alphabet just bough Fitbit. Fitbit knows your most intimate details. They know when you sleep, when you are awake, where you are. They know when you go up and down stairs. I'm guessing they have a pretty good idea of when you make love, where you make love, and with whom (if you're both wearing). And they just sold their data-collection to a company that exists to exploit your data.
This needs to be stopped. Now.
That the collected data is open to any imaginable abuse in the future, just as if the motivations, in addition to the actions themselves, had also been extremely malicious, since they (the big tech companies and the savvier small ones) default to collecting everything they can and never deleting it if they can avoid it, means stopping this is also pressing for very real, very serious safety reasons.
He's totally right imho, but to defend the GDPR at least a bit: It does have the concept of "consent" so that data should only be collected if people agree to it being collected. However, I still have my doubts that it works like that in reality for several reasons:
- Too many websites place tracking cookies first and then let you disable them
- Too many apps use some kind of analytics without any consent
- The GDPR has different mechanisms to give companies a "legitimate interest in collecting data". How this is enforced is kind of unclear.
- The GDPR issued fines in the past, but what's gone is gone. It maybe helps that companies stop collecting more data in the future after they were caught, but you, as an individual, are still screwed.
So, the only way indeed is to stop SOME data collection in the first place and do it yourself. And you certainly can forget cookie banners and all that junk. Only thing that works is:
- don't sign up to abusive services
- use tracking protection on the web (uBlock Origin)
- possibly use something like pi-hole to prevent tracking for all your devices and apps
But of course, this doesn't stop data collection where you really have no choice but to agree to something.
But yes, until the situation has stabilized and someone has gotten very badly hurt from a financial perspective, the only sensible thing is to block non-HTML content by default and only whitelist the stuff you actually want to run.
Under GDPR, unless the website owner can prove a legitimate interest (which is pretty hard to do and doesn't work for advertising), this is illegal.
> - Too many apps use some kind of analytics without any consent
Unfortunately usage analytics can qualify as a legitimate interest, but the actual data used and the purpose matters and you might be able to drop a tracking cookie for improving your service, but AFAIK collecting health data, GPS location or other sensitive information won't fly for analytics without explicit consent.
And we are talking about first party analytics only. Having analytics cookies dropped as part of an advertising network without consent will not fly either.
> - The GDPR has different mechanisms to give companies a "legitimate interest in collecting data". How this is enforced is kind of unclear.
Not sure what you mean by "enforced", but there are rules for establishing a legitimate interest and just because companies claim they have it, doesn't mean that they actually do.
It's basically up to the data protection authorities to do their jobs. Give them some time, there are a lot of lawsuits already.
How does this work in practice? Sites just say "we collect data, click OK to accept". Where's the option to say no?
This was even reinforced with a court decision recently: https://www.technologylawdispatch.com/2019/10/cookies-tracki...
That's not compliant with the GDPR plain and simple. For example ads on a website are not required for that site to work (even if it's the only revenue), so the site cannot store data only to track users to show ads. The way a complliant site should work is it can say
"if you want to allow tracking cookies to get more relevant ads you can do so in settings".
I.e. no must be default in case of non-acceptance - and the site must still function.
Hopefully they will be fined huge amounts of money.
That's an absurd argument, any data can be leaked. Your bank data can be leaked, mine was, should bank be banned to hold any personal data for that reason?
Obviously, if fewer services collect data, the overall chance of lots of data being leaked goes down.
I'm not arguing whether data can be collected with wrong intends.
I'm arguing that the fact that it can be leaked isn't an argument in itself because everything can be leaked.
Now you’re saying banks have to collect data and saying other things.
So you do agree data does have to be collected at some level?
Please also note that I basically repeated what Snowden said in the video.
- It protects everyone equally.
- It shifts the incentives towards not collecting, since the data becomes a liability.
- It empowers citizen to know what was collected through the use of GDPR requests. It further allows the citizen to know what that data was used for.
- It allows citizens to ask the collecting entity to remove any and all data they have about them.
- Finally, it allows legal recourse when data that was never supposed to get collected inevitably leaks.
So there are multiple things the GDPR does that a purely technical solution does not provide. As usual, the best protection is to have both. Used ublock, didn’t give consent, but still suspect foul play? Send the company a GDPR request, and if data was collected, ask for it to be deleted (and additionally report it to your gdpr regulation entity).
> - Too many websites place tracking cookies first and then let you disable them.
I’m pretty sure both of those are straight up illegal, thanks to the cookie law and the GDPR.
Love those cookie warnings. We need more popups like that. Imagine all the quality time spent clicking, knowing, for sure, that you are getting a cookie.
Article 5 of GDPR specifically deals with the reduction of what's collected and the destruction of data after it has served its purpose. The legislators were well aware that what isn't there cannot be lost, stolen or misused.
Not collecting personal data in the first place is always the preferred scenario.
Not even state actors with unlimited resources (that's you, NSA) can prevent stored secrets from leaking. It's ALWAYS something, whether teenaged hackers or far-flung contract system administrators with too much access (that's you, Mr. Snowden).
Rule 1. Don't collect data you don't need.
Rule 2. Don't store data you don't need.
Rule 3. Assume all data you store will leak, according to Murphy's law (at the worst possible time).
Rule 4. Make your stored data has limited utility. Eternal Blue (hi again NSA) was not such a secret.
Rule 5. Make sure your stored data has limited useful lifetime. US Social Security numbers do not have limited useful lifetime. Strangely enough, credit card numbers do have limited lifetime.
Rule 6. Do your best to set up leak detection. For example, seed your financial secret caches with fake social security numbers that raise flags when used.
Rule 7. See rule 3.
Secrets should be stored under the legal concept of strict liability. They're just like bulls in a farmer's field. If the bull escapes and causes damage, the farmer pays for it. No excuses. No need to prove negligence.
We have, up and running at scale, workers' compensation and the vaccine injury fund. Both of those assume strict liability. A dangerous factory sees its premiums go high enough to put it out of business. Same for a sloppy vaccine manufacturer.
Why can't NSA and Equifax be held to the same standard? (I'd hate to be POTUS announcing a tax increase to cover the damage caused by Eternal Blue.)
PHB: "We might need it someday for analytics, just keep storing it."
* Engineer, PHB leave company *
New Engineer: "Hey does anyone know what this data is being used for?"
New PHB: "No idea, just keep it running. Don't want to break anything."
* New Engineer, new PHB leave company *
New new Engineer: "What's all this data for?"
New new PHB: "No idea, just keep it running. Don't want to break anything."
But how about this: Security auditor: "What's all this data for?"
PHB: "I don't know, it's the way it's always been done here."
Security auditor: "Your data retention policy doesn't pass ISO27001 (or PCI or whatever). No certification for you.
Cyberinsurance company: "We're tripling your rates because you aren't certified."
CEO: "PHB, deal with this problem."
At least in theory, the GDPR is meant to restrict collection, not just how the data is used and stored. It has a big loophole in allowing for vague "business interests" to be taken into consideration whether collection is legal or not.
More than that, the GDPR clearly establishes ownership of PII and asserts that owners have the right to request information about how their data is handled as well as demand that data be destroyed, exported or corrected.
And while we're at it, let nuke the data brokers. Preferably literally.
It helps frame the response a bit more in the context of the question the host asked. (~"is GDPR the panacea?")