GDPR enforcement so far is one of these rare examples where the bigger fish are fined and prosecuted first. (Yes, fines against smaller players happen, but are less likely on a per-company basis).
The (very small) company I work at got a warning from our DPA (Germany, SH, so the relevant one is the ULD). I had a nice discussion with our caseworker there, where he also said that for small companies, they are extremely lenient and very happy to work with them to improve things.
I was at presentation about GDPR in Norway and the implied message was similar. It can be tricky to implement GDPR correctly, but if a small company will mistake on that, authorities will simply point to it with a reasonable timeline to fix it before any talk about fines.
Bigger companies is supposed to have resources to implement GDPR properly the first time.
It would be nice if everything worked that way. When it works the opposite way, it's a vicious cycle of letting the biggest fish grow and making them even harder to go after.
GDPR is not the EU regulation on cookies. GDPR only has a single direct mention of cookies. The ePrivacy Directive is what's known as the "cookie law" (https://gdpr.eu/cookies/)
The ePD dates to 2002 but stupid cookie banners only came into being around the same time as the GDPR in 2018. I think you can say the cookie banners are in some technical legal sense ePD's fault, but to laypeople it certainly seems that the GDPR is why every website is less usable and more annoying than it was just a few years ago.
I don't think that's right, I remember joking about cookie banners in the late 00s. Since GDPR they now have buttons and categories though, whereas previously we never had a choice, just a notice that cookies were in use.
There were banner shaped popovers in the 00s, but to my memory they weren't for cookies. Any chance you're a European? We may have been visiting different slices of the internet in that period.
Ah yeah, that could be the difference between us. The only difference GDPR really made here is companies went from a single "Confirm" button to dozens of complicationed options plus an opt out.
Imo the law should be structured such that people who care can opt out of cookies programmatically, but every day people who don't care don't need to see a pop up on every site. That causes psychic load and makes us hate the internet.
Also, people should get together and vote on what constitutes "strictly necessary" cookies that don't require consent and "non-essential" cookies that do. It's currently up to some nerds in EU.
Because behavioral psychology tests reveal that the public is both super-paranoid about being spied on and will sell their PII to anyone who asks in exchange for a $5 discount at a store.
It's hard to craft good law when the mismatch between public stated desire and public behavior is so wide.
> that the public is both super-paranoid about being spied on and will
sell their PII to anyone who asks in exchange for a $5 discount
You raise an interesting point about this apparent disconnect.
> Because behavioural psychology tests reveal
What do you know abot these "psychology tests"? Isn't the reality that
both things you assert are really inferences based on anecdotal tales
on forums like this one?
Do you know of any large-scale cross cultural, long term studies that
show who will part with sensitive private data for small compensation?
How are these studies controlled for situation, perceived reward, time
pressure, trust levels?
What do we really know about widespread beliefs regarding the misuse
of data? What are people's threat models? How do they assess harm?
I'm not disputing the truth of what you say - both ring true.
I think we instinctively know all rational people are strongly
anti-surveillance. For complex social reasons we humans dislike creeps
and snoops and treat them as social outcasts. But we also know that
many people will sell their grandmother for a slight convenience or
advantage. Do these two facets really clash, and if so why?
> It's hard to craft good law when the mismatch between public stated
desire and public behaviour is so wide.
Is is though?
Aren't both positions reconcilable if people make no causal link
between the two? What if people just don't have the capacity to make
that connection? Could good laws still protect them in the way that
laws about asbestos protect people from harms related to chemistry and
disease they cannot possibly understand?
> Could good laws still protect them in the way that laws about asbestos protect people from harms related to chemistry and disease they cannot possibly understand?
They may not understand the chemical mechanisms, but the end effect is super easy to understand: “hey if you breath this in you get cancer”; “hey if you eat this, you die”
Compared to the threat of something like cookies… “hey if you let these companies spy on you then…?”
> hey if you let these companies spy on you then....
Then you'll suffer multiple digital, and eventually physical harms;
- You'll lose money on bandwidth that's robbed from you
- you'll lose access to goods and services on the basis of
calculated prejudice
- you risk identity theft, impersonation, having loans taken out in
your name...
- you'll be exposed to manipulation, targeted disinformation
- you'll experience increased targeted attacks like phishing
- friends, relatives, spouse and others close to you will also be
targeted
- your time will be wasted dealing with more spam and nuisance
- you will lose CPU, memory and battery life as the concrete
resources of your devices are given over to the advertising burden
- you will eventually suffer tangible harms, loss of employment,
impacted credit score, loss of medical access...
Sure, it needs more words to explain. Not quite so punchy and succinct
as the real deal. But digital cancer caused by these companies'
products can well be explained to the average "consumer" if they just
stuck an ugly picture on the carton of a destitute homeless guy and
the message "Surveillance capitalism can ruin your life".
What does it mean to have bandwidth "robbed" from me?
What do I care if my gigabytes-of-RAM multi-core CPU is burning some cycles? Have those cycles also been "stolen" by, like, articles I don't want to read or useless comments?
Is there more identity theft risk from targeted ads than, say, the LastPass breach? Google, in particular, keeps that data deeply vaulted and protected.
Am I really exposed to meaningfully targeted disinfo relative to self-sorting by watching Fox News? What about meaningfully-targeted info, like Facebook making it easy to find my elected representatives because they know my address?
Do we have evidence of correlation between targeted ads and phishing? Most phishing I'm aware of just uses big popular names and the law of averages... You don't need targeted info on a consumer to guess they use Amazon.
I worry that bans based on this shaky ground would look more like Prohibition than asbestos bans.
If there's no harm, there's no reason to curtail someone's freedom.
It's hard to support an argument that we should change the law if the harm isn't proven, and "someone directed my computer to use more CPU cycles than it otherwise would (after I voluntarily accessed their server)" isn't a story of self-evident harm.
(And that's leaving out the unsupported strong assertions that "you will eventually suffer tangible harms, loss of employment, impacted credit score, loss of medical access," which... What? Loss of employment because companies spy on me? That screams "lack of causality."
As an employed tech worker in the United States, I operate under the assumption that my employer has access to my corporate inbox. Is your assertion that I could be fired for something I say there [in which case the harm is I said something damaging to my company, not that my company knows about it] or some other mechanism?).
> It's hard to support an argument that we should change the law if
the harm isn't proven
But it is, shadowgvt my friend please dig deeper. It absolutely is!
Over and again in multiple quality peer reviewed studies and analyses
by prominent security engineers and psychologists...
This "big tech = bad" thing isn't just some matter of sloppy marginal
opinion. It's tediously mainstream thinking now. The only real
question is what to do about it. We both know that's not easy. But at
this point attacking the premise just seems like a waste of time.
A fair place to start is here, with the "ledger of harms" [1]
Additionally there's also a mismatch between what is good for the public versus what the public will end up choosing (due to ignorance or carelessness). This is illustrated in your first example.
Good laws will protect the public from themselves, even at the cost of supposed freedoms (think schedule A drugs, production, revenge porn, tobacco).
> It's hard to craft good law when the mismatch between public stated desire and public behavior is so wide.
Some people would also sell their right to vote for $5 if that was legal. It's not hard to make such a law. You simply could make it illegal e.g. to offer money for PII during a purchase that doesn't strictly require it. Or to not respect the automatically transmitted cookie choice. Those who oppose such laws are the lobbyists of the big companies, not "the people".
The last time a state tried to outlaw "selling your right to vote," the consequence is they banned people providing food and water to folks standing in too-long lines at the polls.
This was not good law. Badly-thought-out law has unintended consequences (or even maliciously-crafted intentional consequences at the cost to the public).
My carrier literally bugs me about giving away more and more personal info for 5 bucks every 2 months. They must have heard of the same research. They also ask for consent to receive "advice" and term it "our gift to you". I wonder how many people must fall for it.
It shouldn't be by law, it should just be part of web standards that you can set your cookies preferences once and for all and all websites should respect that setting. I guess browser vendors and whoever else sits in these W3 consortiums like their cookies too much.
This was attempted with the "Do Not Track" header, but that initiative collapsed (lack of vendor support, lack of clarity in the spec regarding what it meant to not "track" someone, and ultimately it got yanked from Safari because it was yet another bit that malicious site operators could use for browser fingerprinting purposes).
The "Global Privacy Control" header is apparently the second attempt to cook this goose.
The fundamental flaw to the DNT was the lack of any sort of penalty for non-compliance. There was no penalty for individual advertisers from either the government or from industry bodies. There was also no penalty to the advertising industry as a whole in the form of more stringent privacy laws.
Agreed, though the issue is that the very same technology can be used for perfectly acceptable and desired use cases (online shopping cart) or distasteful ones (behavioral advertising). So it's hard to set a strict standard.
The Do-Not-Track header is a perfect solution to this, but unfortunately the largest browser vendor is also the largest advertising network, so the implementation of DNT devolved into a farce.
I disagree that the "Do-Not-Track header is a perfect solution to this", because it implies that the default (i.e. the header being absent) should allow sites to track you. I (and apparently EU legislators) think opting in to privacy is wrong. You should have privacy by default. If anything, there should be a "Creepily-Track-My-Activity-Over-The-Entire-Internet" header, that, when present, allows tracking.
There also used to be another standard called "P3P" that tried to integrate privacy into browser UIs with a more or less standardized interface and that, too, failed. Among other reasons because companies wanting to track people subverted it.
It has nothing to do with the browser vendor being in the ads industry. It's up to the website operator to respect it. If a website operator doesn't like a browser vendor's DNT defaults, it won't respect it, and that is why DNT failed. Regulatory backing could fix it.
The ePD/GDPR are not about technical barriers for bad actors, but about compliance for good actors. Technical mechanisms like DNT would be a reasonable way to get rid of cookie banners while complying with ePD/GDPR.
> Imo the law should be structured such that people who care can opt out of cookies programmatically, but every day people who don't care don't need to see a pop up on every site. That causes psychic load and makes us hate the internet.
Make it opt-in programatically rather than opt-out and we may agree.
Most sites don't need to know who I am to know which parts of their site are most used and/or broken. You can track app usage without tracking personal data.
GDPR doesn't require popups, it also allows for programmatic solutions. For some reason (lookin' at you DNT) online advertisement companies, publishers and browser vendors never came together to specify one after GDPR was passed.
Some sites like geizhals.de respond to DNT and only use the bare minimum, but that's the exception, not the norm (for some reason hinthint).
Usually laws don't prescribe some specific technical implementation unless absolutely necessary because they're often not optimal and go out of date quickly.
This may be a stupid question. but I have not seen it discussed so I will ask it.
Cookies are already under the control of the end-user, why does the site operator need to ask/worry at all? if the user does not like the cookies can't they just delete them?
Because 'cookie law' isn't just about cookies. This is an intentional confusion on behave of the advertisers. It's really about using cookies for the specific purpose of tracking. There's alotof uses for cookies and it's not obvious to users what cookies are for what.
DDoS prevention does not need cookies, though they can certainly be useful to attempt to track honest users. Dishonest ones will prevent cookies from being set in the first place, or, if you only use cookies to mark 'good' sessions, will learn to spoof your cookie.
Never rely on your security adversary not doing something to break your security for its efficacy. It is the same thing with KM AntiCheat (which is best handled by the OS, and even then virtualization breaks it) - unlike an antivirus the target user is adversarial to the application.
For shopping carts, you can have a season token as a query parameter, like jsessionid used to be for ages. Does not help across browser sessions, but a session cookie would not help there either, much.
Not sure why the chat bot would need the cookie. To track across sessions? If so, I would posit that this is no longer anonymous access. The identity of the user is defined as the cookie you use to track her.
Technically, a `session_id` in query parameters is just like a cookie (can be used to track you), except that it doesn't persist.
I'd consider it a basic feature that if I reload a website, or open it in a new tab, I'm still logged in, the shopping cart is still full, and I see past messages in the chat window.
While technically all of the above can be implemented without cookies (in a very user-unfriendly way), it's so below the expected user experience that I'd argue that cookies are strictly necessary for these purposes.
IndexedDB/LocalStorage can do all that without relying on the server.
The real shame is that Apple has decided IndexedDB is too powerful to allow outside of their walled garden and will auto-delete it after a week or so. Worse still, Service Worker caches are cleared so offline-available websites break after a week.
The EU cookie law (ePrivacy Directive) isn't specific to cookies. Using local storage does not exempt you from its requirements, and you still need to get consent from users before any use of client side storage that is not strictly necessary to provide a service explicitly requested by the user.
Adding an item to a cart and it persisting is quite directly a service explicitly requested by the user.
And if some EU kangaroo court forces me to click through a “clicking add to cart will save this item in your cart, do you agree to performing the action you explicitly just requested” on every website… I cannot.
> I'd consider it a basic feature that if I reload a website, or open it in a new tab, I'm still logged in, the shopping cart is still full, and I see past messages in the chat window.
That is, as if, to the site owner, you were not anonymous.
It's ok for you to want that, the law is there to protect the ones that don't want it. Because, as it is, websites tend to side with you already.
Private Chromium-based browser windows share state between tabs, at least on Windows*. You need to close out all private instances of that "install" of the browser to drop the contents: They are just not saved into a history, beyond that tab's history. Cookies, however, are stored (in memory, I am guessing) until the session closes.
------
* - others might too, but I only have access to Chromium/Win right now.
I'm not quite as technical as you but what do you think of Cloudflare's explanations for why they use cookies for security and load balancing purposes here?
The law says cookies that aren't "necessary" are illegal without explicit consent. Perhaps there's room for "necessary" to include cookies that are helpful technically even if not absolutely the only way things could be done.
Every security boundary is about increasing the difficulty of bad actors succeeding in their desired action. The problem with the cookie one is that it cannot be singularly authoritative, in that your decision of "allow" /"disallow" cannot solely depend on the value of the cookie.
So to try to construct a way one could do this, the are some constrains on the possibilities:
* Deleting the cookie should not improve chances of "allow"
* There should not be a magic value of the cookie that is known to improve chances of allow
Thus, this should be used as a transient identifier with the TTL set to less than the expected time to discover "good values" to spoof.
The fact that the system needs to be able to work in the case of an adversary deleting these cookies (potentially at the cost of hitting more CAPTCHAs than otherwise) means that I cannot honestly call the cookies "essential" in the colloquial meaning of the word. Another poster explained that there are specific carveouts in the law for this situation.
I'd be very nervous if I was Cloudflare. They pretty explicitly say that the cookies are not strictly necessary for using the websites.
> cookies are strictly necessary to provide the services requested by our customers
Cloudflare's customers are websites, not users, whereas the latter are protected by Cookie Law / GDPR.
That's exactly why Microsoft was fined (according to [1]):
> Microsoft argued that detecting ad fraud was "strictly necessary" for running bing.com, but the court disagreed, saying that advertising is not a service requested by the user. (point 53 in the full decision).
Nah, Cloudflare is on pretty solid ground. The data protection authorities' guidance is explicit that load balancing and security-related services are safely considered strictly necessary for websites.
What is a "cookie" if not a generalized abstraction for "logged in"?
You seem to be calling for some kind of regulation that defines how identities are stored and managed on the internet. And that sounds really interesting to discuss.
But it's absolutely a much bigger problem than one "should" clause on HN
I think despite the goals of GDPR/cookie law being laudable, it really has turned into a complete trainwreck for the European tech sector IMO.
I work with tech product companies in both the EU and US. EU companies are completely stymed by GDPR/privacy implications. Features are often delayed for weeks/months as it goes through endless cycles of GDPR etc compliance discussion, which often doesn't have a concrete answer in the end!
Smaller tech companies in the US are in the whole not doing this. They are moving far, far, faster because of it. It really is night and day.
Sidenote: EU tech companies tend to be much more risk averse than US ones in general. This means GDPR really scares leadership there and the often lose the bigger picture.
The worst thing about these laws are that the regional/national courts are all interpreting them differently, often with major implications.
I think we will actually get to the point where many tech products just block the EU from using them. It is impossible to comply with the law IMO with so many conflicting interperations from regional courts and if the fines keep ratcheting up it will become uneconomic to serve the EU market.
>The website also lacked a button for users to reject cookies as simply as accepting them, CNIL said, where two clicks were required to refuse all cookies while only one was needed to accept them.
No need for FUD and fear mongering, "oh no it's so complicated, every town in Europe has different laws, what can we doooooo we are doomed someone please help us"
And it's not even difficult. We do it every day. When I go to the shop I take products in my basket, and then I extend cash (or a card) to the cashier. I don't ask them if they are sure they want to accept payment, or offer them to take the money only in the form of 10,000 coins of 2 cents, or throw the money in the air and ask them to catch it, etc.
If American corporations were not run by psychopaths this would not even be a debate. Dark patterns are evil and I hope corporations using them cynically will continue to get punished to the tune of millions of dollars.
I work for an EU tech startup and our experience is entirely the opposite. The trick here is to think about how you use the data before you collect it and not just collect whatever you get by default because thats how you speedrun ending up in the courts.
It seems to be more a case of companies not actually wanting to follow the the law and do the straightforward thing - not collect personally identifiable information unless absolutely necessary for the functionality of the product / service. Instead they want to continue business as usual, trying to work around the law with as many dark patterns as they can possibly fit into a cookie popup.
I wasn't really meaning this exact case. But, so much time and effort being spent on _cookie popup windows_! I imagine 99%+ of users click accept regardless (even if reject is just as easy to click).
> I think despite the goals of GDPR/cookie law being laudable, it really has turned into a complete trainwreck for the European tech sector IMO.
It's a giant pain for my work for sure. Anything which could potentially have some intersection at some point with European customers has to fully comply with all of these EU privacy regulations.
Usually that means taking an already clunky internal system and adding a bunch of restrictions to it to make it even more clunky. Also banning some useful practices and tooling that me and people like me use for our day-to-day.
That was not what I was saying actually. I was suggesting US employees on my team of all types (not just coders) are restricted by EU laws in what tools and processes they can leverage.
My reason for not wanting to give details is because this could reveal things about my employer which would possibly put me at compliance risk.
Or, you know, maybe companies could just respect the laws in the countries in which they do business. And if they disagree with those laws, they can either ignore them and pay the fine, or simply choose to not do business in those countries.
This regulator approach is pretty common in western Europe so it's hard to choose which country to exit between France, Germany, or Italy. And worst yet, each has its own pet peeve that it enforces - France with cookies, Italy with contract language, Germany with employee monitoring.
The laws intention was totally fine, and if browsers and sites e.g. would have implemented it with a more DNT like approach or just a default opt-in than annoying opt-out approach everyone could have benefitted, but sure AD companies and everybody else snapped in and needed to annoy and show everybody how bad and unreasonable this whole bullshit implementation now is (which it really is, goals achieved on all sides - except for customers, which are still mostly tracked and now also annoyed..)
> Can you tell me how GDPR has improved European lives over say a New Yorker?
My spam disappeared overnight. There was a flood of "click here to give consent" emails a little before GDPR came into force, and then radio silence. I now only get the low-effort botnet viagra emails that are trivial to identify and catch, and even their amount has dwindled to only 5 this month. I guess most of them fail SPF/DKIM/etc tests and don't get delivered at all.
Same with snail mail. I used to get a pretty large pile of all kinds of ads every week, now I get only invoices, official notices and postcards from friends.
In online services like Youtube, I don't have to click through menus to opt-out of tracking anymore. There's now an equally visible opt-out button on the consent dialog. When I visit sites like USA Today, I get a much nicer version without tracking, ads and auto-playing videos: https://i.imgur.com/YQXaavw.jpg
You can request job interview data from companies or them backlisting you etc. GDPR isn't just internet data, it is all data.
Do you think it is good that your healthcare data is secret in USA? If so, why just healthcare data? If a woman buys pregnancy related items, and that company sells that data to everyone else, now everyone knows she is likely pregnant, why should that be completely public and free for all? GDPR prevents that.
I want to be able to opt out of tracking. GDPR has improved my life.
Websites that want to make it easy for me give me a “reject all cookies” button. It is not the fault of GDPR that most websites want to trick you into just clicking “accept all.”
EU has every right to regulate companies that want to operate on European soil. Just as American companies have every right to stay out the European market if they don't want to comply.
As we see, despite the fines, they choose to stay, which means they still profit.
>Release open source software with license specifically prohibiting European companies from using them.
Even if this happened (which it wont, see sibling comments), it would have to be enforced within the European courts. EU could easily just change laws or choose not to enforce it.
And do the same for copyright of American media. Congrats! Now you have started a trade war and Russia and China are laughing in the corner.
Maybe those american companies simply shouldn't track the users and follow the local laws?
The majority of GDPR fines are avoidable, if the websites would just use GDPR-compatible tracking banners, but they decided to break the law => Consequences
Well, then someone else will be happy to replace them and earn the money instead :-) There's already a french search engine company that could take the Bing users (Qwant)