The article lists numerous alternatives to replace functionality that 3rd party cookies currently provide. But is there anything that prevents third-party analytics and similar services from creating libraries or services that are deployed under their clients' domain, and then transmit the collected data from client site to third-party service's own database?
Is that possibility going away too, or is it still a loophole? I ask b/c if it remains, it may be easier to implement than some of the alternatives, and it may become more difficult for browser plugins like uMatrix, NoScript, etc. to identify and block these services, especially if they allow the client to customize or obfuscate the embed link.
The point of killing third party cookies is to prevent a tracking identifier cookie that uniquely identifies your browser from being reused across different sites.
So you can of course host your own scripts and run them on your own origin, lets call it site1.com. But when your site1.com includes a third party iframe to e.g. googleanalytics.com, and that frame sets a cookie on itself, the cookie is now silently dropped. Then when site2.com later includes the googleanalytics.com frame, the frame cannot immediately link the two browsers. There are other ways to “link” browsers across origins, like browser fingerprinting or in many cases just IP, but they are not usually guaranteed to be 100% reliable.
Blocking third party cookies is standard obvious privacy functionality, but google has held out because it affects their bottom line. So IIUIC they had to wait until they implemented something that protects their bottom line (the chrome-only “privacy sandbox”).
If I understand correctly this requires a shift from avoiding certain domains to avoiding certain URLs.
At present myhometownnewspaper.com, i.e., newspapers, generally have a robots.txt which lists a "sitemap.xml". The sitemap should list all the articles but will not list /allurdatabelongtous.
Ads and analytics URLs do not appear in "sitemap.xml". Maybe some of the ads domains are in ads.txt if it exists. In any event, it's easy to avoid the garabage URLs. IMHO, a web browser that auto-loads resources is not the best software for retrieving the URLs in a sitemap.
> But is there anything that prevents third-party analytics and similar services from creating libraries or services that are deployed under their clients' domain, and then transmit the collected data from client site to third-party service's own database?
No, because to do this unambiguously is simply impossible. It is fundamentally always technically possible for your own site to track user behavior of people on your site (what's to say what's "tracking" versus simply serving site page data), and there is no technical way to prevent a site owner from simply sharing their own tracking data with data aggregators. There are of course some heuristics that tracking blockers could use (e.g. "Even though this script is served from somesite.com, it looks like the Facebook tracking code"), but that just becomes a cat-and-mouse game to obfuscate the script.
That's a big reason I feel this "third party cookie blocking" was just oversold. Adding third party cookies was just technically easy because it just involved adding a simple JS snippet to a page, but serving that snippet from one's own domain is still pretty trivial, and indeed what most of the big ad brokers are moving toward.
> serving that snippet from one's own domain is still pretty trivial, and indeed what most of the big ad brokers are moving toward
But in this case it's not as trivial to link usage across two domains. Sure, each domain can serve the snippet from its own domain, but the analytics snippet can't cross reference data from other sites which use the same script (unless of course the domain builds some data sharing mechanism - not trivial for non tech companies).
In which case why even serve the snippet form your domain when you can use the container mechanism.
Yeah all adtech players are going this way but it's only useful when the publisher has primarily logged-in users. Otherwise there's still no way to link you across sites.
We'll probably either be forced to agree to be tracked by Google's systems or forced to sign in to every page.
There's no chance that anything positive will ever happen with the web again, it has gotten worse at every step for more than a decade. Especially when Google has any involvement with it.
All the things that led to getting rid of flash player have been pretty great. CSS is pretty incredible compared to what it used to be. JavaScript is still a train wreck but web assembly holds some promise there. WebGL is also pretty great.
I’m confused at where we jump to the conclusion that we’ll be forced to sign in to Google’s systems or sign in to every page. How specifically does that come about?
I can see this if you go from site X to site Y where you're logged in on both X and Y, and you can correlate (say) the email. But if you're logged into Y (the store) but not X some random referrer, how do you attribute the conversion?
No. Cookie banners cover any use of client size storage. That includes things like shopping carts and personalization (purely first party), analytics ("first party"), and ads (usually "third party").
Part of what's confusing here is that "first party" and "third party" are being used in a technical sense to mean which domain the cookies are set on. If on an example.com page JS from example.net causes a cookie to be set on example.com that's "first party", while if the cookie is set on any other domain that's "third party".
Shopping carts require cookies if they go beyond what is "strictly necessary to provide a service explicitly requested by the user". This is quite a high bar, and most shopping cart implementations do not meet it. For example, if you put something in your cart on Monday but then don't purchase, it is not strictly necessary that it still be in your cart should you return on Tuesday to complete your purchase, so holding it overnight requires consent. I wrote more about this in https://www.jefftk.com/p/why-so-many-cookie-banners and the relevant guidelines are section 2.3 of https://ec.europa.eu/justice/article-29/documentation/opinio...
Why would it matter if you take 1 minute or 1 day? Either way, the baseline expectation of functionality is for state to persist from one page request to the next page request, no matter how short or long that interval happens to be, and some kind of local storage (typically a cookie) facilitates that for both cases equally. Is it just to replicate the experience of that for which an online cart is a metaphor: with a physical shopping cart, a store employee will empty it out if it's abandoned overnight unless you explicitly ask them to hold it?
Interesting. I feel like if you decide to share your OS session (and therefore your browser data which goes well beyond that cart: things like the passwords it offered to save!) all hope is already lost when it comes to isolation between users.
I think you're taking the interpretation quite far in your post. I've seen legal department giving approval for a much less strict interpretation - basically on the level of "is this cookie realistically useful for this specific purpose". It would be useful to get some real cases that answer those questions, because on one extreme, "strictly" you don't need any local browser storage at all - the product page can contain a form to print out for a mail order and you can track your cart on paper.
Thanks for the explanation - you learn something new every day.
You should still be able to avoid a banner by having a footnote below the “add to cart” such as “we’ll set a cookie to remember this according to our cookie policy [link]”?
>> You should still be able to avoid a banner by having a footnote below the “add to cart” such as “we’ll set a cookie to remember this according to our cookie policy [link]”?
would count as consent under EU standards. I thought consent had to be indicated unambiguously. Clicking an "add to cart" button only unambiguously indicates that the user wants to add the item to their card, not that they are consenting to (or have even noticed) some footnote below the button.
Most cookies in those popups are third-party cookies, at least for websites with giant lists of third party cookies that require Consent-O-Matic to properly disable. Website relying on first-party cookies are usually rather unobtrusive,
No. The cookie banners have nothing to do with cookies per se, they are a consent / information popup required if companies want to store and use personal data beyond what's required to serve you with the website content (GDPR) or they want to store non-essential identifiers on your device (ePrivacy directive).
In other words, you are seeing these because marketing departments need BS metrics that measure nothing and are based on some personal data. The internet can happily exist without them as proven by Github[1].
Some federated identity protocols, including SAML, depend on third party cookies. I've been watching the Federated Identity W3C community group work through some of the issues for a couple of years.
So is google singlehandedly choosing which technologies get implemented and deprecated? What if my website depends on third party cookies to function? Aren't they an IETF standard and so I can expect them to be supported?
I get that third party cookies are just used for tracking nowadays, but shouldn't there be a less monopolistic way of doing so?
In this instance, Google is actually the last browser to deprecate third-party cookies. Apple and Mozilla disabled them in their browsers over two years ago.
I think it’s pretty well established that third-party cookies are a bad idea. They’ve been turned off by default on Firefox and probably most other browsers for a while now. I can’t see when I would recommend turning them on for any user.
Just because something is standard doesn’t mean it never gets deprecated.
“We Are Preparing To Think About Contemplating Preliminary Work On Plans To Develop A Schedule For Replacing third party cookies with chocolate chips cookies”
This will break so many things, that's not even funny. I can already hear customers complain their bank doesn't work on chrome. Maybe this will drive Firefox adoption? (Let me dream please)
It only covers third party cookies, not cookies in general. They aren't used that much except for ad tracking. For the few other use cases that exist the offered workarounds are pretty good and more secure than a general wide cookie.
Safari blocks third-party cookies by default and introduced a new Storage Access SPI so third-party sites that do need cookies can request explicit user permission. This is all part of Safari’s Intelligent Tracking Protection.
Firefox doesn’t block third-party cookies by default because that can break sites that haven’t been updated to use the Storage Access API. Instead Firefox’s Enhanced Tracking Protection blocks cookies from a blocklist of known trackers and all other third-party cookies (e.g. new trackers or legitimate use cases) are isolated using Total Cookie Protection (which has the inconvenient acronym TCP).
Total Cookie Protection uses double-keyed cookie jars, so cookies from analytics.example included in an example.com page are placed in a separate cookie jar from analytics.example cookies included in an example.org page. This allows both sites to use the same third-party analytics service, but the analytics service sees different cookies for each site and can’t link the cookies to one user’s browsing behavior.
For example, instead of an embed link like:
https://allurdataarebelongtous.xyz/collectitall
collecting visitor data on:
https://myhometownnewspaper.com/
It's replaced by:
https://myhometownnewspaper.com/allurdataarebelongtous/colle...
Is that possibility going away too, or is it still a loophole? I ask b/c if it remains, it may be easier to implement than some of the alternatives, and it may become more difficult for browser plugins like uMatrix, NoScript, etc. to identify and block these services, especially if they allow the client to customize or obfuscate the embed link.