> Total Cookie Protection makes a limited exception for cross-site cookies when they are needed for non-tracking purposes, such as those used by popular third-party login providers.
Would be great to have some more details about it: in particular, how do I turn it off if I prefer to add any exceptions manually.
Edit 1: Mozilla Hacks blog [1] has a bit more but still doesn't answer the question:
> In order to resolve these compatibility issues of State Partitioning, we allow the state to be unpartitioned in certain cases. When unpartitioning is taking effect, we will stop using double-keying and revert the ordinary (first-party) key.
What are these "certain cases?"
Edit 2: Reading on, there's this bit about storage access grants heuristics [2] linked from the blog. But is that really it, or is there a hardcoded whitelist as well? If so, it'd be great to see it.
This bit in particular is ambiguous in how it's supposed to work exactly (who's "we" here):
> If we discover that an origin is abusing this heuristic to gain tracking access, that origin will have the additional requirement that it must have received user interaction as a first party within the past 30 days.
(I’m one of the developers of this feature and co-author of the blog posts)
This is a great question and I’m glad you found the answer, you probably understand that for many blog posts we avoid going into too much technical detail.
To answer your final question, there is no hardcoded allow-list for State Partitioning. The heuristics as described on MDN are accurate.
Have you considered using something like Expounder (https://skorokithakis.github.io/expounder/) in your posts? (Disclosure, I made it but it's a small open source lib).
I don't see why we can have full-blown web apps but our text needs to be very specifically just text these days.
Thank you! I used to use footnotes too, but I didn't like how they took you out of the flow of the text. Expounder aims to specifically let users stay in the flow of reading, which is why one of the core instructions is that the text should work in context, as if it were never hidden.
It's good to see experiments along these lines. I really like Wikipedia's recent-ish rich tooltips on link mouseover, and the HTML <summary>/<details> elements deserve to be more widely known.
From the demo it look as if Expounder is one-way - once you've expanded something, you can't collapse it again. Is that correct?
I miss footnotes on the printed page because, in addition to references (where they're probably better as endnotes to be honest) I find they're great to use for parentheticals that bulletproof a point, add some background that's not essential to a point being made, etc. But these latter uses work significantly less well in a blog post or ebook.
What I dislike about footnotes like that is that they pollute the browser history. If you want to leave the page but clicked on a few footnotes and their backlinks, you have to go “back” through all of them.
Thank you so much for posting gwern’s sidenote article! I want to use sidenotes on my site and this was a very valuable resource!
Back button usually come with an unfoldable list of jump points.
I am more ennoyed by how the jump points are turned into a useless feature by so many javascript out there which load new content without impacting the browsing history.
I love this, but I'm a bit surprised that you do not include the ability to "unexpound" an "expounded" term. Is that intentional?
If I were reading a technical text, I would definitely end up reading most paragraphs at least twice. It would make no sense to keep the expounded terms in the second time; I'd be tempted to hide them back as soon as I was finished with them the first time.
It's because, once clicked, the new text should become part of the old, and that's it. Presumably you've already read it, and I don't want to make the viewer have to re-collapse the links every time.
Your use case makes sense, though, which is why the feature was included. Maybe I should mention it in the README.
I think collapsing would also be useful when all you need is a quick reminder, not a full explanation. Like "What's that again? [click to expand] Oh that's right [click to collapse]". That's easier than finding the place to skip to.
Hi, can you consider adding some accessibility to the library? Currently, I don't have a way to know that a term could be expanded, because the signal seems to be visual only and not detectable via a screen reader. Adding aria-pressed might be the solution, but I'm not an expert, just an user.
I feel like the inserted text should be highlighted with a light yellow background or some indicator. Just appearing like that inline seems a bit funky or unexpected.
But I see there is a css class which is nice.
Just a simple rgba(x,x,x,0.5) where the x’s are the usual yellow height.
Why use this instead of footnotes? For example in these Feynman lectures below the footnotes and references to formulas and images activate then you hover over it. These footnote can even include graphics and formulas.
To me, footnotes serve a different purpose, e.g. linking to papers, like the Feynman lectures site does. Expounder is more about indicating that you don't know something, so the text itself can change to accommodate you.
It should animate the text while unfolding, but, other than that, there's no need to know what was unfolded. You just click what you don't know and eventually read the relevant info!
Not the author, but presumably you're overlooking the fact that the expounded term doesn't necessarily have to be "inside" or even "neighbouring" to the details element.
The author's intent here is to have terms explained in the text explicitly in such a way that it would 'augment' the text with an explanation somewhere further down the line, but not necessarily "in-place".
It is also intended for text specifically, rather than replacing one element with another.
I agree that display/summary are similar in spirit though, I had not come across those before.
Awesome. Just a heads up, I've already finished it and just submitted it. HOWEVER, the plugin has to be licensed as GPLv2, but it shouldn't affect your license (since it's just using your code as a library). I'd feel better about it (and it will probably be smoother sailing during the review process) if I could submit your names as authors on the plugin.
Is there support for an expound-all button on a page? I definitely have days where I just want to also read the details and don’t want to click a dozen times while I’m reading.
Not currently, but it shouldn't be hard to add a button with one line of JS to add the required CSS class to all the elements. This might defeat the purpose, though, as it's kind of intended to save you from reading things you already know.
This should have always been the only way it worked.
Plus it should be easier to create white lists of allowed websites and all other cookies delete with every broswer restart. I know it is possible with Firefox but you need to add websites to whitelist manually in deep settings. At least there are some extensions that make it easier, like CookieAutoDelete https://addons.mozilla.org/en-US/firefox/addon/cookie-autode...
I would like something like, each site by default gets a bucket by name.
If cookies from another bucket should be shared with other sites, or might be seen when requested by a cross-site load from another site, ask the user a four choice question.
"Allow (site) to see cookies from (site)?"
Always Allow, Just this time, Ask later, Always Deny
What I wonder/concern is how can one decide for legit use.
This also sounds like a possibility for discriminating small players with legit use. (similar to Microsoft's SmartScreen)
Would be great to know how are those concerns handled?
I agree I wish they had more detail about the exceptions.
I've been a FPI user for years as a best-effort to reign in tracking but there are a common few sites that just break with FPI (50% of the time PayPal checkout doesn't work). Even if "Total Cookie Protection" is only 98% as effective as FPI, I'm making the switch.
Yes, it’s essentially that, FPI with workarounds for common breakage. You should switch from FPI, this is essentially another take on FPI by some of its original developers, so it should have fewer issues overall, not just site breakage.
It will be interesting to see how many sites break with “Total Cookie Protection”. Currently I use what I consider are bare minimum of anti-tracking, that is what I can make Firefox provide on its own, plus the DuckDuckGo browser extention. Those two things alone break an alarming number of sites. The DDG extention is pretty regularly mistaken for an ad-blocker.
Given Firefoxs low adoption, I fear that website owner will just ignore that their excessive tracking breaks their site in Firefox... “Works in Chrome... good enough”
I have strict tracking enabled in Firefox as well as uBlock Origin and I've yet to see a site broken. The only "broken" ones I've seen are badly coded ones that also fail to work in Chrome. Reputable sites tend to be just fine. YMMV.
> Total Cookie Protection makes a limited exception for cross-site cookies when they are needed for non-tracking purposes, such as those used by popular third-party login providers.
Facebook and Google will be excepted? This makes it a joke, sadly.
This is basically Google (Chrome) paying Mozilla (Firefox) to kill 3rd party cookies because Google has a better way to fingerprint users without 3rd party cookies, because they have SO MUCH data about us.
This move is aimed at killing other AdTech companies which rely on 3rd party cookies to track users.
They painting this as a 'PRIVACY' move, after they have already found other ways of tracking users across websites and devices.
I wish there was something better than cookies for these use cases. But then, designing something that can't be abused for tracking, that empowers all the legitimate use cases is also really hard, maybe even impossible.
The question is how to use "Total Cookie Protection" without any hardcoded or heuristics-based exceptions.
Your answer seems to be about how to turn off "Enhanced Tracking Protection"/"Total Cookie Protection" or parts of it (resulting in weaker protection). I want to keep it enabled and disable the exceptions (for stronger protection), i.e. the opposite.
I haven't installed the new version yet, so can't say for sure, but as far as I know there is no setting for this in that menu. [1]
If I misunderstood what you meant, please elaborate.
There's a lot of comments in here about how it's bad that cookies haven't always worked this way, but a significant amount of web content to this day still requires third-party cookies to work. And I'm not talking about cookies that are designed for analytics purposes; the discussions here where concern is raised revolve around simple things like logins breaking.
For greenhorn web developers, you could say the same thing about TLS certificates. Why weren't they always free?
Well, another reason is because TLS (and formerly SSL) wasn't (weren't) just about encryption, but about a "web of trust." Encryption alone isn't trust.
Many things about web technologies have changed over time; and it's easy to say that any individual piece of functionality should have worked this or that way all along, but the original intent of many web features and how those features are used today can be very different.
One day industry standards may dictate that we don't even process HTTPS requests in a way where the client's IP address is fully exposed to the server. Someone along the way might decide that a trusted agent should serve pages back on behalf of a client, for all clients.
After all, why should a third-party pixel.png request expose me browsing another website?! How absurd. Don't you think? And yet, we do it every day.
> Well, another reason is because TLS (and formerly SSL) wasn't (weren't) just about encryption, but about a "web of trust." Encryption alone isn't trust.
Which is a nice principle, but given corporate and government incentives, the trust provided was lackluster at best. The PKI is pretty much broken because of it.
In the end, all it did is incur an unaffordable cost for hobbyist bloggers and other netizens.
You used to be able to simply install a Firefox extension[1] or Android app[2] and automatically steal the accounts of everyone on your wifi network on every website. https stopped that.
Widespread https did that. Firesheep motivated the big players to stop cheaping out and go fully https unlike approaches which did https for login pages only but it took let's encrypt also for https to become truly widespread
Yeah, in the end it’s silly that we ended up with “trust” meaning only you’re connected to someone that controls the domain” which doesn’t actually need PKI to accomplish if we just supported a SRV record with the public key(s) and verifiably authoritative DNS queries.
Which fair it’s trading one PKI for another but web servers vastly outnumber authoritative DNS servers. But DKIM gets along fine without it so we probably could too.
Well, but there is nothing that makes it impossible to build logins and the like without 3rd party cookies. Yes, there are certain patterns out there, that use them, but slowly turning 3rd party cookies off and giving major sites time to adapt might help to dump 3rd party cookies one day completely.
I think the whole idea of sharing cookies across origins was a conceptual mistake right from the beginning, because it is also responsible for quite a lot of security vectors which had to be fixed by other mechanics like the SOP (Same Origin Policy) which in turn required mechanics like CORS (Cross Origin Resource Sharing).
And with all those mechanics in place, modern browsers are pretty tied up and are significantly reduced in their abilities compared to other HTTP/S clients. So when you want to build a PWA (Progressive Web App) that can use a configurable backend (as in federated), you will run into all kinds if problems, that can all be traced back to the decision to share cookies across origins.
My point is you can make these arguments all day. Why do we allow iframes? You could argue web servers should simply communicate among themselves and serve subdocuments to clients.
Why is HTTP/2 Server Push being rescinded?
Why do user agents not provide additional types for <script> based on runtime installations?
Why isn't there a renderer API that allows me to use painted trees in <canvas>, but there is a bluetooth API that no one uses?
I am not sure if I get your point then. I think it is important to see the patterns of bad decisions in the past to improve decision making in the future.
That those mistakes were not done deliberately and with good intentions is a completely different story and that in hindsight everything looks so clear is also well known ;-)
> a significant amount of web content to this day still requires third-party cookies to work.
Not in the corners of the web I frequent. I've been blocking 3rd party cookies for years and the only site that's broken was some Pearson online homework site.
A lot of IDPs break. For example any website that presents "Login with Google" will not work or require a reload after completing the Auth flow before the login is accepted.
This isn't simply "blocking third party cookies", it's "even an iframe has no access to the other state partition". The third party cookie is allowed to exist but it cannot leak to other sites. However, this leak prevention breaks plenty of other things if one is not careful (Mozilla was, there is a heuristic).
Because it breaks a lot of things like SSO providers (although I completely agree with you, screw that, make it the default and add exceptions as necessary like Mozilla is doing now).
I've had third party cookies completely disabled for years, and first party cookies only allowed by exception. It works fine on everything I use except for whatever it was Atlassian were (are?) doing with their very odd collection of about two dozen domains they round tripped through on authentication.
To be honest though, browser fingerprinting makes this mostly irrelevant unless you carefully use a script blocker with a whitelist too. Any domain that includes trackers that drop third party cookies almost certainly includes scripts that can fingerprint you and send results to a server without using a third party cookie.
(A bit of OT)... which is why I am considering SPAs to be complicit in 'evilness'. All these webpages that require js for no real reason is generally making the web insecure and implicitly hostile and difficulty to navigate. Very few have the mental overhead to evaluate each site, so most just let any page do what ever it wants. Tracking and miners be damned.
This is just my hunch as I work in analytics and deal with cookies a lot but both Salesforce and Atlassian appear to intentionally trade off the third party inconvenience because their products are enterprise (you have to log in for work) and they rely on upsell/cross sell across their products which they host on different top level domains. So forcing the third party cookie helps immensely with their sales and retention, and doesn't hurt usage because it's often required for work and if you need to work around it, you usually can find a way if you are so inclined.
If they had used the same domain for their products historically and just separate subdomains they wouldn't have to make this trade off, but it probably also helps with third-party ad networks/segmentation to get folks to turn it on anyways.
Weirdly for me Atlassian doesn't work when I have the spoof referrer enabled in about:config. Like why does referrer, a property that is a header, define whether my login is valid or not?
I've worked on (non-Atlassian) SSO projects where the provider used the referrer to send the client to the page-after-logout (and occasionally page-after-login) if they weren't set as parameters in some circumstances.
Here's a reference to a F5 device providing SAML SSO services and having a similar issue:
I actually had a member of Atlassian's "security dev team" tell me in a support ticket I opened about being unable to login with referer headers disabled that:
> since we cannot discount the possibility of malicious users programatically generating tokens and forcing them upon users, we check the referer header to ensure that the request chain was initiated in the one place that we're comfortable with: id.atlassian.com
I had the same problem and tracked it down to uMatrix's quite reasonable spoof-referrer default, which breaks nothing else. Just Atlassian's sign-in, which seems to bounce you around to several domains before it lets you in.
Not only does redirect based login work, it's an inherently better model than sharing cookies.
With shared cookies nothing stops site A from taking a copy of your cookie and using it to impersonate you on site B. With redirect based login the identity provider has to authorize each application that is being accessed and each site has its own session cookies.
The main problem is dealing with globally revoking access but that's usually solved with shorter termed session cookies that periodically need to be refreshed from the identity provider.
Site A can’t access 3rd party cookies. Cookies only can be accessed by the domain they are created on. Otherwise any site could toss a 1x1 image pointing to any website and steal the cookies.
Could a site fix this by delegating a subdomain or CNAME to the SSO provider like sso-company.example.com so that the cookie is still using the same domain, but pointing the IP to the SSO provider? Assuming the SSO provider supports this, that is. I believe OKTA supports this method.
I mean effectively today hardware you or your boss owns is doing most of the work of tracking yourself.
This is making them have to allocate resources to achieve the same effect. Like taking lojack off of your car and phone, and making 'Them' have to tail you and scour security footage like in the old days. It's more expensive. Expensive things do not scale, so you have to prioritize who is worth the cost. People who are under legitimate suspicion of causing harm. Less 'by-catch' to use a commercial fishing concept.
When it's cheap to harass everyone, nobody is 'safe'. But when terrorists can't be tracked at all, nobody is 'safe' either. So we have checks and balances.
I regularly use nginx to reverse proxy third-party API calls. I use it to protect API keys.
In my case, I strip all cookies and sensitive headers. One must keep in mind that the browser will treat it as a first-party request and the security implications that has. You may have to filter or modify cookies/headers.
well sso providers would still work, if it was made correctly?
sso works without cookies. if I implement google sso I would not login via the google supercookie
there is a state parameter? so If I want to have a cookie that passes stuff, I can just store my stuff inside a cookie and pass the stuff inside the state param, there are so many possibilites via openid (which is super easy), I do not know how saml2 works, which might be different tough.
I know of a token system that some questionable engineers started pushing session state into and since it shipped before anyone noticed, walking that back turned out to be quite a chore. What was supposed to be a couple hundred byte cookie started hitting max cookie length warnings in other parts of the system.
When people need to keep a door open, if they don't see a doorstop in the immediate vicinity after two seconds of looking, some will just use whatever heavy object that is closest and consider the problem 'solved' instead of managed.
I needed data, I didn't know where to put it, this thing can give me data, boom, solved.
Not a huge loss, if you depend on federated logins its just a matter of time until Google or Facebook's algorithms decide to ban your account without explanation or recourse and then how do your users access your site? All you'll be able to do is try to shame the companies on social media and hope enough people are outraged that the company takes notice.
It's going to break all 3rd party social layer providers. Most news sites don't have native comments and rely on a 3rd party like a Disqus. Login in state is stored as a cookie. It also going to break all the openID stuff that is heavily used in organizations like Walmart. OpenID is all based around cookies. I remember having to rebuild our provider when Safari released an update that you can't set 3rd party cookies without user interaction.
That type of attitude toward the millions of users that use discus just shows why Firefox is a dying browser with ever decreasing install base. Funding will keep decreasing as it is tied to search engine deals which is based in active users.
Anything that shields me to some extent from the "grab money fast, before anyone notices we're fucking them over" companies out there is a champion, as far as I'm concerned.
there are valid criticisms of firefox but breaking disqus is a bizarre one. when is the last time you used it? my impressions is that these days the literal majority of content produced on it is spam and it's been this way for the better part of the last decade
I guess we use different sites then. I should specify I mean it doesn't keep me logged in. I consider this breaking because if I click a link to that site, it loses the original context once logged in.
What would be the point of localstorage if JS couldn't access it? Cookies can be set and get via http headers, but is localstorage available by other means than JS?
I have trouble with google login (url must be copied into a google tab) and oracle cloud loses my tenancy home region every few minutes (https://i.imgur.com/ZCsepq3.png). Several other examples like LMS's that use O365 to log in must be manually logged in every time
I use both Google and O365 at the educational institutions I work at and both platforms works fine across a wide variety of applications. Strange that you are experiencing these issues.
Nice, sounds like I can get rid of the extension I use to toggle `privacy.firstparty.isolate`.
> In addition, Total Cookie Protection makes a limited exception for cross-site cookies when they are needed for non-tracking purposes, such as those used by popular third-party login providers. Only when Total Cookie Protection detects that you intend to use a provider, will it give that provider permission to use a cross-site cookie specifically for the site you’re currently visiting. Such momentary exceptions allow for strong privacy protection without affecting your browsing experience.
That's exactly why I have to toggle it. Anyone that uses auth0, and many publications sites (follow a link to a PDF, get redirected to `/cookie-absent` instead) fall foul.
Moreover, I've heard loud voices before that controlling 3rd party cookies will break login providers - guess what, it turned out if there is a will, there is a way.
I find this very annoying. An OpenID Connect provider is perfectly capable of working without using third-party cookies. The only reason they need them is to allow OIDC authentication without actually redirecting to the provider (by using a hidden iframe to do the OIDC flow on the same site). But if 3rd-party cookies are disabled it should just fall back to the normal OIDC redirect.
The OIDC front channel signout functionality relies on third party cookies to work properly. This feature has the IDP basically loading your app's end session page in a hidden iframe.
Similarly the OpenID Connect Session Management feature (check_session_iframe) also depends on the ability to use third party cookies.
This functionality is needed to be able to detect if user logged out from front-end code without relying on having any back end code that could receive either a front-channel or back-channel signout notification and send it back.
In the absence of that a pure SPA with no backend could only detect the logout if access tokens are stateful, and they get an error message back that the token refers to an ended session.
Some people get really cranky if a single sign out feature does not actually sign you out of everything.
Sorry you're right. I was just thinking about sign in. But at the same time it seems like the cat is already out of the bag on this one. Safari already blocks all third party cookies by default and it seems like other browsers are moving in the same direction.
"Nice, sounds like I can get rid of the extension I use to toggle `privacy.firstparty.isolate` ..."
Forgive me ... do I understand that there is a true/false setting in Firefox named "privacy.firstparty.isolate" that you like to toggle from time to time ... and you use an extension to do that ?
I don't do much browser customization and use only one extension (uBlock Origin) but ... couldn't I toggle a single Firefox setting with a simple command line ?
Toggling it manually requires going to about:config, and searching for it.
On startup it's enabled (i.e. do isolate) via a config file, so I could change it there with a shell script. I think though that I'd have to restart Firefox for it to take effect.
The extension gives me a handy button in the toolbar that's red (danger) when it's off (i.e. not isolating) that I can just click to toggle.
Yes it's a tiny job for an extension, but do one thing well right? Also, to be honest, it's easier that it's there than switching to or pulling up a new shell.
Maybe I don't know enough about cookies but it's kind of shocking that that this wasn't the behavior from day one. I suppose it's one of many things designed for a simpler time, but so many of those have been fixed by now.
Kind of an important point: this appears to be an attempt to make third party cookies useless, without actually disabling them since many sites depend on them. This is achieved in two ways:
1. By allowing third party cookies, but compartmentalizing them by the first-party site that sent the request (a much better name for this feature would be "per-site cookie containers", "total cookie protection" is completely uninformative).
2. By using a heuristic to selectively allow cookies to be accessed across the container boundary if they are actually needed, e.g. for logins.
To answer your question, this doesn't make sense as "day one behavior" because it's basically a patch to work around a historical problem with as little breakage as possible. If you were setting up cookie permissions on day one, knowing what we know now, you wouldn't kneecap third party cookies, you'd disable them entirely. Mozilla is trying to make third party cookies useless for 99% of what they're used for: if that's how you feel about third party cookies, you'd just not implement them.
Incidentally, I do block all third party cookies by default and have for years. That's a much stronger approach than the compartmentalization that Mozilla is attempting. I can count on one hand the number of sites I've seen break because of this, most of them are happy to let these cookies fail silently.
There is so much legacy tech out there that is still working on the trust level from back when DNS was a hosts file you manually copied to your system once in a while.
Is this really effective for the users' privacy? Won't AdTech networks simply migrate to browser fingerprinting, perhaps with a bit of server-side tracking?
I'm not arguing to give up. Rather, I'm more convinced in investing in privacy NGOs like noyb.eu and make it expensive to toy with my privacy.
> Won't AdTech networks simply migrate to browser fingerprinting, perhaps with a bit of server-side tracking?
they don't even have to. Just store two (or N) sets of cookie trails as they already do. This will waste a few MB of storage on the client side and do nothing to Ad/privacy.
Sites never shared the ID anyway, specially since GDPR-et-al.
AD tech works like this: you send a hash of one ID and on the backend attach all the profile info (nobody will ever share that with partners, because that is gold), then the other side just assign their own hash of their ID and also keep all their targeting info on their backend. The only thing that matters is that party A ID123 is known to match party B IDabc. Note that those IDs are transient and set at random, because party A and party B doesn't want to give up their secret info by matching IDs from multiple sites. That is called cookie match. it does NOT depend on a single cookie jar. It doesn't even depend on cookies! why do you think most Ads (and google search result links -ha!) have those weird hashs appended? zero cookies needed)
Another thing that helps even more than 3rd party cookie is multi-site referrer, but google killed that on both chromium and firefox a long time ago (firefox still have the about:config way to disable/set to single-site, set to multi-site-domain-only, but good luck finding a single human who changes that setting by selecting magic numbers)
This is wrong: third party cookies are still widely used in the ad industry. Among other things, the cookie matching that you describe is dramatically more effective with third-party cookies than first-party only.
(Disclosure: I work on ads at Google, speaking only for myself)
never said it is not widely used or not effective.
Just saying that it won't matter much if removed from the equation.
I mean, if something makes your life easier, you would be a fool to not use it. but that is like saying not having a ferrari prevents you from driving to the store.
Third party cookies are not simply a matter of making adtech developer's lives easier. Imagine you visit shoes.example and are now on news.example. Both of these sites work with ads.example, and the shoe site would like to show you a shoe ad.
With third party cookies this looks like (simplified MVP form):
1. When you visited shoes.example, it loaded a pixel from ads.example. That pixel automatically sent your ads.example cookie, and put you on a remarketing list.
2. When you visit news.example, it sent an ad request to ads.example, which also automatically sent your ads.example cookie. Now the ad tech vendor knows to include the ad from the shoe site because it recognizes the third-party cookie.
On the other hand, without third-party cookies or any replacement browser APIs, how do these identities get joined? Very occasionally someone will follow a link between a pair of sites, and then you can join first party identities, but you probably don't have a chain of identities that connects a news.example first-party identity to a shoes.example identity.
>On the other hand, without third-party cookies or any replacement browser APIs, how do these identities get joined?
1. When you visit shoes.example, it has an iframe to show an ad from ads.example. This iframe runs some JS to compute a browser fingerprint and then nests an iframe to hxxps://ads.example/?target=shoes.example&client=$fingerprint . The ads.example server records that this fingerprint has visited shoes.example
2. When you visit news.example, it has an iframe to show an ad from ads.example. This iframe runs some JS to compute a browser fingerprint and then nests an iframe to hxxps://ads.example/?target=news.example&client=$fingerprint . The ads.example server recognizes the fingerprint, knows that the client visited shoes.example earlier, and returns a shoes ad.
My parent claimed this was possible to do with link decoration and first party cookie matching, and I'm saying it isn't.
I do agree this is possible to do with fingerprints, though (a) all the browsers are trying to prevent fingerprinting and (b) a reputable ad company would not use fingerprints for targeting. This is my understanding of why Google is putting so much effort into https://github.com/WICG/turtledove
right, if you know how cookies and urls work, all that can happen with zero cookies and some query parameters, like the ones google search surreptitious add on every search result.
cookie synch, It's a freaking industry standard. And you want us to believe google money cow will dry as soon as the effort they are leading goes live?
No, it is not possible to remarket at any meaningful scale with "zero cookies and some query parameters" (though Arnavion's sibling comment is correct that it can be done with fingerprinting). Would you be up for describing how you'd do it in the shoes.example/news.example/ads.example case?
> you want us to believe google money cow will dry as soon as the effort they are leading goes live?
"we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete. Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome. Our intention is to do this within two years." -- https://blog.chromium.org/2020/01/building-more-private-web-...
hint: the same way attribution happened in the early days.
google sends id abc to shoes.com and id xyz to news.com. both sends those ids back to google's own adserver. presto, google knows you are seeing those two ads.
Yeah, the cookie law was a false start. Laypeople don't care about the exact technical implementation (e.g., session cookies vs. persistent cookies vs. local storage vs. browser fingerprinting).
What I care as a EU citizen: Are you collecting and storing information that can directly or indirectly identify me? Yes, tracking and profiling are included in this.
You want to store some session cookies, so you remember my shopping cart? Go ahead!
You want to store some cookies, so you remember I was logged in? Sure!
You want to use every available technological loophole to follow my every path on the Internet? Errrr, no thanks!
I think the cookie law is somewhat meah, but I feel GDPR is pretty future proof. I don't expect GDPR to change a lot, rather our application of it (so-called ECJ recitals) will evolve.
This is basically Google (Chrome) paying Mozilla (Firefox) to kill 3rd party cookies because Google has a better way to fingerprint users without 3rd party cookies, because they have SO MUCH data about us.
This move is aimed at killing other AdTech companies which rely on 3rd party cookies to track users.
They painting this as a 'PRIVACY' move, after they have already found other ways of tracking users across websites and devices.
Except if you're setting up SSO for your company's employees. Using a 3rd party login provider is a necessity. You shouldn't trust employees to create unique / strong passwords for every individual service they login to.
Or if you're setting up a SaaS application where some of your customers will want integration with their own SSO. We don't have developer time to spare implementing that sort of thing but Auth0 lets us do it as one of its built-in integrations.
It lets us offer SSO with whatever Auth0 supports as a freebie add-on, instead of "well, we could work with your platform but it's gonna cost you."
I don't see how it's a trap, except that we have to pay auth0 a monthly fee to handle our authentications instead of having some number of hours a month spent maintaining and securing our customers' logins and integrations.
Not only that. As a user it's incredibly frustrating entering a password 5 or more times each morning. This results in users using extremely weak passwords.
The same is true for forcing users to reset their password every 50 days or so, by the way. This outdated password guideline doesn't seem to die. I know way to many cases where people are using a weak base password with a number attached to it because they got sick of trying to remember a new password every month.
> The same is true for forcing users to reset their password every 50 days or so, by the way. This outdated password guideline doesn't seem to die. I know way to many cases where people are using a weak base password with a number attached to it because they got sick of trying to remember a new password every month.
there are people who actually invent a new password every time instead of cycling numbers?
also, change password a few times until history is flushed and switch back to the same password you started with is a thing.
SSO is more than password management. It is instant provisioning and deprovisioning of users. Role management and auditing. Enforcement of security standards like 2FA in a central place.
Not really relevant for the specific topic, but to be more precise, SSO is only the sign on part. Usually the provisioning/de-provisioning is handled by SCIM, which is related but distinct. You have some SaaS products that offer SSO but not SCIM, for example.
Sorry, I should have been more clear. When I typed SaaS products I meant more about a non-IDP product. They might support SSO but not SCIM-based account provisioning, especially if it's in-house auth (not using something like Auth0). I worked on a product that supported SSO but not SCIM for a long time and not all SCIM features were supported.
I've used Okta to provide gateway access to physical devices and AWS roles in the same deployment. Very impressive when every endpoint and SaaS product is behind a single 2FA login.
If you can enforce that they use the password manager, it solves that one problem.
But SSO centralizes access management. For instance, with one switch I can set password requirements, require 2FA, and grant/revoke access to all of an employee's services when they join the company or leave.
I'm sure there are ways to use 2FA or OTP without externalising access management to Facebook, Google or another SSO provoder, unless you want to pick convenience over privacy and security.
How do you enforce it over a bunch of 3rd party software which either doesn't support 2FA or doesn't support enforcing it? If they support SSO which they usually do, its a non issue.
How do you centralized your authn, your 2fa provisioning? How do you ensure that your cloud native apps have access to the auth backend without risking exposing the wrong ports on the wrong vpc?
Just adding a library to application code is not sufficient. What I mean is that organizations should not roll their own SSO provider. At the very least, work with one of the many companies that offer it as a product or service. If your threat model requires it, you can host the product on premises.
> Don't use [third-party login providers], it's a trap.
Pretty hard to avoid in many cases. Logging in to your Microsoft account for Office (Teams, Outlook, et al.) uses a login service, as does Google, and practically all services that span across multiple domains. Which includes all of the major ones, at this point.
Good that Firefox gives us this option, given how the web has evolved!
i don't think anyone would deny that third party logins are convenient -- either from the user perspective or from the developer perspective. but they are also a huge vector for privacy-invasive ad-profiling, if that's the login provider's business model.
that is true, but that is virtually always because of password re-use. if you use a password manager and randomly-generated passwords unique to each service, this is almost entirely mitigated.
with a single third party login for all services, though, if that third party account gets compromised the results are catastrophic.
> with a single third party login for all services, though, if that third party account gets compromised the results are catastrophic.
The same can be said of the password manager account. It's turtles all the way down.
The fact that we rely on users to not reuse passwords, the fact that using a password manager is all but required to get reasonable security despite being far from convenient, these indicate a major failure to serve the actual needs of users, in my view.
Users have head space for 1-3 strong passwords. They can tolerate carrying maybe 1 security token with them. They can tolerate a little bit of security setup when using a new device for the first time, and they can tolerate a touch or fingerprint scan at authentication time. All authentication systems can and should operate within these parameters.
No web site or app outside of an authentication provider should ever present a user a screen asking them to pick a strong password that they have never used before. That is asking a user to do something that the human brain cannot reasonably do for 99% of the population. At best, a browser or password manager will intervene at that point and pick the password for them. At worst, the user ignores the warning and picks the same password they use for everything else.
> The same can be said of the password manager account. It's turtles all the way down.
What password manager account, what are you talking about? There is never any password manager account, yes, I have heard that some weird people are synchronizing their passwords to some strange 3rd party services but those don't matter. You have one password. Encryption password for login database and that one is local and never transmitted over the internet. If you know a password manager that provides this decryption password to their servers, please open the topic here and they will be bashed to hell for this.
I am a tad more strange, my password manager is synchronized with my sftp server using private key and I am not only randomizing the passwords for each site but also the email address (imagine sha(user+salt) + delimiter + sha(domain + master password)@mydomain.com). And I will never in my life use any SSO as they are mostly spyware designed for tracking users across the sites and certainly not for what they are advertised for. They will break with firefox latest addition? FINE! At least people will stop using them.
One thing are companies self hosted SSOs. Sure, I can trust those for company services. For anything else, like "login with google" or "login with facebook"? Yeah right, my hearth is jumping out of joy and barely waits to use it. It actually works in reverse, if you dont allow me to register using non SSO account (email, password) I wont use your service/webpage/whatever.
It doesn't for corporate usage... having to create accounts for every new employee on every service you use, and then remove those accounts when someone leaves is not scalable. Having SSO is needed.
I use 1Password (and the browser extension) for all my passwords, but I still choose "Sign-in with Google" when that's an option.
The "Sign-in with Google" button is makes it much quicker to create an account and slightly quicker to log in.
Also, I can rely on my Google 2FA rather than setting up and filling in a different TOTP for each site. Something like U2F or WebAuthn would make the filling-in part more convenient, but even sites that offer 2FA usually don't offer those. (And many sites don't even offer 2FA.)
Using 1Password's 2FA feature would make TOTP more convenient, but I'm a little nervous about putting 2FA in 1Password. This might be overly-conservative thinking, though.
I agree it can be super convenient, though 'Sign in with Google' is totally broken for me, because I've accumulated a handful of google accounts.
Every time I log in to a service, I have to guess which account it's associated with (bearing in mind I may have signed up years ago). And if I'm wrong, half the time it immediately attempts to create a new account, and then I'm stuck with a bunch of empty dummy accounts on various services.
Mozilla is really fighting the good fight for the users privacy. I've been using Firefox for as long as I can remember, even when there were faster and more fancy alternatives available. Their ideology and service to the user is what makes me loyal to them
I have tried regular as well as the developer version of Firefox, but no matter what I use, YouTube videos always skip frames after every 10-15 seconds or so. So I use Brave for YouTube and other WebGL heavy stuff and Firefox developer version for daily browsing.
That sounds very strange. Certainly don't see that in Firefox on Mac (work laptop) and both Linux and Windows (personal laptop). Try adding the h.264 extension. That forces YouTube to provide h.264 videos which is hardware accelerated on pretty much any hardware.
> even when there were faster and more fancy alternatives available
This seems to indicates there's not faster alternatives around anymore, but the last time I tried FF (4-6 months ago) I couldn't make the transition because the lag was pretty obvious when coming from Chrome based browsers. Is this not the case anymore?
I use Firefox and Chrome at the same time and I don't really notice any difference. Maybe a bit for Google apps (Hangouts, Docs, Meet, etc) but I just see that as a symptom of Google's attempts at using their market dominance to harm competitors, which makes me want to use Firefox even more.
It's unlikely they put any effort into intentionally make them run slower, it's just that they are written to work optimally on Chrome and minor differences in the behavior of things like the V8 vs. SpiderMonkey and Blink vs Gecko. Given that each one is written with different tradeoffs, it's not surprising things perform differently.
Whether or not the Google programmers use specific proprietary knowledge about the behavior of Chrome to optimize performance is different. If they do, that would be similar to the things that got Microsoft in trouble.
Google knows that every time they release a Firefox bug, FF's user percentage goes down a tiny bit. Repeat over dozens of bugs, for years, and you have a strategy.
There's one blog post from another Mozillian that I can't find anywhere that came out within the last year with other examples, I think it was on HN.
Brave and uBO share filter tech and we aim to make uBO unnecessary (this may require setting shields to aggressive). We do much more than any extension can do, and Google has made it clear they will further restrict extension APIs.
I think this might be more about perception than anything else.
I've used Firefox since 2006, and Chrome always seemed heavier, laggier and uglier. Maybe it's the snappy iOS-like animation when you scroll to the bottom of the page that makes it seem snappier?
It's not imaginary - for years Firefox drained battery on macbooks really fast. Then there is this pesky issue of randomly freezing whole laptop for a minute or so, usually associated with file uploads or locking screen [1], [2], [3], ... Fixed in one version, then appears again in the next version.
I still used Firefox a lot for various reasons (and still do), but I'm not blind to how it performed.
Firefox is fine and quick as long as you don't need to use any heavy Google apps. Some people might even consider this a plus. For me, between work and personal use I'm effectively married to Gmail, Google Calendar, Google Docs, and Google Hangouts. Unfortunately that makes Firefox a non-starter for me. Not to mention Firefox's privacy settings trigger countless reCAPTCHA gates across most of GSuite. I get that this is not Firefox's "fault" and it's done intentionally by Google, but as a user it becomes my problem.
I really want Firefox to work for me and I'd love to drop Chrome, but last time FF made big noise about performance improvements I tried it out and Gmail was still unusably slow.
I use Google Calendar and Google Docs without any issues in Firefox. I agree Gmail is coded terribly and do not use the web site! I stick to using Thunderbird on the computer, and checking email on my phone. Have not been using Hangouts for a couple years, though.
For me, the way Google is keeping Gmail terrible for other browsers is exactly the reason to not use Chrome. No way I'm OK with that.
FWIW I use all of those apps on a daily basis with Firefox and have not noticed any performance issues. It may be worth giving it another try if you haven't in a while.
Indeed. Hangouts is one I find works better in Firefox even! But I observe it seems to vary. Perhaps Intel Macs has some quirks that makes it more peformant and reliable in Firefox.
I switched to FF when Quantum came out. I use it exclusively. Not because I hate Chrome, but because I don't see any need for chrome. Once in a while I see a website that forces me to use something other than FF. But it happens rarely, and it is mostly some webgl-based under-development demo website.
I even use it on my phone. The mobile version is definitely worse than Chrome, but it has plugins (or it used to! nowadays it only support a few popular ones which is a shame) and also I can send tabs from my phone to my computer (which is a better place to read articles anyways).
I switched back to Firefox last week and I had the same experience -- Google apps and Slack were dog slow. But after a day or so they were working fine, I imagine it's a matter of populating the cache. YMMV.
It also depends upon the operating system among several other variables,
I didn't find noticeable difference between FF and Chrome based browsers(Vivaldi, Edge) on macOS(although Safari runs circles around them) after using them extensively. I used each of them for a separate project with several common websites loaded in them, there were different quirks for each browser(especially reg tab hibernation) but latency was not one of them.
On Linux FF seems definitely faster than Chromium, although there are occasional DNS errors which stops loading the web pages altogether(likely result of my own doing). I've stopped having different browsers for different projects and just use FF for all.
On Android with Chrome, not just Chrome but even WebView using it is astonishingly fast(e.g. DDG browser), I presume it's because of data saver feature. On de-googled android like LineageOS, FF/Fennec seems to be on same level as Chromium and DDG is faster here as well.
On iOS, everything is Safari.
I don't use Windows much, but I've seen others mentioning Edge seems to be faster than Chrome recently.
I find them to be close enough to imperceptible for just normal html and css etc.
The stumbling block for me as FireFox user is I am increasingly bumping into web apps that preform poorly in FF but are fine in Chrome for one reason or another. One instance I bump into a lot is ElasticSearches Kibana runs like trash in FF for some reason.
I am guessing performance differences might be masked by good hardware? Sometimes performance differences don't show up until you use an underpowered machine.
I don't think it's just that. I have a half-dead Chromebook with linux, and I use Firefox on it. Some years back I ran Chrome on it because it worked better, but at some point I started seeing issues with Chrome and tried Firefox again. I've been using Firefox since.
I switched from Chrome to Firefox about a year and a half ago. Chrome definitely felt more snappy, but the difference wasn't that much.
Except on Facebook. My Facebook tab is incredibly laggy, and gets more and more laggy the longer I leave it open. I'm one of those users that tends to keep 50+ tabs open, and I have to close and reopen the Facebook tab at least once a day to keep it from becoming a nearly frozen mess. Even then, if a video is playing and I click it to make it fill the window, it takes several seconds for it to happen. And with an i9-9900K, 32 GB of RAM, RTX 3080, and a 1 TB NVMe drive, my computer is definitely no slouch.
My CPU immediately pumps to 100% usage after opening google docs. Granted, it's on my old laptop, but I can use electron apps and they run far better than gdocs.
Keep in mind that Firefox opens their website on first run and on every update and that includes Google Analytics.
I find the majority of their privacy claims dubious and dangerously misleading for those that don't know any better. If they were serious about privacy they'd offer uBlock Origin (or equivalent functionality) preinstalled by default.
Their current countermeasures such as containers, tracking protection and this cookie thing is trivial to bypass with browser fingerprinting and IP address tracking if you have a global view of the Internet (which Facebook and Google do have).
I modified the settings long ago to come up with a blank tab on startup. I use NoScript and do not allow google analytics through. No facebook domains make it through NoScript as far as javascript is concerned, very few google ones do.
I get you about the updates. It's a risk-reward ratio I accept because firefox + noscript + always starting in a private session is way more helpful than the update problem is harmful. Using a VPN a lot of the time helps, too. There is no solution I know of that is perfect. My threat model is pretty relaxed, though, so what I do is mostly for my peace of mind. You have reminded me that I should start spoofing my user agent again.
I don't disagree that it's possible to configure Firefox to respect your privacy. I myself use it sometimes and have a similar configuration.
But it is extremely misleading for them to be shouting "privacy" at every opportunity while the truth is that their browser leaks personal data like a sieve in the default configuration. This would give a false sense of security to non-technical people who don't have the skills to see through these lies.
And here are the FUD-spreaders yet again, that instead of the tiny “bad” things like some form of harmless analytics (it is not even that) they would run towards the goddamn gate of Hell itself. Like, what do you imagine chrome does? Or do you think brave have eveything removed? It’s the exact same browser with different name and logo and preinstalled adblocker..
Sorry for the somewhat angry comment, but I honestly can’t understand this mentality.
Google Analytics isn't harmless though. It's giving a single party a wide view on the entire Internet (thus the ability to circumvent cookie-based tracking by just using IP addresses and heuristics) and said party makes its money by tracking people online.
I'm not saying Chrome is any better, but at least Chrome doesn't toot the "privacy" horn at every opportunity.
Brave does have some kind of blocker built-in which might actually help even if it's not perfect.
"neither Mozilla nor Pocket ever receives a copy of your browser history. When personalization does occur, recommendations rely on a process of story sorting and filtering that happens locally in your personal copy of Firefox."
I preferred chrome cookie control over Firefox after switching. (I have had to compromise with umatrix to fill this feature gap.) Very granular control for each cookie where a cookie can be allowed, temporary, or blocked.
I went through my entire list of cookies once, 400 at least and started perma blocking all those I didn’t recognize. It was beautiful. I can’t do the same in Firefox.
I’m not feeling very good about this move where third party cookies are isolated by website. There are lots of websites separated across multiple domain names sometimes unrelated. (Sharepoint, office 365) they will have difficulty.
And then there are special login websites and others like dish network telling CNN you have a subscription with them.
This breaks. And creates a predetermined list of who can do what.
> I went through my entire list of cookies once, 400 at least and started perma blocking all those I didn’t recognize. It was beautiful. I can’t do the same in Firefox.
If I understand your description correctly, you can definitely do this in Firefox also. Preferences/Privacy & Security/Cookies and Site Data.
> I went through my entire list of cookies once, 400 at least and started perma blocking all those I didn’t recognize. It was beautiful. I can’t do the same in Firefox.
I did this in Firefox before Chrome was even a thing. This has been supported natively without add-ons since at least 3.5, if not even earlier.
That would be under "Cookies and Site Data". There are two options: Manage cookies (which only give option to remove cookies) and Manage Exception (which require you to manually add domain names. This is not usable for massive cookie block list.
That would be under "Cookies and Site Data". There are two options: Manage cookies (which only give option to remove cookies) and Manage Exception (which require you to manually add domain names. This is not usable for massive cookie block list.
If anyone wants to see these protections in action, www.clerk.dev leverages the Storage Access API in development mode - where we need to share session data across localhost and a clerk-owned domain.
With this launch, developers are now prompted to explicitly allow third-party cookie access in Firefox.
(In production mode, the prompt isn't thrown because our cookies are set in a first party context.)
Is there a reason why uBlock Origin is still not included in the browser? In this day and age, you can't have privacy online without it, and claiming otherwise is misleading at best and maliciously deceptive at worst.
Not affiliated with Mozilla, nor do I know, but my thoughts:
A quick check reveals that while ublock origin seems to be the most popular, it's by far not the only popular add-on to block ads https://addons.mozilla.org/en-US/firefox/search/?q=adblock ; so why include ublock origin specifically? Especially since it has become much more than a simple adblocker (script blocking capabilities for example), why not something else? Why not integrate an ad-blocker developed completely by mozilla?
Why not include NoScript + Containers by default? And some UserAgent Switch capability? And more fine grained cookie storage options (currently available via add-ons), et cetera?
When you start integrating capabilities currently being offered by add-ons, the questions are :
- where to stop
- how to discriminate what to include, what not
- how will users and developers feel (for example the user who wants to use his favorite add-on, which now is not developed anymore because almost no one bothers to install it since functionality X has become part of the browser)
- how to deal with edge cases (the one site which breaks because of ad-block is the reason a non-technical person might simply install chrome and move on with their life)
- is the increasing complexity worth it? to what degree is it?
A lot of the other ad blocking extensions are malicious and collude with the advertising industry through some kind of whitelist program. Their license might also not be permissive enough to allow this.
> Why not include NoScript + Containers by default?
NoScript requires lots of manual intervention, uBlock Origin with the default lists is still seamless and rarely causes breakage thus very little need for manual intervention.
I am not convinced that Containers does anything at all. Browser fingerprinting & IP address tracking defeats it very easily.
> And some UserAgent Switch capability
This is absolutely needed and I'm baffled this isn't offered natively, though this would be less for privacy and more as a developer tool.
> And more fine grained cookie storage options (currently available via add-ons), et cetera?
I find the whole craze around cookies overblown. Your IP address is a relatively persistent cookie you can't clear. The only way is to prevent requests made to the malicious actors to begin with, with some kind of blacklist like what uBlock Origin provides.
> how to discriminate what to include, what not
I'd argue that if your mission is to make the web better and protect people's privacy then including a proper ad blocker is a no brainer.
> does it do any good
That is up to discussion with the add-on author (the author of UBO has repeatedly declined donations and seems to be doing his efforts out of passion and/or hatred for ads, so he should be onboard), but otherwise, the secret sauce isn't really the blocker per-se but the blocklists such as EasyList/Fanboy's lists, and Mozilla has enough resources to reimplement a compatible client from scratch if needed.
> how to deal with edge cases
Contribute back to the lists to fix any edge-cases by adjusting an over-reaching blocking rule, and offer an easy way for users to temporarily disable the blocking on a per-site basis.
> I find the whole craze around cookies overblown. Your IP address is a relatively persistent cookie you can't clear. The only way is to prevent requests made to the malicious actors to begin with, with some kind of blacklist like what uBlock Origin provides.
In my personal opinion, no one should be connecting to the internet in this day and age without using a VPN service wherever possible.
My preference would be to include the functionality of ad blockers but not include any of the actual lists. You would then be able to pull down the same lists that ublock origin provides by default and add any additional lists you want.
I'd argue that this is justified when it comes to misleading non-technical users about their privacy.
Mozilla plasters the word "privacy" everywhere and yet opens their own website on first run and after every update which includes Google Analytics, from the same company that's known to violate people's privacy on a large scale and profit from it.
Browser fingerprinting and IP-based tracking is reliable enough that blocking cookies is absolutely useless in this day and age against an omnipresent adversary such as Google & Facebook. Blocking their request uBlock Origin-style is the only way to go and claiming to protect your privacy otherwise is very misleading.
So we have a suite of B2B products, hosted on p1.com, p2.com, p3.com, with an OAuth2 provider on a1.com. a1.com isn't very "well known", and it won't be, because we run it privately for auth and user management for our own products only. There are no subdomains anywhere, only individual domains.
Does this break our setup? And how do we tell users to un-break it? And is there a way to tell Mozilla via directives that we have a private list of sites we'd like to share a1.com cookies in?
User clicks log in at p1.com, they get forwarded to a1.com which checks their (now first-party) cookies, then once they're logged in they get forwarded back to p1.com with a token in an URL parameter.
Ah, right, thanks. So this is a problem only if we have in-page widgets from a1.com that load on p1.com and hope to find a the currently logged in user there. Makes sense, that's basically what an ad is.
This weakens security. Now auth tokens can be logged or actively intercepted on corporate networks with TLS MITM and these URLs will eventually find their way into emails and other unencrypted locations. Not exactly progress.
The behaviour with third party cookies blocked is how oauth2 works by design.
Even without third party cookie blocking, if you're at p1.com and you click to log in with a1.com but you're not logged into a1.com yet, you get forwarded to a1.com to sign in.
So with third party cookies blocked, it's no less secure than it was before.
Safari solves this by sending third-party cookies only if the user visited the originating domain within 24 hours.
Not sure how Firefox handles this but I guess it would be easy to detect a redirect from a1.com to p1.com and recognize this as a use-case where a third-party cookie from p1.com should be sent for a request originating from a1.com.
That said it's probably more privacy-friendly to append an access token as a hash parameter to the URL when redirecting and extract it via JS, which will not be affected by cookie limitations.
It breaks non-tracking functionality for embedded things on the web as currently implemented in major browsers, in particular, which is one of the largest use cases.
Signing into a website through an iframe redirects you back to a sign in page inexplicably if the post-signin page requires a cookie.
Another example is you're signed into website A, and while on website B, iframes to website A behave in such a way that you're not signed in, and you cannot sign in.
If you disable third-party cookies, you can't download files or view videos in Google Drive without a workaround.
This is because the download is from googleusercontent.com while your browser remains at drive.google.com the whole time - and to download private files, googleusercontent.com expects you to have a login cookie. If you block third-party cookies the download gets stuck in a redirect loop, sending you to get a cookie over and over again.
I mean, why are all these lengthy intermediate steps necessary? It's only a matter of changing the default value of one damn setting. I've had third-party cookies disabled for more than a year and the only websites I've had problems with were ridiculously poorly-made ones — like AliExpress, that for some reason has a zillion subdomains and relies on third-party cookies for authentication.
Leads to a prisoner's dilemma situation. A move like that has to be done by everyone in concert (example: killing Flash), or it's harmful to the one browser that blinks first.
This thread contains plenty of examples of legitimate uses for third-party cookies. If FF instantly and immediately broke those, users would be cursing, not praising Firefox, and switching to a browser that doesn't break what they use.
It's funny you note that the only website that had issues was a top 50 website (https://www.alexa.com/siteinfo/aliexpress.com#section_traffi...) that no doubt has a lot of ordinary non-technical folk on it. Breaking sites like these would likely kill an already relatively niche browser.
because you're fighting the ad industry. The ad industry which also has their own browser and tells grandma whenever she searches about problems with cookies that there's a "better" browser out there.
Precisely. Google is an ad behemoth AND has the majority of the market of browsers. If Firefox (or Safari of Opera or etc) changes to something that breaks Google but Chrome doesn't, they'll just get more of the market. For non chromium browsers to survive, they have to play a long game and show people why these changes are important. People are happy to sacrifice privacy for convienience, unfortunately.
> relies on third-party cookies for authentication
A lot of websites depends on this via auth0, cloud identity, cognito... and the experience becomes subtly broken in a way that you need to be extremely technically savvy (a developer that has a whole lot of auth experience) to understand.
> Total Cookie Protection makes a limited exception for cross-site cookies when they are needed for non-tracking purposes, such as those used by popular third-party login providers
How does this work out? Say I want to launch a new popular login provider - how do I get past the Firefox gatekeeper?
In the Firefox storage access policy, we have defined several heuristics to address Web compatibility issues. The heuristics are designed to catch the most common scenarios of using third-party storage on the web (outside of tracking) and allow storage access in order to make websites continue normally. For example, in Single-Sign-On flows it is common to open a popup that allows the user to sign in, and transmit that sign-in information back to the website that opened the popup. Firefox will detect this case and automatically grant storage access.
Note that these heuristics are not designed for the long term. Using the Storage Access API is the recommended solution for websites that need unpartitioned access. We will continually evaluate the necessity of the restrictions and remove them as appropriate. Therefore, developers should not rely on them now or in the future.
This site appears to provide a reasonable analysis of all the common browsers. It was mentioned on HN a year ago to zero comments. Chrome is completely indifferent to prevailing privacy compromises. Brave is locked down pretty hard. This one is amusing: "Brave: Add noise to Canvas, WebGL and AudioContext APIs to make fingerprinting more difficult"
Great ! Can we now remove all these cookie banners that have been plaguing the web since a pencil pusher in the EU thought it would a great idea to force every single website to display an annoying popup.
The better way to do this would be if you could configure your preferences once and for all in the client which then transparently communicates it to the website providers.
But there is a difference between a volunteer action by some browser developers, and the law. I think the bigger problem is that there are different policies in place in different legislation, so it would be very challenging to implement something that satisfies the needs.
These banners are there to fool you into accepting all cookies. They are basically a dark pattern at this point. The GDPR and the so called cookie law state that strictly functional cookies have implicit consent by the visitor. Even selfhosted tracking via cookies is considered functional. The GDPR/cookie law also does not enforce those banners. They only state that the user has to consent to every form of tracking.
So every time you see one of these huge banners it is the deliberate effort by the website owner to trick you into accepting the tracking.
Nobody wants to argue with GDPR regulators which cookies are "strictly necessary" and they certainly don't want to pay lawyers to review the purpose and use of every cookie.
It's not a trick, it's just that the easiest path for all sites to comply is to obtain blanket consent for everything.
Spoiler alert: we have that law. The GDPR as it stands outlaws annoying/misleading consent banners.
Next step: fire the incompetent people staffing the various data protection agencies and replace them with someone that would actually enforce said law.
It depends on your usecase. Containers for me has nothing to do with this.
I use containers for sites like AWS where it doesn't understand the concept that I might want to switch regions or accounts but only in some tabs so that I can work on multiple parts of the network.
I use Containers to make sites have no stored memories of me. Most sites I open, a new, temporary container (extension required) for that visit, and swiftly deleted afterwards.
All my YouTube views are firmly disassociated from my account, so recommendations will only be impacted based on geographic data. News sites can't remember if I've been there before, other than using IP addresses.
There are other use cases for containers besides third party cookie isolation. If you want to have two separate sessions for a site, you'd still need containers.
I have multiple sites like Github, Dropbox etc where I have multiple accounts I'd like to access separately. Typically private account vs work account, but also other scenarios.
Containers makes this a breeze.
In addition, at least Firefox only has a single private session. So if I open a site in one private window and another in a different one, they're in the same session, sharing cookies etc. Not so with different containers.
I find this a horrible usecase for me, I keep my password managers separate from my work provided one and my personal one. Containers don't solve this, and I use profiles, which I have to be thankful for MultiFirefox for fixing it. But only on MacOS.
I don't understand why fixing profiles isn't a priority, I find the usecase for them is completely different to containers which are awesome in a completely different way.
the most aggravating trend Firefox jumped on was making the option to allow-list cookies a byzantine and infuriating process from what it used to be.
If you want to reject all cookies and allow-list only a handful of sites, youll need to go into privacy settings and choose a "custom" option to reject all cookies. presumably you're knowledgeable if youre here but if not, theres a scary warning that tells you doing this will "cause websites to break." Once thats done, reload your tabs and realize that if you choose "allow all cookies" at a later date, switching back to the "custom" setting doesnt return you to your former "block all cookies," just the watery default of blocking some cookies.
now if you want to allow-list a site, good luck. You cant use add-ons to do it and theres no menu option to quickly accomplish this anymore. open your settings again, under privacy, and custom settings again, and youre faced with a form to enter your new site. once you add the site to the list, you must hit save. Yes, the site is in the list now, but unless you hit save, you didnt add it.
Now arguably firefox cracked down on cookie block/allow capability at the behest of google and advertisers some years ago but to see them doubling back on the cookie issue --not to fix the blocklist feature but to nanny-state your cookie preferences even further-- is a real slap in the face.
stop tip toeing around the issue to appease advertisers. Let us block what we want to quickly and easily.
Other than this is how cookies should have worked from the get-go, I have a question/scenario:
1. User visits site-a.com, which sets a cookie containing 'ThisIsUser9'
2. site-a.com also rewrites every external URL on the page, with a new param '&adtrack=ThisIsUser9'
3. User clicks on external link on site-a.com and goes to site-b.com
4. site-b.com's server sees the adtrack param on the end of the URL and sets a cookie 'ThisIsUser9' and also adds the adtrack param to all external URLs on the returned page.
5. Advertising company works with site-a and site-b (and many many other sites) to build up a persistent profile of your browsing habits.
We can't stop this, even with this new FF cookie isolation. Those of us who care will install an extension to strip known trackers from all URLs, and 90% of all other web users, will still be tracked as usual.
I can at least SEE that siteA passes my information to siteB. Or at least that it passes something (e.g. a huge base64 chunk in the url). That's a big step forward. I can also block the referrer headers so it's not visible in siteA url itself. If I want to navigate from SiteA to SiteB and the url doesn't look "expected", I can choose to not click it. Tracking that only takes place in URLS and only when I click things, isn't nearly as scary or problematic as cookies.
There is a solution, but it's somewhat clumsy compared to just right-click, "Open in new tab / window": Right-click, "Copy URL", paste into a text editor, remove any suspicious bits, copy what's left, go back to the browser, open a new tab / window, paste in your sanitized URL.
No, I don't think anyone does that with every damn link. I sure don't.
With all of the cookie protections and in app privacy settings, is highly targeted advertising becoming less effective? If targeted advertising is less effective, will the advertising giants need to provide a disclaimer when you try doing it? Will it lower ad prices?
Or will it take regulations to remove targeted ads?
"That’s because the prevailing behavior of web browsers allows cookies to be shared between websites, thereby enabling those who would spy on you to “tag” your browser and track you as you browse."
>Is that true though? I thought it was well known that you can only access cookies from your own domain:
That's where ad networks come in. A cookie set by <adtracker> when you're browsing say nytimes.com, will be sent to that <adtracker> when you're browsing say reddit.com and that's how the adtracker know's it's the same person on both sites.
Now, if Mozilla would allow Firefox to be configured such that it doesn’t call home or update itself in any way, that would be nice also, as I don’t see why Mozilla needs to know about me either.
Not sure, how does the tor browser score in these fingerprinting tests?
Looked like you loose quite a bit of functionality. Would be nice to have tor-browser like safety and a permission for "use advanced browser stuff that might enable fingerprinting" so you can trust certain sites where you need it.
Does anybody know whether this would complicate existing implementations of session sharing via a shared cookie?
For example, a site a.example.org may save a cookie for domain .example.org, and b.example.org would be able to read it. Site A would then be able to provide some information for Site B to consume, such as logged in state or ID.
From the sounds of it, this total cookie protection feature will essentially not allow this implementation to work.
It sounds like you can design a login provider around that: direct to login site with a return address, confirm with user they want to log in, post back to return address with token that allows site to query login provider.
Do you have any good examples of sites that don’t work on Firefox? I hear this a lot, but I don’t seem to experience it. I exclusively use Firefox on the desktop, while I use Safari on mobile.
most of the time the sites "work". Issues are usually in one of two categories: (1) bad/ugly layout, (2) failure to login properly. Occasionally, web apps for smaller organizations will just stop me at the door due to my User Agent string.
Unfortunately, every time I try, the usability and flows are - for me - lacking. Like, not being able to easily add and edit search engines (adding search for amazon, youtube, etc), history and bookmarks not opening in full tab by default, closed tabs and windows being separated on history...
The main thing I don't like about FF is that the UI is kind of blocky and clunky looking compared to Safari or Chrome. (This is on macOS.)
A trivial example of missing UI polish - when you open "About Firefox" after restarting the browser, the window always appears in the top left for a split second, then moves to the center.
This is great! I recently tried to this fudge behavior into firefox myself by using container tabs and temporary containers extensions. I wonder if these extensions would add any additional protection above strict mode now.
I like this idea a lot. One thing I'm confused about, though. Does this also apply to CORS requests? If A.com sends a withCredentials CORS request to tracker.com, won't the tracker.com cookies still be sent?
No, is still has an effect. CORS operates on a per-origin basis, while privacy mitigations operate on a per-site basis. You might want withCredentials if www.site.example wanted to share cookies with forums.site.example.
> In addition, Total Cookie Protection makes a limited exception for cross-site cookies when they are needed for non-tracking purposes, such as those used by popular third-party login providers.
There is no allowlist. The tracking supercookies from FB and Google should be blocked, only those detected to be for sso using a common heuristic are allowed.
Basically, everything is isolated to the first party domain (the domain of the URL in the address bar), including content caches, HTTP/2 connections, local storage, preferences, etc.
This seems like a nicer solution than Safari, which is blocking even session cookies in third party iframes. Makes it hard to have a multi-page browser game embedded in gaming sites.
Did this update also re-enable sponsored links on new tabs? They just popped up on all of my computers. Mostly I think Firefox is great but things like this annoy me.
Wondering if we can get our sane olde Web back by piecemeal subtraction of all the stuff of the 2010's, and starting over. Makes browsers much simpler, too.
There's an opportunity for this to happen by taking some time to just read through CSS 2.1 and implement the renderer. So much of the web is driven by that portion of spec alone. Then, you could tack on whatever other programming language you wanted to play around with. It doesn't even necessarily have to be JavaScript.
Most people don't even succeed implementing CSS 2.1, though. It takes a non-neglible amount of time.
> We also want to acknowledge past and ongoing work by colleagues in the Brave, Chrome, and Safari teams to develop state partitioning in their own browsers.
This should have always been the only way it worked.
Plus it should be easier to create white lists of allowed websites and all other cookies delete with every broswer restart. I know it is possible with Firefox but you need to add websites to whitelist manually in deep settings. At least there are some extensions that make it easier, like CookieAutoDelete https://addons.mozilla.org/en-US/firefox/addon/cookie-autode...
Awesome work - in retrospect it seems insane it took the world until 2021 to think about this in-hindsight obvious solution for responsible data segregation.
If you care about using a open, secure and not surveillance driven Internet and you are using Chrome rather then Firefox (or Safari or even Edge) you are part of the problem rather then the solution. That said, I run on Mac and on Linux. in both places, Firefox is roughly the same speed, but dramatically better privacy. The internet is a awful place without containers for isolating google and facebook.
users can already get this behaviour by setting 2 values in about:config why is this presented as new feature? mozilla laid off devs to start making marketing stunts?
Total Cookie Protection ? Great, I wish it will solve my year long problem of Firefox eating my cookies and session when it silently updates itself. /rant
Another feature, that no one asked for, that breaks stuff. Every site that mozilla breaks is one more nail in its coffin.
Speed is your second requirement, then security, then privacy: the first requirement is alway that the bludy websites work.
When mozilla lost track of this and prioritiezed security then privacy then performance, and finally/ maybe letting you get your job done, their market share started to fall.
The world needs an alternative to google's vertical.
One that actually works.
I control cookies outside the browser, in a forward proxy. I can allow/deny any cookie based on rules I set. I value privacy protection against a browser vendor just as much as privacy protection against advertisers (who keep browser vendors in business). I do not trust the browser. I trust the proxy. That's how I get "Total Cookie Protection".
I really, really like Firefox, but this is basically what happens when I try to get people to use Firefox (and yes, I do actually try to get people to use Firefox):
E: Hey use Firefox!
O: OK, I'll give it a try!
O: Hey, why doesn't X site work properly with Firefox?
Firefox: Introduces something making it more likely that another site doesn't work
O: Hey, now Y site doesn't work either!
E: Hey, just wait a second you can-
O: Sorry, I don't have time for this, I'm switching back to Chrome.
IMHO - Firefox's #1 priority should be making sure every site in the the first 10,000 of Alexa work equally as well with Firefox as it does with Chrome, period.
What good is amazing privacy stuff if your userbase is rapidly dwindling?
list of sites that don't work (many, if not most of these work on Chrome without issue):
I've been using Firefox as my main browser for a long time and over the past couple of years I noticed an uptick in websites that wouldn't work lest I used Chromium. For instance last week I had to use a crappy HSBC website that wouldn't let me login in Firefox (it would just hang) while it worked in Chromium.
It's still very minor and I can't even come up with a 2nd example off the top of my head but it does definitely happen from time to time.
If anything these few cases only makes me value Firefox even more, I don't want to enable the Chrome monopoly.
Even GSuite works better for me in Firefox. Slides stays smooth even when scrolling through large presentations and it never locks up (like Chrome does).
Cisco Webex is a repeat offender. The experience is much better in Chromium. If I am using Firefox I have to dial in to a meeting using my phone instead of being able to use my USB headset.
Why is this a complaint at Firefox, and not at Google for abusing their monopoly to create new features on a whim regardless of what it does to other browsers?
I suppose because some of them are in the standard and not implemented in other browsers. Or there are some 20 year old bugs (reported) that are not fixed while pocket and robot are featured.
settings that are known to break websites are disabled in the default configuration, and labeled clearly in the settings pane.
firefox doesn't exist to "win" the browser wars. it doesn't even exist to give users the best possible browsing experience, although that's certainly a primary goal and in my experience they're doing well.
the #1 reason that firefox exists is so that mozilla can have a seat at the WHATWG table -- because very important decisions about the fabric of the world wide web happen there, and the other seats all belong to apple, google, and microsoft.
mozilla is the closest thing we (the users -- not just firefox users, but all web users) have to a "representative" in the WHATWG, because mozilla doesn't answer to shareholders.
> What good is amazing privacy stuff if your userbase is rapidly dwindling?
aside from a noticeable dip when the new chromium-edge started shipping with windows, firefox browser usage on desktop has been pretty steady for the past 5 years.
the value in adding privacy features is that it solidifies a certain use of the protocols, making it harder for WHATWG to make spec changes that undermine the provided security.
Would be great to have some more details about it: in particular, how do I turn it off if I prefer to add any exceptions manually.
Edit 1: Mozilla Hacks blog [1] has a bit more but still doesn't answer the question:
> In order to resolve these compatibility issues of State Partitioning, we allow the state to be unpartitioned in certain cases. When unpartitioning is taking effect, we will stop using double-keying and revert the ordinary (first-party) key.
What are these "certain cases?"
Edit 2: Reading on, there's this bit about storage access grants heuristics [2] linked from the blog. But is that really it, or is there a hardcoded whitelist as well? If so, it'd be great to see it.
This bit in particular is ambiguous in how it's supposed to work exactly (who's "we" here):
> If we discover that an origin is abusing this heuristic to gain tracking access, that origin will have the additional requirement that it must have received user interaction as a first party within the past 30 days.
1. https://hacks.mozilla.org/2021/02/introducing-state-partitio...
2. https://developer.mozilla.org/en-US/docs/Mozilla/Firefox/Pri...