Would be great to have some more details about it: in particular, how do I turn it off if I prefer to add any exceptions manually.
Edit 1: Mozilla Hacks blog  has a bit more but still doesn't answer the question:
> In order to resolve these compatibility issues of State Partitioning, we allow the state to be unpartitioned in certain cases. When unpartitioning is taking effect, we will stop using double-keying and revert the ordinary (first-party) key.
What are these "certain cases?"
Edit 2: Reading on, there's this bit about storage access grants heuristics  linked from the blog. But is that really it, or is there a hardcoded whitelist as well? If so, it'd be great to see it.
This bit in particular is ambiguous in how it's supposed to work exactly (who's "we" here):
> If we discover that an origin is abusing this heuristic to gain tracking access, that origin will have the additional requirement that it must have received user interaction as a first party within the past 30 days.
This is a great question and I’m glad you found the answer, you probably understand that for many blog posts we avoid going into too much technical detail.
To answer your final question, there is no hardcoded allow-list for State Partitioning. The heuristics as described on MDN are accurate.
I don't see why we can have full-blown web apps but our text needs to be very specifically just text these days.
I've only recently discovered that Markdown has footnotes, and I've gone to down adding footnotes everywhere.
I use Jekyll + markdown on my website, and I now have lots of fun adding footnotes to my writing.
I added a "footnote tutorial" for readers on https://josh.works/turing-backend-prep-01-intro#why-this-rub..., to help them learn how to navigate the footnotes.
I _love_ your library, and I love the problem that you're solving with it.
Along the way, I've looked at Gwern's sidenotes and Nate Berkapec's "footnotes"/sidenotes .
I eventually want to do something more "in-line", like what you've down with Expounder, but I've been satiated with markdown footnotes for now.
From the demo it look as if Expounder is one-way - once you've expanded something, you can't collapse it again. Is that correct?
Thank you so much for posting gwern’s sidenote article! I want to use sidenotes on my site and this was a very valuable resource!
If I were reading a technical text, I would definitely end up reading most paragraphs at least twice. It would make no sense to keep the expounded terms in the second time; I'd be tempted to hide them back as soon as I was finished with them the first time.
It's because, once clicked, the new text should become part of the old, and that's it. Presumably you've already read it, and I don't want to make the viewer have to re-collapse the links every time.
Your use case makes sense, though, which is why the feature was included. Maybe I should mention it in the README.
But I see there is a css class which is nice.
Just a simple rgba(x,x,x,0.5) where the x’s are the usual yellow height.
Be it topographic emphasize or coloring, there should be an hint. And clicking the text thus emphasized should collapse it.
That's my opinion, otherwise, nice done.
The author's intent here is to have terms explained in the text explicitly in such a way that it would 'augment' the text with an explanation somewhere further down the line, but not necessarily "in-place".
It is also intended for text specifically, rather than replacing one element with another.
I agree that display/summary are similar in spirit though, I had not come across those before.
If you want to be listed as an author, just drop over to https://github.com/withinboredom/expounder-wordpress/tree/ma... and let me know your wordpress.org user names in an issue.
Thanks again for your help!
Mozilla who? That’s where we are now.
If cookies from another bucket should be shared with other sites, or might be seen when requested by a cross-site load from another site, ask the user a four choice question.
"Allow (site) to see cookies from (site)?"
Always Allow, Just this time, Ask later, Always Deny
Would be great to know how are those concerns handled?
I guess that clears it up.
Not really... for a highly technical issue like this, at a minimum you should link to the technical details.
There really is no excuse for making every reader of your blog who wants to know the details dig for them independently.
imo, at least.
I've been a FPI user for years as a best-effort to reign in tracking but there are a common few sites that just break with FPI (50% of the time PayPal checkout doesn't work). Even if "Total Cookie Protection" is only 98% as effective as FPI, I'm making the switch.
EDIT: FPI = first-party isolation
Given Firefoxs low adoption, I fear that website owner will just ignore that their excessive tracking breaks their site in Firefox... “Works in Chrome... good enough”
Facebook and Google will be excepted? This makes it a joke, sadly.
This move is aimed at killing other AdTech companies which rely on 3rd party cookies to track users.
They painting this as a 'PRIVACY' move, after they have already found other ways of tracking users across websites and devices.
(on mac) Firefox > Preferences > Privacy & Security > Custom
Your answer seems to be about how to turn off "Enhanced Tracking Protection"/"Total Cookie Protection" or parts of it (resulting in weaker protection). I want to keep it enabled and disable the exceptions (for stronger protection), i.e. the opposite.
I haven't installed the new version yet, so can't say for sure, but as far as I know there is no setting for this in that menu. 
If I misunderstood what you meant, please elaborate.
For greenhorn web developers, you could say the same thing about TLS certificates. Why weren't they always free?
Well, another reason is because TLS (and formerly SSL) wasn't (weren't) just about encryption, but about a "web of trust." Encryption alone isn't trust.
Many things about web technologies have changed over time; and it's easy to say that any individual piece of functionality should have worked this or that way all along, but the original intent of many web features and how those features are used today can be very different.
One day industry standards may dictate that we don't even process HTTPS requests in a way where the client's IP address is fully exposed to the server. Someone along the way might decide that a trusted agent should serve pages back on behalf of a client, for all clients.
After all, why should a third-party pixel.png request expose me browsing another website?! How absurd. Don't you think? And yet, we do it every day.
Which is a nice principle, but given corporate and government incentives, the trust provided was lackluster at best. The PKI is pretty much broken because of it.
In the end, all it did is incur an unaffordable cost for hobbyist bloggers and other netizens.
Which fair it’s trading one PKI for another but web servers vastly outnumber authoritative DNS servers. But DKIM gets along fine without it so we probably could too.
I think the whole idea of sharing cookies across origins was a conceptual mistake right from the beginning, because it is also responsible for quite a lot of security vectors which had to be fixed by other mechanics like the SOP (Same Origin Policy) which in turn required mechanics like CORS (Cross Origin Resource Sharing).
And with all those mechanics in place, modern browsers are pretty tied up and are significantly reduced in their abilities compared to other HTTP/S clients. So when you want to build a PWA (Progressive Web App) that can use a configurable backend (as in federated), you will run into all kinds if problems, that can all be traced back to the decision to share cookies across origins.
Why is HTTP/2 Server Push being rescinded?
Why do user agents not provide additional types for <script> based on runtime installations?
Why isn't there a renderer API that allows me to use painted trees in <canvas>, but there is a bluetooth API that no one uses?
That those mistakes were not done deliberately and with good intentions is a completely different story and that in hindsight everything looks so clear is also well known ;-)
"Web of trust" is a pretty specific term that doesn't apply to TLS/SSL:
Did you mean to say "public key infrastructure" (PKI)?
I may be confusing the terms "chain of trust" and "web of trust," but to my best knowledge, I don't recall EVs being sold on the former term.
My apologies. I hope there are folks out there who have a better recollection who can piece this together.
Not in the corners of the web I frequent. I've been blocking 3rd party cookies for years and the only site that's broken was some Pearson online homework site.
This isn't simply "blocking third party cookies", it's "even an iframe has no access to the other state partition". The third party cookie is allowed to exist but it cannot leak to other sites. However, this leak prevention breaks plenty of other things if one is not careful (Mozilla was, there is a heuristic).
why this is not the default behavior already?
To be honest though, browser fingerprinting makes this mostly irrelevant unless you carefully use a script blocker with a whitelist too. Any domain that includes trackers that drop third party cookies almost certainly includes scripts that can fingerprint you and send results to a server without using a third party cookie.
If they had used the same domain for their products historically and just separate subdomains they wouldn't have to make this trade off, but it probably also helps with third-party ad networks/segmentation to get folks to turn it on anyways.
Solving a problem isn't irrelevant just because there are other problems; there's definitely more to do, but this still has value.
Here's a reference to a F5 device providing SAML SSO services and having a similar issue:
> since we cannot discount the possibility of malicious users programatically generating tokens and forcing them upon users, we check the referer header to ensure that the request chain was initiated in the one place that we're comfortable with: id.atlassian.com
Make of that what you will.
SSO via OAuth still works fine, because OAuth uses redirects instead of cookies.
With shared cookies nothing stops site A from taking a copy of your cookie and using it to impersonate you on site B. With redirect based login the identity provider has to authorize each application that is being accessed and each site has its own session cookies.
The main problem is dealing with globally revoking access but that's usually solved with shorter termed session cookies that periodically need to be refreshed from the identity provider.
This is making them have to allocate resources to achieve the same effect. Like taking lojack off of your car and phone, and making 'Them' have to tail you and scour security footage like in the old days. It's more expensive. Expensive things do not scale, so you have to prioritize who is worth the cost. People who are under legitimate suspicion of causing harm. Less 'by-catch' to use a commercial fishing concept.
When it's cheap to harass everyone, nobody is 'safe'. But when terrorists can't be tracked at all, nobody is 'safe' either. So we have checks and balances.
In my case, I strip all cookies and sensitive headers. One must keep in mind that the browser will treat it as a first-party request and the security implications that has. You may have to filter or modify cookies/headers.
When people need to keep a door open, if they don't see a doorstop in the immediate vicinity after two seconds of looking, some will just use whatever heavy object that is closest and consider the problem 'solved' instead of managed.
I needed data, I didn't know where to put it, this thing can give me data, boom, solved.
Good. Disqus had it too easy.
>> It also going to break [..]
Good. They had it too easy.
I'm absolutely loving the fact that my switch to Firefox is paying off. Finally!
Good. They had it too easy. I'd pay $20 for clean version of FFX on Mac/iOS App Store.
Thank Microsoft, Google and Apple for that.
By that logic, we should turn off our computers to improve security.
For one thing it means they're locked to my session.
> In addition, Total Cookie Protection makes a limited exception for cross-site cookies when they are needed for non-tracking purposes, such as those used by popular third-party login providers. Only when Total Cookie Protection detects that you intend to use a provider, will it give that provider permission to use a cross-site cookie specifically for the site you’re currently visiting. Such momentary exceptions allow for strong privacy protection without affecting your browsing experience.
That's exactly why I have to toggle it. Anyone that uses auth0, and many publications sites (follow a link to a PDF, get redirected to `/cookie-absent` instead) fall foul.
Similarly the OpenID Connect Session Management feature (check_session_iframe) also depends on the ability to use third party cookies.
This functionality is needed to be able to detect if user logged out from front-end code without relying on having any back end code that could receive either a front-channel or back-channel signout notification and send it back.
In the absence of that a pure SPA with no backend could only detect the logout if access tokens are stateful, and they get an error message back that the token refers to an ended session.
Some people get really cranky if a single sign out feature does not actually sign you out of everything.
Forgive me ... do I understand that there is a true/false setting in Firefox named "privacy.firstparty.isolate" that you like to toggle from time to time ... and you use an extension to do that ?
I don't do much browser customization and use only one extension (uBlock Origin) but ... couldn't I toggle a single Firefox setting with a simple command line ?
Why would you need an extension to do that ?
Genuinely curious ...
On startup it's enabled (i.e. do isolate) via a config file, so I could change it there with a shell script. I think though that I'd have to restart Firefox for it to take effect.
The extension gives me a handy button in the toolbar that's red (danger) when it's off (i.e. not isolating) that I can just click to toggle.
Yes it's a tiny job for an extension, but do one thing well right? Also, to be honest, it's easier that it's there than switching to or pulling up a new shell.
Afk to confirm, but pretty sure this is the one I use:
It's not particularly fun to implement. It's not hard, but the heuristics are enough of a nudge that it can create weird experiences for users.
"I thought I already signed in, but after I navigate, I have to click sign in again, and a window pops up and then I'm automatically signed in? Why?"
Edit: Yeah, seems so.
See also: https://webkit.org/blog/8124/introducing-storage-access-api/
And what's the migration path for users who have been using that setting previously?
Can I now disable it? Do I have to disable it?
1. By allowing third party cookies, but compartmentalizing them by the first-party site that sent the request (a much better name for this feature would be "per-site cookie containers", "total cookie protection" is completely uninformative).
2. By using a heuristic to selectively allow cookies to be accessed across the container boundary if they are actually needed, e.g. for logins.
To answer your question, this doesn't make sense as "day one behavior" because it's basically a patch to work around a historical problem with as little breakage as possible. If you were setting up cookie permissions on day one, knowing what we know now, you wouldn't kneecap third party cookies, you'd disable them entirely. Mozilla is trying to make third party cookies useless for 99% of what they're used for: if that's how you feel about third party cookies, you'd just not implement them.
Incidentally, I do block all third party cookies by default and have for years. That's a much stronger approach than the compartmentalization that Mozilla is attempting. I can count on one hand the number of sites I've seen break because of this, most of them are happy to let these cookies fail silently.
BGP and SS7 are other famous examples.
I'm not arguing to give up. Rather, I'm more convinced in investing in privacy NGOs like noyb.eu and make it expensive to toy with my privacy.
they don't even have to. Just store two (or N) sets of cookie trails as they already do. This will waste a few MB of storage on the client side and do nothing to Ad/privacy.
Sites never shared the ID anyway, specially since GDPR-et-al.
AD tech works like this: you send a hash of one ID and on the backend attach all the profile info (nobody will ever share that with partners, because that is gold), then the other side just assign their own hash of their ID and also keep all their targeting info on their backend. The only thing that matters is that party A ID123 is known to match party B IDabc. Note that those IDs are transient and set at random, because party A and party B doesn't want to give up their secret info by matching IDs from multiple sites. That is called cookie match. it does NOT depend on a single cookie jar. It doesn't even depend on cookies! why do you think most Ads (and google search result links -ha!) have those weird hashs appended? zero cookies needed)
Another thing that helps even more than 3rd party cookie is multi-site referrer, but google killed that on both chromium and firefox a long time ago (firefox still have the about:config way to disable/set to single-site, set to multi-site-domain-only, but good luck finding a single human who changes that setting by selecting magic numbers)
(Disclosure: I work on ads at Google, speaking only for myself)
Just saying that it won't matter much if removed from the equation.
I mean, if something makes your life easier, you would be a fool to not use it. but that is like saying not having a ferrari prevents you from driving to the store.
With third party cookies this looks like (simplified MVP form):
1. When you visited shoes.example, it loaded a pixel from ads.example. That pixel automatically sent your ads.example cookie, and put you on a remarketing list.
2. When you visit news.example, it sent an ad request to ads.example, which also automatically sent your ads.example cookie. Now the ad tech vendor knows to include the ad from the shoe site because it recognizes the third-party cookie.
On the other hand, without third-party cookies or any replacement browser APIs, how do these identities get joined? Very occasionally someone will follow a link between a pair of sites, and then you can join first party identities, but you probably don't have a chain of identities that connects a news.example first-party identity to a shoes.example identity.
1. When you visit shoes.example, it has an iframe to show an ad from ads.example. This iframe runs some JS to compute a browser fingerprint and then nests an iframe to hxxps://ads.example/?target=shoes.example&client=$fingerprint . The ads.example server records that this fingerprint has visited shoes.example
2. When you visit news.example, it has an iframe to show an ad from ads.example. This iframe runs some JS to compute a browser fingerprint and then nests an iframe to hxxps://ads.example/?target=news.example&client=$fingerprint . The ads.example server recognizes the fingerprint, knows that the client visited shoes.example earlier, and returns a shoes ad.
I do agree this is possible to do with fingerprints, though (a) all the browsers are trying to prevent fingerprinting and (b) a reputable ad company would not use fingerprints for targeting. This is my understanding of why Google is putting so much effort into https://github.com/WICG/turtledove
(Still speaking only for myself)
cookie synch, It's a freaking industry standard. And you want us to believe google money cow will dry as soon as the effort they are leading goes live?
> you want us to believe google money cow will dry as soon as the effort they are leading goes live?
"we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete. Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome. Our intention is to do this within two years." -- https://blog.chromium.org/2020/01/building-more-private-web-...
They are describing adding new capabilities to the browser that would make this possible, in a privacy preserving way. Ex, https://github.com/WICG/turtledove/blob/master/FLEDGE.md
google sends id abc to shoes.com and id xyz to news.com. both sends those ids back to google's own adserver. presto, google knows you are seeing those two ads.
Just show 1st class useful controls on the browser UI for cookies and the problem solves itself. what EU cookie law should have been.
Every user understands "site A wants to store a save file" "site A wants to access save file". Nobody understands cookies and same-origin and cors.
What I care as a EU citizen: Are you collecting and storing information that can directly or indirectly identify me? Yes, tracking and profiling are included in this.
You want to store some session cookies, so you remember my shopping cart? Go ahead!
You want to store some cookies, so you remember I was logged in? Sure!
You want to use every available technological loophole to follow my every path on the Internet? Errrr, no thanks!
I'll keep my fingers crossed for a GDPR 1.1 that patches some of the things they got wrong.
Wouldn't you agree?
This should have always been the only way it worked. Every website should run like if it was opened in a separate browser.
> third-party login providers
Don't use these, it's a trap.
Except if you're setting up SSO for your company's employees. Using a 3rd party login provider is a necessity. You shouldn't trust employees to create unique / strong passwords for every individual service they login to.
It lets us offer SSO with whatever Auth0 supports as a freebie add-on, instead of "well, we could work with your platform but it's gonna cost you."
I don't see how it's a trap, except that we have to pay auth0 a monthly fee to handle our authentications instead of having some number of hours a month spent maintaining and securing our customers' logins and integrations.
SSO is a must in any big organisation, there are tens or hundred of applications.
People are incredibly and consistently bad with security.
You really need a way to be able to cancel all accesses in one swoop for any individual.
The same is true for forcing users to reset their password every 50 days or so, by the way. This outdated password guideline doesn't seem to die. I know way to many cases where people are using a weak base password with a number attached to it because they got sick of trying to remember a new password every month.
there are people who actually invent a new password every time instead of cycling numbers?
also, change password a few times until history is flushed and switch back to the same password you started with is a thing.
Where can I learn about best SSO practice/implementation?
But SSO centralizes access management. For instance, with one switch I can set password requirements, require 2FA, and grant/revoke access to all of an employee's services when they join the company or leave.
Pretty hard to avoid in many cases. Logging in to your Microsoft account for Office (Teams, Outlook, et al.) uses a login service, as does Google, and practically all services that span across multiple domains. Which includes all of the major ones, at this point.
Good that Firefox gives us this option, given how the web has evolved!
with a single third party login for all services, though, if that third party account gets compromised the results are catastrophic.
The same can be said of the password manager account. It's turtles all the way down.
The fact that we rely on users to not reuse passwords, the fact that using a password manager is all but required to get reasonable security despite being far from convenient, these indicate a major failure to serve the actual needs of users, in my view.
Users have head space for 1-3 strong passwords. They can tolerate carrying maybe 1 security token with them. They can tolerate a little bit of security setup when using a new device for the first time, and they can tolerate a touch or fingerprint scan at authentication time. All authentication systems can and should operate within these parameters.
No web site or app outside of an authentication provider should ever present a user a screen asking them to pick a strong password that they have never used before. That is asking a user to do something that the human brain cannot reasonably do for 99% of the population. At best, a browser or password manager will intervene at that point and pick the password for them. At worst, the user ignores the warning and picks the same password they use for everything else.
What password manager account, what are you talking about? There is never any password manager account, yes, I have heard that some weird people are synchronizing their passwords to some strange 3rd party services but those don't matter. You have one password. Encryption password for login database and that one is local and never transmitted over the internet. If you know a password manager that provides this decryption password to their servers, please open the topic here and they will be bashed to hell for this.
I am a tad more strange, my password manager is synchronized with my sftp server using private key and I am not only randomizing the passwords for each site but also the email address (imagine sha(user+salt) + delimiter + sha(domain + master password)@mydomain.com). And I will never in my life use any SSO as they are mostly spyware designed for tracking users across the sites and certainly not for what they are advertised for. They will break with firefox latest addition? FINE! At least people will stop using them.
One thing are companies self hosted SSOs. Sure, I can trust those for company services. For anything else, like "login with google" or "login with facebook"? Yeah right, my hearth is jumping out of joy and barely waits to use it. It actually works in reverse, if you dont allow me to register using non SSO account (email, password) I wont use your service/webpage/whatever.
Do they actually do this? Also don't most of the big ones allow you to opt-out of personalized ads.
I like this because it's easier to have strong 2FA with backup codes on a few well protected accounts, than to do it for every tiny site.
The "Sign-in with Google" button is makes it much quicker to create an account and slightly quicker to log in.
Also, I can rely on my Google 2FA rather than setting up and filling in a different TOTP for each site. Something like U2F or WebAuthn would make the filling-in part more convenient, but even sites that offer 2FA usually don't offer those. (And many sites don't even offer 2FA.)
Using 1Password's 2FA feature would make TOTP more convenient, but I'm a little nervous about putting 2FA in 1Password. This might be overly-conservative thinking, though.
Every time I log in to a service, I have to guess which account it's associated with (bearing in mind I may have signed up years ago). And if I'm wrong, half the time it immediately attempts to create a new account, and then I'm stuck with a bunch of empty dummy accounts on various services.
> This should have always been the only way it worked. Every website should run like if it was opened in a separate browser.
FYI: Extension "Temporary Containers" does this: https://addons.mozilla.org/en-US/firefox/addon/temporary-con...
One big advantage is that I now have way more addons installed on Firefox that would otherwise make Chrome utterly slow and unusable.
This seems to indicates there's not faster alternatives around anymore, but the last time I tried FF (4-6 months ago) I couldn't make the transition because the lag was pretty obvious when coming from Chrome based browsers. Is this not the case anymore?
Whether or not the Google programmers use specific proprietary knowledge about the behavior of Chrome to optimize performance is different. If they do, that would be similar to the things that got Microsoft in trouble.
Google knows that every time they release a Firefox bug, FF's user percentage goes down a tiny bit. Repeat over dozens of bugs, for years, and you have a strategy.
There's one blog post from another Mozillian that I can't find anywhere that came out within the last year with other examples, I think it was on HN.
You are looking for https://web.archive.org/web/20180728122724if_/https://twitte...
I use Brave + Ublock exclusively.
Brave and uBO share filter tech and we aim to make uBO unnecessary (this may require setting shields to aggressive). We do much more than any extension can do, and Google has made it clear they will further restrict extension APIs.
https://brave.com/privacy-updates-7/ (latest in series)
I've used Firefox since 2006, and Chrome always seemed heavier, laggier and uglier. Maybe it's the snappy iOS-like animation when you scroll to the bottom of the page that makes it seem snappier?
I still used Firefox a lot for various reasons (and still do), but I'm not blind to how it performed.
I really want Firefox to work for me and I'd love to drop Chrome, but last time FF made big noise about performance improvements I tried it out and Gmail was still unusably slow.
For me, the way Google is keeping Gmail terrible for other browsers is exactly the reason to not use Chrome. No way I'm OK with that.
I even use it on my phone. The mobile version is definitely worse than Chrome, but it has plugins (or it used to! nowadays it only support a few popular ones which is a shame) and also I can send tabs from my phone to my computer (which is a better place to read articles anyways).
I didn't find noticeable difference between FF and Chrome based browsers(Vivaldi, Edge) on macOS(although Safari runs circles around them) after using them extensively. I used each of them for a separate project with several common websites loaded in them, there were different quirks for each browser(especially reg tab hibernation) but latency was not one of them.
On Linux FF seems definitely faster than Chromium, although there are occasional DNS errors which stops loading the web pages altogether(likely result of my own doing). I've stopped having different browsers for different projects and just use FF for all.
On Android with Chrome, not just Chrome but even WebView using it is astonishingly fast(e.g. DDG browser), I presume it's because of data saver feature. On de-googled android like LineageOS, FF/Fennec seems to be on same level as Chromium and DDG is faster here as well.
On iOS, everything is Safari.
I don't use Windows much, but I've seen others mentioning Edge seems to be faster than Chrome recently.
The stumbling block for me as FireFox user is I am increasingly bumping into web apps that preform poorly in FF but are fine in Chrome for one reason or another. One instance I bump into a lot is ElasticSearches Kibana runs like trash in FF for some reason.
Except on Facebook. My Facebook tab is incredibly laggy, and gets more and more laggy the longer I leave it open. I'm one of those users that tends to keep 50+ tabs open, and I have to close and reopen the Facebook tab at least once a day to keep it from becoming a nearly frozen mess. Even then, if a video is playing and I click it to make it fill the window, it takes several seconds for it to happen. And with an i9-9900K, 32 GB of RAM, RTX 3080, and a 1 TB NVMe drive, my computer is definitely no slouch.
In a way I see it as a win, I really really hate opening it on desktop.
I feel that most people complaining about slow browsers have no blocker installed.
I find the majority of their privacy claims dubious and dangerously misleading for those that don't know any better. If they were serious about privacy they'd offer uBlock Origin (or equivalent functionality) preinstalled by default.
Their current countermeasures such as containers, tracking protection and this cookie thing is trivial to bypass with browser fingerprinting and IP address tracking if you have a global view of the Internet (which Facebook and Google do have).
I get you about the updates. It's a risk-reward ratio I accept because firefox + noscript + always starting in a private session is way more helpful than the update problem is harmful. Using a VPN a lot of the time helps, too. There is no solution I know of that is perfect. My threat model is pretty relaxed, though, so what I do is mostly for my peace of mind. You have reminded me that I should start spoofing my user agent again.
But it is extremely misleading for them to be shouting "privacy" at every opportunity while the truth is that their browser leaks personal data like a sieve in the default configuration. This would give a false sense of security to non-technical people who don't have the skills to see through these lies.
Sorry for the somewhat angry comment, but I honestly can’t understand this mentality.
I'm not saying Chrome is any better, but at least Chrome doesn't toot the "privacy" horn at every opportunity.
Brave does have some kind of blocker built-in which might actually help even if it's not perfect.
I haven't experienced this since the rapid release schedule started. They're pretty silent now.
Are there any other config changes you would recommend to Firefox to harden it?
"neither Mozilla nor Pocket ever receives a copy of your browser history. When personalization does occur, recommendations rely on a process of story sorting and filtering that happens locally in your personal copy of Firefox."