It doesn't stop sites pulling all cookies, but it reduces the amount of information being sent: only the information in the same container is accessible.
If you browse the Internet in default Tabs or in a specific Container you still collect Cookies, Storage and Cache in one place — which is something advertisers and other data-collecting services really appreciate — it makes tracking you easy. Fortunately there’s an easy way to automatically create new Containers every time you open a new Tab and delete the Container if it’s not needed anymore: the Temporary Containers Add-on. By default you can open new Tabs in Temporary Containers with the Toolbar Icon or the keyboard shortcut Alt+C. If you enable the “Automatic Mode” in the options however, it will overwrite your standard ways of opening websites in new Tabs and external programs opening links. Instead of opening the website in No Container, it will open the website in a freshly created Temporary Container. You’ll notice how the names of the Containers keep counting up every time you open a new tab and visit a website: tmp1, tmp2, tmp3. As soon as you close the last Tab in such a Container, it will automatically get removed and with it all that data that makes you easy to track.
I had some tabs (mostly YouTube) opened thrice when clicking on a YT link.
They don't automatically switch back to the default container. That's a big problem. You open your FB container (and Firefox can do this automatically when entering a FB URL or following a link there), you follow a link elsewhere or enter another URL, and you keep inadvertantly surfing in the FB container for the next hour.
Some of the multiple tabs issues have been fixed in version 6.0.0. And the upcoming version will fix them completely.
> They don't automatically switch back to the default container. That's a big problem
I've added an "Isolation" feature to the just published version 0.67 of the Temporary Containers Add-on that gives you several ways to avoid accidentally staying in the same container - including an option to only allow "Always open in" assigned domains to load in their container.
First party isolation was made for Tor and privacy and keeps all cookies in containers in some fashion.
That way, you could click on it when you're using JIRA and have first party isolation when you're not using JIRA.
It just works! But, the one problem that I've encountered is clicking links in gmail cause them to be opened within the Google container.
There's also a discussion about implementing such an Isolation feature directly in Multi-Account Containers.
As an aside I used this guide having discovered it a couple of years ago. I can't remember how but I ended up on a site that had been hacked, might have been following a link from a forum, and the page I was looking for wasn't there. Instead there was a link on the page saying it had been moved. I stupidly clicked the link and off I went to a random site that I can only assume was meant to drop some form of malware or take control on my browser. Anyway all I was left with was a message congratulating me on how secure my browser was, I didn't stay around.
The new requested permission, different from the ones previously accepted by me is, if I remember:
"Monitor extension usage and manage themes"
What's that if not a new telemetry? In a privacy-oriented extension!
It's explained here:
""Monitor extension usage and manage themes": Required to provide interoperability with other container Add-ons by checking if they have the required permissions."
How about not being required? Older versions really didn't require. Knowing the management (see "Looking Glass"), even if the extension is not using the telemetry at the moment it's just "hey the user already agreed!" Especially troublesome as I haven't agreed anywhere else.
Unfortunately there's no other way for us to check whether an Add-on that tries to access the "API" has the needed "contextualIdentities" permission. I can assure you that it's in no way about telemetry and never will be. If the Multi-Account Features would be a Firefox platform feature then such an API would have the same requirements.
Also, you didn't effectively "agreed telemetry". The same way you didn't "agreed to sent all website data you visit to the Add-on developers" when you accepted the "Access your data for all websites" permission that the Add-on also requires. If you don't trust the Add-on or its developers, then that permission should scare you more than the "Monitoring extensions" permission. But as you probably know; it's just a wording you have to take with a grain of salt and a lot of Add-ons/Extensions require permissions which would in theory allow them to inject any arbitrary content in websites you visit, or read data from them for that matter. It's how permissions work.
Now, why "automatically" for Add-ons that have the "contextualIdentities" permission? It's simple, if you grant an Add-on that permission, you already gave your consent that the Add-on can access the Container API; and the Multi-Account Containers Add-on itself being from Mozilla, is just additional Container functionality, but as an Add-on. So if you grant the Container permission, you also get API access to some additional informations from the Add-on; and with that increase interoperability between Container Add-ons.
Net effect is that firefox starts exactly as I like, but forgets everything that happened in the session ('groundhog-day mode').
Edit: added 'su -l' step.
Edit: As an adendum, note that this technique can be extended to the complete 'guest' accounts as well, e.g. 'cd /home; rm -fr guest; cp -a guest.base guest; su -l guest'; the entire 'guest' account is then 'groundhog-dayed'.
# Set up clean copy
rm -fr .mozilla
cp -a .mozilla_base .mozilla
cd - > /dev/null
# Clean out junk (so we start clean next time)
rm -fr .mozilla .cache/mozilla*
rm -fr .adobe
rm -fr .macromedia
cd - > /dev/null
firejail --jail /tmp/firefox /usr/local/bin/firefox
Generally, I agree with you that the lighter the implementation is, the better, but when it comes to sandboxing and other security measures, I would prefer not to roll my own.
What things it isolates are listed here: https://wiki.mozilla.org/Security/Contextual_Identity_Projec...
FWIW they work great.
Still, the main reason I run the browser as a different user is to isolate it for privacy; there are some security benefits too, but I agree it's not something that would defeat a targetted attack.
The biggest downside is that you lose access to the X clipboard - which is also good, so its data doesn't get compromised.
FWIW, I use different Firefox "profile" and that site is able to link the two profiles.
Restarting (exactly the same version) of Firefox a second time and revisiting the site gave:
"Are you anonymous?
Do you think that switching to your browser's private browsing mode or incognito mode will make you anonymous?
Sorry to disappoint you, but you are wrong!. Everyone can track you. You can check it out for yourself. Just type your name below."
Which seems to suggest that whatever the site does failed in my (admittedly unusual but still simple) case.
However, I have little doubt that my (rather atypical) setup could be fingerprinted accurately - assuming, of course, I was part of a big enough minority to be worth advertising to.
N.B. other local factors could affect the results here; the more obvious ones are local DNS and a firewall between the ADSL router and the LAN.
If one is serious about regaining control over the sending of user data to these corporations and websites ("privacy"), then IMO one needs a browser that either lacks or can disable the features above and any others that allow media to be "pushed" to the user without any user input. Such a browser would only execute GET or POST upon user input, not upon input from other sources, such as websites.
Perhaps users could have two browsers: one for commercial activity and running "web apps" and another for non-commercial activity, which may not need to be default compatible with "web apps" that push media to the user. This is an alternative to having to become an expert in browser settings.
Instead of disabling features or installing add-ons, the later browser is incapable of pushing unsolicited media or leaking user data because it lacks the necessary features to do so. (I have been using such a browser for many years now. While this is probably not for everybody, I like it.)
I currently use uMatrix for this and it implements this almost perfectly. However, it's scope is too narrow: it only controls requests within the webpage, so doesn't have access to it many requests the browser will make outside of that scope.
If you start by broadening the scope from webpage to browser, you eventually get to the operating system level, at which point we're really just talking about a firewall/proxy tool with granular control. I've used things like privoxy and proxomitron for this in the past; little snitch is the best I've used in terms of UX and control, but it's still nowhere near as good as the uMatrix interface.
There are a number of challenges with making such a tool, the primary two being: (1) mitming secure connections, (2) contextual control, differentiating iframe, js, css, image, etc. requests becomes more difficult once you're working at a global level.
Given these limitations, uMatrix in combination with a good, strict about:config that allows granular control over everything may be the best we can ask for in the short term.
You also have the "behind the scene" settings:
That's for uBlock Origin but I seem to recall it works similarly in uMatrix (can't check now).
My only concern here is: does the extensions API used by this feature definitely cover all requests made by the browser. e.g. I don't see requests to geolocation services from the Navigator.geolocation API, Google Safe Browsing or CT auditing included in the list of example request types there.
You have access if you chose to modify/recompile the .cpp/.rs files that deal with sending requests that those higher level functions use, this is what I do.
Some particular places of interest on /mozilla-central/:
- /servo/components/style/gecko/urls.rs, will pertain to calls called from css image functions
- /netwerk/protocol/http/nsHttpHandler.cpp, some classess that deal with handling things related to sending all http requests
* No WebGL or WebRTC
* Aggressive TLS settings (will break many websites)
* Mixed-content upgrading (Nightly ran an experiment on this recently and it also broke a lot of websites)
* No history
The text warns about this, but it should at least be clear why Mozilla doesn't ship this as default.
- uBlock Origin in default configuration
- No 3rd party cookies (breaks some things, but not too many)
- Clear history and cookies on exit
Combined with an /etc/hosts file, and rather frequent browser restarts (generally daily).
Why the history? That's not readable by anyone except you, right?
> Clear [..] cookies on exit
So do you have to keep logging in to websites daily? Isn't that very annoying?
It does get annoying because I have to type my credentials all the time, but it just takes a few seconds so it's not a big deal. I'm sure this is not preventing me from doing anything better with my time.
Which of course trades off against privacy.
Check out pi-hole and manage that for your whole family
* Install extension "uBlock Origin"
* Install extension "Cookie AutoDelete"
* Go into Preferences -> Privacy & Security, set "Accept 3rd party cookies" to "Never"
First-party isolation will isolate third-party cookies by first-party domain. So Facebook Like buttons on cnn.com will see different cookies than Facebook Like buttons on nytimes.com. Both of these features can break some websites.
Typically, it'll be small sites that are outsourcing part of their site to a third party but don't want to open a separate tab for that which will be affected by this, obviously. If you only browse major sites doing everything in-house you're not going to run into problems.
The majority of these settings aren't the default because they cause significant site breakage (uBlock is maybe the exception). If they could have been enabled by default, they would have. Mozilla has been taking gradual steps in that direction, and backed off/reverted a few times when too much stuff broke.
Use Tor Browser. If a site breaks, at least you'll know why, and you have the choice of whether it warrants lowering your privacy.
I disagree with this.. I do not see Mozilla taking any steps toward privacy at all, infact in many ways they are taking steps AWAY from privacy with many of their recent actions and blunders. From their use of opt-out rather than opt-in for various privacy invading features, their pushing adware to all users via what was suppose to be a QA/Feedback feature, their purchasing and invest in certain companies...
No the Mozilla Foundation from the 90's is long dead, it has been replaced by the Mozilla Corporation, which is really no different than the Google Corporation or the Microsoft Corporation
https://bugzilla.mozilla.org/showdependencytree.cgi?id=12609... are some concrete steps being taken.
Or the containers work. Or the tracking protection work. If you're not seeing those, it's because you're not looking.
I consider myself somewhat tech-fluent (not in the industry) and privacy-oriented, but there's a happy medium of installing FF on my dayjob computer and loading it up with ublock Origin/privacy badger/etc versus some of the other suggestions. I'm OK with the occasional day-to-day leak if most of the footprint can be obscured.
I'll note that disabling custom fonts breaks certain sites. I don't consider it a deal-breaker, but it's worth being aware. Many sites abuse fonts for icons. Developers, please consider using SVG icons instead.
Another comment mentioned how user.js disables WebGL and WebRTC. IMO, that and many other browser features should be disabled by default. If a site requires their functionality, I should be able to whitelist it. Safari used to let you conditionally enable WebGL access for only certain sites, showing a prompt when the functionality was accessed. It's a damn shame they removed the feature. I don't think most sites should have full access to all these browser APIs. Heck, all the storage APIs should probably be limited to the current session by default, with the option of requesting longer-term persistence for trusted services.
I'd really love it if we had an easy way to create fully isolated containers for each web service or group of web services, with varying tweaks in their security preferences.
Since we're already on the topic of configuring Firefox, I have a tangential question. Does anyone know how to configure Firefox to automatically save rar files? You usually receive the option to always save different file types, but the choice isn't available for rar files, so you always receive a download popup. It's quite annoying, and I have no idea why it happening. A cursory search didn't reveal any useful information on the matter. It's perplexing, because tar and zip files can be set to automatically save without any problem.
I hadn't seen uMatrix before, but it looks promising. Does anyone know of any user-friendly OS tools that lets you monitor and inspect requests? On macOS I used Little Snitch for a long time, but I'm trying to shift away from closed-source tools (no problem with paying, but I want to be able to compile it myself), especially for something so critical. Also, it doesn't let you inspect requests.
Maybe I'm missing something but for me, going to the Options tab, selecting General tab, then going to the Applications section and modifying the entry for RAR file in the list from 'Always Ask' to 'Save file' does the job.
Does it work for you?
For your second question, how about Wireshark? It's open source and does let you inspect the traffic.
I just noticed that in the popup RAR files are identified as binary, while ZIP files are properly identified. Gonna have to dig into it a bit.
I've used Wireshark before, but most requests nowadays are using HTTPS. I vaguely recall at some point having tried to snoop on local HTTPS requests with Wireshark and ending up frustrated.
This is a mistake. uMatrix will block requests which would pull source code, but it does not stop script execution, i.e. those embedded in the page itself. NoScript stops script execution completely.
NoScript also activates `<noscript>` tags which will allow content to render on Medium. And which break Twitter by redirecting away just after it loads...
Caveat emptor: NoScript is somewhat buggy in its WebExt form. Sometimes it needs a click on the global revoke button to render `<noscript>` again. Sometimes it pops up blank windows. I'm not aware of any alternative though.
> This is a mistake.
thanks for bringing this up - i couldn't quickly find info regarding how uM handles inline scripts, but i see that uBO does have that specific option - this has caused me to look deeper into uBO and i'm now considering revising the guide and dumping uM completely since i personally don't require all the granularity of uM (i always allow images/css globally for example)
so although there may be some caveats with not using NS, it seems that uBO can basically eliminate the need for NS, at least for those of us that just want stuff to work for the most part
You are mistaken. You could have taken a few seconds to try for yourself before making this erroneous claim.
> NoScript also activates `<noscript>` tags which will allow content to render on Medium.
uMatrix can also "activate" the `noscript` tags, and this can be disabled/enabled on a per-site basis.
I did base my comment on an observations. I wouldn't be using NoScript myself if I wasn't fairly sure it's the only way.
Seems I was wrong indeed, because there are no scripts listed in the Debug tab of the Inspector if uMatrix blocks them.
Thanks for pointing it out! removes NoScript
It's one of the per-scope switches, see:
Unfortunately, Twitter's forced redirect still takes place even after it's off :(
Ah, that explains why I occasionally see <noscript> on links. I use NoScript too, but only allow scripts from a select list.
Also, using both uBlock Origin and uMatrix is somewhat redundant. Gorhill himself has advocated using per-domain permissions in uMatrix and not having different settings for each element type in uMatrix (if I remember correctly, I can't be bothered to look up the source right now) which is easily done in uBlock Origin using Advanced Mode. One can also replace Neat URL and Skip Redirect with Request Control, which is a more flexible solution, imo, though it requires one make their own rules.
I'm surprised that this post is linked to a non-https URL while the website supports https.
I wish HN have some policy/recommendation to prefer
https URLs to non-https one (if the URL support both).
The least thing they could do is ask for it...
Edit: I shouldn't write early in the morning, tons of grammar mistakes.
That's some genuinely nasty stuff that no one would normally want on their machines AND visible only from an obscure about:support page AND with no clear way of disabling it, save for deleting .xpi files:
 https://wiki.mozilla.org/Firefox/Shield - generalized engine for running "study" recipes.
"Follow On Search" is AFAIK controlled by the Telemetry settings in the same dialog in "Privacy & Security".
ignore one's updating preferences
Are you sure? They're add-ons, so they should follow the add-on update preferences. Given that aushelper apparently does nothing aside from modify the update URL to include info whether the system is affected by some bug, it's hard to see how it wouldn't respect the settings.
It's not about denying Mozilla an option of pushing zero-day patches. It's about the fact that it's a built-in always-on _concealed_ feature.
I really don't see how it's particularly concealed. If they actually tried to conceal it, you would not know about it at all.
Nice article otherwise and I have to congratulate the web designer - what a beautiful, readable site!
Also check out https://www.privacy-handbuch.de/handbuch_21.htm for more interesting firefox modifications (german site).
Ruining the analytics is a better tactic in my opinion. Flood the tracking with useless data
A non-firefox-related but privacy addition I will suggest is a strong hosts file 
Preferences > General > Performance > Uncheck the box next to Use recommended performance settings.
You will then be able to change the following settings:
- Use hardware acceleration when available
- Content process limit
> To disable e10s/multiprocess go to about:config by typing it in your URL bar. Search for browser.tabs.remote.autostart using the search box on about:config. There may be multiple results. Set them all to false and restart the browser (if there are no entries, create it as a boolean and set it to false).
It will still have another process that controls all the UI and all those browser tab processes, a process for sandboxing extensions/plugins and then even more processes for miscellaneous things, for example I think asynchronous scrolling has another process and when doing performance improvements, they often just stuck long operations into a separate process.
However, the days when we could install some plugins and tweak a few settings to restore our privacy are, unfortunately, pretty much over. There’s only so much a plugin can do when it doesn’t have access to the core APIs of the rendering engine or the network stack.
As long as Google and Firefox are incentivized to make money by ads, user tracking and all of the rest, they won’t stop.
Long story short: the business model of the web has to change from one where the default state is to monetize the invasion of our privacy to one where we can control who gets to advertise to us and that our attention is valuable; we should be paid for it.
In short, that’s what the Brave browser is all about: https://brave.com/com465. By default, it blocks ads, tracking scripts, fingerprinting and 3rd party cookies in such a way that most pages don’t break. It even blocks those cryptocurrency mining scripts that some sites like Salon are using: https://www.cnbc.com/2018/02/14/salon-disable-ad-blocker-or-....
Brave allows you to pay content creators with a cryptocurrency called Basic Attention Token (BAT) based on the amount of time spent on their sites or as a percentage of a monthly contribution. BAT is based on the Ethereum token standard.
Later this year, Brave users will be able to opt-in to getting paid to watch high quality, relevant ads if they wish. How? By using zero knowledge proofs, Brave can show you these ads without leaking your personal information, based on your browsing history, that never leaves your machine.
Be aware: Brave is in beta; it’s not done yet. It’s based on Chromium but the rest of the tech is under heavy development. It has come a long way in the 3-4 months I’ve been using it regularly. And there are lots of good things in store, including Tor on a per-tab basis, which I’m looking forward to: https://github.com/brave/browser-laptop/wiki/Brave-Tor-Suppo...
But in all seriousness, I stand by what I said—just installing plugins isn’t going to do it any longer. Blocking ads and tracking scripts and the like needs to built-in to the browser and that’s what Brave has done.
Brendan Eich's explanation is on point: https://vimeo.com/209336437