Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Container for Firefox (mozilla.org)
1274 points by aaossa on April 17, 2018 | hide | past | favorite | 394 comments



Third party cookies, and any way to fingerprint a specific user starting from high entropy user agents to screen resolution, font fingerprinting or canvas data, should be considered a breach of the browser security model.

All sites should run in containers and no advertiser should be able to track you across sessions. When I want 3rd party interaction, I should need to opt in and connect the current site with Facebook or some other 3rd party.


Ever encountered Google ReCaptcha when you've turned off third party cookies or while in incognito/private mode? It's a nightmare, even if you're logged into a Google account. You can be shown upto 7-8 challenges, painfully slow loading images and Google's insistence that they encountered malicious traffic from your IP when attempting to use the audio version. What's even worse is they track your mouse movements and fast solvers like me are penalized since they think I'm not human anymore.

As much as I hate third party cookies, turning them on drastically simplifies the captcha solving process. So much so, I now use a separate browser profile with third party cookies allowed, just for the sake of captcha heavy sites.

Given that Google benefits by tracking my activities and free labor from my captcha solving, they will always punish privacy conscious users via such dark patterns.


The optimist in me thinks that if browsers started to disallow third party cookies, ReCaptcha might have to adapt to the change and make it easier to solve if third party cookies are blocked. After all, if they didn't, they'd risk website owners moving away from ReCaptcha because visitors could no longer interact with the site.

The pessimist in me thinks that if a browser were to disallow third party cookies, users might simply switch to a different browser that does allow them.


I don't think it has so much to do with reCAPTCHA as much as it has to do with Cloudflare and website owners themselves. Most reCAPTCHA challenges come in the form of Cloudflare challenge passage pages. Website owners have 5 options to choose from regarding challenges:[1]

— Essentially off: Challenges only the most grievous offenders

— Low: Challenges only the most threatening visitors

— Medium: Challenges both moderate threat visitors and the most threatening visitors

— High: Challenges all visitors that have exhibited threatening behavior within the last 14 days

— I’m Under Attack!: Should only be used if your website is under a DDoS attack (Visitors will receive an interstitial page while we analyze their traffic and behavior to make sure they are a legitimate human visitor trying to access your website)

I think a lot of companies set it to "High" and forget about it, not realizing that it's ruining the experience for a lot of users.

[1]: https://www.cloudflare.com/a/firewall/ebelinski.com#security...


It doesn't ruin the experience of anyone except a couple of overly technical users and customers in the wrong locations.

It's very good at blocking malicious traffic though and it's totally worth losing a pair of users for that.


Yes, it's the users that are wrong!


> The optimist in me thinks that if browsers started to disallow third party cookies, ReCaptcha might have to adapt to the change and make it easier to solve if third party cookies are blocked.

That needs to bring everybody to the same table, Firefox will not do that alone and Google Chrome is not there for nothing.

Basically, Google must accept the proposal of disabling third-party cookies by default which will effect their income negatively, so they may just refuse to implement that feature for Chrome and even may prevent that feature from become a standard.

Sad.


Chrome already has a "Block third-party cookies" feature in Privacy and security > Content settings > Cookies.

Kinda hard to find though.


Users tend to blame websites rather than browsers when the websites don't work. So by using bad CAPTCHAs, i'd guess, users would more liekly blame those sites using them.


This is what finally made me switch my default search engine away from Google. I will put up with a lot of crap, but I'm not going to spend 60s to solve a CAPTCHA every time I do a search.


Bear in mind that Google might be correctly identifying your IP address as a botnet source. If you start seeing lots of CAPTCHAs, it's worth it to take a look around your network for open ports or weak ssh passwords etc., or just look at a traffic monitor to see if there's a lot of egress from your net.


Oh, I'm sure there are a million reasons for them to block me. I normally use a VPN, incognito searches, limited script blocking, and cookie blocking. Pretty sure they don't want me as a customer, not that I can completely blame them. Anyway, I've discovered that there are other reasonably effective search engines with a different business model, and I'm fine using those and giving them a modest amount of revenue. (I don't block ads that don't track me.)


I've never seen a CAPTCHA when simply doing a search - is that because I refuse to log in to Google?


Perhaps Firefox could fake thirdparty cookies instead of disallowing them.


Then you'd probably never pass captcha.


firefox is big enough that captcha would be forced to change, so long as firefox stuck to it.

I make no claim that the above is a good idea. It is possible, but it strikes me there are unintended consequences that I won't think of.



This has been a huge problem of mine when using tor, after disabling 3rd party cookies and also using Firefox containers. Took me two days to just love duck duck go and now it's been set on my laptop, phone and all tables, where Firefox with uBlock origin are to browse the web.


ReCaptcha means to me that I need to blacklist your site and never come back. F@ck that noise


I got very sick of this crap from Google, which is why I switched my default search engine to DuckDuckGo.


Yeah we need to have a talk about captchas and privacy. They're pretty plainly used by providers (Google) to force people into being tracked.


I can't even get recaptcha to work when I have the umatrix extension turned on, even when I turn off the functionality of umatrix by whitelisting everything. I need to go into my browser extensions and turn off the whole extension just to get past a recaptcha.


Using the logger is key to solve that sort of issues. As stated in the documentation, the per-scope switches keep having an effect even if you disable matrix filtering. The logger will show you what is still modified by uMatrix if there is anything.


With umatrix, try unchecking the 'Spoof HTTP header' option. I was able to get it working this way. There's no need to disable the extension itself.


I sympathize. It's terrible when a computer thinks you must not be human because you're not slow and don't make enough mistakes.

I've been accused of being a bot before also.


I tried making a throwaway account on Protonmail account while on the Tor network. I swear, I probably had to go through 25 captchas before it worked.


Google’s recaptcha doesn’t work even when you are logged in sometimes.


I've always had to solve 3-4 captchas. They are insufferable and I have simply stopped using most of the sites that use recaptcha unless they are vital to me.

It feels like Google is using me as a mechanical turk to solve their autonomous car rubbish, which will make them millions, and I have no choice but to do it, or I'm barred from the sites I need to access. It's profoundly despicable, as if I didn't hate Google enough already.


>It feels like Google is using me as a mechanical turk to solve their autonomous car rubbish

That explains why the majority of ReCaptchas I get are along the lines of "Click the squares that contain a street sign" or "Click the squares that contain a bicycle."


The security model of the internet is whack. Javascript should be opt-in, especially from 3rd party sources. "This website wants/requires javascript for an enhanced viewing experience [allow/deny]". Instead not only will any stock browser gladly run anything thrown at it, it will also accept any cookies and now run WebGL code, trigger DRM engines and various other things. All that because the engineers who wrote the standards had enough hubris to think that a turing complete language with access to such a large API surface (from the DOM to the GPU to threads to a lot of the browser state) could be successfully sandboxed.

The web is a lost cause as far I'm concerned, it was under-engineered at the start and as a result people kept pilling mounds of crap on top to make up for it. What started as a way to display basic markup and images has been arm wrestled into handling complex web applications and videogames. The thing is so complex that it would probably take me less time to write a basic operating system and run Firefox on it than implementing my own browser from scratch on Linux. And despite all of this you end up having to use a billion of 3rd party javascript libraries to do anything remotely complex in the browser, things that have been standard in Qt or GTK since forever with a fraction of the code base and memory footprint.

And now people seem to enjoy reimplementing all the internet protocols on top of HTTP. The world has gone mad. I'm going to get a "gopher should've won" tatoo and live in as an hermit on some nearby mountain.

I'm glad that there appears to be a lot of discussion surrounding internet privacy lately, including in the mainstream. However I think that focusing all the discussion of Facebook is becoming counter-productive. Facebook exploited the weaknesses of the web very successfully but it only exploited tools that existed long before it was created. I have absolutely no illusion that Google & friends are doing pretty much the same thing, and if you have an Android phone the amount of info Google can harvest is nothing short of terrifying when you think about it from an Orwellian perspective. Don't miss the forest for the Facetree.

Assuming that internet giants will adhere to some "code of conduct" is naive at best. "Do no evil", yeah right. We let them have to tools to do these things, we need to pry some of them away from their hands through technology and regulation (probably in that order).


> a turing complete language with access to such a large API surface (from the DOM to the GPU to threads to a lot of the browser state) could be successfully sandboxed.

I wonder why you say that. I concede that web has evolved from no-security in the 90s, when random applets and ActiveX controls were 1-click away from rooting your machine, through faulty security in 2000s, where stack smashing IE was a hobby of mine, and up to the current era of sandboxes. Which do work, sandbox evasion zero days have become very rare and remote execution has largely ceased to be an infection vector against the masses.

The problem with 3rd party tracking is not technological, the web was deliberately engineered to work in this manner. The browsers work exactly as specified, and that specification is the problem. It's compounded then as a political problem, where major browser investment are controlled by advertising companies that have a massive conflict of interest regarding the users privacy, and are likely to promote bandaids like "Do not track" instead of the fundamental privacy re-engineering the web urgently requires.

> The world has gone mad. I'm going to get a "gopher should've won" tatoo and live in as an hermit on some nearby mountain.

When I'm thinking like this is usually a sign of age. Hang in there buddy, we'll be fine. There is a promised land just around the corner with rich, secure web applications and strong privacy. If only everybody agreed we want it.


>Which do work, sandbox evasion zero days have become very rare and remote execution has largely ceased to be an infection vector against the masses.

I consider that the widespread fingerprinting and user tracking is a form of sandbox evasion but I agree that it goes way beyond JS. The only identifying info that ought to be sent to some website by default is your IP address since it's necessary to actually send you back the data. Having the size of my view port, the version of my browser and OS, the type of video codecs I support and other shenanigans shouldn't leave my browser without my consent. The problem is that I can easily spoof most of my browser's info but JS makes it almost impossible to sanitize everything.

>The problem with 3rd party tracking is not technological, the web was deliberately engineered to work in this manner. The browsers work exactly as specified, and that specification is the problem

How is that not a complete contradiction? I'm not saying it's a bug, I know very well it's just a huge dump of features that keeps pouring in year after year. I'm just saying that web standards are the embodiment of "we were so preoccupied with whether or not we could that we didn't stop to think if we should".

IMO there are broadly two different use cases for the web currently: Web applications like Google Docs on one hand and glorified PDF reader for mostly static content like HN, internet forums, news websites, Wikipedia etc... on the other. Web Apps are the part that require this ridiculous complexity to expose rich content. That's the stuff you'd use Java applets, XUL, Flash or ActiveX for in the past. Those apps could be whitelisted on a site-by-site basis in the same way that you install an app on your smartphone for instance. You know that you expose yourself to bugs and privacy leakages but you know what you're in for.

99% of the websites I browse everyday don't expose any functionality that ought to require any form of interactive scripting or advanced features beyond displaying text and images (and maybe video). Yet my browser will gladly let them access all these advanced APIs by default, load custom fonts, let them run code on my GPU, make 3rd party requests, mine cryptocurrencies... That's just ridiculous.

The safest code is code that doesn't run.

>When I'm thinking like this is usually a sign of age

Come on, we're not old, we're wise! At least I hope so...


Spot on. And most of website should not even need javascript but some simple extension to xhtml to allow interactive websites in a declarative fashion rather than having a Turing-complete language.



Those comment threads dispute that claim.


So?


So using CSS, one can fingerprint a browser and communicate that information back to home base by programmatic inclusion of font A or font B. This inclusion triggers the download from the respective URL, thus giving that URL information about the user.


Yeah so is Power Point too.


Javacript running automatically is a decision made at the user agent level, it isn't really part of the "internet" security model. If browsers didn't run scripts automatically I think that would be the #1 support question, "How do I turn off those prompts!?".


Prompts could include a checkbox "Allow scripts for all sites".

Also: Some user would ask the site's support about the prompts, not the browser's, which would be an incentive to get rid of JavaScript for the site.


Hear hear! Is a mountain next to you occupied yet?


Exactly. Why aren't all websites run in containers by default (personally I'm envisioning per-domain containers)? What benefit do we get from full-coverage containerization not being the default?


In short: site breakage.

We're so deep in this that a first party isolation would break almost every single website. In a cooperation with Tor, Mozilla actually ported the first-party isolation feature in mainstream Firefox (available in Nightly, don't know about stable), but since it would break almost every single website, there are no plans to turn it on by default. You can, of course, enable it yourself by turning on "privacy.firstparty.isolate" in about:config.

Disclaimer: I'm a Mozilla Foundation affiliate that has nothing to do with Mozilla Corporation nor Firefox.


It would break some sites (certainly not every site, I'm browsing this site right now with it enabled) because 3rd party cookies are an entrenched, standardized feature of the web, you can't simply turn them off tomorrow. Developers are using them to create benign functionality, for example spread an application across multiple domains owned by the same publisher.

What I proposed above is that we introduce an opt-in feature: before the browser is allowed to connect to 3rd parties (in the tracking and cookie sense), the user needs to opt-in, for example by clicking in a notice window displayed at the corner of the browser window. Instead of nagging every user of every site that "This site is using cookies", developers should nag only when connecting to other applications, using a standardized browser API. After sometime from standardization, you can roll out this functionality to all users and nothing legitimate would break.

There should be no presumed used consent - because there really is none, the outcry against Facebook and advertiser tracking shows people don't expect the web to work the way it does.


We ran a breakage study near the end of last year.

First-Party Isolation (FPI) did have the highest breakage scores: ~18-19% of users reported problems with it, and 9-10% of FPI users disabled the study.

Those are low relative numbers, but at entire-market scale, they are big absolute numbers. :/

https://blog.mozilla.org/data/2018/01/26/improving-privacy-w...


It would be great if we could have "FPI on by default with whitelist".

I think the stats are iffy because a lot of the breakages are things I would want broken.


I've been using first party isolation for the last year or so, and I can't think of anything it's broken.


Good to know, I'm gonna try this starting now.

Using the extention: https://addons.mozilla.org/en-US/firefox/addon/first-party-i...


2 week report: I haven't seen anything break.


I tried it out and yes, it breaks AuthN with third party sites, but besides that I had a mostly good time with it. I did turn it off.

Is there any plan to create an exceptions mechanism? "Allow Facebook access to your activities on this webpage?" or something like that?


In my experience with Brave browser I had to enable third-party cookies only for very few sites. In most cases that was to support some external login mechanism that the site used, not to access the site after a login. So I doubt the claim about breakage on most sites.


While micro-sites and CDNs could still be brought under the same SLD. The biggest blocker as is OAuth. I'd want to make a UX call to see if browser could elegantly prompt the user for a 3rd party interaction.

While we are at it, I keep wondering (in a strictly SSL world) if it would be a good idea to restrict CORS calls only to sites using the same certificate as the webpage. Would make life easier for folks like facebook.com making CORS to fb-blablabla.fbcdn.com.


Wouldn't Facebook just start promoting their super Facebook compliant certificate to their partners then ?


Some sites rely on multiple domains to work. E.g. open auth, cdns are usually a separate domain, a lot of big corporates create new domains for “microsites” when they want to do something new but are tied down by slow moving corporate practices (although the world would be a better place without microsites).

Basically, you couldn’t do it without breaking a large part of the internet.


They could build a whitelist and distribute it with Firefox (?)


I agree FPI should be on by default and have a whitelist.


This can be done with Firefox Multi-Account Containers[1] and Temporary Containers[2].

[1]: https://addons.mozilla.org/en-US/firefox/addon/multi-account...

[2]: https://addons.mozilla.org/en-US/firefox/addon/temporary-con...


Indeed. Safari blocks 3rd party cookies by default, I see no reason why Firefox can't do the same. The fact that they don't is a direct contradiction to their claims that they care about user privacy. The only explanation I have is that they do not want to anger their long-time money sources Google and other advertisers.


Safari's third-party cookie blocking is not the same as Firefox's, last I checked.

Firefox just blocks them. Safari blocks them unless you visited that site in a first-party context. Shipping the thing Firefox implements breaks a lot more than what Safari ships.

So for example, if you visited facebook normally, third-party Facebook cookies won't get blocked in Safari.

Firefox _could_ ship the same thing as Safari. The upshot is entrenching existing monopolies, because only big enough players are then able to track you via third-party cookies.... This is the main reason Firefox hasn't done this, as far as I know.

Disclaimer: I work on Firefox, but not on the cookie bits.


I agree, but blocking third-party cookies does break a whole lot of sites.

Apple can do this, because they have a limited yet consistent market share that mostly consists of their own customers.

But Firefox? If they block third-party cookies and Chrome does not, they might just end up losing even more users.


Sure you can argue that way. But it's not as big of a problem as you think it is. The fact that Apple does it is enough proof for that.

Websites can be changed to work without third party cookies. I'm blocking cookies since years and I have come across less than 10 sites that require it. When that happens, most of the time I just go elsewhere or I'll send them an email.


I’ve been blocking third party cookies for as long as Safari has had that option, and I’ve never noticed an issue. I only get Google’s annoying recaptchas when my search string is too specific for their tastes.


What sites are broken by Safari because of its default cookie options? I haven't encountered any.


Any site which iframes in another site where you're expected to be logged in. For example, Disqus comments, Facebook like buttons, Youtube embeds (which work, but show ads even if you've paid for no ads).

There are also lots of one-off cases where developers split functionality across multiple domains in ways that were fine at the time but now get blocked. These are all fixable, unlike the iframing case, but it's still a lot of sites.


We need to redefine 'break' into:

Breaks because it was abusing lack of FPI.

Breaks innocently.


I think it's more nuanced than just blocking ALL third-party cookies. If you have visited the site before (e.g. facebook.com), my understanding is that those cookies are allowed. It's only blocking third-party cookies from sites you have never explicitly visited before.


Atlassian's HipChat web app breaks (in Gnome Web, which also disables third-party cookies).

<snark>So, nothing you'd want to use.</snark>


A large percentage of things that break SHOULD BREAK. That's the whole point isn't it?


Perhaps they could make "better default-settings" an option :)


There are valid uses for some of these. Imagine where online User eXperience and User Interfaces would be without services that help tell someone what is working and what is not.

Fortunately, many US companies are starting to consider a GUID and your IP address to be PII, and no longer allow that information to be stored.


There are lots of "shoulds" in the world. People should be nice, not steal, etc.

It seems to me, if users act differently, there "should" be a way to fingerprint them.


Anybody aware of any browsers making this a trivial workflow? Any comprehensive list of recommended plugins? Anything targeting mobile platforms?


Firefox's first-party isolation (brought across from Tor Browser) will foil cross-site tracking. Combined with Privacy Badger and Cookie Autodelete, your browser won't make the requests and even if they did, they wouldn't have any cookies attached.

https://github.com/mozfreddyb/webext-firstpartyisolation https://www.eff.org/privacybadger https://github.com/Cookie-AutoDelete/Cookie-AutoDelete


Wow. Obviously this idea gets thrown around HN a lot, but you deserve props for phrasing it so perfectly.


We're going from cold war to all-in.

Not just a privacy question. Reddit nags me every time I hit the front page, even returning from a story, asking me to log in. For most sites I need to create ublock filters to prevent pop-ups with useless "we use cookies", subscription requests, ads or social media bars that fills half the screen, etc. etc. etc.

Every newspaper I read has decided that autostarting video and streaming is a good idea.

Yes, HN is OK, but sites that it points to are not.

I'm giving up on the web.


When you mentioned reddit I thought you were talking about their "install the app" pop up. They have by far the worst mobile experience out of any site I've use more than once in the last 2 years.


Fortunately it takes one tap to update your preference to have it stop nagging you to install the app.

https://www.reddit.com/r/redditmobile/comments/6hk9ot/androi...


It should not be opt-out though


How would you ever know it exists then. No, they should give you the option to be reminded again, or be done with it. But since we are talking about deleting all cookies, where do you think this information would be added?


That assumes you want to log in and be tracked.

It's annoying to follow a link from a search engine and not notice it's Reddit only to be hit with the "INSTALL OUR APP. EVERYTHING WILL BE BETTER!" shtick every time.


No, it does not.

I browse Reddit on mobile in Incognito mode in Chrome, frequently, for various reasons.

This means that there is no cookie history, and (when Chrome shits its pants and dumps the Incognito session), the settings are lost. This seems to occur every few days, occasionally every few hours. Reminds me a lot of Netscape 4 days.

Every. Fucking. Goddamned. Time. I go to Reddit, I get:

1. The "Use our Mobile fuckwitted app" fucking nag screen.

2. I have to disable that fucking nag.

3. I have to select "use Desktop".

(Chrome's own "use Desktop site" option isn't sticky across sessions, or even tabs, AFAICT, when selected on a site-by-site basis.)

And I'm finally where the fuck I wanted to be in the first fucking place.

I maintain a small but fairly well-received sub, and moderate a few others. For numerous reasons, these haven't taken off or succeeded in generating much by way of active discussion (Reddit has long been an exemplar of how to design site mechanics to fully kill and destroy any active conversation). Mind, that's a hotly-contested field, but for all it does vaguely well (and yes, Reddit does have some nice features and a large and not entirely useless community), there are a few small things that could be changed to improve this ... which manifestly haven't happened.

Another site I criticise heavily, Google+, actually does this fairly well, given a number of preconditions.

1. The discussion has to be actively and effectively moderated.

2. There's got to be a good, and not overly-large, discussion cohort. I find ~5-10 people is a minimum (with the right 5-10), and in rare cases, up to a few thousand probably an upper bound. The most lively conversation I saw was in a private community of about 50 people, tightly monitored for behaviour but not (with a few bounds) content.

3. The fact that a discussion stays live for an extended period of time and the Notifications loop back earlier participants is key.

4. Individual discussions can only run to 500 comments. This keeps things from running on too long.

5. Discussion is flat, not threaded. This isn't my initial choice, but it actually works ... fairly well. I'd ultimately prefer client-based determination of order, much as with Usenet or decent email (that is: Mutt) clients, including threads.[1]

I'm not saying G+ is great. It has many, many, many flaws. But it is the best general-use system I've found on today's Internet, despite my many reasons for wishing that weren't the case.

I've found and met some great people, and had really good conversations, many lasting days and weeks, more than a few months and years.

And no, not all discussions go well. One of the best hosts on the site is its former chief architect, Yonatan Zunger. And even he has increasingly had to shut down discussions that ended up as shitshows. See: https://plus.google.com/+YonatanZunger/posts/cnqAekPSFgB

The fact that Google doesn't have to crank up impressions on the service for advertising dosh probably helps. The other games the company's played for generating activity stats have quite negatively impacted it in many peoples' eyes, my own included. The G+/YouTube fiasco being the worst.[2]

________________________________

Notes:

1. Mutt offers: threaded, date, sender, and subject sorts, reversible ordering, and extensive filtering, including the ability to arbitrarily tag items and view only those. It remains hands down the best messaging tool I've ever used, though I find email almost entirely untractable for other reasons these days.

2. This ended up with the three top stories on HN on that date being either items I'd posted elsewhere being submitted here, or related fairly directly to those. Interesting experience. http://imgur.com/YgEjUuI https://news.ycombinator.com/front?day=2013-11-16


I don't know if you've come across i.reddit e.g. [0], I only heard about it a few weeks ago, but it's an old mobile site from reddit with nice simple styling and all the shite taken out. No popups for the stupid app.

I don't understand the drive to force people to use the app. Medium trys similar but less annoying tactics. Either way you get eyeballs on your site. Perhaps there's better tracking that they can get hold of from the app.

[0]: https://i.reddit.com/r/Bitcoin/comments/89o16y/im_mark_karpe...


https://i.reddit.com and https://old.reddit.com both avoid this ... until you hit an internal reddit link to a different host (www, np, etc.,) in which case you're back to base and fucked. The inconsistency of intra-reddit links (that is, links within the Reddit app) is ... one of the more annoying elements of the site, and long has been.


You could try a browser extension to rewrite [www|np].reddit.com to i.reddit.com [0]

But yeah when ever you don't follow the path that the major sites want you to go down you have to keep jumping around

Similar with Gmail, all links in the email go via Google tracking so you have to do a right-click and get the link then open a separate tab and paste it (obviously a signal for me that I shouldn't be using Gmail in the first place).

[0]: https://stackoverflow.com/questions/1891738/how-to-modify-cu...


I'm increasingly of the feeling one shouldn't fight systems, though there's always at least some of that. I just can't seem to find the right platform / solution that's within my resources, capabilities, and/or constraints (inclusive of privacy and financial). Reddit or a blog are probably the best that exists presently.

More: https://www.reddit.com/r/dredmorbius/comments/8avwul/open_th...


Presumably most app users aren't blocking adverts, or the app can more easily refuse to show content of ads aren't present.

Apps will cut down multi-site use too, stop users "changing the dial".


Thank you! Helpful tip!


Twitter mobile. Scroll down for a screen, full page "Log in/Sign up" popup. Every single page.


Tragically, this is most probably by design, to encourage sign-ups. Whoever implemented that should not feel any better than a telemarketer or a spammer.


And then when you decide to create a sacrificial account just to keep them happy, they won't let you unless you give them your mobile number...


i hate the fact that sometimes it says "continue to the mobile site" and sometimes it says "login to access the mobile site". I prefer to not be logged in. Sometimes when i visit reddit it refuses to let me use their site logged out. Its super inconsistent. When I can't use the site logged out i just give up.

Also sometimes now the location of the "n comments" button becomes a "share" button, with the comments moved over to the side. but again, only sometimes. I keep clicking share on accident, because it happens infrequently enough that I haven't learned to check.

I've started switching to using their .compact site, which has neither problem. Only a small "switch to the new mobile site" button at the top of the page. The thumbnails aren't as large but thats fine with me


There are at least two mobile reddit interfaces, and one of them does not do this popup junk. The other one is obtained by adding the /.compact suffix, like https://www.reddit.com/r/programming/.compact


This same interface is accessible at i.reddit.com, for example, https://i.reddit.com/r/programming/


That's great when you want to browse Reddit, but often I find Reddit an actual good source for information so I'll click the link from the search engine that then takes me to the "real mobile version" which has the popup.


https://addons.mozilla.org/en-US/firefox/addon/redirector/

according to comments works for this specific purpose. Haven't used it myself.

I was hoping ublock did it but doesn't appear to.

If i.reddit.com is a different IP then a hosts file entry might do it.

Does pi-hole do this?


Instagram's mobile site gives Reddit a good run for its money. It recently just stopped working for me... Chrome throws a "Too many redirects".


yeah their official mobile site design is just awful. This much better tho: www.reddit.com/.compact


LinkedIn are ever so slightly worse. But it's tight.


LinkedIn absolutely “wins” when it comes to a sucky mobile web experience.


It is super rude, and "giving up" is the right answer - to annoying sites. Screw them.

The single nicest thing you can do for yourself after an ad blocker is to get Javascript under control. I personally am using JS Blocker, which takes too much effort, but allows you to selectively load JS. There are others - I like the control JS Blocker offers, but it is a bit annoying.

This is not without problems; there are sites that don't work/don't work well. I frankly don't care about most of them, so that's fine with me, and I leak much less data to the various panty-sniffers[1].

Life is much nicer when you take back control of what runs on your machine.

[1] I block FB, Google and several others' IP space, but the smaller adtech/government/who-knows surveillance shops are impossible to keep up with, and, well, defense-in-depth is the way to go.


How about just whitelist the good guys?


That's what I do on Android, except I use Brave instead of Chrome/Firefox.

I change the default settings to block all JavaScript by default, and then use the Brave button to whitelist sites I trust and actually want to use.

Browsing the web (especially news sites) on my Android has gone from "a complete nightmare" to "tolerable" because of this.


Same approach, but I prefer firefox beta + noscript, which allows more fine-grained control (so I can permanently or temporarily enable js from e.g. reddit.com and redditstatic.com, but not googletagservices, amazon-adsystem, etc. .. on the same page.)


Me too. Brave gave me back the ability to browse the web on mobile.

Now if I could just force pages to load in Brave instead of web view, that'd be amazing.

Related tip: if you open the Google app, menu -> settings -> Accounts & privacy -> disable Open web pages in the Google app, that will fix it for google now links.


You should try Naked Browser (or NB Pro , their paid app with some extra features). I have been using it for close to 6 years and it is awesome :-) In my opinion it has one of the simplest and nicest UI with features like script blocking. [Disclaimer: I am not associated with the developer in any way]. Ad blocking is sorely missing though.


> autostarting video and streaming

Super annoying. In Firefox in about:config, you can set: media.autoplay.enabled to false. That works well for me.


A similar setting exists in chrome where you can mute all tabs by default and only whitelist a limited number.


Some sites are using js tricks to override this setting.


until you mentioned this I forgot reddit started doing this. at one point, I wasn't even given the option to dismiss the login modal. I had to either login or not use reddit.

these ublock origin rules ended that:

  www.reddit.com##.splash-design
  www.reddit.com##.modal-open:style(overflow: visible !important)


You do know that this is not a "web problem". Just spend 10 minutes with your average 13yr old and see how many times they get bombed in the face with one of those interstitial ads.

Also it is easier to root your phone with a malicious app than with a web page. Heck, if you are an app developer why even bother to root phones when you can just do whatever you want, after all if the user does not give you all-in permissions you won't even let the app run.


AFAIK this is against guidelines and the app shouldn't be approved in the App Store. Not sure about Google Play


> I'm giving up on the web

They said, while posting on a web forum


> a web forum

which, to be fair, is accessible via numerous 3rd party apps: https://news.ycombinator.com/item?id=14684105


> Every newspaper I read has decided that autostarting video and streaming is a good idea.

If you're using Firefox, go to about:config and set media.autoplay.enabled to false.


Also: after years of reddit not caring about email addresses, it will now nag me every time I log in to verify my email address.


The extension "I don't care about cookies" prevents 99% of the useless "we use cookies" pop-ups.


I sometimes use reddit with Firefox for Android (beta, equipped with noscript and ublock origin) and it's really not too bad like that.

But overall I get a much better experience using the free, open, 3rd party app Slide ( https://www.reddit.com/r/slideforreddit/ ) so, maybe give that a try? (Assuming you're on Android - ios, I've no idea.)


no don't give up. just be in the vocal minority that gives a shit and expects better.


"We use cookies on this site! Read our Cookie-policy mambo jumbo here! <dismiss>"


Blame the stupid EU cookie law for that though, not the sites. Cookies can actually be very user/privacy friendly, since they provide a way to retain user preferences without requiring an account to tie them to.


> Cookies can actually be very user/privacy friendly, since they provide a way to retain user preferences without requiring an account to tie them to.

Enable cookies at that point then, don't turn them on globally for everyone.

> Blame the stupid EU cookie law for that though, not the sites.

If every site didn't turn on cookies by default as if they're somehow needed to read a text page a didn't contain a billion trackers then there would have been no law in the first place.

The need for this law lies squarely at the feet of the sites themselves, the industry has shown no desire to regulate itself.


The intention was to highlight just how many sites use cookies to trace your movements that you might not realise. Clearly it didn't work well, and led to the reasons for a lot of the GDPR


They are also entirely voluntary - which the EU would've known if decision makers had visited their browser settings once.


The notice is only for cookies used for non-essential tasks (like analytics) and doesn't have to be shown for things like login sessions. Your browser can't distinguish between those.


The cookie notice policy led to the worst imaginable actual outcome: A state in which you either accept cookies (incl. third party) by default or you will be haunted by numbing "reminders" that cookies exist. Usually taking large parts of your small screen estate.

Prior to that you could just disable them if you liked.

One can say that policy makers had good intentions - certainly. However, I can not imagine a valid evaluation that wouldn't conclude this policy as failure.


I don't disagree that the policy was a failure, I just disagree with your specific point made in the previous post.


Hmm. I'm not sure I can follow.

The browser not being able to distinguish between essential and non-essential cookies does not change the fact that cookies are and were entirely voluntary.


Right, but there's no reason to think the legislators didn't know that, because their goal was not to prevent the use of all cookies.


They thought that sites would stop using tracking cookies. But most sites use google analytics.


There are several blocklists specifically curated to adress the EU cookie nag:

http://prebake.eu/ https://www.i-dont-care-about-cookies.eu/ https://github.com/r4vi/block-the-eu-cookie-shit-list

AFAIK some prominent cookie notices are also part of the Fanboy Annoyances list: https://easylist.to/

The only prominent nags that are not very well targeted with those lists are in webapps like reddit mobile. Primarily because of their dynamic, non-descriptive ids.


Yep, I've used "I don't care about cookies" for a couple years now I think, it's been great. It's very rare that I see these pop-ups.


Giving up on Reddit is just the same response to stimulus, but the web is vast and varied. You’re giving up on a lot of good to spite a lot of bad, but you’re just going to hurt yourself. Use uNlock and uMatrix and after a few days of regular surfing and a little tweaking you won’t even realize that shite is still there.


Giving it up for work is not possible. But for entertainment it's different the gp not only mentions Reddit, but other news sites and specifically the ones linked from HN. It's a general trend that almost everybody does, probably because "everybody is doing it".

There are good alternatives like reading books, watching movies or going out. Web sites are shooting themselves in their feet. Blockers end up being too much work with this arms race. They think they're "winning" when they're actually just selecting the people that can't choose.


I regularly browse HN and click through to articles. I run firefox / noscript on both mobile and desktop. I occasionally have to enable or temp-enable some stuff in noscript to be able to read articles, but it's a small minority of cases, and probably smaller if I wasn't so prone to temp-enabling rather than enabling.

For sessions where I know I want js to "just work", like online shopping from a set of presumed-trustworthy sites, I use a different browser profile without privacy extensions, but I use it just for those purposes and avoid general browsing with it.

To make multi-profile browsing simple, I theme the profiles differently, so it's obvious which one I'm in. On desktop, my launcher for firefox does --ProfileManager --new-instance and on mobile I just use different firefoxes - ff-beta and ff itself.

Is this "too much work?" - I can see how it might look like that, but I've been using this system for a few years, and though it takes a few minutes to set up on a new system or device the maintenance overhead is low, so I'd say it's not "too much." Also, the time gained not waiting for js-encrusted sites to load probably outweighs the setup/maintenance by a considerable factor.


More or less the same here. Still situation worsened in the last months. And we are the techies. For "regular people" no doubt it's too much.


I used Firefox's containers for about a day, and then I discovered the privacy.firstparty.isolate option (in about:config), which effectively gives every site its own container with no user effort. That, combined with Cookie AutoDelete, seems to work well.


Yeah but you miss in the parallel session. Having several github and twitter accounts, it's very handy.

I never disconnect from anything, i just open a non container tab. I never connect to anything on my laptop, i open the site in the desired account container.

This website is in a state I don't want ? New container, and it's sees me as a new customer.


This doesn't work for me.

I have a "personal" container that I use for my Google/PayPal login, and sites that I use Google oauth with, same for a "work" container.

If each site used it's own container, I couldn't sign in with Google without signing in multiple times (let alone the mess of signing out old sessions, which I do once in a while).


Google logins still work, this is a different kind of container from the Firefox container tabs.

I use this feature with container tabs and it's fine. There are so far one or two sites I've come across that this breaks, neither of them anything I care about.



It doesn't work all the time. I tried it with Google login on Atlassian, it seems to rely on third-party cookies, and fails. There are a few bugs open on websites that don't work with this feature:

https://wiki.mozilla.org/Security/FirstPartyIsolation#First_...


I assumed it didn't work because it was an Atlassian site I was logging in with. Thanks for pointing this out.

Container tabs works for me very well, so I don't see much point in switching over and dealing with little bugs like this.


Atlassian actually rely on referrer headers as well! Extensions such as smart referrer break their single SSO system.


Sorry, are you using `privacy.firstparty.isolate` or container tabs? I think SparkyMcUnicorn was saying the former would not work for them.


Both. I'm saying they work together fine, and isolate works fine with Google logins. It's not that strict.


My use case is multiple Google accounts. I can log in to multiple Google accounts at the same time in different containers and answer emails very easily.


You can be logged into multiple Google Accounts at the same time without the extension. Just click the little profile icon in the top right and click "Add Account."

I'm logged into four, including a Gapps account.


Have they fixed the issues though where some of their apps only work with the primary account you're logged in with? (e.g. the first one)


Google Play Console only works with your primary account.


Over the years they have definitely fixed those one by one, but I’m not sure if they’ve fixed all of them. I believe the last problematic one I encountered was some ads-related developer console a year or two ago.


They fixed the big ones but there are still a bunch of edge cases where you have to log out, like one account is from US and one is from Canada.


No. I use this feature daily, and have to login to my "primary" Google ID for my Drive/Calendar, then I "Add Account" to enable my secondary/tertiary ID's. It has to be in that order.


Good question, though the account isolation to the extent I've tested it has been surprisingly good.

I still don't fully trust it, but that's a different matter.


+1 deal with this all the time on spreadsheets ===== fixed


This would let Google know these accounts are affiliated with the same person on your network, whereas containers won't necessarily provide that assumption.

Right?


If they wanted to they could associate them other ways by just looking at the IP address and/or browser fingerprinting.


My thought is that they can certainly find good indicators that they're related, but that signing into them at the same time directly in their website confirms the relationship. Bear in mind, everything coming from say, a corporate network, won't be distinguishable by IP, and browser fingerprinting is likely less helpful in an environment with a hundred PCs running identical system images.

Your home device may not live in such a family, but it's hard for Google to determine that.


Even in a corporate network there's some plugin drift over time depending on job role especially for developers. And GP specifically has installed a container add-on which probably makes him/her fairly unique among all the corporate traffic coming from that IP/group of IPs.


Most good corporate environments should be controlling what plugins they allow users to install in their browsers. (For the few users we permit to use Chrome, for example, all extensions are disabled.)

But again, the issue is not your specific configuration (or GP's specific configuration), the point is that Google cannot assume same IP or browser fingerprint is a definitive association of identity, whereas signing in additional users in Google's app definitely does.


> Most good corporate environments should be controlling what plugins they allow users to install in their browsers.

For a very odd definition of the term 'good' I suppose that's may be the case. It's certainly not been in any place I've worked in the past 20 yrs.


I can't vouch for the security of any place you've worked in the past 20 years.

But suffice to say, it's very common for Chrome extensions to be able to both modify any content on websites you view and read data you enter into them. Both adware and spyware is prolific on the Chrome Web Store, and it's the number one infection vector I see.

Controlling extensions is downright basic competency for network security. With regards to Chrome, I currently operate an outright block, though obviously we can whitelist extensions as necessary. (One thing Chrome does particularly well is their ADMX templates: It's easy to blacklist and whitelist extensions, and install them compulsorily for users as well.)


Considering that I've spent 15 of the last of those 20 years working at cybersecurity companies, I can say with some degree of assuredness that your level of control over browser extensions goes far beyond the typical work environment, and that goes doubly so for companies based in silicon valley. Do you really think that all those engineers at Google/FB/Apple/etc can't install browser extensions?


I assume there is no hiding from Google unless you black-hole their IPs. Also, don't let an Android device with links to Google be near you.

It'd hard to ban Google from your life even if you want ti. I think governments should help citizens achieve that when they choose so.


If you browser fingerprint and uniquely identify someone in this scenario what is the GDPR rationale going to be?


I'm not sure if this is the same thing you are suggesting, but what I like to do is use the "Manage people" option from the menu at the top right. My work and home google accounts are never logged in at the same time in the same chrome window. It seems to keep everything separated pretty well, even extensions seem to maintain their own settings/"activated-ness" within each person/account.


On Chrome you have the SessionBox extension - https://sessionbox.io/discover


Why don't you use Thunderbird for multiple mail accounts? I cannot imagine using web interface for multiple email accounts, too much effort, taking my browser tab space. Thunderbird is right tool for it...


"... and then I discovered the privacy.firstparty.isolate option (in about:config), which effectively gives every site its own container ..."

But with its own IP, possibly a non-routable, private IP ?

I think not ... and that's too bad because that is what we really need. We need the ability to chroot jail a GUI program just like we jail named or (whatever).

A different root, a different IP, and no access (or knowledge) of the rest of the system. If I used facebook, that's the browser I would run it in.


This is exactly what Qubes offers. Have as many separate browsers as you like, each originating from a different IP using VPN proxyVMs. Or just run everything through Tor using a Whonix gateway.

https://www.qubes-os.org/intro/


Any problems? For example, CDNs are in different domains. Some ecommerce sites use payment services from other domains.


Things will break like logging in with Google/Facebook/other OAuths that use popup window redirects, iframe services like Disqus or Facebook comments, Google's captcha will become more annoying, less local caching will happen which means slower browsing in general.


When you say login would break, what do you mean? Just that it would be less convenient, i.e. you have to login to your identity provider once for each first party? Or are you suggesting that the redirect flow for OpenID Connect or oauth wouldn't work at all?


Yet another reason why you should avoid using service to log into another as much as possible.


I'm really of two minds about this. On the one hand you can use Facebook or Google to log in and then have them know what sites you log into. Or you can use a site's native login and be pretty sure that at some point you're going to get a "we take security very seriously" notice after they screw up and expose all their user data.


If you use a password manager with different passwords for each site it doesn't matter (much). Either they steal your password (usually hashed), or your OAuth information.

The minor difference is that with OAuth they cannot log-in to the site pretending to be you, but if they don't hash the passwords, or use weak method to do it, then it's possible.

Regarding other user data stored in the DB (which is usually more valuable than just the passwords) there is no difference between the login method.


I don't really see how logging in with a different system prevents your user data from being exposed?


I've definitely run into problems with ecommerce sites (really just Tarsnap) and embedded iframes from 3rd party payment services, while using the Chrome equivalent of this Firefox setting.


In case it's useful: This was due to a bug in Chrome which has since been fixed. (Chrome was losing the "POST" flag when sending a redirect message between processes.)


Any idea what version of Chrome fixed it? I refill my Tarsnap balance only occasionally.


The bug was fixed on January 26th and was being merged to older supported branches expeditiously -- I don't know the exact version numbers but I'm sure that any releases they rolled after mid-February would have the fix.


I guess 65.0.3325.146 from March 5th probably works, then. Thanks!


How do you do this in Chrome?


chrome://flags/#enable-site-per-process

Strict site isolation > Enable.


Using profiles I guess


Doesn't Privacy Badger do about the same, but more granular and breaking fewer things?


By not breaking things, Privacy Badger will essentially let some cookies and trackers pass through which aren't cross tracking but these could leave behind a trail. Some will stay dormant until activated.

CAD or any similar cookie deleting add-on will essentially kill any cookie that isn't required for your active tabs. This way nothing is left behind to track or trail you.


What about browser fingerprinting (https://panopticlick.eff.org/)? Unless you are using the Tor browser, this appears "uncontainable" especially for folks not using Chrome over Windows 10 or another super generic combination.


Blocking your canvas to avoid being fingerprinted/tracked is like walking under the CCTV with a hoodie and cap on. Unless everyone under the CCTV does the same you will not be detected but will be tracked. It might be hard to identify you but you will standout.

So instead of standing out of the crowd why not disguise so well that the tracker doesn't know the real you.

The idea is to use something that gives a fake readout of your fingerprint thus making you look normal but keep changing it occasionally so it leads no where for long time. Try Canvas blocker[1] and/or Canvas Defender for fake readout.

[1]https://github.com/kkapsner/CanvasBlocker


> The idea is to use something that gives a fake readout of your fingerprint thus making you look normal but keep changing it occasionally so it leads no where for long time.

What is "normal" in this case? One correct way to counter fingerprinting is to standardize the fake readouts to a specific value and not change it every so often. So if everyone's browser is reporting the same value for this feature that feature will become meaningless in the context of fingerprinting since its entropy is very low.

That's what Tor Browser has been doing with font enumeration, reported windows size, screen resolution etc. - they all report the same value. On the other hand, you can correlate these values and identify Tor Browser users.


I use containers to keep some activities limited to it. As my primary browsing is open all the time which. For each of these activities I use a separate container; shopping, work, research, banking, payments and social.

I use uBO with dynamic filtering[1] + I've replaced 'CAD' with 'Forget Me Not'[2] works like a charm.

[1]https://github.com/gorhill/uBlock/wiki/Dynamic-filtering:-qu... [2]https://github.com/Lusito/forget-me-not


Wouldn't setting your cookie settings to strict (block all third party cookies) do the same thing?


It's more thorough than just cookies including various caches and data stores.


Does the OP's method allow third party cookies while isolating them from other sites?

That might be more usable than blocking third party cookies for sites that break because of SSO.

Personally I use Containers and block third party cookies and I don't have any troubles.


nice, I wish there were a dedicated checkbox for this in the settings page. I didn't know about this until now.


Yup, this is how I thought containers would work like at first.

9/10 times I navigate off a link from one container and boom I've pulled in everything in that new container by ctrl+l reflex.


If you use Firefox Containers, including the Facebook Container, please also use Cookie AutoDelete [1] to get rid of cookies from closed tabs across containers. Otherwise, in my observation, sites will still be able to track you if you reuse a container (even after closing all tabs of that container) for a specific site.

[1]: https://addons.mozilla.org/en-US/firefox/addon/cookie-autode...


This is the purpose of containers. They allow you for the same set of websites to interact with 'you' in different contexts


I know the purpose of containers. But it's still not as easy to use as I would like it to be (I understand that doing usability design for this is complex), and I haven't created a specific workflow for it yet.

When I'm on some site and want to quickly search the web using Google (when DDG isn't enough) on the same tab and then return back to the site, or if I use a container temporarily to check a Gmail account, I don't want to mix those up by mistake later (I never search on Google while logged in). This problem may not exist if I spent time to pre-decide the use/assignment of websites/accounts for each container.

Since I also use sites like HN to jump to other sites (several of which may be trying to track), auto-deleting cookies gives me peace of mind once I close a tab (I set the timeout value in the extension configuration quite low). I was using Self-destruct Cookies [1] before (when legacy addons were still supported on Firefox).

For additional protection, I do some more extensions.

[1]: https://addons.mozilla.org/en-US/firefox/addon/self-destruct...


It sounds like you're after Temporary Containers: https://github.com/stoically/temporary-containers

That gets you a clean container every time you click the button. Very useful for development testing, too :).


what's the use case difference between that and porn mode browsing? i can't really tell when i should use one and the other.


The best feature (at least for me) is that you can whitelist some specific websites to always open in a new temporary container, without any extra effort, while leaving the rest of your websites like they are today.

For example, I have YouTube (and any other Google website) to always open on a temporary container but other sites like HN and Reddit are on a normal and permanent container.


I believe Incognito disables (some?) addons, and it has a collection of settings which may mean it acts differently from regular browsing (Tracking Protection is probably the biggest, although I turned that on for normal browsing too).

It's also unclear to me where one Incognito session stops and the next starts, although I'd assume they live and die with their window.

More importantly, Incognito opens a new window while Temporary Containers open in the same window with coloured tabs. So you can have _lots_ in the same window if you want.


Does it also clean Firefox built-in DNS cache ?


My understanding is that First Party Isolation maintains _separate_ caches per first party, but I don't know whether temporary (or multi-account- for that matter) containers also do that. It seems like it would be reasonable for them to do so but I can't find a reference.


Honestly, I think one should use the container for specific things and just destroy it.


For some reason Firefox Containers with Cookie AutoDelete messes up with Twitter and I cannot login to Twitter. After entering user/pass I expect 2FA screen but I'm redirected to `https://twitter.com/login/error?username_or_email=...` url where I have to enter my credentials again. I'm not sure if I'm the only one.


I don't use 2FA with Twitter, but I have no issues logging in with Containers and Cookie AutoDelete. Just a data point that something else may be amiss on your end. Disabling other extensions and trying it could possibly help narrow down on the cause.


But doesn't that mean you need to login again each time you go to a website?

Sure, by saving passwords in Firefox it isn't a huge hassle, but a cookie is still more practical...


Explicitly white-list sites you don't think are tracking you across the web.


Thanks for this. I was wondering when Firefox would give such an option in the browser, but never bother to search for plugins.


Shouldn't Cookie AutoDelete be part of the Firefox containers?


Cool advice, thanks.


If anyone in your household uses FB from the IOS or android mobile app from wifi, it will have both your IP address and GPS coordinates. Correlating that multiple people share one residence or workplace is easily done for FB. You can keep playing a shell game like run all your desktop PC traffic through a VPN somewhere so that the FB container doesn't show up as the same geolocation, but you or an ignorant non technical user you live or work with will slip up.

Edit: fb also buys geolocation data from organizations that do the modern equivalent of wardriving. Correlating GPS location with RSSI of specific wifi SSIDs and AP MAC addresses. If anyone near you uses the app, even if their phone has all location services turned off, you're still geoprofiled to within a city block.


What's your point? Facebook still won't know what websites you visit on your PC. It also doesn't have geolocation access there unless you explicitly grant it.


That the targeted advertising and profiling will still come to you. Live with a person who joins fb mommy groups and buys diapers online? Prepare for a barrage of relevant ads. The main point is that the combination of relationship-inference and geolocation can undo probably 70% of your own privacy protection measures, through ignorant ordinary use of the fb app by friends, family and coworkers. Then combine that with fb facial recognition profiling from photos third parties may upload...


Putting in my year of birth to 1911 (but the same month/day so half friends don't send me wishes on the wrong day) used to mostly destroy their targeted advertising. It was all adverts for arthritis medication. They're steadily figuring it out, but it's taken them about 10 years.


I'm also enjoying completely meaningless ads as I lie to them about my gender.


Sure, they will still track you as best as they can (though as I said Facebook doesn't have geolocation access on desktop). This doesn't change the fact that without your browsing profile, Facebook will have less knowledge about you than it would normally have, period. Facebook can't "undo" that.


I think that was the OP’s point. FB woule have geolocation information on you because thanks to someone else’s use of your WiFi network on their cellphone, FB can now associate your IP address with a geographical location.

So even though you’re on desktop and haven’t provided them with location permissions, they can still identify your location based on your IP address.


Ah, okay. I still don't see how this would undo preventing Facebook from knowing what websites you visit. The location of your PC seems like an entirely separate data point.


Facebook can use GeoIP on their servers to position you within a few miles. That's close enough for demographic profiling. And like others mentioned, if anyone uses the Facebook app on your home network, Facebook can link the devices precise position with your IP address.


Perhaps we need a specific Tor shell Facebook app. Then it's just a wrapper onto the desktop site.

Just force your kids to use that on mobile and use Tor on desktop with Facebook as the homepage.

You can tell your kids that 'Tor' means 'Facebook' in Icelandic.


lots of people have a dynamic ip. if you don't want that fb (or any other web-thing) correlates an ip with you, you should get an isp that gives you a dynamic ip.


I've been trying to use Firefox's containers for a while. They are pretty clunky - you need to open a blank container tab using a menu and then enter the url. I forget to do that all the time, so after some time you are just logged into everything in the "global" container, or logged in to google and twitter in the same container etc.

If you check the "Always open this page in this container thing" - then it prompts you with an "are you sure you want to open this page in this container" every time you go to that page which is very annoying. Edit - this is not true - there's a checkbox to "remember your choice". I don't know why I didn't see it.

They are error prone enough that they aren't good tracking protection. Just use ublock origin or a similarly good privacy plugin to globally block as much tracking as you can.

The facebook container eliminates some of those annoyances, but only for facebook.


> then it prompts you with an "are you sure you want to open this page in this container" every time you go to that page which is very annoying.

That isn't true - you say yes/no the first time and then it's automatic. Works great for me.

I create a container for every service I want to stay logged into and whitelist the specified sites with Cookie Autodelete (which is container aware), then auto delete all other cookies on tab close.


Yeah, I'm not getting it after the first "Are you sure?" prompt, but that first one is annoying. I already checked the box asking to always open foo.com in a specific container. Why is it asking again?

Also, I'd love to do this in my bookmarks, instead of having to visit each site and set the container one-by-one.


> I already checked the box asking to always open foo.com in a specific container. Why is it asking again?

I have some sites that I almost always want to open in a specific container. Except when I don't.

I suspect this double-confirmation is for people like me: I can chose between "always open this site in this container" and "always suggest opening this site in this container".


I actually have a few cases like this as well, but they are very common cases I encounter 20-30 times a day.

I thought containers were clunky too until I went looking for bug reports for the functionality I wanted (always open domain in container) and found out I just wasn't using them right.

I'm actually really impressed with how polished it is given the complexities it entails


Is there an official article/tutorial detailing how to use them and use cases with examples?


From searching I just found a Mozilla help page[1] that seems to have much of the useful info, along with some links to other articles.

Not sure of it or something similar was referenced at some point during the extension install process.

1: https://support.mozilla.org/en-US/kb/containers


There's a "remember this" checkbox and that worked last time I tried to use it.


There is an add-on called "new container tab" that uses alt+C to open a new tab in the same container. So I'll open my banking tab and Alt+C several times to do all my monthly tasks.

And as others have said, the "always open in this tab" shouldn't act how you describe.

Finally, there is an option in about:config called "firstparty.isolate" or something similar that does what containers do, for the most part, by default. It will break SSO with Google/Facebook/etc. though.


You can middle click on the refresh button and it'll clone the tab you're currently in. Very useful for duplicating containers.

And yes, that's terrible discovery in ux.


It's always been obvious to me that the nav buttons are treated the same as links. Left goes there, middle goes there in a new tab, right has options.


Agreed, in Chrome I used profiles to segregate, and it worked well.

Profiles in Firefox is somewhat clunky to use at the same time (I end up just using two installations).

I tried containers as well, but gave up, due to reasons you describe.


It's easy to launch new Firefox profiles if you create an alias something like:

    $ alias ff='firefox -ProfileManager -no-remote &'
You can also make desktop icons that open different profiles quickly.


I recommend the "Switch Container" extension for this.

For developers, containers and this extension also make it super easy to test what a page looks like when logged in as a different user.


IMHO, this is how containers should really work:

- Every website (domain) should get its own container by default. I don't want to configure stuff when visiting a new domain.

- If I want domains to share a container, then I don't mind having to configure that.

- When clicking a link inside a container that points to a different domain, then the link should open in the container for the domain pointed to.

- When clicking a link, I should have the opportunity to edit the link before opening it, to avoid information leakage from one container to another.

- Cookies may be saved per container (default). But I should be able to turn cookies off for a specific container.

- Containers should work against fingerprinting, i.e. by perturbing browser characteristics slightly. This should work by default per container and per session. It should be configurable.

- If some well-known websites only work with multiple domains, then that is ok. These domains can be grouped into one container. Firefox can distribute a "whitelist" for such configurations. Please don't bother me with the specifics, but enable me to figure out what the settings are for a container, and to change those settings.

- Container settings should be synced over my devices. Needless to say, containers should work on all platforms.


Btw, the ground rule I used here is that the amount of "security-related mental-gymnastics" I have to perform to stay on the safe side during normal browsing should be zero.


Don't blame them for taking advantage of the current media attention facebook is getting to bump firefox containers a bit! It's pretty neat technology.


/sigh. Yet another "privacy" solution that's solves the wrong problem.

You prevent the big bad company from spying on your browsing activity while, at the same time, explicitly posting to the big bad company's first-party site messages containing all the juicy bits of private information that you went through all that effort to prevent them from inferring through your other activities.

And we pretend this is helping, rather than just adding noise to an already confusing technical landscape.


Sorry, you're not seriously entertaining the idea that browsers should just turn off facebook for everyone, are you? There are plenty of parts on this planet where facebook is the main page in people's lives, like it or not, and the problem to solve is one of people's attitude towards that being okay. Not the availability of facebook. You don't change people's minds from "this is useful to me on an hourly basis" to "I won't use this" without going through (many) intermediate step(s), and this is a GREAT intermediate step for those people who can't (for whatever reason) give up facebook, but do understand that maybe it's time to restrict what facebook sees a little. And then we keep the pressure on, as society. This is a long battle, we're nowhere near done.


Big companies like this never go away. They just reshape their public image and keep doing what they are doing. The Facebook thing will blow over. He has friends in high places. Right now this is a show for the public. Later its forgotten and Zuckerberg is back doing whatever he wants to do.


Big companies like this have never existed before. It's hard to remember, but 10 years ago there was no "massive Facebook" or anything of the same scale. Amazon, Apple, you name it: NONE of them have been around long enough for a claim that "they never go away" to make any kind of sense.


The same shape no.

But big invasive entities have always existed. In france, coal mines almost owned the lifz of their worker : when they shop, where they slept, etc. News were only on a few newspapers or tv channels linked to the same top people, one source of info to tell you what to think, do and to buy.

The government has always collected a lot of infos to.

Before that, the church did. They polished the global personnal data collection but called it confession. They mastered the ad, it was for all people to see on sunday. Very useful for population control and black mail.

So unless people learn to see the red flags and react, the next wolf will it the sheeps like always.


I think you're making a bigger deal of the negative effects of "noise" than it is. If anything, it at least helps spur the discussion about tracking and grow awareness.

Comments like "sigh" are not at all constructive, and I would encourage you to consider not being immediately dismissive to people who at least try to do something about problems you care about.


yes, why not just use the Private / incognito mode every time you want to use Facebook directly or indirectly?

That way any Facebook-owned service like logins and website comments plugin will work as expected but will not follow you around in your regular browser window!

________________________

From the extension page:

> "Clicking Facebook Share buttons on other browser tabs will load them within the Facebook Container. You should know that using these buttons passes information to Facebook about the website that you shared from."

> "Because you will be logged into Facebook only in the Container, embedded Facebook comments and Like buttons in tabs outside the Facebook Container will not work. This prevents Facebook from associating information about your activity on websites outside of Facebook to your Facebook identity.

> In addition, websites that allow you to create an account or log in using your Facebook credentials will generally not work properly. Because this extension is designed to separate Facebook use from use of other websites, this behavior is expected."


Made a jump to Firefox as primary browser on macOS this month. My setup involves heavy use of the new containers feature and a tree-style vertical tab panel, which AFAIK is unique to Firefox and offers neat visual browsing history.

Extension rundown:

— kesselborn’s Conex, a Spotlight-like quick container switcher/tab finder. It’s set to auto-hide tabs not in the current container, and I use it to routinely create one-off containers for specific tasks

— piro’s Tree Style Tab, with a workaround setting to make it play well with Conex and a couple of custom styling rules

— MarsCat’s Switch Container, which allows to re-open a tab in another container (used with caution)

What unblocked the switch for me:

— There’s now a working tree-style tabs extension in Quantum

— The new Web Authentication API removes the need for Keychain integration in the long run

I’m wary of getting too used to a heavily customized setup, and am still figuring out the best way to back up my Firefox profile. Previously I relied mostly on stock Safari and Chrome, which is still great for its developer’s tools.


> tree-style vertical tab panel, which AFAIK is unique to Firefox

For the major browsers, yes. Some Chromium off-shoots have them though.


I stand corrected, also I should’ve added “on Macs”. I’ve been keeping an eye on Doogie[0], but its maintainer doesn’t have the resources to maintain another build.

Curious if anyone can weigh in with similar alternatives available on macOS.

[0] https://cretz.github.io/doogie/


You're talking to the maintainer :-) Yeah, I don't use Apple products so looking for someone to do the work. In the meantime, I am not too familiar with mac alternatives, but these days CEF, QtWebEngine, etc aren't that difficult to use if you want to roll your own (mine is just Qt + CEF).


Hey, that’s unexpected and awesome! I hope to be able to help with the build after I upgrade to latest OS version.

Concepts on which Doogie is built sound very reasonable and I’m pretty sure they inspired my current setup. Native approach to bubbles vs. workspaces is better than shoehorning containers and tree-style tabs into that with a bunch of extensions in Firefox.


I guess you are already using uBlockO in advanced mode to block all 3rd-party scripts and frames by default?


Does using containers improve performance with many tabs open?


Can’t say for sure—haven’t used modern Firefox without containers. Last time was years ago, pre-Quantum, so it wouldn’t be a fair comparison.

Compared to other browsers, if anyone’s interested:

There was no perceived performance hit after migrating 60+ tabs from Safari and Chrome to Firefox. Browser startup time is definitely faster. For now it seems I no longer have to think about the number of tabs I have open.

I usually have fewer than a dozen tabs in any given container. With Conex, tabs in other containers don’t seem to get loaded unless I switch to them (tabs not in current container are hidden).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: