Basically, various privacy protections cause various kinds of website breakage.
Make it a long deprecation if you have to. Give even longer exemptions to the really big players / the big breakage / the legitimate use cases while we find better ways. But it is up to the browser vendors to remove the weapons here.
It's possible for your IdP to track the SPs you authenticate to regardless of protocol or cookie use, of course.
This extension gives Firefox selective amnesia: if you're in a Facebook container tab, it'll remember and send those cookies. If you're not, it won't!
An alternative solution is to never make those third party requests in the first place, but you might need some of them for content you're actually interested in viewing. Using both a blocking extension and this container extension should improve your privacy towards Facebook.
Presumably the like button wouldn’t work - but that’s what I want. So the Q is: what will break that I didn’t want to break?
It can. Blocking third-party cookies is available in the browser settings of at least Firefox, Chrome, and Safari. I think it’s even on by default in the latter.
I’ve been using it for years and never seen a broken page as a result.
Sites that put a checkout flow hosted on a different hostname in a subframe break.
Some forms of "sign in with X" break.
But there's a thing for Firefox which does it for all sites. Called First Party Isolation.
Obviously this would need a white-list (and a pair<from,to> whitelist, not just "this domain is OK list) to allow SSO scenarios.
This will treat every first party domain as it's own container for cookies and other stuff.
Default privacy settings are tough to manage.
Some people want privacy, and will accept broken websites if it keeps their data and online movement private.
Other people just want their usual websites to work, don't understand or care to think about privacy, and if some random content farm looks busted in Firefox, will just switch to another browser.
Aside from picking a sensible default, Firefox also offers to educate users where it makes sense. For example, when you open a new private browsing window in Firefox, the tracking protection section includes a "See how it works" button that takes you to a tour-style walkthrough of how tracking protection works.
There's an add-on that does something close to that:
This add-on's options include opening each (sub)domain in its own container. These containers are temporary: they're deleted a short time after you close their last tab, so you have to log back into each site on each visit. (This may be something you do anyway.)
I don't (yet?) know of an add-on that automatically assigns each domain you visit to its own permanent container, and automatically creates new containers for each new domain.
You can set Firefox to behave that way, though. Look for First Party Isolation.
What is really nice, is you can tell it to ALWAYS open your banks website in a particular container, and it will. If you go to that URL from a tab in your work profile, it will switch to the banking profile for you.
Of course, I can appreciate that such behaviour is technically far more challenging. It's merely as a user that it's disappointing, and it makes Temporary Containers  practically unusable.
We close the previous tab and cancel the webRequest before it's sent to the site so none of the default cookies are sent.
"Converting" a tab from one container to another is actually a bit complicated, and there are open issues for it. :/
A good workaround for this could be to let users add arbitrary domains to their container whitelists (so you could manually add www.foo.com), but I haven't found a way to do that yet.
Are they planning to fix it somehow?
Please design an iOS identity isolation UX for Mobile Safari, Apple Mail and third-party apps.
Currently we have to use several different iOS web browsers to achieve isolation of browser logins/cookies, but Apple Mail opens all links in Safari.
Ideally, there would be an OS enforced, per-app setting which defines which web browser (or Mobile Safari identity context) will be used to open a URL.
A URL clicked in Apple Mail will then open in a private tab.
Everything down to the fonts you have installed can be used to track you
Facebook would only have to edit a specific field in users' posts' urls related to certain domains, so that different users access unique urls that link them to Facebook in a specific context.
Once outside the container, a user is traceable back to a Facebook profile.
I don't know if this is possible, but people don't really check urls, let alone human-unreadable ones.
I imagine that this could break navigation, but directing wrong queries to a default page would solve this. Anyway, they could still try their luck.
Cliqz has done some interesting research in this area of detecting (and stripping) unsafe data elements.
If most users don't pay attention to URLs (which is true), users cannot give informed consent, a core tenant of GDPR.
1. Models don't need everyones data, they just need enough data. Facebook, Google, et al have more than enough data for a lot of applications even if they only had 5-10% of the population. So at best, this ship jumping will limit Facebook's ability to "micro-target" certain populations. In this case, P(whatever | privacy concerned individual) will be a bit noisier, but Facebook will still have a really damn good idea about P(whatever | still a Facebook user)
2. Facebook can still use/sell the models it develops against its user base to target you even if you're not on the platform unless you really think P(depressed), P(bad employee), P(insurance risk), P(easily influenced by a specific type of marketing) has anything to do with the fact that you're not on Facebook. The minute someone asks a few questions about you in any setting, they'll be able to infer a ton more from the models alone. Lack of information about you will only add noise, and to make things worse, Facebook has enough data on privacy conscious individuals anyway to where they can reasonably fill in the privacy conscious holes in their data with a reasonable model.
3. P(privacy concerned) may be correlated with P(not manipulable), so you jumping ship isn't going change the systemic issues everyone is concerned with, namely Facebook and third-party customers' ability to morph society in the means they see fit.
4. You can replace Facebook with Google/Amazon/Spotify/Chase/Bank of America/Hospital System/Government and all of the above is true within the domain of data they control.
I appreciate this. I was aware Firefox already had a container system, but didn't want to employ the effort to set it up. A one-off for Facebook though, feels easy to work with.
I'm happy to recommend/install it to friends I install uBlock Origin for.
Great job Mozilla :) . More of this.
The great thing is when you block all that, your internet is much faster.
It can't compete with the total control you get with crafting specific firewall rules but it has most of the same effect while being a lot simpler to manage.
An excellent guide: https://killtacknine.com/building-an-ubuntu-16-04-router-par...
Up for months, incredible speed, no maintenance, except to add new domains to block when they prove to be spyware. Twitter may be next on the list since it is of declining use to me.
I'll add that I'm by no means a linux whiz, but learned a lot by doing this project.
Biggest issue is explaining to guests why they can't access facebook.
Another bene is you no longer have the planned obsolescence of consumer grade equipment. I fully expect this thing to last a decade, and "firmware" is automatically updated via linux.
No surprises, the Omnia updates itself and reboots automatically (you get an email warning of you want to intervene) and everything 'just works'.
Why is this different from Privacy Badger? This allows you to segregate all facebook toxicity to a single container. This allows you to fully use facebook, and places like login via facebook, without exposing other things to facebook in the first place.
That said, this really doesn't address either the Cambridge situation, or the fact that Facebook themselves allowed the Obama campagin to pull demographic information in violation of their own polices, which was arguably impacted far more people (https://www.investors.com/politics/editorials/facebook-data-... && http://www.dailymail.co.uk/news/article-5520303/Obama-campai...). The only solution to that is to #DeleteFacebook. Facebook is a surveillance as a service provider. The only way to keep them from monetizing you for commercial, social or political reasons is to firewall them off.
You can also associate this with a VPN, if you want to deny them the IP address your home machine is using.
Neither of those citations say that what the Obama campaign did violated Facebook's policies.
The "But the Obama campaign did something similar!" is an argument based on false equivalence, used to muddy the conversation.
(Further, both of those citations - a right-leaning source and a tabloid, respectively - are articles written about some tweets. Which is not to say that they're wrong, but without further confirmation, one might want to take them with a grain of salt.)
 The "You might like" links at the bottom are to the articles "Russia Scandal: Did Obama Tutor Hillary Clinton In Electoral Conspiracy 101?", "Will Mueller Ever Admit That There Was No Trump-Russia Collusion?", and "Hillary Clinton Still Can't Believe She Lost ... Or Why".
In fact, reading what the OFA admitted, and what is under investigation here, the only difference seems to be that Facebook didn't unofficially bless the efforts of Cambridge, while they did for OFA.
Here is the money quote - directly from Twitter:
“They came to office in the days following election recruiting & were very candid that they allowed us to do things they wouldn’t have allowed someone else to do because they were on our side"
That's not a conspiracy theory, or some evil double standard. That is a director for OFA explicitly mentioning that they were being given special data "because they were on our side".
You should be very very scared about social network deciding to be on peoples side. See Trump and Twitter.
It's as if the last two years hasn't convinced you to be deeply skeptical of social networks and surveillance as a service operations. This universal confirmation bias is the exact same bias that Trumpkins use to ignore any news they don't like. Facebook will keep selling you to the highest bidder. They don't give a fsck about your political point of views.
(And this is probably tame for what they do for China to suppress their people).
No, it doesn't. Between the two sources, they say that Facebook was "surprised" that they were able to get so much data, and that Facebook didn't stop them. The sources do not say that the Obama campaign violated Facebook's policies.
This is the false equivalence: "What the Obama campaign and Cambridge Analytica did to acquire data on tens of millions of Americans is the same."
In reality, Cambridge Analytica obtained the data from a 3rd party not authorized to provide it, and who collected the data under false pretenses.
The Obama campaign did neither.
EDIT: Here's The Washington Post. "Facebook’s rules for accessing user data lured more than just Cambridge Analytica" - https://www.washingtonpost.com/business/economy/facebooks-ru... :
Cambridge Analytica — unlike other firms that access Facebook’s user data — broke Facebook’s rules by obtaining the data under the pretense of academic use. But experts familiar with Facebook’s systems and policies say that the greater problem was that the rules for accessing the social network’s information trove were so loose in the first place.
The willingness to excuse Facebook selling your privacy because you agree with a political point of view is part of the problem.
If we trust The Daily Mail article:
Davidsen said that she felt the project was 'creepy' - 'even though we played by the rules, and didn't do anything I felt was ugly, with the data'
If we trust the IBD article:
The only difference, as far as we can discern, between the two campaigns' use of Facebook, is that in the case of Obama the users themselves agreed to share their data with the Obama campaign, as well as that of their friends.
The users that downloaded the Cambridge app, meanwhile, were only told that the information would be used for academic purposes. Nor was the data to be used for anything other than academic purposes.
It's an important distinction, to be sure, and Facebook is right to be attacked for its inability to control how its user data were being gathered and shopped around.
(Though it should be pointed out that it wasn't a Cambridge Analytica app, and Cambridge Analytica obtained the data from a 3rd party who wasn't authorized to provide it.)
After all, both campaigns (and most major US political campaigns) surely had embedded Facebook employee account managers helping them spend as much money as effectively as possible, while scrutinizing their account usage.
(Who, me, cynical?)
It's becoming clear exactly what the cost of social surveillance is in our lives. This is just one of many significant problems with it.
There have always been jackasses willing to sacrifice morals (or rightness) in favor of power. Social networks just make it easier to ignore signals that should humanize power.
Seriously, being on the public side I can't tell the difference. Both cases involve (ab)use of user data from FB without the users' knowledge. Both are bad -- or neutral, if you're the sort of user that understands the implications of using Facebook at all. Indeed, I'm not even surprised by these incidents. If anything I'm surprised that anyone is. I'm not even surprised -though annoyed- by partisan attempts to make one side or the other look worse.
Please, let's stop pretending that Facebook's blessing makes one (ab)use of user data OK and the other not. If you object to a campaign's use of your data, why wouldn't you object to a campaign's use of someone else's data?
And yes, your data, when shared with FB, is FB's data. But do people even understand that? You and I do, but does your mom? Mine certainly does not.
> That said, this really doesn't address either the Cambridge situation, or the fact that Facebook themselves allowed the Obama campagin to pull demographic information in violation of their own polices...
The sentence is essentially "this doesn't address A, nor does it address B", which requires that A and B be different.
Somehow, I think after this, it is ( I hope ) unthinkable.
Zuck, Kamala Harris and Oprah are whetting their appetites.
These are the type of people we feel are well qualified to run the most powerful, largest, and most expensive institution in human history?
Not exactly inexperienced.
So, she’ll have a couple fewer years in the Senate and a lot more in elected executive positions than Kennedy when he was elected President.
> These are the type of people we feel are well qualified to run the most powerful, largest, and most expensive institution in human history?
Harris isn't necessarily my first choice, but she's certainly qualified; Oprah and Zuck are unqualified, though less so than the current incumbent (which, admittedly, is a low bar to clear.)
GHWB was founder/CEO of an oil company, Chairman of the Republican National Committee, Director of the CIA, and Vice President.
He might be a great executive, but then, Donald Trump might be a great real estate developer, investor, and business man. Just a few factors aligning correctly and you can be successful, even if you aren't particularly skilled.
Relevant quotes from Betsy Hoover the Obama 2012 online organizing director:
"So the app that everyone's referring to in this moment was an app called Targeted Sharing. It was an app that we created on Facebook that fully followed Facebook's terms of service. And any individual could decide to use the app. When they clicked on the app, a screen would pop up that would say what data they're authorizing the app was giving us access to and exactly how we were going to use that data. And so at that time, it was totally legitimate on Facebook to say you're giving us access to your social network. You're giving us access to your friends on Facebook."
"...So, you know, we got your list of friends. And then we matched it to our model, our list of voters that we didn't build with Facebook data. We built with voter history and, you know, all of the other data points that Democratic campaigns use to build models. But we matched the data of your friends to that model and then reflected it back to the person who had authorized the app..."
Sounds very similar to Cambrdige Analytica imho. In addition I don't really think that exploring how deep the rabbit hole goes is deflection, rather showing how much more pervasive this practice is than people think.
>Facebook friends lists, tags and photos allowed Obama operatives to identify a person’s close friends, which they then matched with offline public records. (Was this person likely to vote for Obama, but unlikely to get out to vote?) They then told the app users which of their friends they should send campaign messages to.
They seem pretty different. Not just in how the data was collected, but how it was then used.
The research into Facebook likes and personality and the manipulation of those psychological profiles, which CA based their entire operation on, didn’t even exist in 2008.
Comparing the two is absolutely a false equivalency.
And the CA scandal goes way beyond the Facebook stuff. The company is allegedly involved in various illegal and anti-democratic activities around the world. Are people really downplaying this simply because they support Trump and CA has a Trump connection?
The other difference is that based on what we know, the Trump campaign didn't actually have any interest in using the CA data in question. The head of their campaign has quite consistently said he thinks CA's psychographics was worthless nonsense, and I'm not aware of anyone finding evidence contradicting this. It's quite possible that, in fact, the 2012 Obama campaign was the only US presidential campaign that systematically gathered information on people's Facebook friends to feed their campaign machine.
> Neither of those citations say that what the Obama campaign did violated Facebook's policies.
The text you quoted does not claim that the Obama campaign violated FB's policies. It says that FB allowed the Obama campaign to do things that would -without FB's approval- have violated FB's policies.
The tweets were from Obama's own Campaign Director, saying that OFA were allowed to violate FB's TOS in a way that others weren't.
The Daily Mail is a tabloid, and is so bad that it has been banned from being used as source in Wikipedia.
I can see why Facebook would care about that, because owning/selling/managing the data is their model.
But that doesn't matter to me a Facebook user when the end result is the same (my FB info provided to political campaign.)
This is one of the features that we've brought upstream from the Tor browser. Further reading at https://www.torproject.org/projects/torbrowser/design/#ident...
I've quit FB, but the above might be enough to contain FB for those who can't/won't. I still do the above for any other privacy-hating aspirants.
You're gaining security the same way bike locks provide security. Anyone motivated could break it but you're hoping that enough people are easier targets that you're ignored.
This solution is good for you personally but it's not a fix.
You can do the same thing with Chrome profiles, Chrome Canary and Choosy. This multi-container thing (per OP, don't know if you are using the same one) appears to be FB only. So while it is effortless, it's reactionary and limited. The general solution is better and doesn't force you to a specific browser.
Disclaimer: When I use google (search, gmail, etc) and facebook I use specific profiles for those activities. I use a default profile for everything else. So, I don't actually use the above solution myself.
I used to alternate between browsers for different uses but after the Pocket debacle I abandoned Firefox "for good". Since Quantum I haven't liked it anyway.
This Facebook extension is a reactionary Facebook-specific adaptation of that extension. But it is still useful for many users since it doesn't require any user interaction.
I think multi-account-containers is more convenient than profiles since it provides a single interface to manage the containers and browser settings are shared, but I am actually using profiles on Firefox because the extension doesn't seem to support the "Never remember history" browser option.
 https://github.com/mozilla/multi-account-containers and https://addons.mozilla.org/en-US/firefox/addon/multi-account...
How? Specifically the part where clicking a link inside the FB container that leads outside of FB opens outside the container?
AFAIK it's impossible, and that's why I stopped using the multi container stuff in Firefox.
It's such an obvious oversight. When they implemented "get into the container when clicking xyz.com" how come they never thought of "get out of the container when clicking something that's not xyz.com"?
But Facebook is a private company, so they're allowed to do what they want regardless of whatever "policies" they set, so long as they remain within the law. If you don't like it, start your own Facebook!
(Due to Poe's law, I should note that I'm being sarcastic. Funny that this sentiment gets brought up all the time in discussions about YouTube, but not here...)
Just deactivated Facebook, and it's a wonderfully freeing sensation. I can always reactivate it if I really need to reach somebody who's only available on there, but that's unlikely at best.
I have though of spamming face book with a ton of likes and shares of things I am not actually interested in.
What are peoples opinions of that? It has the obvious problem that my friends might associate me with things I actually dislike.
Spamming likes and what not to disrupt your profile might be great for your own privacy but isn’t stopping them from monetizing you at all.
Full of obfuscation tactics like that.
Or am I missing something?
This extension isolates your Facebook login cookies while you're browsing on non-Facebook websites, hence making it harder for those tracking techniques to identify you.
Privacy Badger prevents the cookie being sent back to Facebook from these third party websites.
Business world is not technical.
Why not bundle uBlock Origin into every Firefox install and make it impossible for Facebook to track everyone's activity on other web sites?
This has a real limitation, which is that if you click an external link on your Facebook newsfeed, the new website will also be in the FB container. On the one hand, that keeps your FB-related browsing from "contaminating" your other web cookies. On the other hand, all the websites in that container get to link you to your FB account just like normal.
> If you click on a non-Facebook link or navigate to a non-Facebook website in the URL bar, these pages will load outside of the container.
Or am I misunderstanding?
A google one would be so much easier for setting-up and forget it. The facebook one is really nice because once installed it auto logs out then automatically puts facebook.com into its own container everytime I type the url or visit it (even if the active tab is in another container).
With the multi-account extension I have to actively manage containers which becomes annoying quite fast.
Edit: After playing around, this also breaks the back button. If anyone from Mozilla is here, I think it would be better if an FB page opened in a new tab while keeping the original open.
We tried a few different UXs and none of them felt ideal. We have an open issue to add Messenger to the list of FB domains that are contained. https://github.com/mozilla/contain-facebook/issues/45
Why does not browsers (eh, Firefox) do this by default for every origin? 3rd party cookies would be broken I assume, but overall it would lead to less tracking across origins.
Liking and logging in with google should show a similar warning as an untrusted or malicious site and then send the cookies
I don’t see a reason to allow any cross domain tracking from any website at all.
Facebook is probably just the most visible offender rather than the worst offender in this regard.
1. Go to Options->Privacy
2. Go to History and change it to use custom settings
3a. You can either disable third-party cookies
3b. Or you could go a bit further and check "Always use private browsing mode"
But the Facebook container retains your Facebook cookies (and therefore your login) after restarts.
This is the tech world's equivalent of a "how to take a punch" class for victims of domestic violence.
People know that (facebook|their partner) is abusing them (privacy-wise|physically) and they choose to continue the relationship.
That's it. No one chooses a relationship with a drone, and to date Facebook haven't found a way to force people to create accounts and continue to use them, at gun-point.
You don't need a facebook account for them to know more about you than your parents do. They get data from many other sources.
Also, poor analogy.
Really bad comparison there.
I have experienced domestic violence first-hand and I cringed when I read the word "choose" when describing the situation.
It's also no coincidence that their SMS harvesting shenanigans are only possible on Android, the poster child for "fuck privacy I want free shit from a giant creepy company".
Sure, on some level there should be laws about privacy. But this is fucking America we're talking about - it's practically the third world when it comes to the state of basic protections for citizens.
So yes, Facebook (among other agents, like Google) is forcing me to follow a course of action that protects my family and myself from intensive data collection, and that many times breaks my browsing and interferes with my work. Forcing HTTPS, blocking cookies, revoking certificates, etc. and then reverting some special cases or creating exceptions on the fly for specific sites I cannot go without is kind of maddening.
Stalking isn't a relationship, but it does force victims to take action.
Does that suddenly make it okay to completely ignore ethics?
I'm also interested in this claim. If you're referring to Intelligent Tracking Prevention, I don't think it does this.
So ITP does nothing to protect a user who visits facebook.com every day. Which is most of Facebook's user-base.
2. The study is optional.
3. DNS over HTTPS is a net benefit to privacy.