Hacker News new | past | comments | ask | show | jobs | submit login
Proposal: Treat FLoC as a security concern (make.wordpress.org)
561 points by meattle 30 days ago | hide | past | favorite | 258 comments

> Why is this bad? As the Electronic Frontier Foundation explains in their post “Google’s FLoC is a terrible idea“, placing people in groups based on their browsing habits is likely to facilitate employment, housing and other types of discrimination, as well as predatory targeting of unsophisticated consumers.

All of this has been happening with tracking cookies, fingerprint tracking, pixel tracking and so on. And will continue to happen.

I find it so bizarre it took Google to talk about phasing out 3rd party cookies and replacing it with a much lesser technology in the face of FLoC, for people to suddenly be all up in arms about it.

Third party cookies, love them or hate them, have been with us for a long time, and simply dropping them would not be viable without the long phase out. And a long phase out is not something around which you can form a singular rallying cry.

FLoC is a new thing which is just being rolled out, so it's a lot easier for people to resist adding a new thing that makes the internet more crappy and less private.

I think it's unnecessarily fatalist to say that all of this will continue to happen so what's the point of resisting it. Public awareness and negative opinion of the pervasiveness and creepiness of internet tracking continues to grow, and advocacy against tracking mechanisms helps create the type of groundswell which could actually shift public policy to forbid such tracking.

Google specifically is catching some heat for potential antitrust problems, so raising a ruckus about Google abusing its dominant browser position to cram FLoC into the internet is more likely to have positive effect than ever before.

> Third party cookies, love them or hate them, have been with us for a long time, and simply dropping them would not be viable without the long phase out.

Not true, FireFox and Safari have had them off by default for over a year now. Additionally Chrome had planned to turn them off last year but then cried "covid" which for some reason = delay... because... think of the adverts! i mean covid!

Anyway, I'm pretty sure any large websites relying on 3rd party cookies for functionality will have already experienced users blocking issues. We have and have already been forced to change, we don't do ads, but our use case is a bit esoteric in that it needed 3rd party cookies for sessions.

> Not true, FireFox and Safari have had them off by default for over a year now.

Not quite. They will block some 3rd party tracking cookies that fall on their tracking blacklist. If you want to block all 3rd party cookies you have to explicitly disable them.

Safari does actually block all third party cookies by default, not just a subset. And has been doing so for about a year now.

If I understand correctly, they allow 3rd-party cookies but delete them at the end of the browsing session rather than respecting the expiry.

Care to reach out? This username at Microsoft.

We're working to understand what legitimate use cases are broken without 3p cookies so we can work with Google to backfill them. FLOC helps ad trackers track but doesn't help with any of the legitimate uses of 3p cookies like auth.

Since you seem to work for Microsoft, the broken use case I know: without 3p cookies Microsoft Teams is broken (it shows a not very useful error page asking to refresh the page, when you do it shows the same error again). Many of us need it for remote work so I guess this one of the things Google had in mind when they delayed disabling 3p cookies by default.

Yes, Teams relies on silent auth from within an iframe (all those chat windows etc). This is impossible without 3p cookies, so we're working with Google to find a solution. Storage access API in Safari works, but sounds like it won't make it to Chrome.

Thanks, nice to know you are working on it.

Auth should be doable with just redirects though right? Isn't that how OAuth and OpenID connect work?

Same thought. I’m curious to know what other kinds of authentication protocols require third-party cookies to operate. Within OIDC, even more obscure/advanced features such as session management and global logout require only 1st party (the IdPs) cookies in order to function.

I guess the tradeoff being made here is just leaning into our reliance on the certificate authority system. Whereas before, with third party cookies, you might have had more flexibility with how you structure your domains.

The implicit grant, while deprecated, is still used across the ecosystem. Further, embedded apps (eg you have a portal that iframes Salesforce and Salesforce needs to be authenticated) can't redirect to the login page or open that iframe for cookies.

Front channel logout is also broken, part of the OIDC spec. It opens iframes to sites, but the sites don't get their cookies and can't write them.

OAuth sometimes is done using an iframe and a pop-up window instead of by navigating to the federated auth provider and back. That version often needs third party cookies.

Sure, I suppose all that (waves hand at third-party cookies situation) is what I meant by the long phase out of them. As far as I can recall those defaults were signaled a fair amount of time in advance.

I have sympathy for anybody with a legitimate use for third-party cookies whose life has been made more difficult by bad actors abusing features to maximise their own profits.

Same tragedy that finally caused origin-based cache partitioning at the expense of some performance.

this is the same execuse facebook's been using, although they're trying to push the narrative that limited tracking in Apple's iOs 14+ "hurts Small Business!"

https://twitter.com/jason_kint/status/1383391849902075911 (video in thread)

Another thing is that third-party cookies can actually offer real functionality that has nothing to do with advertising (for example platforms like Salesforce let you integrate your own offering into a Salesforce users' system, and third-party cookies can be a pretty straightforward way of accomplishing that).

I don't see really any non-tracking reason for FLoC

I have 3rd party cookies off. The only thing I have seen requiring 3rd party cookies in the last year is Microsoft when I had to use teams.

> FLoC is a new thing which is just being rolled out, so it's a lot easier for people to resist adding a new thing that makes the internet more crappy and less private.

What exactly about FLoC is making the internet less private though? Every time I see the technology it seems that it's deliberately built to keep private data in your Chrome browser and leak significantly less than anything else Ad related right now.

Also, why is WordPress allowing so many ad and anlytics tracking plugins and not considering those as security issues?

If you have figured out a way to eliminate tracking, be my guest. Mozilla would like to know, Apple would like to know. Until then FLoC attracts attention because it's new, yes, this explains our reaction. It's still an irrational reaction.

Also what's this "predatory targeting of unsophisticated consumers" about? You don't need targeting for this. Heck you don't need anything for this. The way it's usually carried out is you hack some sites and redirect them to you landing page about "this one magic trick to riches, banks hate her".

Mozilla at least has done plenty of things to reduce the effectiveness and scope of tracking on the web.

Just because we can't create an ironclad solution doesn't mean that there isn't real value in reducing the usable surface area for profiling and tracking users across the web.

FLoC is an additional means of tracking users that is presently being pushed by a giant corporation who controls a significant web browser, of course anybody who is pro-user-privacy would rail against its rollout. It's ridiculous to call that an irrational reaction.

Apple is doing just fine on greatly reducing tracking, and we don’t think FLoC or anything much like it is necessary. Safari already blocks 3P cookies by default and has done so for a year.

So you're the one who keeps doing that, or is it a group thing?

Just curious. It undoubtedly works, but I've always wondered why it's so pervasive.

Congratulations, you win a free iPhone.

The issue really is: Google isn't phasing our 3rd party cookies out of charity. They are clearly looking for a way to keep doing all the things third-party cookies enable after they go away.

Here we find people saying (through legislative and regulatory action) that they want to end the use of 3rd party cookies because the bad behaviors they enable, and they are rightly outraged at the efforts to comply with the letter of the law while running roughshod over the intent of the laws.

Why is firefox phasing out 3rd party cookies? Is it out of charity?

Firefox is not a product of the 800lb gorilla of targeted ad revenue. The Mozilla foundation could live without the ad revenue tied 3rd party cookies. They feel pressure from their users (people like us) who want to not be tracked everywhere, and ending the use of 3rd party cookies is one technical step.

At some point in the past year it seems the privacy concerns reached a critical mass. The Mozilla foundation is responding in ways to keep its browser share relevant.

Yes. Mozilla is a non-profit, pretty much everything they do is out of charity.

Sort of true. To know what a non-profit's motivations are, look at the biggest donors and donations. Some of the Mozilla Foundation's money comes from the Mozilla Corporation, which does get a lot of money from the big tech companies.

As far as I'm concerned, it is just great that something made it for people to be "all up in arms about it". And "why" is not very important and neither is it surprising, really. I mean, how could have people been all up in arms about well-established technology all web is built upon, when 10 years ago google was, allegedly, "the good corporation", all people were like "I have nothing to hide", avoiding FB because of privacy concerns was just "being paranoid" and people telling stories confirmed by Snowden shortly after were "schizophrenic"? It is literally like 3-5 years (right before Cambridge Analytica scandal, I guess) since being "concerned" about your web-presence became somewhat mainstream.

And FLoC is... well, it's new.

Third-party cookies can be blocked by technological means. Heck, cookies can be blocked. There's even a toggle in Chrome to disable them, no extension needed. And yet, here we are, attempting to preserve the bucketing of humanity despite what anybody using the browser actually wants.

People don't want more things like 3rd party cookies.

The idea is that this is the bare minimum the industry needs in order to stop needing 3rd party cookies (and other tracking strategies).

It's not pretty but on the other end... ads do serve a function. And not all ads are bad. The focus should be on getting rid of the scammy ones, and not tracking won't help much with that.

Fuck the industry. The entire point is to get you to spend more money on stuff. I will do nothing to assist with the brainwashing of humanity, and will continually fight against it.

I think a sufficiently large number of people is saying they effectively do not care about the surveillance-based ad industry and want it dismantled.

Tracking is used pervasively by the ad ecosystem today, but that doesn’t mean tracking is needed for ads to exist. Contextual ads are a thing. FLoC isn’t even replacing the most important use of tracking, which is measurement, not targeting.

Ads serve capital, they do not serve me.

I think many people would disagree with the conclusion to your second paragraph. If you don’t take it as given that something must come and replace cookies then FLoC appears to be an attempt to rebuild that which was just cast into the bin of history.

One difference between third party cookies and FLoC is that with FLoC it is being explained to the public how browsing history is being used. The third parties who send Set-Cookie headers do not write up documentation attempting to explain to the public what they are doing and why it is not a threat to privacy.

As other comments point out, it would be more difficult for people to "be all up in arms" about third party cookies because third party cookies is not an issue that is easily attributable to one entity nor one set of documentation.

“attempting to explain to the public what they are doing”

The idea that you can explain FLoC, third party cookies or any other digital advertising technology to the public is crazy talk.

Ad tech is extremely complex and constantly evolving - “the public” includes children, the intellectually disabled, the mentally ill, the elderly and the illiterate.

No amount of documentation is going to help these people reach a point where they could actually be considered as giving “informed consent” to their “use” of an ad-tech stack that often involves dozens of different legal entities and software components.


“why it is not a threat to privacy”

The entire purpose of FLoC is to maximise profit for Google by minimising my privacy.

Agreed 100%.

Only me, but I actually find the docs insulting for the reasons you stated.

I think the cause and effect is a bit mixed up in your reasoning. What's actually happening is threefold:

1. People are starting to get fed up with tracking of all kinds (including third-party cookies). This is happening gradually, but is increasing. A consequence of this can also be seen from the legal side in the form of GDPR.

2. Google sees the tides on tracking are turning and tries to preempt by proposing a superficially less problematic alternative (FLoC) that will be least detrimental to their tracking business.

3. People dislike FLoC because it is not sufficiently lessened in this new normal and is therefore also unacceptable.

I am hopeful that this will help get rid of FLoC but I worry about two things. One, this will end up being treated like the "no track" headers. That's just totally ignored after IE (was it IE?) enabled it be default. That gave all the trackers a reason to just ignore it and track everyone. I don't know if that exact same thing can happen here, but something similar maybe? The other thing I worry about is that FLoC 2.0 or whatever might replace it, will be worse.

"Kill it before it lays eggs." but do we worry about what evolves from this if it dies?

> "Kill it before it lays eggs." but do we worry about what evolves from this if it dies?

Nothing really evolves here - status quo is what stays. You continue to be tracked head to arse on everyones servers, the media keeps adding 150 trackers to every webpage and the internet moves on.

Thinking that one of the biggest profit making industries in US will just go away if you scream loud enough on HN is utterly naive and will require a better push. This approach is inherently negative and just STOPS a process - but it doesn't IMPROVE on the current state and that will require more work.

I'm not quite sure what that work would be though - it seems that current approach is "this gigantic multibillon industry must be banned and completely destroyed" which is great on a personal level, but I don't feel like it's realistic on a purely political level.

We should keep in mind why Google invests in FLoC, though.

Either they realize third party cookies are on a (regulated) dead end. Or they realize there is a bigger moat. Or something else that helps them.

But in any case, seeing the current Google, this is not something benefitting their users(products?) primarily. Unless some benefits accidentally aligned.

So, pushing back towards the broken status quo may be the right thing, if you know, or believe, how Google is going to benefit from the new FLoC.

I cannot evaluate that. But Googles track record does not offer me confidence their new tech is going to help me overcome the issues I have with the status quo.

> We should keep in mind why Google invests in FLoC, though.

Is that really so hard to figure out? Google wants to continue running their Ad network and they've been under attack for collecting data.

FLoC seems to significantly reduce the need to data collection while still enabling their core business to continue.

I might be wierd, but this still seems like a major win to me - since I care about my data not being stored somewhere off my devices and FLoC keeps it on my terminal equipment. And it allows my Firefox to send customized or faked data.

Don't users not like being tracked by 3P cookies? Isn't that why 3P cookies are being phased out?

The cynic in me says that Google is phasing out 3P cookies only because it's got something better (via Analyitics and now FLoC), and so it can simultaneously get some good PR while hamstringing the competition.

> "this gigantic multibillon industry must be banned and completely destroyed"

Which industry? Online advertising? Or the whole sector with Google at the front? I think it's a mistake to assume that tracking and the massive trading in personal information that takes place now is somehow foundational to either industry. Advertising worked before that was a thing and it will continue to work after. The amount of money flowing into advertising won't be dramatically changed because advertising is necessary.

It might be that if online advertising was significantly dumber, money would be shifted from online to print/tv/whatever, but that doesn't mean it's somehow "gone".

Also, if dumber ads are the only ads you can buy, then dumber ads will cost more. Now clever ads cost money (ads with fraud prevention mechanism, conversion tracking, fantastic targeting) costs a lot money. A dumb ad shown to every visitor to a website without any targeting or followup wouldn't bring much money per visitor. But if that dumb ad was what you could do and your other option is a bus stop ad - then you might have to pay a premium for that too. The loss of the ability to track people wouldn't change the laws of supply and demand for advertising space.

Only govt action will work. That too concerted action by several national govts.

Developing a browser (or forking the existing one) with comprehensive anti-tracking features would also work.

There are a half-dozen plugins one can add to Ungoogled Chromium to browse the web in (relative) safety. It's not a nation-state level undertaking: six or seven figures.

The problem really comes from apps, which are loaded to the gills with spyware.

How can it do that when the server needs an IP address to send traffic to? Cookies just make it more convenient but there is fundamentally that "analog hole". You can Tor things up and obfuscate but if you can interact with them they can track you.

> How can it do that when the server needs an IP address to send traffic to?


The Tor Browser is a pretty comprehensive solution to that, especially in high security mode.

>Only govt action will work. That too concerted action by several national govts.

Any government action will be a compromise by necessity. Which is why I think EFF doesn't really push for it - even GDPR doesn't ban targeted advertising in full.

Do you remember Snowden story?

The govt action is the shitty way out. This all is a classic there is not enough to go around situation. Govt regulation will make it more entrenched and "manageable".

The best outcome is to come up with a fundamentally better business model. Something that satisfies seller's desire to promote their products and customers desire to feel respected and important. Preferably cutting out a middleman and reducing costs of doing business at the same time.

> The best outcome is to come up with a fundamentally better business model

A fundamentally better business model already exists: make users into customers. Google should charge users directly for the services they use. Then they wouldn't need to resort to all these underhanded tactics to try to monetize their valuable services. They could just monetize them directly.

Of course this is highly unlikely to happen now that everyone is conditioned to expect valuable services like Google's to be available for "free". But they're not free and never have been: the only question is how we pay the costs. Right now we pay those costs with our personal data and our attention, plus the time and effort we have to spend to try to push back against our personal data being monetized and our attention being incessantly competed for by advertisers. I would gladly pay in money to make those non-monetary costs go away. Perhaps I am an outlier and not many people would. But that just means we pay the costs in other ways that end up being even more costly than the direct money costs would be.

Fundamentally better for whom? Do you think Google has never considered that business model? I think it’s much more likely that they’ve put a considerable amount of effort and research into it, and concluded that their current business model will let them extract the most money from their products.

That’s why we need regulation. Under these market conditions, Google’s business model does appear to be the best for them.

> Fundamentally better for whom?

In the long run it's better for everybody. But it is true that "the long run" can be pretty long.

> Do you think Google has never considered that business model?

I think Google probably considered it early on but found it easier to go the way they actually went. But "easier" is not the same as "best in the long run".

> their current business model will let them extract the most money from their products

Google doesn't have products, they have services. And of course, since their services are free to users and users are now addicted to that, they can obviously extract more money with their current business model since they have made a concerted effort to make the "users as customers" business model impossible.

However, their current business model was being built during the same time period when "Don't be evil" was still the company's motto and still apparently taken seriously by company leaders. Which means those leaders were either very disingenuous or delusional. Because addicting people to a free service and then exploiting them and their personal data in order to make the money they can't make from the users directly, as customers, is evil. And trying to keep their current business model propped up in the face of users becoming increasingly aware of the ways in which they are being exploited, is only going to force Google to be more and more evil. Sooner or later, if it doesn't change, it will kill Google as a company.

> That’s why we need regulation.

Regulation won't fix this problem. Corporations can always either buy their way around regulations (oh, another million dollar fine because we broke regulation XYZ about exploiting user data? just rounding error in our accounting) or buy enough influence to get the regulations written so they don't actually impose a burden on them (but do impose a huge burden on potential competitors, the new startups that would otherwise be finding ways to disrupt Google's current business model, since users are clearly becoming dissatisfied with it).

The only thing that will fix this problem in the long run is for users to realize that there is no such thing as a service that is (a) free and (b) valuable. We are going to pay the costs somehow. The simplest way to pay them--with money--is also, in the long run, the best.

Can you explain how exactly charging their users money is a better long term business model for Google than selling their data? If you don’t count the threat of regulation (since you say it won’t solve the issue) I can’t think of a single piece of evidence supporting that.

It sounds like your reasoning is “users will eventually wake up!” which I would bet a lot of money will never happen.

> Can you explain how exactly charging their users money is a better long term business model for Google than selling their data?

Because long term, users will realize that letting their data be sold is bad for them and will stop considering it acceptable. Indeed, that is already happening. And so, as I said, Google will have to continually become more and more evil to try to prop up their business model by further obfuscating what they are doing, until it becomes unsustainable and they crash.

> It sounds like your reasoning is “users will eventually wake up!”

More like: when enough users have suffered serious harm from having their data sold (which is only a matter of time--plenty of users already have suffered harm due to Google's incessant seeking after data--see for example all the furor over the "real names" policy, which was not just Google but they took plenty of flak for it), it will stop being considered acceptable. (Users who already correctly foresee such harms, like me, are already taking whatever precautions we can to avoid providing the data in the first place. I don't use Facebook, I don't use Twitter, I don't use any other social medial "platforms", the only Google services I use are search and maps, and I never click on ads. And I would be glad to pay Google directly for search and maps, if only they would let me do so in order to avoid having what data I do provide them sold to third parties. In fact, given that "freemium" is now a recognized business model, I don't see why they aren't trying it.)

> which I would bet a lot of money will never happen.

Then I assume you are long Google?

> Because long term, users will realize that letting their data be sold is bad for them and will stop considering it acceptable. Indeed, that is already happening.

It's happening at the fringes, in communities that are already more privacy-conscious. I haven't seen any signs of a groundswell of support for this position.

> Then I assume you are long Google?

No, I was speaking figuratively; I don't have any desire to profit off surveillance capitalism. I simply see regulation as a far more likely solution than users demanding an ethical business model with their wallets.

> I simply see regulation as a far more likely solution than users demanding an ethical business model with their wallets.

In the short term, I agree we're far more likely to get regulation than a significant change in user demand. I do not, however, think any regulation we get will be a "solution".

Do you think it woukd be better if we were still billed by the kilobyte used? A pay per use or monthly quota would outright discourage curiosity and add in another mental fatigue of tracking costs. Keeping information access gated behind wealth would not be an improvement. Those very real costs outweigh the vague theoretical and frankly some psychologically self-inflicted ones. Your data being monetized by others is not an intrinsic harm. Be wary of abuses but use isn't always a harm. Somebody doing a traffic survey may make money from your data but that does not make you poorer.

> Do you think it woukd be better if we were still billed by the kilobyte used?

What does this have to do with Google? Yes, my ISP charges me a flat monthly fee for Internet access, and I prefer that pricing structure to being billed by the kilobyte (not that any ISP I'm aware of tries that any longer). But I'm still paying directly for the service. My ISP doesn't give me my Internet connection for free and then try to monetize it by showing me ads.

> A pay per use or monthly quota would outright discourage curiosity and add in another mental fatigue of tracking costs

Google could use the same pricing structure my ISP does: a flat monthly fee, with no limit on usage, billed to my credit card. No more mental fatigue than "do I have a working Internet connection?", which is exactly how much mental fatigue it takes for me to use Google now.

Would this be a challenge to achieve at scale? Sure. But a company that really took the motto "don't be evil" seriously would be taking on exactly this kind of challenge, precisely because it's a problem that someone is going to have to solve sooner or later, and is worth a lot to whoever solves it because it makes things better for everybody. Who better to do it than Google? But instead of hiring smart engineers to solve this problem, they're hiring smart engineers to figure out better ways to capture users' eyeballs. It's insane.

> Keeping information access gated behind wealth

In a sane society, resources like Google search would be made more widely available by the standard method taught in economics classes: price discrimination. The price they charge for their services would vary according to what the particular customer can easily afford. People in first world countries, like me, might pay $10 or $20 a month. People in the poorest countries, where Internet access itself is not guaranteed, might pay nothing, as they do now. Google has about four billion users and about $180 billion in annual revenue; that works out to an average of $45 a year per user. That seems feasible, if they are allowed to price discriminate. But of course price discrimination is considered "evil", even though it's not--it delivers more value to more people when it is allowed to happen.

> Your data being monetized by others is not an intrinsic harm.

While there are of course ways in which my data can be monetized that don't harm me, I think the actual evidence clearly shows that the ways in which our data is being monetized do carry a high risk of causing harm. "Traffic surveys" is not a good proxy for what most data harvesters are actually doing.

As soon as you invent that you will be bought out or strong armed out, it is very rare for a new niche to be established wholesale.

Also, nearly $125B was spent on internet advertising in the US in 2020, per the first estimate I found on the internet [1]. While Google and Facebook keep huge chunks of that, my guess is at least 40% flows through to publishers. So that's a $50B revenue stream to publishers (all sorts of web sites, including news; apps, musicians (via spotify and so forth)) that we're talking about breaking. I really don't believe people have thought through all the effects of that. Not least of which is seeing almost all (reliable) news behind a paywall.

[1] https://www.statista.com/statistics/183523/online-advertisem...

So if suddenly all tracking stopped, advertisers would just stop spending money on advertising? That doesn't seem right... advertisers published ads before tracking was a thing, they would still do it if tracking becomes impossible.

yes it’s called contextual advertising and it’s fine. Visit a camping website and companies pay to advertise camping gear, travel , etc. Visit a video game review site companies pay to advertise new games, systems etc.

The very idea that a user needs to be tracked from site to site and a profile built around his/her web activity is dystopian and depressing.

The key, of course, is that the big successes of advertising, cannot use contextual advertising (much). On Facebook/instagram/... it just doesn't work, as there isn't much context to the posts.

Huge amounts of it, yes -- particularly since the anti-floc people (which, to be blunt, I'm not in love with, particularly Google just deciding to do this on their own) tend to also be in the break 3rd party cookies camp.

With respect to eg brand advertising: even if you get past an inability to measure impact, once you break most of the ad infra, ad buyers simply aren't going to negotiate / buy with small sites. It's not worth their time or money. Small here is probably less than millions of uniques per day.

With respect to direct response advertising, you've mostly lost the ability to track a conversion. So it becomes pointless.

The advertising before extensive tracking was a different time: way way less money, way fewer ads, way less ad blindness amongst viewers, way way way fewer publishers, etc.

Will some advertising persist? Absolutely. eg the branded / source trackable referral codes that podcast advertising uses. But there will be an enormous falloff in dollars pointed at publishers.

And to be clear, I'm not a fan of 3rd party tracking. But we should be deliberate before we end the ad-supported internet.

Of course you can track ad impact and conversion - you just direct the ad to a certain url and see how many hits you get.

And banning extensive user tracking doesn’t mean “ending the ad supported internet”, that’s sensationalist to the max!

To suggest that ending tracking would mean that sites have to individually negotiate ads with individual websites isn’t true either - ad networks have and will always be a thing, regardless of the ability to track.

First, you could still track a direct response conversion by including information in the url for if they click on it. You can still even track impressions by measuring requests.

Second, if this will truly cause a drop in advertising spend.... then that money will be spent somewhere else, which might boost a different industry.

I don't think this would really change advertising spend, though... it would just change the type of advertising and how it is tracked/paid for.

Advertisers still want to get their ads in front of people, and the amount of content to advertising demand wouldn't change.

In fact, I think a change to content based advertising will help with content quality. With user based advertising, an advertiser doesn't care if the valuable person is viewing good content or not. Content creators just need to attract the valuable eyeballs, and can use as much click bait and useless content as possible to get them.

With content based advertising, the advertiser will spend on quality content, because that is the only metric they have to try to reach quality users.

The ad infrastructure can still exist -- it would just have a restricted set of data (IP, device fingerprint, the surrounding content, and whatever info the first-party publisher voluntarily submits about you) to decide what ad to serve. Small, niche websites may do better than big news sites under this regime since you can infer more about their visitors by the fact that they chose to visit.

I could see bigger sites expending a lot of energy trying to bring the tracking and inference in-house, and even federating these efforts, creating a kind of soft-paywall that requires you to "pay" by validating an email address or some other stable identity marker in exchange for temporary access to content, so they can watch what you browse and build a shared model of you that they can feed back into the ad networks. I could see the NYT continuing to manipulate and fine-tune its headlines and graphics, trying to sort its visitors into cohorts based on what appeals to them to squeeze every last cent out of a pageview.

At the same time, so much content discovery and consumption happens in the belly of the beast (Facebook, Google, Youtube) that most ads will continue to be targeted based on the considerable information those websites have about you, regardless of what browsers do or what happens to third-party tracker networks.

I feel news was of better quality 15 years ago than today so I wouldn't mind going back to that state of the world. And 15 years ago pervasive tracking wasn't a thing. So yes, please, let's kill it with fire.

> I feel news was of better quality 15 years ago than today

Having been around for the last 30 years (yes, even before the 90s!) I can say that the quality of news and the NUMBER of news sources is far greater. The amount of data online has only increased and has increasingly been sourced. Imagine the world before, when the only source of news was to interview someone in person or read the copy that particular source was using in the narrative.

I've been around for the same amount of time so that's not the cause of the difference in our opinions. What might be is that I'm not talking about general availability of information, but news specifically. To me, news quality seems to have plummeted roughly at the same time as the rise of targeted ad industry, which is also the time news sources became "free" (as in the reader not paying any money for it).

We're already successfully killing third party cookies and most browser fingerprinting strategies. This is an attempt by a browser to build an intentionally user hostile mechanic to compensate, but we can kill this too.

We just need to continue to make it increasingly impractical and expensive to track users until it stops being considered a viable business strategy.

What do you mean? They are widely used, which seems far from dead. Aren’t you declaring victory too early?

Safari and Firefox already block them by default, and Chrome is set to block them before 2022: https://www.wired.co.uk/article/google-chrome-cookies-third-...

The FLoC proposal (and others) are happening now because of the coming cookiepocalypse.

The causality is more complex: Chrome's approach from the beginning was that they would remove third-party cookies and replace them with more private alternatives like FLoC: https://blog.chromium.org/2020/01/building-more-private-web-...

(Disclosure: I work on ads at Google, speaking only for myself)

If Chrome wants to be the only browser with third-party cookies, they're welcome to, I suppose. Breaking down Chrome's dominance has to start somewhere, and having a straightforward, easily verifiable reputation as the single least private browser on the market is a decent start. I already know what the headlines from most sites will look like if Chrome decides to reverse course.

If only Firefox was removing cookies, that would be a problem, because Chrome could just ignore them. But with Safari on board as well, and with the entire iOS market at stake for sites that try to ignore the policy...

If Chrome doesn't remove third-party cookies, they will be the only browser anywhere not to do so. Chrome's original stance might have been conditional on finding a replacement, but I'm not sure they still have a choice at this point. I don't think Google is going to hand that selling point to Apple, and you're seeing yourself in these comments that a lot of the people following this issue didn't accept Chrome's original promise as conditional.

And maybe Chrome is confident enough in their market position that they're willing to take that hit and they think it won't matter. Maybe they're even right. From my perspective, breaking Chrome's dominance on the web is a necessary thing that needs to happen eventually for the health of the web, so every time that Chrome makes their browser worse in a highly public way, that's a win.

Remember that Firefox and Safari are already blocking the majority of third-party cookies online, and those browsers still work today, the web hasn't broken for them. So every year that Chrome spends delaying that deprecation is another year where people like me can point out that they're lagging behind literally the entire market on privacy.

> If only Firefox was removing cookies, that would be a problem, because Chrome could just ignore them. But with Safari on board as well, and with the entire iOS market at stake for sites that try to ignore the policy.

nit: Safari was ahead of Firefox here, with ITP 1.0 blocking most third-party cookies by default in 2017.

Indeed, Apple's been at the forefront here. It's why I'm low key okay with the WebKit monopoly requirement on iOS, everyone has to deal with it.

And the other minority browsers are also on board now. Edge and Brave and such are also preferring privacy-friendly default configurations.

If we kill FLoC, my hope would be that Google still finds it untenable to backpedal on removing third party cookies... or that public awareness about Google's antiprivacy stance kills Chrome if they do backpedal.

It's simple: We force Google to stop tracking us, or we stop using Google products.

Chrome is not the only browser working on more advertising-specific APIs as more-private replacements for third-party cookies. For example, Edge is proposing PARAKEET [1] for remarketing, and Safari has implemented an initial conversion tracking API [2].

[1] https://github.com/WICG/privacy-preserving-ads/blob/main/Par...

[2] https://webkit.org/blog/8943/privacy-preserving-ad-click-att...

Yeah, I've heard of PARAKEET, and imagine concerns are quite similar to FLoC. Thankfully, Microsoft doesn't have the capability to push web standards, so as long as Google doesn't adopt it, we are good there. =)

Apple's solution doesn't look like it provides user interests or demographics, does it?

PARKEET is much more like Chrome's TURTLEDOVE/FLEDGE than it is like FLoC ;)

There's a lot of cooperation here, and similar goals; I'm not sure why you think Microsoft and Google can't find an API they both like?

These are strategies that are being aggressively restricted. Chrome has not started preventing third party cookies yet, but they're the last holdout and have already stated they will kill them shortly.

If you're using a non-user-hostile browser, these strategies are already heavily limited by default and are already not a concern. Every Firefox release is making significant improvements on reducing the fingerprinting footprint of the browser, and several user-hostile API features proposed by Google have been rejected by them and Safari to prevent expanded fingerprinting.

> Chrome has not started preventing third party cookies yet, but they're the last holdout and have already stated they will kill them shortly.

Chrome's original announcement about phasing out third-party cookies is explicit about new technologies like Privacy Sandbox (which includes FLoc) being how third-party cookies will no longer be needed:

"After initial dialogue with the web community, we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete. Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome. Our intention is to do this within two years." -- https://blog.chromium.org/2020/01/building-more-private-web-...

(Disclosure: I work on ads at Google, speaking only for myself)

Rhetorical thought question: How long could Chrome survive as the only browser which refuses to stop tracking users? The idea that Chrome was the fastest or best browser has fallen pretty far out and behind those which block tracking scripts and ad content, and two alternatives to Google straight up pay users to use them, where's the carrot for using Chrome?

So probably the phasing out third-party cookies will be postponed due to these reaction?

It is not clear to me at all what the overall view is on FLoC. Brave and Vivaldi don't like it, sure, but they already ship with built-in ad blockers so of course they don't. People here who don't like it also seem to be against advertising in any form beyond direct deals between publishers and advertisers for <img src="https://advertiser.example/ad">.

If there are people who are (a) ok with personalized ads, providing they can be done sufficiently privately and (b) do not like FLoC, then I'd love to read what they have to say!

(Still speaking only for myself.)

People here who don't like it also seem to be against advertising in any form beyond direct deals between publishers and advertisers

I doubt many people object to ad networks and real time bidding; it's just that the user's personal information shouldn't be exposed in the process. Yes, that means the only signals you'd get are the current page, and maybe high-level OS/browser/device info.

(a) ok with personalized ads, providing they can be done sufficiently privately

My opinion, which I think is fairly common around here, is that what you're describing is fundamentally impossible. Much like the incessant government demands for encryption backdoors that don't compromise security.

>> providing they can be done sufficiently privately

> what you're describing is fundamentally impossible

I guess the question is what you would consider to be sufficiently private? For example, would it be sufficient for the advertiser to be completely unable to distinguish you from a sufficiently large group of people with similar behavior?

I mean, the issue is personalized ads. It shouldn't exist, and advertisers would make just as much money without it if it were illegal. Content-based targeting has worked for decades and does work today. Sites have target markets, ads have target markete, connect these and you are serving ads to the right people, without compromising their privacy.

> the issue is personalized ads. It shouldn't exist

Why shouldn't personalized ads exist?

> advertisers would make just as much money without it if it were illegal

It depends very much on the advertiser. Advertisers with broad interest or close matches to specific publication types, sure, but that's not everyone. One way to think of this would be to imagine a world in which advertisers couldn't even choose where their ad appeared -- they would make less money then, without the ability to target contextually, right? There are many valuable transactions that only happen because the right information is given to the right person, and the less well-targeted ads are the more of those you lose.

The story is even worse for publishers. There are major kinds of publishing with negligible commercial tie-in. Historically, the expense of producing a newspaper or magazine meant that you were never holding a single article, and it could be treated essentially as one unit for advertising purposes. But now it is very common for articles to be shared in isolation, which means this cross-subsidy disappears.

Okay, I still think it’s too soon to declare victory until Chrome actually does it. It could be delayed.

By "we" here you mean... Google with Chrome (as the most popular browser), Apple with Safari and Mozilla with Firefow. Google being the one against whom the fight against FLoC is being fought?

That sounds... optimistic since you needed Google to form that "we".

> this will end up being treated like the "no track" headers

This seems disanalogous. FLoC requires browser cooperation. The user can simply use a browser other than Chrome.

What's to say Chrome will actually respect the opt-out headers in the first place? It could easily go like the DNT-headers, which was just interpreted as a signal to please-track-harder.

Are you sure it won't evolve into that anyway? Google isn't looking at FLoC as a compromise, it's just an intermediary while they continue their ever-lasting search to optimize their ad services. The next Big Thing will arrive whether or not FLoC is allowed to exist.

A comment in the WP post brings up the malicious nature of FLOC opt-out - it requires base layer changes to your site. Google knows from Samesite that it requires "your app is going to break" levels of urgency to get old sites to update, and can likely follow the dots to how an opt-out is much less likely to be used than an opt in.

This feels like something that should get more attention/discussion. It flew for Samesite because "better security defaults" is a good argument. Not sure it works that way for FLOC.

Despite being involved in the Samesite rollout I hadn't quite made the same connection as that commenter, as I am not as connected to the FLOC work.

FLoC cohort computation only triggers on websites which call the document.interestCohort API or load ads. So your average website does not have to opt-out. It's not opted-in in the first place.

How likely is it that any third party script would call the API? Most sites I know try to pull in at least one Google hosted script.

You may have to audit third-party scripts before using them (which is not a bad idea regardless of FLoC).

If you load Google Analytics on your site, I feel like you have no right to complain that it may track users...

WordPress is 41% of the web. If this goes through and FLoC is disabled by default by WordPress, will FLoC be dead on arrival?

Between large web publishing platforms and all alternate browsers blocking FLoC, I think we could kill it, yes. WordPress is used by a lot of marketing focused folks though, so we'll see if WP is able to land this.

It's staggering how much leverage WordPress has. They were going to stop using React because of the patents clause, and only a week later Facebook caved and relicensed it as MIT.

This is very interesting. My web development role right now is at a marketing company that works pretty exclusively with Wordpress.

I've always been so interested in learning about the next best thing that I hadn't given Wordpress much thought.

Now, using it all the time, it's popularity is very understandable as an interface for people who are not technically savvy to maintain their own website.

I feel like the Wordpress community isn't the loudest, but it is certainly a force. I think, as a brand, this move definitely has me more excited about working with their software.

Exactly. A big part of the WordPress community are publishers, bloggers, affiliate marketers, etc who rely on ads to generate revenue. I'm not sure they'd be too thrilled with this proposal.

Sure, but this doesn’t mean no advertising, it means no default supporting FLoC. I know advertisers aren’t going to like it, but I doubt it means they’ll give up advertising altogether.

I wonder if AdWords will require use of floc headers

> I wonder if AdWords will require use of floc headers

I don't know, but I guess they won't. Instead, you'll just get worse targeting on your site if your users don't send the headers. Which I think may also not be very popular with WordPress users, but I guess the proof will be in the pudding.

The ones in marketing will rather immediately request that it is turned on instead.

Google has such a monopoly that it will take a lot to overcome their plans.

Glad to see WP taking a stand - I never knew that FLOC would be so bad. The WP proposal made it clear that it’s a discriminatory technology.

Well, FLoC is implemented on Chrome, you don't disable it, you opt out with a Header.

So if Googles find that too many people uses the header, they can just decide to ignore it from now on. Who is going to prevent them to do that ?

Possibly GDPR? As an explicit no-consent to tracking? Not rhethorical questions, I know too little about the details.

When you use Chrome for the first time, it makes you accept its ToS which tells you they are going to track you.

If the ToS are contrary to the law, then they are null and void. Laws tend to trump private agreements. Then, if it goes to trial in Europe, they’d have a hard time proving that the ToS are fair and that the user agrees freely and understanding what is being agreed, which is also another condition for any form of contract to be valid.

You're saying there is some law which prevents me from inputting my own data into a program, and it categorizes me into one of a thousand types of people?

My comment was specifically that ToS are not a licence to behave illegally. There is no law preventing you from doing that in general (though there are specific limitations), but there are laws on how PII needs to be processed and stored.

yes, laws can prevent you from doing things that, when explained technically and without context, sound trivial and not important.

Does the gdpr or any other law prevent end users processing their own data in their user agents? Or require that headers sent to respected?

The GDPR requires tracking to be opt in. The fact that you have to use special headers to opt out is already problematic. Ignoring the headers to track anyway is of course worse.

IANAL, but my understanding is that this is not in line with GDPR. You are not allowed to force the customer into tracking, which effectively happens in the scenario you describe since the user can't use the browser without accepting the ToS. Also, you have to be quite explicit: simply burying tracking in 52 pages of unrelated legalese is not compliant with GDPR.

Someone please chime in if I'm wrong here. I'm no lawyer but do take these things seriously (I'm trying my best to provide a tracking-free website.)

They will lose that case under GDPR, you can't hide the details in ToS and hope the user doesn't see it. You must get informed and freely given consent. Google is violating both, because I can't click "No" and the information is so hidden you can't expect a normal consumer to find it.

It will take a few years but they're going to get hit very very hard by EU privacy regulators.

Of course, but the goal is not to win, the goal is to make it so it take years before they get fined. In the meantime, they will have made enough money and it will be factored into the cost of business, then they will come up with a new tracking scheme. Rinse and repeat.

This assumes the majority of these Wordpress websites will update to the latest version in a timely manner

A key point of this is that if they consider it a security flaw, they will backport it into point releases for WordPress blogs that haven't done major upgrades in years.

If added as a security patch, lot of websites will auto update.

I’m not sure whether that would be wise to do for WP. It will show that WP can and is willing to basically push any update to sites running WP just to further a cause of the company.

Mweh if it doesn’t break anything. But terrible if it breaks something.

It's the WordPress Foundation and the code is driven by a community, not really a company with a chain of command...

"WordPress is 41% of the web"

This blows my mind every time. Even though I know it.

I don’t know it. Where did you learn it?

Okay, thanks!

It looks like it’s based on the top ten million websites by traffic, but weighted equally. Maybe there are lots of low-traffic WordPress sites?

> Maybe there are lots of low-traffic WordPress sites?

And many, many more high traffic websites. There's even some Facebook landing pages running WordPress and other many high profile sites[1].

1: https://wpvip.com/

Most likely google will just turn off that silly opt out functionality. It's not like anyone's going to stop using their spyware browser.

Surely that depends on what their experience using it is, just like every other "winning" browser before that is no longer winning? If FLoC generates so much hostility within the web dev community that a few major sites/platforms start actively blocking it, and if Google responds by ignoring the opt-outs in Chrome, and if the community responds with a SOPA-like "no access using Chrome for the next 48 hours then, here are some other fine browsers you can use instead that don't invade your privacy in this way", Google will simply be outgunned. However, you probably need platforms on the scale of WP and/or some sites with huge audiences like Facebook/Wikipedia/Netflix/Reddit to be on board for the effect to be fast and powerful enough to make a difference.

>and if the community responds with a SOPA-like "no access using Chrome for the next 48 hours then, here are some other fine browsers you can use instead that don't invade your privacy in this way"

that seems unlikely.

Is it, though?

It appears that Google is trying to rewrite the rules of how browsers and the Web work, with the appearance of being on the side of privacy, but actually introducing an alternative method of surveillance that is going to be less favourable to almost everyone except Google. How many of the huge-audience sites are potentially going to lose out from that, not least because they rely on advertising themselves for the lion's share of their revenues?

This whole discussion started with a proposal from a platform that is supporting nearly half of the sites people are visiting. That puts WP in a unique and potentially very powerful position here as well, and evidently they're interested in trying to force the issue.

And finally, the SOPA experience has shown that it is not entirely implausible for large numbers of sites to collaborate in this way if they feel the threat is serious enough. So if FLoC is as bad as the critics are suggesting, it doesn't seem entirely out of the question. There seem to be quite a few powerful organisations that would have a variety of motivations for wanting to give Google a bloody nose over this one.

I wonder whether, if WP takes the stance that FLoC is a security risk, whether they'd also consider a version of Chrome that doesn't allow opting out of it a security risk as well. And, if not, why not?

I'd like to see them try that and see how that flies.

Chrome is entranched, but not like IE was. You have to install the browser in the first place, which means the moment it starts to be too crappy people move elsewhere.

Why do you think Google hasn't prevented adblockers from running on it? If they did so, it would sink the browser so quickly.

One of the ways Chrome got as popular as it did was to bundle installation of it with various other programs, the way spyware and adware did. You install a random program, you don't open "advanced install" and uncheck "Chrome", and you end up with Chrome installed.

> the moment it starts to be too crappy people move elsewhere

You seriously underestimate the power of inertia.

>WordPress is 41% of the web

By domains or by visits?

As far as I'm aware, it's flawed in the same way as the PHP popularity stat: domains that report it in an HTTP header. I don't know about you, but I don't put a header advertising that I built a site with Python and Flask or whatever.

I guess those go in the "None" bucket, so I think they are counted.


FLoC is designed to be opt-in, so uh...... no?

Currently, for A/B testing, FLoC is automatically opting-in 0.5% of sites that serve ads, but that's only for a small testing population, the idea is that FLoC history contribution will be opt-in exclusively. (There's a proposal that you have to contribute to FLoC history calculations to get access to a user's FLoC identifier)

My fear is that it will end up exactly like the do not track headers and that at some point Google won't listen to the disable Floc header.

Why is that a problem? If I visit ognarb.com, what right do you have to tell me "You aren't allowed to use that fact for developing a profile about yourself"?

You send me a bunch of data, including headers, and I'm more or less free to do with that what I want within the privacy of my own browser. I don't have to listen to any of your headers if I don't want to.

I think that stat is more like 41% of servers, not 41% of traffic.

From my surface level reading of FLoC - would it be possible for Edge or Mozilla to implement FLoC - but to send noise / random / incorrect data up in a way that essentially wrecks the algorithm?

Then advertisers will fingerprint the browser as well, to see whether the FLoC data can be trusted.

Just have everyone spoof Chrome then

A substantial amount of modern Internet infrastructure relies on the fact that major actors are behaving in good faith. This isn't a chain of escalation anyone would benefit from going down.

The surveillance companies have started us down the path of bad faith by nonconsentually tracking us via protocol and implementation bugs that leak identifying information. IMO Firefox et al need to keep working towards a better-specified JS runtime without these security vulns, so that when the layperson complains about big tech surveillance an easy answer is "Stop using Chrome".

For firefox this is nearly impossible because of the different quirks it has in its javascript/layout engine. It might be easier to do with all the chromium forks, but it's unknown how the proprietary bits in chrome affect browser behavior. At worst they can use something like have obfuscated code (eg. widevine L3) for attestation.

This should be done at extension level not browser level.

I don't see why not, but that doesn't help the ~95% of people not using Firefox (let's be real, Microsoft is not going to pass up the chance to violate someone's privacy).

The Verge interpreted MS’s stance on FLoC as a soft no. In any event, it is not an obvious yes.


This interpretation is missing the important context that the PARAKEET proposal (https://github.com/WICG/privacy-preserving-ads/blob/main/Par...) is another strategy for opt-out personalized ad targeting. So they may have technical quibbles or business concerns, but they're not opposed to the core concept.

"At Microsoft, we are committed to fostering a healthy web ecosystem where everyone can thrive – consumers, publishers, advertisers, and platforms alike. Protecting user privacy is foundational to that commitment and is built into Microsoft Edge with features like Tracking Prevention, Microsoft Defender SmartScreen, and InPrivate browsing. We also support an ad-funded web because we don't want to see a day where all quality content has moved behind paywalls, accessible to only those with the financial means.

Through this proposal, we believe we can substantially improve end-user privacy while retaining the ability for sites to sustain their businesses through ad funding. We propose a new set of APIs that leverage a browser-trusted service to develop a sufficient understanding of user interests and therefore enable effective ad targeting without allowing ad networks to passively track users across the web. This approach removes the need for cross-site identity tracking, enables new privacy enabling methods to support monetization, and maintains ad auction functionality within existing ad tech ecosystems."

> Microsoft is not going to pass up the chance to violate someone's privacy

If they are not benefiting and Google is benefiting they may pass on that.

Well, if those 95% of the people (who exactly is counting, and how?) want Mozilla to help them, they should consider switching from Chrome (and stop enabling Google on the meantime).

I just love the Google's way of thinking.

Users: We hate cookies, because they are abused to hurt our privacy by allowing advertisers to build a profile about us

Google: We have a great idea! We can get rid of 3rd party cookies and instead make your browser build profile about you and share it with everyone.

IIUC while floc does indeed build a profile browser side it isn’t something that advertisers can track with the same precision as they can with 3p cookies.

So while it’s not the holy grail it does appear to be a small step in the right direction from the status quo.

Do I understand the situation correctly? Genuinely curious.

That's what I've been wondering. If FLoC is better for privacy than current tracking methods and Google intends to switch to using FLoC instead of current tracking methods, wouldn't it be better for FLoC to succeed?

The problem is that it isn't much better (if at all) in its current implementation. The current implementation allows tracking and identifying individual users (this may be or may not be fixed by Google, it has already been reported to their Github issue tracker). Since it is entirely based on your browsing patterns you can also be tracked cross device (people's browsing patterns rarely change).

For the moment 3p cookies are better, they can easily be erased, blocked or even isolated, and are restricted to a single browser.

I'll bet my browsing habits change substantially enough across devices to lead to different floc cohorts. I use my laptop differently than my phone and there are lots of cohorts.

The privacy of current tracking methods (3rd party cookies) is bad, which is why a lot of browser vendors are starting to block 3rd party cookies by default. FLoC may be marginally better than 3rd party cookies, but still browser vendors are mostly choosing to block it by default. There isn't a good reason to let either succeed.

True, that makes sense

Even if we assume that FLoC is entirely good, it's a false choice - why do we need _any_ tracking at all?

The web survives by serving ads. Targeted ads earn content creators more. They earn more because of higher conversion rates, so presumably users like them more (ultimately if an ad makes me buy something, I probably appreciate having seen it).

Yes, but it is unlikely to succeed.

The problem is that it is an unstable strategy. FLoC is strictly worse for privacy-conscious users if trackers don't change their strategy, and it is also strictly worse for trackers to stop using their current tracking techniques.

It is a bit like the prisoner dilemma.

True, but the FLoC implementation comes with its own sack of worms look eff excellent post on it.

Paraphrasing what I saw somewhere

> If I go to thing W, X, Y, and Z (where those are distinct elements with distinct fans), people within those cohorts will be indistinguishable but I will likely be the only person who has been to all 4. Therefore, you can easily identify individuals. FLoC is a crock of shit. At least you could block 3rd party cookies

You can block FLoC: switch to Firefox, or at least to any Chromium derivative that isn't Chrome.

I think you can enable/disable FLoC via chrome://settings/privacy or chrome://flags.

See: https://github.com/WICG/floc/issues/103#issuecomment-8218146...

Then again, FLoC is only enabled "in a limited set of circumstances" anyways.

For details on said circumstances, see: https://github.com/WICG/floc#qualifying-users-for-whom-a-coh...

I've not been following this proposal closely I did find


>Tracking people via their cohort

>A cohort could be used as a user identifier. It may not have enough bits of information to individually identify someone, but in combination with other information (such as an IP address), it might.

Whose purpose is:

>A FLoC cohort is a short name that is shared by a large number (thousands) of people, derived by the browser from its user’s browsing history.

I wonder if it's possible to define a large enough number X that people are OK with the idea. (Cookies are effectively "1" and nothing is "3,010,000,000" ie on the internet)

Could the cohort minimum size be configurable?

Given the IP address can be known today: what's the existing accidental "FLoC proxy" or "How unique are you online?" Or "online finger print" (something I'd not thought of before: my timezone can significantly narrow down who I am) You can try using yourself on: https://amiunique.org/fp

"Could the cohort minimum size be configurable?"

this is a good idea, but unfortunately it would just lead to MORE ways to track users, since "size of cohort" is now a (probably very, very high entropy, given how many users never configure anything) source of information

The benefit of FLoC is that you will need to go out of your way to track and de-anonymize your users. Yes, it can probably be done with enough data. But it is literally the simplest operation in the world with 3P cookies, so simple that tracking users across the internet may just happen on accident. That is unlikely to happen with FLoC.

Can't privacy concious browser defeat FLoC simply by sending random cohort IDs on each request?

That would require admitting that moving the tracking process to client-side actually improves on status quo (by not collecting data on the server).

While the whole framing of EFF et. al. is put in a way that does not allow for even a small doubt that the proposal is just the worst thing ever with no redeeming qualities. That framing disallows working within this feature to modify browsers to send the required headers.

not at all if you're not taking part in the data collection at all and are just sending noise on the channel. I guess chrome could counter DRM-signing the cohort-id or something

I would believe that random noise is easy to filter when you are Google.

Depends upon the noise, But even if they filter it out it will get the desired result of not having a cohart id. In case of opting out you are in the default don't like privacy invasive cohart

FLoC cohort computation only triggers on websites which call the document.interestCohort API or load ads.

This is not quite an opt-in. But a blanket opt-out isn't necessary either.

This is an important point if true, do you have a citation for that? I can't find anywhere in the documentation indication that sites are only included in the FLoC model if they call the FLoC API.

Yes. Take a look at this issue: https://github.com/WICG/floc/issues/103

Then again: "final design is still subject to change based on [Origin Trial] feedback".

Final design is subject to the numbers in Google's ad department. Chrome is an adware browser, and security takes a backseat to any broken API or security disaster that Google's ad customers decide to mercilessly abuse for fingerprinting (AudioContext, etc).

The real solution is to make everyone stop using Chrome.

I am a bit uneducated at this but does Brave browser which is based on chromium also have the same problem?

They've said they're going to disable FLoC. Still, this is one of the benefits of Firefox, it's not based on Chromium at all so it's out of the question.

No, Brave removes everything that has to do with Google from the browser.

It does ping static1.brave.com every now and then, which seems to use Google gstatic, which in turn uses Cloudflare:


The FLoC debate is pretty binary - you're either for it or against it. I think it's better to frame the debate as "how much tracking entropy should browsers provide?" Tracking entropy is log(cohort you're in). So if a service can tell you're in a group of 1024 users, tracking entropy is log2(1024) = 10.

The cohort you're in currently determined by 1) third-party cookies 2) fingerprinting techniques. Removing third-party cookies and introducing FLoC will probably reduce the entropy provided by the user. Recall that the FLoC proposal aims to put each user in a group of several thousand other users. That's about 12 bits of entropy. A third-party cookie would probably provide more, though I don't know the number off the top of my head. You only need log2(3 billion internet users) = 32 bits to identify every internet user hyper-precisely.

So, moving to FLoC probably reduces the tracking entropy provided by the user. But it still leaves fingerprinting as a viable way to identify users. Even if both third-party cookies and FLoC were eliminated, there would still be fingerprinting.

So, I think the Google approach is "provide a minimum tracking entropy via FLoC, and try to bound maximum entropy by limiting fingerprinting." Privacy advocates want a world where browsers try aggressively to limit tracking entropy, perhaps ideally eliminating it altogether.

See the "privacy budget" mentioned here for a similar idea: https://blog.chromium.org/2019/08/potential-uses-for-privacy...

Disclaimer: I work at Google.

I mean if we are going to be subject to mandatory profiling, why not take brave's approach of paying users directly for the apps they see cutting out the middlemen

It would appear that there are already at least two plugins that take care of this for those who'd like to do so before it's rolled into the WordPress core:


You don't need a plugin for this (every plugin is a security risk). You only need to send one single http header.

True, but modifying core files to send the header isn't good either because you'll have to redo the change at every update. Also, most security plugins such as Wordfence will choke on a modified core file, and rightly so.

You can chuck the same hook (as seen in the original link) into your theme's functions.php file. Or make your own plugin to hold miscellany.

Or you can set the header in your web server configuration.

The submitted title was "WordPress Proposal to Treat Google's FLoC as a Security Concern". That makes it sound like Wordpress itself is officially making this proposal. Is it? The page doesn't look like that to me.

We've reverted the title in keeping with the site rule: "Please use the original title, unless it is misleading or linkbait; don't editorialize." (https://news.ycombinator.com/newsguidelines.html).

I’m a WordPress committer and (somewhat former) owner of some large parts of WordPress. This is correct; the Make blogs can be posted to by many members of the project, and this does not indicate a decision or “official” word by any means. (I could create a Make post right now with a counter-proposal if I wanted.) It’s not a proposal by the WordPress Foundation, nor by any of the project’s leads.

However, this does have more gravitas than a random blog post elsewhere, as those with the ability to publish are contributors to the project who have made significant contributions.

Take this post as if it’s an emailed proposal to a project’s mailing list.

Thanks! In that case, the article's original title, appearing next to the domain make.wordpress.com, seems right.

The page does seem to be the official wordpress development blog, linked from wordpress.org's "get involved" page.

"The WordPress core development team builds WordPress! Follow this site for general updates, status reports, and the occasional code debate."

This is make.wordpress.org kinda like a issue tracker for WordPress core

It’s closer to a mailing list than an issue tracker; Trac (https://core.trac.wordpress.org) is the issue tracker.

> That makes it sound like Wordpress itself is officially making this proposal. Is it?

Seems like it is to me.

It looks like it to some and not to others, which is already confusing if it's an official proposal.

Clicking on the author's user profile [1] says they're a "Core Contributor". So maybe not the Wordpress org itself making the proposal but a core team member.

[1] https://profiles.wordpress.org/carike/

“Core Contributor” indicates they have contributed patches to WordPress previously and received acknowledgement (props) in the commit message, or have otherwise contributed to the Core component (i.e. the codebase, as opposed to Support/etc). It doesn’t indicate commit access or project leadership necessarily.

That said, only significant contributors get access to post to Make.

I wish someone at Google said "We have this idea that would significantly improve user privacy, and that's through means that would fundamentally hurt our possibility to deliver ads".

Or facebook saying "we have this idea that would improve the experience on our platforms, and we think it's a great idea despite hurting our ability to grow, show ads and our short term bottom line. It actively discourages 'engagement'".

If I had any stock in either company I'd still be delighted about these. I think it's the best long term growth strategy they can have. Focusing not on growth but on users and goodwill.

Ah come on. The FLoC proposal has built in ways to turn it off. If you don't wanna be put in a cohort you can just configure your browser (even chrome) to say you don't have one.

If it's not opt in, it's malware and should be treated as such. Don't let Google gaslight you.

It's sort of opt-in.

For websites, FLoC cohort computation only triggers if you call the document.interestCohort API or load ads - these actions are considered an opt-in. (https://github.com/WICG/floc/issues/103)

For users, it's sort of opt-in, too: You must be logged into a Google account, must have enabled Chrome history data sync, must not block third-party cookies, must have enabled Google web activity tracking and must have enabled ad personalization. (https://github.com/WICG/floc#qualifying-users-for-whom-a-coh...)

Also, you can disable FLoC via chrome://settings/privacy or chrome://flags. (https://github.com/WICG/floc/issues/103#issuecomment-8218146...)

It's not a perfect opt-in, but it's also not malware.

I believe what you are reading is ways that sync of floc classifier is disabled. That is, Chrome won't save the floc classifier in the cloud if sync is disabled, but, like history, will save it locally even if sync is disabled. (Assuming the setting isn't turned off.)

I'm not going to let anyone gaslight me. Features of a browser that aren't opt-in aren't definitionally malware.

User agents, for example, or even cookies, are not malware by any reasonable definition of the term. They present risks to the user and must be managed, but this is bounded.

Honestly I’m starting to think treating google like a security concern is the answer here. Lately their moves have been actively open web hostile. See AMP etc

Proposal: use Firefox

Although Firefox keeps getting worst, it is still a good alternative to Chrome... at least on PC... Firefox mobile stripped too many features from the latest version.

I think this is starting to get to the level of a moral panic. I respect that these developers think FLoC is bad, but what does it have to do with the WordPress project?

It's just HN. It's just like the reaction to AMP on this board. Most clients like the feature if it speeds up the site and brings more visitors to the site. Here, you'd think it represents the end of the internet or something.

If we have to be fair, Google didn't build a browser, an email service, a free DNS service, and free hosting/optimization service (AMP) just because, y'know, whatever.

I tend to roll my eyes at the blind hatred of corporations, but we also have to have both feet firmly on the ground, that these products and services are strictly tied to long-term plans for ROI. What kind of a ROI would the biggest advertising network have? Tracking, profiling and serving profiled ads.

Look at gmail: I pay $60/year-ish for Fastmail. Gmail is at least that good. So is the purpose of gmail to have a cross device stable identifier? Absolutely. Are people realizing tons of value from it for free? Also yes.

I expect to get a gmail type service paying only the "price" of having some unobtrusive ads when I'm on the gmail site, nothing else.

I absolutely do not agree to google using anything from gmail to generate a stable user id for advertising, or e.g. show me ads in google search results or youtube videos, based on analysis of email content.

If Google can't provide what I expect (a free mail service paid only by ads on gmail dot com) they should tell me that they want $X per year and I'd happily pay it. It's not that I don't want to fund the operations of services, it's that I'm always assumed to rather pay with my information than with dollars.

Then don't use gmail. It turns out they have competitors, including one listed in my comment.

Too late, I used the address in hundreds of places. I’d like to pay Google for having them respect my privacy now.

In the age of social media, the loudest voices are frequently intolerant minorities who've virtue-spiraled themselves into extreme positions. The current opposition to FLoC is a great example of this phenomenon in action.

Anyone know if there has been any research into the relative value to people placing advertisements in content-targetted vs person-targeted ways?

Lately the loss of security, increased tracking, etc are very pressing issues, which the "general public" is not aware of. Would it be feasible, or actually doable, to create an wareness month - a la Movember? This would help to shine some light on what is being done by major corporations, and which affects everyone.


Can someone explain how FLoC works like I am five ?

The intro lost me:

> WordPress powers approximately 41% of the web – and this community can help combat racism, sexism, anti-LGBTQ+ discrimination and discrimination against those with mental illness with four lines of code:"

    function disable_floc($headers) {
        $headers['Permissions-Policy'] = 'interest-cohort=()';
        return $headers;
    add_filter('wp_headers', 'disable_floc');
If you seriously think this is going to make a difference in racism, of all things... I mean... do people seriously think that? Do you know what racism is anymore?

Please don't take HN threads into extraneous flamewar. This is in the site guidelines: "Eschew flamebait. Avoid unrelated controversies and generic tangents."


Cherry-picking a detail you find most triggering in an article and importing it here to express how provoked you feel is a way of setting the thread on fire—no doubt unintentionally [1], besides which the greater part of the problem is caused by the upvotes such things attract—but still, we don't want threads-on-fire. We're trying for something different than that.

Readers should leave tangential provocations where they find them, and commenters should comment on what gratifies their intellectual curiosity, as the guidelines ask.

Edit: also, please don't use HN primarily for political or ideological battle. It's not what this site is for, and it destroys what it is for, so we ban accounts that cross that line [2], and your account's recent history seems to have crossed it. Fortunately that seems to be a recent development so it should be easy to fix.

[1] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...

[2] https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...

Ok, these are good points. I would love to have less politics involved in my tech discussions and will adjust my own comments as you suggest. Thanks!

I mean I'd be willing to listen to an argument that FLoC will contribute to systemic racism. I accept that it's plausible.

But it really makes me distrustful of the whole proposal when people make wild claims like that and don't feel like they need to make even the briefest attempt to back it up. It seems a lot more like they're just taking the currently trending social cause and co-opting it to support their own unrelated agenda.

FLoC exists to group users down into behavioral targeting categories, it should be obvious that some of those will end up corresponding to gender or race or other traits that are protected statuses. We've repeatedly had incidents where big companies were caught accidentally letting (for example) landlords filter advertisements by race or recruiters filter listings by age, both of which are illegal.

Yup, from the linked EFF article:

> Observers may learn that in general, members of a specific cohort are substantially likely to be a specific type of person. For example, a particular cohort may over-represent users who are young, female, and Black; another cohort, middle-aged Republican voters; a third, LGBTQ+ youth. This means every site you visit will have a good idea about what kind of person you are on first contact, without having to do the work of tracking you across the web.

To further back up the post - we have previously seen targeted advertisement used specifically to disenfranchise black voters, so there is definitely precedent.

FLoC is replacing cookies, that were already used in pretty much the exact same manner. I can't say I think FLoC is a win for consumers, but how it will promote racism any more than cookies is beyond me.

I could be wrong of course, if so, please explain how.

Because cohorts are stronger than cookies for sites that aren’t tracking you across the web and correlating that data.

I think you’re the one that’s operating on a purely old-school definition of systematic discrimination. You’re giving people a signal that by it’s very nature groups people like them together and naturally will have a correlation to their age, gender, race, wealth, ability, blah blah. And then you’re told that you’re supposed to use this information to make decisions about them as an individual. How does this not lead to racism?

This is the digital equivalent of trying to be “race blind.” You can’t just remove the race column in your db and assume that’s it fine to torture your data for patterns secure that your results won’t correlate to race.

It seriously smells like virtue signaling extraordinaire, worse still when corporations do it, because we know it's often for marketing purposes. Did H&M and Nike really care about Xinjiang cotton before people in social media started making a stink about it?

A lot of this is reminiscent of the hyperbole over AMP.

With the death of third-party cookies Google is trying to force browsers to add enough bits of entropy so that the same level of user tracking can be achieved through fingerprinting instead. Simple as that. The fact that Google is rolling this out right now but their plans to reduce fingerprinting move much more slowly, if at all, is telling. This absolutely needs to be treated as the massive privacy leak that it is.

> Simple as that.

Not quite? Maybe this will add more bits that will be useful for fingerprinting, but this seems like an absurd way for google to go about making it easier to fingerprint browsers, considering that most browsing happens over Chrome where Google can see what pages everyone visits anyway. And Google is currently proposing adding anti-fingerprinting measures [0] that observe how many bits of information a website has gathered and block API access after it reaches a certain threshold.

A straightforward analysis of Google's motivations makes sense here: they want to keep their ad business profitable while improving their reputation on privacy. FLOC allows targeted ads, keeping their business profitable, and doesn't rely 3rd parties observing your browser history, improving privacy.

From https://web.dev/floc/ :

> With FLoC, the browser does not share its browsing history with the FLoC service or anyone else. The browser, on the user's device, works out which cohort it belongs to. The user's browsing history never leaves the device.

> There will be thousands of browsers in each cohort.

A further privacy improvement is that they're designing it to avoid leaking whether you're a member of a "sensivitive category":

> The clustering algorithm used to construct the FLoC cohort model is designed to evaluate whether a cohort may be correlated with sensitive categories, without learning why a category is sensitive. Cohorts that might reveal sensitive categories such as race, sexuality, or medical history will be blocked. In other words, when working out its cohort, a browser will only be choosing between cohorts that won't reveal sensitive categories.

[0]: https://techcrunch.com/2019/08/22/google-proposes-new-privac...

Can’t you just switch to a Chromium fork without the FLoC? If they were closed-source, I think I would agree.

Sure, "you" - as a reader of hacker news - can use Firefox (or a chromium fork). The problem is that most normal users have no idea about any of this stuff, and no understanding of why they might want to switch.

Realistically most people just don't care the same amount that the subset of privacy-obsessed+techies do.

How many people who use Chrome can or will even know to do so? Tech oriented people already have alternatives.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact