Hacker News new | past | comments | ask | show | jobs | submit login

All this focus on cookies and FLoC feels like smoke and mirrors from Google.

Modern adtech can track users regardless if cookies are enabled or not, and whether they enable this new Chrome feature or not, via browser fingerprinting. They've been doing this for years.

So this new "privacy sandbox" is a diversion to the public, and particularly to law makers, that signals "see, we care about user privacy". When in fact it ultimately makes no impact on their revenue.

The public and law makers are barely starting to get an understanding about cookies, and there's a growing concern about them, so this is Google being proactive towards the blowback. Fingerprinting is much more complex to understand, and concern about it is so under the radar, that it will take many more years for the focus to catch up to these nefarious practices.

The frog is being boiled[1], make no mistake about that.

[1]: https://gazoche.xyz/posts/boiling-frog/




This is a take from someone who’s clearly not a domain expert. The purpose of finger printing is to identify individual users - which is pointless if you’re able to use third party cookies, unless you want to do cross-device tracking or get around as blockers. If third parties cookies are not a targetable Id in the bid stream (in a post cookie world), there is nothing to match a fingerprint to, so fingerprinting is useless. You can talk about ID5 and IDLs in this same discussion, but they are explicitly opt-in.

Additionally fingerprinting is not a tactic that advertisers want to use - anyone spending real money bets their vendors and wants to stay away from sketchy bs vendors who do that. Google doesn’t want it, TTD doesn’t want it, xandr doesn’t, cococola doesn’t, Nike doesn’t, etc. We all want a technology that is truly privacy focused for users, but still enables functionality that is critical to advertising like brand safety, frequency caps, and some semblance of targeting (even via context). That doesn’t even get into retargeting/dynamic retargeting.


Way to ad hominem, but you're right, I'm not a domain expert. Just a web user who refuses to be tracked and manipulated by advertising, and highly skeptical that any of these changes are done to benefit the user.

> there is nothing to match a fingerprint to, so fingerprinting is useless

Huh? A fingerprint doesn't need to match _to_ anything. It just needs to be consistent across browsing sessions for a profile of visited sites and interests to be built.

> Additionally fingerprinting is not a tactic that advertisers want to use

Really? Citation needed. All advertisers want their ads to be highly targeted to a consumer who is most likely to make a purchase. The reason web advertising is much more appealing than advertising in traditional media is precisely because it allows microtargetting on a level not possible via traditional means. Advertisers are always chasing a higher conversion rate, and microtargetting is proven to yield better results than showing ads to a large and generic cohort of consumers. Advertisers aren't happy about the Topics API, and many will choose the technology that allows them to continue to target ads more accurately. Fingerprinting is so far the most foolproof method of doing this, since it avoids pesky cookie blockers, and is difficult to detect.

> We all want a technology that is truly privacy focused for users, but still enables functionality that is critical to advertising like brand safety, frequency caps, and some semblance of targeting

I call BS on the first part. Ad targetting goes directly against user privacy. There's no reconciliation of the two. Advertisers can go back to buying ad space in context-relevant places (e.g. show fishing ads on fishing-related sites), but none of them want to lose a _substantial_ part of their revenue by not taking advantage of user tracking.

How you can be so defensive about this is beyond me, and leads me to believe you work in the ad industry.


I work at a DSP and directly manage a few million a month in ad spend. I talk to digital marketing managers, vps of marketing, heads of analytics, etc of household e-commerce and cpg brands weekly. All of them have extremely strict vetting practices to ensure their vendors are not fingerprinting or using any mildly questionable tactics. Literally all of them want privacy focused advertising, some of them are even requesting audits of environmental impacts of our server usage/etc.

Their is a world that is privacy focused and gives advertisers what they need - that’s what the privacy sandbox is trying to achieve. My employer works with Google directly on topics and other solutions to achieve what we want and create privacy.


Do their vendors, or your DSP, even have the information of where user profile data originally came from? If it was exchanged on multiple data brokers and part of 2nd or 3rd party data, is the original source even known? And if so, does your platform give advertisers the filters to exclude profile data based on method of acquisition?

And you're saying that all of those big brands refuse to use profile data acquired by fingerprinting, even if it would allow them to microtarget their campaigns? So they're essentially valuing people's privacy over their own profits? Somehow I highly doubt that.

I'm not saying that what you're saying is false. I just think that in an industry with highly shady and consumer hostile practices, built on the core ideas of psychological manipulation, that needs to be regulated to stop violating people's right to privacy, and even then finds ways to tip toe around regulations, it's quite unbelievable that all of a sudden they have a change of heart and actually claim to care about consumers. I mean, you're a part of it, so forgive me if I instinctively distrust your claims, and the claims of your clients.

> Literally all of them want privacy focused advertising

So they're running ad campaigns without targeting user profiles at all, in the same way advertising is done on traditional media? Because that's the only "privacy focused advertising". I highly doubt this as well.

Again, privacy and advertising are part of a zero-sum game. Advertising profits increase at the expense of user privacy, and the more targeted campaigns are, the more profitable they are. It would be foolish to think advertisers and adtech companies would be willing to sacrifice their profits out of the goodness of their hearts. This is why we need regulation in the first place, because they're not capable of self-regulation, and will pursue profits at all costs. But this is nothing new, and big business has been exploiting people since the dawn of industry. So please don't try to frame advertisers and adtech as some kind of benevolent actor.


> Modern adtech can track users regardless if cookies are enabled or not, and whether they enable this new Chrome feature or not, via browser fingerprinting. They've been doing this for years.

One of the explicit goal of "privacy sandbox" is preventing browser fingerprinting by limiting informational entropy from user environment. https://github.com/mikewest/privacy-budget


But the implicit goal is that _Google_ now owns fingerprinting in Chrome, versus the various other actors and tech in the space. Same as Apple owning fingerprints in iOS (and thus disrupting Facebook).

For anything “privacy tech” you must divorce the adversarial case (an actor maximizing an attack vector) with the average case (a monopolistic company using the tech to control overall opportunity costs). The latter has under-funded public study because Google et al will both throw gobs of money against it and throw shiny privacy tech problems out there to distract researchers.


That makes it even worse IMO.

Google's justification for this was after all that it's a nerved alternative to persistent user identifiers (like 3rd party cookies), because you have to give the poor, starving advertisers something in exchange if you take away their ability to identify users.

So far, so bad, but if advertisers can in fact still identify users, then FLoC will just be another, relatively high-quality signal that they can add to the profile. (In fact, fingerprinting isn't even needed yet as Google apparently feels it's fine to activate FLoC long before they disable 3rd party cookies. How that squares with the presentation as a privacy feature is a lection in corpospeak I guess)

So especially in that situation, you should turn off FLoC.


They have to set up the replacement (topics API) before they get rid of the previous solution (third-party cookies). Sites need time to adjust and implement the new systems.

They also need to not make sweeping changes to the ad industry that could be described as anti-competitive or monopolistic. I doubt they'd get away with just turning off third-party cookies in their browser.


In the EU, as far as the ePrivacy Directive (cookie law) is concerned, fingerprinting is similar to using a tracking cookie, even if no cookies are actually involved. And as far as GDPR is concerned, fingerprinting can identify a visitor, it counts as personal data, therefore you need a legal basis for processing it.

Not sure what the point you're trying to make is.

Also, Google under Privacy Sandbox has been exploring ways to introduce a fingerprinting limitations and a budget. Which may as well be smoke and mirrors, but if you watch their marketing materials, they talk of fingerprinting in general.


Sites that use tracking cookies rarely comply with the law as it is, and even then skirt around it via "legitimate interests" and other dark patterns. What makes you think they would disclose a behavior that is even more difficult to detect?

We can't assume good will and behavior from an industry that is built on deceiving and manipulating the user. The GDPR is a good first step at regulating these practices, but it's too vague, and it's applied far too leniently. It also obviously only applies to EU citizens, and not to the global industry.

I wasn't familiar with the privacy "budget", but it sounds like Google is trying to define privacy as a scale, where some amount of fingerprinting is OK. Users can be identified with just a few data points, and some are more valuable, depending on the context. Some might even be required for the site to function, so will there be "legitimate" exceptions to the budget in those cases? It sounds like a backwards approach that will be difficult to manage, so I'm not sure it will be a win for protecting privacy.

More importantly, I don't trust that an adtech company will go out of its way to implement solutions that go against its bottom line. These companies have a track record of abusing user data, and the only reason they take these initiatives is for good PR, which is again protecting their bottom line. The entire industry needs much broader and stronger regulation for any of this to actually improve.


The parent complains that lawmakers don't understand fingerprinting, or that companies like Google are trying to avoid the regulation of fingerprinting by focusing on cookies. Such statements are false.

You're moving the discussion towards law enforcement.

Well, DPAs in EU are overwhelmed, but lawsuits and rulings are progressing. For instance, Facebook found out that they can't force behavioral advertising via their ToS or via legitimate interests:

https://thisisunpacked.substack.com/p/the-eu-war-on-behavior...

I'd also add that small companies may fly under the radar, but big companies like Google and Meta are big targets.


Yeah and the Legal Basis will be something like “we need to track users to improve our services”


That's not a lawful basis under GDPR. There are only 6.[1]

(a) Consent

(b) Contract

(c) Legal obligation

(d) Vital interests

(e) Public task

(f) Legitimate interests

What a lot of companies are trying to do right now is weasel through under "legitimate interests" (eg a lot of scumbag seo-monkey websites have cookie consent dialogs stuffed with "legitimate interest" switches even though that doesn't work the way they think), but it's not clear that "improving my services at the expense of people's privacy" would pass the "legitimate interest" test if that ever goes to court. Legitimate interest requires them to pass "purpose", "necessity" and "balancing" tests. The "balancing" test in particular balances the companies interests against the interest of the user in maintaining privacy. Here's more about "legitimate interest" under GDPR.[2] it's not the get-out clause that people seem to think.

[1] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...

[2] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...


It doesn't matter what the law says if it's not being enforced. Much more blatant GDPR breaches are still going unpunished, so do you really think they are going to audit every single website to make sure they comply?


Enforcement is a huge problem, that is true.

My hope is that with recent rulings against Google, Meta etc. we might see an improvement across the board. Like there's some improvement with reject buttons: https://noyb.eu/en/where-did-all-reject-buttons-come


Absolutely and I have to say one of the problems with the (vast) ambition of GDPR in tackling what is a huge problem is that enforcement is a massive undertaking especially when the (alleged) transgressors are these massive multinational corporations who have practically infinite resources to put into evasion.


How on earth do these HR companies that scrape LinkedIn and sell the data fall under GDPR? They claim to.


They are trying to present that as legal basis, and it's not.


Google can also use people's IP address since so many use Gmail, Android, YouTube or Google Sync.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: