Hacker News new | past | comments | ask | show | jobs | submit login
Et Tu, Procter and Gamble? (daringfireball.net)
143 points by FabHK on April 10, 2021 | hide | past | favorite | 73 comments



> Basically, IDFA was Apple’s attempt to work with companies to provide a way to offer a sanctioned identifier for advertising tracking that respected user privacy and user control over tracking. It didn’t work — these companies have no respect for user privacy or user control, even with IDFA.

I'd disagree with this part - I think IDFA worked as designed, Apple has changed in terms of what they're marketing and how much power they have to enforce their vision.

Apple created IDFA in 2012 to give users a way to control and reset their advertising ID. It worked as designed. Advertisers could track users, but users had control to reset their advertising ID to disassociate their data. Later Apple introduced the option to disable tracking but it was opt-in, not opt-out, so most users still allowed tracking and didn't realize it's happening.

In iOS 14, they've created an opt-in system, (which everyone knows most users won't opt into) and a ban hammer for apps that violate the spirit of that opt-in.

Ten years ago Apple wasn't running ads about privacy and they were working on building out the App Store. Hell, 2012 is also when they introduced signing into Facebook and Twitter built into the operating system. Smacking down ad networks and decreasing revenue for app developers would have been a lot harder. They're in a much better position now.


>Ten years ago Apple wasn't running ads about privacy and they were working on building out the App Store.

Steve did

https://www.youtube.com/watch?v=39iKLwlUqBo

>Privacy means people know what they're signing up for, in plain English and repeatedly

>I believe people are smart and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you're going to do with their data

I think the issue was that Apple bungled IDFA by not making in opt-in initially and now are trying to correct their mistakes.

Apple was vilified 10 years ago for this stance, but at the end of the day they were right with their opt-in view on tracking and how pervasive silicon valley was going to become with collecting data.


> think IDFA worked as designed, Apple has changed in terms of what they're marketing and how much power they have to enforce their vision.

Companies improve their products and sometimes it changes the ecosystem. This is way better than, say, the car industry who fought safety enhancements for decades. This is improving safety for their customers; the people who believe they will suffer for this are like the high-interest payday lenders of the internet.

Apple has taken a pro-privacy position for longer than the iphone has existed. Execution has not always been great but I do believe their position is genuine. However I’ll let them defend themselves on this one: they have a communications group.

I’m not trying to claim they are saints, merely that I believe that their pattern of actions Has shown a longstanding interest in their customers’ privacy. Perfect? No, but higher than any others, AFAICT.

PS: if you want another car analogy: certain other actions by Apple remind me of the car companies’ proprietary extensions to the open OBD-II. So again, not saints.


Unfortunately their protections do not really apply to their own apps.

While 3rd party apps need to show individual dialogs to get access to location data, all their own services (eg. data collection for maps, or the "Find My" network) automatically get access to location data if you enable location data on your device.


Some of them do (especially, but not only, on the Mac) so I wonder if calling them out would cause them to change. They responded when with they were called out for letting their own Mac apps bypass connection filtering.

Also some of these questions are asked in the setup dialogue.

For this particular case (the user ID) they explicitly assert that they are also subject to it.

I suspect what happens is their own apps don’t go through the App Store process. The lack of that kind of dog fooding is, IMHO, the real problem.


> For this particular case (the user ID) they explicitly assert that they are also subject to it.

It's kind of a ridiculous claim though, because Apple is the only company that has access to your device UUID (on iOS). 3rd party services aren't allowed to access the device UUID, so they get the IDFA. Apple doesn't need to use the IDFA, they just use the UUID which the user can't change, and link it to your Apple ID.

Also, Apple really really wants you to have an Apple ID. iPhones are almost useless without an Apple ID (no way to install software). So it's not like you can opt out of Apple tracking you by not signing in with your Apple ID.


Slight issue with privacy is that any persistent source with sufficient entropy is going to be a unique identifier, and even low entropy sources when used together create an n^n^n... sized field of identifiers. Same goes for browser fingerprinting, it's the same set of techniques.

> Through apps, CAID collects user device data, such as the device start-up time, model, time zone, country, language and IP address.

Any business whose gross margin per unit is as low as P&G's is effectively an advertising company with a smelly water supplier in the back. We can probably expect the same surveillance techniques from the other CPG manufacturers as well.

The browser itself has become parasitic, and the only viable way to mitigate this predatory advertiser and surveillance problem is to recognize and delineate the difference between what can only be described as the hegemonic internet (hegemonet?) and private channels like matrix/element, or previously irc and other ways of connecting.


Maybe VPN providers need to start providing hosted browsers.


You can win at "whack a mole" by hitting harder. Pass some real legislation and fine P&G a billion dollars. Works better than any amount of technology-level detection.


Legislation, and what do you do about other countries whose laws and activities are opposite of your own?

What China wants may be completely different than what we want and they have space at the table too


>And what is Apple going to do if they do identify apps in China using CAID in flagrant violation of the App Store rules, if those apps have the backing (implicit or explicit) of the Chinese government?

The CPC is on Apple's side when it comes to privacy.

https://www.reuters.com/article/us-china-cac-personaldata-id...


> Doing this is clearly against Apple’s rules. The questions are: Can Apple detect these techniques? And what is Apple going to do if they do identify apps in China using CAID in flagrant violation of the App Store rules, if those apps have the backing (implicit or explicit) of the Chinese government?

As much as I love Apple’s move for more robust privacy policies, I doubt that they have a chance to prevent this from happening. From a tech perspective, because it requires a lot of effort to detect these.

But also from a business perspective. It's one thing to ban a game, but banning WeChat?


I don't get the connection to WeChat here. You need to log in anyways, so why should WeChat run any fingerprinting? The user ID is right there, given by the user.


WeChat may have a convenient way to track users within the app, but a "device ID" equivalent is still useful for ad agencies to transparently target users on other apps based on the things they do on WeChat. (Otherwise, they'd need to add more user friction - for example, forcing users to link accounts.)

For context, some have argued [0] that modern Chinese life, in many ways, revolves around WeChat - chat, payments, location sharing, games, and more - and I don't find any reason to doubt those claims.

Sure, Apple could say no if WeChat decided to integrate fingerprint-driven tracking on iOS, but because of WeChat being a "super app" in China, Tencent could just say "sure, we'll stop supporting iOS" and that would cut iOS devices off from one of the most used apps in China, and make everyday life harder for everyone in China with an iPhone.

With iOS being the minority player in China [1][2], Tencent can likely get away with it too - I don't foresee a competitor rising up out of nowhere especially when WeChat still has a deep network effect for the remaining 78-82% of the population that uses Android.

[0]: obtained from Wikipedia: https://web.archive.org/web/20170103135948/https://www.fastc...

[1]: https://www.kantarworldpanel.com/global/smartphone-os-market...

[2]: https://gs.statcounter.com/os-market-share/mobile/china


> With iOS being the minority player in China, Tencent can likely get away with it too

I would somewhat disagree. Yes, iOS is a minority in China, but for wealthier people it's a higher percentage as it's a status symbol (as it is in many places). I don't know if this would change the calculus much but it might, and apparently the party is keen on cutting down on apps requiring more than essential data collection without consent[1] so it's possible that the government won't directly force Apple's hand in this case (as in, by forcing WeChat to remain in the App Store even using CAID).

1: https://www.reuters.com/article/us-china-cac-personaldata-id...


I would guess what they want is the ability to link behavior outside of WeChat (e.g. looking at computer monitors on Amazon, and then reading tech news on the Verge) with your profile on WeChat. Apple’s changes are in part designed to limit the abilities for entities to track you via third party cookies as you go around the web; device fingerprinting would be a substitute for those third party cookies.


Attaching a username to a legal name is one high-level fusion. Attaching these names to a device ID is another. Attaching that to a different app and username are a couple more. This account information can be used in finding out when that user logs into another device.

If any of those devices are sharing browsing history then that can be fused to online accounts. If any of those devices are active during daily commutes can fuse brick-and-mortar shopping destinations, place of employment, what daycare watches their kids, where Mom and Dad live.

Maybe, some of that data is embarrassing. Maybe, some of that data could be used to tell a story about you. Maybe, that story if false... but, would be very difficult to refute if made public. Maybe, you should just work with us (this one time, I promise) rather than have your whole life turned upside down...

No drop of water takes responsibility for the flood.


I have to agree with you here...


Ironically -- and I don't use that phrase lightly but this is Apple apologist Gruber -- the context of et tu is weaker persons banding together to stop an out of control tyrant.

Brutus was by at least a whisker on the right side of history at et tu time.

And his self-interest was raw mortal threat. Proscription lists were a thing.

As another halo company, P&G is an Apple rival and Apple doesn't cooperate with it in developing its policies. Apple develops its policies solely in its self interest. And changes them without input from stakeholders.

Thusly I argue that "ironically" is justified here. YMMV.


Maybe it's me being bad in reading diplomatic, but Apple directly states:

> Apps that are found to disregard the user’s choice will be rejected

So, if I'm reading this correctly, if the apps will be found violating the policy they will be pulled/rejected? This statement seems pretty clear to me, to be honest.


That's what they claim, but apps have already found to be lying on their "privacy nutrition labels" with seemingly zero consequences.


Who here bets Apple would ever pull WeChat?


They probably won’t pull WeChat but they could easily refuse updates while disabling tracking points. After all, there is no reason an app can’t be told an iPhone booted up on January 1st, 1970.


Apple has said it won't hold updates for policy reasons anymore. They seem to have forgotten they promised that, but they did.


This is in my opinion a symptom that cannot realistically be treated. The "disease" that need treatment is the power big markets have in the world and this is nothing new. The only New in this is that the US (state and businesses) aren't setting the agenda 100% of the time anymore. While this system of tracking is clearly A Bad Thing I'm not so sure the dilution of US control here is bad. So far I'm actually convinced this is an extremely good thing for the world on average. Maybe not for America but its time as top dog alone at the peak of power is almost at an end anyway.

In other words this is just a new player that want what Apple (and others) have granted US tech industry for decades. If they want to shut the door now they will likely be shut out of China, which I fully understand from their perspective. What is needed is a way to secure users from tracking at a much lower level and I doubt Apple will ever trade money/income for user privacy. If they did I would buy my first iPhone.


> I doubt Apple will ever trade money/income for user privacy.

They do this continuously.

> If they did I would buy my first iPhone.

That, I doubt.


>They do this continuously.

They do not. The privacy Apple do is for PR and to shut out competitors. Benefit for users is only a side effect.

>That, I doubt.

Understandably. It will likely never happen that Apple will do privacy.


Shouldn't it be: Et Tu, Procter et Gamble?


Well, the site wrote it with an ampersand (&), which is originally a creative combination of "e" and "t". My nitpick is: shouldn't it be "Et vos, Procter & Gamble?"


The ampersand symbol is a ligature.

https://en.wikipedia.org/wiki/Orthographic_ligature


> The ampersand symbol is a ligature.

Nice one. Never knew this either. Small clarification/update.

The ampersand symbol is in the process of becoming something other than a ligature, specifically:

> The ampersand comes in many different forms. Because of its ubiquity, it is generally no longer considered a ligature, but a logogram.


How did I never learn this?


You never learned anything, until you did; like this =D


No way! Who refers to a company as a plural?

Unless the Romans did it, but I was under the impression that "organizations" like the Senate (senatus) were singular.


> No way! Who refers to a company as a plural?

British English tends to refer to organizations as plurals instead of singular.


Perhaps Et vous (pluralised you)?


It's Latin, not French. "Et tu" could be from both languages, but the origin of the saying is what Julius Caesar said to Brutus after being deadly stabbed.


The words individually could indeed be French but we'd use the oblique case here, "et toi (aussi), Procter & Gamble?".

Also, I'm not sure what the usage would be in classical Latin, but using the singular to talk about a company in French seems perfectly appropriate. IIRC it's the Brits who like to use the plural for companies and institutions.


There's also Tu quoque meaning the same thing in Latin.


Did Caesar actually say that, or was is just a line written into Shakespeare's play?

I've never been clear on that.


Some ancient historians quoted him as saying something similar in Greek, though who knows if it actually happened — ancient historians were often “creative”. I think the source of the Latin phrase is Shakespeare though.


It’s Latin, rather than French.


"vous" if pluralized you in French, not in Latin.


That’s French, not Latin.


It's an allusion to Shakespeare's "Julius Caesar".


Possibly if you were writing in Latin. I think you would also have to Latinize the names. But et tu is a foreign phrase that has been adopted into English from Shakespeare, so you can use it in an English sentence without translating the rest of the sentence.


If they had used the ampersand it would have been perfect!


Submitter here: I'm pretty sure that I've just copied the story title from the webpage (with the ampersand) and pasted it into the HN box, so I assume that HN software then replaced the ampersand by "and".


It may show up eventually here: https://hackernewstitles.netlify.app/


Edges case are hard — both you and HN are forgiven :)


Proctor and Gamble have been on my 'don't buy from these bastards" list ever since they offered a refund of about $100 on purchase of a shaver, and then reneged on the promise.

If you fuck up your customers, you don't deserve to have them.


The reality is that Apple has little ability to block apps that are tracking people by indirect data such as “device boottime.” Sure maybe in that specific case they can block that piece of information but there are likely at least 100 other seemingly innocuous pieces of information available to apps that can be used the same way.

The idea of a “trustless” App Store is nice and roughly works in practice but is ultimately a fiction. Using a malicious app will always expose you to danger. New vulnerabilities are announced regularly. The situation is worse on the web: WebKit vulns are released nearly monthly and there is no app review there. Browsing the web with JavaScript enabled is a security nightmare. For example see the recent security notes for iOS 14.4.2 [1], every release you’ll likely see a fix for a new vulnerability.

I’m hesitant to say there will ever be a scalable and permanent solution to this problem. The best advice is simple: do not run apps you do not trust, do not visit websites you do not trust.

[1] https://support.apple.com/en-us/HT212256


Apps are heavily sandboxed. iOS controls all local information they can access, so Apple is in position to fight this.

It doesn't need to be perfect. Only hard enough that apps have to resort to more desperate techniques, so that they're detectable and Apple can ban them for trying.

Overall I don't like Apple's sandbox and total control over iOS, but it exists, and in this case at least they can use their power for something good.


The point of the article is exactly that it isn’t 100% perfect and this slight imperfection has allowed companies like P&G to take advantage of that for large-scale privacy violation. From the article:

   The whack-a-mole1 aspect of Apple’s new privacy rules is that while Apple can restrict access to the API that provides access to the IDFA identifier, clever developers can find (perhaps infinite) other ways to combine things they do have access to into a unique, or even just “close enough to unique to be useful for tracking”, identifier. IP addresses, to name just one example, are a big factor that Apple can’t block would-be-trackers from using.


This is why tracking is a problem that can only be solved at the legal level. Ultimately there's no reliable way for a machine to detect tracking given that any information collected for tracking can also be used for legitimate purposes.


Just like App Store review, it can only be mitigated at the legal level. Companies will skirt the law if there is little risk of them being caught and, like you said, there will never be a robust way of automatically detecting violations.


All apps rely on information provided to them by the OS. The OS can simply introduce (pseudo)random variability in that data, especially for apps that collect a lot of it so the probability for aggregating and correlating is a lot higher.

The more access your app wants, the more randomness should be introduced.


What about the IP address used to communicate with the app’s server? Can that be obscured? What about the screen resolution? What about time zone? Battery level?


Nobody’s blocking IP addresses but apparently these are not enough or these companies wouldn’t be clutching their other tracking pearls.

IP addresses and battery levels are dynamic and screen resolution and time zones are shared by millions.

The real fingerprinting is in things like sensor calibration and these can be stopped by binning or randomizing values.

Apple has announced they will be blocking these prints one by one and they are.

Developers wouldn’t be screaming if they weren’t in trouble.


The point of the article is that there are enough data that are low information in isolation but when combined together can effectively identify ad targets. This is already being done, it isn’t a theory. That’s the point of the article:

    clever developers can find (perhaps infinite) other ways to combine things they do have access to into a unique, or even just “close enough to unique to be useful for tracking”, identifier


Hence why I said that the more data an app is trying to get access to, the more random variability should be introduced. Some apps might legitimately need access to some data points with "minimal" random noise. But if they start asking for enough to uniquely identify a user then the OS should just "blur" all the data. Battery level can randomly vary on any app that can't justify knowing the exact value. Imagine the app having a "budget" for getting exact data: it can spend it on 2 precise data points or 10 very blurry ones.

Unless Apple decides to implement some form of VPN to obscure the IP there's not much to do about that, the phone can hide it only from the app, not the server. On a mobile network this seems less of a concern, and even in the home it wouldn't allow unique identification. So assuming the IP was the only thing that leaks and every other data point is "poisoned" by the OS, I'm sure it would make those companies tracking you deeply unhappy and unsatisfied.

Maybe some loopholes can't be closed, others may provide precise data but it's shared with millions of users making it as generic as it gets. Just raising the bar for successfully tracking the user to a very hard to reach level, and lowering the overall accuracy to the point it's no longer a practical concern could be good enough for all intents and purposes.


Nobody’s disputing that. But if Apple wasn’t anywhere near blocking these clever tricks, they wouldn’t be screaming bloody murder.

Again, Apple knows these fingerprinting parts exist, has acknowledged they exist, has stated they are going to block them and has blocked them. And no, they aren’t done, they might never be done. But also they have stated no intention of stopping.


They cannot block IP address, screen res, battery level, time zone, just to name a few.


Apple can bin battery level so it doesn’t contribute to fingerprinting, IP addresses are dynamic and shared and screen resolution and time zone are shared by millions of others.

Why would tracking companies be grasping for straws like boot up time if these other obvious things are so effective and impossible to block?


Those are just a few examples, the surface area is obviously massive (100s if not 1000s). We both know that. The reality is that iOS was not designed with this attack vector in mind. Here is yet another one I just randomly thought up “install date of app” https://stackoverflow.com/questions/39255403/get-date-of-whe... now imagine what a dedicated developer could design.


Sorry, it’s not difficult or forbidden to remember user identity in your own app. It’s about sharing that identity between apps.

And by the way, there is absolutely nothing stopping Apple from reporting your app was installed on January 1st, 1970.


> And by the way, there is absolutely nothing stopping Apple from reporting your app was installed on January 1st, 1970.

I doubt Apple would do that but I think you are very much missing the point. These one-off patches are not effective at scale, there are literally 100s if not 1000s of information leaks of this type that indeed are observable across multiple apps. App install date was just one that I pulled from thin air that required no thought. A dedicated engineer could easily solve this problem.


> What about the IP address used to communicate with the app’s server?

Yes. I've been wanting Apple to launch a VPN service for a long time.

A commercial VPN service can aid privacy by aggregating thousands of users behind a single IP address. The problem is that you have to trust the VPN provider. Commercial VPN providers are inherently shady -- after all, their entire business model is aiding and abetting copyright infringement or perhaps even worse activities. (No logging, wink wink!) I see no reason to trust them.

By contrast, Apple has a valuable brand (i.e., a reputation) and has made privacy a core part of their sales pitch. Unlike the inherently fly-by-night commercial VPN industry, Apple would have billions of reasons not to betray their customers by selling VPN usage data.

I think there's a market for an existing, known, reputable business to come in and offer a VPN service that explicitly does keep traffic flow logs for a short time, in the same way that ISPs retain dynamic IP assignment logs for a short time. By retaining logs, you avoid the shady elements that are otherwise attracted to no-logging VPN services.

If not Apple, I think one of the few remaining reputable independent ISPs would be a great fit. Sonic.net comes to mind. Sonic makes [clear, explicit claims](https://www.sonic.com/privacy-policy) that they do not sell usage data, but they do retain IP assignment logs for up to 14 days and will provide that data to law enforcement with an appropriate court order. That's exactly what I want: my adversary is P&G and their ilk, not law enforcement with a court order.


Screen resolution? You can predict that perfectly based on the model number, and many models have the same screen resolution.

https://developer.apple.com/library/archive/documentation/De...


Why does an app need to know the device's IP address? Why can't the OS just provide a method to communicate with the network. The OS handles all of the network communication with the app just providing the message and recipient?


Because you can hit a server which can see your IP.


What does a server seeing the address have to do with the app seeing the address directly from the OS? The app knows the address it needs to talk to, so it preps the message and then calls an OS provided method to make that call. The OS method then returns the response. The app never needs to know the IP address of the device on which it is running.


It does not, but it will try to grab it anyways for the purposes of tracking. Given that this is trivial to get there is no reason to hide it from apps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: