Hacker News new | past | comments | ask | show | jobs | submit login
Google, Meta 'break' Apple's device fingerprinting rules (theregister.com)
104 points by gulced on May 7, 2024 | hide | past | favorite | 69 comments



Original source, buried within the article: https://www.mysk.blog/2024/05/03/apple-required-reason-api/


Yeah, this is much clearer than The Register's attempt to summarize/explain/sensationalize it.

And the meat seems... pretty bland. They caught a bunch of "plausibly fingerprinty"[1] tokens (that a reasonable interpretation of Apple policy would require being stored on the device) being sent unobfuscated in later requests.

That looks a lot more like a genuine mistake by developers than a real attempt to evade tracking controls to me.

[1] But... not very? Seems like they're mostly concerned with system uptime. And, sure, boot time can be a fingerprint, but by definition an ephemeral one. In combination with other techniques, sure, maybe there's a way to construct an user profile. But alone? Meh. And it seems like that's all the investigators found.


Uptime might be somewhat ephemeral, but it can be a pretty clear signal to indicate that two different user profiles are actually the same person. For example, it would be a plausible explanation for why even when I took several steps to hide my identity when creating a new Meta account (new Safari profile, tracking blockers, VPN, anonymous email account, etc), Facebook immediately suggested all of the contacts from my other account as friends. Or perhaps they picked up on some other signal. My point is, I absolutely don't consent to my devices sending their uptime to third parties without explicitly asking me every time.


Did you use a different device and phone number or friend a mutual friend/yourself or allow contact access? That hasn't been my experience having 2 Meta accounts


Uptime can be pretty long on iOS devices, and the developers have no legitimate use for that information.


The only monotonically increasing clock available on iOS is based on uptime. If you need to measure accurate timings or guarantee that events are in the right order, you have to use it. As Apple points out in their guidelines, there are many legitimate uses for this information.

Developers can add an offset to the values before sending them off the device, and that might be what Apple wants them to do, but these companies have over a decade of pre-existing code that didn't do this. It's not surprising to me that there are things that haven't been updated yet given that Apple only made these requirements public fairly recently.


Surely the fix here is to lie to apps about uptime, adding the offset at the OS level.


"boot time plus random offset" is going to be just as much of a fingerprint as "boot time" is, though (the offset can't change once picked for a boot because the whole point is to be able to calculate the time between events in the app).

Something that reduces the granularity a lot, like "time since UTC 00:00:00 on the day of the last device boot", might work.


Make the offset app-specific. Then pretend the phone ran out of memory and ask the app to terminate every now and then, switch to a new random offset for every run.


Adding an offset would still make this "derived data" which is "not allowed" to be sent off device.


It could be useful if you want to prompt a user to reboot their phone because something isn't working and a reboot often helps. But if they just rebooted, telling them to reboot if they've just rebooted is going to annoy them for no reason.

Things like the user gave permission for camera or microphone and is trying to use them but there's no data coming through. Or really most anything with Bluetooth. Or sometimes their mobile network is being weird.

That probably doesn't justify sending it off the device as a raw value, but Meta really likes driving UI from the server instead of letting the phone be smart. And they have a habit of pushing Apple's limits.

(Disclaimer: I worked for a subsidiary until the end of 2019)


Not sure why you're being downvoted for this, iOS devices aren't servers, apps have very little control over their lifecycle, and the existing time APIs for an app's internal sense of time are plenty. There is no reason at all an app would need to know this random piece of global state. I'm not even sure why Apple offers the real uptime instead of an offset starting when the app first called the API.

It's the first thing that talk about in the docs https://developer.apple.com/documentation/foundation/nsproce...


Apps don't need it, and it probably shouldn't be offered. But as things stand, many of the OS's fundamental time APIs return time since boot, and for some reason, most of those aren't even listed as required-reason APIs. The APIs listed are mach_absolute_time() and ProcessInfo.systemUptime, but there is also clock_gettime() with CLOCK_MONOTONIC/CLOCK_MONOTONIC_RAW/CLOCK_UPTIME_RAW, as well more obscure ones like mach_continuous_time(). Therefore:

- I find it easy to believe that some of the apps may not realize they're using the restricted APIs in an unapproved way.

- Alternately, some of the apps may not be using restricted APIs at all, and instead getting the same information with non-restricted APIs. Not clear if that's even against the rules. It certainly would be extremely sketchy if done as an intentional workaround, but it would be even easier to do accidentally. If you grepped the codebase for uses of the restricted APIs, you wouldn't necessarily find uses of these other APIs.


Cupertino, if you're listening, I'm not comfortable with the offset either, that would also do fine as an identifier.


You expect apps to have no access to a simple monotonic timer?


I expect privacy theater; I err towards having opinions that enforce privacy - i.e. a time that's guaranteed to be x seconds since the app first started after system boot can be used to diminish privacy


Why would you expect privacy theater, though. Apple genuinely tries to frustrate this kind of snooping from (third-party) apps, so if they choose to obfuscate the timer in the first place, I don't see what the point is for them to make it a security theater. The obvious and trivial implementation for this is adding a random number, and that's what any minimally competent dev will do unless specifically told otherwise. So, why would they be told otherwise?


I expect Apple to figure out how to provide apps with such a timer in a way that also prevents them from having access to the system uptime.


If there's a random offset applied, how would they have access to the actual system uptime?


I don’t know, like I said, I expect Apple to figure that out. Their engineers get paid to figure out such problems, not me!


My original response in this thread was to a comment that said, "I'm not comfortable with the offset either".


The first-time-the-app-called-the-api-since-device-boot is just as unique as device-boot-time.


It isn't though, at least not in the same way. It's not as if apps can't generate unique identifiers like a UUID. What's important for fingerprinting isn't that it's unique but that it's stable across different apps which this wouldn't be. You could maybe try to eek out a bit of information with "value is greater than" but it would be hard to orchestrate because you can't guarantee that all apps will be first used around the same time.


At Google/Meta's scale, there is / ought to be less room for such "genuine mistakes" as they have the resources and clout to preempt them.


> That looks a lot more like a genuine mistake by developers

Can we please stop handing "assume good faith" to megacorps? That's something you grant to individuals, not to companies whose entire existence revolves around doing nefarious things like this. There are no genuine mistakes, there are exclusively bad faith actions that should be coupled with company-funded third party investigation and public disclosure to determine impact along with varying levels of prosecution wherever possible to act as both punishment and deterrent even if it turns out to be a genuine mistake.


The software stack on a modern cellphone, or of a browser, is about as likely to stop fingerprinting as a sticky-fingered burglar already in the police database can, in an oil and sugar factory, prevent his identification after touching every knob and lever in a warehouse of machines by wiping off the door handle on the front door alone. Rules are important because we cannot prevent crime from occurring at all. Though a truly isolated, truly privacy-preserving technology may seem on the horizon - whether it be through statelessness, homomorphic encryption, distributed governance, or some other odd and yet unproven technology, I can only hope we can one day look back on the sorry state of affairs today with nostalgia as the rash and haphazard painful afflictions of passion for the stars and the unlimited potential we set ourselves out for today.


Librefox does a decent job at resisting fingerprinting, at least against the fairly standard service that we used at my previous company.

I'm sure with a little effort it could easily be defeated, but for some purposes it's enough.


Could this be the reason why on iPhones, nearly everything we search for in Google Chrome is advertised to us on Instagram?


More likely just tracking pixels. Unless Google is sharing data with Meta which seems unlikely. They operate competing ad networks.


Look up "retargeting ads"


Right. This is the most basic level of marketing. Is this not as widely known as I assumed it was?


How does it work exactly? Google sells lists of search terms and associated device ids to Meta?


Google data never leaves their platform, it’s too valuable to them. Retargeting works the other way where a site or app attempts to get a cookie or fingerprint and gives them to Google or Meta to target that user with ads


IIRC not to Meta themselves but to Meta's customers (I mean, the advertisers, which are the ones who pay. You the user aren't Meta's customer, you are their product) but it's been a long time since I hear Criteo's technical pitch so I don't remember how it works.


Yep, just noticed exactly this.


Why, on any device… or group of devices you own and/or operate.


Happens on Android too. Google and Facebook have a secret agreement to share data.

https://www.wired.com/story/texas-accuses-google-facebook-il...


I would believe this - after all how many billions of dollars did Google pay Apple to be the primary search engine on iOS? This further erodes my confidence in Apple doing the right thing and I think the EU is doing the world a service by setting in motion alternative app stores.


>I would believe this - after all how many billions of dollars did Google pay Apple to be the primary search engine on iOS?

The OP also mentions Facebook and Spotify also are violating rules, and to my knowledge they're not paying apple "billions", so this explanation doesn't pass the sniff test. The actual reason is probably far more banal: these apps command a huge user base, which would be very upset if apple banned these apps. As a result apple is letting such infractions slide because they don't want to upset users.


Could also be that they are afraid of more antitrust attention if they enforce the rules too strictly.

The EU already handed Google the browser market on a silver platter, under the guise of “competition”. How much farther would they go if Apple prevented a perceived competitor from abusing users?


They’ve done this for years before the EU started stepping up.

I think a more likely explanation is they know what would happen to their sales numbers if they kicked off Facebook and Spotify.


> I think the EU is doing the world a service by setting in motion alternative app stores.

Why, so that developers can have absolutely no rules against abusing their users?


So people can actually have some control over the device they OWN.


There’s a second set of rules for mega popular apps. Always has been. FB should have been permanently banned 15 years ago.

If you’re important enough Apple will let you get away with murder compared to normal devs. Users don’t buy phones without FB, Instagram, Spotify, WhatsApp, Google apps, etc.

There is only one time I remember anything happening to a big app. Fortnite probably could have tracked people for years. But the they took away Apple’s cut and publicly stuck a thumb in Apple’s eye at the same time. So they got booted.


15 years ago apple was proudly announcing they were integrating Facebook accounts into iPhone OS and making their APIs available to apps, to thumb their nose at Google. (iPhone OS 4, IIRC)

That's part of why Facebook felt rightfully indignant when Apple moved to kneecap attribution and acted like it had never heard of it


This shouldn’t be a cat and mouse game. Google and Meta should be subject to a class action lawsuit for gross privacy violations and sued into smithereens. All the developers whose livelihood depends on surveillance capitalism can go do something more productive with their lives.


The claims of privacy by Apple are just a facade. I’m done with this walled in ecosystem (jail).

They have good looking products but the company itself is as rotten as their competition.


And you got all of that from an article about Google and Facebook, maybe, trying to get around Apple’s attempts to provide more privacy?


It’s a history of this behavior that has convinced me that Apple’s claims of being a privacy advocate is nothing but lip service.


or it's more complicated than a simple all or nothing rule.


Well I never!

Side note: I wonder how much harm Apple is doing to it's brand with it's Pious Attitude and Arrogant Behaviour?

Even as a long time, but relatively small shareholder, I'm on the fence about Apple right now.

I cant see the iPhone failing to lay golden eggs anytime soon.

But the Apple Vision Pro feels like they took those golden eggs and gold platted a toilet seat.

Tim Cook runs a tight shop, but he's been riding the wave of iPhone growth and the engineering excellence that gave us Apple Silicon.

The overall Apple brand hasnt been doing so well IMHO, and it's the Brand that saved Apple back in the 90s!

You cant fake passion, and everytime we see Tim Cook on stage or presenting, I get the vibe he would rather be somewhere else!


Interesting to see this downvoted. I've been on Apple for over 15 years and feel exactly the same. The ice is getting a lot thinner and I've been looking at options how to move out of their ecosystem.


It's not that Linux on the desktop is closing the gap to Windows and MacOS, but that the latter two are aggressively shitting in the wine and gaslighting long-term users about the incredible improvements delivered every year. See: multi-monitor support, display management in general, window management, the accursed Spaces conceits, relocation of menubar toggles for sound and Bluetooth every OS release, pointless shuffling of the Settings app every release, comical degradation of iTunes over time... the list goes on.

My kids are coming up on Linux, and are never going to have the hallowed halls of 10.6 to look back on fondly.

It's sad, but every year I feel in a new way that we're merely strolling through a once-glorious cathedral built and abandoned by great engineers from a different era.

I felt this way at Google ~5 years ago, and left as soon as I could slip the golden handcuffs.


Yes, I thought so LOL

Lots of comments, but yet the downvotes! :-)

I guess I'm long past being on the internet for upvote points LOL


[flagged]


Devil's advocate: People using Android accept (for whatever reason) a less fluid, subpar UI and UX, while iPhone users are less likely to tolerate it (by nature of not using Android). Thus, using an iPhone could actually suggest a more sophisticated sense of taste i.e. being cooler.


Devil's advocate or just trolling?

I can't bloody stand using Apple's unintuitive garbage. I have great admiration for the hardware, but trying to drive their O/S I've always found to be a horribly frustrating experience.


While their mobile OS doesn't differ much in usability from android phones, the ux / ui on their desktop operating system can be quite frustrating, confusing and especially if you come from a windows world, outright weird.


I think it's poorly made, inconsistent and unintuitive no matter what you're used to.


Material design is a better UI and UX experience and more thought out than iOS flat. iOS is more limiting overall which might be better for some users by providing less options. You can't install a virus, or any apps you want, if Cook doesn't approve it first.

But complaining about design is a lazy complaint from 10 years ago when Samsung was fighting against a clearly better UI to differentiate its offering after Google spent the money to actually design a UI.

Lag is an issue if you are comparing 100 dollar budget phones to the latest $1000+ iPhone. Not if you actually compare it to a comparable model.


Could you give an example for a subpar UX?


Devil’s brother advocate: People prefer using iOS over Android because they cannot handle having complete control over their UI and UX. They prefer having strict rules on what they can and cannot do and what apps they can and cannot use.


It’s a division of markets similar to the auto industry: you see tons of Camrys customized with after market parts, but very rarely a BMW/Lexus similarly tricked out.


Similarly BMW and Mercedes posers are notorious for sticking M and AMG badges on their cars. The badge doesn’t make the car go faster, but the perception that they belong to this elite club is worth it.


Not entirely untrue. The last time I had an Android, I rooted it, replaced the boot animation with a Tardis, installed some anti-virus / anti-theft app that silently forwarded all received sms to my other number (and realised after selling it I hadn’t properly removed it).

By the end of that first weekend I had had to reinstall Android / AOSP a dozen times, only to realise Monday morning I had somehow screwed up the cellular stack so I couldn’t use it.

That’s when I realised I couldn’t trust myself with that kind of power.


Honestly I think the iPhone UI is vastly inferior to Androids. I mean, jesus, they didn't even have folders or widgets until a couple of years ago.


No, it's because SMS is awful. I'm looking forward to see how well Apple implements RCS and how many of these problems go away.


Well, my comment wasn't aiming towards the technical details (which I even agree!), but more about the social background, with people considering other people who "just send green text" as "lesser". And no, it's not just the "young people crowd", but a certain part of society.


One could argue it’s a good thing: these people filter themselves out of our lives. It could almost make me wish for an Android.


> Just look at that ridiculous discussion about green text vs. blue text while messaging. People still use that as a marker for being cool vs. non cool in certain circles.

Only in the US. People are much more susceptible to corporate messaging here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: