This is also interesting and something VICE missed, look at the first image which was from the FOIA Request!
It's Firefox, and in the bookmarks bar, you have GrayKey login, GrayKey/GrayShift Support, and a certain page from Apple support explaining how to enter recovery mode...
To me, that suggests that these cops have it bookmarked because GrayShift is using some sort of Recovery Mode exploit, similar to Checkm8 did through Checkra1n for the A10 and below, and the cops have it bookmarked so they remember how to do it.
Of notable curiosity not mentioned by VICE, but Apple actually modified their A12, A13, and A14 (I believe) CPUs while still in production to have a new second-generation Secure Enclave embedded within them sometime around Fall 2020. We don't know the full extent of the changes, but it sounds like the 2nd-generation enclave comes with much stronger hardware-based rate limiting on PIN code guessing through a "mailbox" system.
For me, that means I have my eyes all on whether an iPhone 12, iPhone 12 Pro, or (new) iPhone SE 2nd Gen get hacked. Right now the highest confirmed GrayKey hack is an iPhone 11, which wouldn't have the change. We'll see whether Apple got them this time.
Quote from Apple Platform Security > Hardware Security > Secure Enclave:
"Devices first released in Fall 2020 or later are equipped with a 2nd-generation Secure Storage Component. The 2nd-generation Secure Storage Component adds counter lockboxes. Each counter lockbox stores a 128-bit salt, a 128-bit passcode verifier, an 8-bit counter, and an 8-bit maximum attempt value. Access to the counter lockboxes is through an encrypted and authenticated protocol.
Counter lockboxes hold the entropy needed to unlock passcode-protected user data. To access the user data, the paired Secure Enclave must derive the correct passcode entropy value from the userʼs passcode and the Secure Enclaveʼs UID. The user’s passcode can’t be learned using unlock attempts sent from a source other than the paired Secure Enclave. If the passcode attempt limit is exceeded (for example, 10 attempts on iPhone), the passcode-protected data is erased completely by the Secure Storage Component."
I once had auto reset enabled on my phone after 10 attempts, and then somehow while it was in my gym bag it proceeded to accidentally get buttons pressed and reset itself. Better to set that number to 1000 instead of 10.
How's that even possible? AFAIK after the first few attempts there's an exponentially increasing lockout time between passcode attempts, so the random button pressings would have to persist for a long time for it to reach 10 attempts.
Nowadays the time between guesses increases exponentially. So by the time you are close to a reset, you need to wait a few hours or days before the next and final guess.
I set mine to four attempts, figuring that if I screw my passcode up four times it takes me like an hour to restore the device state from backup/MDM but if someone thought they had guesses at my passcode they'd lose that much quicker. Everything I can't "live without" that I use my phone for day to day is in GSuite/Spotify/iCloud/1Password anyway.
(I now really pay attention when typing my passcode, which doesn't happen often because of Touch ID)
Those, too, can be backed up, though for a while, at least on Android, Google Authenticator could not be (unless rooted, and even then, a nontrivial undertaking). This, AFAIK, has changed.
Surprised this wasn't implemented as whole filesystem encryption and then just overwriting the secret key in the SE instead of dealing with the time/battery cost of an incomplete wiping of "passcode-protected data."
When you design protocols like this, the big question is whether there is a bias in the salt and verifier keys that get installed in the SE at the time of manufacture. It's a good design as described in principle, assuming there is sufficient entropy in those 128-bit values, as we tend to assume the field size is what provides the security, but really, it's that there are this many sufficiently random bytes from that field that provides the assurance. If you have seen how crypto gets backdoored in practice, key length doesn't matter if the RNG is hobbled in a deniable way.
Personally and armchairing it, if I were solving this problem I would work more on brute forcing collisions in sensor/image sampling human fingerprints than bothering with crypto and side channels. But in terms of the real threat model, it makes more sense to approach it from the perspective of what the whole product truly is and what it provides.
From a product perspective, designing iPhone security is such an interesting problem because the whole brand image is predicated on being the best available, so you can't do an "oops, sorry" like an inferior product. You can only do what you know you can do better than anyone else. The real threat model is the brand damage from a company like Cellebrite or NSO getting caught or leaked about.
Oddly, the north star for a product like that is probably the U.S. 4th and 5th Amendments, as from a security feature perspective, you can always evaluate whether a design is aligned to those as the sufficient and necessary criteria.
>Surprised this wasn't implemented as whole filesystem encryption and then just overwriting the secret key in the SE instead of dealing with the time/battery cost of an incomplete wiping of "passcode-protected data."
It is; it zeroizes the KEK. Nothing is slowly wiped from end to end.
> key length doesn't matter if the RNG is hobbled in a deniable way
An interesting thought: Consider how valuable target Apple's RNG is, the design, the code, and the devices themselves. Reduce entropy, in a way that only you know, and you have an enormous advantage.
> the whole brand image is predicated on being the best available, so you can't do an "oops, sorry" like an inferior product
It's also possibly an effective way to manage engineers: Set a strong, public standard, and they know there is no avoiding it.
“GrayKey also allows agencies to install the agent which surreptitiously records the user's passcode if authorities hand their phone back to them”
Yikes. This is something that probably would trip up a lot of people. After arresting you and taking your phone, they say you can make a phone call and pass your phone back to you. You unlock and it’s game over.
(Sorry for the offtopicness but could you please email me at hn@ycombinator.com? I want to send you some repost invites and this is the only way to get in touch!)
Good to know though - if you are ever arrested, reboot your iPhone before entering the PIN. At least for the last 5 years since (I believe) the A9, no malware or jailbreak has managed to be persistent on an iPhone beyond a reboot that we know of.
This is likely because all of the exploits jailbreaks or malware use are flaws in already running processes, but when the phone is rebooted, hardware root of trust verifies everything that loads is signed by Apple, clearing out the virus/jailbreak unless it was manually invoked (or plugged into the GrayKey) again to re-exploit those processes.
Of course, in the unlikely event that you happened to have an iPhone without the new Secure Storage component mentioned on this thread, be arrested, and the cops were trying to "Hide UI" on you, it would be really awkward for them if you rebooted your phone first ruining the trap. "Wait, no, you're supposed to enter your PIN right now!"
There is always that risk. However, if you are this paranoid, you need to remember that you can do a Force Restart. This is handled by hardware, not by software, so if you know what to look for it cannot be faked.
You can't even trust that it's your phone; it wouldn't be that hard in many cases to provide an identical-looking device that just records the entered PIN. It won't hold up to inspection once unlocked, but it doesn't have to; once you've provided the PIN, you've lost.
If the device has left your possession, don't accept it back or interact with it. Get a new device, and restore it from backup.
Also, if you fill out the emergency contacts in your phone, you can call those contacts without unlocking your phone.
There's an easy defense against this: when you're handed the phone, intentionally enter a passcode that isn't yours. If it unlocks, then you'll immediately know it isn't your phone, and won't have given away your passcode.
Do iPhones have the hard-reset like Android where you hold power for a while, ignoring the on-screen prompts until it's forcefully reset?
That might be an option. If it's not, I guess it's much easier on iOS to fake a reboot as the boot process (visually) is known, on Android it could be anything because it's manufacturer customisable.
Incorrect. When the person press and holds the power button, just show a fake graphic onscreen for "swipe to turn off". When person does that, show a black screen for 3 seconds, show the Apple logo for 8 seconds, do an animation, and the phone hasn't rebooted while the person's been tricked.
They have to load it before first unlock by definition, and at this point I don't believe that it would be possible to establish persistence. Restart your phone when it's handed back to you by people you don't trust.
> The instructions open with asking readers to make sure they do have legal authorization to search the device; this can be in the form of a search warrant.
This is a joke. As a rule, law enforcement does not respect fourth amendment rights and see the constitution only as red tape to be ignored or circumvented whenever convenient.
No. I personally know law enforcement officers who go above and beyond, often getting warrants they think they don't need to err on the side of caution.
I don't know what kind of proof or sources would satisfy you; I concur with the earlier poster that claims that this kind of thing should be considered a 'water is wet' statement, but if I thought that police conducted their affairs 'by the book', i'd have some questions and misgivings about the disproportionately large sums of money that come out of city coffers to pay for settling police misconduct cases[0].
or i'd check out Justice Robert's thoughts on fourth amendment violations with regards to violent police actions[1].
Apparently getting shot can be considered a seizure; plenty of people every day shot in the midst of unwarranted and illegal police actions, by police officers who often fail to convince the courts that their actions could ever be considered reasonable.
I'm not going to bother linking data about 'Police Brutality United States 2020' or 'Police Misconduct United States 2020' -- the datasets are numerous , feel free to google those phrases. I feel as if I were to link one you'd just judge whichever I chose
For every law enforcement officer, who you may anecdotally know personally, that does things 'The Right Way', there is an un-told number of LEOs that fail to share the same altruistic and noble priorities that your friends might.
Something I hold as a truism about humans is that most individuals are good and honourable, but collate these good people into a large group and the result is generally terrible and untrustworthy. I don't see any reason why that wouldn't also apply to people in law enforcement.
Meaningless without some point of reference. I'm not arguing that law enforcement is always ethical but the parent comment claimed that law enforcement ignores warrants, "as a rule".
I mean generally the penalty for violating your privacy is nothing actually. If they discover evidence by violating your privacy you may exclude it from consideration. since it was obtained illegally. But there isn't really a penalty for a police officer reading your diary or whatever as you are not able to show damages.
Because there isn't really a penalty, I'd assume some officers might not mind peeking into your device just to have a look.
Easiest way to do this is probably what is mentioned at the end of the article:
> As part of a feature called HideUI, GrayKey also allows agencies to install the agent which surreptitiously records the user's passcode if authorities hand their phone back to them, NBC News reported.
A keylogger, as it seems. So, if you give your phone unlocked to authorities, e.g. during a border check, they can log your passcode and you are now screwed.
I think it's time for phone manufacturers, at least the few unknown ones that care for users privacy, to implement plausible deniability in the form of two or more bootable systems where one is the main one and the others boot would be triggered by a special password, so that they would not contain certain phone numbers, messages, etc. and anything installed there wouldn't have access to other partitions/systems.
The system should be clever enough to let some innocuous information slip from other systems. For example, let's say a border guard asks me for a password and I give the one that boots the system I'd use when travelling, with no other stuff than a few family photos and cat videos, if he checked the phone calls he would immediately spot that I didn't make a phone call (with that system) in one month or more, which would immediately raise alarms, so there should be a way to tell the other systems to slip in there some information like phone calls to relatives or the restaurant, chats with girlfriend, etc. just so that one could pretend that the system was used daily.
They have probably found an exploit to non-persistently install a 3rd party keyboard (what a mistake that feature was!) that writes all content to unprotected disk. All apps have this ability (to write to disk that is “always available”)
According to the EFF, in most cases, you can invoke your 5th amendment rights to prevent CBP from making you give up a password, particularly if you claim the content of the password (as opposed to the data the password is protecting) is personal and protected under the right to avoid self-incrimination. Of course, CPB has plenty of perfectly legal ways to get you to "voluntarily" comply: from making you miss your connecting flight, to putting you through enhanced security in the future, to denying entry if you're a non-resident.
On iPhone at least when you restart the phone it forces you to use the passcode to unlock the first time. I believe in the US passcodes are protected by the 5th amendment whereas face id or fingerprints are not. So while you can be compelled to unlock your phone with biometrics, you cannot be compelled with the passcode. Your milage may vary in other jurisdictions.
And what is the implication for refusing in both cases, passcode vs biometrics? Is one likely to land you in hotter-water than the other due to the 5th protection?
I also presume as a foreigner none of these protections would apply anyway?
If I understand it correctly, refusing either one could land you in detention (or jail) - law enforcement officers have a pretty broad ability to detain people for lawful investigations (or, what they beleive are lawful investigations). The difference is that you'd be released after refusing a passcode (but possibly having to appear in front of a judge with a lawyer), whereas the court could hold you in contempt (and jail) for refusing biometrics.
I think the strategy would be to not use biometrics at all.
As a foreigner, you still have rights. It is still US soil. You haven’t done anything illegal by asking to enter with your stuff, but you’ll probably be refused entry and maybe have your phone seized anyway meanwhile a US citizen would be allowed in, but might still be without their phone.
I think you’re in-the-clear long-term if you asked to “withdraw your entry request into the USA”. At least that’s the advice I’ve heard if they ask start asking about any drug use history, if you have any, so that you’re not lying but also don’t get yourself banned from entering later.
In some places it’s a gray area as customs is allowed to search anything but that’s it. You’re not compelled to open/do anything for them.
Yes, they could deny you entry if you’re a visitor, or they could seize it from you and give it back to you in pieces in 6 months, but if your encryption is good enough…
Also reduced (but not eliminated) risk of surreptitious malware installation.
Nothing that we know of in the last five years since, I believe, the A9 has managed to have anything non-Apple be persistently loaded across a reboot whether it be malware or a jailbreak.
My understanding was that this requires some kind of exploit, but this exploit does not actually give you the passcode. I'm guessing from context that it also doesn't allow you to unlock even if the phone is on? That seems somewhat surprising to me.
If you have access to a Mac with Apple Configurator 2, you can disable the "Allow device to pair with other computers" during supervision setup (you don't have to enable any other settings in Configurator besides this one). It'll wipe the device (you can restore from backup), but from then on unless the device is wiped it will not connect to any computer except one with the keypair Apple Configurator creates (so just uninstall configurator and it is deleted).
That's an interesting idea, although I suspect for it to be effective you have to make it self-destruct the data if a wrong cable is inserted. Otherwise they'll plug the cable in, see it's not working, and send it off to a forensics lab to "fix" the phone at which point they'll discover your modification and revert it.
Based on the teardown photos[1][2], it doesn't look like there's much space to work with here, so getting any sort of modification in there would be non-trivial.
Destroying the port would also be a minor roadblock, because they can just solder on a new port (see prior comment about shipping off the phone to a forensics lab to fix it).
Still prevents an evil maid attack malware upload.
Security isn’t about being 100%, it’s about making your stuff as a target unworthwhile, or delaying things until long after you’re dead (where apple’s password cracking prevention obviously lacks)
With enough epoxy when you reconstruct, resoldering isn’t so straightforward.
Yes, but in the context of this thread (Cops Use GrayKey to Brute Force iPhones), the attacker isn't really constrained by time, nor are they opportunistic.
183 days to try 63 million passwords. Roughly 4 per second by my math.. that seems pretty low to me. Any decent password would be uncrackable in that case - but of course if people mostly use numerical 4 digit passwords that's plenty fast.
I believe the secure enclave, which is responsible for mapping the passcode to an encryption key, has hardware level rate limiting so I suspect GrayKey is limited by that.
That's correct. I haven't used graykey for a year or so, so I can't speaj to any updates that have happened in that time. This article wasn't very interesting, and all this information can be found by calling grayshift sales support staff.
Ay my shop we didn't actually have a ton of success with graykey because most the devices were BFU. Once the agent is loaded, you can plug the phone into a regular charger and let it brute force itself into the next millenia. But after it tries te first few hundred passcodes, the rate drops significantly.
Haven't there been stories of parent's losing their phones because their kids randomly entering in passcodes forced the exponential time outs to be into the years (and longer) time frames?
The lockout period progresses something like 60 seconds, 5 minutes, 30 minutes, 1 hour, 3 hours, 6 hours, 1 day and so on. This should only be possible if the child had sole possession of the phone for days. Not saying it’s impossible, but this appears to be an extreme edge case.
I suspect most of these reports come from either bugs in the software (and some quick Googling suggests this has been the case), or perhaps that even someone (heck, even a savvy child) was trying using some sort of brute force exploit to unlock the phone.
I can see it happening. My 1.5 year old daughter routinely locks me out of my phone by touching the in-screen fingerprint reader when she takes my phone from the desk or wherever it's lying around at home.
I have a password the maximum length allowed so it's not trivial to unlock when she does that.
Isn't this common? I think Intel TPMs use that as well. They have their own clocks. I wonder if you could tamper and inject your own faster clock signal though.
Everyone's discussing technical measures to lock their phone or prevent these measures from working under particular circumstances, but nobody is commenting on the dystopian nightmare we find ourselves in.
Phones contain so much private information that it's incredible that they are not given special treatment in law.
Data on a phone never killed anyone.
Cops are conducting warrantless searches of people's phones with impunity and it has become normalized. The potential for misuse or damage from leaks of that data is huge.
And even worse, no political party is even talking about changing that anachronistic, disgusting status quo. It's just more and more draconian laws to hack and spy on innocent citizens, "for our safety", and even worse, mass spying that is actually illegal goes unpunished.
Even worse, the whistleblowers who brought these illegalities to light are being disappeared in various ways.
I don't know what we can do about it but it is just so disappointing and sad. Where are the good guys to protect us?
You can encrypt them locally using Finder. Then they can also make it onto your Time Machine backup too. Just need an offsite copy after that. Though looking at what iCloud backup includes it mainly seems to be settings and app data. App data is often centralised anyway or uses iCloud Drive
I read that as just being a list of what circumstances you can still use the attack that you might otherwise think you couldn't. For example having a broken screen it seems clear couldn't provide any benefit, but that's also in that same list.
I remember seeing a similar attack on iphones before. What I understood was it cutoff the power to the device before the TPM could log a passcode attempt. This [1] seems to be similar to what I remember.
It's Firefox, and in the bookmarks bar, you have GrayKey login, GrayKey/GrayShift Support, and a certain page from Apple support explaining how to enter recovery mode...
https://support.apple.com/en-us/HT201263
To me, that suggests that these cops have it bookmarked because GrayShift is using some sort of Recovery Mode exploit, similar to Checkm8 did through Checkra1n for the A10 and below, and the cops have it bookmarked so they remember how to do it.