It's Firefox, and in the bookmarks bar, you have GrayKey login, GrayKey/GrayShift Support, and a certain page from Apple support explaining how to enter recovery mode...
To me, that suggests that these cops have it bookmarked because GrayShift is using some sort of Recovery Mode exploit, similar to Checkm8 did through Checkra1n for the A10 and below, and the cops have it bookmarked so they remember how to do it.
For me, that means I have my eyes all on whether an iPhone 12, iPhone 12 Pro, or (new) iPhone SE 2nd Gen get hacked. Right now the highest confirmed GrayKey hack is an iPhone 11, which wouldn't have the change. We'll see whether Apple got them this time.
"Devices first released in Fall 2020 or later are equipped with a 2nd-generation Secure Storage Component. The 2nd-generation Secure Storage Component adds counter lockboxes. Each counter lockbox stores a 128-bit salt, a 128-bit passcode verifier, an 8-bit counter, and an 8-bit maximum attempt value. Access to the counter lockboxes is through an encrypted and authenticated protocol.
Counter lockboxes hold the entropy needed to unlock passcode-protected user data. To access the user data, the paired Secure Enclave must derive the correct passcode entropy value from the userʼs passcode and the Secure Enclaveʼs UID. The user’s passcode can’t be learned using unlock attempts sent from a source other than the paired Secure Enclave. If the passcode attempt limit is exceeded (for example, 10 attempts on iPhone), the passcode-protected data is erased completely by the Secure Storage Component."
(I now really pay attention when typing my passcode, which doesn't happen often because of Touch ID)
Worked on the passwordless 2FA part of it
When you design protocols like this, the big question is whether there is a bias in the salt and verifier keys that get installed in the SE at the time of manufacture. It's a good design as described in principle, assuming there is sufficient entropy in those 128-bit values, as we tend to assume the field size is what provides the security, but really, it's that there are this many sufficiently random bytes from that field that provides the assurance. If you have seen how crypto gets backdoored in practice, key length doesn't matter if the RNG is hobbled in a deniable way.
Personally and armchairing it, if I were solving this problem I would work more on brute forcing collisions in sensor/image sampling human fingerprints than bothering with crypto and side channels. But in terms of the real threat model, it makes more sense to approach it from the perspective of what the whole product truly is and what it provides.
From a product perspective, designing iPhone security is such an interesting problem because the whole brand image is predicated on being the best available, so you can't do an "oops, sorry" like an inferior product. You can only do what you know you can do better than anyone else. The real threat model is the brand damage from a company like Cellebrite or NSO getting caught or leaked about.
Oddly, the north star for a product like that is probably the U.S. 4th and 5th Amendments, as from a security feature perspective, you can always evaluate whether a design is aligned to those as the sufficient and necessary criteria.
It is; it zeroizes the KEK. Nothing is slowly wiped from end to end.
An interesting thought: Consider how valuable target Apple's RNG is, the design, the code, and the devices themselves. Reduce entropy, in a way that only you know, and you have an enormous advantage.
> the whole brand image is predicated on being the best available, so you can't do an "oops, sorry" like an inferior product
It's also possibly an effective way to manage engineers: Set a strong, public standard, and they know there is no avoiding it.
Yikes. This is something that probably would trip up a lot of people. After arresting you and taking your phone, they say you can make a phone call and pass your phone back to you. You unlock and it’s game over.
This is likely because all of the exploits jailbreaks or malware use are flaws in already running processes, but when the phone is rebooted, hardware root of trust verifies everything that loads is signed by Apple, clearing out the virus/jailbreak unless it was manually invoked (or plugged into the GrayKey) again to re-exploit those processes.
Of course, in the unlikely event that you happened to have an iPhone without the new Secure Storage component mentioned on this thread, be arrested, and the cops were trying to "Hide UI" on you, it would be really awkward for them if you rebooted your phone first ruining the trap. "Wait, no, you're supposed to enter your PIN right now!"
how do you trust that the phone was actually rebooted? ie. the ui fakes a reboot screen.
If the device has left your possession, don't accept it back or interact with it. Get a new device, and restore it from backup.
Also, if you fill out the emergency contacts in your phone, you can call those contacts without unlocking your phone.
If it doesn't unlock, it's probably your phone.
That might be an option. If it's not, I guess it's much easier on iOS to fake a reboot as the boot process (visually) is known, on Android it could be anything because it's manufacturer customisable.
EDIT: Fixed some grammar.
Oh wait we can't do that anymore.
This is a joke. As a rule, law enforcement does not respect fourth amendment rights and see the constitution only as red tape to be ignored or circumvented whenever convenient.
So. Do you have a source for that?
or i'd check out Justice Robert's thoughts on fourth amendment violations with regards to violent police actions.
Apparently getting shot can be considered a seizure; plenty of people every day shot in the midst of unwarranted and illegal police actions, by police officers who often fail to convince the courts that their actions could ever be considered reasonable.
I'm not going to bother linking data about 'Police Brutality United States 2020' or 'Police Misconduct United States 2020' -- the datasets are numerous , feel free to google those phrases. I feel as if I were to link one you'd just judge whichever I chose
For every law enforcement officer, who you may anecdotally know personally, that does things 'The Right Way', there is an un-told number of LEOs that fail to share the same altruistic and noble priorities that your friends might.
Because there isn't really a penalty, I'd assume some officers might not mind peeking into your device just to have a look.
This crap happens all the time.
> As part of a feature called HideUI, GrayKey also allows agencies to install the agent which surreptitiously records the user's passcode if authorities hand their phone back to them, NBC News reported.
The system should be clever enough to let some innocuous information slip from other systems. For example, let's say a border guard asks me for a password and I give the one that boots the system I'd use when travelling, with no other stuff than a few family photos and cat videos, if he checked the phone calls he would immediately spot that I didn't make a phone call (with that system) in one month or more, which would immediately raise alarms, so there should be a way to tell the other systems to slip in there some information like phone calls to relatives or the restaurant, chats with girlfriend, etc. just so that one could pretend that the system was used daily.
That’s how I’d do it anyways.
Maybe I can set this up with automation and geofencing.
I always assumed a border agent in the US would simply insist you turn it on for them, and you have no choice.
I also presume as a foreigner none of these protections would apply anyway?
As a foreigner, you still have rights. It is still US soil. You haven’t done anything illegal by asking to enter with your stuff, but you’ll probably be refused entry and maybe have your phone seized anyway meanwhile a US citizen would be allowed in, but might still be without their phone.
I think you’re in-the-clear long-term if you asked to “withdraw your entry request into the USA”. At least that’s the advice I’ve heard if they ask start asking about any drug use history, if you have any, so that you’re not lying but also don’t get yourself banned from entering later.
In some places it’s a gray area as customs is allowed to search anything but that’s it. You’re not compelled to open/do anything for them.
Yes, they could deny you entry if you’re a visitor, or they could seize it from you and give it back to you in pieces in 6 months, but if your encryption is good enough…
Also reduced (but not eliminated) risk of surreptitious malware installation.
How can an application be installed if the phone is locked?
Has anyone rewired their lightning connector so it only works with your own cables and everything else fails?
Based on the teardown photos, it doesn't look like there's much space to work with here, so getting any sort of modification in there would be non-trivial.
Security isn’t about being 100%, it’s about making your stuff as a target unworthwhile, or delaying things until long after you’re dead (where apple’s password cracking prevention obviously lacks)
With enough epoxy when you reconstruct, resoldering isn’t so straightforward.
Maybe a brownout supervisory SOT23 (tiny) that resets the phone device the second a wrong cable is plugged in:
There might even be one on there to begin with to prevent excessive lithium battery discharge.
Ay my shop we didn't actually have a ton of success with graykey because most the devices were BFU. Once the agent is loaded, you can plug the phone into a regular charger and let it brute force itself into the next millenia. But after it tries te first few hundred passcodes, the rate drops significantly.
I suspect most of these reports come from either bugs in the software (and some quick Googling suggests this has been the case), or perhaps that even someone (heck, even a savvy child) was trying using some sort of brute force exploit to unlock the phone.
I have a password the maximum length allowed so it's not trivial to unlock when she does that.
most people don't use "decent passwords" on their phones, because they have to enter it multiple times a day.
Phones contain so much private information that it's incredible that they are not given special treatment in law.
Data on a phone never killed anyone.
Cops are conducting warrantless searches of people's phones with impunity and it has become normalized. The potential for misuse or damage from leaks of that data is huge.
And even worse, no political party is even talking about changing that anachronistic, disgusting status quo. It's just more and more draconian laws to hack and spy on innocent citizens, "for our safety", and even worse, mass spying that is actually illegal goes unpunished.
Even worse, the whistleblowers who brought these illegalities to light are being disappeared in various ways.
I don't know what we can do about it but it is just so disappointing and sad. Where are the good guys to protect us?