Apparently, they can't enable the phone's "icloud backup" because someone changed the icloud password. Doesn't Apple have old passwords -i.e can't they restore the old password from backup? And presuming they can't (why?).... can't they simply modify the server-side to not check for password for a given account, and accept just accept any password for backing up?
Unless they are trying to pre-empt something else (like the recently-touted shift to "devices even we can't access" from Tim Cook, which may or may not be simple advertising), they just picked the wrong time to stir this particular pot.
Waving around the proverbial pipe wrench for a case this ambiguous is just stupid. They're just as likely to have it taken away from them altogether.
That's being pretty liberal with words. It's more difficult to crack more recent versions, but saying it's impossible is putting too much faith in the creators of the device and too little in determined people who would want in.
That is, if the secrets in question are on that NAND chip.
Might not be bad exposure for a security startup, either.
So the FBI would most likely still require Apple's assistance in this.
The exponential backoff of attempts is not really an issue in that case.
Congressman Issa was previously the CEO of DEI, a car security and audio equipment company. He is possibly one of the most tech-savvy members of U.S. Congress and happens to be one of the wealthiest as well .
The Congressional hearing video footage is here, the suggestion was proposed at 1h23m 13s in.
If it is, how do they do that? I can't imagine it's somehow embedded in circuitry (too complicated to mass produce) so it must be on some kind of storage medium, right? What makes that unreadable?
> Every iOS device has a dedicated AES 256 crypto engine built into the DMA path between the flash storage and main system memory, making file encryption highly efficient.
> The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused (UID) or compiled (GID) into the application processor and Secure Enclave during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption operations performed by dedicated AES engines implemented in silicon using the UID or GID as a key.
> Additionally, the Secure Enclave’s UID and GID can only be used by the AES engine dedicated to the Secure Enclave. The UIDs are unique to each device and are not recorded by Apple or any of its suppliers.
In newer phones it is in the Secure Enclave instead of the CPU (the SE handles all encryption/decryption for the CPU).
Maybe another HW key is required to do so.
Not if there's a write-permit fuse as well =)
If you're not familiar with PBKDF2, it is similar in function to bcrypt or scrypt - it turns a password into a key and is designed to take a long time to prevent brute force attacks. Tying in the UID key prevents the attacker from brute forcing on a faster machine (or machines).
The wrapped keys I mentioned are stored in what apple calls "effaceable storage", specially designated non-volatile memory that actually erases rather than just being marked as free. I have no idea if it's stored on the NAND chip no the iPhone 5c or not. (Apparently there was a previous attack that involved making the chip read-only, so Apple may have moved the effaceable storage to mitigate it.)
If you're interested in details, this is a good read, lots of interesting ideas in there:
Part of this relies on the specific iPhone 5c from the shooter, because of the per-device hardware key. They ultimately need to unlock that specific phone, with the NAND data intact, in order to read the contents.
But, if the passcode is stored in NAND and validated only against user input they could duplicate the NAND and parallelize the process. If any part of the user code check involves the hardware key, then it wouldn't work.
If it were just a password hash and Apple still had that hash, then probably so. Is that actually how this information is stored? ISTM it could be something more complex. The system was not designed to support the FBI's intended use, so the fact that Apple could have logged and stored all sorts of things doesn't indicate that they did.
After I changed my iCloud email/password my phone tried to use the old one for months, and I had to escalate the issue high up Apple Support to get it fixed (you can't erase your iPhone without proper iCloud credentials if findmyiphone is enabled). That also involved my email, so it was slightly different.
But not awkward after a valid reply from the server saying "password incorrect".
But my stronger point was that there is no need to speculate, since someone with an iPhone can verify what actually happens. Set up backups, change the password, verify that it fails, change the password back, and report what happens.
Obviously making some assumptions (like HTTP, or a 5xx on network fail between internal services), but telling the difference between "user supplied bad data" and "the server messed up" really isn't that awkward at all.
For a real test, the server side hash must be saved and restored. Good luck.
If you want the American public to believe the FBI is making fraudulent claims, show demonstrable proof that it can actually be done instead of all the talk and theories.
First of all, for most of them, it's black magic. For the few who know a bit more, will be convinced by arguments of analogy (after all, nowadays many hard drives are in fact flash devices, and the details of interface protocol are beyond what jurors will care about).
That's what they're asking for. Part of the FBI's argument is that they need the information for safety of our country, not specifically for the trial.
However... in that case the FBI will have remote access to the phone in question to run whatever software tools against it they require. (This requirement is in the order. :) )
Given that "prevent iOS from reading the ROM used to boot the iDevice" probably isn't a threat that Apple considered to be a serious one, it's entirely possible that the FBI (or an agent of another TLA embedded within the FBI) could use this remote access to also gain access to Apple's (signed!) PIN entry delay and self-destruct removal modifications.
If this happens, and there's a way to bypass whatever mechanism Apple used in the modified image to make it run only on that single iPhone, then Apple has just unwittingly (and unwillingly) handed a backdoor to any iPhone of that model to FedGov (along with any other governments that have clandestine access to the systems of the TLAs in question).
Don't be confused; the stakes are really high.
The owner is dead, so the phone data is not needed for a trial against the owner.
It might be evidence against "co-conspirators" - sadly, we have a vindictive, lawless government now that likes to go after former roommates and acquaintances.
Also, you hire an expert to testify as to the technical details. It isn't perfect, but it is what it is; and anyway, the first thing a defense attorney will say/do in the case that the FBI prefers is to cast doubt upon the integrity of anything the FBI finds after all of the screwing around with it that's been done.
There's nothing interesting there.
If the NSA did this for espionage it's one thing, but I'm curious as to whether substantially modifying the iPhone in this way would stand up in court.... How would the police assert that they preserved evidence after doing this?
I was involved in a drawn out case challenged the validity of data recovered from backup at great. That was easy to assert with normal IT people, and yet it took weeks to litigate. Couldn't imagine how this would go.
Edit: Not sure why I got downvoted. I can currently circumvent my keyboard passcode with a number of steps, and I'm on iOS 9. Steps to try for yourself:
Edit: Ok I've been tricked. The steps below are unnecessary as the first step actually unlocks your iPhone in the background. ¯\_(ツ)_/¯ The fact remains though that these bugs have existed in the past and may exist on the device the FBI wants to unlock.
1. Invoke Siri, "what time is it?"
2. Press the time/clock that is shown
3. Tap the + icon.
4. Type some arbitrarily long string into the search box. Highlight that text and copy it.
5. Tap on the search box. There should be a share option if your device is capable. Tap the share option.
6. Share to messages.
7. Press the home button.
Congrats, you're more effective than the FBI.
It's a hoax.
TouchID is so fast on the 6s, it's easy to unlock it without realizing you're doing it.
The default protection class is "Protected Until First User Authentication". This means that unless an app says something more specific, the key required to read a file is not available between reboot and the first time a phone is unlocked.
On step 7, are you sure you're not also unlocking your phone with Touch ID when you do that?
More over, what's more fascinating is, some people may say it's privacy v security and the fight for terror. But what has emerged from the last few weeks is multiple reason why the FBI should not win in court, regardless of your perspective of terror. It's been very clear from day 1 that the intentions of the FBI are vicious and non-genuine, and with every passing day, more people are finding out.
What would make you secure here is having a password sufficiently long and complex to make brute forcing infeasible. Touch ID facilitates that by making it practical to use the device even with a strong password set. But if you have a four or six-digit passcode set on such a device, then this technique should still work fine.
First, the iOS security guide makes no mention of any storage besides the flash. When it discusses key erasure, it's always about Effaceable Storage, which according to the guide is a dedicated area of NAND flash.
Second, Apple says that the attack the FBI wants them to carry out could be made against all iPhones. If the Secure Enclave had its own storage, it would be trivial to have it wipe the device if the SE software was updated without a passcode, which would defeat the requested attack. It's possible that the SE has its own storage and Apple just didn't use it this way, of course. Or whoever said this on behalf of Apple could just be wrong that the more recent phones are vulnerable.
The SE does enforce the escalating timeouts, and probably the wipe after ten tries, but that wipe could just be the SE telling the Effaceable Storage to erase the key.
That said, you can bet that Apple is working on a fix for this for future devices. Some way for the Secure Enclave to know that the update was authorized by the user even if the device was subsequently bricked by the update and then recovered. With the ability to maintain proof of authorization even after the device is bricked, they can then make the SE wipe itself if it's updated without authorization.
And fixing a stuck device like that could be done by just not upgrading the Secure Enclave's software during the recovery process. Restore the main OS and you're back where you were. Then you can update the SE firmware.
I'm pretty sure that if the SE doesn't wipe when updated without a passcode, it's because there's no way for it to do so. It just loads the firmware it's presented at boot, no questions asked beyond verifying the signature on that firmware.
No doubt Apple is working on improving this. You'd only need 256 bits of nonvolatile storage within the SE, and corresponding improvements to the SE's bootloader.
As for sending passcodes over the wire, Apple really doesn't want to have this because it makes it a lot harder to reject requests like the one the FBI is making. Disabling the wipe-after-10-passcodes is just turning off a bit of code, that's not an "undue burden", but implementing the ability to send passcodes over the wire is a non-trivial engineering effort and becomes an "undue burden" when the FBI requests that it be added.
Submitting passcodes electronically doesn't really make that much of a difference. It takes at least 80ms to try a passcode that way. Require touchscreen input and you've bumped that up to, what, a second or two? Without the escalating delays and potential wipe, a (very bored) person could crack a four-digit passcode by hand in a day.
Also, did they remove this functionality? The "IP Box" brute forcer submits passcodes over USB. Apple may have removed that when they patched the vulnerability that allowed bypassing the escalating delays, of course, but it did exist.
I don't know why the FBI requested this ability, but it's presumably just something like, while we're here we might as well make things a little easier.
The SE uses a different key for each app, so even if you can decrypt one key, you only get data from one app.
I'm starting to think no one is driving the clown car in their technical division.
Sounds like a better hack would be to interpose the flash memory interface with a RAM cache that simulates writes without modifying the original flash data. Then they can hammer away at brute forcing it without the delay of reburning the flash.
But I very much doubt you would practically manage to remove that NAND chip and replace it very often on that umpteen layer ultra thin board. Instead, remove it once and stick it in a test fixture, then try brute forcing it.
The bigger question here is: do you really want law enforcement to hack into things as standard procedure? They certainly don't. It's difficult, expensive, slow, and worst, unbound by law. It's a world where your privacy is exposed based on the federal hacking budget rather than a judge's opinion about your potential criminality.
It's much better for law enforcement to be constrained by law than technical ability.
>Even if an iPhone is locked, all of that encrypted data can technically be read easily so long as the phone had at least been unlocked once since the time it was booted up.
Obviously I think it's a nonsense, but I have no way of disproving it (even though the burden of proof is on the claimer, naturally).
Edit: OK I found this http://www.darthnull.org/2014/10/06/ios-encryption so never mind, I guess...
Is there a reason we should have confidence in software engineers rather than electrical engineers?
Once you've done that you need to reverse the process.
I've done a few of these and they still scare me.
Its not a trivial process.
To be clear I don't think apple should compromise the phone, just that this is not a long con by the FBI to compromise all phones.
am i crazy to want the president step up and say: "our position as a government is: x"? there's no/no way this has escaped his notice. isn't that part of the job description of "leader of the free world?"
I agree that thr government should have a unified view on all this. However, it should come from CONGRESS, not the President. This is clearly an area where our democracy needs to make a decision about how we govern ourselves, then enshrine that decision in law, then follow that law.
I'm surprised someone at Uni hasn't made demonstrating this exact attack a class project.
If it's so easy, then the ACLU should have no problem demonstrating it with an actual iPhone 5c.
> If the FBI doesn't have the equipment or expertise to do this, they can hire any one of dozens of data recovery firms that specialize in information extraction from digital devices.
Nice to be vindicated though.
The first is at the hardware level, performed by the SSD itself. It encrypts all data on it with a key stored in rewriteable memory. By changing the key, you can "erase" the drive. The FBI could backup that key if they wanted to undo the "secure erase" feature.
However, this is not the only layer of encryption. The second is done in software, performed by iOS. It encrypts all user data with a key derived from the user's passcode and a unique key burned into the chip. The FBI cannot get past this layer, and this is what they wanted Apple's help for.
You were right in that the FBI could trivially get past the first layer of encryption, but it's not the one we care about. There's more than one layer.
 Or rather, by the flash controller. At least, I assume it is. It might actually be an iOS software feature too, but the OP's description reminds me very much of how some SSDs implement secure erase. Whether it's actually done in hardware or software is immaterial, anyway.
...so that they could get past that layer of encryption by making lots of password attempts.
Not that it's particularly useful for hackers, I'm just wondering if this can be done "perfectly" at all.
iDevices can have longer PINs and PINs with letters and symbols.
Using a default 4 digit code, you end up with a numeric pad and four boxes. If you use a longer than 4 digit code but stick with numbers, it gives a single box to enter the pass code in but presents a numeric pad. If you use letters at all, it switches to a qwerty-ish keyboard and a single box to enter the pass code.
Yes, it can, and I explained exactly how in the original post. It's true I got some of the technical details wrong, but the substance of the post was and is correct.
To be clear, I'm not one of the flaggers, but I can understand some potential motivation there. There is already a lot of noise on this topic.
So should you be flagged to death because you got this wrong?
" they could use a copy of the chip to try five different PIN codes, and then replace the chip with a fresh copy of the original and try five more. Lather, rinse, repeat. At worst this would take about a week or so."
No you were wrong. You claimed that if the FBI was competent they could retrieve the key from the flash. There were no technical details discussed at all. (you could trivially google this issue and find out what you suggested doesn't work)
The ACLU is describing an attack on the PIN not the key. You acknowledge this in another comment on this thread so I have no idea why you are claiming here that the FBI can get past the encryption in any way; which as others have said doesn't work.
Yes. And so did I.
Hasn't it always been known, since the beginning of this issue, that this was a penetrable 5c phone and thus a red herring for the FBI's request?
The FBI isn't going to rip open a phone, unsolder a chip and risk destroying the device, when it can do what it's done successfully, many times in the past, and ask Apple to unlock the phone for them:
This thread is a real-world demonstration of the XKCD comic about pipe-wrench security:
Nerds think that proving that there's some theoretical, high-tech attack against the this specific phone means that the FBI should therefore lose. But that's irrelevant. This case is about the pipe wrench.
To use the classic XKCD comic, the crux of the case is that FBI is the one arguing the first panel (i.e., some bogus magical encryption we can never break), and because of that claim, they need to be able to compel Apple to compromise the security features using the old wrench trick.
The reality of there being practical alternatives for the FBI to pursue should give pause as to whether they can compel Apple to compromise the security features, and arguably the method described/discussed is indeed very practical.
All in all, it's less about the FBI's ability to do any of this and instead more about "should be the allowed to force a company to do something like this?". By demonstrating the claim that it's impossible to proceed without Apple's help is not true, I would think it should give pause to any court as to how to rule, since the implication of the ruling is pretty big.
It doesn't matter that you can come up with some theoretically plausible attack that works in this one case. If it's harder or riskier or slower or less effective than Apple complying with the warrant, then the question stands.
Forcing the FBI to admit that their real objective is the general power to hit people with pipe wrenches seems like an important step.
You did receive harsh RTFM comments (stark contrast with the tone of the comments on this thread).
Glad that you can see your work corroborated.
> I flagged the article, as the entire argument is predicated on a factually false premise.
Wow those comments are harsh
Granted, you updated your post to suggest what the ACLU is now suggesting, but that was after the commenters correctly criticized your post for being wrong.
"they could use a copy of the chip to try five different PIN codes, and then replace the chip with a fresh copy of the original and try five more"
I don't know how I could have made it any clearer that I was proposing an attack on the PIN, not the key.
It's encrypted, but here's the thing: the encryption key is also (almost certainly) stored in the same chip. So all the FBI needs to do is de-solder the chip, mount it in its own hardware, and read out the data.
This is not correct and was the main suggestion you made.
I don't know how I could have made it any clearer that I was proposing an attack on the PIN, not the key.
Because you just did. You are now claiming that an offhand comment you made that resembles what the ACLU suggests is the main point of your post and that is not the case.
Keeping control of an on-line curated forum as it grows is still an unsolved problem.
Your real issue was going up against the cult of Apple fanboys that hang out here. You will find this response with anything that remotely suggests that Apple isn't perfection.
You can see them still getting huffy in the responses to this comment.
And Chrome plugin that I use to force the browser to the smile site: https://chrome.google.com/webstore/detail/smile-always/jgpmh...
I've never used this firefox one but here's a link to a similar one for FF: https://addons.mozilla.org/en-US/firefox/addon/amazonsmilere...
Is the suggested donation a buck oh five?
The OS should set the security level initially. The TPM would enforce it. You can't modify the OS to make an attempt without it counting against the initially configured limit.
This would also slow down their attack considerably.
I disagree that the claim is fraudulent.
Of course it would take much less time for a 4-digit numeric code -- but AFAIK at this point the length of the password is unknown, so, the ACLU claiming fraud based on the assumption of the length of the password is not correct.
Lets see how many more DVs I can get.
In other words, if you are holding the phone in your hand, you can figure out how many digits the passcode is, and whether it's alphanumeric or just numeric, without entering a single character.
But where I was disagreeing was that it was ever revealed that the phone in question really does have a 4 digit numeric code, because as far as I know, the size and complexity of the pass code has never been revealed by the FBI.
Right. They could do this, and risk destroying the device, or they could ask Apple to do the easy, reliable thing, and just install a build on this phone that allows brute-force attacks.
Given that Apple has a long history of complying with these kinds of requests for valid search warrants, and that this situation is about as clear as it gets when it comes to justifiable uses of government investigatory powers, it's obvious why they're taking the latter approach, and not the former.
There's a legitimate privacy debate in this case, but this isn't it.
Edit: I'm just stating facts here, folks. Downvoting me won't change those facts, or make the government change its tactic.
Order #2 is the relevant order. But to answer your question, they've complied with similar warrants on at least 70 cases in the past:
The only difference in this case is whether or not the FBI can compel Apple to make and install a custom build of iOS to crack the phone.
In any case, the legal question has nothing to do with encryption. It's an incidental detail.
Moreover, Apple won't comply with valid warrants for phones running iOS7, so it doesn't really have anything to do with the security of the OS. This started only because a federal judge made an issue of the legal justification for the first time ever:
But OK, if you insist...here's "evidence" straight from the EFF:
"For older phones with no encryption, Apple already had a software version to bypass the unlock screen (used, for example, in Apple stores to unlock phones when customers had forgotten their passcode)."
And before you go there: whether or not you call this "brute forcing" is, again, a distinction without a difference. The FBI wants access to a single, password-protected phone, under warrant, and Apple has historically maintained custom software that helped them comply with these exact requests. Nobody knowledgable about this case cares that the software has to iterate through 10,000 numbers, or uses some other method to gain entry. They just want the outcome.
Whether Apple has previously signed a piece of PIN unlock software or not completely misses the point: they decided to do that. They were not compelled. They expressed trust in the software because they trusted it. Not because they were forced to. Compelled speech is constitutionally prohibited.
Maybe give CVE-2014-4451 a try?