What the FBI is attempting is to use 'All Writs Act' from 1789 which authorizes Federal courts to issue "all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." Are there limits to what a judge can order a person, or a company, to provide?
A warrant describes "the place to be searched, and the persons or things to be seized." I would not expect a judge can draft a warrant for something which doesn't actually exist, and then force someone to create it.
This is not about providing physical access, or about producing documents which are in your possession. This is whether the government can usurp your workforce to make you create something that only you are capable of creating, against your will, not because there's actually a law which says you have to provide that capability, but simply because some investigator has probable cause that given such a tool they could use it to find evidence of a crime!
If 'All Writs' somehow does give the government the ability to enslave software developers to creating this particular backdoor, what is there to legally differentiate this request from, for example, one that would function over WiFi or LTE remotely?
There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6. For example, why not force Apple to provide remote access to a suspect's device over LTE while the device is unlocked / in use? While we're at it, the iPhone has perfectly good cameras and microphones, let's force Apple to provide real-time feeds.
Think about the sheer quantity of networked devices which exist (or will exist) in an average home which could be used in the course of an investigation. If they can force Apple to create a 5C backdoor, I can't see any reason they can't apply the same logic to WiFi cameras, Xbox Kinects, or even your cars OnStar. Heck, even TV remotes come with microphones and bluetooth now... And don't get me started on Amazon Echo!
Fundamentally, the question is can you force a device manufacturer to implement backdoors into their products to be used against their own customers? Notably, service providers have already lost that battle, they are required to architect their systems to be able to spy on their users and provide that data to law enforcement, often through specially design real-time dashboards. At least in that case it is based on duly enacted legislation with that specific intent.
But this is something really quite shocking -- can investigators, simply through obtaining a warrant, force companies to re-design the personal devices that we own and keep with us almost every moment of the day to spy on us? I truly hope not.
Well stated. That's the crux. The technical difficulty of any given hack is going to vary and is ultimately irrelevant. The idea that the government can commandeer a company's resources towards its ends, especially when those ends compromise the security of a larger community, is a dangerous one.
Once in possession, they'd be free to contract an independent entity to modify the firmware and resign it with Apple's own key. It can't be all that hard to NOP out the code that increments the unlock attempt counter and associated delay mechanisms.
"As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware."
In these older devices, there are still caveats and a customized version of iOS will not immediately yield access to the phone passcode. Devices with A6 processors, such as the iPhone 5C, also contain a hardware key that cannot ever be read and also “tangle” this hardware key with the phone passcode. However, there is nothing stopping iOS from querying this hardware key as fast as it can. Without the Secure Enclave to play gatekeeper, this means iOS can guess one passcode every 80ms.
And even if it did have a Secure Enclave, from page 5 of your link:
When an iOS device is turned on, its application processor immediately executes code
from read-only memory known as the Boot ROM. This immutable code, known as the
hardware root of trust, is laid down during chip fabrication, and is implicitly trusted.
The Boot ROM code contains the Apple Root CA public key, which is used to verify that
the Low-Level Bootloader (LLB) is signed by Apple before allowing it to load. This is
the first step in the chain of trust where each step ensures that the next is signed by
Apple. When the LLB finishes its tasks, it verifies and runs the next-stage bootloader,
iBoot, which in turn verifies and runs the iOS kernel.
So the LLB is run first, and could be used to contact the Secure Enclave and guess passwords.
And from the article:
At this point you might ask, “Why not simply update the firmware of the Secure Enclave too? That way, Apple could disable the protections in iOS and the Secure Enclave at the same time.” (crossed out)Although it is not described in Apple iOS Security Guide, it is believed that updates to the Secure Enclave wipe all existing keys stored within it. An update to the Secure Enclave is therefore equivalent to erasing the device.(/crossed out) I initially speculated that the private data stored within the SE was erased on update but I now believe this is not true. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped. In all honestly, only Apple knows the exact details.
So, it may be that Apple does have a backdoor to upgrade the SE. Only Apple (and maybe other state actors) really know this. So, it certainly isn't as cut and dried as you imply, if the device did have a Secure Enclave.
But, in this case, the device does not have a SE, so it is clear that with Apple's keys the device could be, with a practical amount of effort, be hacked.
People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries. See, for instance, any case involving a complicated medical issue.
Can you name some of them?
IANAL, but the Federal rules of evidence section 706 explicitly states, "But the court may only appoint someone who consents to act."
The problem is different now than it was centuries ago when technology was oil, gas, and steam powered. Now it is electric, meaning information moves at the speed of light all around the world simultaneously.
Such technology is getting better, faster, and cheaper at exponential rates leading to ephemeralization. Meanwhile paper-based political processes have stalled and remain slow as ever.
Say a paper-based political system begins the process of banning a new technology. By the time they finally get around to completing their process, 5 better, faster, and cheaper technologies have already been invented making the old one irrelevant. This is an increasingly important problem to deal with considering the existential nature of paper-based political processes and the rate of technological change.
Paper-based information systems and processes are simply too slow to keep up with the speed of the electric medium. It's like trying to race a lightning bolt.
In a way, this would resemble war-time confiscation of production capability. You have a car factory, for instance, and the government tells you that from now on, you'll be producing tanks, thank you very much.
I think this was not strictly speaking what happened in the US during Second World War, because companies were willing enough to produce war material for the US government in exchange for considerable sums of money. But for instance, the Skoda factory in Czechoslovakia was just confiscated and directed to making military vehicles.
This almost certainly ends up at the Supreme Court, but due to the national security implications of the case it's plausible certiorari petition could be made by either Apple or the government. And yet we have a 4-4 court right now.
 Or really any small tweak. I can remember at least a couple of times being asked to provide someone with a special build of software I worked on that, e.g., logged something it didn't ordinarily log to help debug an issue a customer was having that we couldn't reproduce in-house. Can't say I ever thought of that as creating new software, even if I added some fprintf statements that weren't there before.
Can a court order someone to delete phrases from a letter or book?
Code is protected under copyright law. Code is a form of speech. How is either a court, or law enacted by Congress, telling someone what to write or delete what's already written, not abridging speech? Why is code different?
A bigger question might be if the government has the ability to require Apple to do this without compensation, and if they do, then who gets to set the rates or define the project -- Apple or the government? Possibly, a better tact/approach for Apple (discounting the setting of precedent, etc) would be to bill the government $700-$1000/hour per developer, and assign a team of 200-300 developers to this, with an expected delivery date of 3-5 years. As the government starts paying down $100 million/month for in effect, nothing, very quickly the public would become outraged and force the question of compensation into court. At the end of the day, the people/government would have to decide if its worth either 1) forcing Apple to do this uncompensated or 2) if it is worth the cost to decrypt one phone -- neither I believe would be widely popular/supported.
Regardless of what happens, unless the FBI withdraws this request, which is highly unlikely -- they have clearly chosen this case due to the terrorism connection to be its legal Alamo -- this is going to end up in front of the Supreme Court.
Almost certainly this expertise is billable to the government, and won't take $100 million, I'd be surprised if it took $1 million, but what do I know?
The more concerning thing is FBI almost can't lose. If they lose the case in court, they've put pressure, will continue to put pressure, on the public and Congress to change this in law, which is why I think it's important to establish constitutional reasoning for why this is a bad idea.
At the end of the day, there really is no way the government will succeed here, you really can't force Apple's developers to do this, as they could just tell the government to go pound sand. The only way they could get it done is through threat of violence/incarceration or by gigantic sums of money. The government can't really argue that Apple with its billions in profits, that their developers' time isn't worth a few billion.
You don't get to play games like this with the judiciary. They can require you to allow an outside auditor to review the code.
Abridge means to curtail. Telling someone to create something doesn't curtail anything.
The phone in question is an iPhone 5C which does not have a Secure Enclave, only a burned-in hardware ID.
It's also worth pointing out that there's no clear evidence that the Secure Enclave must clear all keys as part of a SE firmware update (just speculation that this would be reasonable).
It is being asked to help exploit an existing backdoor in a device they produced; a device that was used by a person who slaughtered many people.
If the "10 false codes and the device is bricked" can be circumvented, there is a backdoor.
The 10 try deletion is a completely separate system from the encryption. If you stop it from deleting, the encryption is still intact. No one will be able stop this from happening short of Apple taking action or divine intervention, so there is no backdoor here.
What you seem to be pointing out is a philosophical point, about if in the future Apple designs software to backdoor your phone, then there is the potential for a backdoor right now, therefore, there is a backdoor right now. This is false. There is no backdoor. Apple, being god in this case, could make a backdoor. Apple, or god (which ever you prefer) didn't make a backdoor, so there is no backdoor.
This is not a "back door". It's just not.
The bottom line is that Apple produced a device whose security features could be circumvented.
I really believe that there should be a way for law-enforcement to get access to specific devices in response to a court order as long as the solution doesn't involve weakening the encryption for everybody else.
I'm absolutely against backdoors, secret* keys or similar crap. But physically access a single device in order to make brute-forcing it possible, that seems acceptable to me as that won't affect any other device.
That would be similar to a court order allowing law enforcement to enter your premises and take out the safe in order to pry it open at some other location where specialised equipment is available.
If this is all law enforcement wants, then maybe it's time to hand this over before law enforcement wants even more which will doubtless pave the way for mass surveillance of devices.
* until they leak. Then everybody has access.
The FBI is already paving that way with this case. They don't overly care about access to this particular iPhone. They're taking this case through the courts so that they can establish a precedent that allows them to force manufacturer cooperation to unlock any phone.
Edit: If they really cared about access to this individual phone, they wouldn't be going through the courts to get it; they'd be talking to the NSA TAO or other LEO with advanced forensic capability. As several people have pointed out, this iPhone 5C does not have a Secure Enclave and probably does not present a significant challenge to forensically analyze, to people that know what they're doing. They're going through the courts on this so they can get carte blanche to access iPhones 5S and above, which no LEO currently has capabilities to inspect.
Further edit: This is Farook's work phone. His main, personal phone was found destroyed in a dumpster near the site of the attacks. I find it incredibly unlikely the FBI really cares much about the contents of this individual phone, they just want a high-profile test case to expand their surveillance capabilities.
This is an analysis, not an objective and demonstrable fact.
I could just as well argue that yes, the FBI really does care a lot about this particular iPhone, and that's why the asked-for update is to be keyed to this iPhone and only this iPhone.
At the same time, even assuming that is true, we're talking about the FBI going through a legal process, reviewed by a judge, to get the data off one phone at a time. If that's how it works every time, I don't see a problem; that is how the system is supposed to work. I am kind of baffled as to why we're cheerleading the fact that Apple is refusing to perform what appears to be a perfectly reasonable request that is being made in accordance with the law. If you are operating under the presumption that the government is always a bad-faith actor, then we have much, much bigger problems.
Also, apparently this 'precedent' has already been set; according to a link in the article, Apple had previously offered custom firmware images to law enforcement after a court order that bypassed the lock screen on earlier iPhones.
Wasn't it just yesterday a story was published about an upcoming documentary about the STUXnet virus that claims that the US and Israel developed it in secret together and had a very successful, but very limited use for it. Only when Israel allegedly went off on their own to modify and deploy it did it spread wide and far, popping up on the radar of anti-malware companies and getting researched and publicized.
Like what you said, after the exploit/backdoor/software is designed, it can never be un-designed. It will exist as a tool that can only be mitigated, but not destroyed.
Then why don't we have Apple's private keys yet?
Plenty of companies keep a lot of things very secret, including things like powerful debug modes, for a long time. At least long enough that everybody forgets the details and the software has long since rotted away.
Nobody in FBI would give a damn about leaking the patched OS image: it's Apple's reputation on stake, not FBI's.
There is nothing of value for the FBI to leak.
This is the huge difference between this order (which I can live with) and blanket encryption backdoors using key escrow or other crap (which I'm absolutely vehemently against and willing to fight to the teeth)
That is completely not true. There is no way to make such a thing that can only work on one particular phone. There will be some point at which the compromised firmware image checks to see if it's that device, at which point it would be possible to change that to whatever device you want.
"This is the huge difference between this order (which I can live with) and blanket encryption backdoors using key escrow or other crap (which I'm absolutely vehemently against and willing to fight to the teeth)"
No, there is absolutely no difference between those two.
The technique that makes this possible is described in Apple's iOS Security White paper, page 6 ("System Software Authorization"): https://www.apple.com/business/docs/iOS_Security_Guide.pdf
This mechanism explains why you can't take an old release of iOS off a different phone and copy it to yours.
No. You don't have to assume that law enforcement or intelligence agencies are bad faith actors to see they are constantly seeking to expand their powers.
Given what we know about government surveillance programs, why would one assume the government is a good faith actor when it comes to encryption?
It needn't be "always", just (perceived as) too often. It seems fair to say the broad sentiment is that the national security arms of U.S. government have broached that barrier.
And how will we ensure that's the case? Once they have the firmware they need they can install it on other phones.
For all we know so far, Apple could still provide a signed firmware bypassing the bruteforcing delay implemented by the Secure Enclave.
Probably not easy and not on mass scale. It may even need to have to see inside the silicone itself. Which is expensive and hard. But as long as the keys are on the device or in possession of apple - they can be extracted.
Apple chose themselves to have total control over the device, signing and ecosystem - now it backfires.
I would think it's more similar to the following: the government has gone to the safe manufacturer to help it open a safe it has a warrant for, has access to and can move, but can't open.
The government is asking to develop a method to modify the safe so that it can be opened. The safe manufacturer says that it they did so the same method would be able to be used on all of their safes, and thereby make all of their products less secure.
I would imagine that a reasonable safe manufacturer would bring up the same objection.
If the safe manufacturer doesn't want to be put in this position, it should make it so there is no such modification possible. Which as far as I understand it is what Apple did with their safes^H^H^H^H^Hphones starting with the A7 CPU, but this phone is older.
I'm guessing that the secure enclave not only requires a private key from Apple, but that it wipes the crypto keys it contains (effectively wiping the device) if it's updated without first being unlocked with the user's passcode. That would prevent even Apple from cracking it, barring an exploit of the secure enclave's software, or some sort of highly advanced attack on the physical hardware.
This requirement is self-contradictory. The device has no way to determine whether the attacker trying to gain access is a good or bad guy, nor can it.
Which law enforcement? The FBI? Really? how about the DEA? TSA? How about the federal police in China? Venezuela? Saudi Arabia? Syria?
Were I in Apple's position, I would probably do what Apple is doing here... but it's a harder question than just "should we cooperate?" They have to ask, "what if we don't?"
Sure, they can try brute-forceing it, or even break open the chips and try to extract keys with an electron microscope. That's all within their domain. But why should anyone be forced to assist them?
A safe is a container filled with physical objects: property. Property is subject to search and seizure with appropriate warrants, levies, writs, orders, wants, etc.
A phone is a container filled with information. The only physical property relevant to evidence consists of the electromagnetic state of the memory on the device. This would be no different from the bioelectric state of the neurons in the human brain, which coincidentally, also is a container filled with information. In both cases, there seems to be easy precedent to state that the information in those containers represents protected information, as it pertains to the possibly incriminating testimony of that information.
A safe can be physically removed and brought to a place where there are more specific tools available to access its physical contents. I stipulate to that.
A phone can also be physically removed and brought to a place where ... What? What tools exist to interrogate the electromagnetic state of the phone that aren't already accessible? Asking Apple to create some software allowing them to unlock and read the information is tantamount to asking a neuroscientist to create software allowing them to unlock and read your mind.
Not trolling, these are my sincere beliefs. Are they wrong?
Nobody can bypass the security on an iPhone but Apple, requiring them to do so isn't requiring a company to assist in a search, it's requiring a company to use its unique position as the manufacturer to damage their product.
Unfortunately, I think this sort of example paints you into a corner. The court is not requiring the company to permenantly damage the product, any more than a locksmith who opens a locked door to assist in the execution of a warrant is damaging the door. Once the court has what it requests it is trivial to reinstall the original firmware.
There is an old saying that hard cases make bad law. This is a hard case. The defendants are both heinous and deceased; they lack standing to resist and few will support such an effort. Apple is a third-party here and is on such shakey legal ground that they are left defending themselves in the court of public opinion because they know they are going to lose in an actual court. Bad things tend to come out of situations like this.
Can you explain how that duty comes to be?
And since you asked, I would bet that most of the people sitting on a jury deciding if you go to jail or not for impeding the execution of the warrant probably think that your duty to act in such a situation is considered a part of the price of admission to civil society.
Well, because an actual judge would be required to make such a decision, and in each case, the merits would be assessed separately. So Apple could say no, and a judge might agree with them "right, the government has no grounds here".
(IANAL, and IANEAA - I am not even an American).
Unfortunately that's exactly what they're going to end up doing with this faux resistance. It seems like in this case there is a master key that only Apple has. If the device's security is broken in this manner, then this is a terrible place to make a stand, as Apple will have no choice but to eventually comply.
Next time, with an actually secure implementation, the stance will be "you protested last time and gave in, do that again". And when USG realizes Apple isn't bluffing that time, their bolstered entitlement will result in the inevitable law for Apple to go back to the backdoored nearly-just-as-secure scheme.
To the first order, USG doesn't care about the argument that foreign governments could also compel Apple, since that simply reduces to traditional physical jurisdiction. And governments seem to be more worried about protecting themselves from their own subjects than from other governments.
We can only hope that the resulting legal fallout is implemented in terms of the standard USG commercial proscriptions based on the power of default choices, leaving Free software to continue to be Free.
Apple could be forced to write software that removes the rate limiter and the FBI could still be stuck without access because it's possible the user used a password with too much entropy.
Include in the explanation how removing code makes the prior state insecure.
If we're not talking about trusted hardware, then naive code which calls sleep() is defective for the same reason - the security of the system cannot depend on running "friendly" code. See Linux's LUKS which has a parameter for the number of hash iterations when unlocking, which sets the work factor for brute forcing.
If this still isn't apparent, you need to try thinking adversarially - what would you require to defeat various security properties of specific systems?
> Further the code is signed by author's private key
This is the crux - if Apple is in a privileged position to defeat security measures and you're analyzing security in terms of Apple/USG, this counts as a backdoor. It doesn't provide full access, but it does undermine purported security properties of the system.
It's quite possible to implement a system with similar properties that doesn't give Apple such a privilege. It sounds like they didn't.
This is not correct. Reverse engineering is a thing. Proprietary software just makes it harder. People modify proprietary code all the time.
> Further the code is signed by author's private key, so even if an attacker could modify compiled code (via a decompiler for example), they still can't inject that modified code into the hardware without signing.
This is the actual point.
An OS signing key is never a replacement for a bona-fide user-initiated upgrade intent.
In designs with trusted hardware to prevent evil maid attacks, the boot trust chain should use a hash rather than a signature. This hash is updated only when the trusted chip is already unlocked.
To avoid creating useless bricks, said trusted hardware should allow the option to wipe everything simultaneously. But nothing more granular.
Do people think this a game? Apple doesn't run things, the federal government does, and will, in the end, use it's full power to get what it desires.
The vast majority of people in the US agree that, when it comes to the illusion of keeping them safe, that Apple should bend over and give up the info.
Now, I personally do not agree with this stance, but it's obvious to me which way the wind is blowing.
They want the illusion of safety that the 3-letter agencies provide.
What's more, they don't even have to actually do it, they just need to make the FBI believe that they actually would do it if the FBI presses the issue.
Way too many powerful and wealthy people are interested and invested.
EFF to Support Apple in Encryption Battle-
Chinas behaviour made HSBC, the worlds 5th largest bank, move to London. It's not an unheard of move.
A masterful troll statement!
Apple may have to comply with this order (after appeals), but this also helps muster the troops for the battle against universal backdoors.
Otherwise, they could start locking people up.
They do that all the time for not complying with edicts.
For those of you whom have never experienced it, even 48 hours in jail is a truly miserable and ugly experience, one that no corporate titan is interested in.
If they actually try to hit that hard they'll find themselves in a very bad PR situation. They might not care for that but the consequences of such an action will bite them really hard on the ass.
Also, "its", without an apostrophe.
If true, precedent has already been set.
It probably wouldn't apply as precedent as it previously had nothing to do with encryption.
Specifically (emphasis mine):
> ...the Self-Incrimination Clause ... may be asserted only to resist compelled explicit or implicit disclosures of incriminating information. Historically, the privilege was intended to prevent the use of legal compulsion to extract from the accused a sworn communication of facts which would incriminate him.
> ...the act of producing documents in response to a subpoena may have a compelled testimonial aspect. We have held that “the act of production” itself may implicitly communicate “statements of fact.” By “producing documents in compliance with a subpoena, the witness would admit that the papers existed, were in his possession or control, and were authentic.”
> Compelled testimony that communicates information that may “lead to incriminating evidence” is privileged even if the information itself is not inculpatory.
EDIT: Wikipedia summary of the case here: https://en.wikipedia.org/wiki/United_States_v._Hubbell
You would have had to know the target and push a vulnerability beforehand, which wouldn't have helped in this case.
As a side note, the author mentions that Apple has updated the Secure Enclave with increased delays in the past without wiping data, though they state that only Apple knows how it really works. I just want to put forth the theory that maybe the Secure Enclave allows its firmware to be updated if and only if the user's passcode is provided at the time the OS tells the Secure Enclave to prepare for a firmware update. That would be a reasonable way to ensure the Secure Enclave can't be subverted.
There are Android full disk encryption schemes, and of course phones with signed bootloaders.
What Apple did that was so valuable is providing a very clear, almost abstract implementation, from scratch, hitting every point along the way (randomized device private keys, read and execute only Secure Enclave, signed loaders, proper AES(-XTS?) full disk encryption, probably also requiring strong a password too, full lock after ~48 hours - sure, it'd be good if this could be customized to something lower).
I still remain opposed to any kind of circumvention that reduces security, which this definitely does. Just questioning whether it's something to be so shocked about since it's not exactly far removed from the kind of requests that they have conformed to in the past.
Presumably this could then be used for offline attacks against the image dumped from the phone's flash memory.
Update: In the meantime I was talking to my security engineer peers and it is not feasible to carry out an attack this way. The user partitions remains un-mounted until the PIN is provided after the boot.
It's encrypted data using the PIN (and a key embedded in the phone). There's not another way in.
All an attacker would have to to was clone the contents of the the device's SSD and somehow read the secret key that is embedded somewhere else. I'm not sure how feasible the latter part is, but surely this shouldn't be beyond the capabilities of US three-letter-agencies?
> The UID key is used to create a key called “key0x89b.” Key0x89b is used in encrypting the device’s flash disk. Because this key is unique to the device, and cannot be extracted from the device, it is impossible to remove the flash memory from one iPhone and transfer it to another, or to read it offline. (And when I say “Impossible,” what I really mean is “Really damned hard because you’d have to brute force a 256-bit AES key.”)
Newer phones also include a secure enclave that introduces another key and hardware restrictions on timing. The FBI's request wouldn't make sense for a modern iPhone.
The Pandora's box is opened?
After too many incorrect pins, there's a time delay before another attempt can be made.
The FBI is also thinking about the next time. They want to be able to take a phone, plug it in and brute force the PIN to gain access.
I agree with the first post. We need to be creative and find a way to resist government surveillance, and the piece where the engineering seems impossible is allowing an occasional breach of security for extreme circumstances.
What's extreme? Well first, physical possession of the device should be required. Second, it should take resources only a nation-state would be able to afford. Want to decrypt an iPhone? It's going to cost > $5million in processing power. Any criminal would move on.
It also seems pretty disingenuous/hypocritical for Apple to plead "customer privacy" when the ENITRE BUSINESS MODEL of much of the smart phone and app industry (from which Apple directly benefit with a 30% commission) is predicated on abusing customer privacy.
Apple uses privacy as a major selling point. Apple has also proven very bad at abusing customer privacy for profit, they have even shuttered their own advertising service.
"...we never sell your data."
Or so they say.
They should already have this information from their 'metadata' collection programs. If the FBI was doing their job at any point this wouldn't even be a subject of debate.