What the FBI is attempting is to use 'All Writs Act' from 1789 which authorizes Federal courts to issue "all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." Are there limits to what a judge can order a person, or a company, to provide?
A warrant describes "the place to be searched, and the persons or things to be seized." I would not expect a judge can draft a warrant for something which doesn't actually exist, and then force someone to create it.
This is not about providing physical access, or about producing documents which are in your possession. This is whether the government can usurp your workforce to make you create something that only you are capable of creating, against your will, not because there's actually a law which says you have to provide that capability, but simply because some investigator has probable cause that given such a tool they could use it to find evidence of a crime!
If 'All Writs' somehow does give the government the ability to enslave software developers to creating this particular backdoor, what is there to legally differentiate this request from, for example, one that would function over WiFi or LTE remotely?
There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6. For example, why not force Apple to provide remote access to a suspect's device over LTE while the device is unlocked / in use? While we're at it, the iPhone has perfectly good cameras and microphones, let's force Apple to provide real-time feeds.
Think about the sheer quantity of networked devices which exist (or will exist) in an average home which could be used in the course of an investigation. If they can force Apple to create a 5C backdoor, I can't see any reason they can't apply the same logic to WiFi cameras, Xbox Kinects, or even your cars OnStar. Heck, even TV remotes come with microphones and bluetooth now... And don't get me started on Amazon Echo!
Fundamentally, the question is can you force a device manufacturer to implement backdoors into their products to be used against their own customers? Notably, service providers have already lost that battle, they are required to architect their systems to be able to spy on their users and provide that data to law enforcement, often through specially design real-time dashboards. At least in that case it is based on duly enacted legislation with that specific intent.
But this is something really quite shocking -- can investigators, simply through obtaining a warrant, force companies to re-design the personal devices that we own and keep with us almost every moment of the day to spy on us? I truly hope not.
Well stated. That's the crux. The technical difficulty of any given hack is going to vary and is ultimately irrelevant. The idea that the government can commandeer a company's resources towards its ends, especially when those ends compromise the security of a larger community, is a dangerous one.
Once in possession, they'd be free to contract an independent entity to modify the firmware and resign it with Apple's own key. It can't be all that hard to NOP out the code that increments the unlock attempt counter and associated delay mechanisms.
"As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware."
In these older devices, there are still caveats and a customized version of iOS will not immediately yield access to the phone passcode. Devices with A6 processors, such as the iPhone 5C, also contain a hardware key that cannot ever be read and also “tangle” this hardware key with the phone passcode. However, there is nothing stopping iOS from querying this hardware key as fast as it can. Without the Secure Enclave to play gatekeeper, this means iOS can guess one passcode every 80ms.
And even if it did have a Secure Enclave, from page 5 of your link:
When an iOS device is turned on, its application processor immediately executes code
from read-only memory known as the Boot ROM. This immutable code, known as the
hardware root of trust, is laid down during chip fabrication, and is implicitly trusted.
The Boot ROM code contains the Apple Root CA public key, which is used to verify that
the Low-Level Bootloader (LLB) is signed by Apple before allowing it to load. This is
the first step in the chain of trust where each step ensures that the next is signed by
Apple. When the LLB finishes its tasks, it verifies and runs the next-stage bootloader,
iBoot, which in turn verifies and runs the iOS kernel.
So the LLB is run first, and could be used to contact the Secure Enclave and guess passwords.
And from the article:
At this point you might ask, “Why not simply update the firmware of the Secure Enclave too? That way, Apple could disable the protections in iOS and the Secure Enclave at the same time.” (crossed out)Although it is not described in Apple iOS Security Guide, it is believed that updates to the Secure Enclave wipe all existing keys stored within it. An update to the Secure Enclave is therefore equivalent to erasing the device.(/crossed out) I initially speculated that the private data stored within the SE was erased on update but I now believe this is not true. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped. In all honestly, only Apple knows the exact details.
So, it may be that Apple does have a backdoor to upgrade the SE. Only Apple (and maybe other state actors) really know this. So, it certainly isn't as cut and dried as you imply, if the device did have a Secure Enclave.
But, in this case, the device does not have a SE, so it is clear that with Apple's keys the device could be, with a practical amount of effort, be hacked.
People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries. See, for instance, any case involving a complicated medical issue.
Can you name some of them?
IANAL, but the Federal rules of evidence section 706 explicitly states, "But the court may only appoint someone who consents to act."
The problem is different now than it was centuries ago when technology was oil, gas, and steam powered. Now it is electric, meaning information moves at the speed of light all around the world simultaneously.
Such technology is getting better, faster, and cheaper at exponential rates leading to ephemeralization. Meanwhile paper-based political processes have stalled and remain slow as ever.
Say a paper-based political system begins the process of banning a new technology. By the time they finally get around to completing their process, 5 better, faster, and cheaper technologies have already been invented making the old one irrelevant. This is an increasingly important problem to deal with considering the existential nature of paper-based political processes and the rate of technological change.
Paper-based information systems and processes are simply too slow to keep up with the speed of the electric medium. It's like trying to race a lightning bolt.
In a way, this would resemble war-time confiscation of production capability. You have a car factory, for instance, and the government tells you that from now on, you'll be producing tanks, thank you very much.
I think this was not strictly speaking what happened in the US during Second World War, because companies were willing enough to produce war material for the US government in exchange for considerable sums of money. But for instance, the Skoda factory in Czechoslovakia was just confiscated and directed to making military vehicles.
This almost certainly ends up at the Supreme Court, but due to the national security implications of the case it's plausible certiorari petition could be made by either Apple or the government. And yet we have a 4-4 court right now.
 Or really any small tweak. I can remember at least a couple of times being asked to provide someone with a special build of software I worked on that, e.g., logged something it didn't ordinarily log to help debug an issue a customer was having that we couldn't reproduce in-house. Can't say I ever thought of that as creating new software, even if I added some fprintf statements that weren't there before.
Can a court order someone to delete phrases from a letter or book?
Code is protected under copyright law. Code is a form of speech. How is either a court, or law enacted by Congress, telling someone what to write or delete what's already written, not abridging speech? Why is code different?
A bigger question might be if the government has the ability to require Apple to do this without compensation, and if they do, then who gets to set the rates or define the project -- Apple or the government? Possibly, a better tact/approach for Apple (discounting the setting of precedent, etc) would be to bill the government $700-$1000/hour per developer, and assign a team of 200-300 developers to this, with an expected delivery date of 3-5 years. As the government starts paying down $100 million/month for in effect, nothing, very quickly the public would become outraged and force the question of compensation into court. At the end of the day, the people/government would have to decide if its worth either 1) forcing Apple to do this uncompensated or 2) if it is worth the cost to decrypt one phone -- neither I believe would be widely popular/supported.
Regardless of what happens, unless the FBI withdraws this request, which is highly unlikely -- they have clearly chosen this case due to the terrorism connection to be its legal Alamo -- this is going to end up in front of the Supreme Court.
Almost certainly this expertise is billable to the government, and won't take $100 million, I'd be surprised if it took $1 million, but what do I know?
The more concerning thing is FBI almost can't lose. If they lose the case in court, they've put pressure, will continue to put pressure, on the public and Congress to change this in law, which is why I think it's important to establish constitutional reasoning for why this is a bad idea.
At the end of the day, there really is no way the government will succeed here, you really can't force Apple's developers to do this, as they could just tell the government to go pound sand. The only way they could get it done is through threat of violence/incarceration or by gigantic sums of money. The government can't really argue that Apple with its billions in profits, that their developers' time isn't worth a few billion.
You don't get to play games like this with the judiciary. They can require you to allow an outside auditor to review the code.
Abridge means to curtail. Telling someone to create something doesn't curtail anything.
The phone in question is an iPhone 5C which does not have a Secure Enclave, only a burned-in hardware ID.
It's also worth pointing out that there's no clear evidence that the Secure Enclave must clear all keys as part of a SE firmware update (just speculation that this would be reasonable).
It is being asked to help exploit an existing backdoor in a device they produced; a device that was used by a person who slaughtered many people.
If the "10 false codes and the device is bricked" can be circumvented, there is a backdoor.
The 10 try deletion is a completely separate system from the encryption. If you stop it from deleting, the encryption is still intact. No one will be able stop this from happening short of Apple taking action or divine intervention, so there is no backdoor here.
What you seem to be pointing out is a philosophical point, about if in the future Apple designs software to backdoor your phone, then there is the potential for a backdoor right now, therefore, there is a backdoor right now. This is false. There is no backdoor. Apple, being god in this case, could make a backdoor. Apple, or god (which ever you prefer) didn't make a backdoor, so there is no backdoor.
This is not a "back door". It's just not.
The bottom line is that Apple produced a device whose security features could be circumvented.