Hacker News new | past | comments | ask | show | jobs | submit login

Crucially, this is software which doesn't currently exist in the world and which Apple has no intention of voluntarily writing. There is no specific law or regulation (like CALEA) which requires Apple to provide this functionality.

What the FBI is attempting is to use 'All Writs Act' from 1789 which authorizes Federal courts to issue "all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." Are there limits to what a judge can order a person, or a company, to provide?

A warrant describes "the place to be searched, and the persons or things to be seized." I would not expect a judge can draft a warrant for something which doesn't actually exist, and then force someone to create it.

This is not about providing physical access, or about producing documents which are in your possession. This is whether the government can usurp your workforce to make you create something that only you are capable of creating, against your will, not because there's actually a law which says you have to provide that capability, but simply because some investigator has probable cause that given such a tool they could use it to find evidence of a crime!

If 'All Writs' somehow does give the government the ability to enslave software developers to creating this particular backdoor, what is there to legally differentiate this request from, for example, one that would function over WiFi or LTE remotely?

There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6. For example, why not force Apple to provide remote access to a suspect's device over LTE while the device is unlocked / in use? While we're at it, the iPhone has perfectly good cameras and microphones, let's force Apple to provide real-time feeds.

Think about the sheer quantity of networked devices which exist (or will exist) in an average home which could be used in the course of an investigation. If they can force Apple to create a 5C backdoor, I can't see any reason they can't apply the same logic to WiFi cameras, Xbox Kinects, or even your cars OnStar. Heck, even TV remotes come with microphones and bluetooth now... And don't get me started on Amazon Echo!

Fundamentally, the question is can you force a device manufacturer to implement backdoors into their products to be used against their own customers? Notably, service providers have already lost that battle, they are required to architect their systems to be able to spy on their users and provide that data to law enforcement, often through specially design real-time dashboards. At least in that case it is based on duly enacted legislation with that specific intent.

But this is something really quite shocking -- can investigators, simply through obtaining a warrant, force companies to re-design the personal devices that we own and keep with us almost every moment of the day to spy on us? I truly hope not.

> There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6.

Well stated. That's the crux. The technical difficulty of any given hack is going to vary and is ultimately irrelevant. The idea that the government can commandeer a company's resources towards its ends, especially when those ends compromise the security of a larger community, is a dangerous one.

Forcing Apple to develop said backdoor aside... I have to wonder if it wouldn't be much easier, legally speaking, for the government to force Apple to hand over it's firmware signing keys? Those undoubtedly do already exist.

Once in possession, they'd be free to contract an independent entity to modify the firmware and resign it with Apple's own key. It can't be all that hard to NOP out the code that increments the unlock attempt counter and associated delay mechanisms.

It doesn't quite work that way. The files on the device are ultimately encrypted with a dependency on both the hardware key in the Secure Enclave and the user's passcode. It's impossible to update the firmware without both pieces of information or erasing the device, and on A7 or later processors the unlock attempt delay is a direct result of the method of encryption used and tied to the hardware so it must be performed on the device itself.


Do we have any documentation that indicates it's impossible to do firmware updates without the user's passcode? That's what the government seems to believe that it could (at present) do, given Apple's cooperation. The iOS Security Guide doesn't seem to address that particular point.

It's possible to reflash the firmware in DFU mode, but it requires erasing the data on the phone. Or more technically, there's an encryption key in storage that needs to be regenerated to give access to the filesystem, but that encryption key (in conjunction with the hardware key in the Secure Enclave and the user's passcode, if set) is also necessary to read any user data on the phone.

Are you saying the article is wrong or did you not read it?

"As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware."

Yes, you can only load new firmware onto the device by erasing it, which would make it pointless.

<spoon feed> At this point it is very important to mention that the recovered iPhone is a 5C. The 5C model iPhone lacks TouchID and, therefore, lacks the single most important security feature produced by Apple: the Secure Enclave.

In these older devices, there are still caveats and a customized version of iOS will not immediately yield access to the phone passcode. Devices with A6 processors, such as the iPhone 5C, also contain a hardware key that cannot ever be read and also “tangle” this hardware key with the phone passcode. However, there is nothing stopping iOS from querying this hardware key as fast as it can. Without the Secure Enclave to play gatekeeper, this means iOS can guess one passcode every 80ms. </spoon feed>

And even if it did have a Secure Enclave, from page 5 of your link:

When an iOS device is turned on, its application processor immediately executes code from read-only memory known as the Boot ROM. This immutable code, known as the hardware root of trust, is laid down during chip fabrication, and is implicitly trusted. The Boot ROM code contains the Apple Root CA public key, which is used to verify that the Low-Level Bootloader (LLB) is signed by Apple before allowing it to load. This is the first step in the chain of trust where each step ensures that the next is signed by Apple. When the LLB finishes its tasks, it verifies and runs the next-stage bootloader, iBoot, which in turn verifies and runs the iOS kernel.

So the LLB is run first, and could be used to contact the Secure Enclave and guess passwords. And from the article:

At this point you might ask, “Why not simply update the firmware of the Secure Enclave too? That way, Apple could disable the protections in iOS and the Secure Enclave at the same time.” (crossed out)Although it is not described in Apple iOS Security Guide, it is believed that updates to the Secure Enclave wipe all existing keys stored within it. An update to the Secure Enclave is therefore equivalent to erasing the device.(/crossed out) I initially speculated that the private data stored within the SE was erased on update but I now believe this is not true. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped. In all honestly, only Apple knows the exact details.

So, it may be that Apple does have a backdoor to upgrade the SE. Only Apple (and maybe other state actors) really know this. So, it certainly isn't as cut and dried as you imply, if the device did have a Secure Enclave.

But, in this case, the device does not have a SE, so it is clear that with Apple's keys the device could be, with a practical amount of effort, be hacked.

Under extraordinary circumstances, compelled expert witness testimony has lots of precedents, including situations where experts are required to expend resources to develop that testimony.

People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries. See, for instance, any case involving a complicated medical issue.

Yes, you can compel expert witness testimony, but this is different. Testimony is answering questions. Here, they are seeking to compel engineering/coding work. I've never seen that compelled by a subpoena.

If you read old-ish legal journal articles about expert witness compensation, you find that the requirement to do up-front work in order to generate the knowledge required to handle questions is a dividing line for whether (or, at least, whether in the 1960s) expert testimony must be compensated. From that, I gather that this kind of request isn't unprecedented.

Under extraordinary circumstances, compelled expert witness testimony has lots of precedents,

Can you name some of them?

IANAL, but the Federal rules of evidence section 706 explicitly states, "But the court may only appoint someone who consents to act."

Compelling expert testimony happens all the time. (You might have to pay for it, but the witness has no choice.) The Apple subpoena is different. It is an attempt to compel engineering and coding work.

I would argue that expending resources is different from creating something entirely new though.

>People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries.

The problem is different now than it was centuries ago when technology was oil, gas, and steam powered. Now it is electric, meaning information moves at the speed of light all around the world simultaneously.

Such technology is getting better, faster, and cheaper at exponential rates leading to ephemeralization. Meanwhile paper-based political processes have stalled and remain slow as ever.

Say a paper-based political system begins the process of banning a new technology. By the time they finally get around to completing their process, 5 better, faster, and cheaper technologies have already been invented making the old one irrelevant. This is an increasingly important problem to deal with considering the existential nature of paper-based political processes and the rate of technological change.

Paper-based information systems and processes are simply too slow to keep up with the speed of the electric medium. It's like trying to race a lightning bolt.

But Darpa sees 20 years ahead, and if they decide to ban a technology ,i'm sure the legal process will be expedient enough.

> This is whether the government can usurp your workforce to make you create something that only you are capable of creating

In a way, this would resemble war-time confiscation of production capability. You have a car factory, for instance, and the government tells you that from now on, you'll be producing tanks, thank you very much.

I think this was not strictly speaking what happened in the US during Second World War, because companies were willing enough to produce war material for the US government in exchange for considerable sums of money. But for instance, the Skoda factory in Czechoslovakia was just confiscated and directed to making military vehicles.

If you have done nothing wrong, broken no law, can a court order you to write a letter or a book? If not, then on what basis does a court order someone (a company) to write code? This isn't just a matter of removing the anti-bruteforcing code, the FBI has also asked for a way to quickly iterate passwords without using the on-screen keypad. That's creating new functionality, not just removing existing functionality. This seems specious.

This almost certainly ends up at the Supreme Court, but due to the national security implications of the case it's plausible certiorari petition could be made by either Apple or the government. And yet we have a 4-4 court right now.

I think most technical people, under ordinary circumstances, if asked to make a build of some software with a security feature disabled,[1] would not consider it akin to being asked to create something new akin to writing a letter or a book.

[1] Or really any small tweak. I can remember at least a couple of times being asked to provide someone with a special build of software I worked on that, e.g., logged something it didn't ordinarily log to help debug an issue a customer was having that we couldn't reproduce in-house. Can't say I ever thought of that as creating new software, even if I added some fprintf statements that weren't there before.

Adding a means for FBI to input passcodes via Lightening connection instead of the touch screen is certainly a non trivial addition.

Can a court order someone to delete phrases from a letter or book?

Code is protected under copyright law. Code is a form of speech. How is either a court, or law enacted by Congress, telling someone what to write or delete what's already written, not abridging speech? Why is code different?

This would not be a "freedom of speech" issue, as it's not preventing speech (which is what the 1st amendment protects). It's compelling (arguably) a significant amount of labor / work, that in the end results in compelled speech (code) -- in a way, this is more of a 4th (due process) and 14th amendment issue (equal protection / indentured servitude [slavery]).

A bigger question might be if the government has the ability to require Apple to do this without compensation, and if they do, then who gets to set the rates or define the project -- Apple or the government? Possibly, a better tact/approach for Apple (discounting the setting of precedent, etc) would be to bill the government $700-$1000/hour per developer, and assign a team of 200-300 developers to this, with an expected delivery date of 3-5 years. As the government starts paying down $100 million/month for in effect, nothing, very quickly the public would become outraged and force the question of compensation into court. At the end of the day, the people/government would have to decide if its worth either 1) forcing Apple to do this uncompensated or 2) if it is worth the cost to decrypt one phone -- neither I believe would be widely popular/supported.

Regardless of what happens, unless the FBI withdraws this request, which is highly unlikely -- they have clearly chosen this case due to the terrorism connection to be its legal Alamo -- this is going to end up in front of the Supreme Court.

I agree the compelling of work is more convincing than the court compelling speech stated or withheld. But code is a language, it's not just building, it's something of a hybrid. And the court order requires both the creation of new code, as well as the inhibition (deletion) of previous code, so it's telling Apple to change their speech as well as their reputation using their own labor to do so.

Almost certainly this expertise is billable to the government, and won't take $100 million, I'd be surprised if it took $1 million, but what do I know?

The more concerning thing is FBI almost can't lose. If they lose the case in court, they've put pressure, will continue to put pressure, on the public and Congress to change this in law, which is why I think it's important to establish constitutional reasoning for why this is a bad idea.

Well the question would be, who gets to dictate the size/scope of the project and the rate charged to the government? Is any person outside of Apple really qualified to make that claim? It's not open source code, so Apple could make any claim they wanted. Also, who says they would have to put their 'A' players on this? Why not back-burner that with specially hired 'E', and 'F' level developers? Charging $700-$1000/hour might seem high, but on a custom development project may not be too far out of line.

At the end of the day, there really is no way the government will succeed here, you really can't force Apple's developers to do this, as they could just tell the government to go pound sand. The only way they could get it done is through threat of violence/incarceration or by gigantic sums of money. The government can't really argue that Apple with its billions in profits, that their developers' time isn't worth a few billion.

>It's not open source code, so Apple could make any claim they wanted.

You don't get to play games like this with the judiciary. They can require you to allow an outside auditor to review the code.

The First Amendment doesn't protect written works, it protects expression. Non-written works can be expression (black arm bands worn in protest), and written works can be functional and non-expressive (e.g. a legal contract). Code is only speech when it's being used as a form of expression. If it's just used to make a commercial product go, it's not speech.

> How is either a court, or law enacted by Congress, telling someone what to write or delete what's already written, not abridging speech?

Abridge means to curtail. Telling someone to create something doesn't curtail anything.

Adding an extra log line is a bit different than custom tailoring a hardware id check that can't trivially be reverse engineered by a highly motivated party.

Can the All Writs Act not then be used to acquire Apple's private keys? After which law enforcement could create the needed software?

Nope, because everything is ultimately dependent on the hardware key in the Secure Enclave and the user's passcode, neither of which Apple has knowledge of. The FBI would be able to create the OS, but they wouldn't be able to load it on the device without erasing it. It's really a brilliant system.

>Nope, because everything is ultimately dependent on the hardware key in the Secure Enclave

The phone in question is an iPhone 5C which does not have a Secure Enclave, only a burned-in hardware ID.

It's also worth pointing out that there's no clear evidence that the Secure Enclave must clear all keys as part of a SE firmware update (just speculation that this would be reasonable).

Compelling production of Apple's private keys would (for better or worse) be a far easier legal question than what this subpoena seeks to compel. Remember the Lavasoft case? The private keys were ordered produced there. The difference here is that what is sought to be compelled here doesn't exist now; it will have to be created through expert engineering and software design work.

Apple is not being asked to engineer a new backdoor.

It is being asked to help exploit an existing backdoor in a device they produced; a device that was used by a person who slaughtered many people.

There is no backdoor

Why would Cook not explicitly make that case?

If the "10 false codes and the device is bricked" can be circumvented, there is a backdoor.

The encryption only has one way through, with the passkey. So the encryption works and this is good. No backdoor there.

The 10 try deletion is a completely separate system from the encryption. If you stop it from deleting, the encryption is still intact. No one will be able stop this from happening short of Apple taking action or divine intervention, so there is no backdoor here.

What you seem to be pointing out is a philosophical point, about if in the future Apple designs software to backdoor your phone, then there is the potential for a backdoor right now, therefore, there is a backdoor right now. This is false. There is no backdoor. Apple, being god in this case, could make a backdoor. Apple, or god (which ever you prefer) didn't make a backdoor, so there is no backdoor.

That's not so much a back door as a potential vulnerability.

A distinction in phrase only.

Not true. To say something has a "back door" is to imply security was deliberately compromised by the manufacturer, whereas a vulnerability is a security weakness in the the architecture or implementation.

This is not a "back door". It's just not.

I get what you're saying but it would have made more sense had Apple not added the Secure Enclave to newer iPhones. I.e. they tried to make it more secure, so I have to conclude that they had tried their best with the older versions, too.

A simple flow chart or state diagram would show the vulnerability. I'm not saying it was intentional, but I believe the backdoor label can be applied to accidental entrances as well.

The bottom line is that Apple produced a device whose security features could be circumvented.

All security features can be circumvented. Security is not absolute

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact