In cases like this it is always FBI and the govt. which usually have the PR upper hand. They wait to find a most abhorent crime that nobody sane would want to defend (terrorism seems to work well today) and use that as an example. "Oh look everyone, Apple doesn't want to fight terrorism. We are keeping people safe and what side are they on?". It is almost too easy.
Fighting that is an uphill battle. One can present the technical details ("They could have cracked that particular model themselves") or appeal to more general ideals of freedom and privacy etc. Those typically are not as effective in convincing the average joe out there when the other side uses the "T" word.
But here they are playing the same card as FBI -- using a crime that most people can fear -- their phone getting stolen, their identity used. Everyone has heard stories, has friends at least who this happened to and so on. So it works well. Terrorism is more scary, but this is more real. Great work.
They [Apple] sell phones, they don't sell civil liberties, they don't sell public safety, that's our business to worry about. 
Oh they understand what they are asking, they just don't care if the average person's data is stolen every once in a while as long as they can get all the information they want when they want it. This is pretty similar to how the NSA can buy zero day exploits and not feel the need to inform the American people.
Apple is highly professional in protecting its own innovation, its own information 
Comey even articulates Apple's position clearly,
Apple's argument, I think will be, that's not reasonable because there are risks around that. Even though we're good at this, it could still get away from us. And the judge will have to figure that out 
So Comey claims Apple is very good at technical stuff and so they must have a way to keep this software secure. Yet at the same time he claims we should not believe Apple when Apple says this backdoor would weaken everyone's security. On one hand he is saying we should believe in Apple's technology expertise, and on the other hand he is saying we should not. Well, given that Comey is self-admittedly not knowledgeable about technology, and given that many other tech companies and independent tech experts have stood up to support Apple, it's clear to me who to trust on this issue about tech security.
Apple has demonstrated their commitment to safety by improving the iPhone's security with successive iterations of software and hardware. The FBI has demonstrated their deep desire to gain guaranteed backdoor entry to devices for over a year, yet claim this case is only about one phone. Apple's track record and rhetoric are much more consistent and trustworthy than the FBI's.
This software takes that right away.
Apple's job here is to educate the public. If this ever becomes a political issue, we want the public to be informed. We don't want the public to be reactionary.
Criminals already know how to use tech. We need a security force who can figure out how to maintain public safety without relying on backdoors.
Policing is only easy in a police state.
EDIT: This paper  seems to document such an attack, decapping the chip and reading the memory cells using lasers (although the details are over my head).
The idea is once they have the phone they can do whatever they want with it, but they can't go to a manufacturer and force them to weaken the security of _all_ phones because they don't want to do any work.
The same is true for China and other major powers. There are enough foreign origin people working at these companies to pick from. So, NSA is basically keeping up with everyone else. Even smaller groups/organizations could have a spying system in place.
There is no real safety when it comes to information, as long as we use operating systems and software created and compiled by other people. It's just trust that keeps it running.
It's mechanism of action would be along the lines of reading the PIN code out of memory, trick the user into entering their PIN into a fake screen, read the decryption key out of memory, of make un-authorised calls to decrypt data.
In the case that we're all talking about, this capability would be pointless.
On the other hand, could a 3rd party create a crack that manipulated the PIN reset counter, in a way similar to that being requested by the FBI? Potentially with the right vulnerability.
(Note, I've only skimmed the Apple white paper, so I may be wrong here).
We'll have to see if the reverse engineers are able to reproduce and publish, but between this and my overall security cynicism, I lean towards there being a way in but wouldn't be surprised that if the intelligence agencies have it, they're legit not sharing.
Strange to catch yourself in these thoughts but the narrative seems to work.
An attorney for the ACLU says, "The absolute earliest the case could reach the Supreme Court would be at least two years." 
Perhaps someone more knowledgable could clarify, but....
Isn't the entire point of good security that I don't have to "trust" Apple to do the right thing? Shouldn't it be impossible for them to do the wrong thing?
For example, if I encrypt my phone with my key and set it to secure delete after so many failures how could Apple circumvent this in a truly secure system? Shouldn't they not be able to push an update without my permission? Permission that can't be given because the phone can't be unencrypted?
So the fact that Apple can circumvent the encryption of they want is an indication of a vulnerable system?
Apple can't circumvent its own encryption (not that we know, anyway, and that hasn't been claimed). The FBI knows that the govt have tools capable of breaking the encryption, with time. They essentially scan for the correct key.
Apple has security in addition to the encryption though: in this case they have two: one is a timing system that limits the guessing rate, and the other is a 'self-destruct' system that destroys the data if too many guesses fail. These are the systems that the FBI want compromised. They would then set their normal password cracking toolchain to work.
It's disturbing, because the govt are definitely not the only ones with such a toolchain. And when you increase the number of devices that can be scanned (by producing and signing a version of iOS without these protections), you increase the ability to crack at least one. In general 'the bad guys' have it easier, because they are rarely interested in cracking a specific phone, they are interested in cracking some of a larger set of them.
The aim is to have these kind of security features on the silicon, so we don't even trust them to keep the extra security in place. But that's not the case here.
One could argue that the phone is secure because even Apple has no way of recovering your passcode, nor do they have a master key that might give access to your data. That's fairly secure.
On the other hand, the system is insecure because it can have updates applied (which is what the FBI wants Apple to do) without requiring user consent, only physical access.
Imagine that updates required a user to put in their passcode, or otherwise wiped all the encrypted data; it would be more secure, but is that enough? Theoretically, there are still ways of getting at the data - someone might find a bug in the software, or they might physically crack open the CPU to get at a piece of needed data.
Security is always a balance - we want to trust Apple as little as possible, but I know of no way to create an invulnerable system.
> "The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI."
I imagine they are working hard on preventing themselves from being able to bypass the failsafes.
Sometimes the NSA feeds information to the FBI, DEA, etc, etc. Recipients do parallel construction. And then lie to courts about it. Which is perjury. How often does that get revealed?
Have any prosecutions resulted from widely practiced perjury about obfuscated stingray use? That behavior is common knowledge, no?
I hear people say we can't have universal healthcare because a decently priced tri care is one of the main draws for joining the military and we'd have to pay more to recruit and retain soldiers if we have universal healthcare. What basic rights do we curtail so it will be easier for the NSA to recruit in the future?
Well, first you'd have to have evidence that such a thing happened.
That evidence is widely available. If you want more, just search for it. This training manual should get you started.
Yet no one was charged.
Had some facts way wrong. Here's the story I was thinking of. It was money laundering outside of their jurisdiction. No one was charged.
The FBI has a long history of colluding with criminal informants. FBI agents tend to become virtually indistinguishable from said informants. Look at the Silk Road cases. Or all those murders in Boston, back in the day.
Anything that requires reflashing the phone, or that writes log files at boot or alters as much as a single bit on the phone is a no-go from an evidentiary standpoint.
Edit: but anyway, iOS 7 has no understanding of the encrypted format introduced in iOS 8
FBI doesn't just want an unlocked phone, they want an unlocked phone with data intact, so Apple still needs to write and test a lot of code that performs reverse database migrations, possibly losing information in the process.
I don't know enough to know whether this works. I suspect apps that didn't exist in iOS 7 would have their data deleted, although perhaps this wouldn't happen if it wasn't unlocked (and it would be power-cut as soon as it was cracked).
The signed software with bad behavior is precisely what Apple is afraid to unleash upon the world. Some FBI agents have recently shown themselves to be extremely untrustworthy with valuable items .
(The existing vulnerabilities would be the "cut power after each attempt" one, I assume.)
Basically, the FBI's actions are tantamount to being accomplish to every crime they enable if they succeed in forcing us to make the iPhone knowingly vulnerable to law enforcement and criminals alike.
While that may be a good design for various practical reasons, the technical properties of Apple's security design are not very relevant. This is a political issue about how far Apple (and eventually, any of us) has to go in creating access for law enforcement.
Right now the FBI is being reasonably diplomatic and "asking" (formally, through the court) to create this backdoor. If the situation were as you suggest and no update capability existed, they would simply skip the current step and move on to the next step: forcing Apple to include explicit "lawful intercept" capabilities. It is very important for us to win this earlier stage of the fight, because it will be much harder to fight a law.
In case you've forgotten, the traditional telcom industry already lost that fight (CALEA).
But "we", as in you and I, cannot easily do this. Because we can't sign stuff as Apple. But maybe it's doable.
If this doesn't happen, I would highly suspect there is an NSL (or something similar) involved forbidding it.
I appreciate Apple's willingness to fight, but this article could have benefited from less bragging PR talk and a little more humility. If the issue really is larger than Apple (and they are claiming it is) then we don't need this kind of messaging.
The message is that being forced to create a backdoor is bad for everyone, and they're doing it to Apple today but they'll do it for everyone else tomorrow.
I would be much happier if more companies actually tried on security, competed on that vector and then bragged to high heaven about their successes. Usually when you hear a company touting it's security, it's snake oil. The companies who are actually putting forth a solid effort to compete on security, and I think Apple has a reasonable claim that they are one of these companies (even if there are nits to pick), have my blessing to brag as much as they can about it. I'm talking multimillion dollar ad campaigns touting their (actual) security advantages versus competitors. Have at it. Donald Trump style. Go wild!
Any company that "bragged to high heaven about their successes" with security are setting themselves up to fail. As the old saying goes, "pride comes before a fall".
Not saying that Apple aren't doing a good job. Just saying, bragging and actively trying to differentiate, followed by a major security flaw, is a good way to lose trust.
I guess this is part of the reason I have an aversion to companies speaking about their security. I know Apple security is very good (anyone curious can read the white paper), but most of the time corporations speak about security in a way completely detached from reality.
In Cook's initial letter he was so careful about stating his belief that the FBI's intentions wee good that he said so twice. That was the warning shot. This piece, from Federighi was a shot to the head.
I still got that vibe from the article, despite the one line you pointed out.
That message becomes more clear when you see the support coming from Google, Facebook etc. It's clear they stand with Apple on the side of keeping encryption legal, secure, and backdoor-free. Apple doesn't need to speak for them too much, and I think Apple can still argue backdoors are bad for the whole tech industry while speaking about their own perspective, experience, and products.
And if Apple doesn't find a way, you can sure bet the NSA will work harder on its efforts to break the iPhone in the general case.
The answer can be "No." Or the answer can be unknowable.
That's the problem with compromise positions: If you make a product that helps keep Chinese, Russian, Iranian, etc. dissidents out of prison, it might very well be good enough to keep the NSA out, too. It's too dangerous to try to fine-tune the level of security.
We're facing an election in the US about which people are half-joking it might be the last. That should be a wakeup call for people who think we can arrive at a technological compromise.
If they can, then Apple can be compelled to deliver the goods on these terrorists. If not, then no.
I really don't see this issue as all that complex.
In fact, our friends at the CIA have given this prospect plenty of consideration: https://www.cia.gov/library/center-for-the-study-of-intellig...
It's always the issues that don't seem all that complex that end up coming back and biting us in the ass.
No, not quite. The safe designer can more easily figure out how to bypass his autodestruct mechanisms than anyone else, but that doesn't mean that he "has a code" that bypasses them. In this analogy, it's still nontrivial for the safe designer to perform the work, but it's (probably) easier for him than for anyone else.
They don't have any bypass code. They have detailed plans for the autodestruct mechanism. That's it. They are probably the entity on the planet with the best understanding of the autodestruct system, but the autodestruct system does not have an "off" switch.
To reverse the analogy: The safe designer's bypass "code" is analogous to a back door already built into iOS that opens when presented with some information held in secret by Apple. (Think CALEA wiretapping infrastructure built into US-destined telecom infrastructure equipment.)
Because Apple has to write a special version of iOS to fulfil FedGov's request, we can reasonably assume that there is no pre-built back door in iOS that sits around, awaiting the secret information.
No, the "code"  in Apple's case is the yet-to-be-written version of iOS that can both be loaded at boot time to bypass the PIN retry count and retry delay functions and -to a sufficiently high degree of certainty- only be run on the intended iPhone.
Ergo, your analogy is -as I said- not quite right.
 And it's not a "code", but a bypass procedure.
You then made further mistakes. The signed build does not need to be limited to a single device, so it has no place in the analogy. The FBI allows Apple to keep the build themselves and delete it after use.
Please try to keep up, or you'll continue to tilt at windmills, and you won't learn anything.
You misunderstand what business this safe manufacturer is in. He's in the business of high-security safes. Any bypass technique he comes up with must be as specific as possible, and defended in every way possible, lest he fail his customers  and destroy his reputation (and -thus- his entire business) as a high-security safe manufacturer.
So, yes, the technique developed must be limited -as far as is possible- to a single safe (or phone). Anyone who reads HN knows that leaks happen. The more valuable a piece of information, the more likely someone with the ability to snatch it will be targeting it. And -as we should all know by now- it's tremendously difficult to prevent a state-level actor from acquiring data that's not very securely locked away.
Again, if you're in the business of providing high-security services to your clients, you are going to ensure that any bypass mechanism you're forced to make is as difficult to create and one-time-use as possible.
 And -in some jurisdictions- aid in the "legal" torture, imprisonment, or death of his customers.
The "if we make this software, criminals will get their hands on it" argument is absurd. Does Apple have no faith in their own security? The FBI has at no point suggested the modified firmware would ever leave Apple's possession.
Does Apple have a problem with theft of internal code that I am not aware of?
Further, I'm just waiting for someone to transpose the argument, for example: "Apple says guns shouldn't be manufactured for any reason because they could kill someone" or "Apple says cars are unsafe and should be banned because people are killed in accidents."
EDIT: Behaving like a bad actor even if you believe your cause is just still makes you a bad actor, albeit one with good intentions. If they want to win the argument they should stick to realistic positions and leave the manure shovelling to "those other people".
It only takes one disgruntled employee, or one security slip, and 100-500 million users are compromised. Not a risk I'm comfortable with.
You're requiring them to be perfect. But humans are involved.
If a government can compel to break into your own device, what will happen later on when you close those backdoors? What's stopping the governments from demanding you leave a backdoor by pointing "hey but you could do it before"?