One thing I was wondering is how Apple is even able to create a backdoor. It is explained toward the end:
"The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer."
This is actually quite reassuring - what this means is that even Apple can't break into an iPhone with a secure passphrase (10+ characters) and disabled Touch ID - which is hackable with a bit of effort to get your fingerprint.
That is not exactly true. They wrote the OS, they designed the phone, they know where the JTAG connectors are. Cracking the phone apart and putting is logic board up on a debugger would likely enable them to bypass security.
From what I understand Tim is doing, and I greatly admire, is trying to avoid a judicial requirement that they be able to do this on demand. The so called "back door" requirement, because he knows, as others do, that such a feature would be used by more than the intended audience, and for more than the intended uses, to the detriment of Apple's users.
What I really find amazing is that I was at a talk hosted by the East-West Institute where the Air Force General of the new cyber command (whose name escapes me) complained that "we" (silicon valley) had let the government down by not writing strong enough crypto to keep our adversaries out. I remarked that it was the ITARS regulation and the Commerce department at the behest of the NSA which had tied our hands in that regard, and that with a free reign we would have, and could do, much better. Whit Diffie was there and also made the same point with them. And now, here we are 10 years later, and we "fixed" it, and now its our fault that they can't break into this stuff? Guess what? Our adversaries can't either!
The right to privacy, and the right of companies to secure that right with technology for their customers, is a very important topic and deserves the attention. I am really glad that one of the most valuable companies in the world is drawing a bright line in the sand. So I really support Tim's position on this one.
No, they can't. A quick update to recent hardware practices: modern SoCs like Apple's have something called "Secure Enclave Processor" that's on-die. This is the first thing to start when the chip is powered up, the thing that loads a cryptographically-signed bootloader, and the thing that gates a lot of IO with the outside world (like the NAND).
Hardware encryption on consumer hardware has existed for over a decade (look up Intel's TPM), and while it hasn't obviously taken hold on the more open Intel world, locked-down hardware platforms like Apple's top-to-bottom design has had much more liberty in implementing secure computing.
Furthermore, all debug/probing features can be disabled by fuses at the factory. The manufacturer can test the chip with those features on, and once verified, blow those fuses. No-one's JTAG-debugging that chip, not even Apple.
That said, Apple's focus on security and privacy ramped up in recent years. You want more secure, get more recent hardware. The downside, of course, is that if even Apple can't hack the software... neither can you.
However, without more information, this does not tell us whether it is possible in this case. The obvious implementation for a secure enclave resisting this sort of attack is to only allow key-preserving updates when already in unlocked state (which would be the case for any normal user update). All other cases should destroy the user keymat, even if the update is validly signed by Apple. This would be done by the hardware and/or previous firmware before it loaded the new firmware so you can't create an update that bypasses this step.
If this isn't how the secure enclave works now, I'll bet it will be in the next version (or update if possible).
I'm also confused by a lot of this since don't you need the password anyway to upgrade?
I bet if Apple is forced to comply with this order they will make sure that they will find a way to design the iPhone such that they physically can't comply with similar requests in the future.
Yes, they can. The particular phone in question is from before the Secure Enclave Processor
Some people trot the argument that it's OK for the government to compel apple to deliver the backdoored firmware because the measures it would circumvent are not of cryptographic/information-theoretical nature.
Then one could expand that argument by saying that compelling physical reverse-engineering is also OK because the devices are not built to be physically impossible (read: laws of nature) to pry open.
When a passcode is entered, the SoC queries the Secure Enclave with the passcode. If the passcode is correct, the Secure Enclave responds with the decryption key for the flash storage.
The best Apple could do is sign a malicious update to the Secure Enclave firmware that either removes the time delays or dumps the keys. However, some people suspect the SE erases its secrets on firmware update, although this behavior isn't documented in Apple's security reports.
Dumping the Secure Enclave would not result in the keys necessary to read the files on the filesystem. Each file has a unique key, which is wrapped by a class key, and for some classes, the class key is wrapped by a key derived from the passcode. If you don't have the passcode, you can't unwrap any of the keys (Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).
To my knowledge, all keys are still wrapped with the UID, and the UID is still a factory-burned SoC key (not accessible to any firmware). Possible to extract, but not easy to do at scale.
Tim's position today might not be apple's position tomorrow. Apple is a large publicly traded company. They owe a duty only to shareholders. Fighting this fight will probably impact the bottom line. Tim's continuation may turn on the outcome.
Cooperation may see Apple hurt. The perception of cooperation was part of RIM's fall from grace. Non-cooperation may also cause issues. Through it's various agencies, the US government is Apple's largest customer, as it is Microsoft's. Large contracts might be on the line should Apple not play ball. Either way, this order has probably wounded Apple.
I'm having trouble finding numbers, but I seriously doubt this. The reason that's true (or more likely true) for Microsoft, is Windows. The US gov't has massive site licenses for Windows and most of MS's software portfolio. Apple is used where in the US government? Some cell phones? A few public affairs offices that convinced their purchasing officer to buy a Mac Pro for video editing? Maybe some labs that wanted a unixy OS and, again, convinced their purchasing officer to buy a Mac Pro?
The bulk of Apple's revenue comes from outside the US. Perhaps the US government is their largest single customer (I still hold this is a dubious claim), but it is not essential to their continued existence. They would do just fine without those sales.
But remember, iPhones and MacBooks are quite popular everywhere, including US government procurements (e.g., https://37prime.wordpress.com/2012/08/05/nasa-mars-science-l...).
Your characterization ("Maybe some labs that wanted a unixy OS and, again, convinced their purchasing officer to buy a Mac Pro?") is a little off -- where I work, MBP's for laptop replenishments are treated exactly the same way as Windows systems, you just tick a different button on the order form.
As in, improve it.
Tim Cook is probably more popular than Obama (and surely is WRT this issue.) Apple is about a thousand times more popular than the NSA and blessed with almost infinitely deep pockets and a very, very good marketing team.
Not to mention the fact that most of the people who use computers and phones don't even live in the USA.
Tim Cook is responsible to his BoD and shareholders.
In comparison, Apple had a net income of ~50B in 2015.
Let that sink in for a moment.
That almost certainly isn't the case. It is doubtful whether any other government organization cares about how they handle this case. Heck the FBI likely doesn't care as long as Apple doesn't do anything illegal.
Because Freedom Markets(tm), booyah!
Society can bumble along just fine without corporations. Corporations serve society.
Take away society, with its culture, laws, rules, regulations, courts, people, economy, markets, capital, etc, there can be no corporations.
The Shareholder Fallacy
Historically, corporations were understood to be responsible to a complex web of constituencies, including employees, communities, society at large, suppliers and shareholders. But in the era of deregulation, the interests of shareholders began to trump all the others. How can we get corporations to recognize their responsibilities beyond this narrow focus? It begins in remembering that the philosophy of putting shareholder profits over all else is a matter of ideology which is not grounded in American law or tradition. In fact, it is no more than a dangerous fad.
The Myth of Profit Maximizing
“It is literally – literally – malfeasance for a corporation not to do everything it legally can to maximize its profits. That’s a corporation’s duty to its shareholders.”
Since this sentiment is so familiar, it may come as a surprise that it is factually incorrect: In reality, there is nothing in any U.S. statute, federal or state, that requires corporations to maximize their profits. More surprising still is that, in this instance, the untruth was not uttered as propaganda by a corporate lobbyist but presented as a fact of life by one of the leading lights of the Democratic Party’s progressive wing, Sen. Al Franken. Considering its source, Franken’s statement says less about the nature of a U.S. business corporation’s legal obligations – about which it simply misses the boat – than it does about the point to which laissez-faire ideology has wormed its way into the American mind.
Laws and statutes don't enforce contracts. But courts do. You are trumpeting a theory I've heard many times before. It's creators lack a basic understanding of contract law or corporate organization. :ookup "shareholder derivative actions".
I would not be surprised at all that Apple's internal 'backdoor' (if you can call it that) is just resetting the security enclave, essentially erasing everything on the NAND. That'd be fine for refurb/manufacturing, desirable even as that guarantees that full system wipes happen before a refurb goes to a new customer.
Most ICs can be completely erased to remove the limitations on access, but this usually requires a 'mass erase', where the entire non-volatile memory is erased (taking any codes, passwords, and encryption keys with it).
source: I am an embedded software engineer who works with these settings in bootloaders and application software.
Is this true? That would have to mean that either the passphrase is stored on the device or that the data is not encrypted at rest. Neither of these sound likely, frankly
The only way to secure the device against that would be to have the users manually memorize and key in a complete 128 bit encryption key, which they could then refuse to provide.
I think we tech folks were fooling ourselves with the idea that Apple had somehow delivered a "snoop-proof" device. They really didn't (no one can!) as long as they're subject to government or judicial control.
> possibly with the addition of a judicially compelled fingerprint scan or PIN brute force to get the encryption key out of whatever on-device escrow it's stored in
This is the whole problem. The keys are in the SE. You can't brute force the PIN because the SE rate-limits attempts (and that rate limiting cannot be overridden by an OS update because the SE is not run by the OS).
If you can get a fingerprint scan then all bets are obviously off, but then you don't need Apple at all.
> A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 51⁄2 years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers
(Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).
Additionally you don't have to use your thumb, so if you don't know what body part was used your out of luck.
Edit just to be clear: the requirement really is that the firmware be stored in a ROM somewhere, probably on the SoC. That's a lot of die space (code storing a fully crypto engine isn't small) to dedicate to this feature. Almost certainly what they did is put a bootstrap security engine in place that can validate external code loaded from storage. And if they did, that code can be swapped by the owner of the validation keys at will, breaking the security metaphor in question.
The key thing would be for it to lose all stored keys on update when the current passphase has not been provided, and it sounds like that may not currently be the case.
Maybe in this case, Apple could comply, but a simple tweak would make it impossible in the future?
I would not actually be shocked if they originally did wipe out stored info on firmware update, but had some issues with people updating their phone and losing everything, so they ifdef'd that particular bit out in the name of usability.
Well, yeah. But then you'd have a system that couldn't be updated in the field without destroying the customer data (i.e. you'd have a secure boot implementation that couldn't receive bug fixes at all).
It's a chicken and egg problem. You're handling the problem of the "iPhone" not being a 100% snoop-and-tamper-proof device by positing the existence of an "interior" snoop-and-tamper-proof device. But that doesn't work, because it's turtles all the way down.
Ultimately you have to get to a situation where there is a piece of hardware (hardware on a single chip, even) making this determination in a way that has to be 100% right from the instant the devices go out the door. And that's not impossible, but it's asking too much, sorry. We're never going to get that.
The enclave would store (in secured storage) a hash of the last used firmware. Hardware would have a hash update capability, but this destroys all other stored information (i.e., keys) if used when the enclave is not currently in an unlocked state.
On boot, hardware verifies firmware signature as usual but also compares the firmware hash (already calculated for the signature check) to the stored value. If there is a mismatch, update the stored hash. Since the enclave is currently locked, the hardware clears the keys.
Since it's in hardware, you're correct that it would have to be 100% right, but that's quite feasible for a simple update mechanism (indeed, the most complicated bits are reused pieces from the signature check which already has this requirement).
Die space is cheap nowadays, especially stuff that doesn't need to be on all the time because of the death of Dennard scaling.
(FWIW: OTA firmware updates are routine in the industry. I've worked on such systems professionally, though not for Apple.)
To be fair - the only reason he's doing it is because it would cause a significant drop in sales for Apple devices. People overseas would immediately stop buying because "The American government is listening", and that's assuming countries like China and Russia wouldn't ban them outright.
This is the big thing American politicians are missing or glossing over in their campaigns to get re-elected: Forcing American companies to compromise their products will result in a significant loss of revenue overseas. Microsoft, Google, et al, have already reported it and foreign governments have already started banning goods/services (due to the Snowden revelations).
That's not being fair at all. To say the only reason he is doing it is to protect iPhone sales doesn't speak to Tim's character. Of course he cares about sales, but he also cares about privacy.
Compare to GE, which rolled over .
So I believe Mr Cook when he says his opposition to the FBI's request is rooted in a desire to do the right thing, and not the bottom line.
[Cook] didn't stop there, however, as he looked directly at the NCPPR representative and said, "If you want me to do things only for ROI reasons, you should get out of this stock."
That's a very blunt statement that the immediate stock valuation is not Cook's only consideration.
Some people seem to have a hard time taking Cook at his word, but he's been quite consistent. This massive skepticism feels more like nostalgie de la boue than anything based in facts.
We should be careful about plainly stating what someone else's motivations are when it contradicts their own story.
Edit: s/it's/his reasons include/ for clarification.
We still live in a world where all people are not treated equally. Too many people do not feel free to practice their religion or express their opinion or love who they choose. A world in which that information can make a difference between life and death. If those of us in positions of responsibility fail to do everything in our power to protect the right of privacy, we risk something far more valuable than money. We risk our way of life.
See pages such as http://qz.com/344661/apple-ceo-tim-cook-says-privacy-is-a-ma...
Cook's responsibility is first and foremost to the stockholders, and secondarily to the customers. Decrypting the iPhone would seriously compromise the security of Apple's products, gravely damage the company's credibility, hurt sales, and drive the stock price down.
No CEO is going to take such a drastic step unless they are a craven, cowardly type who meekly obeys ask-for-the-sky demands from overbearing federal law enforcement types, and Cook surely did not rise to his current position by being a pushover.
That's not to say there won't be some kind of secret deal made behind closed doors, but secrets tend to get out. Apple would not be so foolish, I think. Yahoo? Microsoft? They just handed over the keys to their email to anyone who demanded it -- the Chinese government, the NSA -- but Apple has no history of this type of behavior. Surely Snowden would have revealed it if they had.
People that really care about security don't use smartphones.
I'm not sure this is true once you factor in price range. The general knowledge is a lot of those Android devices are sub-$250.
Maybe by share of units shipped. By revenue share they dominate, and their margins are estimated to be very good.
I understand wanting to know people's motivations, from both the perspective of predicting future action and just because we're nosy monkeys. But frankly, what's in Cook's heart doesn't matter. Actions do. And to date, in my view, he's done pretty much exactly the right thing on this issue all along.
Maybe he's defending customer privacy because he believes the Lizard People have religious objections to invading until all humans have Freedom of Math. It doesn't simply matter.
That hasn't happened with other devices or earlier iPhones that aren't as secure.
If Apple could in fact write a software backdoor, doesn't it mean that the backdoor exists, at least potentially?
And how can one be sure that Apple is the only company able to build that door? At the very least, couldn't the right Apple engineer be either bribed or forced (by terrorists or the government) to build it?
"Impossible" should mean "impossible", not "not yet done, but possible".
Not to mention that this is for the iPhone 5c. As other comments have mentioned, newer iPhones have the hardware-based Secure Enclave which add to the difficulty of breaking into the phone. https://www.apple.com/business/docs/iOS_Security_Guide.pdf
So while the Secure Enclave enforces the delay between brute force attempts, Apple could still release an update that removes that delay.
The FBI can also unsolder the components in the phone, make a full image of the content, find the encrypted section and then brute-force. This is what is done for SSD. They do not power up the drive, unsolder, put the memory modules in a special reader and copy the data before the controller of the SSD automatically wipe out data because of automatic optimization after a delete/trim.
Leaving a physical trace of my passwords is not only bad practice from security point of view, but quite useless since I know them.
Also, my online accounts are useless if I can't use them because I'm dead, so I don't really care if no one can access them anymore. What happens to important things such as banking is already dealt with.
But I think the right solution here would be for Facebook to have a way of handling deceased people, not giving your password to everyone in case of sudden death.
See, for example, the people who know they're going to die and who leave their iPads to their relatives in their wills. Apple doesn't take grants of probate as sufficient legal documents (everyone else does (eg banks)) and insist on a court order.
That's not true actually. For example, the industry standard for storing passwords on a server (bcrypt) is specifically designed to slow down password match attempts.
> no way to stop a dedicated attacker from brute-forcing it
You can't swap the SE or CPU around, nor can you run the attempts on a different device.
Against a sufficiently capable adversary, tamper-resistance is never infalible, but good crypto can be.
> Against a sufficiently capable adversary, tamper-
> resistance is never infalible, but good crypto can be.
To a sufficiently capable adversary, _all_ crypto is just "mere security by obscurity".
"Oh, mwa-haha, they obscured their password amongst these millions of possible combinations, thinking it gave them security - how quaint. Thankfully I'm sufficiently capable.", she'll say.
Sure, if they wanted to they could implement a backdoor. But assuming they correctly created and shipped the secure enclave it shouldn't be possible to circumvent it even for Apple.
In theory it should be possible to make it fixed (which Apple doesn't seem to have done).
Apple reasons that there is no way to guarantee no one will take the same update and apply it to other iOS devices. Or government taking this a step further by making Apple to build that into future update for the whole user base.
A lot of it about PR.
Schroedinger's Backdoor? ;)
The way I understand it (and, correct me if I'm wrong) is that the code flows from disk through the aes engine where it is decrypted and then placed in a presumably interesting/hard to reverse place in ram at which point it is executed. I imagine even more interesting things are done to higher value data in ram, but that's not code - because as you said, code has to be decrypted (at the latest) by the time it reaches the registers.
The right password can be obtained by either knowing it, or by guessing it.
As an additional security measure, the software shipped with the phone prevents brute force attacks by wiping the device after a given number of failed attempts.
Apple has been asked to modify the software so that it won't wipe the phone, thus allowing the authorities to try many passwords.
If anybody could circumvent this additional security measure, actual security would be lower. The authorities are not asking Apple to ship this change to all users: they only want to install it on the device in their possession.
However, Apple is concerned that once they provide the authorities such a modified software, it could be leaked and thus be used by third parties to breach the security of any Apple device.
It should be noted that currently all data encrypted for example on your laptop's hard drive is already subject to this kind of brute force attacks. It's a well known fact that authorities or malicious users can already attempt brute force attacks on encrypted data if they can access the data on a passive device such as a hard-drive.
It's important to understand that Apple (at least not in this case) is not being asked to implement a backdoor in the encryption software. It's also important to understand that even if Apple was forced to install a backdoor, it would affect only the ability to access future data and not help the investigation of the San Bernardino case.
This very request by the Authorities suggests that Apple does not currently install any backdoor on stock phones.
However there is a logical possibility that the Authorities are either not aware of any such backdoor or in the worst case they are publicly requesting this feature just to hide the real reason of a possible future success at decrypting the phone: they can claim that Apple didn't have any backdoor, and they were just lucky at bruteforcing the device; in fact Apple wasn't even cooperating with them at relaxing the brute force prevention limit, so they could claim they did it in house.
(I'm personally not inclined to believe in such improbably well coordinated smoke and mirrors strategies, but they are a logical possibility nevertheless).
“Each Secure Enclave is provisioned during fabrication with its own UID (Unique ID) that is not accessible to other parts of the system and is not known to Apple. When the device starts up, an ephemeral key is created, entangled with its UID, and used to encrypt the Secure Enclave’s portion of the device’s memory space. Additionally, data that is saved to the file system by the Secure Enclave is encrypted with a key entangled with the UID and an anti-replay counter.”
The device in question is an iPhone 5C, which uses the older A6 design.
Everything is encrypted with a derivative of this UID, and extracting the UID is not a thing you can do without destroying the device.
"even Apple cannot decrypt without using the right password."
Could you please explain?
The iPhone prevents this by locking up (and potentially erasing the phone) after 10 failed attempts, but this a restriction created in iOS. If they provision a new backdoored version of iOS to the phone, that restriction wouldn't apply any more, and they could brute-force away.
If you have things that you need to be private, don't put it on a smartphone.
Apple has implicitly for a long time, and lately much more vocally, cared about privacy. They don't have the same data-driven business model that Google and FB do.
They say that. But with closed source software we can't verify that it's true. I'm not saying they don't care about privacy, only that we don't really know if they do or not.
Plus, with open source you can verify intent, which you can't with apple.
Which provide a device getting your finger prints, all your phone numbers, internet search, bank details, some paiements, network communication, voice communications, text communications, localisation using GPS and wifi + hotspot + phone towers and soon ihealth device collection body metrics.
And they are profit oriented, not people oriented.
Sure but they were there for years before anyone noticed. Same with PHP's Mersenne Twister code. Same with multiple other long-standing bugs. It's disingenuous to toss out "Oh, if only it was open source!" because reality tells us that people just plain -don't- read and verify open source code even when it's critical stuff like OpenSSL.
I agree that failing to fix a problem like this in a timely fashion is bad, but sins of omission are generally judged differently than sins of commission, for better or worse. Apple failing to apply proper prioritization to security holes isn't the same as Apple collecting data to be sold to the highest bidder.
So, again, Apple should not be treated as equivalent to Google and Facebook. Feel free to judge them harshly, but don't paint them with the same brush.
As long as we're on the topic of encryption, phones, and law enforcement it's worth keeping in mind that in the US at least courts can compel you to unlock your phone with Touch ID, even though they can't compel you to give them a password. Communicating a password is considered speech, so self-incriminating speech is protected by the fifth amendment. Physically holding your finger to a device is not considered speech and so it's not protected.
I think this is an interesting, and perhaps underappreciated, aspect of a shift from passwords to biometrics for verifying identity. It would shift the power dynamic between civilians and government a bit - here in the US at least.
Of course, hopefully no one is in a situation where they need to protect themselves against over-reaching or unjust government officials any time soon.
And I'm not optimistic that the stockholders care about anything more than doing the opposite.
And any stockholder with an ounce of intelligence will understand that Apple's choices here are both morally right and effective marketing. As the information age matures, privacy is becoming a valuable asset, and Apple is starting to gain a positive reputation in this space.
AAPL is one of the most mainstream, widely-held stocks on the planet. Assumptions about the opinion of some homogeneous "stockholder" are useless.
I'm positive HN is filled to the brim with AAPL shareholders who care deeply about this issue.
Talk is all we need so far. Publicly saying no to the FBI is a step rarely taken.
I voted for, and to my surprise so did a majority of other stockholders, and that avenue of opportunities was removed.
I'm ambivalent regarding Apple's stance. In principle they are doing the right thing, but in practice, it seems they may be kicking up a whole lot of fuss over a relatively minor issue (with the exception that providing an easy means to brute force a phone to the authorities sets a horrible precedent). As for creating a universal backdoor, it seems highly unlikely they couldn't produce a signed OS / coprocessor firmware image that wasn't locked to one of the various serial numbers associated with this particular device
edit: as mentioned below, this order entirely originates with Apple's use of DRM to prevent software modification. Had users actual control over the devices they own the FBI wouldn't need to request a signed firmware in the first place. Please think twice about what Apple might really be defending here before downvoting
This is the entire concern (in my opinion and in my reading of Tim Cook's opinion). If the government can force Apple to backdoor this one iPhone (because terrorist), then they can force Apple to backdoor any iPhone for any person given a valid warrant, subpoena or otherwise granted power. Once the flood gates open...
Imagine this scenario:
1.) Apple creates the custom iOS build for the FBI to use to decrypt this iPhone.
2.) China hacks into either Apple or the FBI and downloads this build. (We know they have the capability, because it's already happened. )
3.) A visiting U.S. diplomat, politician, or military officer has his iPhone pickpocketed while in China. (This also happens all the time.)
4.) The Chinese government uses this stolen software to brute-force the encryption on the device, finding access codes for classified U.S. military networks. (Because we know U.S. diplomats never use their personal email for state business , right?)
5.) Now a foreign power has access to all sorts of state military secrets.
The problem with backdoors is they let anyone in. Right now, there's a modicum of security for Apple devices because knowledge of how you would bypass the device encryption is locked up in the heads of several engineers there. The FBI is asking Apple to commit it to source code. Source code can be stolen, very easily. Tim Cook's open letter is making the point that once this software exists, there is no guarantee that it will stay only in the hands of the FBI.
WARNING — THIS Apple Engineer IS CLASSIFIED AS A MUNITION
#!/usr/local/bin/human -s-- -export-a-crypto-system-sig -RSA-in-3-lines-HUMAN ($k,$n)=@ARGV;$m=unpack(H.$w,$m."\0"x$w),$_=`echo "16do$w 2+4Oi0$d-^1[d2% Sa2/d0<X+dLa1=z\U$n%0]SX$k"[$m]\EszlXx++p|dc`,s/^.|\W//g,print pack('H' ,$_)while read(STDIN,$m,($w=2*$d-1+length($n||die"$0 [-d] k n\n")&~1)/2) -------------------------------------8<------------------------------------- TRY: echo squeamish ossifrage | rsa -e 3 7537d365 | rsa -d 4e243e33 7537d365 FEDERAL LAW PROHIBITS TRANSFER OF THIS APPLE ENGINEER TO FOREIGNERS
edit: self-answer: the following post seems to have an answer: https://news.ycombinator.com/item?id=11115579, although it seems to describe a newer device than the one in the case; but I was interested in how such protection is possible at all, so that seems to answer it for me.
I strongly disagree. They are taking a stance in the debate about government mandated backdoors in software.
The order says that Apple's exploit should only work on this specific, already existing device.
That's a large part of the fuss!
So they would still get 10 bites at the cherry, and sure, on the tenth, they could depower the phone and prevent the wipe, but if each attempt is persistently stored before the password-check is carried out, depowering the phone wouldn't give them any more chances.
The details of the gov't request are in another story on the HN front page
I would assume that Google, Facebook, and Amazon are already sending every single keystroke we type straight to all the three letter agencies pretty much in real time. (I also assume the same about Apple, so I'm not sure what to make of this open letter.)
Google has been (even more) proactive about security and encryption since then, since part of their business model relies on trust.
Exactly like Apple. Or do you think that the emails in iCloud are not given to the prosecutors?
Google and Facebook's core competency is using your personal data to sell ads.
Apple's core competency is selling you appliances. Yes, they wind up with some personal data because of the services they also provide, but it's far less valuable to them than Google or Facebook.
What has to do your post with this claim?
The point I was trying to make is that Google and Facebook have direct access to all the data of their customers, and already provide access to government agencies. Contrary to Apple they don't safely store some data of their costumers safely on the device, which this case is about.
Your point is wrong regarding Google and smartphones if the smartphone is encrypted
Beyond that, Google definitely has the keys to your encrypted backups on their servers, so access to the phone might not even be necessary.
"The situation is different for Android. Google’s version of Android, which runs on most Android smartphones and tablets in the western world, only implemented encryption by default with the latest version Android 6.0 Marshmallow released in October 2015."
That version of Android is only on a handful of devices, not even a full percentage point of global market share. Even on Lollipop and older devices that do support encryption, it has to explicitly be turned on by the user. And once again, Google is not expressly clear that they don't have your encryption keys on Lollipop and lower; they only claim not to have them for Marshmallow devices. They definitely have the keys to your encrypted data on their servers no matter what, which can include complete backups of your device.
> Google is not expressly clear that they don't have your encryption keys on Lollipop and lower;
They have explicitly said that if the device is encrypted they don't have the key.
> They definitely have the keys to your encrypted data on their servers no matter what, which can include complete backups of your device
Source for that?
In other words, if you can ever actually use something, it's probably not secure.
Apple accuses the FBI of playing language games with the term "backdoor", but I think Apple has done the same. The fact that they can push weak OS updates to a locked phone is the backdoor. This means that they can already comply with the court order, and they likely will. This letter covers them from PR damage.
What Tim Cook wrote is that
> "install it on an iPhone recovered during the investigation."
> "the potential to unlock any iPhone in someone’s physical possession."
So the FBI has the physical phone already. They can deliver to Apple who can disassemble it and either use a JTAG/Flash programmer on an internal connector to manually write new software, or they could desolder the Flash holding the old OS and place a new one.
Both of these techniques are common enough in the embedded industry that I expect this is what Apple means. They probably can't push an OTA software update and force the install on a locked device.
The newer devices run a special L4 kernel on the secure enclave. It is not updateable without providing the existing passcode. It enforces the attempt rate limiting and key deletion on too many attempts (if enabled). Special limited communication channels allow the CPU to talk to the SE. In production devices the SE has JTAG disabled. Encryption and decryption of the master keys happen inside the SE with its own private AES engine so even oracle/timing attacks on the main CPU are useless.
Why doesn't Apple just help hack this phone but wash their hands of newer devices and tell customers to upgrade? Because if the FBI and this court get away with using the All Writs act to compel Apple to write new software they'll eventually be forced to add a backdoor to SE-equipped devices too. Courts won't understand or care about the differences.
If the government forced them, Apple could insert a backdoor into the next major version of iOS or the hardware; then everyone inputs their passcode during the upgrade and the backdoor is deployed. Their primary defense against that so far (and the only real one you can have as a corporation) is to never build the capability in the first place. This judge's order is telling them to go build the capability (in theory for this one phone). The fact that you can't retroactively build the backdoor for 5S and newer devices isn't the main issue.
Better to fight every step of the way and draft as many pro-privacy people as possible into the fight to apply political pressure.
The whole point is that it doesn't matter what the court thinks if Apple cannot comply due to the laws of nature. That was their whole argument to begin with. Their argument now is pretty mushy in comparison.
The iphone contains a sim card.
A sim card is a complete, general purpose computer with its own CPU and RAM and the ability to run arbitrary java programs that can be uploaded, without your knowledge by your carrier.
You are owned. Deeply, profoundly, in ways that you have no way to manage/mitigate.
The real question, for me, is why authorities are dealing with Apple at all and not just working with the carriers who have proven to be their trusted allies.
The international legal framework of sovereignty basically says you are owned. (Not universally de jure, but pretty much de facto.) Whatever rights you have are effectively granted to you by your country. Unfortunately, this notion is seldom given any thought, and the current most visible proponents of such an idea are unpleasant angry underclass men using it as an excuse to behave badly. There are others who have given thought to this, however, and it is part of the motivation behind such things as The Universal Declaration of Human Rights.
In my experience, law enforcement does not make their own jobs harder on purpose. If there is an easy way to get that data, they would use that way to get it.
So you don't want any hardware to have access to main memory if it doesn't need to. For instance, you can use an IOMMU to ensure that devices can only access the specific areas the OS wants to allow them to DMA to/from, not all of memory.
The phone isn't always locked and encrypted; for example, whenever the user is using the phone it's unlocked and decrypted.
I'd guess "security by obscurity". Just because they have the device rooted via SIM card doesn't mean they have available a signed build of a multi-gigabyte OS with most security libraries expunged.
I want to disclaim that this is pure speculation. I have no insider knowledge or indeed any particular familiarity with the institutions in question.
The FBI may want this authority and this precedent and think that this is a good chance to get it. They may say, "Well, the San Bernadino case is a high-profile case that may sway people, including judges, who would otherwise be less inclined to back our request. Who knows when the next nationally-publicized case will be in which the likely perpetrator carries an iPhone?" They may also believe that the current political climate is good for their case.
And they probably also believe that there's no harm in trying. If the courts rule against them, they haven't lost anything. If the courts rule for them, they get a brand new tool.
To clarify, I agree that nothing they ask of Apple is technically impossible or even that difficult for Apple to pull off, probably via simple DFU without touching the flash at all.
The backup is probably easier to attack if you have it, since it doesn't have hardware imposed timeouts on password guesses. It may not be current however.
Nothing's bulletproof but the iPhone is the most trustworthy IMHO.
"Locked" seems like an improper term for such a scenario.
I applaud apple for appealing this case to the public however there is a HUGE HUGE difference between "we can't unlock" and "we shouldn't unlock". This distinction will likely be lost on the general public unfortunately.
To be fair, they could have stated it explicitly.
If the data was truly encrypted, the concept of pushing an update or creating a master key would not be possible.
Apple's security PDF says that the iteration count is calibrated so that one attempt takes 80ms in hardware, so that's the hard limit on the brute forcing speed, regardless of any updates Apple releases.
This means that a long alphanumeric passphrase is secure, but a 6-digit passcode could be broken in half a day, and a 4-digit passcode would take just a dozen minutes.
# characters [0-9] [0-9a-z] [0-9a-zA-Z]
1 0.8 seconds 2.9 seconds 5 seconds
2 8 seconds 1.7 minutes 5.1 minutes
3 1.3 minutes 1 hour 5.3 hours
4 13 minutes 1.6 days 2 weeks
5 2.2 hours 8 weeks 2.3 years
6 22 hours 5.5 years 140 years
7 1.3 weeks 200 years 9 thousand years
8 13 weeks 7 thousand years 550 thousand years
9 2.5 years 260 thousand years 34 million years
10 25 years 9 million years 2 billion years
If it is stated very clearly, can you quote me a sentence?
In the security guide linked here it seems possible for this iPhone model but not later ones.
Edit: According to the discussion below Apple can ship updates to the secure enclave. I don't know if that's possible to a locked phone.
I wouldn't be surprised (It isn't stated in their iOS security doc) if the key generation uses a hash of the system files as part of a seed for the entropy source used for keys, though that's pure speculation on my part.
Edited for clarity regarding "push" vs physical access.
As far as I'm aware there is no known technique to prevent someone with physical access, a bunch of engineers, and the code signing keys from replacing firmware.
Drawing the line in the sand at "the government can't force us to hack this guy's phone this time" thus ends up being "can't force us to provide features to hack anyone else's phone down the line".
Starting with the A7 CPUs, the iPhone CPU has a "secure enclave" which is basically a miniature SoC within the SoC. The secure enclave has its own CPU with its own secure boot chain and runs independently of the rest of the system. It runs a modified L4 microkernel and it does all of low-level key management.
The secure enclave contains a unique ID burned into the hardware. This ID can be loaded as a key into the hardware AES engine, but is otherwise designed to be completely inaccessible. Assuming AES is secure, that means the key can be used to encrypt data but can't be extracted, not even by the supposedly secure software running in the secure enclave. This key is then used to generate other keys, like the ones used to encrypt files. That means you can't extract the flash memory, connect it to a computer, and then try to brute force it from there. Or rather you can, but you'll be brute forcing a 256-bit AES key, not a 4-digit PIN, making it effectively impossible.
One of the secure enclave's tasks is taking a PIN (or fingerprint) and turning it into the encryption key needed to read the user's files. The main system just hands off the user's code to the secure enclave, and gets back either a key or a failure. The escalating delays with successive failures and wipe after too many failures are both done in the secure enclave. That means that updating the device's main OS won't affect it.
All of this is discussed in Apple's security guide here:
The one open question is software updates for the secure enclave. According to that guide, its software can be updated. Does that mean it can be updated with code that removes the restrictions and allows brute-forcing passcodes? The guide doesn't address how the updates work.
My guess, based on how meticulous Apple is about everything else, is that updates are designed to make this scenario impossible. The secure enclave must be unlocked to apply an update, or if updated without unlocking it wipes the master keys. This would be pretty simple to do, and it would fit in with the rest of their approach, so I think it's likely that this is how it works, or something with the same effect.
In this case, the intended answer/conclusion/implication is "yes, this is."
Just one terrorist attack + PR letter to customer + forced update away from loosing encryption on your phone.
More on topic is whether Apple or even Google get out from under this if their on disk encryption mechanism is open source. If everyone owns e.g. LUKS (in a sense no one company owns/controls it) then can any one company be burdened by a court to effectively break everyone else's software by being told to create a backdoor?
The real problem is that you don’t want to set any precedent at all. Once it’s possible to do something for the 5C, weasel words can be introduced to make claims like “well: now you must maintain the current level of access by law enforcement”. Next thing you know, that excuse can be used to interfere with all future hardware designs.
Arguably, the fact that the 5C accepts a firmware update without the passcode is a security vulnerability and ought to be patched.
Companies exist to make money, not to protect our rights. It even crossed my mind that the possibility that NSA et. al. rooted these devices long ago, and that this whole "debate" is just a staged thing to make it appear as though we had any privacy and feigned adherence to the democratic process.
But your point that, as a matter of policy, many organizations will simply not use the product if they know it has backdoors, relates it back to their capitalist motivation and makes any conspiracy less likely.
I agree that Tim Cook has spun it as if it's not a backdoor when clearly it is. Still quite a hard-to-use one though. It seems like they need the physical phone and maybe Apple's private key for signing updates.
The idea that the act of writing software makes it insecure is silly though. The security doesn't come from no-one having made the appropriate changes to iOS. If it was, that would be security through obscurity and any motivated hacker or the FBI could modify iOS themselves. It must be about signing the update so that it can actually be installed.
(article) > But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.
I am also missing a key part of the technical details involved in this this situation.
EDIT: backups are encrypted, but apple have the keys. See below.
So basically, they could be in clear text, it's pretty much the same.
This times nine hundred and eleven thousand.
Even with as much data as each of them has, it's still better that the data isn't given to yet another party (the government, or each other).
All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission
As I understand, Apple complains about the introduction of this new threat model:
1. criminal steals someone's iPhone,
2. gets hold of a special iOS version that Apple keeps internally to assist the government,
3. pushes the OS update to the iPhone by themselves,
4. uses some tool to automate brute-forcing the user passcode
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
Now how serious is this threat? And how useful is it to have the FBI be able to look into a phone when they have a warrant? Does the balance between the right to privacy and the need to assist criminal investigation really tilt towards privacy in this specific case?
Meanwhile there are serious threats that do affect a lot of users in practice, where Apple does a good job but could do better still such as:
- Remote code execution on iOS (6 vulnerabilities in 2016 so far)
- Phishing, brute force and social engineering attacks (the improved two-factor authentication is not yet available to everyone)
Am I wrong in thinking that users are way more likely to be affected by these threats except when the government has a warrant?
If Apple is actually worried about the FBI getting access to the modified iOS version, they should focus their complaint on that and propose to do the whole data extraction in-house.
It's not like iOS is impossible to hack now and it would be terrible that it becomes possible. There are other aspects of the system that allow malicious exploits more easily than this theoretical threat. So it doesn't make sense to preserve at all costs (e.g. making warrants unenforceable) an "impossibilty" that never was.
No, not really.
> Or is the ability to comply with a warrant worth nothing?
What if I issued a warrant for you to give me a 3 headed dog? Is your inability to comply with a warrant worth nothing?
Please continue to think
They may be doing it for people right to privacy, but don't forget they might also be doing this because their image would become tainted irrevocably if they complied with this. Trust in Apple devices would be shattered (across those who currently trust Apple).
Why against hope?
 - http://idioms.thefreedictionary.com/hope+against+hope
I see these efforts as Google going far out of its way to support privacy and security on the web.
So, I'm doubtful.
It seems the primary aim of this statement is to prevent rumors that would spook Apple users into thinking their data is fair game
Note that this letter says: "When the FBI has requested data that’s in our possession, we have provided it."
Seels like that detail is getting very little attention in this announcement. Really? If the FYI requests any data, they hand it over...?
For example, I have friends who work in law (though not in the US), and the number 1 data request -which is revised by a judge, and only then given by companies- are call logs from telephones (just from/to, date, time, duration, nothing fancy). And this are extremely helpful and information rich, if you know how to use them.
But I was mainly pointing out that quote because it wasn't clear what lesson the parent commenter wanted Google, Facebook, and Amazon to be learning from Apple. I would have guessed it'd have something to do with protection of user data, but the letter says they turn over any user data they have!
As for the comment of Google, Facebook, etc, learning, I agree with you.
This specific case, in fact, is pretty close to "murder of a loved one;" the phone's owner killed people, and the FBI wants to find out if they were part of a bigger plot.
> Really? If the FYI requests any data, they hand it over...?
They have to, there's no legal wiggle room here. Creating backdoors OTOH seems to be sufficiently legally questionable that Apple can risk noncompliance.
(Of course, Apple could not accumulate all that data in the first place, but that would be silly, obviously. Now please sync your wifi passwords to iCloud.)
This isn't true. You can stick to app repositories like F-Droid and use Raccoon to download Play Store apps via your desktop without using a Google account on your phone.
I don't use them personally but I imagine Goole Now, GMail and Google Maps would need Play Services.
The apps I do use (non-google) tend to function well enough without Play Services though.
Anyone who does is a rounding error.
> Anyone who does is a rounding error.
I'm actually curious if there is literally anyone who uses no proprietary software, including the radios and the SoC, on their Android device.
My bet is that there's not even a single device out there for which this is possible. (If there is, I'd love to see it.)
According to them, there are unfortunately no baseband modems on the market that can legally have their firmware distributed as free software. Their workaround is to keep the modem as isolated from the CPU/RAM as possible.
What is the legal restriction here? (It sounds like you're referring to some restriction beyond simple copyright protection on some of their components - are there FCC regulations regarding the firmware?)
EDIT: Ah, of course, the FCC needs to certify devices before they can actually be used.
>We unfortunately cannot provide free baseband modem firmware, as there is no option available on the market which would be able to fulfil this requirement. Even if it existed, it would bring very little value to the users, as operating a radio device with modified firmware on public networks without recertification is prohibited in most jurisdictions of the world and privacy concerns in cellular networks are mostly related to what happens on the network side, not inside the device.
I don't have any more information than this. If someone can quote specific FCC regulations to back this up, I would find that very interesting :)
Not entirely true, publishing the code isn't the same as allowing its modification. Code signing can be used to limit which versions are allowed to run.
Reproducible builds of the source would allow one to ensure that the binary, certified version of the code their baseband processor is running is legit (i.e. not backdoored). It would also help audit the code and spot security holes.
If I can't run my home-compiled versions of your code - whether because of code signing restrictions or because of federal law prohibiting firmware that hasn't been certified - it's not free. So without without reproducible builds, providing the source code for the firmware provides very little benefit (since I have no way to prove that the code corresponds to what's actually running on the device, nor any legal way to install and run it on the device myself.)
Reproducible builds could in theory work, but actually getting builds to be bit-for-bit reproducible is not an easy feat. I'd be very surprised if firmware were capable of this.
 This is a great example of why a free software license doesn't necessarily mean that the software is free. It means that the author has waived his/her ability to restrict your freedom to use/modify/distribute the software, but that doesn't mean that third parties (ie, the government, or a patent troll) have done the same.
Basebands aside, the rest of the device is somewhat feasible to see being open.
Despite the openness of Android/AOSP, there are still, unfortunately, things like binary blobs for certain graphics chips and closed-source firmware for things like Wi-Fi chipsets. Given what we've seen agencies like NSA are capable of (intercepting hardware in transit to apply backdoors, paying off RSA to make Dual EC the default pRNG in their crypto libraries, etc.), them compelling a manufacturer of a component to include a backdoor in their closed-source blobs is no stretch of the imagination.
Apple even has this problem: basebands in cellular modems are notorious for being the source of exploits in otherwise-secure phones.