Hacker News new | past | comments | ask | show | jobs | submit login
Apple Has Not Unlocked 70 iPhones for Law Enforcement (techcrunch.com)
137 points by _isus on Feb 19, 2016 | hide | past | favorite | 63 comments



Not sure if this has been discussed, but it seems that the only reason this is an issue is because Apple has the ability to install new software on a locked device.

Couldn’t Apple simply remove the ability for a software update to take place without the device being unlocked or wiped?

Obviously this wouldn’t apply retroactively to old IOS versions, but it would be consistent with their change in policy from IOS 7 to 8.


I think they have already done this, but the phone at the center of the current controversy had not been updated to the new version of iOS. (In the new version, in order to update the OS on a locked phone you need to enter your passcode).

Edit: No, apparently this is wrong. I got it from https://www.washingtonpost.com/news/volokh-conspiracy/wp/201... , but that article has since been updated.


From TFA:

The California case, in contrast, involves a device running iOS 9. The data that was previously accessible while a phone was locked ceased to be so as of the release of iOS 8, when Apple started securing it with encryption tied to the passcode, rather than the hardware ID of the device. FaceTime, for instance, has been encrypted since 2010, and iMessages since 2011.

So Apple is unable to extract any data including iMessages from the device because all of that data is encrypted. This is the only reason that the FBI now wants Apple to weaken its security so that it can brute-force the passcode. Because the data cannot be read unless the passcode is entered properly.


They want Apple to disable limitations that disallow them from brute-forcing the passcode. This isn't even possible (supposedly) on newer A7-based devices that have security hardware (i.e. the "secure enclave").


I'd say it's certainly possible, just difficult. Considering what Chris Tarnovsky has done as a hobbyist (with a focused ion beam workstation) to hack several TPM and smart card chips and get their secrets out the FBI has to be incompetent not to be able to do the same thing to Apple's hardware. Or this is just a fishing expedition because doing that is difficult and doesn't give them any more power.


Comment I was replying to was talking about software, not hardware, and said (quote):

the phone at the center of the current controversy had not been updated to the new version of iOS

If the phone was on iOS 9, as the article claims, then... that's the latest version of iOS.


I don't know if they do, and if they do they going out of their way to keep it hidden. They released an update to fix bricked phones where the touchID was repaired by a 3rd party [1]. The update was only released through iTunes because they cannot push an OTA update to a locked device.

[1]: http://techcrunch.com/2016/02/18/apple-apologizes-and-update...


> Apple has the ability to install new software on a locked device.

People keep saying this but it probably isn't true. When law enforcement has physical access to the hardware, they can take apart the phone and flash it with new software and put it back together.


I don't think that's what's meant either.

You can plug any iPhone into iTunes and flash new firmware to it, as long as it's signed by Apple. That's what the FBI want to do, I believe.


This is correct.

To prevent this kind of attack surface the mechanism for applying new firmware needs to wipe access keys for the encrypted data, and for these keys to not be accessible without changing the firmware.


but i already have to enter my password to apply the update and again when my device boots? Can you link to technical details please, I have been following these developments and have seen no technical explanation of the vector.


>>> Apple also argues that since its reputation is based on security and privacy, complying with the court’s demands based on an expanded application of a 200-year-old law could put it at risk of tarnishing that reputation.

One one hand, a corporate reputation. On the other, people from the FBI saying they need access to prevent terrorist attacks. There are many arguments to be made. This is not the winner.


One of those sides is being fatuously disingenuous.

The San Bernardino shooters demonstrated rather competent op-sec, and the 5c the FBI wants Apple to backdoor was a work phone, not even a personal one. I'm spectacularly skeptical they used that phone in any way whatsoever relevant to the attack. More to the point, the attackers are already dead. There is nothing to prosecute here.

They picked this case as the one to push because they thought it had the best PR value for their purposes, not because of any exigence or concern over further impending attacks.

That is to say, the FBI is cynically leveraging the outrage at a mass-shooting, inflicted by "brown people", to further their anti-crypto, "We must have access to ALL OF THE THINGS!" agenda. It's pretty disgusting, frankly.

EDIT: snark.


Both sides are positioning themselves for maximum advantage vis a vis their publics. Neither one is acting without self interest. The USG says it wants to prosecute crime and associates, Apple say they want to protect the privacy of their customers and not doing so tarnishes their image which damages their business proposition.

The courts, the congress and the people will ultimately determine who prevails.


You'll note that I didn't dispute Apple's self-interest in their position. But they're being up-front about it: "Our customers will lose trust in us if we do this thing, and that hurts us as a business, as well as them." That's not merely PR. That's a fact.

The FBI's stated position, "These people did a terrible thing and we have to investigate it" is pure PR; the phone the case is proximally about is not at all what it's ultimately about.


> The FBI's stated position, "These people did a terrible thing and we have to investigate it" is pure PR

How is this not a fact?


There's nothing to investigate. Everyone except the conspiracy theorists have stipulated that these two committed the attack, and they're dead so prosecution is moot.

Referring again to my point up-thread about the shooters' op-sec, there wouldn't be any useful intel on their (once again!) work phone, anyway. "Investigation" is a red-herring to disguise their actual motive.


There are companies and agencies and other entities which allow employees to use their work-issued phones for personal use [some require re-imbursement]. And even if they explicitly forbade it, it's not like a perp will necessarily obey those stipulations, so it's not a forgone conclusion there would be no useful daate there.


As meticulous and careful as these two were about keeping what they were up to secret, encrypting the evidence they didn't destroy, and so-on, do you really think they're going to be stupid and sloppy enough to call Terrorist HQ on their fucking work phone?

Red. Herring.

EDIT: Aside, I just went down a diverting and entertaining little web-hole on the etymology of the expression "red herring". It is, indeed, a deliberate diversion: in the 19th century, fugitives would cover their tracks with smoked (red) herring, because it would put the bloodhounds off their scent.


More like dual purpose. It has legs, but they may be weak, but the possibility that they did trip up is not infinitesimal --it could be a small lead, or it could turn up nothing. Given its potential dual purpose, if I were the FBI, I'd go for it. They'd be remiss if they did not pursue this.


...and that obligates Apple to help them — at the peril of their customers' trust — how, again?


By way of a lawful court order? Customers would lose trust if Apple were negligent about their users' security but here there is no negligence. It's a court order, which Apple can contest and have and we'll see who prevails in the courts. This isn't a rogue agent knocking at the door and ransacking things.


Irony being that the real shooters were 3 tall white dudes from Craft Intl.


Is this just sarcasm, or is there really any evidence for this? Craft were also at the Boston Marathon.


Most people who use a work phone do not use a personal phone. This phone could have evidence to implicate their neighbor (http://abcnews.go.com/Blotter/san-bernardino-shooters-neighb...) in a crime.

I see nothing "disgusting" about the FBI's behavior here.


You're perfectly within your rights to trust law enforcement as much as you'd like, believe that any investigative power they're given won't be abused, and that this isn't a cynically calculated grab for exactly that purpose.

I don't.


This isn't a grab for power — this is a power that they already have. They went through the courts to get the proper order, following the letter of the law. We give the government the power to open bank safes, seize property, and wiretap; but we make laws that allow them to do so only when it's in society's best interests. If you don't think the law serves society's best interests, it's up to you as a citizen that your government represents to help change it. In this case, I think you'll find that a majority of people will say it is in society's best interests to open the San Bernardino shooters' bank safes, obtain their phone records, and decrypt their phone in order to look for accomplices.


FISA warrants and NSLs are "following the letter of the law", too. You don't think they're subject to egregious abuse?

As for what the "majority of people" find reasonable, all I have to do is look at the polling numbers in the current presidential race to find myself in abject disgust at what the "majority of people" appear to think. They seem not to give half a shit what happens to anyone else, as long as they can watch the next episode of American Idol without fear of "getting blowed up by terr'ists" — despite the fact that more people have won the lottery than have died from terrorism in the US since September 11th.

EDIT: Anyway, there's nothing new or informative to be found in this discussion. We disagree. Let's leave it at that.


This is a court case. What is under this discussion is not what is reasonable, but what is the law and how it applies to this case. That seems to come down to whether or not is is considered an "undue burden" on Apple to break security on this device, and that's it.

If you want the law to change, I would hope that the court sides with the FBI, as that will provide far more ammunition to anyone wanting to change the law than ruling in favor of Apple.


The only law I've seen proposed would force Apple to comply with the FBI request. Given that, perhaps the law doesn't currently mandate what you think it does?


This is a discussion. Most of the people here are not lawyers. In discussions we generally discuss right and wrong and what is reasonable, which have nothing to do with the law. The law is an ancillary technical concern about what the FBI or Apple will be forced/allowed to do, and has nothing to do with what they should do.


IIRC, this particular guy had a separate personal phone, and he thoroughly destroyed it before the attack. That does seem to imply that he had used it for all nefarious discussions, not the undestroyed work phone.


I think the two parties need to hold a meeting and make some agreements in the future because more and more personal information or activities will be saved on the phone. The number of conflicts between Apple and the government will keep increasing. People need to find out the point of balance.


With respect: I completely disagree. I don't think the govt has any right to force apple to allow them to bypass anything encrypted. IMO, this isn't a balance situation, but rather a binary situation.


Thanks for your opinion. Do you think this is a problem can be solved? Or the two parties will just keep fighting without an end?


Fortunately that argument is being made in the NY case which is not the terrorism related case in CA.


The interesting thing in the CA case that's not getting a ton of play is the FBI has the support of the phones legal owner (a government agency) but the owner doesn't know the passcode in this situation.

If the suspected terrorists owned the phone themselves, we'd also have the 5th amendment wrapped up in that case too.


If the suspected terrorists owned the phone themselves, we'd also have the 5th amendment wrapped up in that case too.

No we wouldn't. The "suspected terrorists" are dead.


Not all of them. Their neighbor is a suspected accomplice and is currently being investigated.


Irrelevant. He has no 5th Amendment standing vis à vis the deceased shooter's work phone.


The what-if scenario was if the suspected terrorists owned the phone (as opposed to the deceased's employer). The living neighbor is a suspected terrorist. If he owned the phone and let his now-deceased neighbors borrow the phone, the complications mentioned above could arise.


I wonder what Apple's PR machine is doing on this front to support their position. The more articles I see about what Apple can, cannot, will not, or could not do, makes me suspect Apple's PR team is behind some of those articles.


I get so sick of this. Whenever some sort of dictatorial measure is taken or proposed, the go-to cleanup is "this is the norm, this is status quo, nothing to see here."

We saw it all over the fucking Bush campaign when special rendition reports and waterboarding shit was coming out. We even saw Bush (probably Cheney) writing his stupid little memos which he tried to mold into executive orders, absolving the CIA of torture.

Does anyone actually buy this fucking defense?


Excuse my ignorance, but is Apple's encryption in iOS 9 devices significantly stronger than any of the newest Android devices? Also, are there any precedents of any government demanding the unlocking of any Android phones?


>Excuse my ignorance, but is Apple's encryption in iOS 9 devices significantly stronger than any of the newest Android devices?

AFAIK, I don't think we know, or can compare. It's my understanding that Apple's setup is closed-source, so we can't inspect, whereas Android's is open.

Someone please correct me if I am wrong.

>Also, are there any precedents of any government demanding the unlocking of any Android phones?

This isn't an issue, as, by default, android phones send everything to the cloud (exceptions for custom roms, etc). So govt can just ask Google for a copy from their server instead.


This is interesting and new to me:

   > For iOS devices running iOS versions earlier than 
   > iOS 8.0 [..]
   > Please note the only categories of user generated 
   > active files that can be provided to law enforcement [..] are: 
   > SMS, iMessage, MMS, photos, videos, contact, audio recording, and 
   > call history. 
   > Apple cannot provide: email, calendar entries, or any third-party 
   > app data.
What is the difference between the two categories? Are email and calendar entries encrypted on iOS7 and below?


If it's about Apple, some key detail will always get jacked up by a journalist like Shane Harris of 'the daily beast', trying to translate it for the public.

Report is extremely misleading and an example of bad journalism.


TLDR for what follows: Mandated backdoors must be a red line, but this is not a request for a backdoor and actually seems pretty reasonable. Trying to argue that the tech industry shouldn't help, even in this case, is not only the wrong position in my book, but a sure way to lose the bigger debate.

My views on the general encryption controversy are:

1. Everyone must be free to make their technology as secure as they possibly can. There can be no mandated weakening of security, back-doors, or other requirements to make the information more easily accessible by law enforcement. On newer iPhones, Apple has patched up the flaw that the FBI wants their help with exploiting. They must continue to be allowed to do that.

2. The government must be able to demand, with a court order predicated on probable cause, that companies provide any and all information that they have that could be useful in circumventing their security features. This can be everything from technical specifications and threat-model analyses, to lists of unpatched vulnerabilities and code-signing keys.

3. It seems to me that American companies have a moral obligation that goes beyond the legal obligations in point #2. They should be actively assisting the government in recovering information, especially when concerning issues of national security. In extreme circumstances, like total war, this should definitely be legally mandated. I'm undecided as to what the policy should be generally. On a practical level, it's probably not feasible for the government to, e.g. start hacking around the iOS codebase themselves, so just information might not be enough.

I'm not too troubled by this court order, especially given the particular circumstances. The right to make products as secure as you can, even from yourself and the government, is what's really important to defend.

Trying to argue that the tech industry shouldn't help, even in this case, is not only the wrong position in my book, but a sure way to lose the bigger debate.

Disclaimer: These are obviously my own personal views and nothing else. They do not necessarily reflect the opinions, policies, or practices of anyone but myself.


I call b.s. There is no possible justification for a company to hand over any private keys, code signing or otherwise.

There's valid questions whether a data set should or needs to be encrypted. But giving up private keys means some entity can masquerade as that key's owner, and alter what that owner has written. Not OK. Authentication, veracity, provenance, of a data set is vital to the trust model. Why would you advocate breaking this?

Second, assisting which government? Any? All? Some? What's the metric?


Our threat-models and business-models don't have the benefit of being law. They must rise and fall according to the realities of the real world that they contend with.

A digital signature does not inherently mean much. It just (pseudo-)guarantees that a holder of a copy of the private key signed the data. Taking that to mean anything else, such as that a signed update was freely designed by Apple with only pure intentions, or even that Apple is the only holder of the private key, is nothing but our own wishful thinking. This court order has hopefully shattered our naivete. The real-world now says that you can't trust that a third-party's private keys are not controlled by governments. It was extremely unrealistic for Apple to include the government as an adversary in their threat-model, claim to protect against it, but then leave this opening.

>Second, assisting which government? Any? All? Some? What's the metric?

When it comes to questions of legality, the simple answer is that the company has to abide by the laws of all of the countries that it operates in. If they don't want to, or if they can't because two countries have conflicting legal demands, they have no choice but to leave that country. This cannot be any other way. See my other responses in this thread for a more detailed discussion of this "but what if China did this" issue, if that's the argument that you're alluding to here.

When it comes to questions of only morality, it's up to the individual people in a company to advocate for the right course of action. Only their own conscience, and the free judgments of their peers, can guide them.


The court order does not require Apple turn over their private key. What it does is compel Apple to sign something with their private key, under duress.

You're basically making a might makes right argument, far extended from a Hamiltonian position of implied powers.

If public key cryptography can't be trusted, and governments are as a matter of fact not fully trusted and in at least the U.S. we're taught explicitly that it can't blindly be trusted which is why we have a written constitution, then the entire trust model fails and we can just stop with all of this nonsense and go back to pen and paper.


Do you still hold the same position if we replace the United States / FBI with China?

Apple sells tens of billions of dollars of iPhones to China every year. If Apple provides assistance to US law enforcement but not to Chinese law enforcement, that's going to be a disaster. But if Apple provides assistance to Chinese law enforcement, that's a different kind of disaster.


Search and seizure is a fundamental government power. I doubt any modern state would be viable (i.e. maintain its monopoly on violence) without it. In the US, we have procedures and standards of evidence that the government must meet in order to exercise that authority. That's not the case in China.

So yes, the Chinese government should absolutely have this authority; they wouldn't be much of a government otherwise. They should also absolutely have protections against unreasonable searches and seizure. The lack of these protections is the problem, not that their government has search and seizure powers. A tech company operating in China has to make a choice between subjecting itself to Chinese law, or not doing business in China.

The same goes for policing in general in authoritarian regimes. The Chinese police state is undoubtedly evil. But I'd bet that most of the time, they're going after your run-of-the-mill crooks that need to be policed in China just as they are in America.

Your argument is a good one though. I think there is a real danger of accidentally building the infrastructure for a future totalitarian regime. It's also a good political argument when it comes to international issues, like Microsoft's argument that the US government can't force them to hand over data that's stored in Ireland. I'd like to see Microsoft win the case in court, but I suspect that the government will win. US court orders are binding even when they require you to break foreign laws. But the government exercising that authority in this case could totally undermine US cloud data providers, so I suspect that Congress could be persuaded to restrict the government's authority here.


> Search and seizure is a fundamental government power.

What's not fundamental is that it should not be used UNLESS you are suspecting a person of wrongdoing with serious facts in the first place. It's never supposed to be an all-encompassing absolute power.

> I doubt any modern state would be viable without it.

There's hardly any data on states that don't exercise such powers, so don't spread the fallacy that the opposite cannot be true. It's not because right now A is linked to B that you actually need A to have B in absolute terms - you just don't know that. We are living in an era of nation-states themselves coming from a long history of monarchical power (an inheriting the same rights and powers, more or less as governments instead of a single person), so it's not like we have ever tried to design societies in a very different way at all so far.


> What's not fundamental is that it should not be used UNLESS you are suspecting a person of wrongdoing with serious facts in the first place. It's never supposed to be an all-encompassing absolute power.

I agree 100%. In this case, the government has a court order for a specific phone used by a specific person who committed a particular crime. What exactly are you arguing against?

> There's hardly any data on states that don't exercise such powers, so don't spread the fallacy that the opposite cannot be true.

I'm sure there's an interesting discussion to be had about alternatives to the nation-state and how they might control the exercise of violence, which is by definition the fundamental characteristic of any system of government. But we live in a world where this authority is centralized (or sometimes, in federal systems, split between different levels of central authority).


> What exactly are you arguing against?

I was arguing about your statement that seemed like a blanket one. I have no problem with the use of search in this particular case.

> But we live in a world where this authority is centralized

Yes, but that does not mean there is no alternative. Large scale societies are still very recent in Human History - and authoritarian systems have largely led to wars up until now, so assuming this is the only working system is just survivor bias at work.


> Mandated backdoors must be a red line, but this is not a request for a backdoor and actually seems pretty reasonable.

This request is not so much a red line, it's more of a yellowy orange line. A yellowy orange line that takes us one step closer to crossing the red line without even realising we are doing so. See: http://imgur.com/xWpvw

There is a real danger in accidentally building the infrastructure for a future totalitarian regime.

Full credit to Apple for actually seeing this and being willing to make a stand.


You know, your combination of (1) and (2) has been the status quo throughout American history. Companies are free to build the safest safe they can. But when the government has a warrant, they are obligated to provide reasonable assistance in opening it. I'm anxious about a move away from that status quo in either direction (either mandated backdoors or companies refusing to do what they can when presented with a warrant).


Yep. But this safe-maker promised people a safe so strong that the maker himself can't break in. The government's request and their response has proven that false, at least for older iSafe units with the default combination length (4 digits). This line from their press release makes me suspect that even newer iSafes are not safe from this safe-maker:

> In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.


> This line from their press release makes me suspect that even newer iSafes are not safe from this safe-maker:

Please see this comment for discussion about that line of reasoning: https://news.ycombinator.com/item?id=11131777

I'm not saying that your intuition here is wrong, but rather that Cook's statement has some subtle nuances to it that some seem to miss.


Agreed. Moreover, the Techrunch author is being disingenuous. The data on the 70 other devices was also "secured" behind a pin even though not encrypted. Apple built tools to extract the unencrypted data from the phones without entering a pin. Similarly, Apple could write a tool to flash this phone with a build that does not enforce the wipe rule. Just because the security Apple is helping circumvent in this case is slightly harder for an outsider to circumvent than the security they circumvented previously, that doesn't mean that it absolves Apple from being compelled to circumvent it when presented with the proper legal order.


I'm surprised that Apple has gone so political against the government here. Despite a massive amount of lobbying by the FBI, Congress has so far given the tech industry the benefit of the doubt, believing our argument that "we're just making our products as secure as we possibly can, even against ourselves." Now Apple is parroting that line even when it doesn't apply, and using it in a political PR war against the government. How do you think Congress will see that? Apple has stepped off of the high-ground and into the battlefield.


> Congress has so far given the tech industry the benefit of the doubt

I think it's more they listen to the NSA rather than the FBI, as opposed to giving the tech industry any sort of benefit of the doubt.

https://theintercept.com/2016/01/21/nsa-chief-stakes-out-pro...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: