
Secret Memo Details U.S.’s Broader Strategy to Crack Phones - plhetp
http://www.bloomberg.com/news/articles/2016-02-19/secret-memo-details-u-s-s-broader-strategy-to-crack-phones
======
mtgx
> _What the court is ordering Apple to do, security experts say, does not
> require the company to crack its own encryption, which the company says it
> cannot do in any case. Instead, the order requires Apple to create a piece
> of software that takes advantage of a capability that Apple alone possesses
> to modify the permanently installed “firmware” on iPhones and iPads,
> changing it so that investigators can try unlimited guesses at the terror
> suspect’s PIN code with high-powered computers. Once investigators get the
> PIN, they get the data._

I don't think there's much difference between a backdoor and that. A backdoor
can be "just a vulnerability", and that's what the FBI is asking Apple to
create - a vulnerability in its security system.

It's kind of like saying "we don't want Apple to break its AES-GCM encryption,
we just want it to replace it with RC4." Or "we only want Apple to support
export crypto protocols as well, so we can downgrade to them when we do our
attacks".

Whether we call it a "backdoor" or "vulnerability" or "just don't make it that
secure" thing, the end result is the same. The FBI wants Apple to weaken its
security, and that weaker security can and will be exploited by malicious
actors, too (even if you're assuming it won't be abused by the FBI and the
police itself, which of course it will be).

~~~
randcraw
Presumably any "backdoor" is a security hole that can be accessed on any
phone. So far, this request asks that Apple remove only the part of one
phone's security that 1) destroyes that phone's data after 10 failed tries,
and 2) slows the automated entry of passcodes into that phone. So far, this
case is specific to one phone and thus not about a general backdoor.

As the case stands, unless Apple can show how compliance weakens security on
_other_ phones, I don't see how they can refuse to comply, unless it's
technically impossible for them to do. If it _is_ impossible, their refusal in
this case may lead to a ruling that requires them to change iOS to comply with
such requests in the future. But if they can comply now, only the security on
one phone will be diminished. It's strongly in the company's interest to
comply.

If Apple does not (cannot) comply, and subsequently they are ordered to change
iOS to comply with future requests, would this be a "backdoor"? Yes. I don't
see how Apple can add this "feature" to iOS and assure that only authorized
legal authorities could activate it in the future.

Is this case about changing iOS to comply with future requests of this kind?
Not yet.

~~~
sandiegodave
If Apple creates new software that removes a part of one iPhone's securty,
that software can potentially be used on all iPhones. It's no different from
what you said: " I don't see how Apple can add this "feature" to iOS [or in
this case, create a tool to use on the current version of iOS] and assure that
only authorized legal authorities could activate it in the future."

~~~
ikeboy
How is that different from "If I hand over this iCloud data, I'll need to
create software to download iCloud data from a specific user, and that
software can potentially be used on any iCloud account"? How is this not
applicable to any warrant that requires technical ability to comply?

~~~
sandiegodave
Apple doesn't make the same promises about encryption and privacy on their
iCloud service that they do their iPhones, that's the only difference. If you
store information in iCloud and the FBI comes asking for it, Apple will hand
it over. By the very nature of iCloud, Apple has access to your data. Not so
with your iPhone. Apple says your device is encrypted and they cannot access
it without your consent, end of story. Forcing them to create tools that
invalidate this promise on one iPhone means creating tools that can invalidate
the promise on any iPhone.

~~~
ikeboy
Saying you'll break the law before you do it does not make it legal. "We told
our customers we won't do something" is not a legal defense.

By the very nature of iPhone 5C, Apple can put whatever software they want on
it with physical access.

Also, note that the phone is owned by the government. They provided it to the
terrorist, who worked for the government. So there's no promise broken to
customers, because it's the customer themself who's asking for this.

~~~
sandiegodave
I'm not saying anything about the (il)legality of Apple refusing the FBI's
request, or whether their iPhone promises constitute a legal defense. At this
point it's neither legal nor illegal, and the courts will take up this case
soon (right now, Apple is arguing that precedent is on their side. From the
NYT: "In a 1977 case involving the New York Telephone Company, the Supreme
Court said the government could not compel a third party that is not involved
in a crime to assist law enforcement if doing so would place “unreasonable
burdens” on it.").

I've only been responding to the claim that this is only about "one iPhone",
as if there would be no impact on all iPhones as a result.

~~~
ikeboy
Apple lost in court. They may win on appeal, or not, but as of now they've
lost. I think that "it may or may not be legal" is not correct.

Re one iPhone: I responded that the same could be said about any warrent. The
actual order requires them only to modify one iPhone. Whether that proves they
can do it for others doesn't matter. If they need to do it in this case, they
need to do it whenever the government has a warrent. That's not a slippery
slope, and it doesn't affect any non warrented devices.

~~~
sandiegodave
The legality is still in question, then. If they can still win on appeal, it's
not definitively illegal.

As for the rest, I only pointed out the difference between this request and an
iCloud request, which you asked about (rhetorically, I know). This iPhone
request plausibly places an undue burden on Apple that the iCloud requests do
not, so it's different (including from a legal standpoint). If they eventually
lose and the Supreme Court says "No it's not an undue burden, now go hack the
phone!" then fine, the highest court in the land will have declared your
analogy sound. Right now, that's very much in dispute.

~~~
ikeboy
1\. This is quickly getting into the philosophy of law. Every case can be
struck down by either appeal or a later case overturning precedent. I think
such cases should be thought of as "it was illegal, but now it has been
changed by the court". So once someone's lost, we call their actions illegal
until they appeal and win.

2\. You said that complying would open the door to doing it for other phones.
In that regard, complying with an icloud request also opens the door to
complying with future icloud requests. The fact that it creates the ability
(for Apple) to do it later has no effect on legalities.

There may be a difference on undue burdens, but that's a different point. Your
point would still be invalid. " creating new software for FBiOS is an undue
burden, but creating new software to download a user's icloud data isn't " is
a different argument than "creating new software means we can use it again
later". The latter is the argument you made, and it doesn't differentiate
between icloud and FBiOS.

If we're concerned the FBI will take the software and use it without a
warrent, Apple was given the option to do everything on their own premises and
just give the FBI the data/unlocked phone when done.

~~~
BraveNewCurency
> You said that complying would open the door to doing it for other phones. In
> that regard, complying with an icloud request also opens the door to
> complying with future icloud requests.

There is one crucial difference. In the iCloud case, the government must ask
Apple any time they want to get iCloud user data. Each time, Apple can verify
that they have a court order before doing the work. So it's "stateless" in
that the first request doesn't "open the door" for later requests.

In the iPhone case, the government is asking for an OS (signed by Apple) that
can be flashed onto _any_ device. This new OS would have a giant backdoor that
disables all the protections of an iPhone. We all know there is no way to
prevent this OS from being used elsewhere, for other uses. This is not
stateless -- once Apple creates this OS, there is no going back, all phones
are now insecure.

~~~
ikeboy
I went into this elsewhere.

Every update to any iOS device (well, any since iPhone 3GS) requires a
signature unique to that device. Since iOS 5, it also includes a nonce
generated on device at the time of upgrade, so you can't even replay the
signature, it needs to be signed at the time you install the new version.

This is why you can't downgrade to earlier versions of iOS after Apple stops
signing them.

Anyway, I hope it's clear now why that was wrong.

------
kbenson
> Knake said that the Justice Department’s narrowly crafted request shows both
> that FBI technical experts possess a deep understanding of the way Apple’s
> security systems work and that they have identified potential
> vulnerabilities that can provide access to data the company has previously
> said it can’t get.

I assume the actual request is more technical then, because the overview they
gave here explains the things you would want to do if you knew nothing about
the encryption and wanted to brute-force. Reduce password attempt timeouts,
allow automating the password attempts, and don't melt-down after too many
failures.

~~~
AngrySkillzz
That's why a lot of people (myself included) are so cynical about the request.
We all know the intelligence community has experience with side channel
attacks, decapping processors, etc. But they've apparently decided not to use
them in this case, "in the interest of time."

If they really wanted that data, they could get it with their current
capabilities. They don't really want that data; they want the legal precedent
to compel companies to subvert their own security mechanisms, and they want to
intimidate one of their harshest critics (Cook). That's part of their broader
strategy; if we can compel you to break security you built, you build security
you can't break. The next logical question is whether they can compel you to
not build security mechanisms you cannot break in the first place. That's the
legal question the FBI really wants to ask.

~~~
guelo
Another option the FBI has is to issue a search warrant for Apple's code
signing key. Then they could flash the device themselves. That's what they did
to Lavabit and it has a lot more legal standing than this All Writs Act thing.

~~~
ddinh
If the FBI were to do this, would it be possible for Apple to counter it in
the future by splitting the key across multiple entities, some in non-US
jurisdictions, in such a manner that all of them would have to agree to sign
anything?

~~~
newman314
You just described David Chaum's crypto plan. Also widely derided as having a
backdoor. So it's really just moving the goal posts.

------
blisterpeanuts
The Clipper chip initiative[1] from the Clinton era completely failed, for two
reasons: one, the technology was proven to be flawed, and two, privacy
advocates shot it down.

It seems as though all the debates and analysis on this topic have already
occurred. Yet, here we are again: a law enforcement agency demanding special
privileged access to privately owned consumer electronics because it _might_
contain useful crime fighting information.

It seems to me that the U.S. needs to have a broader discussion about what
levels of government surveillance and intrusiveness into private lives we are
comfortable with.

The outside threat of terrorism is now the club being wielded to force the
issue, but is there really any evidence that this type of increased access
helps? We had the Boston Marathon attack, in which two brothers immigrated
from Chechnya, a known breeding ground for some of the most brutal terrorists
in the world, the Russians actually phoned to warn us about them, and nothing
was done.

Similarly, there was chatter in 2000-2001 about an attack involving passenger
jets, reported by Israeli and German intelligence agencies. Yet, nothing was
done. One would have thought it common sense to scrutinize foreign nationals,
especially from Muslim countries with a lot of hostility toward the U.S. among
the populace, who were involved in aviation. Reportedly, the Israelis even
were monitoring a couple of the 9/11 hijackers in the U.S. at one point.

Should we not be streamlining our intelligence bureaucracies to avoid another
Marathon fiasco, before sacrificing what little remains of our privacy on the
altar of national security?

1\. [https://www.eff.org/deeplinks/2015/04/clipper-chips-
birthday...](https://www.eff.org/deeplinks/2015/04/clipper-chips-birthday-
looking-back-22-years-key-escrow-failures)

~~~
rayiner
> It seems as though all the debates and analysis on this topic have already
> occurred.

I don't know if I buy that framing of it. We've already had this debate:
hundreds of years ago. We decided that the government can't search your stuff
at random, but if it gets a warrant, it can, and is entitled to reasonable
cooperation in doing so (breaking locks, drilling into safes, breaking open
safe deposit boxes, etc).

I think you can just as easily say that privacy advocates are the ones trying
to reignite a settled debate. They want phones treated differently than how
other property is treated. This is privacy advocates trying to shift the
Overton window: where a search pursuant to a valid court order somehow becomes
a privacy violation.

~~~
gergles
The difference in all of your examples is that the government actually
_cannot_ compel assistance with the things you described. The government
itself can break locks, drill safes, break open deposit boxes, of course. But
to date, they haven't tried to call Brinks and force them to come explain how
to break into a safe.

If I were able to somehow develop a file cabinet that was impenetrable, the
government would not be able to come compel me to break it open for them. They
are allowed to try to break it open, but they cannot compel me to help them.

(CALEA requires that systems must be built so that the government can snoop.
Cell phones aren't covered by CALEA (the network is)).

~~~
rayiner
The power of the courts to compel cooperation in civil and criminal
investigations predates the Constitution. Courts routinely order banks to
drill into safe deposit boxes, for example. They can also force your
accountant to hand over records about you and testify against you. The
limiting principle is whether the cooperation imposes an "unreasonable burden"
on the third party.

~~~
nitrogen
The key differences now are that our devices' activities are more akin to
sending a letter through the legally protected US mail, rather than disclosing
secrets to untrusted third parties like cell carriers (no reasonable person,
upon being shown a text messaging app, would think they were texting their
carrier and not their friend). Thus, we expect a much higher standard of proof
and accountability for accessing that information than has been applied of
late.

Second, our devices are becoming effectively cybernetic extensions of our
minds, and this cognitive intimacy definitely warrants a new discussion of
boundaries that is not burdened by awkward analogies to historic traditions of
the courts.

Finally, by whatever basis you want to argue (multiple amendments in the bill
of rights, etc.), we should not be forced to deny the effectiveness of
mathematics. We should have the right to use whatever algorithmic and physical
security mechanisms we desire to guard ourselves against hackers, thieves, and
foreign spies. If that slows the local spies down too, they'll just have to
find some other way to do their jobs, and reconsider whether the scope and
scale of their activities was appropriate to begin with.

~~~
rayiner
> Second, our devices are becoming effectively cybernetic extensions of our
> minds, and this cognitive intimacy definitely warrants a new discussion of
> boundaries that is not burdened by awkward analogies to historic traditions
> of the courts.

I guess I just don't believe this.[1] Courts have always been able to compel
say your friend or girlfriend to reveal your darkest secrets--stuff that you'd
never even write down--but your text messages should be treated differently?
It's not like cognitive intimacy didn't exist before electronic devices.

[1] As far as sci-fi philosophy goes, my go-to is WWHIS (What Would Happen in
Star Trek). You think the Federation can't compel someone to assist in
decrypting a computer?

~~~
nitrogen
Don't mix the separate points I was making. Text messages are like mail, other
device use is like thought, imagination, or memory. I also notice you didn't
mention spouses, which are treated differently from friends, and did not
address the third point about encryption.

I'm also not making an argument from sci-fi, I'm talking about reality today.
Star Trek (especially TNG) has a lot of great morality plays and valuable
lessons can be learned from it, but the vast majority of the action we see
takes place on a military-like vessel and can't inform us about how to
structure a total population. But this is all beside the point that we are
experiencing the melding of our minds and our devices right now.

~~~
rayiner
> other device use is like thought, imagination, or memory

And I'm saying I don't buy that. My phone isn't an "extension of my brain"
(that's what I'm calling sci-fi philosophy). It's a replacement for my phone,
calendar/notebook, and camera, which are all things that have been subject to
search with a warrant.

> I also notice you didn't mention spouses, which are treated differently from
> friends

But not for reasons relevant to your argument. Spousal privilege exists not to
protect _privacy_ but to protect _marriage_. The rationale is that you
shouldn't turn spouses on each other. It dates to a time when spouses were
considered the same legal person.

I didn't address your point about encryption because I don't disagree with it.
But banning encryption isn't directly at issue here.

Star Trek is always relevant. It imagines a future society where people
achieve prosperity through a very powerful, sometimes fallible, but basically
benevolent government. It's an ode to the righteous power of Institutions.
It's a compelling alternative to the anarchic leanings of much of modern sci-
fi.

~~~
nitrogen
_And I 'm saying I don't buy that. My phone isn't an "extension of my brain"
(that's what I'm calling sci-fi philosophy)._

It may not be true for you, but you don't have to buy it for it to be
compelling to others. I'm saying that when I and presumably many others use
technology, it's not perceived as an external device, but rather a direct
extension of our minds and senses. There's no skeuomorphic substitution taking
place in our minds.

It's very vaguely analogous to (though much stronger than) a car enthusiast
saying that they feel one with the car. A similar concept applies to many
other tools humans use as well; people may say "ow!" and grimace in pain when
they damage their tools, even though their own flesh wasn't physically harmed.

I use those analogies to suggest that a tool used not by the body but by the
mind can have an even stronger integration with the core identity of a person,
a connection that must be taken into account by any laws seeking to regulate
such tools.

Regarding spouses, I'd make the argument that in a society that doesn't treat
a marriage as a single person, privacy is one of the necessary substitutes
that society needs to maintain its sanity and achieve the same end of building
stable, productive households. So the historic rationale for spousal secrecy
should not prevent us from extending the concept to other interactions,
perhaps even with non-human parties (like our devices).

I'm glad we can agree about the banning of encryption. Maybe it's not the
direct issue in the Apple case we are discussing, but it is a part of the
rhetoric being used by (parts of) the government, and does relate to forcing a
company to make it easy to break a device's encryption.

\---

 _Star Trek is always relevant. It imagines a future society where people
achieve prosperity through a very powerful, sometimes fallible, but basically
benevolent government. It 's an ode to the righteous power of Institutions.
It's a compelling alternative to the anarchic leanings of much of modern sci-
fi._

It also only works in a post-scarcity, free energy environment. There's often
a recurring theme of what happens when a government is corrupt or is
infiltrated by outside powers (represented for the sake of fiction as brain-
eating parasites). And I'd say Kirk at least has a pretty strong anarchic
leaning :-). But it's mostly entertainment, and doesn't provide a blueprint
for achieving such a society. I really like Star Trek, but I think in this
conversation it's more distracting than it is enlightening.

~~~
rayiner
> But it's mostly entertainment, and doesn't provide a blueprint for achieving
> such a society.

I think it does. How did the society become post-scarcity and free energy?
Teams of scientists working in Federation research labs, Federation programs
that make the fruits of that technology available to everyone.

~~~
krapp
Bear in mind the technological basis for its post-scarcity society (cheap and
efficient conversion of matter into energy and back) is impossible in our
universe. It may not even be post-scarcity. We're shown the lives of the
elite, but there are Federation colonies that practice traditional
agriculture, mining colonies, trade in goods, starvation and chaos due to
technological breakdown and sometimes a massacre or two. Certainly the
Original Series didn't seem post scarcity, merely advanced (from the point of
view of the 1960s.)

And as far as privacy goes, it probably depends on the series, but in the most
Utopian ideal of the Federation, would anyone even use encryption? Surely the
desire for secrecy would be considered a form of atavism, something humans
would have evolved beyond?

------
guelo
I wonder if these "national security" people sit around longing for the next
non-white person terrorist attack in order to spring their plans into action.

EDIT for the downvoters, my point about non-white people is that terrorist
attacks by white people, such as all the mass shootings, don't seem to trigger
the grand plans that these national security types like to execute.

~~~
morganvachon
That's a hair's breadth away from the lunatic fringe. It's a simple logical
leap to the conspiracy theory that the shootings are false-flag operations put
in place by the powers that be to initiate legislation on encryption.

That's probably why you are being downvoted.

~~~
ionised
False flag operations aren't exactly an alien concept to US law enforcement
and intelligence services, as well as other countries;

[https://en.wikipedia.org/wiki/Operation_Northwoods](https://en.wikipedia.org/wiki/Operation_Northwoods)

[http://www.washingtonsblog.com/2015/02/41-admitted-false-
fla...](http://www.washingtonsblog.com/2015/02/41-admitted-false-flag-
attacks.html)

------
coldcode
"My guess is you could spend a few million dollars and get a capability
against Android, spend a little more and get a capability against the iPhone.
For under $10 million, you might have capabilities that will work across the
board". Go ahead, good luck - Apple

~~~
skybrian
That's not an improvement. If Apple is necessarily in the loop, at least they
have a chance to fight it in court. If the FBI can do it themselves, that's
one less procedural speedbump.

~~~
Consultant32452
Further, if Apple is necessarily in the loop Apple won't close whatever holes
the government tools utilize. If Apple isn't involved there's a chance they'd
close the holes even if by accident.

------
Zpalmtree
This may be a stupid question and obviously Apple would never do it, but if
Apple decided they wanted to ignore these requests, and stopped selling Apple
devices in the US until the government backed down, do you think public
opinion would force the government to comply? Just wondering how much power
such a huge company has.

~~~
newmemory
Nope. The CEO will just be tried, convicted and jailed. And while that's
happening, the backdoor will be quietly installed. Remember Quest[1]?

[1]: [http://www.denverpost.com/business/ci_25434854/former-
qwest-...](http://www.denverpost.com/business/ci_25434854/former-qwest-ceo-
nacchio-claims-tv-his-jail)

~~~
ProAm
This is way more in the publics view compared to when Quest went through this
though. I dont believe the government has the ballls to jail Tim Cook, CEO of
one of the worlds most valuable companies.

~~~
morganvachon
Indeed, I think any pressure on Cook will come in the form of sanctions
against the company itself, forcing the board's hand into firing Cook or
convincing him to back down on his stance. Perhaps a deeper investigation into
offshore tax havens by the IRS than other companies are currently under? The
IRS is a powerful weapon of intimidation in the government's arsenal, and has
been wielded many times in the past.

------
bmay
Are there any open source alternatives to the iPhone that might take off
because of these happenings?

~~~
venomsnake
No need. The only thing apple must do is change the ios key management.

1\. Once you buy the phone you (via itunes) create a RSA key pair. Put one of
those in the phone. That key is set and bootloader uses it to verify loaded
updates.

2\. ios updates come to you signed by apple, you must resign them with your
itunes and then they could be loaded.

So you obtain the ability to sign your own software on your own device.

In that case no amount of Apple assistance can help FBI until they obtain your
private key.

~~~
ikeboy
You lose your key, now you can't upgrade the phone. Nobody can, so the phone
goes down in value.

Perhaps the ability to completely restore shouldn't require your own
signature.

~~~
mjevans
If you've lost that key you can't get in anyway. That's what a factory reset
(which would not include the old decryption key) update is for. That scenario,
updating with removal of the previous configuration, is pretty much baked in
for both the case you mentioned and the reselling/refurbishing the device
circumstances.

~~~
ikeboy
It's very difficult to bootstrap something like that and also make it
updatable. If the key can be changed, then some software X can update it, and
that software itself must verify the key. You have X updating Y and Y being
ultimately responsible for updating X. Not so simple.

Given that nothing like that is out there as far as I know, you'd need a more
specified model to prove it's even possible. It's not obvious from the
proposal above.

~~~
codys
This really isn't difficult to do:

\- loader A stores a (PubKey,NextLoader) and has the ability to blank (via
actual blanking or deletion of an encryption key) the entire device.

\- loader A provides a new-key(PubKey,NextLoader) method which blanks the
remainder of the device.

\- Loader A also provides a update-loader(NextLoader) method that doesn't, but
also doesn't update the PubKey. Before accepting the new NextLoader, it
verifies it against the stored pub key.

Could also allow things like updating the PubKey if the update is signed by
the previous PubKey

Spec presumes only access is via the loader A api, PubKey would really need to
be stored somewhere safe (HSM, TPM, etc) to discourage direct hardware access.

Probably also could use the PubKey to encrypt the NextLoader.

New phones ship either:

\- without a pubkey or next-loader, and require initial provisioning to do
anything (fairly inconvenient)

\- ship with a flag set that prevents updating the loader, ie: requires
updating the key (less secure, probably need to specify further how this
occurs to avoid the security hazard from un-updated phones from being too
great).

~~~
ikeboy
>loader A provides a new-key(PubKey,NextLoader) method which blanks the
remainder of the device.

Assuming this updates the loader, that means anyone with physical access for
five minutes can permanently brick the phone, without opening it.

Are we sure we want that?

Also, how can loader A modify itself?

~~~
codys
> Assuming this updates the loader

It doesn't. The model I presented presumes that it is unchangeable.

> anyone with physical access for five minutes can permanently brick the
> phone, without opening it

Pick one:

\- you can always get your phone working, even if you forget your key & only
you can apply updates to any software on the phone. Anyone can replace the
loader (but this wipes all other data).

\- you can always get your phone working, even if you forget your key & only a
third party can provide new versions of the loader.

\- if you forget your key, your phone is permanently bricked.

> Also, how can loader A modify itself?

It can't.

In general though, it's fairly straight forward to have code copy itself into
ram & run from there while overwriting it's source. The problem is that opens
the potential to brick the phone (just like any method that allows updating
the loader).

To avoid bricking in all cases, one _must_ assume that there is some un-
replaceable software (or hardware mechanism to start software).

~~~
ikeboy
I'm coming around to believing it's possible.

Now I'm wondering whether you can place malicious RAM in the phone that
changes instructions on the fly. Is that feasible?

~~~
codys
Lots of things become possible once one is willing to decap ICs to get at the
internals.

I'd expect security consious parts (ie: all of the theoretical "loader A")
would need to run in SRAM (ie: ram that is in the same IC and thus harder to
get at than external DRAM chips) or some other mechanism.

At that point, it becomes a question of physical hardening within the ICs.
Some manufacturers have done things like put metal layers over fuses (to
prevent them from being changed), I'd imagine the same could be done (at some
cost) for a larger area of the chip. I'd imagine HSMs (hardware security
modules) and TPMs (though these aren't as good) probably implement some of
that. There also exist some chips targeted towards security purposes (not
aware of any processors off hand) that could be used .

~~~
ikeboy
Wouldn't the regular ram also need hardening? If I can modify the ram, I can
change the OS that's loaded to ram. Does this hardening slow it down?

------
ipsin
Just in case I'm missing it, the story is that there's a National Security
Counsel "Decision Memo" defining a strategy, but that memo has not been
leaked?

------
rbanffy
Simple question: what prevents the FBI from removing the components from the
phone and using software they themselves wrote to drive the hardware crypto
and decode the data they want?

It can't be that difficult, if you have FBI-class resources and some help from
the NSA, to lift the components and make them work on a copy of the encrypted
data.

~~~
rbanffy
Never mind. Just read the Q&A on the secure enclave. I wonder if decapping the
chip would allow to extract the UID from the chip wiring.

~~~
noblethrasher
The iPhone in question is a 5c, which does not have the Secure Enclave.

~~~
rbanffy
If I understood it correctly, the UID is still not directly readable even
though the actual computation happens in the same CPU, not inside a secondary
secure environment.

------
pc2g4d
I think this issue is important and I hope Apple prevails over the FBI.
However, I'm also left feeling that they're subject to this request only
because of what amounts to a security flaw in their own devices.

How/when can I run a phone OS that simply isn't subject to such known flaws
and corporate manipulation? What are my options?

------
sarciszewski
Warning: Autoplay video.

~~~
blisterpeanuts
Sigh, I guess it's time to enable Click-to-play again[1]. I wish there were a
way to just automatically pause on load, without needing to completely disable
flash.

1\.
[https://news.ycombinator.com/item?id=8802986](https://news.ycombinator.com/item?id=8802986)

~~~
coldpie
Firefox supports loading plugins like flash on demand on the Addons settings
page. Additionally, Firefox has a setting "media.autoplay.enabled" to prevent
HTML5 media from playing automatically. However, some websites assume autoplay
succeeded and behave wrongly. For example, YouTube's paused/play button state
is backwards.

------
treebeard901
Two interesting possibilities to consider: 1) The government has already
gotten past the iPhone security and read the data.

2) Apple already has the software they were asked to create.

------
MCRed
My Facebook feed is full of people ragging on Trump for being on the wrong
side of this issue, but they are silent about Obama:

"In a secret meeting convened by the White House around Thanksgiving, senior
national security officials ordered agencies across the U.S. government to
find ways to counter encryption software and gain access to the most heavily
protected user data on the most secure consumer devices, including Apple
Inc.’s iPhone, the marquee product of one of America’s most valuable
companies, according to two people familiar with the decision."

~~~
randcraw
Obama is not running for President.

~~~
liquidise
You're right. Unlike Trump, Obama's misguided opinions actually matter.

~~~
CaptSpify
Obama already gave up this fight right away. He came on board promising to
fight it, and doing the exact opposite. I guess I'm saying: We already know
where he stands, therefore, he's not interesting anymore

