
Why Apple Is Right to Challenge an Order to Help the F.B.I - doe88
http://www.nytimes.com/2016/02/19/opinion/why-apple-is-right-to-challenge-an-order-to-help-the-fbi.html
======
msravi
From here (4th para from the end):
[http://www.nytimes.com/2016/02/19/technology/how-tim-cook-
be...](http://www.nytimes.com/2016/02/19/technology/how-tim-cook-became-a-
bulwark-for-digital-privacy.html)

"Apple had asked the F.B.I. to issue its application for the tool under seal.
But the government made it public, prompting Mr. Cook to go into bunker mode
to draft a response, according to people privy to the discussions, who spoke
on condition of anonymity."

If this is true, it sort-of implies that Apple would have done it, but
secretly, and they were forced to take their very public stance because of the
FBI's posturing.

~~~
sharkweek
I'm actually a little surprised that the FBI would go so public with the
request? Anyone have any insights here?

Did they think perhaps there would be public outcry the other direction
meaning Apple would look like it was harboring terrorist information if they
didn't comply to a public request?

~~~
duaneb
> Did they think perhaps there would be public outcry the other direction
> meaning Apple would look like it was harboring terrorist information if they
> didn't comply to a public request?

This is exactly what is happening. It's a political hail mary to get the
easiest-to-see use case of forcing privacy violations on american corporations
and users legally protected in court precedent. Of course, under this
interpretation, there's the disturbing implication they had to wait this long
(after the revelation about PRISM) to find a case that so well matched their
political agenda.

It's also a little worrying that they are going public, because normally
that's not necessary for court verdicts. Perhaps they're preparing for
legislative action if the case fails?

~~~
swyman
Any insight into which presidential candidates have the most privacy-friendly
views here?

~~~
ghayes
I don't know why you're being down-voted. The US is in the midst of an
election cycle that may very well pertain to the "legislative action" of the
parent comment (in so far as the executive has an effect on the legislature).
From a recent article [0], it states (summarizing):

Pro-Privacy: Cruz (R), Paul (R), Sanders (D)

Pro-Surveillance: Trump (R), Rubio (R), Bush (R)

Moderate: Carson (R), Clinton (D)

[0] [https://nakedsecurity.sophos.com/2016/02/02/where-do-us-
pres...](https://nakedsecurity.sophos.com/2016/02/02/where-do-us-presidential-
candidates-stand-on-privacy-and-surveillance/)

~~~
chippy
> I don't know why you're being down-voted.

Off topic, but I am going to make a Hacker News bot that looks for this
phrase, waits 1 hour and sees if the comment is still downvoted.

Or, I'm going to write a copy and paste paragraph explaining to people why it
appears that a comment has been downvoted. This would include all the
mechanics, how karma works in HN, voting algorithms, mod flagging, time passed
etc, involved as well as human psychology at play. So, the above phrase either
is honest or dishonest.

For example, in this case I would posit that yes, you actually do know why
they were being downvoted, but that by writing that you were actually saying
"I disagree with the others who downvoted you". There is a difference.

It could be similiar to when people say "I don't know why anyone voted for
Bush". They are not expressing ignorance about the reasons, they are saying
that they disagree with peoples choices. In this case, it would be dishonest.
Unless the person was actually ignorant of others (it is possible), in which
case it is honest.

Off-topic, as mentioned.

Edits - I do know why I am being downvoted.

~~~
newscracker
On topic reply to an off topic comment. :) Could it also be because people use
"don't know" and "don't understand" to mean the same thing in many contexts?
With "don't understand" being longer to type and speak and also being more
"formal" in communication than "don't know", perhaps "don't know" is used as a
substitute?

In the Bush example, someone saying "I don't know why anyone voted for Bush"
probably means "I don't really understand why (and with what thought process)
anyone voted for Bush after everything we know about the matter."

------
bsder
The best way to put this for the other side is:

"How about we start with FBI mandated remote control gun disablers given that
it's guns that killed these people? Oh, you're concerned that someone will
figure out how to bypass it, and it won't just be the FBI disabling your gun?

Congratulations. You now understand my position."

~~~
snowwrestler
Why don't we have remote take-over capability built into airplanes? People
have certainly crashed them intentionally.

Why don't we have remote kill switches in cars? Think of the high-speed chases
that could be prevented.

Why don't we have surveillance cameras in private homes? Think of all the
crimes that could be solved.

Even if these are all off by default, and the law requires a court order to
turn them on, Americans would not accept them. Encryption is no different,
except that most Americans don't understand it well enough to see the
parallels.

But these are just arguments against backdoors.

The even more fundamental argument is about the All Writs Act. Can the FBI
require companies to develop new capabilities just with a court order?
Shouldn't that require legislation? Why did we bother to pass CALEA, if the
government could have just said "All Writs Act" and gotten what they wanted
that way?

~~~
danjoc
It is easier to understand if the same task is rephrased.

"Apple ordered by Chinese government to disable lock on phone seized from
undercover CIA agent."

The act is the same. Being a multinational company, the result is no different
for Apple. A government is legally requiring Apple to make decryption
possible, but suddenly the people crying for Apple to unlock this phone will
take an extremely different stance.

~~~
lern_too_spel
Everybody I've seen here saying that Apple should assist the FBI in unlocking
the device takes the same stance in both cases. If Apple doesn't like it, they
should either secure the device in such a way that it isn't easier for
themselves to compromise than it is for a third party or stop doing business
in that country if they believe the requests are unreasonable.

------
cptskippy
What bothers me most about this article is the following statement which is
accepted without contest.

"Law enforcement agencies have a legitimate need for evidence, which is all
the more pressing in terrorism cases."

What makes a terrorism case more pressing? How many domestic terrorist attacks
have had related followup attacks? How many domestic terrorist cases have been
linked to other domestic terrorist attacks? How many domestic terrorist
attacks have been carried out by the same set of individuals or groups?

The reality is that domestic terrorist attacks are not common or frequent,
there is no urgency in investigating them because they do not lead to followup
attacks. They're coordinated events, not a series of related events so there's
no pressing urgency?

~~~
TearsInTheRain
what? Sure terrorist attacks are not common or frequent but they generally
involve murder on a larger scale than your average criminal. There is of
course a lot of urgency in investigating them as they could easily lead to
follow up attacks. Look at San Bernidino and the Boston bombings. both pairs
of terrorists clearly had plans for follow up attacks

~~~
JoeAltmaier
What terrorists plan, and what happens, are very different. Terrorists often
die in their attacks, so followup plans are often moot.

------
grecy
I've been explaining this case to others and I've come up with a good way to
make them understand.

"Instead of the FBI making this request, how would you feel if the Government
of China were asking? or Russia, or Syria? Do you want them to have the
ability to read your encrypted data off your iPhone?"

~~~
incepted
If there is a subpoena against that phone, I don't really have an issue with
the government gaining full access to it. But there needs to be a subpoena and
that backdoor can't be activated without one.

It's a similar, albeit stronger, idea as "probable cause": giving law
enforcement a temporary waiver of rights that regular citizens are usually
afforded.

~~~
jpgvm
The thing is, while ever a backdoor exists and can be activated at will the
fact a subpeona is require to compel someone to use it doesn't mean that it
can't be hacked by someone else to use for more nefarious purposes.

Deliberate backdoors can't exist, Apple needs to put the foot down and make
that clear.

~~~
vectorjohn
There is so much misinformation floating around.

The FBI wouldn't get a backdoor to all phones. It would literally only work on
this one phone.

Or put another way, the backdoor is already there and Apple is being compelled
to open it for this phone.

~~~
jpgvm
You completely missunderstand. It's not about this phone, or even this
singular backdoor.

All backdoors are poison. End of story. If the FBI wanted access to the data
on the phone they would have it already. But they don't. Probably because the
already concluded they were acting alone and not part of a terrorist cell or
network and no longer have need of the rest of the information or they really
just don't care enough about it.

However they do want to use this case to get precedent for industry to produce
new backdoors for them into previously secure-ish systems in the name of
"terrorists". This very idea is cancer, a cancer that will spread to other
technological systems they feel aren't easy enough for them to tap into
whenever they want. A cancer that will spread across the world as it sets a
precedent for nation states to force private companies into deliberately
weakening the security in their products to please whatever government finds
their citizens having access to encyption to be reprehensible.

So no, it's not mis-information.

You and anyone else that thinks this is ok just don't grasp the gravity of the
situation.

~~~
vectorjohn
I don't misunderstand at all. I just have a different opinion than you do.
However, a lot of misinformation is being spread.

The FBI isn't asking for Apple to create a backdoor. End of story. They're
compelling Apple to open a backdoor that Apple already created. Apple made an
insecure system. They can, and I think _should_ fix it, but in this phone at
least the back door is there.

So, wait to use a slippery slope argument when it actually becomes applicable.
You're so trigger happy to jump on the privacy, precedent setting bandwagon
you didn't stop to think what this case actually is.

Basically, as long as you make insecure systems, the government _can_ (by the
fourth amendment no less) demand you let them in if they have probable cause
and a warrant. If you make something impossible to get into, and the
government starts demanding you stop making devices like that, then we have a
problem. But this particular case doesn't get us any closer to that outcome.
Not even symbolically.

I think you and everyone else overreacting to this case grasps gravity that
isn't there. You're tilting at windmills.

------
buzzdenver
Doesn't the fact that Apple is capable of unlocking the phone mean that a
backdoor already exists ? Then it's just a question of a lone wolf telling the
FBI how to do it, maybe along with using some secret keys that could be stolen
if Apple is hacked like Sony was. I would like a phone that is unhackable even
by its creator. Anything else is just a question of time to get broken into.

~~~
function_seven
It all depends on how you define a backdoor. The 5c doesn't have the secure
enclave†, so all of it's encryption routines are done in the CPU, as
software/firmware.

That software doesn't contain a backdoor, but the fact that a new software
load could be created and installed could be viewed as the backdoor itself.
But if that software doesn't currently exist, is the mere potential of it a
backdoor? Somewhat of a philosophical question.

If Apple prevails and doesn't write the software the FBI is demanding, then
the backdoor is not there.

The later iPhones with the Secure Enclave may truly be unbreakable if they
protect their secrets with hardware in the chip.

But my understanding of it is that Apple is NOT capable of unlocking the phone
right now. The FBI is demanding that they develop that capability. So I guess
they're capable of _being capable_...

† At the risk of being an armchair SE expert...

~~~
clort
They are capable of unlocking the phone. There is no philosophical question
about this, as they have the knowledge and the ability and the authority.

That it might take some expertise and effort is not really relevant - some is
required in any case even if the firmware already existed, so its just a
matter of scale. For example I know several people who could not update a
phone firmware without supervision, and probably some who could at a stretch
(given massive time and resources) reverse engineer it. Apple firmly occupy
the sweet spot in this range, they have the secret knowledge that would
otherwise have to be reproduced and it would be a bit of effort to build on
this but not huge. Changing a bit of code [to remove the delay] and rebuilding
a binary image is not a complicated task for their engineers, it is likely
something they do all the time.

Personally, I think they should just do it if they are legally compelled to do
so. They have done in the past and this is an older device and in 5 years the
issue will be moot, as any firmware they create now would be useless.

~~~
JustSomeNobody
>Personally, I think they should just do it if they are legally compelled to
do so. They have done in the past and this is an older device and in 5 years
the issue will be moot, as any firmware they create now would be useless.

If they do it now, they the point will indeed be moot in 5 years, not because
of technology, but because it will be required of them to make sure they can
do this to ALL iPhones.

------
moonshinefe
The fact that Apple has made this issue very public (with
[http://www.apple.com/customer-letter/](http://www.apple.com/customer-
letter/)) is unprecedented almost (Lavabit and SOPA had some similarities). I
applaud them for taking a stand and not just giving in.

If this legal precedent gets set that the FBI can force US tech companies to
break into their own customers' encrypted data, you can bet the industry will
lose millions if not billions of dollars worldwide in tainted reputation.

Who's going to buy US companies' devices that claim privacy via encryption if
they're easily backdoored at the FBI's request?

~~~
rayiner
Is that true? Armed with a court order, the FBI can search a U.S. house,
compel banks to turn over financial information, compel accountants to turn
over records, etc. Yet people worldwide extensively use those American
services.

~~~
adventured
That's not a similar example to what's going on here.

Your example would have to include a situation in which the FBI makes copies
of the keys to the house, copies of the hard-drives, etc. Then inevitably
makes it embarrassingly easy for other parties to get access to those as well,
permanently reducing your net security to almost zilch regardless of the
outcome of the context and warrant.

And worse, in the process of gaining access to that one house and hard-drive,
they simultaneously gain the potential to access millions of homes and hard-
drives that have nothing to do with the warrant, and then those millions of
houses and hard-drives see their net security reduced to essentially nothing.

~~~
rayiner
They're offering to let Apple do the unlock themselves and just hand over the
data.

~~~
criddell
And that establishes a precedent. So, in a future case that is less
sensational, the court could order to Apple to unlock that phone as well and
Apple would have to comply. Apple doesn't get to pick and choose which court
orders it complies with. That goes for court orders in the US, China, Canada,
etc...

~~~
vectorjohn
Slow down there, you're moving the goal posts way too fast.

If you have an argument, stick with it.

If Apple doesn't want to look bad complying to FBI requests, they shouldn't
have put a backdoor in their software.

------
ikeboy
> It is also theoretically possible that hackers could steal the software from
> the company’s servers.

It's also theoretically possible for hackers to steal Apple's private key from
their servers, in the exact same way. As far as I see, there's no change to
the threat model by Apple making a new software version, and signing it using
the same process they use to sign other versions of iOS. It's useless if not
signed, so the real worry is someone having the ability to sign it, and that
applies exactly the same if Apple signs iOS or FBiOS.

This suggests the authors don't understand the technology well enough to know
this.

edit: this is assuming the software is locked to a specific device. So the
signing doesn't matter to any other device. If the software worked on all
devices, then a leak of a signed version would be problematic. Although even
if they couldn't lock it to device, they could make it only work for a short
time, say a week, so if that signed version leaks later it would have no
effects (I'm not sure if you can change the clock on a phone without unlocking
it though).

~~~
danielsju6
Once the software exists, due to Apple's specialized knowledge, what's to stop
them from issuing a NSL for Apple's signing keys + access to the remote update
mechanism... then the FISC rubber stamps warrants en masse and Apple devices
are used to spy on tens of millions of Americans. It sets a horrible precedent
that the government can redirect the development resources of a private
company.

How many millions of dollars in engineering salaries will this take? Project
management? They're demanding an entire blackhat division of Apple be spun up,
with the goal of circumventing other teams security effort.

~~~
ikeboy
None of that has to do with Apple creating the software, but rather with their
handing over the signing keys etc.

The FBI or NSA could probably create their own software if they had the keys
and source code.

I doubt the cost to Apple is over 100 man hours. But if it's too difficult,
they can argue that and simply hand over the source code.

~~~
danielsju6
It would likely take more than 100 man hours just to provision the code sign;
Apple probably has that all under controlled hardware that require multiple
lead-engineer level crypto tokens across multiple divisions.

This isn't some Node.js box that a junior developer will monkey patch in
prod—this is serious cryptography @ a large company w/processes.

"simply hand over the source code" Ohhh, so you're arguing a private company
should hand over all of it's private property (it's source code) to a public
institution because... terrorists?! Then Samsung starts hiring more FBI agents
for some reason... or the source to the iPhone magically shows up online. Also
how many hundreds of thousands of hours of engineering time will it take to
sanitize that codebase to make sure it's suitable for public dissemination?

~~~
ikeboy
You think it takes 100 hours every time Apple releases a new iOS version just
to sign it?

They used to do something similar with unencrypted devices, according to
[https://blog.trailofbits.com/2016/02/17/apple-can-comply-
wit...](https://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-
court-order/)

>Ohhh, so you're arguing a private company should hand over all of it's
private property (it's source code) to a public institution because...
terrorists?!

I'm not arguing that. I'm arguing that if Apple claims it's too hard to
comply, there's a much easier method. It's to Apple's _benefit_ that they're
given the option to make the software and sign it themselves.

This isn't my original argument, I took it from
[http://bloombergview.com/articles/2016-02-17/the-apple-
fight...](http://bloombergview.com/articles/2016-02-17/the-apple-fight-isn-t-
about-encryption)

~~~
danielsju6
Yep, that's what I'm arguing. The SHSH blob servers probably aren't trivial +
I'm certain they have a lot of process in place to keep someone from
"accidentally" releasing a software update, or the release of iOS 9.3 from
taking them all down.

You'd be talking about creating both a special version of the OS and updating
the SHSH servers to accept that code signature.

What are they gonna do, just hardcode the device's UDID in a sub-routine and
distribute it to the entire cluster? What about testing? If they do it wrong
and the SHSH blob/update gets in a bad state, they could end up accidentally
wiping the phone... so now QE needs to get involved + build test cases.

I'd just spitball a 20 person team with at least nine months, lots of meetings
w/leadership/pm/and VPs, hardware being purchased, and several executives
having to get involved.

Hell, it takes more than 100 engineering hours at my company just to update
some chef cookbooks; let alone schedule a release and get sign off.

I'm guessing you haven't worked at a large company before?

\----

They probably did something very different than custom firmware before.

~~~
ikeboy
It's trivial to redirect a network to ask your own server instead of Apple's.
That's what tinyumbrella did while it still worked. So they wouldn't need to
change those clusters, just set up a single machine signing only that device
(which is easy enough that tinyumbrella did it without Apple's help), and give
it access to Apple's signing key.

>If they do it wrong and the SHSH blob/update gets in a bad state, they could
end up accidentally wiping the phone

How many hours does it take to test on a spare phone? 5?

>I'm guessing you haven't worked at a large company before?

No. But this isn't something that's being rolled out to millions of users. The
team doesn't need to do anything that affects other users. They can have their
own SHSH server offline, and sign everything offline.

------
zaroth
The problem is _not_ ability to search the device after it has been cracked,
the problem is specifically does the government have the power to force
companies to develop backdoors for their own devices?

In this case the cracking capability is for a locked phone in FBI possession.
Let's assume the same technique isn't possible on newer phones. So what about
the next case where the FBI wants remote access over LTE while the phone is
unlocked / in use by the suspect?

If you can use All Writs Act to compel Apple to develop the first backdoor,
then surely the same is true for the second.

------
chrischen
This is also a great product move by Apple, as if they win it shows even the
FBI can't access your locked phone.

~~~
onion2k
Perhaps, but that would be a strong incentive for government to mandate that
devices _have_ to have a backdoor in the future, or that they can't use strong
encryption so devices can be read without a backdoor. I doubt the FBI see
losing this particular battle as the end of the war.

There's also another possibility, albeit less likely, that the FBI know
they'll lose but they don't care because they can already read the device and
they want any terrorists to believe they don't have the information on it.
That's a pretty full-on "tinfoil hat theory" though.

~~~
ubernostrum
Mandated backdoors are something that law enforcement wants, but something
that the national-security complex probably doesn't want (they seem to feel
they can get what they want without help, and don't want to make US
personnel/corps easier targets for their overseas counterparts). And the
national-security complex is likely to win that fight.

------
marincounty
Does anyone feel this whole incident is a carefully staged by the FBI and
Apple?

I'm not into conspiracy theories, but I'm wondering on this one.

Why would the FBI, or Apple make this so public? The Apple letter seemed
staged?

The federal government always seems to get what it wants in the end,
especially if one has a lot to lose?

I imagine the conversation starts off with an indignant, appalled CEO.

"Hell No--I'm not giving you access to my customers data!"

Federal government counters with, "Do you want us to scrutinize your past, and
present life?"

"Do you want us to look at every stock trade you ever made?"

"Do you want us to publicize the personal information we have on you already?

"You know we can make your life misserable? You know we can make your
companies life misserable?"

No this isn't Russia, but our law enforcement branch of the federal government
scares me, and I'm a nobody. There's been some deaths, especially in tech,
that seem suspicious. The drug overdoses--guy in San Francisco that was about
to give a talk on ATM hacking comes to mind.

That tech guy who died in that fiery car crash on Los Angeles.

(I don't want to argue with anyone. I have no evidence. Just a weird feeling.
And yes, Tim Cook seems like a choir boy. He comes across as someone who
doesn't even jay walk.)

------
rrggrr
Let iPhone users opt-in and you'll find far fewer proponents of backdoor
encryption than will be reflected in Congress. I wouldn't be surprised if most
in the LE and Intel communities didn't opt-in. Why? Perhaps because the USGOV
has yet to prove they can keep their own data secure. The Office of Personnel
Management hack, Clinton official emails on a private server, and many more
instances have shaken the faith in USGOVs ability to be an effective steward.
I'm in favor of master keys for the FBI after they prove to the public, in a
transparent and accountable manner, that those keys can be kept unassailably
secure from misappropriation or abuse.

------
cmurf
What if Apple's coerced firmware update bricks the phone? That's destruction
of evidence they'd need immunity from. What trust and conflicts arise when a
company has blanket immunity from such evidence being destroyed?

So many slippery slopes.

------
mrb
Off topic, but: _" Apple is doing the right thing in challenging the federal
court ruling requiring that it comply"_

I am not a native English speaker. Why "it comply" and not "it complies"?

~~~
chillingeffect
Subjunctive tense. Since it is still not known whether they will comply or
not, the verb is unconjugated. Writing "it complies" would be confusing. It's
highly proper grammar and not used in everyday spoken speech and not used all
the time in most written speech, but since this is more formal and written,
it's used. It's one thing the NYT gets right.

~~~
mrb
Thank you! We live to learn.

[https://en.m.wikipedia.org/wiki/English_subjunctive](https://en.m.wikipedia.org/wiki/English_subjunctive)

------
geggam
When two nuts with rifles trigger a "National Security" problem I think the
problem is with the Nation not the people.

What happens when an entire nation threatens us ?

------
stillsut
Couple of thoughts:

\- Doesn't an iPhone become completely secure if you prevent if from passively
polling for OS updates. If the iPhone would only poll and install updates
after user un-locks and allows, then there would be no way change the
necessary software configuration without breaking the encryption. And the
encryption can't be broken, thus if you could chose OS -level default "don't
accept or even check for any updates without user permnission" you really
would have an unbreakable device. But if they ever did implement this, it
would be terrible for Apple's forced upgrades and their platform would
fragment into many incompatible versions. I think everyone knows which option
Apple will chose between: CompletelySecurePhoneOS or
AbilityToForceUpgradesAndPatches.

\- Correct me if I'm wrong but didn't Tim Cook initially state cracking this
phone was impossible according to Apple's experts? And now it seems it's a
quite reasonable issue of Apple signing an OS-update specific to this device's
unique ID; so quite feasible. So was that a lie?

------
TheAppGuy
I've got a feeling we're being manipulated by someone with an agenda at play.

~~~
Crito
Are we ever not? It's a political topic, so manipulation goes with the
territory.

------
EGreg
All they need to do is compel companies (by hook or by crook) to install
backdoors in their algorithms or hardware.

[https://www.eff.org/deeplinks/2014/01/after-nsa-backdoors-
se...](https://www.eff.org/deeplinks/2014/01/after-nsa-backdoors-security-
experts-leave-rsa-conference-they-can-trust)

[https://en.wikipedia.org/wiki/Dual_EC_DRBG](https://en.wikipedia.org/wiki/Dual_EC_DRBG)

[http://www.cnet.com/news/spy-fears-lead-nuke-lab-to-dump-
gea...](http://www.cnet.com/news/spy-fears-lead-nuke-lab-to-dump-gear-from-hp-
unit-not-huawei/)

Who is to say that other state actors haven't done the same to chips produced
by their companies? The truth is, the genie is out of the bottle.

A year and a half ago, I wrote a serious article on this:
[http://magarshak.com/blog/?p=169](http://magarshak.com/blog/?p=169)

------
InTheArena
The guy who wrote the two most important 4th amendment case opinions over the
last thirty years (both in favor of privacy rights, both by 5:4 votes), and
who had the longest record of cracking down on laws that were being stretched
to cover uses never imagined (which by definition, the Writs law is being
abused) just died.

It's not good timing....

~~~
bratsche
For those of us not well-versed in these sorts of things, what were the two
cases?

~~~
InTheArena
Jones and Kylo. Kylo held that the Government could not use infrared scanners
to track individuals inside of their house without a warrant, while Jones held
that the government could not use GPS trackers on cars, even in public,
because it had to trespass to install the GPS tracker. He also dissented with
the forced DNA testing of anyone charged with a crime, and from warrent-less
suspecion-less traffic stops.

[http://reason.com/archives/2016/02/16/antonin-scalia-was-
a-g...](http://reason.com/archives/2016/02/16/antonin-scalia-was-a-great-
jurist-for-cr)

~~~
lern_too_spel
There is no explicit right to privacy in the Constitution. Some of Scalia's
rulings happen to favor privacy while others do not. For example, he also said
the Miranda ruling is judicial overeach, so in his view, you have rights
(privacy or otherwise) only if you know about them ahead of time. _Roe v.
Wade_ established precedent for an implicit right to privacy interpretation of
the 14th Amendment. Scalia has said that interpretation is wrong.

~~~
InTheArena
no, but the 4th amendment (which I referenced above) is the usual proxy. Even
the most ardent of abortion backers will admit that Roe v Wade is flawed by
every measure of judicial practice (which is why the entire line of thought
advanced by Roe v Wade has not gotten any traction in any other ruling). Only
the fact that it's tied to such a hot-button issue has kept courts from over-
turning it.

The 4th amendments and all writs law is being broadly interpreted by this
judge to give the government the power to effectively enforce a warrant
against the entire population of the United States that uses iPhones. Scalia
would have been all over that like white on rice.

------
tosseraccount
Apple indeed may have a right to refuse the administration. Does the
administration have the right to remove Apple Computer from the GSA government
purchasing schedule?

~~~
valleyer
Yeah, I'm sure they'd shed a tear over that.

------
GnarfGnarf
Why doesn't the FBI copy the contents of the terrorist Farook's iPhone to a
second iPhone (after all, it's only a hard drive). Make ten attempts on the
second iPhone, brick it, then copy the contents again, try the next ten
digits, and so on until they hit the combination?

~~~
ozcorp
As I understand it, half of the key that decrypts the phone is stored on the
chip itself and cannot be read. When you enter in a correct password on the
lock screen, it is added to this hardware key and the combined key is then
able to decrypt the phone. If you move the contents of the hard drive to
another iPhone, the key on that phone will be different and thus can't decrypt
the drive. Its like 2 phase authentication on your Gmail account and you are
being sent the wrong codes.

Check out this article for more information. It also explains how the fact
that the phone is a 5C is significant to the encryption scheme. [0]
[http://blog.trailofbits.com/2016/02/17/apple-can-comply-
with...](http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-
court-order/)

~~~
GnarfGnarf
OK, good explanation. What about removing the original hard drive from the
iPhone, and installing the copy? You could just keep plugging in copies until
you crack the code?

Could the chip also not be cloned?

------
LeicaLatte
Curious if Apple is breaking any user agreements if they end up doing this.
Sharing information on servers is one thing. Sharing a key which enables
access to all future communication too is different. Does their EULA cover
such scenarios? Can it open them up to possible lawsuits from its users?

~~~
mc32
How, the phone owner, the state agency, granted permission, the actual users
died. Who'd have the basis to sue under what claims?

If you thought you had a claim I guess you could sue to block the order, but
that's up to the courts to decide and that would take precedence over an eula.

------
clumsysmurf
One thing I hope can be clarified: Is the FBI asking Apple to patch iOS on
this one device, one time only (in a way that can not be reused) ... or are
they asking Apple to provide a "reusable" patch / modification that allows
future devices to be accessed?

~~~
joe_the_user
"One thing I hope can be clarified: Is the FBI asking Apple to patch iOS on
this one device, one time only _(in a way that can not be reused)_ "

This must be weasel-word day. The FBI is asking for patch, hypothetically just
for this phone. But only in this post have I seen anyone imagine "a way that
can not be reused" since the point raised stated by the parent article is that
such a patch could inherently be reused.

~~~
roywiggins
If all Apple produces is a signed update (with logic that activates it only if
it's the right IMEI) then it's not useful to anyone hacking any other phone.
The software can't be modified to work on them without Apple's cooperation,
since any modification breaks the signature and you don't have Apple's signing
keys.

------
mgleason_3
Given the powers and size of the national security budget and the NSAs in
particular, why do they need Apple? Shouldnt the NSA be able to crack this on
their own?

Kinda makes ya wonder what all that moneys spent on...

~~~
moonshinefe
They can get the info they want with more sophisticated techniques from what
I've read. Also, Apple already gave them the iCloud backups. The legal
precedent of forcing companies to break their own security for the FBI's sake
is the prize at hand here it seems.

------
Shivetya
So here is a simple question. How will we know if the US becomes successful in
having manufactures put in a backdoor? Can it be done to the current 6 models
through a software update?

------
j_m_b
Glad to see this from the editorial board of the NYTimes

------
jy2947
I may be wrong - but I have a feeling that, technology companies like apple
and google are developing software on device to make user data so protected
that they will say "I can not technically to crack my software" even ordered
by a Judge (presumably for legit reason), thus the DOJ is using this case
trying to prevent it from happening. And, if this is the case, then personally
I am at the DOJ side, because I recognize this is a less ideal world (actually
I think it is even worse), and this country is technically in a war.

------
Tepix
Has there been any precedent of (the|a) government forcing someone to sign a
piece of code against their will?

------
redindian75
Wonder why can FBI just hand the phone to Apple keep the phone just get the
data we don't needed specifics.

~~~
rplst8
Do you really want that to be the model for American justice? Law enforcement
hands your digital electronic records over to a private company who has the
power to put anything on that phone? Do you want to entrust your personal
legal future to a private entity? I think I'd want a solid chain of custody on
anything that may prove my innocence.

~~~
tosseraccount
The phone was the property of San Bernardino County.

The county has longstanding email and Internet use policies that state, “NO
USER SHOULD HAVE AN EXPECTATION OF PRIVACY” \-
[http://ktla.com/2016/02/18/why-didnt-san-bernardino-
county-o...](http://ktla.com/2016/02/18/why-didnt-san-bernardino-county-
officials-have-access-to-the-terrorist-iphone/)

~~~
dclowd9901
I bet even I, not a lawyer, could reasonably defend that, no matter who "owns
it", a device as personal and as much an extension of the brain as a
smartphone could be considered private.

------
botw
I wonder why the head of FBI didn't ask Tim Cook first in private. or I missed
something?

------
marcoperaza
TLDR for what follows: Mandated backdoors must be a red line, but this is not
a request for a backdoor and actually seems pretty reasonable. Trying to argue
that the tech industry shouldn't help, even in this case, is not only the
wrong position in my book, but a sure way to lose the bigger debate.

My views on the general encryption controversy are:

1\. Everyone must be free to make their technology as secure as they possibly
can. There can be no mandated weakening of security, back-doors, or other
requirements to make the information more easily accessible by law
enforcement. On newer iPhones, Apple has patched up the flaw that the FBI
wants their help with exploiting. They must continue to be allowed to do that.

2\. The government must be able to demand, with a court order predicated on
probable cause, that companies provide any and all information that they have
that could be useful in circumventing their security features. This can be
everything from technical specifications and threat-model analyses, to lists
of unpatched vulnerabilities and code-signing keys.

3\. It seems to me that American companies have a moral obligation that goes
beyond the legal obligations in point #2. They should be actively assisting
the government in recovering information, especially when concerning issues of
national security. In extreme circumstances, like total war, this should
definitely be legally mandated. I'm undecided as to what the policy should be
generally. On a practical level, it's probably not feasible for the government
to, e.g. start hacking around the iOS codebase themselves, so just information
might not be enough.

I'm not too troubled by this court order, especially given the particular
circumstances. The right to make products as secure as you can, even from
yourself and the government, is what's really important to defend. Trying to
argue that the tech industry shouldn't help, even in this case, is not only
the wrong position in my book, but a sure way to lose the bigger debate.

Apple's definition of "backdoor" is highly suspect. A backdoor is if I ship my
product with an intentional vulnerability, so that I can hack into it later.
Apple's not being forced to add a backdoor, it already exists because the
security features break down against an adversary that has Apple's private
key, at least for the default 4-digit PIN configuration. Now the government is
asking them to use their own capabilities to help hack this phone. Of course,
Apple didn't create this backdoor for malicious reasons, they just didn't
include themselves in the threat-model, greatly simplifying updates and other
security features, and allowing the walled-gardenness of iOS. It's also
central to the walled-garden. Curiously, this is in direct contradiction to
their claim for some time now, that they were designing iPhones such that they
themselves can't break into them.

Now put yourself in a Congressman's shoes. The FBI has been telling you for
years that tech companies are being purposefully antagonistic to their
legitimate search and seizure authority. That the tech companies are
purposefully designing features with the sole intention of shutting the
government out. Now here's a case where there was no mandated backdoor, the
government was able to devise an exploit method, and they got a court order
from a judge to make Apple use it on a dead terrorist's phone. "Mandatory
backdoors would hurt everyone's security", one of the arguments that _we 've
been winning with_, now sounds like a bullshit cover for "we are against any
government surveillance". Can you smell the legislation coming yet?

Disclaimer: These are obviously my own personal views and nothing else. They
do not necessarily reflect the opinions, policies, or practices of anyone but
myself.

(Reposted from
[https://news.ycombinator.com/item?id=11131456](https://news.ycombinator.com/item?id=11131456)
with additional)

~~~
marak830
I am only going to reply to 3.

Why would any company have a moral obligation? In fact, what if they are
saying this is their moral obligation? To not bypass the security they told
customers they had put in place?

I think they by using the term moral obligation, people are trying to negate
the meeds for laws or rulings.

In my opinion: There are no moral obligations for companies, just laws they
have to follow.

After it is made law, it is the companies choice to do business there or not.

------
buzzdenver
I wouldn't be surprised at all if Apple already gave a version of the OS to
the FBI that enabled them to brute force the password for that one phone with
the condition that they publicly put up this show as if they were not
cooperating.

~~~
brians
That would involve lying to a court. Intelligence folks might do that. Law
enforcement? Not a chance; it would be career-ending even to discuss it.

~~~
logfromblammo
If you truly believe that, I think perhaps you place too much trust in the
wrong sort of people.

The reason why I consider the hypothesis unlikely is not because those
involved are unwilling to deceive, but because they are unlikely to have
sufficient skill to make the attempt and avoid detection. I believe that,
considering the number of people that would be necessary, at least one Apple
insider with ethical concerns would leak evidence that would expose the true
story.

~~~
buzzdenver
I believe that much more than "law enforcement never lies", but IMO it's hard
to leak evidence in this case. A whistle-blower could say that he worked on
this project, but there isn't the hard evidence of documents that for example
Snowden could leak. The FBI or Apple could just deny it or say no comment.

