
Justice Department Wants Data from About 12 Other iPhones - rosser
http://www.wsj.com/article_email/justice-department-seeks-to-force-apple-to-extract-data-from-about-12-other-iphones-1456202213-lMyQjAxMTI2MjIzMzMyMTMwWj
======
froo
_" the FBI said they are not seeking to set a precedent in the case, but to
get the company to help them open a single phone that may hold crucial
evidence to help explain the most deadly terrorist attack on U.S. soil since
Sept. 11, 2001."_

What qualifies this as being a terrorist attack? Is it because the colour of
the perpetrator's skin wasn't white? Sandy Hook had double the number of
resulting deaths and so is technically more deadly.

Virgina Tech was done by a South Korean born man with even more deaths than
Sandy Hook.

Poor reporting WSJ

~~~
rayiner
> FBI investigators have said that Farook and Malik had become radicalized
> over several years prior to the attack, consuming "poison on the internet"
> and expressing a commitment to jihadism and martyrdom in private messages to
> each other. Farook and Malik had traveled to Saudi Arabia in the years
> before the attack. The couple had amassed a large stockpile of weapons,
> ammunition, and bomb-making equipment in their home.

It's pretty offensive to focus on the shooters' skin color, instead of what
they were: violent Islamic fundamentalists.

~~~
froo
We've seen violent Christian fundamentalists recently with the Colorado
Planned Parenthood shootings.

While many around the world consider that a terrorist activity, it had not
been labeled as such by US officials (the guy is only being charged with
murder and not additional offenses).

I find the double standards to be what is truly offensive.

~~~
rayiner
The Colorado Planned Parenthood shooters didn't pledge support to a worldwide
movement of Christians trying to overthrow the western world order.

~~~
DanBC
The one still alive describes himself as a "warrior for the babies". He also

> Mr. Dear described as 'heroes' members of the Army of God, a loosely
> organized group of anti-abortion extremists that has claimed responsibility
> for a number of killings and bombings."

He said the attacks were politically motivated.

[https://en.wikipedia.org/wiki/Colorado_Springs_Planned_Paren...](https://en.wikipedia.org/wiki/Colorado_Springs_Planned_Parenthood_shooting)

------
rurban
On a side note: The FBI didn't/couldn't even properly investigate 9/11 so why
do they still dare to use this wording? ( _" to help them open a single phone
that may hold crucial evidence to help explain the most deadly terrorist
attack on U.S. soil since Sept. 11, 2001"_) What came out of PENTTBOM? One
single court case against Moussaoui, who was not directly involved in 9/11 at
all, but that's it. Hundreds of arrests which led to nothing but illegitime
long-term hold ups under military law, but no due law.

If Apple can help them to extract data from those phones, fine. But Apple
apparently built secure phones without bypass, so they are out of luck and it
makes no sense to come up with fantasy warrants without any technical
solution.

------
joekrill
> the encryption of personal devices has become a serious problem for criminal
> investigators in a variety of cases and setting

And so has the second amendment -- but that doesn't mean we should get rid of
it. Yes, their job is hard. That's the nature of the job. Just because
something makes a job more difficult doesn't mean it's a bad thing. I don't
get why they insist on making this argument.

~~~
sieveoferos
I agree with the general sentiment of "don't get rid of things _only_ because
they make certain jobs more difficult" \- but I don't think the FBI is arguing
for getting rid of encryption. I think they want to be able to break security
surrounding encryption sometimes and under certain circumstances.

~~~
tobylane
They amount to the same thing, depending on how its done and who gets it. If
any law enforcement who tries hard (FBI, MI5, China) to fight what they want
to fight gets hold of Apple's tool then that encryption is not so much
outlawed as pointless to use.

It's because of the feverant fight on something (communism, black civil rights
protesters, anti capitalists, future civil rights upholders) leads to
questionable use of any tool that we can't give them too many tools. In the UK
the mayor of London bought 3 water cannon trucks but has been forbidden to use
them - where will they be in ten years time?

------
nefitty
I think the fear is that those 12 will lead to the other 700 million iPhones.

~~~
marcoperaza
That's not a justified fear. Requiring Apple to backdoor all phones is not
similar at all to requiring Apple to help hack particular phones, that they
have the capability to hack, in response to court orders.

~~~
nefitty
You're telling me you would trust that software to remain in the hands of
trusted actors? In 2015, alone, the IRS was breached, LastPass, the director
of the CIA, Hacking Team, even Kaspersky Labs was breached! There can be no
absolute guarantee that this backdoor would remain safe indefinitely. That is
just the most blatant problem, not to mention the overt displays of cynicism
and misuse of authority by the NSA as revealed by the Snowden leaks. Consider
the political climate in the US at the moment. Now imagine a truly evil actor
came into power, being handed over control of organizations with unheard-of
amounts of surveillance power. I don't mean to seem paranoid, but in this case
the feeling is completely warranted.

~~~
marcoperaza
So would it be okay for the FBI to have to bring the phone to Apple?

~~~
danieldk
Don't you see the slippery slope here? Next, the German or French police will
knock on Apple's door. When they get access, China, Russia, and others will
line up next.

Moreover, who says that the FBI or some other agency will stop after this
iPhone, or the next 12 iPhones. Why not push to get their own signing key
after they succeed in this case? They will try to get as far as possible.

We live in 2016. many of our devices with our private data are directly
addressable from anywhere in the world. Intentionally weakening encryption and
security in any way is a dangerous proposition.

It's good that Apple fights this tooth and nail. Sure, it may align up with
their PR. I don't care, it benefits every citizen of the net who wants privacy
and security.

~~~
amatix
worse. If this becomes "normal", then rather than the CTO or CSO or VP iOS
Engineering having to unlock the code-signing keys to sign releases
(presumably via N of M), it'll _have to be_ automated so that "oh, the FBI
needs another custom build for the 4th time this week, just do the build and
click here to sign it". At that point the security of the master code-signing
keys has evaporated to nothing and we're all sunk.

~~~
rhizome
I think the precedent that emerges could be that law enforcement can force
companies to sign specific functionality. This access-to-data power would be
in line with their access to telephone and telegraph copper in centuries past.

------
avn2109
Question for HN in general: Is it possible in principle (for Apple or someone
else) to construct a smartphone that can accept software/firmware updates, but
that Apple cannot push malware to at some later time?

E.g. can we implement all security functionality in hardware/burn it into the
silicon? Or accomplish the same ends by some other means?

Intuition says "no," because "security functionality" is sort of nebulous. But
it would be great if a device could be constructed in such a way that all such
future demands for collusion by hostile actors such as governments could be
rendered preemptively impossible.

~~~
Someone
_" can we implement all security functionality in hardware/burn it into the
silicon? Or accomplish the same ends by some other means?"_

Yes. The software could be burnt into PROM (which is unchangeable) or one
could even create a custom ROM chip, and if necessary contain hardware or code
that checksums the ROM.

However, a company doing that must be willing to run the risk that there is a
bug in that unchangeable software/hardware and then either tell their
customers that they are screwed, or that they can get a free replacement
phone. It also may lengthen development cycles, as you cannot, at last minute,
order your factory to open a million boxes and update that part of the
firmware anymore.

Alternatively, a fully open phone would allow customers to inspect updates and
reject them or perhaps even to partially reject them (partial rejection would
prevent the case where users want a feature, but only can get it by accepting
weaker security). That requires a 100% open phone (hard- and software) and
enough knowledgeable people willing to invest time in looking at the code.

~~~
chadzawistowski
> That requires a 100% open phone (hard- and software)

I am eagerly awaiting the [http://neo900.org/](http://neo900.org/)

Unfortunately the baseband modem is still unfree, but at least it's isolated
over the USB bus, versus having direct memory access as many phones do.
Unfortunately no phone has a free, legal modem.

Anyways, I'm more excited about the prospect of the phone itself being
completely free. In their own words:

> Not a single line of closed code will have to run on the main CPU to be able
> to use the Neo900. Using free telephony stacks like FSO or one from QtMoko,
> FLOSS Linux drivers will be available for every single component. In order
> to get 3D acceleration working, which is not necessary to operate the
> device, closed drivers would be needed.

~~~
SturgeonsLaw
It warms my heart to see the N900 getting a revival. While not the sleekest
phone out at the time, that device was tremendously underrated as a portable
computer.

------
headgasket
there's quite a few more than are involved in probably about 9 innocent deaths
PER DAY. [http://www.huffingtonpost.com/2015/06/08/dangers-of-
texting-...](http://www.huffingtonpost.com/2015/06/08/dangers-of-texting-and-
driving-statistics_n_7537710.html)

Does the justice dept want those unlocked too?

~~~
rhizome
Comey said the FBI wants the ability to do this even for car accidents, so a
qualified "yes."

------
werdum
I for one am happy for this issue to be at the forefront of discussion.

The worst thing that could happen is all of these requests going unspoken,
buried beneath less important topics.

------
abpavel
Sadly this is a tough sell, given that the general public perceives
"encryption" as "password", unaware of the underlying technology and
implications. It's doubly sadly that it was a government employee, in a
government-controlled environment, using a government-managed device, and even
when government had access they went and changed the password, locking
themselves out.

Critical thinking would lead one to question the need of any data at all,
given the thoroughly demonstrated incompetence. Yet public filter stops on
perception that that Apple once cooperated, but now chooses not to do so.

------
rebootthesystem
I had a really bad experience with an iPhone update and a password storage
app.

I had been running my iPhone on iOS 8.x. No need to update to 9.x.

The day finally came when I was forced to allow the update.

Now, without my knowledge the update also enabled automatic updates of apps.
All apps were thus updated to their latest versions without my explicit
consent.

I chose the app I am using to keep hundreds of account passwords specifically
because they DID NOT transmit anything over the internet at the time I got it.
I could do what they called "wifi sync" to synchronize and backup my database
to the desktop version of the same software running on my PC within the same
network.

Well, with the forced update "wifi sync" went away and now the only option is
"internet sync". I did not realize this whe the app ran through and
synchronized to my PC.

So now the dilema. This fucking company is doing this because they want to
sell cloud storage for your data and force you into an annual subscription in
order to be able to "internet sync". And, of course, the huge violation of the
security of my data which, up until the unauthorized automatic update, had
been kept private and never left my network.

Not only do I have to find a new password and data vault that will not try to
take ownership of my data and pull a bait-and-switch after, I also need to
change every simgle password I have due to my database now being in their
cloud.

Unbelievable.

~~~
Sephr
Instead of relying on proprietary software that is out of your control for
storing your passwords (and getting burned like you have described), why not
use something open source like KeePass? I don't have an iPhone, but on Android
there is KeePass2Android, which is open source and (if you choose to use it as
such) offline-only.

~~~
rebootthesystem
That's the migration plan when I get the time to move the data over.

------
coldnebo
I'm not sure I understand this. Apple seems to be limiting all discussion to
in-situ mechanisms of cracking, but no one is talking about external means.
For example ICE-level debuggers require sophisticated hardware that is not
easily available to everyone. Likewise, the XBOX encryption was very difficult
to crack without Ghz hardware.

Just put the critical path in the prom and then bypass the prom with your own
hardware-level circuit. The device itself can be keyed so that only Apple
hardware bypass is allowed to connect in this way. Now you have a physical
bypass that is difficult if not impossible to get around, but enables
warranted access by agencies that own the limited hardware. This also has the
advantage of human cost. You can't easily apply this method to millions of
phones without a huge cost in time and effort. Even if the device is stolen,
it limits exposure to phones in the physical possession of the hardware bypass
which is surely better than compromising millions of phones. And so what that
the critical path patch exists out in the open? Knock yourself out and make an
emulator that will unlock hw emulated phones (which is a difficult task, not
even the IOS emulator is a true hw emulator), but it won't work on the actual
hardware unless the prom is swapped which is hardly trivial.

The key signing argument has little weight by the way. DVD manufacturers had
the same stance and the root key was leaked to the public. How can Apple
guarantee the same won't happen with their keys?

It seems both Apple and the FBI are withholding something, but on face value
the technical requirements should allow warranted access. The fact that they
don't is a flaw in the technology design.

Case law surely has precedents in this area? Can safe manufacturers be
required to make bypass mechanisms for bank vaults? What about non-criminal
property law? Say a family member dies and the legal estate needs access?

------
malandrew
This may be a totally absurd question, but is there a way to comply such that
the government explicitly acknowledges that a case cannot be used as
precedence?

i.e. comply with the order, but this case legally can't be referred to ever
again as justification of precedence in any other case.

------
tomberek
What about a different approach? A society does want to allow access to
information for investigative purposes while protecting security. Then what
about allowing court orders for companies to bypass security in their products
with the caveat that the access method is made publicly reproducible.

This forces the company to fix the vulnerability and forces the government to
carefully consider which cases are important enough.

Just brainstorming..... not a solid proposal. Shoot some holes in it please.

------
jbverschoor
What happened to "just one phone"

------
timr
...and it's an entirely reasonable position to say that they should get it, if
they have a warrant. Especially if the case as cut-and-dried as the San
Bernardino one.

The public debate on this has reached truly sad, nigh-Trumpian levels of
hysteria and uninformed commentary. There is no "back-door" here. Encryption
is not being compromised. This has very little to do with encryption at all,
really: if the criminals in question were to use a strong password instead of
a four-digit PIN, Apple could just shrug, say "not possible in our lifetimes",
and that would be the end of it. But these criminals have easily brute-
forceable PIN codes, and the investigators want to brute force them.

This situation is about a legal fight of very narrow parameters: should it be
possible for the government to compel a company to help extract its customers'
"secure" data, via this specific, very old law. Reasonable people can disagree
on this point.

Unfortunately, the public debate has gone _completely round the bend_ , with
famous people grandstanding on _totally irrelevant_ things (like "encryption
back doors"), which have no bearing on anything at all. Moreover, as it turns
out, Apple has been _doing this for years_ for police investigations, and the
empire has not yet fallen. If you're worried about the slippery slope,
well...we're already well downhill, and our bottoms are wet. Perspective.

I realize that it's not popular amongst the tinfoil-hat set that has set up
residence here, but I think that there are times when we _want_ our government
to be able to do things like break into a suspect's phone. There should be
safeguards (like warrants), of course, but it's a perfectly reasonable
position to say that privacy is not absolute.

~~~
threatofrain
I agree with you that people capitalize on events to fit into their political
narratives, and I also agree that privacy is not an absolute value, and that
tensions between values calls for tradeoffs to be made.

But I do believe that calling this a backdoor is proper framing. Apple
provided a weak password knowing that convenience often beats security, but
Apple also provided a mechanism by which one can have weak passwords and still
have strong security via a max-attempt mechanism. It is a circumvention of
security features, and "backdoors" are about security, not encryption (which
is merely a subset of security).

I also think the legal fight is not circumscribed around narrow parameters
with predictably narrow legal outcomes. The FBI cites a law from 1789 that
says that the court may issue "...all writs necessary or appropriate in aid of
their respective jurisdictions and agreeable to the usages and principles of
law". It does not sound easy to predict what case law shall determine to be
"necessary or appropriate" 5-10 years from now.

~~~
timr
_" Apple also provided a mechanism by which one can have weak passwords and
still have strong security via a max-attempt mechanism. It is a circumvention
of security features, and "backdoors" are about security, not encryption
(which is merely a subset of security)."_

I don't deny any of that, but there's still a bright-line distinction between
"circumventing security features" for a single, badly protected phone, given a
warrant, and weakening security across-the-board for everyone. This is a case
of the former, not the latter.

 _" I also think the legal fight is not circumscribed around narrow parameters
with predictably narrow legal outcomes."_

The legal fight is, factually, centered on the question I stated. It doesn't
involve any of the other technical stuff that's being tossed around this
debate. That was my point. But like I said: I think it's a legitimate
question, so I'm not sure who you're arguing with right now?

