
The Feds Can Now Probably Unlock Every iPhone Model - dsr12
https://www.forbes.com/sites/thomasbrewster/2018/02/26/government-can-access-any-apple-iphone-cellebrite/#50ba3a5d667a
======
eridius
Bruce Schneier says¹:

> _There 's also a credible rumor that Cellebrite's mechanisms only defeat the
> mechanism that limits the number of password attempts. It does not allow
> engineers to move the encrypted data off the phone and run an offline
> password cracker. If this is true, then strong passwords are still secure._

¹[https://www.schneier.com/blog/archives/2018/02/cellebrite_un...](https://www.schneier.com/blog/archives/2018/02/cellebrite_unlo.html)

~~~
IBM
>The story I hear is that Cellebrite hires ex-Apple engineers and moves them
to countries where Apple can't prosecute them under the DMCA or its
equivalents.

Crazy if true.

Doesn't this also create a weird incentive problem where the FBI (or any other
law enforcement agency) who would normally be tasked with helping Apple with
this doesn't actually want to?

~~~
guelo
So Israel is like a high tech Guantanamo where our government goes when those
pesky laws get in the way.

~~~
rayiner
No? The law doesn’t prevent the government from searching your property in a
wide range of circumstances: e.g. with a warrant, pursuant to a valid arrest,
etc. That’s the whole idea of warrants: so there is a controlled way to search
private property. The government goes to Israel to defeat technological
roadblocks to doing what it’s allowed to do under the law. This technology
isn’t being used to break into phones at surprise checkpoints, it’s being used
to search phones of people who have been arrested.

~~~
pessimizer
You left out why that's different from Guantanamo, and instead just defended
it generally.

~~~
rayiner
Presumably guelo thinks Guantanamo permits the US to do things that would be
illegal here. But breaking an iPhone pursuant to a warrant wouldn’t be. Having
a warrant (or a suspect in custody) permits the government, by design, to do
lots of things that would otherwise be illegal. The government doesn’t need to
ship safes to Israel to avoid violating safe cracking laws when searching
pursuant to a warrant.

~~~
cryoshon
>implying the government asks for a warrant for many of their invasive
activities

they don't play by their own set of rules.

~~~
rayiner
Ah yes, the old “government does X, so i’m going to speculate it also does Y,
and it’s up to you to prove otherwise” trick. Arguing on the Internet is so
much fun when we get to just make things up.

~~~
veidr
Come on now, there are _literally thousands and thousands of thoroughly
documented cases of law enforcement and government agencies violating the
law_.

The Israel == Guantanamo thing doesn't exactly make sense to me either, but
now you're arguing nonsense. Certainly, _we all know_ that the government,
including law enforcement, doesn't always follow the law.

That's almost the entire argument for putting any limits on governmental power
at all.

(That's not to say we should restrict them from doing this, though; if they
can crack a phone, good for them, I suppose. But it's another thing to be
concerned about.)

~~~
rayiner
There's thousands of law enforcement agencies in the U.S., handling tens of
thousands if not hundreds of thousands of cases each year. If they break the
law with respect to a small percentage of those cases, you'll end up with
thousands of examples. But with respect to any given thing, statistically, the
government is probably _not_ breaking the law.

Here, the "Israel == Guantanamo" thing doesn't make sense if you assume that
the government is using the Israeli hacks to break iPhones it has in custody
because of a warrant or arrest. You can speculate that the government is
stealing peoples' iPhones and breaking into them without a warrant, but it's
an actual logical fallacy to point to different things the government is doing
to argue that the government is doing this thing too.

------
julianj
How is chain of custody maintained if the process is a secret? Couldn't a
person argue that the data obtained was planted?

~~~
wmf
They'll probably drop the case if the defense points this out (seriously, look
at Stingrays).

~~~
TylerE
Or they just do parallel construction.

They gather the evidence illegally, and the conjure up a legal means to re-
find the evidence they already know exists.

~~~
Chriky
What here is illegally gathered?

~~~
veidr
Nothing, if they have a warrant (or equivalent legal authorization).

But the reason most of us care about having our data encrypted is _not_
actually because we are committing heinous felonies, and want our phones to
hide the evidence from legitimate cops (though of course sometimes that’s the
case).

It’s because we don’t _trust_ the authorities to follow the law. If they can
crack your phone legally, they can also crack it _illegally_. (Say, after
seizing it within 100 miles of the border, which they can do any time they
please for whatever reason (including no real reason)).

So even though this ability isn’t necessarily illegal in and of itself, it’s
certainly of interest to those of us who are concerned about the threat
vectors that are presented by government forces that do engage in illegal
practices.

~~~
Chriky
It's my personal belief that this line of thinking, which is common among
"geeky" types but not among the general population, is a form of slight
delusion or power fantasy.

Is there anything you can provide to convince me it's remotely possible?

------
caymanjim
I'm too lazy to find a link for it, but last time I read about Cellebrite,
they were cloning the data and simply trying unlock codes in sequence until
one worked. They could restore the cloned data before each try, or possibly do
it on custom hardware or an emulator, and start with a fresh copy each time,
so they never triggered "erase after 10 failures". It's a pretty
straightforward approach, but it doesn't scale well. Works for targeted
cracking of high-value targets.

~~~
eugeniub
This is not true. You cannot just clone the data and run passcodes against it,
because the data is not encrypted by your passcode. Instead, each file on iOS
11 is encrypted with a different AES 256-bit key, and cracking even one
256-bit key through exhaustive search is thought to be out of reach of
humankind ([https://security.stackexchange.com/questions/6141/amount-
of-...](https://security.stackexchange.com/questions/6141/amount-of-simple-
operations-that-is-safely-out-of-reach-for-all-humanity/6149#6149)). The file
keys are wrapped by, among other things, the device's Unique ID, a 256-bit key
generated by the Secure Enclave, and accessible only to the Secure Enclave,
not any other hardware or software running on an iOS device.

In the end, the only options are: bruteforcing passcodes on the original
device while attempting to trick the device into allowing more than 10
failures, or prying open the Secure Enclave to obtain the Unique ID — both
options a lot more complicated than just cloning the data and trying passcodes
on it.

~~~
userbinator
_or prying open the Secure Enclave to obtain the Unique ID_

People have been cracking secure coprocessors of the type used in payment
cards, TPMs, and the like for a long time, dare I say even those which were
designed to a higher level of security than Apple's. The fact that there is an
entire phone attached to it doesn't make much of a difference, but the
technology behind this (FIB, microprobing, etc.) has been steadily dropping in
price and increasing in availability for a long time.

~~~
Godel_unicode
Fwiw, Apple has a $100k bounty on this type of exploit (pulling secrets from
the secure enclave).

~~~
OkGoDoIt
But Cellebrite apparently makes millions off of its service, so the economic
incentives are still on their side.

~~~
movedx
I understand what you're saying here: why share the fact they've broken the SE
for $100k when they can keep making millions.

But if they cracked the SE, and kept that fact to themselves, they would be
making even more money because every government on the planet would be coming
to them. This is provided they kept it to themselves.

It would mean a significant spike in the number of phones being cracked and
people being arrested/charged/hung/etc. This would be a statistic that would
jump off the charts and trigger Apple to essentially develop a solution
straight away.

The only way this would work is if they had cracked the SE and are doing an
Enigma: keeping it top secret and only cracking very high profile targets with
the technology, which I guess is possible.

~~~
majewsky
> This would be a statistic that would jump off the charts and trigger Apple
> to essentially develop a solution straight away.

How, though? If the only information Apple has is that their SE scheme is
broken, how is that supposed to help them develop a solution?

~~~
paulie_a
The risky.biz podcast proposed a solution, half seriously and half in jest,
offer 50 million for the bug bounty. It would destroy the working
relationships and trust of the group of people that is required to come up
with multi stage exploits, and apple has the cash to do that once or twice.

------
viraptor
> relatively inexpensive, costing as little as $1,500 per unlock

That's not be good. Since we're bound to have that cost-of-unlock war anyway
as new workarounds are found, it should at least be higher. I'd hope for $50k+
so if it's really needed, it goes through several levels of approvals.

------
mcphage
Why didn't Forbes send them an iPhone X running iOS 11 to hack, to find out if
it's true or not?

~~~
guelo
I imagine Cellebrite is very picky about who gets to be there customer.
Wouldn't want someone sending a trojan device that reveals their secrets.

~~~
mcphage
I would think Forbes Magazine would be a great customer for them.

~~~
JumpCrisscross
> _I would think Forbes Magazine would be a great customer for them_

Forbes would unlock one phone and then alert Apple by way of their story.
Celebrite’s ideal customer pays for lots of iPhones to be unlocked.

------
julianj
I wish there was a kind of "dead man switch" app that would wipe a device if
it is not unlocked for x days or met some other kind of personalized criteria.

~~~
pdkl95
An app would need an environment to run in, but the main CPU/etc may not be
available if the recovery process involves removing chips or other hardware
modifications.

A better idea would be to put an RTC + watchdog timer[1] into the security
chip that holds the keys and power it continuously with a small amount of
_external_ power. The power must be available _and_ the watchdog timer must
have time remaining to disconnect a circuit that pulls the memory holding the
keys to ground.

More advanced types of tamper resistance and self-destructing chips are
possible, but they tend to have significant downsides.

[1]
[https://en.wikipedia.org/wiki/Watchdog_timer](https://en.wikipedia.org/wiki/Watchdog_timer)

~~~
Godel_unicode
Apple is being sued for making phones slightly slower, can you imagine the
lawsuit if they were "deliberately lobotomizing" phones? You know that's how
it would be said...

------
cbrozefsky
So they get owned earlier this month, and then a fluff piece appears later
this month that doesn't mention any of the findings even tho it calls out the
dangers of hoarding vulns.

This is a planted marketing/PR piece.

------
peterjlee
This might be a good scenario for Apple. Apple doesn't have to build a
backdoor, which is good for PR, and the Feds got what they want to they'll
stop bothering Apple. Which is the position Android/Google was in all along.

~~~
ddalex
So the tinfoil hat theory here is that Apple itself leaks the cracking tech to
Cellobrite to ease the fed pressure, and keep reputation intact? Sorry, I
don't buy it.

~~~
chapill
No one at Apple would share source code? You never heard of iBoot.

~~~
zbentley
That's not really the strongest interpretation of GP's statement. It wasn't
implying that nobody would ever share source code for any reason, but that the
_company_ would deliberately decide to share source code in order to obtain
fly-by-night compliance with government requests while maintaining its public
image.

~~~
chapill
That happens all the time too, and I'm sure Apple is no exception. Want to
score that big NSA storage contract? Pony up your HDD firmware... for
"security assurances." Suddenly, NSA has exploits for all 7 major HDD
manufacturers.

[http://www.cbc.ca/news/technology/nsa-hid-spying-software-
in...](http://www.cbc.ca/news/technology/nsa-hid-spying-software-in-hard-
drive-firmware-report-says-1.2959252)

Hmm, how did Apple get cleared for DOD use?

[http://www.zdnet.com/article/iphones-ipads-cleared-for-u-
s-m...](http://www.zdnet.com/article/iphones-ipads-cleared-for-u-s-military-
use-dod-fortifies-cloud/)

What process would be required for that? Hmmmm.

------
exabrial
[https://xkcd.com/538](https://xkcd.com/538)

All I want is my every day encryption to be a big enough pain in the butt to
crack that the feds can't break it without spending a medium amount of money.

===Edit HN: Apologies this was lost in my subtlety, but consider the game
theory aspects. Your best bet is to _just enough_ of a pain in the butt it's
difficult to reach you, but you certainly don't want to be singled out on a
national stage either. Maybe I'm the only one that considers this angle?

~~~
letsgetphysITal
I'm not even on the Fed's radar. I don't want a mugger to send my stolen
iDevice up the food chain to a Russian syndicate and have them able to in my
Lastpass, internet banking app, live bitcoin wallet until I've had enough time
to change all the credentials.

------
geuis
Claims in one hand, shit in the other. Prove yourself. P

------
notadoc
Not too surprising based on economics alone.

Presumably a security professional selling a usable exploit to a company like
Celibrite pays far better than the $0 that comes from releasing it as a
"jailbreak" to the general public.

~~~
saagarjha
I'd assume things like these are generally difficult to release to the public
in any meaningful way, since they often require hardware hacks like
desoldering components.

~~~
threeseed
What ? No they don't.

All of the jailbreaks have just involved tethering your phone to iTunes or
visiting a particular website or app. There's never been a need to do any
desoldering.

~~~
cheeze
All of the jailbreaks _that were released as easily accessible jailbreaks_.
It's definitely plausible that this exploit requires direct access of pinouts
on the motherboard.

The difference is just that those exploits which were difficult enough that
they required soldering generally weren't released or didn't get much
traction.

That being said, I remember soldering a modchip on my original xbox 16 years
ago

