
Apple Just Killed The 'GrayKey' iPhone Passcode Hack - ceejayoz
https://www.forbes.com/sites/thomasbrewster/2018/10/24/apple-just-killed-the-graykey-iphone-passcode-hack/#7fa224f05318
======
dang
Url changed from [https://gizmodo.com/apple-reportedly-blocked-police-
iphone-h...](https://gizmodo.com/apple-reportedly-blocked-police-iphone-
hacking-tool-and-1829974710), which points to this.

~~~
gruez
can you also do something about the clickbait title?

>Apple Just Killed...

iOS 12 was released weeks ago, so the exploit wasn't _just_ killed as the
title says.

~~~
thanatos_dem
It’s using the title of the article directly. Go yell at Forbes for being
click-baity.

~~~
gruez
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

>Otherwise please use the original title, _unless it is misleading or
linkbait_ ; don't editorialize.

------
Someone1234
"Nobody knows how?" They have a pretty plausible explanation in the back half
of this very article:

> With iOS 12, Apple implemented a highly-anticipated change called “USB
> Restricted Mode.” This shuts off lightning port access on the iPhone if it
> hasn’t been unlocked by a user in the last hour. This was widely believed to
> be Apple’s solution to foil companies like GrayShift and Cellebrite but we
> don’t know for certain if that did the trick. Apple did not return our
> request for comment.

That would definitely do it. If Lightning goes down they cannot differentially
backup/restore while attempting to guess the user's passcode.

~~~
Scoundreller
How do you backup your phone after you smash your screen?

~~~
Johnny555
_How do you backup your phone after you smash your screen?_

Isn't that kind of like asking "How do you put your seat belt on after you've
been knocked unconscious in a car accident?"

~~~
umvi
No, more like: "How do I get my luggage out of the trunk after a brick goes
through the windshield and smashes the trunk lever?"

~~~
callmeal
You pull down the back seat and get into the trunk from inside the car of
course. Snark aside, if your phone is dead, the only way to get data out of it
will be to get to the "disk" on the device, if that's even possible.

------
skohan
> GrayKey, which counts an ex-Apple security engineer among its founders

How is it possible for someone to start a business built on breaking the
security systems they had a role in implementing? It seems like a huge ethical
breach, and I'm surprised there would not be contractual considerations to
prevent this in the security field.

~~~
eridius
Why would it be unethical to get a job trying to break the security you
previously helped implement? Security is not accomplished via obscurity so
it's not like there's a trade secret issue here.

Really the only problem I can think of is if you put backdoors or weaknesses
into the system with the expectation of being hired later to defeat the same
system. But the problem there is the fact that you deliberately crippled the
system you were building and not the fact that you were hired later to defeat
it.

~~~
chongli
It becomes unethical as soon as you discover the vulnerability and refuse to
disclose it.

It's like if you have a key to an apartment and you move out without the
landlord asking for it back. It's not unethical if you then discover that the
key still works. It becomes unethical when you do not disclose this fact.
Worse still if you start making copies of the key and selling it to people who
intend to break into that apartment.

~~~
snowwrestler
I think the ethics depend heavily on when a flaw was discovered.

If this person knew of a flaw when at Apple, and did not disclose it to Apple
(his employer), but instead left to go exploit it--that would be clearly
unethical and possibly even illegal.

But if they left, and then, as an ex-employee of Apple, discovered a new flaw
in Apple's security... how are they any different from a random person who
never worked at Apple?

Do security researchers have a general ethical obligation to disclose a
product's security flaws to the company who created it? I think most security
researchers would say no... the obligation is more rightly put on the company
itself to produce secure products.

Company managers don't generally have an ethical obligation to their ex-
employees after the employment period ends. It doesn't seem fair to say that
ex-employees should be obligated toward their former employer. The exception
would be if they acted unethically _while employed there_ and then reaped the
benefits later.

Or look at it this way--imagine an employee of GrayKey went to Apple, and then
learned how Apple is defeating GrayKey. Does that employee have an ethical
obligation to tell GrayKey about that? If ethics don't work the same in both
directions, they're probably not strong ethics.

That is, unless there is some higher moral at stake, like "breaking security
is always wrong." But even that is problematic because if you never try to
break security, it never gets better.

~~~
chongli
_I think the ethics depend heavily on when a flaw was discovered. If this
person knew of a flaw when at Apple, and did not disclose it to Apple (his
employer), but instead left to go exploit it--that would be clearly unethical
and possibly even illegal. But if they left, and then, as an ex-employee of
Apple, discovered a new flaw in Apple 's security... how are they any
different from a random person who never worked at Apple?_

They are still different from the average person on the street. Security is
more than just the binary flaw/no flaw distinction. Perhaps while working at
Apple the researcher knew about some old libraries that were still in
production release but had not been worked on, let alone updated in years?
That sort of insider knowledge could help them find exploits the average
person wouldn't think of.

GrayKey is a completely unethical company. They are selling unauthorized
access to people's devices. The fact that their main clients are law
enforcement officers is irrelevant.

I can't believe you'd suggest that Apple is in the wrong for hindering
GrayKey's efforts. Am I in the wrong for changing the locks on my door to
hinder burglars?

Security flaws are ticking time bombs that threaten all of society.
Discovering them and exploiting them rather than helping to fix them is
ethically akin to discovering chemical or biological weapons and helping to
put them into use.

~~~
snowwrestler
> GrayKey is a completely unethical company. They are selling unauthorized
> access to people's devices. The fact that their main clients are law
> enforcement officers is irrelevant.

I'd say this is an example of a higher ethic or moral at stake. Not everyone
agrees with you that law enforcement access to a device is always unethical.
It certainly can be, if the police are acting unethically (which some do). But
if they are properly investigating a crime and get a warrant, that's going to
cross into the OK zone for a lot of people.

> I can't believe you'd suggest that Apple is in the wrong for hindering
> GrayKey's efforts.

I did not suggest that and I don't believe that. I think it's great that Apple
is fixing their device security.

~~~
chongli
_I 'd say this is an example of a higher ethic or moral at stake. Not everyone
agrees with you that law enforcement access to a device is always unethical.
It certainly can be, if the police are acting unethically (which some do). But
if they are properly investigating a crime and get a warrant, that's going to
cross into the OK zone for a lot of people._

Not for me. There's nothing sacred about the police. They spend a heck of a
lot of time and effort acting as foot soldiers in a class war against the
impoverished. They engage in widespread legalized highway robbery in the form
of civil forfeiture. And if you want to include customs and border patrol (I
do) they also spend a ton of time breaking up desperate families and
conducting dragnet surveillance against innocent travellers. And their
effectiveness in all these endeavours? Abysmal, if you take their stated aims
at face value.

------
juskrey
Any sufficiently popular "hacking device" is doomed in the long run. In this
occasion this is good. We have observed much more massive battles for game
consoles chipping, which were all lost at the end by modchip makers. They are
just training defences.

~~~
chupasaurus
Inverse logic is valid too.

Any sufficiently popular "thing to crack" is doomed in the long run. ... They
are just training offences.

~~~
xoa
> _Inverse logic is valid too._

Actually it's not.

> _Any sufficiently popular "thing to crack" is doomed in the long run. ...
> They are just training offences._

No, because these are not symmetric operations. There is nothing
mathematically impossible about bug free software. While the Halting Problem
means that we can't just formally take any arbitrary program and solve all its
states, it is possible to formally prove small programs and additively build
up, and to approach a limit of zero with enough brute force over time
particularly in simple programs. If they are not changing and actively gaining
new informal features then they will not be automatically gaining new
vulnerabilities either, and for all these very reasons it's standard practice
to try to separate security core code and make it as simple and static as
possible beyond bug fixes.

So unlike the real world in software defense is actually in a naturally
stronger position if managed well. Software history is a spiral not a circle,
security practices are improving and legacy is being reduced over time. A lot
of easy mistakes of the past don't happen so much or at all anymore, and Apple
has definitely benefitted long term from the hunt for exploits in hardening
their ecosystem. Bugs still exist, but they're not low hanging fruit anymore,
they're harder to find and more restricted in scope and more must be chained
together for wider access.

~~~
dev_dull
> _No, because these are not symmetric operations. There is nothing
> mathematically impossible about bug free software._

To me this is like saying there’s nothing mathematically impossible about a
perfect vacuum. There’s always something leaking energy/attack surface.

------
jumelles
> You can’t even view [GrayShift's] website without a login given to members
> of law enforcement

I would _love_ to see some leaked screenshots of even just the main page after
login.

~~~
StudentStuff
GrayShift is targeting government and large companies as customers, it will
likely not be a quality website.

~~~
_audakel
Amen to that. My company does very large cyber security contracts for DoD and
has a god awful home page. It's almost like a badge of honor to someone how
bad it is.

~~~
Slartie
Maybe there's a style guide for the US intelligence/cybersecurity apparatus
that demands ugly design straight out of the 90s? Maybe that style guide is
also top secret ;-)

I've been wondering about that since Snowden afforded us a glimpse into the
internal PowerPoint world of the NSA. That stuff - all of it - looked just
like I would have designed presentations around 1996. When I first discovered
PowerPoint. As a kid.

~~~
SmellyGeekBoy
I imagine a lot of government agencies are still using IE6 internally which
probably explains this.

------
vbezhenar
Is it even known how graykey broke iPhones? Like details of the attack, how
they bypassed pin-code delay? I thought that those delays were hardware
implemented by secure chip, but obviously that's not the case.

------
zaroth
The screenshot in the article which is supposed to show how to make sure the
feature is enabled, seems to be showing the setting _disabled_?

~~~
dpkonofa
The screenshot is actually showing a section called "Allow Access When
Locked". In order to have your content secured, you'd want USB Accessory
access turned off when the phone is locked. The screenshot is correct. Their
description of the feature has it backwards.

~~~
zaroth
Ah, I guess in that context it makes sense...

Does the description change as you flip the toggle? Otherwise it certainly
seems the like description implies it should be ON in order to achieve that
behavior, not OFF.

~~~
dpkonofa
Yes, it does. When it's off it says something to the effect of "Turn off to
prevent devices from accessing your phone's contents without requiring your
passcode."

~~~
zaroth
Wow, when they word it that way, you wonder why it’s an option at all!

But in any case, from the screenshot it seemed confusing but with the full
description of the UI it sounds like they got it as right as a cryptic
security setting buried 3 menus deep could ever be.

------
jpalomaki
Previously I remember seeing some speculation that having tools like GrayKey
available can be in a sense good for Apple. If the tools are hard enough to
get and limited to law enforcement, that should not affect too much the public
perception of iPhones being very safe and secure. If these tools are available
in some form, it reduces the pressure from the government towards Apple to
provide access to data in the devices.

~~~
robbrulinski
If GrayKey exists as a real tool, it will eventually be stolen and leaked or
sold online.

------
ada1981
I’d really like a clone of this device that functions as a charger.

------
Justsignedup
The problem is that Apple can now start locking the USB after a single failed
attempt. Eventually the entire technique of brute forcing would be moot.

------
DINKDINK
I assumed it was the tweak they made to ARM pointers

~~~
eridius
The tweak made to ARM pointers is a protection against return-oriented
programming attacks. While it's conceivable that GrayKey's operation relies on
attacks of this nature, in all likelihood this is not the answer.

------
sunstone
You can be sure that more than a few people at Apple know how it was done.
Just as you can be sure they are not talking.

------
akulbe
This makes me happy. While I am not anti-police... I feel like there is far
far FAR too much abuse of power going on.

------
sbhn
Great news, now can apple fix that problem where nearby nfc hid readers auto
open Apple Pay.

~~~
ceejayoz
Isn't that the entire point of Apple Pay?

