
Apple Security Research Device Program - dewey
https://developer.apple.com/programs/security-research-device/
======
saurik
Anyone who wants to should be able to buy such a device, as it isn't like any
of the machine code you are getting elevated access to is even secret (you can
download, from Apple, unencrypted copies of the entire operating system). (You
can try to make an argument that this is about keeping you from getting access
to third-party encrypted assets to prevent some aspect of piracy in the App
Store, but this doesn't accomplish that either as you need only have a single
supported jailbroken device for that to be easy, and the world already has
millions of those and you can't really prevent them as the act of fixing bugs
discloses the bug for the older firmware.)

The real problem here is that Apple is so ridiculously controlling with
respect to who is allowed to develop software (in Apple's perfect world, all
software development would require an Apple license and all software would
require Apple review)--in a legal area that isn't really conducive to that
(see Sega v. Accolade, which was important enough to later ensure permanent
exemptions on reverse engineering and even jailbreaking for software
interoperability purposes in the original DMCA anti-tampering laws)--that they
are even working right now on suing Corellium, a company which makes an iPhone
emulator (which again, has strong legal precedent), in order to prevent anyone
but a handful of highly controlled people from being able to debug their
platform.

Apple just has such a history of being anti-security researcher--banning
people like Charlie Miller from the App Store for showing faults in their
review process, pulling the vulnerability detection app from Stefan Esser,
slandering Google Project Zero, denying the iPhone 11 location tracking until
proven wrong, requiring people in their bug bounty program to be willing to
irresponsibly hold bugs indefinitely so Apple can fix things only at their
leisure, and using the DMCA to try to squelch research via takedowns--that
this ends up feeling like yet another flat gesture: they should have done much
more than this device at least a decade ago. I'd say Apple is in store for a
pretty big fall if anyone ever manages to get a bankroll large enough to
actually fight them in court for any protracted length of time :/.

~~~
devwastaken
If only open source software licenses could have predicted the level of
vertical integration control their software would be used in. Apple
continually violates the good will of developers and puts forth their own bad
will. I'm tempted to make up an 'MIT minus non-free platforms' agreement. If
the OS can't be completely emulated and freeley installed without restriction,
then you can't use the library.

I'd like to see Apple survive having to recreate half their software from
scratch.

~~~
saagarjha
Sounds like you might want GPL?

~~~
searchableguy
Probably AGPL

------
keyme
From experience, I'll suggest all serious security researchers to never, ever,
sign any agreement with the company whose products they are researching.

This particular case is also outrageous for other reasons:

1) They are only doing this now because Corellium has been selling virtually
the same thing for a while already.

2) They are doing this to try and hurt Corellium financially, while they're
already suing them in parallel.

3) Agreeing to their terms here, effectively makes you a glorified Apple QA
engineer. Only you don't get a salary, but rather, a bounty for whenever you
find a bug. For most people that would be way, way less money than just being
employed wherever.

~~~
ghshephard
I read the terms of the SRD [1] to suggest if you get one, and use it, you
aren't eligible for bounties on any bugs you find while using it. So, you are
an _entirely_ unpaid Apple QA engineer. Knowledge is its own reward I guess.

[1] "If you use the SRD to find, test, validate, verify, or confirm a
vulnerability, you must promptly report it to Apple and, if the bug is in
third-party code, to the appropriate third party. If you didn’t use the SRD
for any aspect of your work with a vulnerability, Apple strongly encourages
(and rewards, through the Apple Security Bounty) that you report the
vulnerability, but you are not required to do so."

~~~
xondono
It doesn’t suggest that you aren’t eligible for bounties, but rather that you
are not allowed to _not disclose_ a vulnerability.

~~~
usmannk
It certainly suggests this.

Full bullet: If you use the SRD to find, test, validate, verify, or confirm a
vulnerability, you must promptly report it to Apple and, if the bug is in
third-party code, to the appropriate third party. If you didn’t use the SRD
for any aspect of your work with a vulnerability, Apple strongly encourages
(and rewards, through the Apple Security Bounty) that you report the
vulnerability, but you are not required to do so.

~~~
xondono
Maybe it’s me but what I read in that paragraph is:

If you use the SRD, you are required to report any vulnerability. If you
didn’t, you are not required but encouraged.

It doesn’t say if you used it you aren’t eligible for reward

------
shantara
>If you report a vulnerability affecting Apple products, Apple will provide
you with a publication date... Until the publication date, you cannot discuss
the vulnerability with others.

In addition to the mandatory bug reporting, Apple reserve a right to dictate
the researchers a mandatory publication date. No more 90/180 days responsible
disclosure deadline policy. I highly doubt any serious researcher would agree
to work with such conditions.

~~~
saagarjha
Would be interesting to see if Google Project Zero joins the program, given
their inflexibility in disclosure.

~~~
bobviolier
Looks like they won't
[https://twitter.com/benhawkes/status/1286021329246801921?s=1...](https://twitter.com/benhawkes/status/1286021329246801921?s=19)

------
guidovranken
> If you use the SRD to find, test, validate, verify, or confirm a
> vulnerability, you must promptly report it to Apple and, if the bug is in
> third-party code, to the appropriate third party. If you didn’t use the SRD
> for any aspect of your work with a vulnerability, Apple strongly encourages
> (and rewards, through the Apple Security Bounty) that you report the
> vulnerability, but you are not required to do so.

So vulnerabilities found through this program are not eligible for any reward.
Then what would be the incentive to enroll (and accepting liabilities like
losing the device, Apple suspecting you of breach of contract etc)? Just
bragging rights?

~~~
saagarjha
I think that is supposed to be read as "you must report any vulnerabilities,
which will be treated as any vulnerability you chose to voluntarily submit".

~~~
guidovranken
You are correct because they have now added a bullet point:

> Vulnerabilities found with an SRD are automatically considered for reward
> through the Apple Security Bounty.

------
natvert
:O

This is huge. Not as a security device, but if this were the normal permission
model on all iPhones (e.g. owners of devices get root on the devices they
own... like a normal general purpose computing device) I could ditch my
android and my mac and use an iPhone for everything.

I'm not saying this will ever happen, but in my mind this paints a bright
picture of what the iPhone could be.

It's also a bit sobering as I'm quite concerned Apple is actually pushing the
other direction in their shift from Intel to ARM.

~~~
yalogin
I dont get the allure of this. As someone working in security, the phone is an
extremely leaky thing and very bad for privacy to begin with. On top of that
you want to remove all restrictions and make it a security nightmare too? I
get that you want to install what you like. Sure, but I don't think the
convenience is worth the security trade off.

Honestly the mac or desktop is where I enjoy the openness and do stuff I want
to do. I would want to leave the phone untouched and as secure as possible.

I would like to hear your and others' take on it though.

~~~
saagarjha
> As someone working in security, the phone is an extremely leaky thing and
> very bad for privacy to begin with. On top of that you want to remove all
> restrictions and make it a security nightmare too?

One major issue is that Apple's security model is "we don't trust you". And by
that I mean everything works from _their_ root of trust; not yours. This isn't
the usual "I think Apple is backdooring my iPhone, FWIW", what I'm really
saying is that I want the ability to elevate some of _my_ software to the same
permissions that Apple gives theirs. There is no reason that I should not be
able to vet my own code and add it to the "trust cache". So this isn't just
"every app should run without a sandbox", but it should be "I think GDB that I
personally should be able to attach to other apps, but nothing else".

~~~
donor20
This is because apple feels its too easy to trick users into elevating
software permissions - which in turn may cause risks and harms to their user
base.

Let me ask you - do you have your elderly parents on an android? Then you will
know already how totally owned those phones can become.

~~~
jedieaston
If you require the user to hook into iTunes/Xcode, flip the device into
recovery mode, click a few buttons, and agree to a "You're hecked if you break
it now" policy, it'll be enough to scare off 99.9% of people from getting
owned. After that, just have it work like the current profiles/supervision
system where Settings makes it clear that non-verified code is running and has
a big "make it go away!" button (sideloaded IPAs show up in profiles with a
delete app button, and that works well enough except for the time limit).

~~~
stqism
I don’t really agree to this, the end result is going to be a large number of
YouTube tutorials instructing people on how to do this with captions like:
watch free movies on iPhone, “popular mobile game” money hack, and Snapchat
take screenshots without notifying hack.

Half of these developer / root mode required secrets are going to be
occasionally working mods and tweaks except with tons of baked in spyware and
ads that can no longer easily be removed.

Perhaps some sort of per device profile which requires a paid developer
account could work, but I’ve gotten a number of odd calls about YouTube videos
involving Kodi from family before, so I’m not sure trusting in the give users
freedom front.

~~~
GekkePrutser
This proves exactly the point made above of Apple not trusting the user.

However if someone wants to be an idiot, how far do you go to stop them?
Apple's approach stops too many great possibilities for knowledgeable users.
It should be in the same category as those "will it blend" types. Screw it up?
No warranty.

For me there's several things I need it that are impossible because Apple
won't allow them, so I have to use Android. But that's comes infected with
Google spyware out of the box :(

~~~
giovannibajo1
I think Apple point is that users that need being protected from themselves
without even realizing it are far more than those who might get a benefit from
root without getting burnt. Since the two things can’t exist at the same time,
they’re going for the road that makes the majority happy.

~~~
zapzupnz
And provides the least burden to their support service.

------
jedieaston
I wonder how much people are able to publish about the device. I'd expect not
much, but it'd be nice to be able to compare a iPhone that was completely
unlocked (at least, to whatever that means for Apple) with whatever security
they put on the ARM Macs which are supposed to be "open for hobbyists". I'd
expect that the ARM Macs have much of the same security stack (by default)
that iOS devices have given what they said in the WWDC talks, but maybe that's
not the case.

Also, if you found an exploit on a research iPhone because you made use of
entitlements that were Apple-only, I wonder if that'd be worth anything bounty
wise. Nobody can/should be able to write an that'll get through App Store
checks if they asked for PLZ_NO_SANDBOX_ILL_BE_GOOD or something (at least,
that's what I thought before the whole Snapchat system call thing happened).
But hypothetically the App Store review process is vulnerable to a bad actor
inside Apple pushing an update to a big app that included malware, so I'd
think that private entitlements shouldn't be available at all to binaries that
didn't ship with the device/in a system update (unless some kind of hobbyist
flag was flipped by the consumer). So I'd say that would be worth something,
even if smaller than a more interesting exploit.

~~~
saagarjha
We’ll see how the shipping ARM Macs are “fused” when they come out, but my
guess is that they will be more locked down than these devices: their OS will
be more permissive but you will not have meaningful kernel debugging.

> Nobody can/should be able to write an that'll get through App Store checks
> if they asked for PLZ_NO_SANDBOX_ILL_BE_GOOD or something (at least, that's
> what I thought before the whole Snapchat system call thing happened).

Snapchat (on iOS at least) is still subject to the app sandbox, no app has on
iOS has been granted an exception there to my knowledge. On macOS there are
apps that are “grandfathered in” to not require the sandbox on the App Store,
but new apps are supposed to have it. Due to the way the dynamic linker works,
until recently it was possible to upload an app that could bypass the sandbox,
but Apple has said they have fixed this. Some apps do have an exception to
this as well, as the broad way they fixed one of the issues broke legitimate
functionality in library loading. You can find those hardcoded in AMFI.kext,
theoretically they could turn off the sandbox for themselves if they wanted.

~~~
GekkePrutser
> We’ll see how the shipping ARM Macs are “fused” when they come out, but my
> guess is that they will be more locked down than these devices: their OS
> will be more permissive but you will not have meaningful kernel debugging.

My big worry is them dropping terminal access altogether like on iOS. That
would really make the platform useless to me.

However I don't think they would do this at this point. There's many user
groups (like cloud developers) specifically favouring Mac because of the
strong terminal access.

~~~
easton
Craig specifically said that this wasn't going to happen, in one of the
podcasts he said people came up to him internally and said "Wait. There's
still Terminal, right?" and he said "Yeah, it's a Mac.". The Platforms State
of the Union host also said that they had made contact with a bunch of open-
source projects with assistance (and in some cases, iirc the OpenJDK and
CPython, pull requests) on moving to ARM.

~~~
GekkePrutser
Thanks, I didn't know that. Good to hear!

~~~
john_alan
Yep. And Craig also said the Mac is staying open. And that he was sick of
trying to convince people that!

------
saagarjha
I want to apply (not that I am sure that Apple would consider me a security
researcher) but am unsure to what extend they're going to go with

> If you use the SRD to find, test, validate, verify, or confirm a
> vulnerability, you must promptly report it to Apple and, if the bug is in
> third-party code, to the appropriate third party.

I mean, if I find a bug I might report it, but I know people who work on
jailbreaks and stuff–if they tell me something will I have to promptly report
it? What if I find something on a non-SRD device? If I ever hypothetically
"write a jailbreak", will Apple come after me even if I say I didn't use that
device for it? I can get 90% of the benefit from using a device with a
bootroom exploit, with none of the restrictions here…

~~~
phnofive
I’m not a lawyer nor your lawyer, but I read that to mean any vulnerability
you discover as a result of your research using the SRD, not any vulnerability
you otherwise discover or of which you have knowledge.

~~~
saagarjha
Right, but is Apple going to believe me when I say that I didn't? They could
just revoke my access anyways. (I'm being honest here, this isn't a question
of "can I trick Apple into thinking I didn't do this on the SRD".)

~~~
phnofive
I’d think it’d be difficult to run in both circles for too long.

------
ebg13
This involves an interesting set of assumptions about the plausibility of
deep-cover hacking operations.

> _If you use the SRD to find, test, validate, verify, or confirm a
> vulnerability, you must promptly report it to Apple_

But let's say you pass their review, get a device, find a vulnerability, and
don't report it. Then what? You're breaching the contract, but they have no
way to know that, so there's no consequence?

~~~
saagarjha
I would expect that if you put up jailbreakmeios14.com and they find you have
one of these devices, they will remove you from the program.

~~~
ebg13
Yes, but most exploits are deployed in secret by malicious groups trying to
hack your shit and steal your money/identity/whatever, not publicized on
consumer-facing websites with your name attached.

~~~
mschuster91
It's like with Al Capone and the taxman: at least Apple that way has an angle
of legal attack against shady companies (Hacking Team, Finfisher, ...).

------
brutopia
As a long time iOS user this single aspect has made me look over the fence to
the android side the whole time. Not having full access to my own devices is
insane. The poor security on android side has kept me away, but they’ve just
recently been catching up enough that the scales are almost tilted.

~~~
rimliu

        > The poor security on android side
    

You almost got it… almost.

------
ChrisMarshallNY
Looks fairly cool, but I'll bet it isn't that popular with security boffins. I
would be cautious about something that might not actually reflect a current
"in the wild" device.

For example, if the OS isn't quite at the same level as the release OS, it
could be an issue.

That said, this is not my field, and I am not qualified to offer much more
than the vague speculation, above.

~~~
saagarjha
I would expect it to be exactly the same except that you can debug it,
basically. iPhones have a special fuse in them that prevents that from being
done on production hardware, and these will presumably have that "unblown". If
you want to test on production hardware you always can, this just lets you do
research (a metaphor might be that this is "a debug build with symbols, normal
iPhones are a "release build".)

------
dpifke
Has anyone here ever been paid a bounty for a vulnerability reported to Apple?

~~~
GekkePrutser
Yes it does happen: [https://gizmodo.com/apple-pays-developer-100-000-for-
finding...](https://gizmodo.com/apple-pays-developer-100-000-for-finding-
serious-bug-i-1843805634)

~~~
saagarjha
Any for kernel bugs?

------
gowld
Why do experts make absurd comments like "“Everybody thinks basically that the
method you learn in school is the best one" ?

~~~
saagarjha
Wrong thread?

------
hendersoon
I agree this is theater; no serious whitehat researcher would sign a deal
forcing them to accept dates from the manufacturer. It won't be useful for its
intended purpose.

On the bright side, it will be very useful for jailbreak research and in a
way, those bugs _do_ get disclosed to Apple for them to subsequently fix. Not
necessarily the way Apple wants, but it does shine daylight on their code.

These guys keep working exploits close to their hearts and don't release them
specifically so they can get a look at new hardware. That will no longer be
necessary. You find an exploit, you can release it right away.

And on the gripping hand, it will also be used by malicious criminals and
state actors to develop zero days for various evil purposes.

~~~
saagarjha
> On the bright side, it will be very useful for jailbreak research and in a
> way, those bugs _do_ get disclosed to Apple for them to subsequently fix.

It’s useless for jailbreak research because Apple will force you to shut up
about it at least until they patch it, so now you can’t jailbreak.

~~~
GekkePrutser
That is if people obey the NDA of course. I'm sure not everyone will do so.

However finding a bug, reporting it and then 'suddenly' a jailbreak appearing
that would use it, would be highly suspicious indeed. So they'd probably have
to give up the chance of getting the bug bounty.

PS: I'm certainly not signing that NDA myself :)

~~~
hendersoon
Yes. Once these devices exist they will be used by everybody interested in
that sort of access. Ironically, pretty much everybody _other_ than whitehats.

