
iPhone Bugs Are Too Valuable to Report to Apple - cinquemb
https://motherboard.vice.com/en_us/article/gybppx/iphone-bugs-are-too-valuable-to-report-to-apple
======
objclxt
But...this is true of every software vendor. Does anyone think Microsoft is
paying market value for a remote code execution exploit in Edge? They're not,
they'll give you $15k for it[1].

I find this particularly interesting:

> [the security researchers] asked Apple's security team for special iPhones
> that don't have certain restrictions so it's easier to hack them [...] these
> devices would have some security features, such as sandboxing, disabled in
> order to allow the researchers to continue doing their work.

If I go to Google or Facebook and ask them to, say, turn off some key security
features on their site so I can find more bugs they're gong to tell me to go
take a hike. It's unclear to me why a security researcher thinks Apple would
give them access to a device with the sandbox bypassed. Why would they
possibly trust them?

[1]: [https://technet.microsoft.com/en-
us/library/dn425036.aspx](https://technet.microsoft.com/en-
us/library/dn425036.aspx)

~~~
saurik
Your analogy to me falls extremely flat given that in this situation you own
the iPhone in question and are just asking for the right to put arbitrary
software... on your own phone.

(My comment is now at -1. I challenge anyone considering to downvote this
response to actually answer how asking for the right to modify the software on
the _one_ phone in your possession--the one whose security features, if you
really want to insist that this is about a security feature, only affects
_you_ \--is even remotely comparable in an honest setting to "go[ing] to
Google or Facebook and ask[ing] them to, say, turn off some key security
features on their site".)

~~~
jsjohnst
Here's the concern I have, if you're able to "unlock" a single device, how do
you do it where it's useful to the researcher, but can't be done to an
unsuspecting 3rd party's phone?

The only way I can think of that doesn't have an endless array of problematic
edge cases is for Apple to have a "researcher edition" of the phone, but even
that isn't problem free.

Thoughts?

~~~
saurik
Apple already solves this problem by way of their TSS update server: they can
whitelist specific ECIDs to be able to sign and install developer customized
firmwares. What the argument was was "if we sign on to your program, can you
add us to your whitelist?".

The only issue I can come up with would be "what if a researcher who was
registered with and known to Apple gave their phone to someone else as a fake
gift to spy on them". If you really want to go there we could argue it, but it
seems a little far fetched in terms of "amount of damage that can be done via
this route that couldn't be done via other ones".

~~~
jsjohnst
> The only issue I can come up with would be ....

If it's locked to specific hardware, then no, I definitely wouldn't argue the
edge case scenario you mentioned.

------
saurik
If you noticed your friend and neighbor left their car unlocked with the keys
inside, would you steal their car, or would you tell them? I bet you would
tell them.

Let's say instead that there were some shady people who liked to hang out
outside of the local gas station, and you have heard that they will give you a
cut of any heist they pull off based on "tips". Do you call them over to come
steal the car? I bet you don't do that either.

Let's say that it is some random person's car. I still doubt you steal the
car, and I still doubt you tip off someone else to steal the car. However, I
also doubt you go far out of your way to find and tell the owner of the car
something is wrong.

What if, though, you knew that you could get a reward; something sizable
enough to be at least worth your time, but nowhere near the value of either
the car itself or what a band of thieves would be willing give you if you
called them and tipped them off?

That's all this bug bounty program is: it is designed to provide a reason for
people who come across bugs to even bother coming to Apple at all rather than
just putting it in a "pile of fun bugs".

Only, instead of the moral issue being "someone's car might get stolen", it is
more like "you found a bug in the Tesla's computer locks, which makes it
trivial to walk up to any Tesla anywhere and just drive off (or even tell the
Tesla to steal itself!)".

The companies that offer large sums of cash for key bugs, such as Zerodium,
tend to be pretty "black hat"... their clients are doing stuff like corporate
and governmental espionage; they might even have mafia-like organizations as
clients for all you know.

[https://www.wired.com/2016/09/top-shelf-iphone-hack-now-
goes...](https://www.wired.com/2016/09/top-shelf-iphone-hack-now-
goes-1-5-million/)

So that's the real ethical question involved here: do you go to Apple and get
your $50-200,000, knowing that Apple will give you credit for the bug, let you
talk about it at the next conference, and seems to care enough to try to fix
these things quickly...

...or do you sell your bug to a group that resells it to some government which
then uses it to try to spy on people like Ahmed Mansoor, "an internationally
recognized human rights defender, based in the United Arab Emirates (UAE), and
recipient of the Martin Ennals Award (sometimes referred to as a “Nobel Prize
for human rights”)".

[https://citizenlab.org/2016/08/million-dollar-dissident-
ipho...](https://citizenlab.org/2016/08/million-dollar-dissident-iphone-zero-
day-nso-group-uae/)

FWIW, I have severe moral issues with this bug bounty program: I am a strong
advocate of simultaneous disclosure, and while Apple does tend to fix bugs
quickly, they have made it clear that they are not prepared to commit to
timelines even while keeping users in the dark about what they need to do to
protect themselves.

However, this article makes it sound like the entire concept of the bug bounty
program is incompetent or something, as it is failing to pay as much money as
the black market... while I have met a few people in the field who are more
than happy to sell a bug to literally anyone with cash, the vast majority of
people (even the ones whom I have sometimes called "mercenaries" for being
willing to "switch sides"), have a pretty serious distaste for the idea of
selling a bug to the highest bidder.

The real reasons you don't hear much about people selling their bugs to Apple
are that they are like Luca (who started doing this at the age of 17--he's now
either 19 or 20?--which is context that I think is really important for this
evaluation) and are sitting on bugs because they are _personally valuable to
them_ (as without at least one bug, you don't even own your own phone enough
to look for others; so there's a really big incentive to not disclose your
last bug: this is the thing that Apple should really care to fix), that they
are intending to release a public weaponized exploit in the form of a
jailbreak (which, given the demand from legitimate users due to Apple's
insistence on locking down their devices for reasons that are more about
business models than security, can be a ticket to world-wide fame that money
just can't buy, and which will also net you at least some donations on the
side), or simply that they actually have been but they haven't told anyone (a
situation that seems so likely that it seems weird that this article discounts
it).

~~~
averagewall
You're neglecting the effort needed to find the keys. Suppose you spend all
day every day checking every car for miles around searching for keys left in
them. That's your occupation.

Either:

A) You're already a criminal doing it for the money or

B) You're just trying to help people and not doing it for the money.

Person B probably doesn't exist. So you're probably already A, a criminal and
you don't care if people think you're immoral.

~~~
Godel_unicode
> Person B probably doesn't exist

I don't? Weird. There are lots of motivations for hunting bugs, many more
powerful than money to people who have money already.

------
microcolonel
Well, it's invite only, and some fairly critical exploit categories seem to be
way below market value. This is especially silly because these are _maximum_
payouts.

Apple is not lacking cash on hand, and while I know that's not a reason to
spend it, I figure they could come further toward the going rate. Especially
infuriating is the lack of other moral incentives beside "doing the right
thing", like when you get a bug bounty for an open source component, and know
that the public has free access to it. Even "doing the right thing" when it
comes to Apple is morally unrewarding, since they typically treat developers
and partners like dirt, and are so isolated from the rest of society.

Because of the lack of true community around Apple and their products, owing
largely to how proprietary all of their products and programmes are, I don't
see why anyone would do white hat security research on their platforms unless
they were paid substantially and directly by Apple or an Apple customer.

From my perspective, it's enough of a smack in the face to use their products.
I doubt many want to be fed what amounts to (relatively speaking) table scraps
for elite security research on a platform that you don't own at the end of the
day even as a customer. To have to be _invited_ to do this just makes it
completely not worth starting if your goal is to participate in the white hat
market.

------
monksy
So the short end of the stick, Apple is trying to be stingy on the rewards
paid for finding these issues and is complaining that they're not competitive?

~~~
tristanj
It's more like the exploit buyers will always pay more than what Apple would,
so if Apple raises its bounties the exploit buyers will raise their amount
too. Zerodium could just offer 2x as much as Apples bounty, and Apple can
never win on price. It's a losing game for apple to play.

What Apple can compete on are the non-monetary incentives, such as prestige,
rewards, access, etc.

~~~
desdiv
>Zerodium could just offer 2x as much as Apples bounty, and Apple can never
win on price.

I'm a die-hard Android fan, but even I'm forced to admit that iPhone has much
better security. And security is a big selling point.

Raising the bug bounty to $10 million and it would still be pocket change to
Apple, but it would result in:

1\. Free advertisement

2\. Good PR

3\. Outbid the black market for iOS exploits

4\. Encourage more white hats to look into iOS security

~~~
heartbreak
Why do you think that organized crime and governments can't afford $10 million
per zero day?

There are amounts that they can't afford, but those amounts would stretch
Apple's wallet as well.

~~~
PeterisP
Governments certainly can afford such amounts. If you're developing a
cyberweapon with an important purpose (e.g. Stuxnet) then putting in 3-5
zerodays at $10m each is within your budget, it's comparable to the cost of
physical military hardware that they'd gladly buy and use (and destroy) for
goals of similar importance.

------
agjacobson
1\. Sell bug on the open market for whatever price the market will bear.
Advantage: paid concomitant to your talents. 2\. Wait n days. n could be zero.
Don't have a theory for n. 3\. Slip the bug to Apple anonymously.

~~~
rubatuga
Well, your payment could be part of a contract where they would only give you
the money if Apple doesn't patch the bug in the next n days

~~~
pluma
Contracts only matter if they are enforceable by law. If a contract involves
illegal activity, it is invalid.

~~~
jsjohnst
There are many kinds of contracts. The contract implied above could be "cross
us and find you and your family disappear", not "we sue you in court".

------
drewmol
The article mostly proposes what Apple should do: increase bug bounty payouts.
When proposing what the discoverer should do, does it matter WWJD?

He's dead so he cannot weigh in, but I think it's fair to suppose early
life/career he may sell it to the highest bidder and use the money to invest
in a Pixar type, or sell it to both parties if possible. Later life/career
possibly use it to negatively impact a competitor? (weighing legal risk to his
company of course) How about WW{Apple/Cook}D? I think we know what Uber would
do;)

------
DennisP
Apple has $58 billion, and a quarterly gross profit of about $20 billion.
Based on the article's numbers, it could outbid the black market by
multiplying their payouts by ten. I doubt there are so many bugs that they
can't fix this with an insignificant impact to their bottom line, even
assuming it doesn't help sales to have a bulletproof reputation for security.

~~~
runeks
I agree.

I don't think there's a need to multiply payouts by ten though. When you give
regular people -- as opposed to people familiar with selling exploits on the
dark web -- the ability to sell exploits legally to Apple, you motivate a
_much_ larger group of people. The honest hackers far outnumber exploit
vendors if someone is willing to pay them. That someone could be the company
that depends on the software for their business (Apple with iOS), or a
software insurance company.

The very reason companies like e.g. Google, Microsoft, Apple are so profitable
is because they have many users, which in turn make exploits against their
software very valuable. So, unless a company's profit per user is very small,
this should always scale (more users = both higher income from users and
higher price for exploits). If they were willing to purchase exploits at
market price they would, in effect, be functioning as their own insurer -- in
my opinion, this practice is the only sensible thing an insurance company can
do to insure software of reasonable complexity.

Microsoft paying one million USD per exploit, for 100 high-value exploits per
year (0.1bn USD), would decrease their yearly profit (2016: 16.8bn USD) by
only 0.6%, but make a _huge_ difference in the security of their software (100
high-value exploits per year is a lot). Stockholders would be foolish to vote
no if this were proposed.

~~~
wepple
> would decrease their yearly profit (2016: 16.8bn USD) by only 0.6%

> Stockholders would be foolish to vote no if this were proposed.

I doubt that an improvement to security of the type you describe here would
have a significant improvement on sales or reputation of MS. So shareholders
will see profits go down, who would vote on that?

------
PhasmaFelis
This article is fucking _weird_. It's like saying, with a straight face, that
the military needs to be competitive with professional-assassin pay scales if
they want to hire the best killers.

~~~
qbrass
I'd assume the military would have to pay more than going-rate to hire the
professional assassin. It's definitely the wrong time to cheap-out.

------
such_a_casual
> It isn't forced on you in any way at all.

Uhm, what universe do you live in exactly? Because the universe where I live,
I have Apple or Android. Saying that apple's bullshit isn't forced on millions
of users is like saying the Republican party's bullshit isn't pushed on
millions of citizens. I mean argument aside, let's not do Apple's job for them
and paint the world as capitalistic utopia where consumers have all the power
and make all the decisions. Consumers do not make 99% of the decisions related
to their phone, including the decision to have one.

~~~
princekolt
I don't see how being born into a family of a certain political opinion and
being indoctrinated for years can be comparable to choosing a phone brand.

I think you are analyzing this question from a very US-centric perspective and
failing to realize that in other markets (like Europe and South America, the
ones I'm familiar with), Apple has a much smaller footprint, and does not
possess the "reach" of a political party _at all_.

Edit: typo

~~~
saurik
I think the idea is that if you want a high quality phone--and you essentially
need a high quality phone to be competitive in the workforce and even to
engage in many social activities and functions--you are going to end up buying
a device from one of a small handful of companies (Apple, Samsung, Microsoft,
Sony, HTC), _all of which are closed down and locked experiences_. This is a
problem that can likely only be solved by legislation (which the EU is
thankfully looking into, as the EU actually cares: I <3 the EU).

~~~
strcat
> all of which are closed down and locked experiences

That's not true. Many Android devices have an unlockable bootloader with
explicit support for building the Android Open Source Project (AOSP) for the
device. Nexus and Pixel devices are directly supported by AOSP without
modification. It's the same codebase used to build the stock OS for those
device. The stock OS on those devices only adds Google Play apps to the source
tree, some of which replace AOSP apps. It doesn't contain any secret sauce
changes to AOSP. Android engineers use the same Nexus / Pixel devices that are
shipped to consumers as their development devices. You enable OEM unlocking
within the OS from the owner account and can then unlock the bootloader via
physical access using fastboot over USB, allowing images to be flashed via
fastboot. Serial debugging can be toggled on and done via an open source cable
design through the headphone port.

Other companies like Sony have emulated this by releasing official sources for
building AOSP for their unlockable devices rather than only making the
bootloader unlockable and leaving it up to the community to hack together
support. However, I think it's only Nexus / Pixel devices where you get
support for full verified boot with a third party OS (i.e. you can lock the
bootloader again, and have it verify the OS using a third party key) along
with the ability to toggle on serial debugging.

It's why the Android security research community is so active. You get the
same sources / build system, development devices (Nexus / Pixel), debugging
tools, etc. as an Android engineer working at Google. The only major thing you
don't get is access to their internal bug tracker. Hopefully they'll move
towards the Chromium model where most of that is public once embargoes are
over.

~~~
saurik
Putting aside for the moment your comments about AOSP (the timeline on the
slow closing down of the source branches is a great one, particularly as you
now watch more of the code move into Google Play services and the AOSP core
applications be slowly obsoleted), as the issue here isn't really about the
source code (and I don't think that's the point you are making anyway), I will
concentrate on looking at the status of Android as an open hardware platform.

I cover the Nexus devices when I give talks. While I haven't looked into the
Pixel yet (and I know that I need to, as the arguments I am about to make for
quality likely will have begun to change), I can tell you that effectively no
one buys the Nexus devices (the market share for them is ~1% with a 1% margin
of error), and they are not seen as high quality devices.

The reality of the Android market is that Samsung makes 98% of the profit, and
the vast majority of flagship devices are being made by the handful of
companies that put the most effort into locking down their devices. If you
want the "high-quality phone"\--the one with the good screen and the good
camera and the fast CPU that can run all of the apps that you increasingly
need in this day and age--you are not buying one of the random open devices.

Again, though: I admit that Google's attempt to retake the flagship market and
compete with their hardware manufacturer partners with the Pixel (a device
which specifically looked at having stuff like a super high quality camera and
screen and such) might change things, but this is an incredibly new
development in the grand scheme of these things.

~~~
strcat
> The reality of the Android market is that Samsung makes 98% of the profit,
> and the vast majority of flagship devices are being made by the handful of
> companies that put the most effort into locking down their devices.

You know you can buy an international Galaxy S8 and unlock the bootloader
without any exploits, right? That part works the same way as Nexus / Pixel
devices on the S8 variants that can be unlocked (i.e. not US carrier versions,
etc.). The difference is that Samsung doesn't give you 100% of their OS
sources especially on the day that they release each update and they don't
support verified boot for a third party operating system. They also revoke the
warranty if you do it, but they _permit_ it. They explicitly implemented a
standard unlocking procedure for their consumer devices and it's not in any
way forbidden by the terms of use other than voiding the warranty, which is
sad but not exactly unfair. Their attempt to void the warranty is not valid in
everywhere anyway. They still often need to honor standard warranty
requirements unless it's demonstrated that the user is at fault for what went
wrong.

