
How a Low-Level Apple Employee Leaked Some of the iPhone's Most Sensitive Code - tomduncalf
https://motherboard.vice.com/amp/en_us/article/xw5yd7/how-iphone-iboot-source-code-leaked-on-github
======
jballanc
When I worked at Apple, the culture of secrecy was enforced primarily by fear
of MarCom (Marketing and Communications) and Legal. It might sound ominous,
but I always felt it was quite the opposite.

Instead of being treated like children, with hands tied and every interesting
piece of technology locked behind layer upon layer of approval and access
restrictions, probably 80% of all of Apple's source was available to anyone
within the company. This promoted a sense of ownership, a feeling that no
matter what your "product" was, we all worked on the entire OS, together. It
also allowed for the free exchange of ideas and enabled engineers to be self
motivated when faced with a challenge. More than once, when I was stuck, I
would dive into the code archives and look to find some bit of code that
someone else wrote that might do what I needed, and then I'd call them up and
see if we couldn't work on a common solution.

All of this was possible because Apple made it very clear on day 1 that we
were being entrusted with the secrecy that is such a core facet of who Apple
is, and that if we violated that trust, there _would_ be consequences.

I feel sorry for this "Low-Level" employee, because I suspect they are soon
going to find out just how much of a price tag Apple puts on their trust and
valued secrecy. On the bright side, they're probably too young to have any
real estate holdings Apple could go after, and Chapter 7 gets wiped out after,
what? 10 years?

~~~
ksec
>On the bright side, they're probably too young to have any real estate
holdings Apple could go after, and Chapter 7 gets wiped out after, what? 10
years?

It is hard to imagine one could get penalty of bankruptcy instead of going
into Jail for this.

I actually value this working model a lot. The comment below suggest this is
culture of fear, instead I call this a culture of utmost trust. ( Well he did
say he hate Apple, haters are going to hate whether it is rational or not )

~~~
jballanc
> It is hard to imagine one could get penalty of bankruptcy instead of going
> into Jail for this.

IANAL, but I believe for a charge of grand larceny you'd have to show intent
to steal, and the "Low-Level" employee seems to have a pretty legitimate case
that this was negligence, not theft. On the other hand, I think it's a pretty
straight-line open-n-shut case for Apple to sue for damages, and I'm not sure
that ruining a young software engineer's financial life for a decade is less
of a deterrent than jail.

~~~
mseebach
The negligence was only in losing control of the code. I guess successfully
maintaining tight control of the code would be a mitigating factor (or, more
likely, lead to not getting caught), but taking the code from Apple for the
purpose of sharing it with these researchers in the first place is clearly
premeditated (so, showing intent) theft.

~~~
yeukhon
Yes, it’s stealing afterall. Taking screenshots using one’s phone is still
possible to steal the code, although the effort would be tedious for a large
codebase. But for an active adversary, if that’s the only method, he/she would
do it. I fear there is no way around that, even with logging enabled (there
will be thousands of accessing the codebase per day). If the proposal is to
lock down the repository and requires approval, that will make the development
very slow. Not even in the banking would a manager want to ask “why” all the
time...

~~~
mseebach
I have no idea what they might do, but it's not a law of nature that they have
to crack down on access. They have tens of thousands of employees, a single
breach isn't evidence that a trusting, open environment doesn't work.

------
ancarda
Maybe this will sound silly and ridiculous, but I’m going to ask regardless.

Why doesn’t Apple just opensource iBoot? It could be under a very restrictive
license; one that doesn’t allow the code to be redistributed or used by anyone
in any project. The assets and tradmarks would also be protected. However, it
would let people study the code — yes, some people will use that knowledge to
make jailbreaks, but you know some will tell Apple.

Apple has opened projects like ResearchKit before.

I guess my hope is the iOS community would be able to take a more direct
approach to contributing to platform security that we depend on. How much code
has been smuggled out? Probably way more than this if it’s so easy.

~~~
quantummkv
> It could be under a very restrictive license; one that doesn’t allow the
> code to be redistributed or used by anyone in any project.

Open source means allowing the usage of code. Putting code under a license
that does not allow any kind of usage is definitely not open source. And then
why should Apple allow people to view their proprietary code when it has no
intention to allow 3rd party usage? Chinese firms are already aping Apple's
design. Everyone knows what they will do if they can see all the code.

~~~
ancarda
You’re confusing open source with free software. You definitely can have open
source code that you can’t use (legally). Taking the code off GitHub and using
it without Apple’s permission would be theft.

Open Source only means the code is available to be seen by the general public.
Free Software means you have specific rights, like the right to view, modify,
and distribute the code.

A possible license could be APSL:
[https://en.wikipedia.org/wiki/Apple_Public_Source_License](https://en.wikipedia.org/wiki/Apple_Public_Source_License)

~~~
ghaff
Code that can be viewed but not used would not be in compliance of an OSI-
approved license. I suppose you can call anything open source but OSI is the
arbiter in the minds of most.

~~~
atonse
Sure. If you use an OSI license maybe. But I seriously doubt they can stop me
from using the phrase open source if my code’s source is open.

~~~
rileymat2
Obviously thay cannot stop you from misusing a term. I used to believe like
you that the definition was more broad like you indicated. But the people who
coined it had a narrower free software defintion in mind.

[https://en.m.wikipedia.org/wiki/Open-
source_software](https://en.m.wikipedia.org/wiki/Open-source_software)

~~~
ryanlol
The fight over the meaning of "Open source" was lost by OSI almost two decades
ago.

Here's a 2002 version of their website essentially admitting as much:
[https://web.archive.org/web/20021001164015/http://www.openso...](https://web.archive.org/web/20021001164015/http://www.opensource.org/docs/history.php)

>The OSI Certified mark is OSI's way of certifying that the license under
which the software is distributed conforms to the OSD; _the generic term "Open
Source" cannot provide that assurance_, but we still encourage use of the term
"Open Source" to mean conformance to the OSD.

------
xstartup
The companies I have experience with had huge monorepo. It's not unimaginable
that everyone inside the company has access to some important code, what
usually they don't have access to is customers' data which has few more
checkpoints. They mostly considered the code commodity, you could take it and
still couldn't compete with them. In fact, this is one of the core reasons,
some companies are hesitant in going all remote. No one wants their
questionable code to be leaked.

~~~
TomMarius
What difference does being remote make?

~~~
bufferoverflow
Jurisdiction. It's tough to prosecute some developer in India or Ukraine from
the US.

~~~
TomMarius
Ah, okay, good point. You have the option of not employing foreigners though.

~~~
walshemj
And I would assume that as in the UK collective punishment Is legal in the
states.

Refusing to sponsor any h1'bs from that country would be one penalty.

~~~
thecodeboy
Too harsh a penalty I'd say.

~~~
walshemj
Well if you harbour cyber criminals and refuse to extradite they made their
bed they have to lay on it

------
thecodeboy
I found a pony in Apple’s iBoot source code.

[https://medium.com/aishik/i-found-a-pony-in-apples-iboot-
sou...](https://medium.com/aishik/i-found-a-pony-in-apples-iboot-source-code-
def8cef10b24)

~~~
digi_owl
One "playful" way to tell if source have been copied, or recreated via a clean
room means.

~~~
sterlind
Pony is bracketed by #ifdef DEBUG. I think the dev just liked ponies.

------
mankash666
No amount of investment in security software protects companies from social
engineering. Eventually, someone needs to have access to core IP and that
someone needs to be trusted to do the right thing.

If a lowly engineer can leak this to the public, an intelligence agency hell
bent on obtaining source code can certainly have full access to it. I suspect
the top dogs (NSA,CIA, Mossad, FSB, China) have full access to source code
they deem necessary, with or without the company's consent

------
eridius
This article doesn't explain why a "low-level" employee would have access to
this code to begin with. Apple certainly doesn't give everyone access to
everything. What role were they actually in?

~~~
IMcD23
From what I heard, this employee was an intern working on the team that owned
this code.

------
alborzmassah
Allowing people to send code to friends within jailbreak community and
expecting no leak? Did I read that correctly?

------
staunch
Spy agencies in Russia, China, and the US will have a copy of most of Apple's
source code anyway. It's trivial to get a number of employees working for any
of these companies. They must all be fully infiltrated, probably at high
levels too, which is pretty neat.

~~~
scarface74
Sophisticated spy agencies don't need the source code. They can disassemble
it.

~~~
TomMarius
That won't get you all the FIXME and TODO comments.

------
kemonocode
And kids, this is why security by obscurity is such an awful, awful idea.

If it's true their products' security "doesn’t depend on the secrecy of our
source code" then there shouldn't be any issues in open-sourcing it already.
Open source, not making it free software, which I suspect they'd never do
anyway. Even having some form of assurance they're not doing anything
nefarious behind the scenes and a proper bug bounty program would be plenty.

~~~
tzs
> And kids, this is why security by obscurity is such an awful, awful idea.

This is probably the most misunderstood security concept on the net. What it
is supposed to mean is that security should not _depend_ on obscurity. For
some reason people seem to take at as meaning obscurity should not be part of
your defense at all.

In real life, obscurity is useful in many cases because it slows down
attackers. For example, lets say you want to get to my credit card database.
What you need to do is:

1\. Break into one of my public facing servers.

2\. From that server, find a way through the firewall that separates the
network the credit card handling system is on from the rest of my company's
networks.

3\. One through the firewall, find an exploitable flaw in the design or
implementation of the credit card storage system.

If you can get a copy of the source code and configuration of everything
except for passwords (i.e., there is no obscurity in my system), then you can
set up a clone of my system in your lab and work on finding all the holes you
need at your leisure and without drawing my attention.

Once you've got exploits for a public facing server, the firewall, and the
credit card storage system, you might be able to do a quick attack and get in
and get the credit cards before I can do anything.

If I've included obscurity it is much less likely that you can do a quick raid
like that. You'll have to plod through the layers of my system step by step,
working ON my systems, giving me a much better chance of noticing your attack
and stopping it before you get to and find a hole in the final layer where the
credit cards are stored.

