When I worked at Apple, the culture of secrecy was enforced primarily by fear of MarCom (Marketing and Communications) and Legal. It might sound ominous, but I always felt it was quite the opposite.
Instead of being treated like children, with hands tied and every interesting piece of technology locked behind layer upon layer of approval and access restrictions, probably 80% of all of Apple's source was available to anyone within the company. This promoted a sense of ownership, a feeling that no matter what your "product" was, we all worked on the entire OS, together. It also allowed for the free exchange of ideas and enabled engineers to be self motivated when faced with a challenge. More than once, when I was stuck, I would dive into the code archives and look to find some bit of code that someone else wrote that might do what I needed, and then I'd call them up and see if we couldn't work on a common solution.
All of this was possible because Apple made it very clear on day 1 that we were being entrusted with the secrecy that is such a core facet of who Apple is, and that if we violated that trust, there would be consequences.
I feel sorry for this "Low-Level" employee, because I suspect they are soon going to find out just how much of a price tag Apple puts on their trust and valued secrecy. On the bright side, they're probably too young to have any real estate holdings Apple could go after, and Chapter 7 gets wiped out after, what? 10 years?
Ah yes, Purplefish <3. A very magical tool for us OS peeps.
Not sure of the breadth of code in there though, as it seemed every org had its own way of managing its own source code. At the beginning, iPhone OS was basically a mono-repo, but I don't think you could find the iBoot source code in there. Plus, there was this whole hyperbolic "security clearance" leveling that only those working closely with Forstall and Jobs were given full access to UI code. Everyone else had some janky non-ui variant that worked well enough to test the lower level code. To be fair, factory workers were known sources of leaks, so it was a smart move to hide upcoming features.
At every software job I've ever worked, you were entrusted to take care of the companies IP. It's a lot of responsibility when you think about it, because so much of the worlds most powerful IP sits on laptops all over the place. How many of those get stolen or hacked and still thankfully never have things leak?! I've always been impressed. This is another reason we get paid so well when you think about it.
Send some your thanks to Apple itself & its commitment to device/encryption & privacy; most importantly, send some thanks to all the Silicon Valley Macadmins/Macops/CPE/IT teams. No doubt their work has prevented more leaks than we can now!
>On the bright side, they're probably too young to have any real estate holdings Apple could go after, and Chapter 7 gets wiped out after, what? 10 years?
It is hard to imagine one could get penalty of bankruptcy
instead of going into Jail for this.
I actually value this working model a lot. The comment below suggest this is culture of fear, instead I call this a culture of utmost trust. ( Well he did say he hate Apple, haters are going to hate whether it is rational or not )
> It is hard to imagine one could get penalty of bankruptcy instead of going into Jail for this.
IANAL, but I believe for a charge of grand larceny you'd have to show intent to steal, and the "Low-Level" employee seems to have a pretty legitimate case that this was negligence, not theft. On the other hand, I think it's a pretty straight-line open-n-shut case for Apple to sue for damages, and I'm not sure that ruining a young software engineer's financial life for a decade is less of a deterrent than jail.
The negligence was only in losing control of the code. I guess successfully maintaining tight control of the code would be a mitigating factor (or, more likely, lead to not getting caught), but taking the code from Apple for the purpose of sharing it with these researchers in the first place is clearly premeditated (so, showing intent) theft.
Yes, it’s stealing afterall. Taking screenshots using one’s phone is still possible to steal the code, although the effort would be tedious for a large codebase. But for an active adversary, if that’s the only method, he/she would do it. I fear there is no way around that, even with logging enabled (there will be thousands of accessing the codebase per day). If the proposal is to lock down the repository and requires approval, that will make the development very slow. Not even in the banking would a manager want to ask “why” all the time...
I have no idea what they might do, but it's not a law of nature that they have to crack down on access. They have tens of thousands of employees, a single breach isn't evidence that a trusting, open environment doesn't work.
jail is pretty bad, but it's not infinitely bad. that is, there almost certainly exists a sum $X for which you would be willing to go to jail for Y duration of time.
if you're struggling to stay afloat and have no credit/assets, a short time in jail is way worse than bankruptcy. on the other hand, if you've worked as a SWE for a few years, loosing all your money and having terrible credit for years is pretty awful. you're broke, so you can't buy anything outright, and your losses are compounded by the fact that you pay much more interest on things you buy on credit (if you can even get financing at all). it's not hard to see that this could be worse than a short time in jail. it all depends on the length of the sentence and your total losses from bankruptcy.
You do realize that there is life without good credit report? You don’t need to borrow money from somebody else/bank to live good life. Most of the world lives without an obsession with credit reports and borrowing.
yes but (at least in the US), your credit score affects stuff other than borrowing. often landlords will perform a credit check before approving your application; even employers will sometimes check an applicant's credit.
also keep in mind we are talking bad credit and all your assets seized by creditors. i'm not saying a bad credit score is literal hell on earth, but if you have no money and no credit you are gonna have a really bad time for a while.
It would be a civil not criminal case if it ever came to that and they would have to prove actual damages. Pursuing it by way of setting an example to employees could very well backfire.
>probably 80% of all of Apple's source was available to anyone within the company.
This hasn't been true for quite a while.
I worked on developer tools a few years ago and had a lot of trouble finding internal code written in swift to test my tools on. This was when Apple had a few dozens (that we knew of) sort of mature swift codebases.
Suppose you were an author and had recently completed a manuscript for a new book. You hire a company to help you move house and one of them copies the manuscript and publishes it on the internet. Or maybe some other personal, private and damaging records or material, such as images of you naked or making love, or compromising images of your son or daughter. If you went to the police, would you be fine if they refused to investigate because "this isn't some police state and we're not the Stasi"? Where would you draw the line on the responsibilities of people you pay to protect your privacy?
The only thing Apple has to worry about with the release of any source code is if that code does something harmful to their customers. Apple should open source all their software.
>>I feel sorry for this "Low-Level" employee, because I suspect they are soon going to find out just how much of a price tag Apple puts on their trust and valued secrecy
Maybe there's no better way but this is a bad security policy. With thousands of employees with such access, odds are that one might do it, either because he can, bragging, ideology, $$ from competitor or NSA /FSB etc etc.
Punishment is a possibly but "I will not get caught" etc
Maybe this will sound silly and ridiculous, but I’m going to ask regardless.
Why doesn’t Apple just opensource iBoot? It could be under a very restrictive license; one that doesn’t allow the code to be redistributed or used by anyone in any project. The assets and tradmarks would also be protected. However, it would let people study the code — yes, some people will use that knowledge to make jailbreaks, but you know some will tell Apple.
Apple has opened projects like ResearchKit before.
I guess my hope is the iOS community would be able to take a more direct approach to contributing to platform security that we depend on. How much code has been smuggled out? Probably way more than this if it’s so easy.
> It could be under a very restrictive license; one that doesn’t allow the code to be redistributed or used by anyone in any project.
Open source means allowing the usage of code. Putting code under a license that does not allow any kind of usage is definitely not open source.
And then why should Apple allow people to view their proprietary code when it has no intention to allow 3rd party usage? Chinese firms are already aping Apple's design. Everyone knows what they will do if they can see all the code.
You’re confusing open source with free software. You definitely can have open source code that you can’t use (legally). Taking the code off GitHub and using it without Apple’s permission would be theft.
Open Source only means the code is available to be seen by the general public. Free Software means you have specific rights, like the right to view, modify, and distribute the code.
Code that can be viewed but not used would not be in compliance of an OSI-approved license. I suppose you can call anything open source but OSI is the arbiter in the minds of most.
Obviously thay cannot stop you from misusing a term. I used to believe like you that the definition was more broad like you indicated. But the people who coined it had a narrower free software defintion in mind.
>The OSI Certified mark is OSI's way of certifying that the license under which the software is distributed conforms to the OSD; the generic term "Open Source" cannot provide that assurance, but we still encourage use of the term "Open Source" to mean conformance to the OSD.
As I wrote, no one would likely try to legally prevent you from calling any proprietary license open source if you want to. It just won’t actually be open source license in the minds of most if it doesn’t comply anwith the OSI guidelines
Not up to me to say what is open source. If something was compliant with OSI guidelines, it’s probably fair to call it open source. Also probably pretty ill-advised to have a one-off license no one else uses.
The companies I have experience with had huge monorepo. It's not unimaginable that everyone inside the company has access to some important code, what usually they don't have access to is customers' data which has few more checkpoints. They mostly considered the code commodity, you could take it and still couldn't compete with them. In fact, this is one of the core reasons, some companies are hesitant in going all remote. No one wants their questionable code to be leaked.
No amount of investment in security software protects companies from social engineering. Eventually, someone needs to have access to core IP and that someone needs to be trusted to do the right thing.
If a lowly engineer can leak this to the public, an intelligence agency hell bent on obtaining source code can certainly have full access to it. I suspect the top dogs (NSA,CIA, Mossad, FSB, China) have full access to source code they deem necessary, with or without the company's consent
This article doesn't explain why a "low-level" employee would have access to this code to begin with. Apple certainly doesn't give everyone access to everything. What role were they actually in?
Spy agencies in Russia, China, and the US will have a copy of most of Apple's source code anyway. It's trivial to get a number of employees working for any of these companies. They must all be fully infiltrated, probably at high levels too, which is pretty neat.
And kids, this is why security by obscurity is such an awful, awful idea.
If it's true their products' security "doesn’t depend on the secrecy of our source code" then there shouldn't be any issues in open-sourcing it already. Open source, not making it free software, which I suspect they'd never do anyway. Even having some form of assurance they're not doing anything nefarious behind the scenes and a proper bug bounty program would be plenty.
> And kids, this is why security by obscurity is such an awful, awful idea.
This is probably the most misunderstood security concept on the net. What it is supposed to mean is that security should not depend on obscurity. For some reason people seem to take at as meaning obscurity should not be part of your defense at all.
In real life, obscurity is useful in many cases because it slows down attackers. For example, lets say you want to get to my credit card database. What you need to do is:
1. Break into one of my public facing servers.
2. From that server, find a way through the firewall that separates the network the credit card handling system is on from the rest of my company's networks.
3. One through the firewall, find an exploitable flaw in the design or implementation of the credit card storage system.
If you can get a copy of the source code and configuration of everything except for passwords (i.e., there is no obscurity in my system), then you can set up a clone of my system in your lab and work on finding all the holes you need at your leisure and without drawing my attention.
Once you've got exploits for a public facing server, the firewall, and the credit card storage system, you might be able to do a quick attack and get in and get the credit cards before I can do anything.
If I've included obscurity it is much less likely that you can do a quick raid like that. You'll have to plod through the layers of my system step by step, working ON my systems, giving me a much better chance of noticing your attack and stopping it before you get to and find a hole in the final layer where the credit cards are stored.
I was with you up until the second “paragraph.” Just because something doesn’t depend on security through obscurity doesn’t mean it can’t be closed source
Instead of being treated like children, with hands tied and every interesting piece of technology locked behind layer upon layer of approval and access restrictions, probably 80% of all of Apple's source was available to anyone within the company. This promoted a sense of ownership, a feeling that no matter what your "product" was, we all worked on the entire OS, together. It also allowed for the free exchange of ideas and enabled engineers to be self motivated when faced with a challenge. More than once, when I was stuck, I would dive into the code archives and look to find some bit of code that someone else wrote that might do what I needed, and then I'd call them up and see if we couldn't work on a common solution.
All of this was possible because Apple made it very clear on day 1 that we were being entrusted with the secrecy that is such a core facet of who Apple is, and that if we violated that trust, there would be consequences.
I feel sorry for this "Low-Level" employee, because I suspect they are soon going to find out just how much of a price tag Apple puts on their trust and valued secrecy. On the bright side, they're probably too young to have any real estate holdings Apple could go after, and Chapter 7 gets wiped out after, what? 10 years?