Hacker News new | comments | show | ask | jobs | submit login

Bigger tech companies not only do not host important private source code on Github but won't even host on outsourced physical servers. Where I've worked (incl. well-known names), information that we wanted spread to the world could be hosted on trusted external systems (trust still mattered to keep it from being "edited"), but non-public info was always hosted inside a physical building that we owned watched 24/7 by human guards we employed.



MS employee here:

Long before the acquisition, we've been hosting important stuff in private GitHub repositories. Including having strategic discussions in those private repositories.

We've also done a lot of that stuff in public too. Some might say a bit too much, given that we've had things leaked and/or misinterpreted w.r.t product direction in the past.

I still agree with your point, but I believe more of this sort of thing is happening. Lots of stuff that has no real reason to be private is just being open source by default.


> Long before the acquisition, we've been hosting important stuff in private GitHub repositories. Including having strategic discussions in those private repositories.

Wow! I am very surprised by that. Is that an officially allowed policy? Or is it something that is "don't ask for permission, ask for forgiveness"?


Yes, it's absolutely an allowed policy. When we made .NET (Core) open source, we meant it. We still use email like any other org, but whenever we're working on our product we try to keep discussions on GitHub. It's also made collaboration with other teams far, far easier.


How so? What was going to be controlled, by whom?


I assumed that Microsoft has security policies to ensure that all confidential information (e.g. non-open-source code and strategic discussions) is stored on infrastructure controlled by Microsoft.

The company I work at is very careful about keeping our intellectual property on our infrastructure, and I am surprised that a larger company like Microsoft doesn't have similar policies.


Microsoft aims to make most of its money in the immediate future by convincing every major business in the world to let MS host that company's email, internal documents, spreadsheets and powerpoints on Microsoft's office365 servers.

It would be highly contradictory for MS to take the position, as a matter of policy, that it is too risky for them to ever place confidential business data onto a third party cloud-hosted SaaS system, because that is precisely the risk they are asking every one of their customers to take.

Similarly, if you have concerns about putting your company's source code into GitHub now, you should be equally concerned about putting your company's prerelease annual report on the office365 onedrive.


My company is concerned about that as well. We don’t use any cloud storage from Microsoft or anyone else, and we self host Exchange and SharePoint servers.

That is a good point though, it’s becoming more and more inconvenient for a company to self host everything. Microsoft does stand to benefit from everyone becoming more accustomed to relying on 3rd party services in the cloud.


Serious question: do you think your company has better security than the Azure cloud? Or is it a trust issue with the cloud vendors themselves?


.... and if you don't trust Microsoft: Why use Exchange and such? :-)


Better is relative - especially in one metric: many eggs in one basket make that basket exponentially more attractive to evil actors. Bigger attack surface and whatnot...


Flipside (pro-cloud pov): if the work to protect one egg applies to all eggs, then cloud providers will always hypothetically be able to spend more on security due to economies of scale

Essentially, choose your vulnerability: cloud provider single point of failure or in-house lack of resources


Yup. It all boils down to a business decision, the technical merits are not prevalent for either case.


Maybe info sec drove the decision to purchase github because that was the easier way to reign in the data leak. =)


> I assumed that Microsoft has security policies to ensure that all confidential information (e.g. non-open-source code and strategic discussions) is stored on infrastructure controlled by Microsoft.

It depends on how important the code is.

I don't imagine MS will ever move Office or Windows to external servers, but a lot of other stuff is fair game.

There is always a security/convenience trade off.


I'm almost sure you mean private repos on github.com, but just wanted to confirm it. You don't mean corp github right?


Yep.


Not entirely true. Microsoft puts (almost? Yet to find anything that isn't) all our code on VSTS which is accessible remotely, without VPN. I've checked in a (very very minor docs) fix to the Windows code base from my Android phone over LTE.


That's amazing. Contrast that to my friend who works on code at Apple that's so guarded that he can't even access it from Apple HQ. He has to travel to his office in an unmarked Apple bldg several miles from HQ (in an unmarked van) and access the code from inside the bldg. Any attempt to work on his code outside that bldg, on the Apple employee shuttle for example, will result in immediate firing with possible criminal charges. Admittedly, that's not the usual Apple employee, but the contrast between that and Microsoft's, which may as well be hosted on a set of Chinese night market DVDs, is LOL-worthy.


LOL-worthy

and yet, which company released an OS update with an open root account with no password, patched it in a way that broke file sharing, then a couple of months later released an update with another password bypass bug? Hobbling people with security theatre isn't begetting good or secure code.


Opsec and AppSec usually handled by different teams :)


This sounds proportionate if a state might go after the code. For example phone encryption might be a big prize for the Chinese or even American government.

Microsoft actually hand over OS code to states regularly for certain contracts so I figure they don't need to protect most of thier code like that.


I disagree. Phone encryption should ideally be open source-able and it's security should rely as entirely on a device specific key as possible.

I think this makes more sense for a secret project (e.x. the next iPhone), but honestly as a security person it seems overkill for anything outside national security responsible code, like state sponsored malware.

I also find it strange that the code is apparently somehow accessible outside that building (see the fired comment). If this was anything beyond security theatre, it'd be on an airgapped network and that wouldn't even be a concern (as the employee wouldn't be able to access the code from their laptop). Seems excessive for very little gain.


I wouldn't take SiVal's comment as ground truth. I think it conflates rules for general employees with rules for his friend, and mixes it with a dash of unfounded hyperbole (criminal charges?).


The code isn't available outside the building unless someone takes it outside, which they make clear is not only a fireable offense but might qualify as criminal. They made it quite clear: If you're in crunch mode, don't be tempted to just take a bit of work with you to get a bit more done on the long shuttle ride.


Fair enough. I obviously don't know your friend or his project, so I can't with certainty say anything about his situation. I viewed your post through a critical lens because the details given didn't match my experience or the experience of any of my old colleagues, and you are a second-hand witness.


I am going to agree with what doctorsher said in response to your comment. I can confirm that what SiVal said is not a typical experience in Apple.


For reading, I agree, but if you're making changes it is a different story.


This has to be something very mission critical like phone encryption. No way this is the norm even at Apple.


I thought it was widely known Apple was extremely secretive, compared to the broader tech company at the very least.


> unmarked van

You may think it’s unmarked, but if you know how to spot them they’re very easy to pick out.


If I was an intelligence agency, I would do the trivially obvious thing and only use "unmarked cars" when I didn't care about being spotted, and an actual nondescript vehicle the rest of the time.


What's the difference between an "unmarked car" and a "nondescript vehicle" ?

You think the CIA would do their clandestine work on cars labeled "CIA" ?


> What's the difference between an "unmarked car" and a "nondescript vehicle" ?

Unmarked police cars often have multiple radio antennae, flexible lights, and even government plates, they simply lack explicit police markings and light bars.


Surely an unmarked van owned by Apple would have none of that?


The point is that the "unmarked" vehicle sticks out as unusual even without having "Apple" or "Police" emblazoned on its side.


Yeah, which is hosted on Azure, a data center that Microsoft owns and employs guards for, and secured behind our standard corporate authentication. :) (Source: I work at Microsoft, near the VSTS team.)


> Yeah, which is hosted on Azure, a data center that Microsoft owns and employs guards for, and secured behind our standard corporate authentication. :)

Way back when, Microsoft used to host a bunch of auth servers for banks. A friend of mine mentioned an armed guard in front of the data center for that particular service.

I've worked on teams at MS where there was a (non-armed) guard checking everyone who got off the elevator, but before I joined MS I was once left alone in a room full of computers open to the Windows source tree, wearing my "do not leave guest unattended" badge.

Mileage might vary and all that.


Yeah, all I was really saying was that the grandparent's comment and the parent's comment weren't in opposition.

Microsoft owns the data center the code lives in and certainly takes care of physical security.


The only thing a VPN would do in this case is hiding that you're even accessing VSTS and providing modest proteaction against MitM attacks. You still have to use 2FA to log in, and the code you access is still logged.


VPN puts you on corpnet. And yes, I'm well familiar with our various account protection techniques (I work on the token server) - I was calling out that some companies trust their systems enough to make it remotely accessible, not saying it's a bad thing that I could be productive on the bus ride home.


Github offers an enterprise version and I know of at least one big company which hosts their code there.


Note that this doesn't preclude the possibility of an on-prem Enterprise Github setup.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: