Retaining older employees has little effect to prevent that, because they typically don't/can't review much of what's happening from a new team, sometimes a whole department, who joined in on the project and has its own agenda.
It's also worth noting that downtime and issues caused during updates in self-hosted Github are significant. I have experienced them, and I guess the main difference is actually that you can schedule the risk, rather than accept it arbitrarily which has positives but comes at significant cost too.
More information is displayed in a more concise way since the update.
Seriously, just consider self-hosting  rather than 'centralizing everything' 
Try working within a deadline, no engineer worth their salt would ever take this advice.
self hosting is definitely a viable solution.
Same goes for the question of where to host it physically. It seems unlikely your physical server will have better uptime than a virtual server in the cloud.
If all I need is a git repo, with some tools. Why pay someone to mess it up, when I can mess it up for free?
The moment you place it in someone else's work queue you are tied to them... and they might not care about your projects deadlines. Just like github.
Because it isn't free. Your time is a huge cost.
A senior dev who spends even 10 hrs on standing up a git server has blown through years' worth of GitHub costs, and that's assuming you're even actually using the paid service.
Factor in the extreme security requirements of a code server, including needing to update dependencies daily, and you're spending far more time self-hosting with riskier results.
Keeping it secure is no small thing, especially if you want to permit access from arbitrary IPs on the Internet (rather than using a VPN, say). GitHub does this, and presumably they have solutions in place for everything from intrusion-detection to DDoS protection.
GitHub employs people to take care of server failover and data backups. You could spend your own time building your own solutions here, but they're unlikely to be as good as GitHub's. Your solution is guaranteed to be less well tested.
And that's assuming you even have a server room in the first place. You could run your own Git in the cloud, of course, but you're not really 'running your own' if you do that. GitHub take care of the server question (apparently they use a physical-server provider called Carpathia ), and because git always needs to be available but is only used rarely, the amount they charge you is probably less than the cost of running a dedicated server for the purpose.
And all that is assuming that a self-hosted GitLab is just as good as GitHub from the developer's point of view. It may or may not really matter, but GitHub is probably the more polished and feature-rich service.
Building a competitor to GitHub is possible, but not trivial, see SourceHut. (We've been talking about GitHub, but of course they're not the only Git provider.)
I can see only a few situations where it makes good sense to run your own Git/GitLab:
1. Your Internet connection is slow and/or unreliable
2. There are extraordinary safety/security concerns associated with your source-code (military avionics code, say) so you want to run Git in an isolated network (no Internet connectivity at all)
3. Related to point 2: You don't want your organisation's data to reside in the USA. (To my knowledge GitHub don't offer any choice about this, but I could be mistaken.)
For the average developer though, I don't see much upside. Having more control isn't a compelling advantage, it's another way of saying you have more obligations.
Not the biggest fan of Gitlabs UI but I got used to it quickly.
It gets more attractive now that GitHub starts changing theirs. Might as well adapt to GitLab now.
It’s the code review tooling, the artifact storage & the deployment pipelines.
A distributed version of those would be awesome...
I'm not using GH actions, only Cirrus, Travis and Appveyor, which can be triggered manually if the API service is down also.
Can someone ELI5 the security implications of Ruby on Rails?
Interestingly -- the rails developers decided to put in a really horrendous hack to mitigate the common paths through which this design-flaw might lead to unexpected security outcomes ...
In a way, one could argue that the willingness to put in a horrendous hack to 'mitigate' a security flaw provides an example which demonstrates some amount of 'security reasonableness' in rails ...
In reality tho -- I think that this example serves as evidence more for the fact that rails is deeply flawed and very unlikely to be secure in practice -- for reasons of design complexity alone.
Some of our private repos are Mercurial so it would be nice to have both git and hg repos on a single platform.
Edit: Correct a mistake but the time calculated is correct.
Yes I realize the irony of pushing a project hosted on github when it's down.
Having an electronics engineering background, my personal pet theory is that the convoluted layers upon layers of automagic container management, load balancing and scaling mechanisms act like nested control loops in respect to each other and sudden load increases (e.g. Monday morning load spike) cause the system to essentially produce a step response and it starts overshooting/oscillating. Just a thought tough.
Sorry about that. Please try refreshing and contact us if the problem persists.
Contact Support — GitHub Status — @githubstatus
I hate it when sites use their local time, instead of just using UTC everywhere. I got accustomed to using UTC ~25 years ago.
At worst, it'll take you two seconds to open up a terminal and run
$ date -u
Sadly I don't have a real solution for you but at least I have an option that means it's all consistent! :-D
Is it worth backing up my GitHub repos somewhere else? What do other people use as an alternative source of truth for their code?
At this point, I consider it the least reliable hosting service available. But, to be fair, it's also the one with the most traffic.