Hacker News new | comments | show | ask | jobs | submit login
How to set up your own private Git server on Linux (intranation.com)
124 points by urbanmbeatz 2489 days ago | hide | past | web | 65 comments | favorite

The instructions for adding a public key to the authorized_keys file are a bit reckless, as they will wipe out any others that exist. Although not universal (not included with Mac OS X, for example) the ssh-copy-id command is much preferable. At its most basic:

  ssh-copy-id example.com
Or, to specify a specific key & remote user:

  ssh-copy-id -i ~/.ssh/id_rsa.pub bob@example.com
This will properly append the key, with the bonus of securing the permissions of the remote user's home, ~/.ssh, and ~/.ssh/authorized_keys.


ssh bob@example.com 'cat >> .ssh/authorized-keys' < ~/.ssh/id_rsa.pub

This is not an idempotent operation, and I would not recommend this for anything besides a one-off event.

Interesting, I hadn't thought about it since, for my usage, setting up keys is pretty much always a one-off event (and pretty rare on top of that). What are the consequences of dropping your key into authorized_keys files multiple times? Tried it out on a Debian box, and didn't see any issues other than the obvious clutter.

"looks bad" :-)

But it's a bad habit to get into. Or better yet, it's better to think in idempotent operations when doing systems work.

Fair point, I'll update my instructions with a warning, and a version of your commands that works for OS X as well (OS X lacks ssh-copy-id). The instructions are intended for new server builds (hence installing Git etc.).

I decided to move all my private repositories to my own server.

When you do this, make sure that the server has continuous backups. Also, make sure you still have an offsite backup.

Once you figure out what these things are worth, you may realize that you should probably just keep paying Github.

The backups aren't as important as each git repo is a fully blown code. If your local repo is destroyed, you still have the server copy. If your server blows up, you still have the local copy.

There are many other good reasons for a service like Github, like the excellent collaboration features, the really good repository and history browser or the good bugtracker.

If you don't need those (small team, working alone) but are concerned about uploading your intellectual property to a third party server in a potentially foreign country (depending on your location), then quickly setting up gitosis / gitweb / redmine might be enough for you.

In my personal case, I would really love to use github even for my small team, but I'm too concerned about the legal issues to go ahead with that (and the local installation is plain too expensive).

What legal issues/other issues from uploading your code to GitHub are you worried about?

I can't imagine that GitHub would steal your code. They've never heard of you, they have no reason to believe your code is worth anything to them, and one "I have pretty damn good evidence that GitHub stole my code" could ruin their entire business.

You mentioned legal issues. Are you afraid someone's going to ... subpoena your code or something? Because if that happens, you'd have to turn it over anyway.

They've got some pretty intense-looking security[1], and people like Twitter trust them with their code[2]. If they aren't worried, why are you?

1: http://help.github.com/security/

2: I don't know that that's officially known, but I saw Twitter commenting on the "GitHub now has Organizations" post complaining about the lack of the cheaper plan that they added the next day. So they definitely have some private repos on GitHub.

I don't live in the US. Our company isn't based in the US. While I'm somewhat familiar with US legislation by reading HN, I certainly don't feel comfortable to upload my code to US based servers of a US company as I plainly don't know their laws well enough to trust them with my companies intellectual property.

Of course, I could always trust them for now and instantly remove my stuff when there are signs of trouble, but I asked them (a year ago) whether deletions are instant and irreversible and they told me the usual thing: repositories are not instantly deleted so they could restore them in case of accidental deletions. In addition they stay around in backups for an indefinite time.

Legislation not known well enough and no control over the removal of my code from their machines - call me paranoid, but these are good reasons not to upload my code to them.

> Are you afraid someone's going to ... subpoena your code or something? Because if that happens, you'd have to turn it over anyway.

Not if you don't live in the US.

In 18 months when you need to a clients project gets deleted and you and find the server version destroyed... well...

Might sound unlikely but it happens.

it could also happen that github loses data and it's hard to valuate the exact likelihood of you accidentally deleting two repos or github losing a fileserver and its backup.

Also, if github is down or your repo with them is corrupted, you have to go through their support. If your own server has a problem you can fix it instantly.

I'm not convinced that reliability is the correct reason to go GitHub. Features: Yes. Reliability: Not necessarily.

Sure, Github can lose data. And you can lose data. But the advantage is that you and Github are much less correlated; the odds that both of you will lose the data at the same time are fairly low. [1]

Data safety is all about fighting correlation. You don't back up one partition to another on the same spindle, because when the drive dies the whole spindle is lost. Paranoid people back up to two different drives, two different disk controllers, two different machines, two different datacenters, two different continents...


[1] But nonzero. It is worth thinking about the scenarios.

Agreed. For that exact reason my main dev machine now has hourly local backups through time machine, local HDD clone backups every 4 hours and a separate offsite backup with Mozy.

In addition to the remote repos on Github and my normal local copies...

Agree 100%. Furthermore, it's important not to conflate version control with data backup. Although they share some traits, they have different goals. For example, if I lose my local working copy of a repository before any commits, I've lost valuable work. In absence of a good backup strategy, the existence of the remote repository is of little consolation.

As soon as bandwidth costs come down enough I have a great startup idea: backup copies stored on a different planet/planetoid. Mars. The Moon. Either one. Half-kidding!

I'm <s>guessing</s>sure Github does backups.


Seeing "I'm guessing" paired with "QED" is... strange to say the least.

Only guessing in the sense I haven't actually checked :)

I agree and disagree with you in equal measures. Paying github is no protection against github messing up, in the end you are still responsible for your data and any subsequent loss will be your problem, regardless of the cause of the loss.

So github can be a part of a backup strategy but it isn't a strategy by itself.

Likewise, there are plenty of parties that wouldn't dream of storing their data in a third party repository, it could be compromised, there are at least 'n' github employees that now have access to your data etc.

So there is a need for both options, one where you outsource your headache to github and keep a couple of local copies just in case, another where you do have your own repository that you control with the associated backup mechanisms and a number of off-site copies.

Fortunately github makes it easy to do the former and git itself can make it (relatively) easy to do the latter.

For plenty of people the first is enough. For me it wouldn't work, so I'm really happy this got posted.

I wrote this little shellscript to backup all my Git repositories on GitHub: http://github.com/avar/github-backup

It runs in cron and backs up all my data daily.

Daily, weekly and snapshot backups with Linode are $5 on top of what I'm paying them anyway (and just a few clicks to set up). That's less than the cost of GitHub's micro plan and I can have as many private repos as I want.

I love GitHub and I'm sure they'll continue to do well, but running your own Git server is only going to get easier. If you don't need the social side of what they offer, hosting yourself makes sense.

My poorman's offsite backup script involves zipping up the repository, encrypt it, and mail it to GMail. The task is scheduled daily and works pretty well.

I did this with my university project source code. The only difficulty is that Gmail does not allow executables even within a zip file. You have to work pretty hard to avoid getting them into your repo.

If you encrypt the zip, it works fine. Took me ages to discover that :)


I thought the point of git was that it was decentralised. So even if the server died, he wouldn't lose anything would he?

You can also stick a git repo on top of Dropbox... There are several articles about how to do this if you do a quick google.

Git makes it trivial to replicate your repos to more than one directory, no matter where those directories are located.

Whether or not this makes you "decentralized" depends on where your directories live. Two machines in the same location? Not decentralized.

Obviously. You can also backup your files to a folder on the same disk, but you wont will you.

Backups are important, but not that important if it's just a small private Git server. After all, the full history is not only stored on the server, but also on each of the clients.

As the author, I can respond to this:

Given the decentralised nature of Git, having continuous backups becomes less important unless all my computers (including the server) fail at once.

But yes, backups are important, and they are done.

If all your computers are in a single location, it's not decentralized in cases of fire, flood, earthquake, or other natural disasters...

Or even theft. I lost more than a week's work last year becuase a team member had his laptop, external harddrive, and desktop stolen from his house while he was visiting his parents for Thanksgiving. Whoops. Triple backups don't count if they're all in one apartment.

You have to have backups on server anyway. And setting offsite backups to amazon s3 is like 30 mins of work and may be $1 per month in s3 costs. and this will backup not only your git repositories, but the whole server.

You should already have those things in place for your server. I don't think that the marginal cost of managing git is much if you're already managing a database, web server, application, mail server, and so on.

Server most likely needs to be backuped anyway. Backups not so important for git as for svn because you are cloning repository with history etc.

I highly recommend using Gitosis; it makes managing teams, projects and users a breeze. It's really nifty: you get a git repo that contains the ssh keys and config file to manage the system. Add the user's key, add the user to the config file, push, and you're done!

Installing gitosis was probably the most error-prone and miserable experience I have ever had in Ubuntu system administration. Solid as a rock once I got it up, but man, at the time I had to cobble three different blog posts together with a bit of bubblegum. Obligatory disclaimer: my mental model for both git and sysadminning is not as good as the mean HN reader's.

I don't see any reason to run gitosis when there's gitolite, even for a single user. Either is definitely preferable to doing things manually, though.

I just went throught gitolite's readme, and I don't see any reason why a lone dev might want to use it, since it seems to be built for managing teams with access control needs.

Doing it manually for me just means (on the server):

      mkdir project_name
      cd project_name
      git init --bare
Hardy difficult.

I already have ssh keys set up from long back, so that's all I have to do really. Then I just add the remote repository on my local machine.

I avoid sysadmin work when possible - it's not something I enjoy spending time on.

Setting up gitolite is dead simple - setting up a git server manually manually might be 'simple' compared to other tasks, but it's still unnecessary work. Creating new repos is as simple as changing the config file and pushing it - no need to do anything on the server at all.

I see absolutely no reason TO do it manually, and that's the decider.

As mentioned by cdr, "Gitolite" is a spiritual successor to gitosis, and has a lot improvements and new features in management, permissions, and setup.

Earlier this year, I moved from SVN to git when I found I really needed better branch management. One thing I found is that hosting a git repository is a lot less straightforward than hosting a subversion repository, particularly since fewer bug tracking systems have adequate support for DVCS at this point. Trac seems to be lagging behind Redmine in this regard.

If anyone is interested, I created a library for deploying git hosting (or SVN hosting if you really prefer) on a VPS with Redmine project management: http://github.com/ericpaulbishop/redcloud

Git hosting is provided via gitosis, not gitolite since there is no redmine plugin for gitolite support. It runs on Ubuntu VPS and uses Nginx with passenger compiled in as the web server. PHP via php-fpm can optionally be compiled in too, in case other sites on the same box need PHP (as is the case for what I'm doing).

For simple uses, there are several free git hosting services. Assembla offers unlimited repos and 2GB for free, while ProjectLocker offers unlimited repos and 500MB for free. There are some limitations on tools and sharing and users, but for personal use, these can act just like another repo in the cloud, Yet Another place your private code is copied. No private server maintenance required.

If you do want your own private server, VPSs are getting ridiculously cheap. lowendbox.com relays the latest offer of Open VZ 512MB, 20GB of storage, 1TB of bandwidth, all for ... $3.60/month. At those prices, your backup strategy could simply be to have two, or three, from different vendors. These cheap VPS hosts can just disappear one day, so never have this be your only copy of anything.

Another good solution is installing gitorious. Especially if you have a big team and a few projects.

I ran throught his this week- Gitorious' install is a bit of a pain, and the software is just BROKEN in places (referencing an array, when the variable is not in one).

I like the system, and it's very flexible, but it doesn't seem like they treat it very well for outside use.

I'm working on a VM/ec2 image for Gitorious to ease the installation pain. I'm working from the the Ubuntu installation script that the Gitorious project provides, and I will try to get the improvements merged upstream. Things are mostly working, so I will be making the work public after a few more days of testing. Feel free to contact me if you'd like more info.

I don't know, the only problem I found so far is the lack of tag display. It is a bit harder to install than gitosis, but you also get much more (for example, people can create projects and repos without your help, all the visual stuff, etc..)

Or, use gitosis and be able to control your repositories and access to them much, much easier: http://scie.nti.st/2007/11/14/hosting-git-repositories-the-e...

Does git have the equivalent of "hg serve"?

That would be git-web. It's not built-in, but a set of Perl scripts that are invoked via CGI from Apache or similar. Setup is a snap as it can be installed with apt-get/yum/zypper.

Honest question: What's the easiest way for two developers using git to push and pull from each other, if they're running Windows and on an isolated LAN?

In the last few weeks, I've started using this a lot as well, to host my multiple little "labs" projects without blowing my private github repositories limit.

I keep private github repos for collaborative projects.

Hi Guys,

This is a question I've been struggling with for a while now.

Would you host your company's super important bread and butter code on GitHub instead of your own server?

I get a funny feeling every time I think about this mostly because I've put tons of time (3 years) into our code base and I think of it as one of the most important aspects of our company.

Then again, GitHub hosts their code on GitHub which makes me feel a bit at ease in doing so.

I would really appreciate your insights.


I would, and I do. I don't use their wiki or issue tracker, though, so if they managed to lose the data, it'd take all of 30 seconds to (well, plus pushing some rather large files) to set it up again.

Our codebase is GPLed, so there are no concerns about privacy.

You typically want to enable the post-update hook on the server by mv '${GITDIR}/hooks/post-update.sample' '${GITDIR}/hooks/post-update'.

This makes sure that the repository works with dumb servers which is required for things like http/cgit.

.. and chmod +x on it, IIRC

warehouse makes a nice front-end to your personal git server http://github.com/drcapulet/warehouse "Re-Written from the ground-up for use with Git"

Why would one need a git server? I am somewhat unclear on this... any good reads on this topic?

I am using git because I hate svn (so I use git-svn), so I am curious how a pure git setup would work...

easier collaboration with others that you need privacy on. A backup of your repos(though you should have backups on this too). a canonical version of your repository with read access for your team but commit access for only a select few.

I am sure there are more scenarios but this is just what I could think of in a min.

Is there any point in putting obvious things that anyone can google when he need them on HN main page?

The value is not only in the post but in the subsequent discussion it generates.

In fact, I will often "cheat" and read the discussion first.

You must be a recovering slashdotter, like me.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact