

Maintaining Large Binary Resources in Git - jgorham
http://stackoverflow.com/questions/540535/managing-large-binary-files-with-git

======
yason
I've long established the opinion that the right way to go is to store all
versions of the binary files on a separate disk/share/website and use git to
version the paths to those files. Then you just suffer to have one local copy
of the binary archive and you can access the individual files a thousand times
via git without bloating the git repos.

It seems that git-annex does exactly something like that, and apparently even
more while being more sophisticated.

------
obtu
Git is getting better with large files (now that changes to those are streamed
to packfiles), but there's still some way to go to make it scale to large
numbers of tracked files;
[https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup...](https://git.wiki.kernel.org/articles/g/i/t/GitTogether10_bup_bf08.html)
has some ideas to make the index more scalable.

------
emillon
git-annex is the way to go ! I use to manage my media library and it's 95%
awesome (the nitpick being that you can't easily edit files once they're
checked in - there ain't no such thing as a free lunch I guess).

