
Ask HN: How do you handle large files in Git Repos under development? - johncole
Our development team is facing this issue right now; we need to include some larger binaries (pip and debian install files) with our Github repo.  They&#x27;re not likely to change much over the coming years, but necessary since we run our updates offline, and need the packages to be easily accessible for testing and development.  To test and release, right now we&#x27;re simply able to click the green &quot;Download Zip&quot; on Github.  So we&#x27;re balancing two issues:  Putting the binaries right in Github will eventually make the repo bloated.  And putting them aside and packaging them means that we have to take an extra step for testing, and that someone will eventually forget this or screw it up.<p>Anyone have any experience with this?  We&#x27;ve looked at the following:<p>1. Submodules - This seems almost perfect, but requires some change in developer behavior.  We are used to downloading a test zip package for offline testing from the Github website.
2. Batch&#x2F;Bash File Packaging - Writing batch files that will automatically package up the larger files with the Github repo before we use in testing or release.
3. LFS - Git Large File Storage - We haven&#x27;t really looked at the pros and cons here much.  Any experience?
======
stephenr
... why do you want .deb files in git? Build them and put them in an apt repo.

~~~
johncole
Good question. We need to be able to install offline with a USB drive.

~~~
stephenr
Right but wouldn't a better option be to then mirror the apt repo to the USB
key and just use `deb file:/mnt/usbkey $RELEASE $COMPONENTS` - then the setup,
dependency management etc is all the same.

~~~
johncole
It might be, we could look at it.

~~~
stephenr
Even if you're not going to go the full apt repo route, using regular
file/blob storage is still going to give you a better result I believe.

