That said--and this is without knowing the exact build and tooling environment, so I may well be giving advice inappropriate to the situation at hand!--the second part of that "keep some binaries along with the source files they're generated from" is kind of an antipattern.
If it takes too long to generate them from source, each time erry time, they need to fix that issue--least of all because slow builds mean slow testing, and slow testing means no testing.
That's why I spent two days earlier this year moving a 30-minute build down to a 2.5 minute build.
The output of the build is basically a tool in itself. So most people don't need the build process, just the resulting tool. The input changes on pretty much a monthly basis and is not easily versioned. I could set up all dev machines to support the build and everybody could build it themselves from sources, but that would require me to
* install all the required instruments for the ritual rain dance
* teach the whole team how to do the ritual rain dance
* support the people that break an arm or a leg doing the ritual rain dance
So I prefer perform the dance on my machine, collect the tool and point the main dev environment to the right location. It's all scripted, so I kick off the job and grab a coffee. However, I want to keep old instances around so I can track when bugs crept in, so I can't just go overwrite the result, so I need to adjust the pointer every time. If I could just check the tool in with the regular dev setup that would be much easier, but - since we're using git - that would blow up quite quickly and overwhelm my disk space. (And folks would kill me for filling up their disks as well - rightfully so). That's something SVN or another centralized VCS would handle much more gracefully. In a gist, I have a use-case for fairly stupid versioned file storage with a push/pull api. No complicated merging, no branching, nothing. git-annex could do, but is overkill.
There are some better solutions to what we're currently doing, but there's so many yaks to shave and so few razors.
> That's something SVN or another centralized VCS would handle much more gracefully.
Subversion handles this gracefully because you don't download all of the repository's history to your machine. It's a trade-off that you're talking about here. Most people are ok with losing the ability to check in 500MB files in order to gain the decentralization of having a full copy of the repo (and not needing to query the server just to view history).
Thanks for such a good explanation. When I first saw artifacts of things generated by code in our repo, I had a big WTF moment, but it made a lot of sense once someone pointed out that it was Rather Handy for catching bugs in the code that does the generation.
That said--and this is without knowing the exact build and tooling environment, so I may well be giving advice inappropriate to the situation at hand!--the second part of that "keep some binaries along with the source files they're generated from" is kind of an antipattern.
If it takes too long to generate them from source, each time erry time, they need to fix that issue--least of all because slow builds mean slow testing, and slow testing means no testing.
That's why I spent two days earlier this year moving a 30-minute build down to a 2.5 minute build.