Title is missing [2005] - was written 9-jul-2005 (doesn't say if edited since)
IMHO the most important edition since is djb "redo", which has since been implemented by apenwarr https://github.com/apenwarr/redo/ - It replaces make's dependency tracking with something significantly simpler and yet more reliable, and uses your familiar shell as a language to do that.
redo does NOT try to provide project management (especially not of the "cross platform" variety offered by cmake/tmake/qmake and friends). It leaves that for other tools (and rightly so, I believe).
Another important tool not listed: premake (v3 and v4, which are very different). If you're building a C++ centered project that needs to work on Windows, Xbox, Mac, Linux, PS3 and others - it will keep whatever is left of your sanity given the situation, and help you get the job done.
After make based builds caused too much problems for us, we first moved to WAF (http://code.google.com/p/waf/), but it was too complex for our needs and extending builds with our own steps became a chore.
Then we moved to redo, which was a breeze of fresh air with it's simplicity. We build a few convenience scripts on top of it to allow easier management of build targets (OS X, iOS, iOS Simulator, Linux) and been happy since.
My problem with many modern build systems is that they try to cure the user of using shell commands for compiling and linking, in a misplaced effort to make things "more portable." They walk right by the real reason I'm there, which is to manage the complexity of the build and control compile times.
For projects simple enough to be manageable with those build systems, I'm already happily using make and a reasonably GNU environment (if not GNU's compiler.) For the other projects, where recursive makes and complexity becomes a distraction, I've already exceeded the parameters where these build systems are comfortable.
The acid test for me is MOSREF. MOSREF has a fairly portable ANSI C virtual machine and linker, but the compiler is metacircular and obviously nobody is going to target MOSREF for their packaged in recipes. One build system after another drove me to distraction trying to simplify MOSREF's process -- despite the fact that each step was quite simple, the order of steps was very tricky, especially when hacking on the compiler.
Redo looks like a better evolution away from Make for me, since it means no new syntax and it doesn't try to protect me from bash. Thanks for the recommendation -- I don't work much on MOSREF anymore, but maybe this offers an escape route from "go build", which has been a bit of pest for configuration management.
Shall I be the 1st to mention tup? It automatically tracks dependencies with file notifications, is written in C and is very fast. You can run its monitor in a way that automatically recompiles whenever a source file is changed.
Shake [1] is a Haskell library for writing build systems. I've been trying it out on a small side project lately. It isn't mature so it doesn't do everything you want, but the approach of supplying a library of build system construction tools instead of forcing you into a more rigid framework and/or DSL gives one a lot of flexibility. It seemed interesting and novel to me, but I don't know if it has been done before or not.
Not having a configuration system is a big drawback, enough to make me consider using autoconf in conjunction with Shake. Also you have to know Haskell, which will turn a lot of people away.
This may be the time to point to a paper[0], which, sadly is rarely implemented, and turns on its head one of the common misconceptions of make, which is "make is slow".
I've been using that way of writing makefiles, and significantly improved dependency analysis and speed by an order of magnitude.
The fundamental problem with build systems is that they are built on top of command-line tool invocation, and command-line tool invocation is an extremely leaky abstraction; it breaks whenever the tools themselves are missing or broken. When you abstract away from this, you have to be very careful to maintain transparency all the way down, or else a wide variety of common problems become impossible to debug.
Most of the current crop of tools fails miserably at this. Whenever something goes wrong in ant, or autoconf, or jam, it's always a nightmare to debug; and the more they try to be helpful, the worse it gets.
> Now, autotools/automake are a different (bad) story
While autotools are a bit hairy for the developer, the autotools-based builds usually work quite nicely. I regularly build the GNU toolchain (gcc + binutils + glibc) for cross compilation and usually it works like a charm. I grab the sources from git repos so naturally they have their moments, but it usually works better than any other build system.
Some build systems are a lot more convenient for simple tasks (I like CMake) but when it comes to cross compiling or truly configuring for varying build sites, autotools-based systems deliver.
A few years ago there was some hassle with autoconf/automake version numbers but that seems to have been solved now.
Nice roundup. I recently used CMake for a modestly-sized project. After getting over the initial hump, I've found it to be quite pleasant (the documentation could be a lot better, though).
I find the ability to generate real projects for the various IDEs I use on different platforms to be the key differentiating factor. SCons, by comparison, wants you to set up the IDE to replace its build step with a call to the scons script. It just feels wrong by comparison.
I created Lake a couple of years ago, which allows one to create ones makefile in C++.
A header and footer are added to the makefile, and it's then compiled using ones C++ compiler. Afther that it's run, and that's the point where ones other sources are compiled.
It was inspired by Icmake (by my C++ prof). It never really caught on (I didn't really promote it), but I still think it wasn't a terrible idea (apart, maybe, from it being language specific).
I believe that things which behave like Make, wrap it, or otherwise generate Makefiles are missing an opportunity. If you're building C or C++, that code should already specify all of its deps. You just need to stick to certain design rules.
Really, if you say "build foo", it should just figure it out. The only interesting part is if you have extra flags needed for some libs and/or headers, and those can be specified without too much trouble. Using pkg-config as a starting point usually helps.
I'm speaking from experience here. I stopped using make for my own builds a couple of months ago. Life is great.
This requires Make to parse the files in some way. What I find nice about Make is that there's no magic. I can use Make to drive builds of C, D, fortran objects and link them together, throw in some Java, transform markdown into html, generate graphs with graphviz and build pdfs with latex.
In those cases I'm positively hopeful that Make won't try to do any magic except building its dependency graph, see which files are older than their deps, and do the actions as described.
Make is no silver bullet, but it does a hell of a job at building and binding together non-ideal stuff.
Not sure how mature it is, but Ekam is a project that thinks in this direction. It's not quite as easy as you believe, but it's definitely possible. http://code.google.com/p/ekam/
There's a big note about how it only works on Linux and how FreeBSD and Mac OS support has atrophied. I take this to be related to the syscall sniffing stuff which is at its heart.
Maybe I'm just conservative, but that kind of design is not the sort of thing I would ever want to rely on.
I suspect your code may already have all of this right in the source.
#include <stdio.h>
#include <mysql/mysql.h>
#include "base/logging.h"
#include "http/cookie.h"
// ... and so on.
Right there, you can translate that into a system header you can ignore, a system header for which you should add cflags and ldflags where appropriate (compiling vs. linking), and a couple of local libraries which need to be further investigated.
http/cookie.h and base/logging.h are then analyzed, along with http/cookie.cc and base/logging.cc, assuming they exist. Any #includes there are chased down in the same manner. This continues until everything has been resolved.
If you keep track of all of these things, then you will have a list of every object file which needs to be rolled up into the final binary. You also pick up all of the flags required to compile and/or link with those things by extension.
Obviously you have to handle the whole "rebuild if something changes" thing, but that's not particularly difficult, either. I wrote my own tool to do exactly this. I'm using it for my own (C++) projects, and it's been quite pleasant. It won't work for everyone, though.
As long as every c file has an h file with the exact same name, and every lib has precisely one so file, with the same name, I tend to agree. Although I know many projects for which this is not the case.
One thing that has really disappointed me with Make and friends is that we have Maven and SBT now. Suddenly Make , Autotools, CMake , Scons ,.... they all just seem so archaic compared to the amazing simplicity that is SBT. Automatic dependency resolution, incremental compilation, simple project publishing, no writing of build scripts ever.
The only other build system I know of that comes close is Go's built in system, but that appears to be more because of their instance on no linking
These friendly high abstraction build systems show their teeth when they meet configurations that deviate from abstractions. Go's a great example -- if $GOROOT or parts of $GOPATH are not owned by the build user, "go build" will occasionally fail because a pacman or aptitude update touched the compiler and Go needs to recompile every package in sight.
When dealing with JNI, both Maven and SBT have to delegate the task to less pure build systems. For projects that cannot nest nicely above a very high and consistent layer of abstraction, these "archaic" systems are essential because of their lack of interfering abstractions. They observe a different definition of "simple", instead of being "simple to use in a specific problem domain" they go to "simple to adapt to a different problem domain."
I used to be a big fan of SCons, and now I tend to use Rake for my build system. To me the most useful thing in a build system is to have a straightforward and easy to understand interface for setting it up, and I'm much more comfortable with Ruby and Python than I am with the odd (to me) lisp-derived syntax of make (fwiw: I typically use these with C/C++ projects, not Ruby/Python/&c)
IMHO the most important edition since is djb "redo", which has since been implemented by apenwarr https://github.com/apenwarr/redo/ - It replaces make's dependency tracking with something significantly simpler and yet more reliable, and uses your familiar shell as a language to do that.
redo does NOT try to provide project management (especially not of the "cross platform" variety offered by cmake/tmake/qmake and friends). It leaves that for other tools (and rightly so, I believe).
Another important tool not listed: premake (v3 and v4, which are very different). If you're building a C++ centered project that needs to work on Windows, Xbox, Mac, Linux, PS3 and others - it will keep whatever is left of your sanity given the situation, and help you get the job done.