
GNU Make Alternatives - pooriaazimi
http://freecode.com/articles/make-alternatives
======
beagle3
Title is missing [2005] - was written 9-jul-2005 (doesn't say if edited since)

IMHO the most important edition since is djb "redo", which has since been
implemented by apenwarr <https://github.com/apenwarr/redo/> \- It replaces
make's dependency tracking with something significantly simpler and yet more
reliable, and uses your familiar shell as a language to do that.

redo does NOT try to provide project management (especially not of the "cross
platform" variety offered by cmake/tmake/qmake and friends). It leaves that
for other tools (and rightly so, I believe).

Another important tool not listed: premake (v3 and v4, which are very
different). If you're building a C++ centered project that needs to work on
Windows, Xbox, Mac, Linux, PS3 and others - it will keep whatever is left of
your sanity given the situation, and help you get the job done.

~~~
dirtyaura
Agreed.

After make based builds caused too much problems for us, we first moved to WAF
(<http://code.google.com/p/waf/>), but it was too complex for our needs and
extending builds with our own steps became a chore.

Then we moved to redo, which was a breeze of fresh air with it's simplicity.
We build a few convenience scripts on top of it to allow easier management of
build targets (OS X, iOS, iOS Simulator, Linux) and been happy since.

------
numeromancer
Shall I be the 1st to mention tup? It automatically tracks dependencies with
file notifications, is written in C and is very fast. You can run its monitor
in a way that automatically recompiles whenever a source file is changed.

<http://gittup.org/tup/>

------
jwhite
Shake [1] is a Haskell library for writing build systems. I've been trying it
out on a small side project lately. It isn't mature so it doesn't do
everything you want, but the approach of supplying a library of build system
construction tools instead of forcing you into a more rigid framework and/or
DSL gives one a lot of flexibility. It seemed interesting and novel to me, but
I don't know if it has been done before or not.

Not having a configuration system is a big drawback, enough to make me
consider using autoconf in conjunction with Shake. Also you have to know
Haskell, which will turn a lot of people away.

[1] <http://hackage.haskell.org/package/shake>

------
eis
It should be noted that this article is from 2005. A bit has changed in the
landscape of Make alternatives.

For example another alterntive that was created later is waf:
<http://code.google.com/p/waf/>

------
lloeki
This may be the time to point to a paper[0], which, sadly is rarely
implemented, and turns on its head one of the common misconceptions of make,
which is _"make is slow"_.

I've been using that way of writing makefiles, and significantly improved
dependency analysis and speed by an order of magnitude.

[0] <http://miller.emu.id.au/pmiller/books/rmch/>

------
krasin
Hm... no mentions of Ninja? <http://martine.github.com/ninja/>

If you are working with a project that uses CMake, Ninja is a perfect
replacement for GNU Make, because it significantly speeds up rebuilds.

edit: ah, it's 2005.

~~~
emarcotte
This! I dropped ninja into a project with litterally 5 minutes work (mostly
reading the how to install it part). It's very straight forward and /fast/

------
jimrandomh
The fundamental problem with build systems is that they are built on top of
command-line tool invocation, and command-line tool invocation is an extremely
leaky abstraction; it breaks whenever the tools themselves are missing or
broken. When you abstract away from this, you have to be very careful to
maintain transparency all the way down, or else a wide variety of common
problems become impossible to debug.

Most of the current crop of tools fails miserably at this. Whenever something
goes wrong in ant, or autoconf, or jam, it's always a nightmare to debug; and
the more they try to be helpful, the worse it gets.

------
raverbashing
Funny

I (usually) don't have a problem with Make, I think it's a good tool for what
it does (basically, a DAG solver)

For all means use make with hand-made files.

Now, autotools/automake are a different (bad) story

The alternatives posted here are very interesting

~~~
exDM69
> Now, autotools/automake are a different (bad) story

While autotools are a bit hairy for the developer, the autotools-based builds
usually work quite nicely. I regularly build the GNU toolchain (gcc + binutils
+ glibc) for cross compilation and usually it works like a charm. I grab the
sources from git repos so naturally they have their moments, but it usually
works better than any other build system.

Some build systems are a lot more convenient for simple tasks (I like CMake)
but when it comes to cross compiling or truly configuring for varying build
sites, autotools-based systems deliver.

A few years ago there was some hassle with autoconf/automake version numbers
but that seems to have been solved now.

------
rdw
Nice roundup. I recently used CMake for a modestly-sized project. After
getting over the initial hump, I've found it to be quite pleasant (the
documentation could be a lot better, though).

I find the ability to generate real projects for the various IDEs I use on
different platforms to be the key differentiating factor. SCons, by
comparison, wants you to set up the IDE to replace its build step with a call
to the scons script. It just feels wrong by comparison.

------
enqk
a recent one, tundra, uses LUA as its configuration language. It's very fast,
taking great care of using concurrency.

project:

<https://github.com/deplinenoise/tundra>

and slides:

[http://deplinenoise.wordpress.com/2011/04/23/slides-
tundra-f...](http://deplinenoise.wordpress.com/2011/04/23/slides-tundra-from-
revision-2011/)

------
wybo
I created Lake a couple of years ago, which allows one to create ones makefile
in C++.

A header and footer are added to the makefile, and it's then compiled using
ones C++ compiler. Afther that it's run, and that's the point where ones other
sources are compiled.

It was inspired by Icmake (by my C++ prof). It never really caught on (I
didn't really promote it), but I still think it wasn't a terrible idea (apart,
maybe, from it being language specific).

Lake: <http://www.logilogi.org/pub/lake/README.LAKE.txt>,
<http://sourceforge.net/projects/logilogi/files/lake/> Icmake:
<http://icmake.sourceforge.net/>

------
rachelbythebay
I believe that things which behave like Make, wrap it, or otherwise generate
Makefiles are missing an opportunity. If you're building C or C++, that code
should already specify all of its deps. You just need to stick to certain
design rules.

Really, if you say "build foo", it should just figure it out. The only
interesting part is if you have extra flags needed for some libs and/or
headers, and those can be specified without too much trouble. Using pkg-config
as a starting point usually helps.

I'm speaking from experience here. I stopped using make for my own builds a
couple of months ago. Life is great.

~~~
jamesaguilar
Not sure how mature it is, but Ekam is a project that thinks in this
direction. It's not quite as easy as you believe, but it's definitely
possible. <http://code.google.com/p/ekam/>

~~~
rachelbythebay
There's a big note about how it only works on Linux and how FreeBSD and Mac OS
support has atrophied. I take this to be related to the syscall sniffing stuff
which is at its heart.

Maybe I'm just conservative, but that kind of design is not the sort of thing
I would ever want to rely on.

~~~
jamesaguilar
Maybe. There's not really another way I can think of to get a perfect picture
of what the compiler is going to ask for.

~~~
rachelbythebay
I suspect your code may already have all of this right in the source.

    
    
        #include <stdio.h>
        #include <mysql/mysql.h>
    
        #include "base/logging.h"
        #include "http/cookie.h"
    
        // ... and so on.
    

Right there, you can translate that into a system header you can ignore, a
system header for which you should add cflags and ldflags where appropriate
(compiling vs. linking), and a couple of local libraries which need to be
further investigated.

http/cookie.h and base/logging.h are then analyzed, along with http/cookie.cc
and base/logging.cc, assuming they exist. Any #includes there are chased down
in the same manner. This continues until everything has been resolved.

If you keep track of all of these things, then you will have a list of every
object file which needs to be rolled up into the final binary. You also pick
up all of the flags required to compile and/or link with those things by
extension.

Obviously you have to handle the whole "rebuild if something changes" thing,
but that's not particularly difficult, either. I wrote my own tool to do
exactly this. I'm using it for my own (C++) projects, and it's been quite
pleasant. It won't work for everyone, though.

I wrote about it here: <http://rachelbythebay.com/w/2012/03/28/build/>

~~~
jamesaguilar
As long as every c file has an h file with the exact same name, and every lib
has precisely one so file, with the same name, I tend to agree. Although I
know many projects for which this is not the case.

------
dkhenry
One thing that has really disappointed me with Make and friends is that we
have Maven and SBT now. Suddenly Make , Autotools, CMake , Scons ,.... they
all just seem so archaic compared to the amazing simplicity that is SBT.
Automatic dependency resolution, incremental compilation, simple project
publishing, no writing of build scripts ever.

The only other build system I know of that comes close is Go's built in
system, but that appears to be more because of their instance on no linking

~~~
swdunlop
These friendly high abstraction build systems show their teeth when they meet
configurations that deviate from abstractions. Go's a great example -- if
$GOROOT or parts of $GOPATH are not owned by the build user, "go build" will
occasionally fail because a pacman or aptitude update touched the compiler and
Go needs to recompile every package in sight.

When dealing with JNI, both Maven and SBT have to delegate the task to less
pure build systems. For projects that cannot nest nicely above a very high and
consistent layer of abstraction, these "archaic" systems are essential because
of their lack of interfering abstractions. They observe a different definition
of "simple", instead of being "simple to use in a specific problem domain"
they go to "simple to adapt to a different problem domain."

------
pux0r3
I used to be a big fan of SCons, and now I tend to use Rake for my build
system. To me the most useful thing in a build system is to have a
straightforward and easy to understand interface for setting it up, and I'm
much more comfortable with Ruby and Python than I am with the odd (to me)
lisp-derived syntax of make (fwiw: I typically use these with C/C++ projects,
not Ruby/Python/&c)

------
fusiongyro
I rather like make in all its incarnations, but I'm a bit partial to Plan 9's
mk:

<http://plan9.bell-labs.com/magic/man2html/1/mk>

------
darkestkhan
There is also gprbuild but it is used mostly by Ada programmers (especially
that config syntax and semantics is very similar to that of Ada).

------
egeozcan
I know that this thread can fill with posts like "why no x" but gyp is
officially used by node.js so I'm a bit suprised.

------
jasonwatkinspdx
Redo.

