

Using GNU Make's 'define' and '$(eval)' to automate rule creation - jgrahamc
http://blog.jgc.org/2012/01/using-gnu-makes-define-and-eval-to.html

======
Peaker
Make should really be superceded by a build system that is both faster and
more correct.

One of these options today is the tup[1] build system.

It seems to be a relatively modest/new effort, and is already far better than
Make.

[1]: <http://gittup.org/tup/>

~~~
CJefferson
The big advantage of Make of course is that you can assume it is basically
everywhere.

I have just looked at tup, and it looks interesting, except I don't really
understand why the auto-dependancy only kind-of works. If building X.c uses
header.h, and header.h is created by script.sh, why do I have to tell tup
about that ordering? Why can't it just see it has to run script.sh to get
header.h, before building X.c?

~~~
evmar
The build system can't know beforehand which scripts generate which files
(unless you're implying it ought to run everything that looks like it might be
a script and observe what outputs it writes).

However, once you've compiled a .c file the first time (or run the script the
first time), the build system can remember which inputs and outputs were
involved (either because gcc tells it which files it used, or by going lower-
level and intercepting the file-opens of the program) and reuse that
information in subsequent builds.

~~~
CJefferson
Looking at: <http://gittup.org/tup/ex_dependencies.html>

I see the following warning about half way down the page. While I understand
the first time this happens tup has an issue, I don't understand why it can't
automatically remember it for future use (or does it?)

    
    
        *** tup errors ***
        tup error: Missing input dependency - a file was read from, and was not
        specified as an input link for the command. This is an issue because the file
        was created from another command, and without the input link the commands may
        execute out of order. You should add this file as an input, since it is
        possible this could randomly break in the future.

------
reedhedges
Oh god, this way lies madness. :) It really does. Once you get beyond a fairly
low level of complexity, there will be maintainence problems: It will take you
too long to get it right, it will be impossible to figure out when something
goes wrong, and you'll drive the next poor developer crazy if he doesn't have
a complete knowlege of make (which is most of us). Yes, sometimes it may be
neccesary to do this if you want to reuse some rules and functions but modify
them under different conditions, AND you are restricted to doing it all in the
Makefile (rather than preprocessing, calling out to external scripts, etc.)

~~~
mturmon
Using "define" to set up templates for rules is pretty straightforward and
very powerful. It allows the set of rules to be augmented at runtime, which
can enable new uses of make.

I used this mechanism to automate a pipeline for scientific data. The template
(the "define...endef" block) held several rules needed to re-make results for
one granule of data. When you ran make, it looked for all source granules, and
a foreach() mapping the template across the source granules, just as in the
OP, set up rules for each one.

Then you could dump a new source granule in a directory, run make -j 8, and
get parallel "builds" of the results for free.

As long as it's documented, it can save a lot of repetition.

Not coincidentally, jgc has a nice article on make debugging ;-)

[http://drdobbs.com/article/print?articleId=197003338&sit...](http://drdobbs.com/article/print?articleId=197003338&siteSectionName=)

------
mturmon
A similar version of this idea is also in the gnu make manual, at:

[http://www.gnu.org/software/make/manual/make.html#Eval-
Funct...](http://www.gnu.org/software/make/manual/make.html#Eval-Function)

