Hacker News new | past | comments | ask | show | jobs | submit login
GNU Make Standard Library (jgc.org)
164 points by ingve 7 days ago | hide | past | favorite | 78 comments





GNU make now has a load directive which lets you load up functions written in C

    -load my_module.o
Your makefile can contain instructions to build my_module.o and they will be automatically triggered.

For example you can create an equality test that works in an expression (ifeq can't be used as part of an expression obviously). For example

  FILENAME=file.$(if $(equals $(compression_level),0),tar,.tar.bz3)
The C for this function (without includes and export directives):

  char *
  func_equals (const char *func_name, unsigned int argc, char **argv)
  {
      char *result = NULL;

      if (strcmp(argv[0], argv[1]) == 0) {
          result = gmk_alloc(strlen(argv[0]) + 1); /* not handling failure for simplicity */
          strcpy(result, argv[0]);
      }
    
      return result;
  }
This can be done with a macro but it's ugly and verbose. Macros also slow makefile parsing a lot and for a large build like e.g. an operating system this makes a big difference - it's a penalty you pay every time you run "make" even if you only changed 1 file.

There are plenty of things you cannot do with macros too. $(shell) is a getout card but it drastically slows down large makefiles.

Your module has a setup function which gets called when it's loaded and this adds the function into gmake:

  int
  equals_gmk_setup (const gmk_floc *flocp)
  {
      gmk_add_function ("equals", func_equals, 2, 2, GMK_FUNC_DEFAULT);
      return 1;
  }
Things that are hard/slow to do with macros like arithmetic - comparing, adding and so on are even better candidates. A hash function is great for generating intermediate target names that aren't too long for the filesystem.

My favorite one that I've done is embedding a python interpreter into make - this is very convenient as it's MUCH faster than running a process from $(shell) and it keeps state between uses which can be useful.


GNU Make also embeds GNU Guile, a criminally underused feature:

https://www.gnu.org/software/make/manual/html_node/Guile-Int...


In practice, Guile is usually not compiled in. Whereas I've never seen a version of make without `load` and its supporting infrastructure.

Debian gives you the option, with make and make-guile equivalent packages. IIRC Slackware simply compiles it in (guile already being there) and Fedora/RHEL leave it out entirely.

Yes. It really should be "make" and "make-noguile", but we have what we have. In practice, you actually want the "remake" package. This is a fork of "make" that's a drop-in replacement (since it is the same code), has guile enabled AND contains an interactive debugger.

I wish rmake was in the standard codebase for most distros. It’s absurdly helpful.

It is criminally under-integrated!

The only interface you get into the make internals from Guile is a function to expand a make expression, and a function to eval a makefile fragment.

These interfaces only encourage the use of Guile for nothing more than make metaprogramming, an area where more power is not needed.

Imagine if Guile access to the rules as data structures or something. And the graph of targets to be updated and what not.

Imagine if Guile could be used, say, to override the logic by which a target is considered out of date with respect to its prerequisites.

.


100% agree. With the current API, there's no real advantage to using Guile over a C extension (other than the choice of language). If the Guile interface could hook into Make's dependency graph, it would be huge game-changer. But as it is, the Guile interface is basically a fancy macro processor.

The Guile interface can expand Makefile expressions into values, and can process/evaluate Makefile statements. But there's no way (that I've found) to do something like "remove this dependency from target X", ask "what dependencies does X currently have?", or ask "do you think X is currently up-to-date?".


I would say there is no advantage of using built-in Guile to call make's eval API over:

  $(eval $(shell <arbitrary-command>))
where arbitrary-command could run ssh to a virtual machine, running a docker container loaded with Common Lisp ...

The Guile approach stays in one process, but that has no value in the context of make, which launches external programs as its principal paradigm.


There is some benefit to Makefile processing speed, if that's a metric that affects your build.

With one or two $(shell) calls, it won't matter at all. If you start to approach dozens or hundreds of calls, the extra overhead of all those shells can to be noticeable. Especially if your Makefile is automatically triggered on file changes or something.


>Imagine if Guile could be used, say, to override the logic by which a target is considered out of date with respect to its prerequisites.

That is exactly what I was recently hoping for.

Make-as-library is such a compelling idea that I feel like it must have already been done, but I searched for something like this recently and the closest I found was Java's Ant, which gets the as-library part but sadly has no concept of "target is already up-to-date"...


TIL thank you!

I sometimes wonder if we would even have autotools or cmake if people just knew about this one simple trick

Autotools is designed to solve one very important problem: how do you build the GNU tools in the first place if all you have is some obscure Unix from the 1980s. If you already have gnu make, gnu bash, gnu binutils, gnu coreutils, etc. installed then autotools is pointless.

I have yet to find evidence of cmake solving a problem (or even having design), though I guess `ccmake` would be kind of cool if it weren't full of nonsense?


One of the other things that autotools does that cmake does (admittedly badly) is provide a "configure" step that gives a much more controllable interface into enabling or disabling features of a program.

The problem with autoconf in particular is that it spends a lot of time trying to paper over the compatibility issues of ancient Unixes, whereas modern portability tends to rely more on a concept of a portable abstract system layer. The latter means that most of the work a configure step needs to do isn't "does your system have $UNIX_SYSTEM_CALL" but instead "what OS is this."


I see it both ways.

On the one hand, sure Windows vs macOS vs "Linux/BSD" is so very different, you mainly need OS detection.

On the other, I don't need a list of the exact featuresets of Fedora, Debian, FreeBSD, NetBSD, OpenBSD, DragonBSD, MorphOS, etc. so I can write narcissism-of-small-differences shit like:

   #if defined(SUNOS_V4) || defined(_MSC_VER) || defined(GNU) || defined(FREEBSD)
   # include <strings.h>
   #else
   # include <string.h>
   #endif
I will _gladly_ run a feature test and write:

   #if HAVE_STRINGS_H
   # include <strings.h>
   #endif
   #if HAVE_STRING_H
   # include <string.h>
   #endif
That is soooooooo much cleaner

Speaking of, I am using https://zolk3ri.name/cgit/m4conf/about/ so I do not have to use anything more bloated or complex.

This came up with musl, IIRC, and was problematic: the existence of a named header doesn't guarantee it contains what you think it does

You can configure things with cmake! All you need to do is

1. Figure out what varibles you want to change

2. Add a -DVAR_NAME=value parameter to the command-line!

...which sucks to do.

Meson is a much better way of doing things, but even that falls into "Here are the supported flags and settings, and then here are variable names for everything else"


Even with all the GNU tools available there are still a lot of system-specific things that you may need to know: supported C version, supported C++ version, how to invoke the compiler, correct compiler flags for warnings / desired C or C++ version / etc, where to install things, how to install them and set the right owner and permissions and many many more. Autotools (and cmake) can figure all of that out for you. If you operate in a monoculture and, for example, only deal with a single Linux distribution on a single architecture most or all of this may not be relevant for you. But if you target a more diverse set of environments it can save you a lot of headaches.

I see Autotools as sort of cool if you're porting to a new platform - the tests find out what works and if that's not enough then you have to make some effort to get around it. If you're lucky, however, you put your autotooled code on some completely new OS/hardware and it just builds.

Nowadays the proportion of people who are porting the code is probably much smaller but it's still a way of keeping code working on unix with various compiler, architecture variations.

IMO cmake just includes windows more effectively - for autotools you'd probably be forced down the cygwin route. I find it a bit easier to work with but it's still a nightmare from hell sometimes.


Though there's also gnulib, which as part of the autotools process simply replaces system functions with their own stubs. It was a great idea, briefly, and then it became a fiasco.

Make's BIG problem (IMO of course) is that the commands are executed in the system shell.

If make supplied it's own shell language, a simplified one, then everything would be fantastic.

For one thing, cross platform builds would be much easier to get working as there would be no issue about "is the default shell bash or ash or sh or dash or ksh or whatever" and on Windows there would be no need to use cygwin.

The other thing is there would not need to be such a huge clash between the way expansion works in shells versus make which is very confusing when you combine the two.


> If make supplied it's own shell language, a simplified one, then everything would be fantastic.

We did exactly that in build2, a modern make re-thought. And we provide a bunch of standard utilities like sed, find, etc., that work the same everywhere, including Windows. Here is an example of a non-trivial recipe: https://github.com/build2/libbuild2-autoconf/blob/17f637c1ca...


There's nothing stopping you from specifying an explicit shell in your Makefile (see: https://www.gnu.org/software/make/manual/make.html#Choosing-...).

You could set it to Ash, Bash, Perl, Python, execline, Docker exec, or whatever you want really. You can also set that variable on a per-recipe basis, if you have one recipe that needs a custom shell.

(note to any GNU Make maintainers who might see this: it would be really helpful to be able to set .ONESHELL on a per-recipe basis as well!)


I always do in fact do that. SHELL:=/usr/bin/bash or whatever

The problem is that bash isn't there on windows and not even on all Linuxes so if I supply you with a makefile there's no way to be sure you can use it.

If make included a language then I don't need to worry about your dependencies - not even which version of bash you have. This would cause a makefile to be FAR more useful.


Or even emacs, AS Amy Grinn showed last Werk at Fosdem.

https://fosdem.org/2025/schedule/event/fosdem-2025-5139-org-...


Yes, but where do you stop? In make shell one would routinely call rm, sed, find... should they be included too? So instead of make including a shell, it would be simpler if busybox included a make.

> In make shell one would routinely call rm, sed, find... should they be included too?

If you want to use it as a build tool, yes. The most successful build tools build hermetically, with the build definition specifying the versions of anything and everything that is needed to perform the build. (With Maven you even specify e.g. what version of the Java compiler to use, and it will download and use that version).

> So instead of make including a shell, it would be simpler if busybox included a make.

Can busybox be used portably from e.g. a user's homedir? I've only ever seen it used as the main system in /bin etc..


Well, I believe make is already the most succesful build tool, at least on every platform I'm caring about (ie any variant of unices + few more).

What you are describing looks like packaging more than building. Pinning the versions of everything is not the build tool job.

I understand the standpoint of software publishers who want to limit the number of environments they have to suport, but proprietary software is not the use case that every tools should be optimizing for.

When, a nix user or a gentoo user decides that she wants this version of library X with this version of library Y, that's not make's job to overide her decision, is it? We need some flexibility.


> When, a nix user or a gentoo user decides that she wants this version of library X with this version of library Y, that's not make's job to overide her decision, is it? We need some flexibility.

The user should absolutely be able to override it, but library X's build system should have some defaults and changing them should be a deliberate choice. "Build against whatever happens to currently be installed, and hope you get lucky and wind up with something that works" is not a great build experience.


Not for Windows

That's great. Thanks for pointing it out.

Delighted! :-) Glad someone found it a help

I’ve written a fair amount of Makefiles and bounced off of cmake. Recently I’ve started using zig to build some C++ projects and will not be switching back.

Having your build tool just be a library in a good general purpose language is the right move. Why use unergonomic hacks like the OP when you can use a sane language? My build.zig files have LSP completions and similar syntax (or the same) as what I’m building.

I put make solidly in the pile of tools that are only still around because of network effects. We’d have switched to something better if it weren’t defacto installed by every distro and had to make justifications to sys admins to install alternatives.


Well, many of us certainly aren't happy living with the annoying syntax and arcane customs of CMake. But - it does get the job done, and incorporates large amounts of arcane knowledge, multiplied over numerous platforms - so that you don't have to. Can I get the same from a Zig-based build system, or build system generator? Can you perhaps link to some comparative review of using zig for building, with the use of CMake?

I don’t have a review, but here’s a significant project using it in a complicated cross platform build with different systems dependencies: https://github.com/raysan5/raylib/blob/master/build.zig

They also have a cmakelists.txt and pile of other cmake stuff to compare against: https://github.com/raysan5/raylib/tree/master/cmake

One of the nicer things is that if you’re working with less technical folks, they only need to download the zig binary because it bundles the entire toolchain. Real nice when on corporate windows systems.


> they only need to download the zig binary because it bundles the entire toolchain.

But that can't be possible... neither in principle nor in practice, because each projects needs a different toolchain. Different languages, different tools, different target platforms etc.


Well it’s working in practice so I don’t really know what to tell you.

> because each projects needs a different toolchain

Not true.

> Different languages

I don’t think a caveat for C/C++/zig source files was needed…

> Different tools

If your project is unwilling/able to verify builds on anything except an ancient version of GCC that’s your project’s problem. In practice that’s a minority.

> different target platforms

Zig has 1st class cross compilation to all major platforms.

If you’re using an embedded target you’ll need to vendor its linker script and boot code which is already the common practice.

I encourage you to try it before dismissing it.


Same reason I like Clojure tools.build. Any other way feels a little ridiculous to do it any other way.

I have written some large build systems entirely in Make. More complex things tend to rely on templates, but you can build arbitrary things, with two main limitations:

The error messages are awful, particularly if using templates. "Unexpected separator. Stop" is a classic, with no indication where in your 2k lines of Make it might be.

You can't have file or folder names with spaces in (usually including any parent folder to where your code is checked out). A "list" in Make is a series of strings separated by spaces. There are various workarounds that people suggest, but none of them work consistently across platforms. You just have to avoid spaces. At least this is less bad since Windows stopped using "Documents and Settings" for "home" folders.


GNU Make now has a debugger (`apt install remake`) that eases your first pain point a lot

Even then the problem with doing complicated stuff in Make is that it's very hard to reproduce the environment that triggered the bug in the first place.

I came to the conclusion that you need to treat all of the build system as a linear process, which in Make would mean for example not using "=" at all except to define functions, only use ":=". With this kind of discipline I never really needed a debugger, but really Make is not the right language to write complex logic.

Personally I am a fan of (and contributor to) Meson. It's not perfect but it strikes a good balance between what is in the basic package and what you can do in your build script, and by keeping the build phases separate it really helps with keeping things understandable. The lack of functions can be annoying (and in general I wish it could use Starlark as the language), but it doesn't hurt if you follow the same principle and treat the build script as a data flow process, with each phase producing data structures for the next one. So I think it's generally a good principle to follow.


I guess this is here because it's been 20 years and I blogged about it on Monday: https://blog.jgc.org/2025/02/twenty-years-of-gnu-make-standa...

> The GNU Make Standard Library (GMSL) is […] released under the BSD License.

That’s mildly interesting.


Make is just the epitome of software development:

Started as a simple idea

Required more sophistication

Became something no one person really understands


Did you try reading the manual?

Can't find it on youtube, link?

sarcasm?

Indeed. Should delete but it's too late.

Certainly not! That would be a faux pas.

When software requires you to read manual, then it is strong hint that it has poor UI UX

Like, you dont read web browser manual to use it, even when using advanced features like debugger or dev console (advanced in compare to non-computer person)


Make is a power tool and power tools require effort to fully understand and master (though the base case in Make is surprisingly simple). It also has great documentation (which is something newer generations either don't appreciate or don't care about).

Web browser is 10 times more powerful tool than Make.

And if you want to use it as a build tool, you're going to have to crack open some books.

The manual has everything and is not long. It is actually very concise, but with the caveat that within each page you have to read everything. Every sentence of the manual has important information.

The GMSL is a godsend for those of us stuck on GNU Make 3.81 for... "reasons"

("reasons" being on macOS and limited to Xcode Command-Line Tools, which provides a number of GNU tools frozen to the last version that's GPLv2, before GPLv3.

Incidentally, that's also why macOS switched to Zsh by default, because Apple was tired of backporting security fixes to their Bash 3.2.57

/rant)


I can't even describe how much frustration this caused me when I worked at the RIOT project. macOS was by far the most problematic development platform to support. Even Windows was far friendlier since it has WSL which is officially supported.

In most situations (not mine), you can likely depend on MacPorts, Homebrew, or Nix to provide more up-to-date version of the tools you need that what Xcode provides. In that way, it's similar to WSL in that you can augment your environment to get the proper Unix tools you expect.

OP FYI: US copyright law doesn't recognize or require a range of years, only the date of first publication. Many organizations have decided to omit (the) year(s) altogether. https://blog.ntpsec.org/2020/02/15/copyright-year.html

Check the first comment on the linked page for counterpoint.

Also note that copyright laws exist outside the US and may differ.


Good to know. I tend to use them as markers of "I started working on this in year XXXX and I last worked on it in year YYYY".

Correct me if I'm wrong, but I don't think the copyright statement actually has any legal effect anymore (Berne Convention of 1886 / 1989 for the US) so it doesn't really matter if there's a year or not. I think it's just informative.

One of my pet peeves with make is that it handles special characters in filenames (space, semicolon) very badly. Would this library help with that?

Some time ago, I tried to convert a music library, and ended up bulk renaming the files before and after running my makefile.


Okay who will be doing this year's Advent of Code in GNU Make?

What we really need is a C to GNU Make transpiler written in C just so it can translate itself.

But what system would you to build that ?

GNU Make and GCC, of course.

I wonder how thread-safe the memoisation and/or ALists are? Make's version of parralel processing is a very fun little thing, but has a few quirks around recursion limits that can bite you when going off the beaten path.

[0] https://www.gnu.org/software/make/manual/html_node/Options_0...


Why doesn't someone just build good syntax into make itself?

just put some coded comment at the top of the file like "#makev2" and below have variables, arrays, lists, ignore tabs vs spaces, etc


Having dealt with building a single meson project I can safely say I am glad gnu does not seem to be continuing down that path.

If it's not distributed with the tool, it is just "a make library".

Just ran across this last week and pulled it into my Makefiles. Really nice.

       git clone git://bitreich.org/english_knight
You don't need more.

I wish everyone used mk instead of bloated software.

Unix grew up too much.


That is one seriously funky website; I will not recover from the experience soon.

For anyone who doesn't want to clone the above repo, english_knight turns out to just be a template for idiomatic makefiles. I pasted the relevant file from the repo here:

https://nopaste.net/english_knight


Well, you should access bitreich over gopher :)

gopher://bitreich.org

Much better.


Wut? No telnet? #headduck

You're right. That was less traumatic, but still fun, heh.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: