Hacker News new | past | comments | ask | show | jobs | submit login
Future Plans for Autotools (gnu.org)
160 points by edward 4 months ago | hide | past | favorite | 214 comments



I've developed negative feelings toward autotools over time, but I'm not sure how much I should be blaming autotools versus the projects that use autotools.

One annoyance is that it often feels like "./configure ..." takes longer to run than the actual "make".

But my main annoyance derives from that fact that most projects I build from source which use autotools don't build cleanly on anything but Linux x86/amd64. I spend a lot of time playing with non-Linux and/or non-x86 Unix systems, or using vendor compilers instead of gcc -- primarily Solaris (both x86 and SPARC), Linux on POWER, and AIX on POWER. Current versions of IBM's compilers, Sun (err...Oracle's) compilers, etc. Run the configure script, let it slowly probe all the little corners of whatever it's probing, and sometimes I'm lucky enough that it completes successfully and claims to understand the environment it's in. Then run make (or, as is often required, GNU make), and hit build errors very quickly.

Which makes me think... why did I just sit through that long "configure" script run, and why did the developer bother messing with autotools, if this project isn't even going to bother targeting anything but Linux with the GNU toolchain to build it? If you're not concerned about portability, it seems a relatively simple Makefile would be an easier and faster approach. Why foist the [insert your favorite colorful descriptor here, depending on your personal feelings] autoconf system on builders of your project?


> Which makes me think... why did I just sit through that long "configure" script run, and why did the developer bother messing with autotools, if this project isn't even going to bother targeting anything but Linux with the GNU toolchain to build it?

Even if target is just Linux with GNU toolchain, there are still plenty of issues that needs to be managed by configure script - tests for mandatory and optional libraries / header files, some functions are in different libraries on different systems (e.g. tinfo vs ncurses, or librt), different versions of kernel headers, different versions of GCC (older ones may not support some options).


> tests for mandatory and optional libraries

I never understood the concept of "optional" library. Either your program requires a library or it doesn't. If it requires the library, the easiest thing to do is to assume that it is already installed and just link to it. In case it is not installed the compiler will fail with a very clear error message.

Say, your program is an editor of png files and it requires libpng. Thus in your code you will have

    #include <libpng.h>
and in your makefile you will link your executable with -lpng. And that is all! This works in linux, bsd and macOS. What the fucking else do you need?


>I never understood the concept of "optional" library.[...] In case it is not installed the compiler will fail with a very clear error message. [...] and in your makefile you will link your executable with -lpng. And that is all! This works in linux, bsd and macOS. What the fucking else do you need?

Interesting condescending tone there. I didn't downvote but if I can try to answer the core part of your misunderstanding: Many do _not_ want the build to fail when library png is missing.

Example projects using this type of build optionality are FFmpeg and Qt.

For example, ffmpeg lets you build with the optional Fraunhofer FDK AAC (libfdk_aac)[1] encoder but it is not GPL license compatible. If everyone followed your "assume libfdk_aac is already installed" advice, many people would be stuck at the "very clear error message" and not be able to have a working GPL ffmpeg.

Same idea with Qt optional libraries.

[1] https://trac.ffmpeg.org/wiki/Encode/AAC


Come on, then just edit a configuration file specifying what libraries you want to use and where they are! /s


Ok, humor me for a second: What's wrong with that?


I'd paste the output of ffmpeg's ./configure --help here to illustrate the point, but it would be a rather hostile HN post due to its sheer size. It's not really practical to expect someone to configure dozens or even hundreds of optional libraries in a configuration file. It would be a major hassle to have to get a config file correct rather than config detecting as much as possible. ffmpeg is an extreme case, but it's proportionally not that uncommon; many of the things people most want to compile by hand are naturally the things with a ton of optional libraries. The easy cases with 0 or a handful of options are handled by your package manager and you don't care, because there aren't any "options" for you to disagree with.


It's not "wrong". It's a choice.

You can provide some code and let the users do whatever is needed to get it to compile on their system.

Or you can try to automate the process with the best of intentions and create a monster.

In any case, the reason why autotools exists in the first place is that people were less than thrilled with the work required to make things work by hand.


Users should ask for something, and the build should build that, or fail if it can't.

Randomly linking against things if they happen to be available is a terrible idea.

Does any build ecosystem except C do this? I'm not sure even CMake does this.


Who is the user? Most "users" never build 99% of what they use from source. Sure a few things that they do build they would like control over. But randomly linking against things is the default because it make the package maintainer's job easier, and it is hard enough as it is to get people to maintain and update packages for distributions.


> Ok, humor me for a second: What's wrong with that?

There's absolutely nothing wrong with that, and in fact it's so simple that afterwards we can even write a script to automate setting those variables.

Who needs convoluted build systems anyway?


My sarcasm detector is on the fritz, what you are talking about is autotools, correct?


Say, your program is an editor of png files

Say you are writing a program to edit images that can potentially read lots of different formats. However most people don't need support for .sgi or .ecw files and won't have the libraries to read those formats installed. Your program is still perfectly usable without those libraries and so you should still be able to compile and run it, just without support for those formats.


Then you end up with a program where the users can never trust it to have any given feature. They use it all the time to edit .sgi files, and then install it on a new computer, and bam, suddenly, no support for .sgi files.

Or even worse, it is used as a dependency in a script, or it is a library linked by another program, but the author can never trust that any given installation has the feature he actually needs.

It is a mess. Just don't do it.


The disadvantage is very mild, and the advantage huge (modularity and ability to easily go embedded). If your dependency manager cannot express "program[+feature]", it is your dependency manager's problem.


I do not know of a single dependency manager out there that can express that.


MacPorts' variants:

* https://guide.macports.org/chunked/reference.variants.html

* https://guide.macports.org/chunked/using.variants.html

The default / most common variants have pre-built binaries that are downloaded via a port install, but if you specify something more 'obscure', things are built from source.

FreeBSD's Ports also allow for options:

* https://docs.freebsd.org/en/books/porters-handbook/#makefile...

Build-time and run-time dependencies are both handled.


> a single dependency manager out there that can express that.

If it existed, it would be a nightmare, because features are recursive, and you would end up with fragile monstrosities like

    foo[+bar[+baz[+qux],+quux]]
of course, each variable having version numbers and all


ebuilds¹ use essentially that syntax, as does setuptools². Others support that too(nix, bitbake, etc), but I have bookmarks on the first two ;)

¹ https://devmanual.gentoo.org/general-concepts/dependencies/i...

² https://setuptools.readthedocs.io/en/latest/userguide/depend...


Nix and guix.


Exactly.

By the way another thing I've come to hate is configure or make scripts which download stuff. This is really annoying if you want to build self-contained packages. Or if the downloaded material ends up being not available, or if you're not always connected to the internet.


> Then you end up with a program where the users can never trust it to have any given feature.

That's the "glass half empty" pov.

The "half glass full" pov is that anyone can still use their favourite piece of software even if an obscure and/or useless library is not available to them.


I like the idea, but the reality is that it fosters extreme complexity on all kinds of axes for marginal benefit.

Sure, for hobbyists it is fine but professional use.. I disagree.


> I never understood the concept of "optional" library.

Consider a scientific software, let's call it x. x can solve many problems and these problems cross-cut a lot of disciplines.

A researcher, user of x has a need for only subset of these features, and configures & compiles x with the features only he/she needs. This has many advantages from simpler setup to shorter build to easier verification.

When you're compiling software on remote, restricted systems; these optional components makes people's life a lot easier. Also, esp. in scientific community, every library needs something lower level so, trying to compile everything in results in exponential dependency hikes. Optional components solve this very neatly.


Another similar use case that is quite common is that your tool can use one of several different calculation kernels. For example there is one option that is fast but closed source, one that is open source but slower and one option that only works with Nvidia GPUs. Most people won't have all three options available, but as long as you have one of them the software works fine.


> I never understood the concept of "optional" library. Either your program requires a library or it doesn't.

The program may have an optional feature that requires a library. It does not make sense to force everyone to install it, if such feature is used by 1 % of users. Example might be mplayer, where there are common decoding libraries used by everyone and also plenty of obsure decoding libraries.

Even better is to use dlopen() to open the optional library in runtime, to fit in distribution package management paradigm, but this requires much more work than simple option to disable dependency at compile time, and even with it it is useful to have ability do disable optional dependency at compile time to avoid requiement for header files.


> Say, your program is an editor of png files and it requires libpng. Thus in your code you will have > > #include <libpng.h>

In addition to what other commenters wrote (failures later during compilation instead of at the start), you also need proper include path in CFLAGS.

For example, gdk-pixbuf is supposed to be included by '#include <gdk-pixbuf/gdk-pixbuf.h>'. That would assume the header file is in /usr/include/gdk-pixbuf/gdk-pixbuf.h . But distribution that i use installs it in /usr/include/gdk-pixbuf-2.0/gdk-pixbuf/gdk-pixbuf.h , likely to be able to have multiple versions of header files installed togethers.

You are supposed to use 'pkg-config' tool in configure script to add proper header file locations in CFLAGS to handle this (in this case -I/usr/include/gdk-pixbuf-2).

> and in your makefile you will link your executable with -lpng.

The library you are usimg may depend itself. on another library, you need to add -l flags for all recursively dependent libraries to linker. As this set of recursively dependent libraries may change in the future, you are supposed to use pkg-config to find proper link flags.

Note that all of this is not really dependent on configure, you can do that in a simple makefile (by just calling pkg-config directly from it). So it is more about using pkg-config instead of hardcoded paths than about using configure scripts.


> proper include path in CFLAGS

This makes no sense. Preprocessor options have no place in CFLAGS, which is for compiler options. I guess you meant CPPFLAGS? Putting a preprocessor option on CFLAGS is always wrong and expected to fail.

> But distribution that i use installs it in

If your distribution installs headers outside of the preprocessor path (or somewhere inside CPATH or C_INCLUDE_PATH), then I'd say that this distribution is broken: it has not installed the library, just copied the files somewhere. The makefile is then, correctly, expected to fail. It's just as if you had unzipped the library on /tmp, of course you would not expect the compiler to find it!


> This makes no sense. Preprocessor options have no place in CFLAGS, which is for compiler options.

Technically true, but pkg-config does not distinguish between CFLAGS and CPPFLAGS.

> If your distribution installs headers outside of the preprocessor path (or somewhere inside CPATH or C_INCLUDE_PATH), then I'd say that this distribution is broken

Both Debian and Red Hat use this path. These libraries are designed to be located with pkg-config, so you can install them anywhere, and everything works as expected.


This kind of opinions is why the dream of Desktop Linux will keep being a dream.


Is it though? How so?

Your parent poster IMO has a point: seems that part of the reason for autotools existence is to also cater to all distributions choosing weird places to put headers and libraries in. Shall we accommodate them forever?

As for Linux desktop, personally I'm skipping it due to the complete lack of security and sandboxing in X11: basically every program can be a keylogger is the typical egregious example. And Wayland is kind of funny; people are still trying hard to make it be 60FPS capable which is absurd in 2021.


Sure, it can be simple.

To give you a less clear example, think about writing a library that sends telemetry to some cloud service. The telemetry could be readings from a sensor attached to a microcontroller with 512K of RAM or it could be readings from a server sitting in a datacenter with 256GB of RAM.

You need some helper library to handle the protocol and there's a really nice full-featured library that comes with a bunch of handy debugging tools but it's too memory-hungry for that tiny microcontroller.

Another option is a minimalist library that uses very little memory but also has less flexibility in say TLS.

If you pick just one, one of your platforms ends up suffering needlessly.

Adding the options allows the user to decide what's best for them.


Sure, you can have different make targets if you really need to. Or Makefile.linux and Makefile.embedded everybody understands what to do in that case.

I simply don't understand the explosion of complexity associated to autotools, cmake, and the like.


Because the cartesian product of options quickly explodes and you're suddenly managing 256 targets or makefiles (though even in Make there's a better way to handle this, though it's made painful by the syntax). It's very common for larger pieces of software to have 30-50 different configuration options for building.


> I simply don't understand the explosion of complexity associated to autotools, cmake, and the like.

If you refer to tools like cmake as "explosion of complexity" then I have to say that you are using them entirely wrong, and you should review your practice thoroughly.

I mean, cmake is a makefile generator. You specify a high-level description of your project, and then you run cmake for it to generate your Makefile that performs the whole build.

With cmake you don't even care which compiler you use. You can simply state you require, say, c++ 14 with constexpr support and you're done for all platforms and all compiler picks.

If you believe cmake is more complex than a makefile then either you only work on single-foot projects or you have been cursed with hideously managed cmake projects.


It's not just about libraries.

Autoconf for example can abstract across different compilers from different vendors, with all their various incompatible command–line arguments. And it does that for a dozen or so different languages, not just C.

It can abstract across different function signatures in the libraries you're using. For example if you're calling a libc function where the arguments have changed, you can detect which version of the function you have and make your code work with either one.

It can do the same for structs and types as well.

It can test for system services, such as X Windows or the ability to run Perl scripts.

It can abstract over the differences between different implementations of common utility commands such as awk, grep, install, mkdir, and so on.

It does a lot of stuff!


> It does a lot of stuff!

Of course it does. My point was that I've never found it did anything that I was interested in.


Then it wasn't written for you. It was written for people who want to distribute software to users who do not all run the same Unix operating system (not to mention the radically incompatible hardware of the day), and whose Unix vendors are all mutually hostile to any form of cooperation with each other. If all you ever install your software on is Ubuntu running in a VM at your cloud provider, then of course this will all be irrelevant to you.


Large projects can take dozens of minutes up to hours to compile. It's extremely frustrating if the build system doesn't tell you in advance if it expects the compile to work. Having an error message once every 20 minutes, adding a package and restarting the build is terrible. (Also, for package systems like DEB or RPM, you can't just continue a compile that half-finished.) Now imagine you're a Linux distro and doing that for thousands of packages...


> It's extremely frustrating if the build system doesn't tell you in advance if it expects the compile to work.

I have only found this behavior in three different projects that used cmake. After several minutes of beautiful compilation lines in green, it turns out that the library (that cmake was supposed to check) was not installed after all. Of course it printed a beautiful error message in red. Beautiful but non-informative, that is.


And what if your program needs a version of libpng above a certain version? And maybe below another one? That can give compile errors that are quite alot harder to grok.


One of the frameworks I use optionally supports a commercial library. If the library is present it enables support, if it isn't it just compiles without it.

This is a nice feature since it avoids a lot of manual configuration. The downside is that support fails silently if the library is missing - since that is the expected case for most users.


the `./configure` often tests edge-cases from the 90s and early 2000s, that are not relevant anymore, but everyone is afraid to touch any of that.

...because almost nobody can really write any code in M4sh (M4 with bash spliced together) and other odd languages.

Really, the e-mail from the actual link is surprisingly sober and calls out the weaknesses of autotools well.


> ... but I'm not sure how much I should be blaming autotools versus the projects that use autotools.

Definitely the latter. Few projects treat `configure.ac` and its macros as serious as programming should be. They produce spaghetti code, duplicate code, patched code, slow code, unmaintainable code and they leave it there for decades only add code to make it worse, without ever feel the embarrassment to fix it. Of course blame the coder before blame the language.


I dunno, if a majority of projects using X come out bad, eventually it is a fault in X, not just bad users.


Not really. As an example, the configure script will often output a "config.h" header that will have a bunch of preprocessor macros that the developer can use to make decisions about what platform features are supported.

So you might have HAVE_FOO and HAVE_BAR in config.h. But if the developer doesn't do something like:

    #ifdef HAVE_FOO
    int ret = foo();
    #elif HAVE_BAR
    int ret = bar();
    #else
    int ret = do_some_fallback();
    #endif
... then that's on the developer for just working on a system that has foo() and doesn't consider other systems that only have bar(), or have neither.

Of course, that assumes they even put the tests for foo() and bar() in configure.ac in the first place!

I would still use autotools for a C project even if I was only targeting linux/x86_64, just so I could write a simple declarative Makefile.am file that lists sources and defines target outputs, and have autoconf/automake figure out everything else, and put it behind a standard interface (that is, one that respects prefix, destdir, etc.). Also being able to easily write tests for dependencies and get the appropriate cflags and ldflags makes life easier.


Yes really, if everyone does it wrong, it means something is lacking to put those peoples in the right track from the start.


Can you supply said statistic?


Are there any good examples of configure.ac macros that would be useful to study?


So you're saying it's the PHP of build tools?

...that kinda works, actually.


Yeah, the autotools model of portability didn't stand the test of time, but there's no shame in that. It's in good company. Isn't CMake the usual tool these days?


Autotools, during its lifetime, has provide tremendous value across innumerable scenarios and systems. What does it matter if it didn't stand the "test of time" — especially if it shone a light for its successors to follow?

The Autotools model of probing for system capabilities at build time isn't dead; it will never die, it will only be used in greater and lesser amounts and to complement other means of discovering system capabilities.

CMake seems to me like a reimplementation of the Autotoools model of build-time probing. The true competitors are binary distribution after building on precursor systems with full specification and no probing, and cross-compilation with full specification and no probing.


Saying that something is obsolete isn't a value judgement -- it's an observation of the differences between the state of the world when it was necessary and now. This describes autotools to a 't'. It's not that there are that many fewer build targets these days, but rather that there are many many many fewer installations of most of those build targets these days.


Cmake is only useful in very limited scenarios. I do maintain lots of projects with both, and cmake is always the second citizen, for msvc users only.

Serious users use autotools (with mingw), but the occasional windows noob cannot be helped. So he gets his limited cmake experience. Cmake usually only does a tenth of the autotools probes, edge cases will never be detected and worked around. Such as broken compiler versions, as with gcc or icc frequently. Cmake is a visual basic like hack, but Visual Basic at least had a proper design. Just lacked the library infrastructure.

Everybody uses autotools, show me a cmake-based distro.

All this autotools bashing is hilarious. Probing is costly yes, but you only do it once, and you rarely do it by yourself. Distro packagers do that for you.


This is a complete mischaracterisation of CMake usage.

Today, it's better at portability than most Autotools builds I've used. In projects which maintain both systems side-by-side, I've had Autotools fail on Solaris, MinGW and Cygwin while with CMake it worked without trouble. And that's for projects which hadn't specifically tested and fixed defects for portability issues on these platforms. Empirically, CMake had better platform coverage.

As for the probing of features, take an actual look at what CMake provides. Over an order of magnitude more feature tests. The problem with the Autoconf feature testing is that the vast majority of the tests are 15 years obsolete at this point. The CMake tests are mostly for contemporary platforms. That's where the value lies.


> Everybody uses autotools, show me a cmake-based distro.

The LLVM project exclusively uses cmake.


> Yeah, the autotools model of portability didn't stand the test of time, but there's no shame in that.

xmkmf joined the chatroom


> Which makes me think... why did I just sit through that long "configure" script run, and why did the developer bother messing with autotools, if this project isn't even going to bother targeting anything but Linux with the GNU toolchain to build it?

There's more than just compiling, it adds a bunch of common stuff "out of the box" (with a line or two of configuration). Test runners, pkg-config, installation, cleaning, out of tree builds, etc. Nothing magic that you couldn't do yourself, but IME by the time you do it your makefile is much less simple, it's taken a non-trivial amount of time and possibly has a bunch of subtle errors. Even on my own little projects that will never run on another computer let alone another OS I find autotools worthwhile.

If I had one major complaint it's that it still puts build artifacts in the root directory.


> One annoyance is that it often feels like "./configure ..." takes longer to run than the actual "make".

Agreed. The configure script doesn’t take advantage of concurrency, but make can.

But the number one cause of slow configure is that people forget to run configure -C. The -C option makes it cache the outcome of all of the tests, so that it doesn’t have to do them all again. This is a huge time–saver!


Since ./configure just probes the compiler it could be paralellized. Make has concurrency (-j4) why doesn't ./configure have the same feature


It’s also a shell script. It’s a shell script written purely in POSIX–compatible sh, that is generated by the worst M4 script ever written. The information which will be printed to stdout, to the log files, to the cache files, to the status file, and so on all needs to be buffered until the end of any asynchronous task, then written atomically so that it can be understood and will work consistently. That’s synchronizing writes to six or seven different file descriptors, just so that everything comes out in a comprehensible order. I bet you're already thinking about how to implement it though.

And it does quite a lot more than just probe the compiler.

And finally, it doesn’t support concurrent execution because nobody who was tried to write it has returned. Usually it starts with confident estimates and preparations, followed by departure and sporadic reports. Gradually the reports dwindle in frequency and comprehensibility, until eventually nothing more is heard.


So it's impossible?


Because typically make produces a number of independent targets (for example, object files of different source files) while configure only produces a single output file containing all variable assignments.


Which linux x86/amd64?

Do you mean the linux that puts libraries in

  /usr/lib/x86_64-linux-gnu
or the one that puts them in

   /usr/lib64

?


who cares? wherever they are, the linker finds them when it sees the option -lpng


I would assume autoconf is just checking for existence of packages, but yes I know other tools like cmake will actually try compiling/linking something to see if it's available.


As someone who ran a fair bit of FreeBSD and Interix/SFU/SUA, autotools was much more reliable than any other build system (though CMake came close). 99% of autotools projects would build fine. Approximately 0% of "simple Makefile"s would build at all.


I really like CMake... except at some point you realize you changed one set of problems for another.

the "modern cmake" movement is a step in the right direction

It is nice that windows and linux folks can both work on a project (which I see as windows developers being able to contribute to linux software)

The reality is that it's kind of messy when you actually try to develop something with it. (but not try-to-figure-out-m4 messy)


Likewise, except at some point I worked on cross compiling and embedded systems. When I had to build something using CMake, I would tremble. Autotools may be pure hell for the author side, but it is quite useful in the user side.


Lucky you. I rarely get autotools projects to compile cleanly even under idyllic circumstances.


> [...] I'm not sure how much I should be blaming autotools versus the projects that use autotools.

I'd say that a large portion of the bad name that autotools currently have is due to their users. On one hand developers, who as you say do not actually use all the deductions of configure scripts,. But also users (the ones who blindly type ./configure) are to blame a little. If you complain that you wait while configure checks to find whether your machine is 32 or 64 bits, etc., please read on how to specify "site defaults" once and then all your configure runs take a fraction of the time.


Same for me but there's no better alternative to autotools that I can rely upon


From the point of view of an end user, I love compiling projects that use autotools. Download a tarball and then compile make install without needing to install a bunch of dependencies just for the build system. You can also trust that standard command line flags like --prefix and so on will be there, and do what you expect.

However, from the point of view of a developer, autotools are way too much for my brain. The m4 macros are inscrutable and I never felt like I had any hope of actually undesrtanding how they work. It's one of those technologies that I my only hope of getting work done is by copy pasting snippets of code I got from other people.

Anyway, does anyone know if there are alternative build systems that follow the same paradigm as autotools, but more pleasant to use as a developer?


My experience is that a lot of these projects are bluffing when they use autotools. Autotools will work very hard to make sure stdlib is there. But as soon as you compile on a more exotic system, none of the checks are of any use, and you fail in the make stage.

Maybe we should make any easy to use version of autotools which does nothing but accept the standard prefix, flags, etc options. You wouldn’t be able to do sophisticated configuration, but let’s face it most codes only pretend to do that.


That's the kind of thing I was thinking about. Do they exist?

One example that someone mentioned in a sibling comment is Autosetup, which apparently uses Tcl instead of posix shell + m4. An interesting idea... Tcl is one of those languages that's small enough that it's feasible to include a copy of the interpreter together with the build scripts.


This has been my exact experience. I used autotools for a couple of projects that I wrote in C. Ultimately it wasn’t terrible but I have to say I still don’t understand most of what autotools was doing under the hood, just that it checked which libraries were installed and yelled loudly if the ones I needed weren’t available.

Since then I’ve started rolling very simple Makefiles that just call the compiler. cc will complain loudly when it can’t find the right headers, and for these simple projects I don’t need any configuration flags, so why worry about all the machinery of autotools?


IIRC from my C++ days, there's no alternative to Autotools that keeps the same pattern of configure, make, and make install.

The big alternatives I remember are: SCons, Maven, and CMake.

I liked CMake the most; Autotools was nearly unusable for me. It seemed like I needed to learn almost as much about Autotools to get productive as I did for C++.

If I'm wrong about anything, someone will come along to correct me.


If you squint, `cmake ..` looks like `./configure`. From there, you `make` and `make install`.


Part of the issue is also the lack of standardized flags. I will remember "./configure --prefix=/opt/foo" until the day I die, but "cmake -DCMAKE_INSTALL_PREFIX=/opt/foo" is something I have to look up every time (like I did just now to type it here).


That could be solved by a "configure" script that translates its arguments to ones that cmake will understand. And it seems it's such a good idea that someone has already made one: https://github.com/Richard-W/cmake-configure-wrapper


You can run ccmake or cmake-gui to inspect and change the configure options interactively (either in the terminal via ccmake, or in a "proper" UI application via cmake-gui).


You need CMake installed on your system to do "cmake ..". You don't need autotools installed on your system to do "./configure".


And then it fails because you don't have "make" or a compiler installed. Build dependencies aren't an issue as long as they are reasonably standard and easy to install.


Yes, you need autogen.sh.


You don't need to run autogen.sh. The maintainer can run that ahead of time when they make the tarball.


Depends on the system. If it was old, or the release date was a while back, you might need to do it anyway.

Many Linux distributions do it as a matter of course to ensure it's actually possible to regenerate and that it's up to date. At that point, you start to question the necessity of embedding it in the first place given that its primary consumers don't care.


To avoid the autotools dependencies, we used autosetup https://msteveb.github.io/autosetup

It's very compact to be included along with the project. It does the configure, which tests compiler features and installed libraries. Then generates the Makefile using your custom Makefile.in.

Basically, it's a compact set of Tcl scripts, it even includes a small Tcl engine, in case it's not installed on the platform.


Autosetup is used by the Fossil VCS project (written by Richard Hipp, SQLite author)

https://msteveb.github.io/autosetup/articles/fossil-adopts-a...


> ...without needing to install a bunch of dependencies just for the build system...

Ahem

    autogen.sh: command not found
Autotools should be put out to pasture. They serve literally no useful purpose at all in 2021.


You don't need autogen.sh for a release tarball - the whole point of autotools is to generate a configure script that in turn generates a Makefile from Makefile.am etc. for you.

However, if you're pulling a random commit from git rather than ungzipping a proper release tarball where autogen and co haven't been run on a dev system for release, then that workflow of course can't work.

Which is one of the points of the linked discussion - that folks clone from git rather than doing "proper" releases, with cloned repos increasingly bringing their dependencies with them. Another point being that modern "language ecosystems" a la Go and Rust have their own canonical package management and aren't really made for polyglot development and linking with locally installed libs.

I don't quite get the autotools hate; from a user PoV, it's the one build system that has worked extremely well over the decades with just POSIXly tools installed locally (make, sh, cc). The same can't be said for cmake. Not a particular fan of libtool, but arguably the invasive thing it does is a consequence of link-loaders such as ld.so still not getting lib resolution quite right in spite of ld.so'd heavy-handedness (Mac OS's is saner IMO). Another reality is that Docker builds are used to shield against lib breakage.

IMO, what could be done to simplify builds is not to bring a new grand-unifying builder a la cmake, but to find common ground among GNU and BSD make, make generic "make" more powerful such that Makefile macro expansion works in more places than it does now, and rely solely on Makefiles and POSIXly/LSBly C/C++ header/macro def discovery in your source files rather than relying on automake, config.h, and -DHAVE_XYZ. Then slowly deprecate autotools and restrict yourself to target the much more uniform landscape of Linux, BSDs, and Mac OS we have today.


Autotools are _mostly_ fine from user's perspective (run configure, make, done) and horrible from devs perspective.


Quite the opposite. Autotools are wonderful for devs. I would never start a new project with cmake or bazel or ninja.

Either autotools for the quality projects, or makefile projects for header-only like projects. Cmake is faster, but extremely limited.


They may be of no purpose to you, but there are thousands, if not millions of people to which autotools is still incredibly useful, even with all its weaknesses and flaws.


autotools were originally designed to abstract away the platform-specific UNIX differences, 40 years ago.

Today UNIX and those platform-specificities don't exist anymore.

Instead we have Linux, FreeBSD and MacOS. Unfortunately, autotools haven't kept up and don't actually do anything useful to help you write portable code across Linux and MacOS.


I would love if the autom4te requirement for Perl could be removed so I do not need to always keep a Perl installation. (OpenSSL requires Perl, too.) If a scripting language is an absolute necessity, change it to something smaller like Lua. (Lua is part of NetBSD base.)


Perl is usually installed in most Unix or Unix-like systems. Lua not that much. And the size difference though proportionally big (5x) it's insignificant in modern systems (1M vs 0.2M). Also consider that Perl is more featureful language and switching to Lua could mean more work for the devs if some of them are required to be implemented.


Perl does not cross-compile so without applying non-upstream patches.


And Perl comes with OpenBSD.


What would the better higher-level scripting language be though? I'd love to use something better than sh.


Use a 3rd party tool to generate... autotools "scripts", that's the best compromise IMHO... build tools are not a solved problem yet, you'd think it would be priority #1 for the dev world but no...

Ultimately, devs shouldn't have to suffer to build C or C++ programs like that. Why is building executables so hard like even today? Developers need to solve that problem once and for all. Containers aren't the solution.


> Use a 3rd party tool to generate...

Isn't that a case of "now you have two problems"? :)


More than that, after all we already generate make files. So it is 3rd party -> auto tools -> Makefile. Every step in this chain probably adds another layer of obfuscation to any build error you encounter.


> > Use a 3rd party tool to generate...

> Isn't that a case of "now you have two problems"? :)

No. Now you have 3 problems: the 3rd party and the tool.


I think most of the engineering has moved towards things like Cargo for Rust or Go for Golang.


I wonder how much cruft could be thrown away if the project tossed out support for, say:

- Everything but Linux and a few of the more popular BSDs

- 32 bit CPUs

- CPUs with less than 1,000 installations

- Ancient toolsets that don't support features everyone else takes for granted in the last 20 years

For instance, it has to be a pain in the neck to sit on a code base with Solaris 2.6 support, or NetBSD on Alpha, or... Also, it seems like there'd have to be some legacy pain around things like "we can't use this flag on sed because Amix didn't have it".

But I don't know enough about the infrastructure to know if pruning those out would make a bit of engineering difference. How much is support for ancient or little-used stuff slowing down development?

Edit:

Alternatively, I wonder what it would look like to have a build farm that precomputed all the values. "Oh, you're on macOS 11.1 on Intel? Here's the list of 32 envvars you need to set." If a million people are compiling the same file on a million identical computers, is it a great idea for all of them to have run the same probes?

Edit 2:

I'd never advocate for support for such systems to be scrubbed from the Internet. I'm just saying it's unreasonable to expect a maintainer today to support ancient systems. The people still using those systems have the right to fork the tools and maintain their own version, but that's not the same as making upstream do it for them.


Per the discussion in the link, specifically "All the supported languages except C and C++ are second-class citizens.", I'd seriously consider writing off everything except C, C++, and any language where someone steps up to actively support their language going forward.

Slice autotools down to 32-bit & 64-bit UNIX systems in the last couple of years (bearing in mind that if you start that today, by the time it is finished it will constitute support for the last 3-5 years)... or even all the way to current systems for the parenthesized reason... for C and C++.

Would probably also call it "Autotools 3" and slice away anything else that doesn't seem useful, on the grounds that 2.7 isn't going anywhere. If in doubt, slice it away and see what people say when you release 2.99.01.

This probably takes it down to something not so enormous to carry around.

It seems like other languages are all going their own way. Maybe that's bad, maybe that's good, most likely it's a complicated combination of both, but most importantly, there's nothing Autotools can do to stop it at this point, so you might as well roll with it.


If you're going to cut off the long tail of compatibility, what's the point of using autotools? I thought the whole point of autotools was to deal with the (painful) diversity of operating systems.


Because at some point, who really cares about those ancient systems? Suppose I'm writing a program that uses autoconf, and it detects that someone is trying to compile my program on an Amiga using gcc 2.95. Well, awesome for them if they make it work, but I'm not willing to put in any effort to support that in my code.

What are the long tail of variants that people actually care about now? 32- vs 64-bit, OK, sure. 36-bit? No way. Big- vs little-endian, sure. PDP-endian? Miss me with that.

It just seems like the number of variations that people plausibly care to support is a lot shorter than it use to be. Why should the autoconf gang work themselves to the bone making sure their stuff works correctly on a platform that no one but a platform's maintainers actually cares to target?

I'm not unsympathetic to people using old or odd systems. I've got some bizarre stuff squirreled away in my attic. I don't reasonably expect anyone but me to put effort into keeping my SPARCstation 5 limping along, though.


I side with the parent here, if you don't care about portability why even bother with the autotools? And if you bother to use the autotools, what would you get by purposefully breaking compatibility?

I'm still to this day paid to write embedded code for various DSPs, I'm very happy that autotools are portable. I know that on HN if you're not writing NodeJS or Rust in x86-64 docker containers you're niche and don't count, but it's a bit short sighted.

Autotools doesn't need disrupting.

I even think that the mail in TFA is somewhat mislead. Autotools drop in popularity because the type of software development that requires using the autotools is slowly but surely declining, or maybe it's not even declining but simply growing at a fraction of other ecosystems. Actually many modern languages and environments (IMO rightfully) ship their own well-integrated build systems, so attempting to bring the autotools there is probably a waste of time.

For me asking about "future plans for Autotools" is like asking for "future plans for Make" or "future plans for ls". I don't expect any substantial changes in these projects, but I also don't think they're obsolete. They just fit a specific use case and they do it (mostly) well.


I too have been paid to write DSP code, and autotools would not have helped. You can't run m4, perl, sh, or make on the DSP. Cross-compiling with autotools is far more difficult than cross-compiling with a simple little Makefile.


> Because at some point, who really cares about those ancient systems?

A lot of people do. Retro-computing is way more popular than you would think.

In fact, m68k is the oldest port in the Linux kernel and is so well-maintained that it saw multiple other architectures come and go.


I ran that on my Amiga when it was still new. I have no animosity toward it! But should someone in 2021 still have to expend effort to support that in upstream tools?


Yes, this is open source, everyone is free to chose how they contribute.


Exactly my thought.

Please leave autotools alone and don’t “improve” it.

It’s not like there are tons of build systems already and if someone is so keen in a heavy-weight build system, they can use Bazel which is so bloated that it doesn’t build on any 32-bit target in Debian.


There is nothing wrong with improving, just make sure you have a frozen version that can handle the old cruft, some software is so stable that you wouldn't dare to update, I have some software I made myself that does what it was designed to do and never give problems. However it does not keep one from ever going forward and design new ways of solving the same problems. So my advice would be freeze and move on.


Bazel is obviously an insane idea, but autotools is as "heavy-weight" a build system as they come. I struggle to imagine something more convoluted and bloated than autotools.


It's convoluted because of the efforts to avoid bloating by e.g. reusing the macro processor from sendmail rather than writing its own.


That diversity no longer exists. These machines only run in museums, if at all.


Plenty of industrial systems have been working fine, unchanged, for decades.

An example off the top of my head -- a CNC mill at my previous job, purchased new perhaps ten years ago, runs DOS 6.22 with Windows CE on top. I suspect that the vendor is still selling the same software stack on the same hardware today.

Before someone chimes in to say, "That's insane, why isn't it running an RTOS?": That mill had zero software faults in the years I worked with it.

Edit: Poking around Trak's website, it looks like the software on today's machines is unchanged. Here are some example screenshots: https://www.southwesternindustries.com/software/page/prototr...


> Before someone chimes in to say, "That's insane, why isn't it running an RTOS?"

I'm chiming in to say that Windows CE is an RTOS. [0]

[0] https://docs.microsoft.com/en-us/previous-versions/windows/e...


Whoa. I sit corrected!


And they will continue to run. No one in their right mind will ever upgrade the software on them. That is a lot of the reason behind removing old platforms from Linux: whatever the systems out there are running, it's not a HEAD kernel release.


HEAD kernel runs fine on my Amiga 4000 from 1992.

It also runs fine on my SH-7785LCR SuperH board where I regularly test the latest kernel releases and patches.


Even if that were true (and it's not completely true by the way) autotools is not used to build the kernel. You can very much use autotools to build modern software for older kernels. That's... kind of the point of using something like autotools.

Frankly reading this thread I think many people here bark at the wrong tree. If you don't care for portability don't bother with autotools, just write Makefiles. It's much simpler.


Plenty manufacturers ship software updates to 32 bit devices. Plenty 32bit devices are being sold or even newly released right now.


Do those systems need new software to support them in 2021? If they used autotools, say, they could snapshot the last working version that worked with their system and not care what the new version does. It's not like there are a lot of fixes being merged in these days to handle 30 year old systems.


> An example off the top of my head -- a CNC mill at my previous job, purchased new perhaps ten years ago, runs DOS 6.22 with Windows CE on top

I thought Windows CE was a standalone operating system like current Windows versions are, not a layer on top of DOS like Windows 3.x and earlier were.

But, Googling, I see Windows CE, on x86, did use DOS as a bootloader, just like how Windows 9x did. DOS boots, then runs LOADCEPC.EXE to start Windows CE. (By contrast, CE on other CPU architectures didn't do this, it just booted Windows CE directly.)


How often does someone compile and run new software on those machines? And also, let's be honest, autotools and windows have never worked well together.


Come on. Next time you get rid of some electronic device I recommend that you open it and try looking for the main controller. You might be surprised to find that it's not in fact all running on Ryzen and Apple M1 CPUs.


Statements like these make me question one's nerd cred

My previous employer has several hundred x86-based Linux field installations. Other hotspots include IoT and industrial control systems, to say nothing of all the legacy servers still out there in current use


That’s not true. A lot of embedded stuff is still 32 bits.

The simpler the hardware, the higher the reliability and the smaller the power consumption.


Anybody with intention to work on older systems can work with older releases of Autotools, while newer releases can focus on much more needed features.


Aww, common, don’t put NetBSD in the same bag as Solaris. There are dozens of us!


NetBSD on x86-64: great!

NetBSD on mid-90s hardware: uhh, that's cool if you want to experiment with it or enjoy the nostalgia value, but if you're doing that, you know what you signed up for.


> NetBSD on mid-90s hardware: uhh, that's cool if you want to experiment with it or enjoy the nostalgia value, but if you're doing that, you know what you signed up for.

So why is old tech persona non grata? I'm getting so tired of this death march.


One reason is because its performance per watt is horrid. There’s an environmental cost at keeping hardware 17 Moore’s Law generations behind up and running.


... but no Intel Management Engine ...


You get that on an RPi4 with less power and more performance than some of the ancient architectures still being dragged along for the ride in some projects.


Bad example, on RPi you still have horrid proprietary blobs running.


Then pick one of the million clones that don't.


Yes, but that doesn’t mean anyone has the right to kill these architectures off.

Btw, we just saved the VAX backend in gcc.


I disagree vehemently. Upstream has every right to kill off support for them in the version they maintain. Interested parties are more than welcome to fork the software and maintain their own supported branch. That's a huge part of the appeal of Free Software!


When did it go from "Of course it runs NetBSD" to "Of course nobody runs NetBSD"?


> I'd never advocate for support for such systems to be scrubbed from the Internet. I'm just saying it's unreasonable to expect a maintainer today to support ancient systems.

The raspberry pi 3 is ancient?



autotools is used for embedded systems, if you cut them out then you lose a big part of thei rusers...


Do you have any concrete examples? I see it stated as a use case, but I've never seen it in the wild.

Personally speaking, I cross-compile daily using CMake with toolchain files. Some modern RTOSes use CMake exclusively e.g. Zephyr or mBed. CMake supports quite a few open source and proprietary toolchains, and they all work perfectly well, and integrate nicely with whatever build tools and IDEs you care to use.

Interactive flashing and debugging via JTAG or SWD/SWO in CLion, Eclipse or vsCode is nice. You aren't getting that with the autotools...


Well I left the embedded industry a while ago. There was some movement towards Yocto but we used autotools for everything.

I worked in the VOIP industry ~14 years ago and in the electrical sector ~8 years ago.


That depends on how much modularization the project has. If each platform is e.g just another file it isn't that much of the problem. They can just call that platform deprecated if they don't want to maintain it anymore but will be no issue keeping it.


> 32 bit CPUs

That’d break havoc for pretty much all of OpenWRT.

GNU make is used for lots of embedded devices.


It's been nearly 10 years since I last touched an autotools project. Recently I was re-exposed to autotools when trying to setup Underworld2[1] for what I thought would be a fun simulation project. Underworld2 requires PETSc, and OpenMPI. Finding the right environment variables so that ./configure would work with CUDA and each would reference the other was nightmarish. Who knows if it was autotools-related or something else that broke. The point is that even after 13 years of experience programming I am in no way equipped to deal with that stack to even find out.

[1] https://underworld2.readthedocs.io/en/latest/


Please just sunset it and recommend projects adopt better tools.

And if not, please please please please deprecate libtool, which is the worst abomination. Making shared libraries with gcc and clang is not hard, and libtool should never be adopted for new projects (yet the GNU docs recommend it).


I had the same opinion until I had to ship the app with shared libraries on different BSD versions and a few older Solarises, besides multiple Linux distributions. Even then, I wasn't sure it would work without problems.

libtool is ugly because it tries to solve an ugly problem. To support the above, my manual Makefile was uglier than libtool :)

These days, if I'd have to do that again, I'll pick libtool without overthinking.

People need to learn why some things are as such and should see outside of Ubuntu/Windows/macOS comfort.


Would you still have to ship your app on a variety of BSDs and older Solaris versions in 2021, though?

On current BSDs, clang works. Solaris and HPUX and AIX and Ultrix and Irix are all, in a word, deprecated. The 2000+ line long shell script that’s used to figure out which args to pass to the compiler, determined on the fly for each source file, can still be used on those deprecated systems, but it should also probably be deprecated itself when all you need is “-fPIC” in your CFLAGS (an oversimplification but not by much).


Libtool attempted (badly) to solve portability problems between the diverse range of linkers on 90s Unix platforms. Its versioning scheme is utterly insane. It made a slightly difficult problem incredibly complex, and for no good reason.

Every current platform of note has a perfectly good linker, making libtool redundant. More often than not, it's been an "anti-portability" impediment when it thinks it knows how to drive modern linkers but because it's barely maintained, it gets it wrong by trying to be too clever.

Even proprietary embedded toolchains come with full-fat ELF linkers these days.

Of all of the pieces which make up the Autotools, libtool is the one which could be dropped today with zero impact. It's the least useful and the least necessary of them all.


I still vividly remember the time libtool decided to drop "-flto" from the linker command line, because it decided that dropping unknown options would be more likely to make the link work. Dropping "-flto" rather has the opposite effect, and there was no way to force libtool to actually accept it, since it would drop it even when the user specifically requests it in the linking flags environment variable.

From that day forward, I never want to work with libtool ever again.


> deprecate libtool..

I don't quite understand this. What if one uses something else than clang or gcc on the target system?


Like an ancient compiler that’s also deprecated...?


libtool isn't primarily about portability across compilers; libtool is about portability across linkers and library formats, from the days when there were many systems with deficient library support. It's not unreasonable to assume a modern linker, and modern library support.


People still use macOS, which is the main reason I see "please add libtool support" these days.


It's possible to handle libraries on macOS without libtool.


Really have to agree with the doubters.

I have used Autotools since the nineties and I have never been able to understand it or make use of it for a fresh project (well once, using some gui tool that completely abstracted away the pain, was incompatible with anything else, and who's name escapes me now)

In its day it was a wonder: tar xf; cd ... ; ./configure --prefix=`pwd` ; make ; make test (if you are very lucky) ; make install

But times have changed.

Interfaces are much more standardised now, and from where I sit it is the "polyglot" problem that is interesting, not different architectures.

It is time to pick a winner from all the multitude of tools out there and GNU should go with that....


> It is time to pick a winner from all the multitude of tools out there and GNU should go with that....

plain Makefiles without configuration that work on linux and macos out of the box. Those are musch easier to write than to setup autotools or write a cmakelists.


If that is your bubble.


My memory of GNU autotools and friends, from when I shipped one release of a project with it:

* the size of my project doubled

* naturally, autotools would check for FORTRAN despite my project being in C

* if users built the software as root, it would trash critical system files

* instead of installing ELF library files in /lib, it installed non-functional shell scripts in /usr/lib

* cross compiling broke

The solution was plain old make. You can do a lot with make. It performs really well. I even did a "make install" that would depend on the final locations for everything. It was all configurable. It worked great.


I've used autotools on a ton of my projects, and I like it a lot. I can't speak to any of the problems you had.

The one thing I'll say about Autotools is that the learning curve is pretty steep, and it took a fair amount of experimentation to get it right on my first Autoolized project. Maybe you never made it all the way over the hump?


Telling response on why these systems don’t evolve to be more user friendly from “Gavin Smith”:

It could be made easier to get started with automake. For example, "AUTOMAKE_OPTIONS=foreign" should be the default; otherwise automake refuses to finish if required files aren't found in the project. There's a political problem here to persuade people of this. The documentation could also be improved. I remember somebody was complaining about this page: https://www.gnu.org/software/automake/manual/html_node/Progr... and asking what "maude" meant - it turned out it was the name of the dog or something of the person who first wrote the program, who didn't want it to change. So think getting consensus is impossible for changes that are needed to make the program simpler, easier and better. There's a vacuum of responsibility where nobody wants to step on others' feet. Forking the project would go nowhere as nobody would use the fork.


The explanation on that page that "maude" is being used as a placeholder for your program name seems pretty clear to me.


I found it confusing in 1999/2000 when I first autotooled a project and read it for the first time. If others find it confusing, maybe it needs clarifying or changing to make the example clearer.


I don't often say this but they should probably throw it away and rewrite it from scratch with a smaller vision. There is no part of the current code that should actually be kept. There is no place for a pile of Bourne, m4, and perl in today's stack.


It's difficult because the API is m4-templated shell (configure.ac, macros) and makefiles, and so you can't easily handle any existing autoconf project without using what is effectively an implementation of m4, shell, and make.

Yes, projects themselves could move to something else, but there are many options for that (mentioned in the article). I'm not sure how much value an incompatible autotools-flavored tool would have to anyone. (If you're not templating shell, there are way better flavors of build tool, anyway.)

This is, incidentally, a software design lesson: if your input space is too big and you basically encourage people to see straight through your abstractions, you have no abstractions. (I'm facing this at work: we have an intricate set of homegrown makefiles that evolved over some 15 years, and make is hard to teach and debug, but it's pretty difficult to switch to anything else because we allowed people to write custom makefile rules too and there are lots of them. We can automatically convert the easy cases, which are the majority, but then we'd have two systems, which is worse.)


> There is no place for a pile of Bourne, m4, and perl in today's stack

What do you have against m4? It's a lovely little macro language.


In small doses, perhaps.

After having to maintain some larger M4 macro collections (for Autoconf and the Autoconf Archive), as well as using it in some work projects for code generation, I'd have to say that maintenance and testing are hard. Quoting can be painful. And it has a number of annoying limitations. In retrospect, I think it was a bad choice, and I wouldn't choose to use it again. Trying to understand what's going on with multiple levels of nested macro expansions is painful, particularly when you can't directly introspect all the intermediate expansions.

When it comes to M4 usage within Autoconf, I'd happily see it dropped entirely in favour of shell functions. It's a historical implementation detail which should have been removed over a decade ago.


Textual macro languages are evil. Use anything else if you have any choice in the matter.


In fact, I’m not sure that I’ve stated that strongly enough. I’ve recently written a non–trivial amount of M4 to generate Rust, and I think that M4 is worse than PHP (which I’ve also written a non–trivial amount of, sadly).

https://github.com/db48x/flex/blob/retargeted-to-rust/src/ru...

This kind of macro language is really the worst way to write any kind of software. It’s not maintainable, it’s not readable, there’s no way to collaborate with other developers on it. There’s no way to create abstractions, so you must keep all the details in mind at all times. You can’t put those details into a box and forget about them. It is Cthulu–esqe. It would be a chapter of The Book. Practicing M4, or any language in this family, has a permanent SAN cost. Just learning the language probably doesn’t have much cost, but even small sacrifices should not be made lightly.


> a non–trivial amount of M4

That's horrifying, and certainly not the purpose of m4. Yet, it does not mean that m4 is not beautiful, for doing small things that would be a bit cumbersome in awk, for example.

> There’s no way to create abstractions

Best feature in my view. Abstractions always suck. Would you complain that there's no way to create abstractions in sed, for example?


Yes. The only use for sed (and awk) is one–liners. That you either use once and throw away or immediately assign to a shell alias so that you can reuse it, because a shell alias is a means of abstraction.


Love the SWOT writeup https://www.owlfolio.org/development/autoconf-swot/

I honestly don't see the need for autotools for AT LEAST a decade now. Except for a steady wall of text to make it look like you were doing some cool hacking work.


Can't disagree with most of the points made here, except for a handful.

Firstly, that the "Autotools are not obsolete". Sorry, but they are. They have been for well over a decade. They don't serve today's needs effectively, and that is the reason for the decline in contributors and end users. They moved on. And that includes me too (I did the initial support for multiple language standards in Autoconf, starting with C99). Support for new language standards and features in Autoconf is woeful. Support for integration with various libraries, tools and frameworks is woeful. Other systems support them out of the box.

As for whether support for them should continue. That depends upon what individual projects need in terms of portability requirements. Clearly a lot of projects have multiple decades of investment into the Autotools. There is a sunk cost here, and replacement is costly.

Despite that, I've converted all of my own projects, plus many other projects I work with, over to CMake. In every case it was a net improvement, and well worth the migration hassle. Many projects overestimate the difficulty. In doing these migrations, I've seen many examples of buggy use of the Autotools. It only provides "theoretical portability". Unless you actively test it, it's most likely broken. In direct comparisons during project conversions, CMake has worked on various minor platforms (Cygwin, MinGW, Solaris) where the Autotools support was broken for years and no one noticed. That is to say, direct empirical evidence showed worse Unix portability for the Autotools than CMake.

If the Autotools want to retain relevance, the main hurdle they need to cross is portability to modern platforms. They can't stick to Unix-only portability problems of the '90s and remain relevant. So many projects have needs beyond that, and that is the main reason for the declining use. Fix that, and people might have a reason to stay.

Unfortunately, if you follow the mailing list and such, you'll see that most of the long-time developers and users have zero interest in portability outside the niche they have inhabited for the past 25 years. They don't understand how the rest of the world works or what their needs are. There is no comprehension about using tools other than shell and make. The developers who did care about this stuff moved on. I became a CMake contributor instead. Newer build systems moved on to ninja and other tools, support various IDEs, and integrate with all sorts of other systems and tools. The Autotools don't do any of that, and likely never will. Yet for most developers, these are non-negotiable. Want to use CLion, Visual Studio or Eclipse?


I respect the years of work that have gone into autotools, and the vast ecosystem of projects it has made possible, but nowadays I use cmake whenever possible, mainly because it can generate ninja build files. It would be great to see autotools on https://github.com/ninja-build/ninja/wiki/List-of-generators...


Very nice writeup, I hope autotools will continue the good work. It is still the best build system.

- If you ship a trimmed down version of ./configure, it will run quite fast. Many people copy&paste configure.ac from other projects and do not delete unnecessary tests.

- It is easy for users of other systems to add portability patches. With ./configure you can do anything, not so with meson or cmake.

- ./configure's command line syntax is still the most elegant compared to the enterprise syntax of cmake and meson. --prefix=/home/foo/bar vs. -DCMAKE_INSTALL_PREFIX=/home/foo/bar. Specifying CFLAGS in cmake is painful.

- Unlike cmake or meson, autotools do not dictate a rigid source tree structure.

- You can easily build both shared and static libraries, which the "modern" build systems do not support.

- Cross compiling on Linux with autotools/gcc is heaven compared to cmake/clang. You just need the host triplet and you are done.

- The build output is exactly what I want to see. cmake and meson/ninja hide the output by default. If you enable it using yet another arcane option, the output is more chaotic and unreadable than in most autotools projects.

In general I find autotools projects like gcc easier to compile. The flags are more intuitive, the output is organized and readable. Compare that to cmake projects like llvm, which always seem to expand the option set to more than 100 unintuitive -DENABLE_ARCANE_FOOBAR_QUUX="a;b;...;z" flags.

Autotools is very much appreciated. Thank you for another release, I am looking forward to future ones!


> GNU project status discourages new contributors because of the paperwork requirements

Can somebody elaborate this point? Does that mean I need to go through paperwork even for contributing a bug fix?


Yes. You have to assign copyright to the FSF in order to contribute anything.

This is one reason for its demise. I've done the copyright assignment for both Autoconf and Automake. It's one reason why there are external macro archives (e.g. Autoconf Archive), when much of this functionality could be in the core projects.

Example: Why, in 2021, should you have to rely on external support to do something as trivial as enabling multithreading [ACX_PTHREAD]. It's not even portable to non-pthreads platforms. So much for portability. Contrast with CMake [find_package(Threads); use Threads::Threads].

Compared with CMake, which doesn't require copyright assignment, and anyone can contribute to. It has hundreds of contributors, tons of built-in functionality, and more third-party extensions than Autotools could ever dream of. Copyright assignment is a big part of that, but also so is upstream attitude to third-party contributions. Kitware do this very well. The Autotools maintainers not so much; partly down to their own historical bad choices which prevented change to the internals due to the potential for breaking compatibility.


If it hasn't changed recently, you have to send in paperwork assigning your contribution to GNU as to maintain a single copyright owner.


Plenty of projects require a CLA, which seems pretty similar paperwork-wise?


Absolutely similar, but don't underestimate the friction this introduces, and the number of potential contributors you cut off by doing this.

Getting corporate sign-off for such things can take months while it wends its way through legal departments and upper management. Many people won't even bother, it's such a hassle.

Last one I did was for Apache. It took at least six months. Previously did several for GNU which were much quicker (in a different company).


Why do we have so many different configure / build systems and none of them can run their config checks in parallel.

Compiling Gentoo packages on a 32 core 64 threadripper is getting ridiculous with many packages, even some "larger" ones, taking longer to configure than to actually compile.


Recent HN discussion (of an LWN article about this work, not of this GNU mailing list thread):

https://news.ycombinator.com/item?id=24872032


How many of Autotools' configure-time checks could just be replaced with a giant static header file of platform #ifdefs?


I got some cynical thoughts reading this well-done summary:

> There are few active developers and no continuing funders.

Like most open-source projects.

> There is no continuous integration

Like Signal.

> Bugs, feature requests, and submitted patches are not tracked systematically.

Like PostgreSQL.

> there is code shared among Autoconf, Automake, and/or Gnulib by copying files between source repositories

Like libmtp / libgphoto2.


While I don’t hold much love for autoconf and autotools as full suites to work with, count me in as someone who appreciates GNU make and simple makefiles.

I use it for lots of things, even NodeJS, and it helps me keep my builds more portable.

I’d hate to see GNU make completely abandoned.


GNU Make is widely used, and actively maintained. New features are added regularly. It's the build tool targeted by Kconfig (used in the Linux kernel and elsewhere), and also it's probably true to call it CMake's most common backend.

So, don't worry - GNU Make isn't going anywhere, regardless of what eventually happens to Autoconf.


I'm sad that M4 is getting a bad reputation; probably it is more due to lack of knowledge than the actual tool, because the language is really simple. As usual, many devs would copy/paste m4 macros without understanding what is going on. Also, heavy use by Sendmail or SELinux didn't add to reputation either (again, not due M4 itself).

But I find M4 a true gem "no one heard of" - a portable macro processor you can use with basically anything involved with the plain text. Especially it is useful where you have tons of boilerplate code to type (Java, Ansible, Go, js/html/css to name a few).


Related: Rejuvenating Autoconf https://lwn.net/Articles/834682/


I've always found it emblematic of the basic problems of this community that they choose to a communicate through mailing lists with an online interface that looks crap no matter how you try to view it. Like, if they had dumped the text into an html <body> tag it would have re-flowed gracefully, but somehow they broke that too, and have not managed to do anything about it in decades.


The OP, Zack Weinberg helpfully published his analysis on his own web page¹ and the formatting (different typeface for headings, flowing paragraph text, hypertext links, bullet points) made the analysis much more readable.

I think email is fine as the communication medium for software projects – particularly for projects that have been around for as long as Autotools†. For me, the biggest downside is that not all mailing lists use GNU mailman or other listserv software that provide archiving of list posts. Having a searchable archive of discussions is invaluable.

† though I was surprised to see a number of contributors top-posting; that practice was universally discouraged on Internet mailing lists until the number of new members using MS Outlook overwhelmed members who followed the established “netiquette”

1. https://www.owlfolio.org/development/autoconf-swot/


So we agree that the online presentation is not good - I have no issue with email particularly, i have a problem with the email archives.

More specifically, the fact that no one cares about the usability of the archive or online presentation is proof that they don't care about the impression they make external people. They can hardly complain about the lack of new contributors if they make no effort to be accessible to them.


You can't just "cast" it as html and let it reflow. Email has significant whitespace and writers use it a lot.

Do you have examples of projects that have used a "better" method of communication that have achieved what GNU, Linux or any of the BSDs have achieved?


Self replying because this seems to have tickled a few people the wrong way: I'm specifically complaining about the online presentation of the mail/archive (or lack thereof), not the use of mail per-se.

Simply, you can't complain about the lack of new contributors if you make no effort towards appeal and usability.


The letter mentions that "building an Autotools-based project directly from its VCS checkout is often significantly harder than building it from a tarball release, and may involve tracking down and installing any number of unusual tools."

Why would that be? They both just seem like a source distribution to me... why would there be a difference? Can anyone explain this?


The tarball distribution includes the output of the Automake and Autoconf tools (the Makefile.in and configure.sh files, respectively), while building from a VCS checkout does not. Thus, building from a checkout requires extra steps and software that you may not already have ready to hand.


A source tallball already has the Makefile.in and configure script pre-generated, along with a few other autotools generated files. These expect `make`, the compiler, and needed libraries are installed, and some other generic unix tools, but do not expect users to have any GNU specific software installed. (Like the autotools).

The source repositories typically do not have these already generated (since checking in generated files is frequently frowned upon), and building them often require getting the right version of autoconf/automake/etc installed, as a different version may not be fully compatible.


I used to not include autoconf generated artefacts in my repos, but with .gitattributes making them sort-of invisible and the reduction in setup aggravation for other devs, I've completely changed my mind.


What the FSF needs is some marketing people who can come up with a better subscription model to support these requirements.

I've been a member for years, but the business model (gasp) languishes for lack of a practical vision.


this could be an interesting project to agree on a portable subset of a modern language that can be either interpreted or compiled in the background so quickly that you wouldn't mind. Something like portable-rust. Could replace bash/m4/whatever else auto tools use (and it uses a lot of stuff bolted on)

Would also drive contributors as just mentioning a new languages on your README floods your issue tracker these days :)


Autotools is slow as balls - but then CMake configuration is slow too - a minute if I recall for llvm-project on my M1. Much of the modern Linux build issues are RPM vs DEB packages not playing nice. Rust Cargo has it's own issues with snowflake dependency hell instead of more generic libraries.

GNU Make is a different story. Easy to get core saturation for large workloads. Stanford GG allows you to massively distribute it to thousands of workers. Still very relevant.


> Much of the modern Linux build issues are RPM vs DEB

Which in turn are not the best formats for binary distributions.


I wish someone added a Makefile backend to meson. That would make it a much more appealing as an alternative to the autotools mess.


meson has the problem of being written in Python and thus even an end-user has to endure Python library distribution.


And autotools have the problem of requiring perl, m4, and some other stuff i think. And cmake requires C++. I take python over all of these anytime. That said, it's still far from ideal.


M4 doesn't need to be installed on the software distribution's computer (that was my point).

Perl I don't even know if it needs to be installed on the packager's computer.


> cmake requires C++

One more reason for C++ projects to settle with CMake,


Why? You need Python to run Meson anyway, might as well install Ninja too.


You don't actually need python for ninja. Someone wrote a compatible tool in C called samurai.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: