One annoyance is that it often feels like "./configure ..." takes longer to run than the actual "make".
But my main annoyance derives from that fact that most projects I build from source which use autotools don't build cleanly on anything but Linux x86/amd64. I spend a lot of time playing with non-Linux and/or non-x86 Unix systems, or using vendor compilers instead of gcc -- primarily Solaris (both x86 and SPARC), Linux on POWER, and AIX on POWER. Current versions of IBM's compilers, Sun (err...Oracle's) compilers, etc. Run the configure script, let it slowly probe all the little corners of whatever it's probing, and sometimes I'm lucky enough that it completes successfully and claims to understand the environment it's in. Then run make (or, as is often required, GNU make), and hit build errors very quickly.
Which makes me think... why did I just sit through that long "configure" script run, and why did the developer bother messing with autotools, if this project isn't even going to bother targeting anything but Linux with the GNU toolchain to build it? If you're not concerned about portability, it seems a relatively simple Makefile would be an easier and faster approach. Why foist the [insert your favorite colorful descriptor here, depending on your personal feelings] autoconf system on builders of your project?
Even if target is just Linux with GNU toolchain, there are still plenty of issues that needs to be managed by configure script - tests for mandatory and optional libraries / header files, some functions are in different libraries on different systems (e.g. tinfo vs ncurses, or librt), different versions of kernel headers, different versions of GCC (older ones may not support some options).
I never understood the concept of "optional" library. Either your program requires a library or it doesn't. If it requires the library, the easiest thing to do is to assume that it is already installed and just link to it. In case it is not installed the compiler will fail with a very clear error message.
Say, your program is an editor of png files and it requires libpng. Thus in your code you will have
Interesting condescending tone there. I didn't downvote but if I can try to answer the core part of your misunderstanding: Many do _not_ want the build to fail when library png is missing.
Example projects using this type of build optionality are FFmpeg and Qt.
For example, ffmpeg lets you build with the optional Fraunhofer FDK AAC (libfdk_aac) encoder but it is not GPL license compatible. If everyone followed your "assume libfdk_aac is already installed" advice, many people would be stuck at the "very clear error message" and not be able to have a working GPL ffmpeg.
Same idea with Qt optional libraries.
You can provide some code and let the users do whatever is needed to get it to compile on their system.
Or you can try to automate the process with the best of intentions and create a monster.
In any case, the reason why autotools exists in the first place is that people were less than thrilled with the work required to make things work by hand.
Randomly linking against things if they happen to be available is a terrible idea.
Does any build ecosystem except C do this? I'm not sure even CMake does this.
There's absolutely nothing wrong with that, and in fact it's so simple that afterwards we can even write a script to automate setting those variables.
Who needs convoluted build systems anyway?
Say you are writing a program to edit images that can potentially read lots of different formats. However most people don't need support for .sgi or .ecw files and won't have the libraries to read those formats installed. Your program is still perfectly usable without those libraries and so you should still be able to compile and run it, just without support for those formats.
Or even worse, it is used as a dependency in a script, or it is a library linked by another program, but the author can never trust that any given installation has the feature he actually needs.
It is a mess. Just don't do it.
The default / most common variants have pre-built binaries that are downloaded via a port install, but if you specify something more 'obscure', things are built from source.
FreeBSD's Ports also allow for options:
Build-time and run-time dependencies are both handled.
If it existed, it would be a nightmare, because features are recursive, and you would end up with fragile monstrosities like
By the way another thing I've come to hate is configure or make scripts which download stuff. This is really annoying if you want to build self-contained packages. Or if the downloaded material ends up being not available, or if you're not always connected to the internet.
That's the "glass half empty" pov.
The "half glass full" pov is that anyone can still use their favourite piece of software even if an obscure and/or useless library is not available to them.
Sure, for hobbyists it is fine but professional use.. I disagree.
Consider a scientific software, let's call it x. x can solve many problems and these problems cross-cut a lot of disciplines.
A researcher, user of x has a need for only subset of these features, and configures & compiles x with the features only he/she needs. This has many advantages from simpler setup to shorter build to easier verification.
When you're compiling software on remote, restricted systems; these optional components makes people's life a lot easier. Also, esp. in scientific community, every library needs something lower level so, trying to compile everything in results in exponential dependency hikes. Optional components solve this very neatly.
The program may have an optional feature that requires a library. It does not make sense to force everyone to install it, if such feature is used by 1 % of users. Example might be mplayer, where there are common decoding libraries used by everyone and also plenty of obsure decoding libraries.
Even better is to use dlopen() to open the optional library in runtime, to fit in distribution package management paradigm, but this requires much more work than simple option to disable dependency at compile time, and even with it it is useful to have ability do disable optional dependency at compile time to avoid requiement for header files.
In addition to what other commenters wrote (failures later during compilation instead of at the start), you also need proper include path in CFLAGS.
For example, gdk-pixbuf is supposed to be included by '#include <gdk-pixbuf/gdk-pixbuf.h>'. That would assume the header file is in /usr/include/gdk-pixbuf/gdk-pixbuf.h . But distribution that i use installs it in /usr/include/gdk-pixbuf-2.0/gdk-pixbuf/gdk-pixbuf.h , likely to be able to have multiple versions of header files installed togethers.
You are supposed to use 'pkg-config' tool in configure script to add proper header file locations in CFLAGS to handle this (in this case -I/usr/include/gdk-pixbuf-2).
> and in your makefile you will link your executable with -lpng.
The library you are usimg may depend itself. on another library, you need to add -l flags for all recursively dependent libraries to linker. As this set of recursively dependent libraries may change in the future, you are supposed to use pkg-config to find proper link flags.
Note that all of this is not really dependent on configure, you can do that in a simple makefile (by just calling pkg-config directly from it). So it is more about using pkg-config instead of hardcoded paths than about using configure scripts.
This makes no sense. Preprocessor options have no place in CFLAGS, which is for compiler options. I guess you meant CPPFLAGS? Putting a preprocessor option on CFLAGS is always wrong and expected to fail.
> But distribution that i use installs it in
If your distribution installs headers outside of the preprocessor path (or somewhere inside CPATH or C_INCLUDE_PATH), then I'd say that this distribution is broken: it has not installed the library, just copied the files somewhere. The makefile is then, correctly, expected to fail. It's just as if you had unzipped the library on /tmp, of course you would not expect the compiler to find it!
Technically true, but pkg-config does not distinguish between CFLAGS and CPPFLAGS.
> If your distribution installs headers outside of the preprocessor path (or somewhere inside CPATH or C_INCLUDE_PATH), then I'd say that this distribution is broken
Both Debian and Red Hat use this path. These libraries are designed to be located with pkg-config, so you can install them anywhere, and everything works as expected.
Your parent poster IMO has a point: seems that part of the reason for autotools existence is to also cater to all distributions choosing weird places to put headers and libraries in. Shall we accommodate them forever?
As for Linux desktop, personally I'm skipping it due to the complete lack of security and sandboxing in X11: basically every program can be a keylogger is the typical egregious example. And Wayland is kind of funny; people are still trying hard to make it be 60FPS capable which is absurd in 2021.
To give you a less clear example, think about writing a library that sends telemetry to some cloud service. The telemetry could be readings from a sensor attached to a microcontroller with 512K of RAM or it could be readings from a server sitting in a datacenter with 256GB of RAM.
You need some helper library to handle the protocol and there's a really nice full-featured library that comes with a bunch of handy debugging tools but it's too memory-hungry for that tiny microcontroller.
Another option is a minimalist library that uses very little memory but also has less flexibility in say TLS.
If you pick just one, one of your platforms ends up suffering needlessly.
Adding the options allows the user to decide what's best for them.
I simply don't understand the explosion of complexity associated to autotools, cmake, and the like.
If you refer to tools like cmake as "explosion of complexity" then I have to say that you are using them entirely wrong, and you should review your practice thoroughly.
I mean, cmake is a makefile generator. You specify a high-level description of your project, and then you run cmake for it to generate your Makefile that performs the whole build.
With cmake you don't even care which compiler you use. You can simply state you require, say, c++ 14 with constexpr support and you're done for all platforms and all compiler picks.
If you believe cmake is more complex than a makefile then either you only work on single-foot projects or you have been cursed with hideously managed cmake projects.
Autoconf for example can abstract across different compilers from different vendors, with all their various incompatible command–line arguments. And it does that for a dozen or so different languages, not just C.
It can abstract across different function signatures in the libraries you're using. For example if you're calling a libc function where the arguments have changed, you can detect which version of the function you have and make your code work with either one.
It can do the same for structs and types as well.
It can test for system services, such as X Windows or the ability to run Perl scripts.
It can abstract over the differences between different implementations of common utility commands such as awk, grep, install, mkdir, and so on.
It does a lot of stuff!
Of course it does. My point was that I've never found it did anything that I was interested in.
I have only found this behavior in three different projects that used cmake. After several minutes of beautiful compilation lines in green, it turns out that the library (that cmake was supposed to check) was not installed after all. Of course it printed a beautiful error message in red. Beautiful but non-informative, that is.
This is a nice feature since it avoids a lot of manual configuration. The downside is that support fails silently if the library is missing - since that is the expected case for most users.
...because almost nobody can really write any code in M4sh (M4 with bash spliced together) and other odd languages.
Really, the e-mail from the actual link is surprisingly sober and calls out the weaknesses of autotools well.
Definitely the latter. Few projects treat `configure.ac` and its macros as serious as programming should be. They produce spaghetti code, duplicate code, patched code, slow code, unmaintainable code and they leave it there for decades only add code to make it worse, without ever feel the embarrassment to fix it. Of course blame the coder before blame the language.
So you might have HAVE_FOO and HAVE_BAR in config.h. But if the developer doesn't do something like:
int ret = foo();
int ret = bar();
int ret = do_some_fallback();
Of course, that assumes they even put the tests for foo() and bar() in configure.ac in the first place!
I would still use autotools for a C project even if I was only targeting linux/x86_64, just so I could write a simple declarative Makefile.am file that lists sources and defines target outputs, and have autoconf/automake figure out everything else, and put it behind a standard interface (that is, one that respects prefix, destdir, etc.). Also being able to easily write tests for dependencies and get the appropriate cflags and ldflags makes life easier.
...that kinda works, actually.
The Autotools model of probing for system capabilities at build time isn't dead; it will never die, it will only be used in greater and lesser amounts and to complement other means of discovering system capabilities.
CMake seems to me like a reimplementation of the Autotoools model of build-time probing. The true competitors are binary distribution after building on precursor systems with full specification and no probing, and cross-compilation with full specification and no probing.
Serious users use autotools (with mingw), but the occasional windows noob cannot be helped. So he gets his limited cmake experience. Cmake usually only does a tenth of the autotools probes, edge cases will never be detected and worked around. Such as broken compiler versions, as with gcc or icc frequently.
Cmake is a visual basic like hack, but Visual Basic at least had a proper design. Just lacked the library infrastructure.
Everybody uses autotools, show me a cmake-based distro.
All this autotools bashing is hilarious. Probing is costly yes, but you only do it once, and you rarely do it by yourself. Distro packagers do that for you.
Today, it's better at portability than most Autotools builds I've used. In projects which maintain both systems side-by-side, I've had Autotools fail on Solaris, MinGW and Cygwin while with CMake it worked without trouble. And that's for projects which hadn't specifically tested and fixed defects for portability issues on these platforms. Empirically, CMake had better platform coverage.
As for the probing of features, take an actual look at what CMake provides. Over an order of magnitude more feature tests. The problem with the Autoconf feature testing is that the vast majority of the tests are 15 years obsolete at this point. The CMake tests are mostly for contemporary platforms. That's where the value lies.
The LLVM project exclusively uses cmake.
xmkmf joined the chatroom
There's more than just compiling, it adds a bunch of common stuff "out of the box" (with a line or two of configuration). Test runners, pkg-config, installation, cleaning, out of tree builds, etc. Nothing magic that you couldn't do yourself, but IME by the time you do it your makefile is much less simple, it's taken a non-trivial amount of time and possibly has a bunch of subtle errors. Even on my own little projects that will never run on another computer let alone another OS I find autotools worthwhile.
If I had one major complaint it's that it still puts build artifacts in the root directory.
Agreed. The configure script doesn’t take advantage of concurrency, but make can.
But the number one cause of slow configure is that people forget to run configure -C. The -C option makes it cache the outcome of all of the tests, so that it doesn’t have to do them all again. This is a huge time–saver!
And it does quite a lot more than just probe the compiler.
And finally, it doesn’t support concurrent execution because nobody who was tried to write it has returned. Usually it starts with confident estimates and preparations, followed by departure and sporadic reports. Gradually the reports dwindle in frequency and comprehensibility, until eventually nothing more is heard.
Do you mean the linux that puts libraries in
the "modern cmake" movement is a step in the right direction
It is nice that windows and linux folks can both work on a project (which I see as windows developers being able to contribute to linux software)
The reality is that it's kind of messy when you actually try to develop something with it. (but not try-to-figure-out-m4 messy)
I'd say that a large portion of the bad name that autotools currently have is due to their users. On one hand developers, who as you say do not actually use all the deductions of configure scripts,.
But also users (the ones who blindly type ./configure) are to blame a little. If you complain that you wait while configure checks to find whether your machine is 32 or 64 bits, etc., please read on how to specify "site defaults" once and then all your configure runs take a fraction of the time.
However, from the point of view of a developer, autotools are way too much for my brain. The m4 macros are inscrutable and I never felt like I had any hope of actually undesrtanding how they work. It's one of those technologies that I my only hope of getting work done is by copy pasting snippets of code I got from other people.
Anyway, does anyone know if there are alternative build systems that follow the same paradigm as autotools, but more pleasant to use as a developer?
Maybe we should make any easy to use version of autotools which does nothing but accept the standard prefix, flags, etc options. You wouldn’t be able to do sophisticated configuration, but let’s face it most codes only pretend to do that.
One example that someone mentioned in a sibling comment is Autosetup, which apparently uses Tcl instead of posix shell + m4. An interesting idea... Tcl is one of those languages that's small enough that it's feasible to include a copy of the interpreter together with the build scripts.
Since then I’ve started rolling very simple Makefiles that just call the compiler. cc will complain loudly when it can’t find the right headers, and for these simple projects I don’t need any configuration flags, so why worry about all the machinery of autotools?
The big alternatives I remember are: SCons, Maven, and CMake.
I liked CMake the most; Autotools was nearly unusable for me. It seemed like I needed to learn almost as much about Autotools to get productive as I did for C++.
If I'm wrong about anything, someone will come along to correct me.
Many Linux distributions do it as a matter of course to ensure it's actually possible to regenerate and that it's up to date. At that point, you start to question the necessity of embedding it in the first place given that its primary consumers don't care.
It's very compact to be included along with the project. It does the configure, which tests compiler features and installed libraries. Then generates the Makefile using your custom Makefile.in.
Basically, it's a compact set of Tcl scripts, it even includes a small Tcl engine, in case it's not installed on the platform.
autogen.sh: command not found
However, if you're pulling a random commit from git rather than ungzipping a proper release tarball where autogen and co haven't been run on a dev system for release, then that workflow of course can't work.
Which is one of the points of the linked discussion - that folks clone from git rather than doing "proper" releases, with cloned repos increasingly bringing their dependencies with them. Another point being that modern "language ecosystems" a la Go and Rust have their own canonical package management and aren't really made for polyglot development and linking with locally installed libs.
I don't quite get the autotools hate; from a user PoV, it's the one build system that has worked extremely well over the decades with just POSIXly tools installed locally (make, sh, cc). The same can't be said for cmake. Not a particular fan of libtool, but arguably the invasive thing it does is a consequence of link-loaders such as ld.so still not getting lib resolution quite right in spite of ld.so'd heavy-handedness (Mac OS's is saner IMO). Another reality is that Docker builds are used to shield against lib breakage.
IMO, what could be done to simplify builds is not to bring a new grand-unifying builder a la cmake, but to find common ground among GNU and BSD make, make generic "make" more powerful such that Makefile macro expansion works in more places than it does now, and rely solely on Makefiles and POSIXly/LSBly C/C++ header/macro def discovery in your source files rather than relying on automake, config.h, and -DHAVE_XYZ. Then slowly deprecate autotools and restrict yourself to target the much more uniform landscape of Linux, BSDs, and Mac OS we have today.
Either autotools for the quality projects, or makefile projects for header-only like projects. Cmake is faster, but extremely limited.
Today UNIX and those platform-specificities don't exist anymore.
Instead we have Linux, FreeBSD and MacOS. Unfortunately, autotools haven't kept up and don't actually do anything useful to help you write portable code across Linux and MacOS.
Ultimately, devs shouldn't have to suffer to build C or C++ programs like that. Why is building executables so hard like even today? Developers need to solve that problem once and for all. Containers aren't the solution.
Isn't that a case of "now you have two problems"? :)
> Isn't that a case of "now you have two problems"? :)
No. Now you have 3 problems: the 3rd party and the tool.
- Everything but Linux and a few of the more popular BSDs
- 32 bit CPUs
- CPUs with less than 1,000 installations
- Ancient toolsets that don't support features everyone else takes for granted in the last 20 years
For instance, it has to be a pain in the neck to sit on a code base with Solaris 2.6 support, or NetBSD on Alpha, or... Also, it seems like there'd have to be some legacy pain around things like "we can't use this flag on sed because Amix didn't have it".
But I don't know enough about the infrastructure to know if pruning those out would make a bit of engineering difference. How much is support for ancient or little-used stuff slowing down development?
Alternatively, I wonder what it would look like to have a build farm that precomputed all the values. "Oh, you're on macOS 11.1 on Intel? Here's the list of 32 envvars you need to set." If a million people are compiling the same file on a million identical computers, is it a great idea for all of them to have run the same probes?
I'd never advocate for support for such systems to be scrubbed from the Internet. I'm just saying it's unreasonable to expect a maintainer today to support ancient systems. The people still using those systems have the right to fork the tools and maintain their own version, but that's not the same as making upstream do it for them.
Slice autotools down to 32-bit & 64-bit UNIX systems in the last couple of years (bearing in mind that if you start that today, by the time it is finished it will constitute support for the last 3-5 years)... or even all the way to current systems for the parenthesized reason... for C and C++.
Would probably also call it "Autotools 3" and slice away anything else that doesn't seem useful, on the grounds that 2.7 isn't going anywhere. If in doubt, slice it away and see what people say when you release 2.99.01.
This probably takes it down to something not so enormous to carry around.
It seems like other languages are all going their own way. Maybe that's bad, maybe that's good, most likely it's a complicated combination of both, but most importantly, there's nothing Autotools can do to stop it at this point, so you might as well roll with it.
What are the long tail of variants that people actually care about now? 32- vs 64-bit, OK, sure. 36-bit? No way. Big- vs little-endian, sure. PDP-endian? Miss me with that.
It just seems like the number of variations that people plausibly care to support is a lot shorter than it use to be. Why should the autoconf gang work themselves to the bone making sure their stuff works correctly on a platform that no one but a platform's maintainers actually cares to target?
I'm not unsympathetic to people using old or odd systems. I've got some bizarre stuff squirreled away in my attic. I don't reasonably expect anyone but me to put effort into keeping my SPARCstation 5 limping along, though.
I'm still to this day paid to write embedded code for various DSPs, I'm very happy that autotools are portable. I know that on HN if you're not writing NodeJS or Rust in x86-64 docker containers you're niche and don't count, but it's a bit short sighted.
Autotools doesn't need disrupting.
I even think that the mail in TFA is somewhat mislead. Autotools drop in popularity because the type of software development that requires using the autotools is slowly but surely declining, or maybe it's not even declining but simply growing at a fraction of other ecosystems. Actually many modern languages and environments (IMO rightfully) ship their own well-integrated build systems, so attempting to bring the autotools there is probably a waste of time.
For me asking about "future plans for Autotools" is like asking for "future plans for Make" or "future plans for ls". I don't expect any substantial changes in these projects, but I also don't think they're obsolete. They just fit a specific use case and they do it (mostly) well.
A lot of people do. Retro-computing is way more popular than you would think.
In fact, m68k is the oldest port in the Linux kernel and is so well-maintained that it saw multiple other architectures come and go.
Please leave autotools alone and don’t “improve” it.
It’s not like there are tons of build systems already and if someone is so keen in a heavy-weight build system, they can use Bazel which is so bloated that it doesn’t build on any 32-bit target in Debian.
An example off the top of my head -- a CNC mill at my previous job, purchased new perhaps ten years ago, runs DOS 6.22 with Windows CE on top. I suspect that the vendor is still selling the same software stack on the same hardware today.
Before someone chimes in to say, "That's insane, why isn't it running an RTOS?": That mill had zero software faults in the years I worked with it.
Edit: Poking around Trak's website, it looks like the software on today's machines is unchanged. Here are some example screenshots: https://www.southwesternindustries.com/software/page/prototr...
I'm chiming in to say that Windows CE is an RTOS. 
It also runs fine on my SH-7785LCR SuperH board where I regularly test the latest kernel releases and patches.
Frankly reading this thread I think many people here bark at the wrong tree. If you don't care for portability don't bother with autotools, just write Makefiles. It's much simpler.
I thought Windows CE was a standalone operating system like current Windows versions are, not a layer on top of DOS like Windows 3.x and earlier were.
But, Googling, I see Windows CE, on x86, did use DOS as a bootloader, just like how Windows 9x did. DOS boots, then runs LOADCEPC.EXE to start Windows CE. (By contrast, CE on other CPU architectures didn't do this, it just booted Windows CE directly.)
My previous employer has several hundred x86-based Linux field installations. Other hotspots include IoT and industrial control systems, to say nothing of all the legacy servers still out there in current use
The simpler the hardware, the higher the reliability and the smaller the power consumption.
NetBSD on mid-90s hardware: uhh, that's cool if you want to experiment with it or enjoy the nostalgia value, but if you're doing that, you know what you signed up for.
So why is old tech persona non grata? I'm getting so tired of this death march.
Btw, we just saved the VAX backend in gcc.
The raspberry pi 3 is ancient?
Personally speaking, I cross-compile daily using CMake with toolchain files. Some modern RTOSes use CMake exclusively e.g. Zephyr or mBed. CMake supports quite a few open source and proprietary toolchains, and they all work perfectly well, and integrate nicely with whatever build tools and IDEs you care to use.
Interactive flashing and debugging via JTAG or SWD/SWO in CLion, Eclipse or vsCode is nice. You aren't getting that with the autotools...
I worked in the VOIP industry ~14 years ago and in the electrical sector ~8 years ago.
That’d break havoc for pretty much all of OpenWRT.
GNU make is used for lots of embedded devices.
And if not, please please please please deprecate libtool, which is the worst abomination. Making shared libraries with gcc and clang is not hard, and libtool should never be adopted for new projects (yet the GNU docs recommend it).
libtool is ugly because it tries to solve an ugly problem. To support the above, my manual Makefile was uglier than libtool :)
These days, if I'd have to do that again, I'll pick libtool without overthinking.
People need to learn why some things are as such and should see outside of Ubuntu/Windows/macOS comfort.
On current BSDs, clang works. Solaris and HPUX and AIX and Ultrix and Irix are all, in a word, deprecated. The 2000+ line long shell script that’s used to figure out which args to pass to the compiler, determined on the fly for each source file, can still be used on those deprecated systems, but it should also probably be deprecated itself when all you need is “-fPIC” in your CFLAGS (an oversimplification but not by much).
Every current platform of note has a perfectly good linker, making libtool redundant. More often than not, it's been an "anti-portability" impediment when it thinks it knows how to drive modern linkers but because it's barely maintained, it gets it wrong by trying to be too clever.
Even proprietary embedded toolchains come with full-fat ELF linkers these days.
Of all of the pieces which make up the Autotools, libtool is the one which could be dropped today with zero impact. It's the least useful and the least necessary of them all.
From that day forward, I never want to work with libtool ever again.
I don't quite understand this. What if one uses something else than clang or gcc on the target system?
I have used Autotools since the nineties and I have never been able to understand it or make use of it for a fresh project (well once, using some gui tool that completely abstracted away the pain, was incompatible with anything else, and who's name escapes me now)
In its day it was a wonder: tar xf; cd ... ; ./configure --prefix=`pwd` ; make ; make test (if you are very lucky) ; make install
But times have changed.
Interfaces are much more standardised now, and from where I sit it is the "polyglot" problem that is interesting, not different architectures.
It is time to pick a winner from all the multitude of tools out there and GNU should go with that....
plain Makefiles without configuration that work on linux and macos out of the box. Those are musch easier to write than to setup autotools or write a cmakelists.
* the size of my project doubled
* naturally, autotools would check for FORTRAN despite my project being in C
* if users built the software as root, it would trash critical system files
* instead of installing ELF library files in /lib, it installed non-functional shell scripts in /usr/lib
* cross compiling broke
The solution was plain old make. You can do a lot with make. It performs really well. I even did a "make install" that would depend on the final locations for everything. It was all configurable. It worked great.
The one thing I'll say about Autotools is that the learning curve is pretty steep, and it took a fair amount of experimentation to get it right on my first Autoolized project. Maybe you never made it all the way over the hump?
It could be made easier to get started with automake. For example,
"AUTOMAKE_OPTIONS=foreign" should be the default; otherwise automake
refuses to finish if required files aren't found in the project.
There's a political problem here to persuade people of this. The
documentation could also be improved. I remember somebody was
complaining about this page:
and asking what "maude" meant - it turned out it was the name of the
dog or something of the person who first wrote the program, who didn't
want it to change. So think getting consensus is impossible for
changes that are needed to make the program simpler, easier and
better. There's a vacuum of responsibility where nobody wants to step
on others' feet. Forking the project would go nowhere as nobody would
use the fork.
Yes, projects themselves could move to something else, but there are many options for that (mentioned in the article). I'm not sure how much value an incompatible autotools-flavored tool would have to anyone. (If you're not templating shell, there are way better flavors of build tool, anyway.)
This is, incidentally, a software design lesson: if your input space is too big and you basically encourage people to see straight through your abstractions, you have no abstractions. (I'm facing this at work: we have an intricate set of homegrown makefiles that evolved over some 15 years, and make is hard to teach and debug, but it's pretty difficult to switch to anything else because we allowed people to write custom makefile rules too and there are lots of them. We can automatically convert the easy cases, which are the majority, but then we'd have two systems, which is worse.)
What do you have against m4? It's a lovely little macro language.
After having to maintain some larger M4 macro collections (for Autoconf and the Autoconf Archive), as well as using it in some work projects for code generation, I'd have to say that maintenance and testing are hard. Quoting can be painful. And it has a number of annoying limitations. In retrospect, I think it was a bad choice, and I wouldn't choose to use it again. Trying to understand what's going on with multiple levels of nested macro expansions is painful, particularly when you can't directly introspect all the intermediate expansions.
When it comes to M4 usage within Autoconf, I'd happily see it dropped entirely in favour of shell functions. It's a historical implementation detail which should have been removed over a decade ago.
This kind of macro language is really the worst way to write any kind of software. It’s not maintainable, it’s not readable, there’s no way to collaborate with other developers on it. There’s no way to create abstractions, so you must keep all the details in mind at all times. You can’t put those details into a box and forget about them. It is Cthulu–esqe. It would be a chapter of The Book. Practicing M4, or any language in this family, has a permanent SAN cost. Just learning the language probably doesn’t have much cost, but even small sacrifices should not be made lightly.
That's horrifying, and certainly not the purpose of m4. Yet, it does not mean that m4 is not beautiful, for doing small things that would be a bit cumbersome in awk, for example.
> There’s no way to create abstractions
Best feature in my view. Abstractions always suck. Would you complain that there's no way to create abstractions in sed, for example?
I honestly don't see the need for autotools for AT LEAST a decade now. Except for a steady wall of text to make it look like you were doing some cool hacking work.
Firstly, that the "Autotools are not obsolete". Sorry, but they are. They have been for well over a decade. They don't serve today's needs effectively, and that is the reason for the decline in contributors and end users. They moved on. And that includes me too (I did the initial support for multiple language standards in Autoconf, starting with C99). Support for new language standards and features in Autoconf is woeful. Support for integration with various libraries, tools and frameworks is woeful. Other systems support them out of the box.
As for whether support for them should continue. That depends upon what individual projects need in terms of portability requirements. Clearly a lot of projects have multiple decades of investment into the Autotools. There is a sunk cost here, and replacement is costly.
Despite that, I've converted all of my own projects, plus many other projects I work with, over to CMake. In every case it was a net improvement, and well worth the migration hassle. Many projects overestimate the difficulty. In doing these migrations, I've seen many examples of buggy use of the Autotools. It only provides "theoretical portability". Unless you actively test it, it's most likely broken. In direct comparisons during project conversions, CMake has worked on various minor platforms (Cygwin, MinGW, Solaris) where the Autotools support was broken for years and no one noticed. That is to say, direct empirical evidence showed worse Unix portability for the Autotools than CMake.
If the Autotools want to retain relevance, the main hurdle they need to cross is portability to modern platforms. They can't stick to Unix-only portability problems of the '90s and remain relevant. So many projects have needs beyond that, and that is the main reason for the declining use. Fix that, and people might have a reason to stay.
Unfortunately, if you follow the mailing list and such, you'll see that most of the long-time developers and users have zero interest in portability outside the niche they have inhabited for the past 25 years. They don't understand how the rest of the world works or what their needs are. There is no comprehension about using tools other than shell and make. The developers who did care about this stuff moved on. I became a CMake contributor instead. Newer build systems moved on to ninja and other tools, support various IDEs, and integrate with all sorts of other systems and tools. The Autotools don't do any of that, and likely never will. Yet for most developers, these are non-negotiable. Want to use CLion, Visual Studio or Eclipse?
- If you ship a trimmed down version of ./configure, it will run quite fast. Many people copy&paste configure.ac from other projects and do not delete unnecessary tests.
- It is easy for users of other systems to add portability patches. With ./configure you can do anything, not so with meson or cmake.
- ./configure's command line syntax is still the most elegant compared to the enterprise syntax of cmake and meson. --prefix=/home/foo/bar vs. -DCMAKE_INSTALL_PREFIX=/home/foo/bar. Specifying CFLAGS in cmake is painful.
- Unlike cmake or meson, autotools do not dictate a rigid source tree structure.
- You can easily build both shared and static libraries, which the "modern" build systems do not support.
- Cross compiling on Linux with autotools/gcc is heaven compared to cmake/clang. You just need the host triplet and you are done.
- The build output is exactly what I want to see. cmake and meson/ninja hide the output by default. If you enable it using yet another arcane option, the output is more chaotic and unreadable than in most autotools projects.
In general I find autotools projects like gcc easier to compile. The flags are more intuitive, the output is organized and readable. Compare that to cmake projects like llvm, which always seem to expand the option set to more than 100 unintuitive -DENABLE_ARCANE_FOOBAR_QUUX="a;b;...;z" flags.
Autotools is very much appreciated. Thank you for another release, I am looking forward to future ones!
Can somebody elaborate this point? Does that mean I need to go through paperwork even for contributing a bug fix?
This is one reason for its demise. I've done the copyright assignment for both Autoconf and Automake. It's one reason why there are external macro archives (e.g. Autoconf Archive), when much of this functionality could be in the core projects.
Example: Why, in 2021, should you have to rely on external support to do something as trivial as enabling multithreading [ACX_PTHREAD]. It's not even portable to non-pthreads platforms. So much for portability. Contrast with CMake [find_package(Threads); use Threads::Threads].
Compared with CMake, which doesn't require copyright assignment, and anyone can contribute to. It has hundreds of contributors, tons of built-in functionality, and more third-party extensions than Autotools could ever dream of. Copyright assignment is a big part of that, but also so is upstream attitude to third-party contributions. Kitware do this very well. The Autotools maintainers not so much; partly down to their own historical bad choices which prevented change to the internals due to the potential for breaking compatibility.
Getting corporate sign-off for such things can take months while it wends its way through legal departments and upper management. Many people won't even bother, it's such a hassle.
Last one I did was for Apache. It took at least six months. Previously did several for GNU which were much quicker (in a different company).
Compiling Gentoo packages on a 32 core 64 threadripper is getting ridiculous with many packages, even some "larger" ones, taking longer to configure than to actually compile.
> There are few active developers and no continuing funders.
Like most open-source projects.
> There is no continuous integration
> Bugs, feature requests, and submitted patches are not tracked systematically.
> there is code shared among Autoconf, Automake, and/or Gnulib by copying files between source repositories
Like libmtp / libgphoto2.
I use it for lots of things, even NodeJS, and it helps me keep my builds more portable.
I’d hate to see GNU make completely abandoned.
So, don't worry - GNU Make isn't going anywhere, regardless of what eventually happens to Autoconf.
But I find M4 a true gem "no one heard of" - a portable macro processor you can use with basically anything involved with the plain text. Especially it is useful where you have tons of boilerplate code to type (Java, Ansible, Go, js/html/css to name a few).
I think email is fine as the communication medium for software projects – particularly for projects that have been around for as long as Autotools†. For me, the biggest downside is that not all mailing lists use GNU mailman or other listserv software that provide archiving of list posts. Having a searchable archive of discussions is invaluable.
† though I was surprised to see a number of contributors top-posting; that practice was universally discouraged on Internet mailing lists until the number of new members using MS Outlook overwhelmed members who followed the established “netiquette”
More specifically, the fact that no one cares about the usability of the archive or online presentation is proof that they don't care about the impression they make external people. They can hardly complain about the lack of new contributors if they make no effort to be accessible to them.
Do you have examples of projects that have used a "better" method of communication that have achieved what GNU, Linux or any of the BSDs have achieved?
Simply, you can't complain about the lack of new contributors if you make no effort towards appeal and usability.
Why would that be? They both just seem like a source distribution to me... why would there be a difference? Can anyone explain this?
The source repositories typically do not have these already generated (since checking in generated files is frequently frowned upon), and building them often require getting the right version of autoconf/automake/etc installed, as a different version may not be fully compatible.
I've been a member for years, but the business model (gasp) languishes for lack of a practical vision.
Would also drive contributors as just mentioning a new languages on your README floods your issue tracker these days :)
GNU Make is a different story. Easy to get core saturation for large workloads. Stanford GG allows you to massively distribute it to thousands of workers. Still very relevant.
Which in turn are not the best formats for binary distributions.
Perl I don't even know if it needs to be installed on the packager's computer.
One more reason for C++ projects to settle with CMake,