Hacker News new | past | comments | ask | show | jobs | submit login
GNU Autotools: A Tutorial [pdf] (elinux.org)
91 points by jnxx on March 30, 2021 | hide | past | favorite | 77 comments



If you prefer a video, you might check out my 3-part autotools tutorial, starting here: https://www.youtube.com/watch?v=4q_inV9M_us

The video shows an animation about how the autotools work (which I think helps understanding), and has a CLI demo of how to use it, all for the low price of free :-).


Today I had to use autotools for the first time, where I previously only used CMake.

I thought I should be scared, and m4 looks daunting, but what they at least get right is that it's trivial to generate both shared and static libraries, cross-compilation is a breeze, and in the end all you need is a shell to configure and make to install.

CMake defaults to either shared or static libs, and sometimes you have to read the docs for a project to figure out what custom variables influence this; and cross-compilation is almost always guaranteed not to work because some find_x cmake script tries to compile and run a C source to figure out certain properties about the target (but it confuses that with the host system).


I think autotool's biggest limitation is its UX: it's very difficult to learn what are the good practices, what's deprecated, what new resources are available to make your build scripts more robust. Discoverability is poor. Every autotools project that I come across is a mess of copy/paste from another autotools project, with unneeded (and sometimes wrong) checks and complexity.


Last time I looked at autoconf et al. from a developer's view -- which is, admittedly, probably 20 years ago -- the documentation was quite good. The discoverability of, for example, the C standard library is also quite poor if you do not read the manual.


My take on discoverability is exactly being able to figure things out without the need for extensive documentation. The documentation should exist, of course, but you shouldn't need it for every corner.


This is exactly my biggest problem with some other tools: That they try to automate everything with lots of "magic". And this does not lead to clearly understandable, well-defined systems which are maintainable in the long run. If one is debugging hard problems, for example in multi-threaded code with deadlocks, what happens is that one needs to get a very clear understanding of what a certain piece of code does. The same is the case for debugging build systems. In the end, the build system needs to deliver an efficient, clearly understandable, unambiguous definition of how the software is constructed from the different pieces.

Some systems seem to try to absolutely minimize the amount of configuration code which is required for simple things, using "automatic configuration". If this is for clearly defined things like configuring a library to be static, this is fine. But when it becomes unclear and ambiguous what actually happens, or, worse, when you can't say what versions of dependencies are actually used, this is too much magic. And also when one needs twenty minutes or more to figure out how to switch on address sanitizer, or use a specific language standard. And this becomes worse when the configuration becomes more automagical with each iteration of the build software and the constructs and semantics start to depend on which version of the tool one is using.

autotools has the advantage that these things are quite explicit, and the configuration interface is very stable, and this helps to understand and maintain a system.


I just learned it three days ago, based on the tutorials I linked, and I found it quite easy to learn. In particular, it has very good introductory as well as reference documentation.


Here is a reddit link which shows a minimal setup:

https://old.reddit.com/r/programming/comments/a02n5/trying_q...

This is all one needs to start a project. I actually think it is a good idea to use it quite regularly for hobby projects as well, just like "git init ." .


Sounds just like cmake


In my experience, it is much better than cmake: Apart from having good and comprehensive introductory documentation, it has a nicely ordered reference and also, it is stable, while cmake is changing, sometimes in backward-incompatible ways, and you find a lot of information which is disconnected and fragmentary and one does not know to which versions it applies. The latter will also, by all experience I have, make it much much harder to maintain and fix cmake projects in the long run. Think in a "modern"C++ code base which is 15 years old and in which every developer tried to include its favorite language constructs.


That is my experience as well. I have been using autotools-generated packages well over 20 years but thought it would be difficult to learn.

Recently, I got the job of supporting dozens of packages in an organizational environment. The first choice was cmake with Conan, but I found both to be quite ambiguous defined and it was really hard to figure out what was going on and how to get things to build. Then I found the autotools tutorials I linked before, and I needed less than one day to get going with autotools - and I find the build scripts easy to debug.

I think this is a larger issue with CMake - it has a lot of magic going on, but always leaving the user with fuzzy definitions and a really unclear concept of what is going on. And while I like the kind of magic which happens inside Emacs, for example, when it does just the right thing when I move the cursor around, I had this kind if magic which leaves me wondering what is going on and unable to fix issues based on systematic knowledge. I find it really, really time-consuming to work with such systems.


A nice introduction how autotools work, what are the basic concepts, and what one needs to know to configure an autotools build in a standard project. Especially compared to some other build systems, it is not only solid and well thought-out but also surprisingly well documented.

Another really nice, more detailed introduction is this one:

https://www.lrde.epita.fr/~adl/dl/autotools.pdf


I also highly recommend John Calcote's book "Autotools" from no starch press: https://nostarch.com/autotools2e

Probably the best technical book I bought in the last year. After the lecture, I have put aside all whispered-in prejudices and am convinced that it will still be worth betting on GNU autotools in the 2020s.


Can't resist to point to the "Die GNU Autotools" book, which when read as English expresses the feelings of many, but it's actually German. https://books.google.co.jp/books/about/Die_GNU_Autotools.htm...


What are some things that autotools can do that more modern tools can't?


First thing that comes to my mind is the best-in-class cross-UNIX compatibility. But shouldn't the question be the other way around: What can the modern tools do that autotools can't?


The obvious: CMake compiles into either Makefiles or Visual Studio solutions, allowing for cross Linux / Windows builds.

Support for Ninja files also seems to speed up builds in my experience. (Ninja is a Makefile "assembly language", minimal features, assuming that things like CMake files compile into Ninja). Ninja is fully multithreaded and focused on speed instead of features, and it really does show.

Since ./configure is single-threaded, a substantial amount of time needs to be spent whenever you make changes to your Autoconf / Automake stuff.

CMake has a few nice things: a built in basic testing framework (just interpreting the exit codes of programs: 0 is success and anything else is test-failure). Said testing framework has been integrated into various other test frameworks (like Googletest: https://cmake.org/cmake/help/latest/module/GoogleTest.html).


Autotools actually has some built-in test framework support too. A Makefile.am can have one or more programs marked as TEST, which will run when `make check` is executed. Depending on how you set it up, Automake knows how to use the TAP protocol to collect the pass/fail from your tests.

You can also run `make distcheck` to generate a source tarball, extract/build the tarball in isolation, and run `make check` on the result. A very handy way to make sure that you've packaged everything correctly, and everything builds/tests OK when executed out-of-tree.


To those who downvoted my parent comment, what was your reasoning? Not trying to troll - I was just trying to explain a useful Autotools feature in a discussion about Autotools and CMake features.


> The obvious: CMake compiles into either Makefiles or Visual Studio solutions, allowing for cross Linux / Windows builds.

I get that this is especially attractive for some companies earning money with Windows. But how important are contributions from developers on Windows for Linux distributions and the free software community? I think that most companies which use open source libraries on Windows are unlikely to give anything back.

Also, when it comes to things like networking and server software, Windows has many important differences, and is not that relevant at all any more. Well, every project will need to figure out on its own whether supporting Windows builds is worth the time and hassle.


And good integration with conan. Dependency management is nice. Vcpkg is getting there too.


The problem with conan is that it is yet another language-specific package manager. This is especially fatal since the importance of pure C and C++ projects is shrinking in proportion, and both are more and more used in cross-language projects. And for the latter, I think a Linux distribution or a system like GNU Guix is much better suited.


Distro managers lag and are fragmented. Guix looks good. Neither support windows though If you have that need. Trying to use conan in a blended c++/rust code base hasn't been that bad, but that is because we weren't using any crate outside stdlib.


Good point. Personally, I've avoided autotools for a long time since many projects use CMake and because of its reputation as a painful thing to learn. Lately, however, I've started learning more about autotools, so was just curious as to what it's really good at.


I am absolutely the opposite. I've avoided CMake for a long time since many projects use autotools and because of its reputation as a painful thing to learn. ;)

But actually, one of the points made in the Calcote book really stuck with me: Build tools are not just about developer convenience; they're also about the user's convenience. It's more work on my end, but my users (who, admittedly, are on UNIX-likes) know to type:

  ./configure
  make
From their standpoint, it just works as expected.


Yeah, I find it really irritating when a build system doesn’t respect this interface. Really, cargo and all these other tools should have options for generating a configure script and makefile: they could be relatively minimal and just invoke cargo with the appropriate options, but it would make it much easier to build arbitrary projects from source.


This was a near magical revelation for me. I should finish that book.

It is astounding how many tools I have to install to get most tools to work. :(


It's fascinating, isn't it? Right! Users . . . :)


> From their standpoint, it just works as expected.

This is also quite important when you happen to work on larger projects with many dozens or hundreds of modules and you need to build a specific part weeks after you have been looking at them. To have to look up every time how to build something is more than a nuisance.


I never understood why there are typically two commands and not just “make”. Is this a historical accident?


I think it is simply that make is too crufty to extend. It has perma-broken behavior around space handling, which nobody dares touch [1]. But you can't replace it because it is entrenched, so Makefiles must be wrapped, by configure or CMake or whatever. And these in turn have their own cruft, so must be wrapped, by autoconf and...well someone please wrap CMake!

The C and C++ ecosystem is crying out for a new build system, but there is no mechanism to converge on one. It's bad.

1: http://savannah.gnu.org/bugs/?712


Make does what it does surprisingly well: It runs commands on files based on a description of dependencies. There is no shortage of would-be successors of make, but so far none of them has succeeded, which supports the hypothesis that "make" has hit a sweet spot.

There is one alternative to make which I think is worth mentioning, because of its simplicity, brilliance, and excellent support for correctness: it is redo, as in Apenwarrs redo:

https://redo.readthedocs.io/en/latest/

And redo works with autotools!


This is by construction and is nicely explained by the slides I linked before.

Basically, autoconf generates a makefile which will work for your architecture. It does that via a script which is called "configure", which generates that makefile. The configure script is independent of your hardware and platform and does only require a standard POSIX shell, and it generates code for the version of "make" that is present on your platform. This means that in difference to other build systems, one does not have to install (and build!) the build system for the own platform, and also that there are no compatibility problems because the configure script always matches the source code distribution.

On top of that, the "configure" script is also automatically generated, typically using two files "config.ac" and "makefile.am", via a command which is now called "autoreconf", but this normally happens only on the developer's system.


Oh yes, the reputation of the learning curve precedes it. But the book helped me get started and, as is often the case with complex things, once you've dealt with it a bit, the experience wasn't as bad as I feared.


- it has absolute minimum requirements for running on a target system - it requires only a POSIX shell

- the macro language it uses is strongly standardized, so it produces identical semantics on all platforms. And because it uses macros, which are expanded into shell code, it does not require the macro interpreter to be present on the target systems neither, only the generated script.

- the configure script is included in the source distribution and as such "frozen", so one has no problem with changes or incompatible upgrades which could break builds.

- the philosophy of autoconf is based on feature testing, for example if a given compiler version produces code with a certain property, if a library with the name "foo" contains a function with the name "bar", and so on. This is the right way. It is also good for providing upgrades between large libraries, like Qt.

- testing for features individually is the only way to escape a nuclear-scale combinatorial explosion of feature combinations.

- feature testing also allows for support of evolving software, new functions that are buggy first and need to be worked around, then gradually fixed and turning perhaps into a special library, then perhaps an element in Boost, later you need perhaps a specific version of boost, and so on. This is well-suited to the bazaar-style development which is characteristic for Unix, with an open interchange and borrowing of ideas.

- as already mentioned, it is very well documented. People who just copy-paste existing code are doing it wrong.

- it is understandable, without too much magic build into it. cmake is affected by an excess of magic IMO - it is hard to understand what is going on, and as a consequence, it is hard to fix a failing script. In my experience, a fraction of the time was needed to come up to speed in comparison to learning only some of cmake.

- This also means that it is easier to maintain for a medium-to-long-term future. It is nuts to use a language which is constantly changing precisely for infrastructure- some poor people will have to read and maintain all that code years later! (And, this people could even be you!).

- You only need to support platforms and features you need, at the same time you are able to support a myriad of exotic platforms if you want it. That means that if you want your software to work only on Ubuntu 64 bit Linux, you only need a very small configure.ac script, but if you want to support older libraries on Debian, you can do that easily as well, and if you need to support an exotic architecture with 36 bit-words or a 16-bit embedded MCU, you can do that as well.

- It is so widely used so that almost every Unix developer knows how to build software with it, without much thinking.

- It tries to do one thing, and do it well, providing platform-specific build configuration. It does not try to do dependency management or package management, leaving that to specific package managers.

- and because of the previous point, it also works well with new distributions and package managers, like Arch or NixOS or GNU Guix - it does not interfere with them, and does not fight for the position to be the top dog in a packaging system.

- Both autoconf and make are, while supporting C and C++, not language-specific, so you can easily use it to distribute things like LaTeX documentation, pictures, or even fonts.

- you can use and generate a mix of shared and static libraries, and also use both pkg-conf as well as standard paths (or, for example, paths provided by GNU Guix). And it can use that for each package individually. In contrast, cmake find_package commands do have difficulties with mixing different retrieval mechanisms.

- Supporting all kinds of hardware and compilers will continue to be important. While there are without question obsolete platforms, it will be able to support different hardware architectures in the future, like CUDA on GPUs, or ARM or RISC architectures.

- It works for any library without requiring specific upstream code. This is in stark contrast to cmake, which often needs supporting find_package commands or target definitions for specific libraries. The latter means ultimately that cmake would need to become a meta-distribution which requires specific support for many packages one wants to use. I think that as the complexity of software continues to grow, this approach will probably not scale.


My main gripe with autotools these days is that there seems to be no standard mechanism for specifying non-standard locations for dependencies. (Or if there is, that no packages use it...) If I build something from source, it is nearly always because I want a newer version than what is available from the appropriate package repositories, which usually means that I will want newer versions of the dependencies too, or because I want a built-from-source version of one of those dependencies. If I'm really lucky, pkg-config will pick them up. Sometimes just setting LDFLAGS=-L/my/build CPPFLAGS=-I/my/build will work. Sometimes I have to say --with-gnomovision=/my/build, sometimes I will have to say --with-gnomovision-includes=/my/build/include, ...


If you want to build and install custom packages which in turn depend on libraries which are newer than system packages, this sounds like a perfect use case for GNU Guix as a package manager (e.g. on top of Debian or Arch):

http://guix.gnu.org/cookbook/en/guix-cookbook.html#Advanced-...

With this, you can create isolated environments in which you can work on specific software, and use and go back to arbitrary versions, without cluttering the base system with local installations. One of the best features is that defining an additional package oneself is merely an extension of the package definitions available with Guix:

https://guix.gnu.org/cookbook/en/html_node/Packaging-Tutoria...

And, of course, it works well with autotools....


Just to add, Guix configuration does not only look declarative - it uses a mostly functional language in a side-effect style to define the builds, in the same way as NixOS, just with a language which is established, well-supported, minimal, and elegant.

One of the advantages is that this functional package definitions define very clearly what is provided by a package. It also works well for other languages like Rust or Clojure.


I think this is something to be solved with pkg-config(1). Autotools already uses pkg-config to find dependencies and query for any extra flags needed, and with PKG_CONFIG_PATH you can have the program prioritize your own installation.


Autotools is a pain for development. autoreconf is slow, configure is super slow, and the Makefile generated by automake is a monster that adds seconds just to do a no-op make and it is impossible to hack. So for development, I simply hack up my own scripts to generate the config headers, directly parse the Makefile.am (which is surprisingly easy to parse) and generate my own Makefile. This works well because development only need a specific environment and a simplified Makefile. But I value autotools and keep maintaining the configure.ac, because it really saves you a lot of trouble in supporting users.


Configure and autoreconf may not be the fastest, but they aren't run frequently enough to really bother me. I'd have to disagree with the generated makefile being slow though, I still find it by far the fastest build system.

I've done the whole custom makefiles as well, but by the time you have tests, installation, etc it just becomes a burden to maintain them.


> I've done the whole custom makefiles as well, but by the time you have tests, installation, etc it just becomes a burden to maintain them.

It is generated from the same `Makefile.am`, so no extra burden to maintain. It is just a simplified version of the Makefile. With my project, it is ~1000 lines vs ~44000 lines, so it does make a huge difference in its latency.


I'd agree with that. If a build happens on a build server, it does hardly matter if it takes an extra minute. What matters for developing and debugging stuff are fast incremental builds. And for this, one just needs to type "make", as it is not needed to run configure again unless one changes to a different system.


Comments on value of autotools from the author of the most recent release, 2.70:

https://news.ycombinator.com/item?id=25910496


One thing that I am wondering: Why do git repos with totools build chain often include the generated files, like ./configure ?

Wouldn't it just be better to provide the source files (autoconf.ac and Makefile.am) and instruct developers to run "autoreconf" to generate the rest? That would it also make much clearer which files need to be changed if e.g. source files are added.


It depends on the extent to which the git repo is s product for end users vs. for developers. Traditionally, the product for end users was the source tarball... but also, traditionally, we didn't have git. :)

The whole point of autotools is that the output (./configure) is highly portable and can be run on basically any mostly-POSIX-compliant /bin/sh. You don't need to install anything beyond sh and make just to perform the build - compare with a design like Bazel where you need a piece of software to do the build.

If the generated files change rarely, checking in and committing the generated files means people who are just making code changes (or no changes at all) don't need to install autotools.

Also, there are functionality changes in certain versions of autotools; you might want to make sure your users get a feature that was implemented in a recent version of autotools, but support users who are running on an older OS that doesn't have that version packaged.

(If you expect the git repo to be primarily used by developers / contributors instead of end users who are just building it, then you should definitely gitgnore the generated files, but also make sure to include them in the release tarballs. "make dist" will do it. You should also guide end users to download that release tarball instead of an auto-generated tarball of your git sources - GitHub and GitLab both have support for file attachments in their release feature, these days.)

In our modern world, this might be less important because installing software (even specific versions of software) is a lot easier. But then our modern world probably deserves something other than autotools, because a lot of what makes autotools hard to work with is the fact that it's intended to translate down to sh/make. (Options I'm aware of that look fairly reasonable include Meson/Ninja, CMake, and I guess Bazel.)


Linux distributions like suse or redhat actually bootstrap autotools during build, to.enforce consistency. at least they did 10y ago.

you will sometimes run into subtle forward porting issues doing this.

that's why not doing this and using the generated Makefile and ./configure is a good idea (tm) for most consumers of packages.

in other words: if you're in for adventure and learning, go for autoreconf. https://wiki.debian.org/Autoreconf


In my experience they don't often have a configure file in the repo, but only in the release tarball (some repos on github have an action to setup a tarball with configure script). That way the user only needs a shell and make, nothing else.


I've seen quite a bit of both. GP isn't wrong.


You'll find that many projects include a "bootstrap" script that does just that.


It's extra work to install and run autoconf. With no benefit or necesity unless you're changing the autoconf script. Artificial dependencies are friction that keeps contributors away.


I see that point! On the other hand, it is generally strongly encouraged to keep generated files out of source control repos. Especially since one cannot merge them in a meaningful way.


Or use anything else.

CMake or Meson have both learned the lessons of Autotools and so you dont need to subject yourself to them anymore.


I have been surprised at how good my CMake experience has been. CMake may have its warts but so far everything with a CMakeLists.txt that I have tried has worked out of the box


Yeah but they're not always installed on the random SBC you might be configuring, or an old incompatible version might be installed. And then I have to dig through the build system to figure out the right way to set prefix etc. cmake and meson should really have a way to output a configure script and Makefile...


> cmake and meson should really have a way to output a configure script and Makefile

`cmake .. -GMakefile` not enough for you?


I mean, for distribution, so that cmake is not required for a user to compile the project.


How is that gonna help on windows where there is no /bin/sh ?


Install cygwin or msys or WSL? Windows is unusuable otherwise :).

anyway, the particular case here is something like an academic computing cluster, where you don't have administrator access, and you want to compile some package that will install into some place like $HOME/project/usr. The system has no meson or bazel or scons and only some older version of cmake that may not meet the claimed minimum requirement of the random package you want to compile.

Or it's some SBC that you dare not upgrade the operating system on because you don't know if there is some breaking kernel change with some hardware interface.

Sure, locally installing a new version of cmake or some other build system is not too hard, but it's certainly additional friction.


> Sure, locally installing a new version of cmake or some other build system is not too hard, but it's certainly additional friction.

for cmake it's basically

    wget https://github.com/Kitware/CMake/releases/download/v3.20.0/cmake-3.20.0-linux-x86_64.tar.gz
    tar xaf cmake-3.20.0-linux-x86_64.tar.gz
    export PATH=$PWD/cmake-3.20.0/bin:$PATH
definitely simpler than installing cygwin, msys or wsl :-) (and some software requires cl.exe on windows anyways)


Meson is tremendous pain to bootstrap in hardware development projects, because you have to get so much of Python working first. CMake used to be easier to cross-compile to bootstrap a project, but it's getting harder every year. Autotools just keeps working, even if you don't have a c++ runtime.


> CMake used to be easier to cross-compile to bootstrap a project, but it's getting harder every year.

Why exactly is it getting harder?


CMake still comes with ./bootstrap.sh and basically just needs GCC and a couple base libs for TLS


So, what would you say what these tools have learned and taken from autotools, and what are the core ideas with which they were improving? I was looking around for a description of what the core ideas of CMake are and could not find much, except automating for platforms which are not Unix-y or POSIX.


CMake is marketed heavily, but not all things which are said about autotools are accurate. Here is a video from somebody from KitWare who compares autotools to CMake:

https://youtu.be/8Ut9o4OdSC0?t=459

In this section, he kind of complains about the number of files which are needed to use an autoconf install. But the thing is, almost all of these are generated files. So, one does not have to create them - they are created by autoconf, once, by the developer, and stored together with the rest of the source distribution for installation.

Then, in the seconds after, at

https://youtu.be/8Ut9o4OdSC0?t=582

he compares the speed of autoconf to CMake, and CMake comes out faster. But, well, this time, he also adds to the total time the time which is needed to run autoconf for the first time, which generates the above files. So, in practice, a developer which changes source files, or a user which installs a package, does not need to run the autoconf generation of the configure script, and a developer which does imporovements and testing in the source code also does not need to build the configure script each time. To sum up, at the one hand side, he counts the generated autoconf files into the number of files needed for a package, but then he counts again the generation time into the run time to use autoconf regularly. And this is a bit weird, because the configure script and the other files only have to be generated once when a change to the build system is made, not for install builds and not for normal development builds.

I agree that autoconf could run a tad faster and that generated files could be organized a bit tidier (it is possible to configure it so that it puts them into a subfolder). But I think the comparisonn in the video is a little bit biased.

And of course, a company needs to make money and sell its products, but if a presentation of a core product starts with such a biased comparison, it is not very convincing to me. Well, we'll see where CMake is in five years time - build systems are extremely complex and large things and need to account for a lot of stuff which is rarely used to work robustly, and then it is perhaps coming to an age where things like good architecture, good design, and clean code mattes. I am not worried that CMake does not manage to do highly automated builds for simple projects and happy paths in the most popular platforms, but how it copes with the hard stuff on which a lot of computing infrastructure depends. There is a good reason why most of this infrastructure stuff, be it the kernel, or things like glibc, or OpenSSL, is not made by companies.


The burning question is: is Autotools worth it these days?

In my early days of programming, I thought Autotools was the coolest thing in the whole world. It was essentially magic.

These days, with the demand for running on non-Linux UNIX at an all time low, why put yourself through all this?


For most programming (ie - web services), no. For anything that makes system calls within 1-2 layers of abstraction (graphics programming, shell utilities, etc) it is a godsend


I would be willing to spend an incredible effort to keep something POSIX instead of using autotools ( https://varnish-cache.org/docs/2.1/phk/autocrap.html ). They may have been an acceptable solution to a problem before the turn of the century, but things have changed enough that I’m not convinced they’re still delivering much bang for the buck.


And yet Varnish is still using not only autoconf, but also automake. I've never had the need for automake, but autoconf feature tests are still useful. Not because of POSIX compat headaches, but because of the steady addition of new interfaces and the reimplementation of old ones. musl libc hadn't even made its initial release when the above tirade was written, yet there's a good chance Varnish built against musl because of those autoconf feature tests.

This comes from someone who has invested a significant amount of time trying to implement pretty much exactly what PHK is advocating: https://github.com/wahern/autoguess I don't reach out for autoconf reflexively, but I fully appreciate why people do make use of it.


Apart from that the embedded universe needs to support a lot of CPU architectures, the demand for building on different hardware and slightly different library interfaces is unlikely to disappear. Just think in desktop apps using Qt4 / Qt5 or libraries which require CUDA support for different kinds of hardware.


Perl's build system (Metaconfig) is fairly similar to autoconf. I've always been impressed on how many disparate systems, even decidedly non-unixy ones, it handles. You can tell, though, it was a lot of work to get to that state.


After years fighting with slowness and unmaintainability of Autotools, using meson is just a breeze.

You can get it all, quickly and cleanly.


There are some speed comparisons, measurements, and discussion here:

https://lwn.net/Articles/706404/

https://thiblahute.github.io/jpakkane.github.io/Simple-compa...

The thing is apparently, ninja is indeed faster, but this matters only for very large projects with tens of thousands of files.

Speed of incremental builds matters, and in this respect, ninja is an advance, but is not a day-and-night difference either.


The speed isn't much related to the compilation itself... Make against ninja matters just for build machines.

What it matters for developers is the development time, and doing things in automake takes 3-4 times of Autotools. Not to mention that the configuration time you loose waiting for a single line change.


I meant "3-4 times of meson"


What does "get it all" mean? All of what?


I mean, all the features




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: