
Autotools Mythbuster - nbaksalyar
https://autotools.io/index.html
======
zuzun
If you want to get started with autotools, try to build yourself a minimal
autotools project. Once you have a working stub, you already took the biggest
hurdle.

You only need to focus on two files: configure.ac, where you check for
features, and Makefile.am, where you list your build targets and their
sources. You can also write normal Makefile rules there.

After editing these files, run `autreconf -fi` to create/update the configure
and Makefile script. Then run

    
    
        ./configure
        make
    

to build your project. Have a look at Appendix A/Basic Autotoolization on the
submitted website or checkout libabc by Poettering/Sievers:

1\.
[https://autotools.io/whosafraid.html#idm117385102076160](https://autotools.io/whosafraid.html#idm117385102076160)

2\.
[http://0pointer.de/blog/projects/libabc.html](http://0pointer.de/blog/projects/libabc.html)

------
DasIch
Learning about autotools is an amazing source of fremdscham. The whole
compilation/packaging stuff is hard to get right but autotools looks like
someone completely gave up on producing a good, decent or even bad solution
and just tried whatever didn't completely and utterly fail at the task. That
everyone else just kind of seemed to go with it... there are just no words to
describe how embarrassing that looks.

~~~
jerf
A solution can be no simpler than the essential complexity of the problem it
is solving. It is not clear to me that the essential complexity of the problem
is significantly simpler than the autotools solution. Across the totality of
all C/C++-using systems, across all of the mostly-POSIX and vaguely-POSIX-ish
systems, across all the libraries, the toolkits, the OSes, across the decades,
across all of what it is trying to span, the essential complexity is quite
large. Contemplating all that probably makes autotools come out looking pretty
good in context.

I'd observe that every major modern language since then has taken an approach
that in one way or another, obviates the need for a full autoconf. C's
contribution to large-scale code reuse was largely showing all the ways in
which it could go wrong, and encouraging future language developers to get it
more right to start with, which they mostly have.

(Note I'm not saying there aren't any problems with the way any modern
language works. But they're still all simpler, generally more powerful, and
even in their worst case, nothing like as bad as the autoconf-world's worst
case.)

~~~
nickpsecurity
Even as an autotools critic, I totally understand that argument about the
steps that were necessary for dealing with those portability problems. Well,
way back in the day when they existed to the degree they did. Today, there
should be much less complexity as the systems are more alike than different.
What complexity is there is mostly a relic from days gone by that never got
cleaned up because nobody wants to invest the effort.

~~~
kazinator
As someone who maintains a hand-written configure script for a project that I
build on just a handful of platforms, I have to disagree. There is plenty of
cruft there to detect and take care of.

For example, recently I discovered that -D_FILE_OFFSET_BITS=64 works on 32 bit
Solaris 10 for large file support. However, fseeko and ftello are not
declared. For that you need something special, namely -D_LARGEFILE_SOURCE ---
this is not just assumed from _FILE_OFFSET_BITS=64.

New developments in POSIX have created a fuzzy landscape which resembles some
of the confusion in 1990 Unixes. This platform needs _POSIX_C_SOURCE=<at least
such>, another has the same thing under _XOPEN_SOURCE. Apple won't declare
anything useful to you without -D_DARWIN_SOURCE.

Some platform needed __EXTENSIONS__ to declare struct winsize. I don't
remember which one, but my script detects that.

Systems are not becoming more alike. Even when it comes to standard stuff,
they vary in their conformance to POSIX, and in how they declare identifiers
and the exact headers that have to be included and such.

~~~
nickpsecurity
Appreciate the insight. So, are you on the side of the tool testing for every
feature that might exist going back decades or modules that apply to each
platform or version like we handle most dependencies? I think autotools'
approach is overkill in terms of complexity and efficiency.

~~~
caf
I'm not the parent commenter, but experience has taught me that testing for
features, not versions, is definitely the right way to go. The alternative
ends up with you creating a hodge-podge database of which features are present
in which operating systems / versions, which is inevitably out-of-date,
incomplete and wrong.

Only test for the features you actually care about, though. And at this point,
it's perfectly fine for a new project to assume some baselines like POSIX-1996
and even C99 (unless you're targeting Windows...).

~~~
kazinator
Testing for features is the right way to go. If you add support for a new
platform, there is no guarantee that the tests will all magically work. But a
lot of them, in fact _will_. That reduces the amount of remaining work.

You could have a canned set of files, one per platform, containing some canned
settings of variables. (The Pine e-mail program was a good example of this;
its Alpine successor uses Autoconf.)

If you are porting to a new system, you have to guess which of the existing
canned configurations is the closest one that will work and use it as a
template.

Since there is no logic to detect if any of the features related to the
configuration variables is actually present and in what form, it's just
guesswork. Okay, it doesn't build. Now which of the relevant variables in the
canned config is related to that breakage?

------
falcolas
While it's quite popular to bash autotools, how about instead working towards
a successor? One that's as easy for the end user to use as autotools, but with
less technical debit.

It's a hard problem, as evidenced by the inherent complexity in autotools, but
there are a lot of smart folks who could apply their effort towards that goal.

~~~
rwmj
Of course there are many alternatives (of the alternatives, I can only really
recommend cmake).

The problems broadly speaking are:

* Assumptions: autotools assumes that only a basic Bourne shell is available, and a basic make, and that make is a good way to build projects. All of those are questionable, and certainly if you get rid of the assumption that you're only allowed to use ancient tools, you could immediately make autotools better, if only by removing all the shell cruft. But then someone will complain because your project doesn't compile on SunOS 4 or whatever.

* Familiarity: When I'm packaging your project for Fedora, I will _not_ be pleased if you've used some obscure or half-assed build system. If you're using autotools, I may dislike it, but at least I can fix things.

The second one is a difficult one, because until your build system gets
popular, it will never be familiar, and us packagers will keep cursing you.
Once it's popular, it'll probably be quite crufty because it has had to handle
all the horrible corner cases that autotools can do now.

~~~
adekok
Even worse, autotools doesn't have the concept of modularization.

Look at the nginx configure scripts. They're hand-written, and modularized.
Check for "foo"? There's a shell script. Autotools does the same thing via
special-purpose autogenerated shell code.

That's why the configure script is 1000's to 10's of 1000's of lines of shell
script. It's all auto-generated crap.

And autotools doesn't have the concept of bootstrapping. Building a C program?
Great! Test _everything_ with shell scripts. It might be more efficient to
have C programs doing work, but no... it has to be the lowest common
denominator.

Because managing multiple pieces is too hard. Why? Because there's no
modularization.

I've used autotools for decades because everything else is worse. But I have a
long-term deep-seated loathing of it which is hard to put into words.

~~~
scrollaway
> _I 've used autotools for decades because everything else is worse._

I take it you've used CMake - could you explain in which ways you think it's
_worse_ than autotools? I agree it has concerning design flaws, but they don't
come anywhere close to autotools' and I'd say CMake is overall dozens of times
saner.

~~~
kazinator
Does CMake generate a deliverable which builds on a machine that doesn't have
CMake installed? With full incremental build support and all, so reasonable
development can take place on that system?

~~~
icebraining
Yes to the first question, it generates platform-specific build files (e.g.
Makefiles for Unix/Linux), which don't need cmake itself to build the
software. Regarding the second question, I believe so, but I'm not sure.

~~~
atso
Of course once the makefile is generated, there is no need to use cmake (as
long as the project is not modified), but you cannot share a makefile
generated by cmake, not even redistribute it. You cannot even re/locate it on
your hard drive.

The difference is that autotools generates source distributions that do not
need autotools to build, while a cmake project always requires cmake when
building.

~~~
icebraining
Oh, I wasn't aware it wasn't distributable, my bad.

------
dankohn1
Could anyone speak to why you would use Autotools today in a new project
rather than gyp [0] (created by the Chrome team because Autotools sucks) or
waf [1]? How many people really have a requirement to support more than just
Windows, OS X, and Linux? I can understand not wanting to rip out a working
Autotools implementation of a legacy project, but it's greenfield use should
decline to zero.

[0] [https://gyp.gsrc.io/](https://gyp.gsrc.io/) [1] [https://github.com/waf-
project/waf](https://github.com/waf-project/waf)

~~~
hp
Historically they have supported a lot of things the others didn't or didn't
easily support: cross-compilation, changing the prefix/libdir/datadir,
creating tarballs, "distcheck" to automatically check the tarball works,
libtool, etc. Also lots of Linux tools are built around autotools, for example
it's much much easier to rpm-ify or deb-ify an autotools tarball. Another
example, there's gettext integration for autotools, automated tools for
translators might rely on autotools, etc.

I'm probably listing the tip of the iceberg as far as network effects. In the
Linux/Unix world, autotools is (or at least historically was) what everything
and everybody knows how to deal with, and non-autotools is annoyingly
nonstandard.

Your gyp link there says for example that its main goal is IDE support; a
total non-goal for autotools. Unix/Linux C developers largely do not use IDEs.
And the autotools goals probably aren't in gyp's list of priorities. So while
they both "build the project" they have pretty different goals.

The autotools goals don't always make sense anymore. For example the
historical idea that the shipped tarball would only depend on POSIX sh and not
on any other binary made a lot more sense when people were hand-compiling
stuff on their HP-UX than it does today where most binaries come from
distributions.

If you're building a package mostly for Linux and maybe OS-X-as-Unix-
compatible, especially one in C/C++ and open source, autotools probably still
has a lot to recommend it.

------
notacoward
This looks like a really useful guide through autotool hell, which is a
fantastic thing to have, but it's not clear what _myths_ are being busted so
the name seems like a bit of a misnomer. Perhaps "Autotools Life Preserver" or
something would be more accurate.

------
MichaelMoser123
GNU make has macros too, but these were added much later than 1991;
Interesting if make macros were available earlier, would people have used them
instead of autotools ?

Thank you for the link on autotools, i will try to apply it if i am ever
forced to do something with autoconf/autotools.

~~~
david-given
Trying to do anything non-trivial in GNU make macros is the most bewilderingly
awful programming experience it is possible to have. They are horribly broken
in every respect, from little stuff like awful syntax (whitespace is
insignificant except where it's significant!) and poor error checking
(detecting undefined variables? Who cares?) to the really big stuff (can't do
arithmetic, despite having core functions which take numbers as arguments!).

A while back I wrote a compiler which targeted GNU Make and made it run
Conway's Game of Life (complete with hand-tooled arbitrary precision maths
library written in make). I learnt more about this stuff than is healthy to
know. The compiler output is... special.

[http://cowlark.com/2013-10-19-insane-
make](http://cowlark.com/2013-10-19-insane-make)

~~~
MichaelMoser123
I don't know, Make is difficult because we are used to think in procedures and
not in rules; and the rule and macro syntax is a bit weird; it takes time to
master it.

i have a makefile/make system that uses gnu make macros; (here
[http://mosermichael.github.io/cstuff/all/projects/2011/06/17...](http://mosermichael.github.io/cstuff/all/projects/2011/06/17..).
) this saves you from repeating the same make constructs many times over, in
the following example you do a static library and executable.

    
    
      1: TOPDIR=../..
      2:
      3: # - declare build targets. (built with make)
      4: TARGETS:=shlib slibuser
      5: 
      6: # - slib target is a static library -
      7: shlib_TYPE=lib
      8: shlib_SRC=slib.c
      9:
      10:
      11: # - slibuser target is a executable using slib -
      12: slibuser_TYPE=exe
      13: slibuser_SRC=slibuser.c slibuser2.c slibuser3.c
      14: slibuser_LIBS=shlib
      15:
      16: include $(TOPDIR)/rules.make
    

At a previous job they had an even more convoluted make system - it was
simulating macros: the makefile was including another generic make file; this
included make file was writing hidden files that contained the make rules ;
then as the last step these generated files were included.

~~~
J_Darnley
Your link is wrong and returns 404. Even
[https://mosermichael.github.io/](https://mosermichael.github.io/) returns
404. Can you prove the correct link?

~~~
MichaelMoser123
Thanks here is the correct link (can no longer edit the parent post).

[http://mosermichael.github.io/cstuff/](http://mosermichael.github.io/cstuff/)

[http://mosermichael.github.io/cstuff/all/projects/2011/06/17...](http://mosermichael.github.io/cstuff/all/projects/2011/06/17/make-
system.html)

~~~
J_Darnley
tyvm

------
hendry
Please please please do not use ./configure aka autohell.

Study the Makefiles in [http://git.suckless.org/](http://git.suckless.org/)

Straightforward & fast.

~~~
coherentpony
Autotools solves the problem of having to write a makefile for every platform
and also provides all the expected targets without having to write them all
yourself.

~~~
hendry
There is no need to write a Makefile for every platform nowadays.

------
jokoon
I confess that I don't even know how to write a makefile, but reading the
comments here, should I understand that cmake is better than autotools? Or
should I learn autotools instead of cmake?

Since cmake is more adopted, isn't cmake better ?

~~~
GFK_of_xmaspast
If you have a need for this kind of thing, learn cmake. If you don't have a
need for this kind of thing, consider yourself lucky.

------
raverbashing
One of the first lines

"...the language used to write the configure.ac is called M4sh, to make clear
that it's based off both sh and the macro language M4. "

Tip: don't use Autotools

~~~
davexunit
The Autotools are better than every other supposedly better solution. I do a
lot of distro packaging work, and I can say, without a doubt, that the most
problematic software to package are the ones that don't use the autotools.

~~~
mikepurvis
I don't know Autotools, but I find the CMake workflow to be tolerable. Can
someone who knows both provide a comparison?

~~~
jordigh
The golden idea in autotools is _test for features not for versions_. Allow me
to explain.

The problem autotools tries to solve is building your software on a wide array
of operating systems and configurations. A very attractive alternative
approach taken by most, including CMake[1], is to have a big giant database of
systems and what each system can do at which version. Thus, if you want to use
a library such as, say, Qt, CMake has a Qt module that looks for Qt libraries,
checks their versions, and reports that back to you. These databases of system
versions and what is available at each version are necessarily always
outdated.

Autotools instead gives you tools to write tests. Do you need this library?
Which functions do you need from this library? What should those functions do?

When you're running a ./configure script, that's what autotools is doing. It's
checking to see what kind of C library you have, what kind of Unix tools are
available, how do those Unix tools behave. It is not going to trust version
strings or software names. Perhaps the OS reports that it's Debian but in fact
it's some derivative of Debian that changed some things around. If you were
relying on CMake's hypothetical database of Debian features, it would fail on
this not-quite-Debian system.

Over time, these tests get shared around and become part of the core autotools
libraries, which is the other problem: autotools is now checking for the
absence of certain features that have not really been missing in any system
since 1995.

\---

[1] CMake can perform autoconf-like feature checks too, but this is not as
common as relying on its databases of systems and features.

~~~
manawy
I'm not sure what you mean by databases of systems and features...

I use CMake everyday and for me it's not a database but a set of modules. Each
of this module will look for a particular thing. Some modules are better
written than other but they all allow you to bypass them. Of course on obscure
systems it will fail. That's why they are called obscure. But on systems
following more or less the good practice, there will be no problem at all. On
the others, you need to pass the specific path, but it will be true with
autoconf too... no magic here...

Also in the doc, it is stated that if you rely on a specific
flag/option/function then you need to test for it, and CMake gives you the
tool to do it. That's a good practice everyone should follow.

But I agree, it's not always done. Like Autoconf, I guess it really depends on
how well the developer understand the tool, and how much time he is ready to
spend on it.

There's no magic, it's a complicated problem, with only complicated solutions.
CMake is more attractive to me, because it allows me to do simple stuff
easily, but more complicated operations are also possible.

~~~
jordigh
> I'm not sure what you mean by databases of systems and features...

I didn't really mean a literal database. I meant stuff like this:

[https://github.com/Kitware/CMake/blob/master/Modules/FindQt4...](https://github.com/Kitware/CMake/blob/master/Modules/FindQt4.cmake#L1068)

This is pretty typical CMake. It's encoding that for a particular version of
Qt, add a few more compilation flags. It's a "knowledge base" of Qt versions
encoded as a CMake module.

The autoconf way would be to try to link a minimal program to Qt, and if it
fails, try to see if adding more libraries makes the linking succeed.

Or this, for example:

[https://github.com/Kitware/CMake/blob/master/Modules/FindOpe...](https://github.com/Kitware/CMake/blob/master/Modules/FindOpenSSL.cmake#L112)

It is trying to account for two popular compilers on Windows but not with
Cygwin. It's the whole chain of "if Windows, if cygwin, if mingw, if msvc"
what I called a "database" above.

Doesn't clang work on Windows these days? What will this knowledge base do
now? Shouldn't clang work almost like mingw? But because it misses the "if
mingw" branch, code that probably would have worked for clang will be skipped
over.

~~~
manawy
I'd argue that for Qt this approach makes sense because they provide a well
defined API that they stick too. The developer can also ask for specific
component if it just want them (like QT GUI, QT SVG...).

I agree that this is more problematic with less common, more buggy libs where
the API keep changing, but in this case it's quite easy to check for specific
files, specific functions, ...

All this assumptions have huge consequences in terms of velocity and ease of
use.

------
chris_wot
Oh brother... this would have been good when I was developing in LibreOffice!

------
jheriko
there is only one myth that needs busting. that these tools solve any real
problems that can't be solved better.

------
J_Darnley
Just what I want, a 900KiB configure file that doesn't even work. I'm looking
at you lame! And then a make process that doesn't STFU!

------
akerro
[http://freecode.com/articles/stop-the-autoconf-insanity-
why-...](http://freecode.com/articles/stop-the-autoconf-insanity-why-we-need-
a-new-build-system)

