
Stop the autoconf insanity – Why we need a new build system (2003) - ahomescu1
http://freecode.com/articles/stop-the-autoconf-insanity-why-we-need-a-new-build-system
======
nathell
The comment "Put the blame where it belongs: Automake" below the original
article sums it up pretty succinctly. Quoting:

All the complaints in the article are PRECISELY the kinds of complaints that
autoconf programmers worked very hard to avoid. For example, it's IMPOSSIBLE
to have version skew with autoconf because the configure script is completely
pre-generated before being shipped. As far as the user is concerned it's a
simple sh script.

ALL of these problems stem from the "automake" monstrosity that creates many
more problems than it ever solved. Unfortunately people now tar autoconf with
the same brush as automake and libtool whenever they run into problems with
these. Sigh.

~~~
MichaelMoser123
For my projects I did a make system that uses gmake meta programming to easily
define rules for most cases, here you have to includes a template make file
that expands the target definitions into make rules.

[http://mosermichael.github.io/cstuff/all/projects/2011/06/17...](http://mosermichael.github.io/cstuff/all/projects/2011/06/17/make-
system.html)

The catch is of course _most cases_ ; other stuff is still possible to do ,
but things become hacky.

Another downside is cross compilation, my system does not do this effectively
(though it works for Linux i686, x86_64 and Cygwin - many surprises between
these setups !). Most build systems I know don't support cross compilation -
except for automake, the terrible.

If you do debian packages then you are strongly encouraged to use automake -
because of cross compilation.

------
sparkie
We don't need a new build system, we need a new mentality.

The problem with build systems is they don't cover the entire scope required
to actually build software reliably. Ideally, you want to take just your code
as input and produce a target, but the reality is that you're taking your
code, plus an environment with potentially infinite number of configuration
options, and you're asking the build system to produce a target. This is
effectively saying "Here's some code and some shit, please do your wizardry
and make me something that _looks like this_." Is it any wonder that every
proposed solution to this difficult (impossible?) problem gets it wrong?

The _correct solution_ is to declare the target you want, then build up the
dependencies you need as input to meet that target and declare them
explicitly, then perform the build in that isolated environment to ensure that
no undeclared configuration options can alter the result of the build. This is
what Nix ([http://nixos.org/nix/](http://nixos.org/nix/)) does, and by
derivation, Guix
([https://www.gnu.org/software/guix/](https://www.gnu.org/software/guix/)),
and Debian's ReproducibleBuilds is also attempting. The build process becomes
effectively a pure function, free of unwanted side-effects.

Nix doesn't actually perform builds in itself, except for trivial ./configure
&& make. It piggy-backs on existing build systems via bash scripts if needed,
but it completely manages the environment in which such script will run, so it
has local-side effects which are controlled, similar to how you might use the
ST monad in place of IO in Haskell perhaps.

I find Guix a bit more interesting because it can take on the whole problem -
including the build system. While it can piggy-back on existing build systems
in the same way Nix can, you can also write your own in Guile (or some other
language and invoke from Guile). It doesn't really make sense to depend on the
old cruft under this new mentality, because things like pkg-config are
obsoleted by known, explicitly declared dependencies. The build process can be
simplified (and probably sped-up).

~~~
_delirium
How portable is nix? I haven't looked into it much, but from a (possibly
mistaken) first impression it looks more like a replacement for yum or apt
than for autotools. Despite being crufty, what I find valuable about autotools
is that, given one source directory, I can successfully compile and link the
package on many things: various distributions of Linux, the BSDs, Solaris, HP-
UX, AIX, OS X, nearly whatever else you care to dream up. Does Nix actually do
that (or target it, if not yet)?

~~~
sparkie
Nix is indeed a replacement for package managers, but the point still stands.
Package management and software builds are not completely disjoint - the
latter is depends quite heavily on the former.

Nix itself should work on BSD (I think it's tested on FreeBSD), but Guix is
linux-only so far, as it is still in early development. I'm pretty sure Guix
intends to be fully portable, even to Hurd in the long term. The ideas behind
Nix is platform independent though - there shouldn't be any reason it can't be
ported to other platforms.

As for building individual software on other platforms - they require
different targets. (Or in fact, just an alternative configuration of the same
piece of software on the same platform requires a different target). This is
intentional though - targets have an identity (a SHA1 of their entire inputs),
so any new platform will require a new package.

The package configuration format is quite useful in that regard though, as you
can derive new package definitions from existing ones, so it should be
possible to make partial "generic" configurations, then derive the platform
specific targets from them. There's quite a bit of configuration for each
platform though, as the dependencies will all have different identities, all
the way down to the compiler and the kernel.

Another potential option would be to create a template system to automatically
generate the Nix configuration for multiple platforms, which would be somewhat
similar to what autotools et al are doing today. The difference being, once
one person has performed a build and tested it for a platform, anyone else
should be able to reproduce it exactly, or to explicitly specify where he
wants the target to differ.

In effect, Nix removes the "hidden knowledge" that goes into building software
by requiring that you specify even the commands you used to perform the build.

Of course, Nix isn't a panacea - it introduces a new set of problems,
including some social ones. It might take more effort to write portable
targets for each platform - but for that trade off you get reliability and
simplicity for the users of the software.

------
caf
_He again does the typical thing and runs "./configure --prefix=/opt". The
configure script runs for a while, then exits with an error which basically
translates to "You have an autoconf version which is three weeks old; please
upgrade", but this is displayed in the most cryptic manner possible._

This is simply incorrect. If you already have a configure script, then you
don't need autoconf installed at all, let alone a particular version of it.
autoconf is used for generating the configure script, not running it.

~~~
ArbitraryLimits
I'm positive that I've seen configure scripts that try to check whether they,
themselves, are out of date, and regenerate themselves if so. That's the only
time I've seen an error about autoconf version skew.

To be fair, I've only actually seen this on scripts _I_ was writing - never
seen it happen to something that passed make dist.

~~~
kelnos
I don't remember exactly (my autoconf/automake-fu has gotten weak these past
years), but there's a macro AM_MAINTAINER_MODE that will add --enable/disable-
maintainer-mode as an option to your configure script that will determine if
rules get added to the final Makefile that will check to see if Makefile.am is
newer than Makefile.in or if Makefile.in is newer than Makefile, or if
configure.ac is newer than configure (etc.) and then attempt to regenerate
them.

I don't quite remember what the default is if you _don 't_ include
AM_MAINTAINER_MODE in your configure.ac. So yes, you could have indeed seen
that happen, but there are ways to make it not happen.

------
skywhopper
Interesting that the alternative tools mentioned in the article (SCons, Cons,
and A-A-P), only SCons has been updated since 2003 (according to Freshmeat),
and its last update in 2010 appears to have been mainly to redo the version
numbers and drop Python 2.4 support.

~~~
bjourne
[https://code.google.com/p/waf/](https://code.google.com/p/waf/) is the
spiritual successor to SCons. It started off as a fork of it (due to
performance problems in SCons) and then became a full project on its own.

It can do everything autotools+make can and you only have to code your build
in Python rather than Makefile + M4 + shell scripts. I've been using it for
many years and it is technically superior to any build system I've seen. Shame
the author of it doesn't market it more -- with some hype it would easily have
been the de facto standard build system by now.

~~~
mansr
Replacing make with python is a huge step back. The makefile language is
simple yet extremely powerful for describing a dependency tree such as that
for building an executable from sources. Its perhaps strongest point is that
it completely agnostic to what the different steps actually do rather than
consist of a predefined set of canned rules/tasks with, if you're lucky, a
convoluted way of adding your own using e.g. python (or worse, java). This
means that as long as a shell command can turn a set of inputs into an output,
describing it takes (usually) just two lines of makefile, one naming the
target and prerequisites, one providing the command to perform the
transformation. If a particular rule is too complicated for a few shell
commands, invoking a separate script (or even a just-built executable) is
trivial. Every other system I've looked at needed a comparatively enormous
amount of code for adding even the slightest tweak to the built-in rules.

GNU make is actually two different languages coexisting in the same file (the
makefile), a declarative language for describing the targets and their
dependencies as well as a functional language for the variables/macros. Once
you realise this, a whole new world of possibilities opens up. If more people
took the time to actually understand proper use of makefiles, perhaps we'd see
fewer poorly reinvented wheels.

~~~
rkangel
I've written a few complex Make based portable build systems over the years
and in my experience Make script is its major limitation.

Make is a great system for declaring dependencies - the basic syntax is
wonderfully concise (not that that's everything) and clear, and the system is
a great platform on which you can build your build system (unlike Scons which
tries to abstract away all sorts of things and just ends up hiding them).

The problem is that when you are trying to do anything more complicated than
simple rules you are lacking basic language features. You don't even get
function calls - the templates work under many circumstances, but not all (and
are a pain to debug). Also don't get me started on invisible syntax errors
(tabs vs spaces), and the only error message Make has' Missing separator.
Stop'.

Writing a build system in Python loses you that concise declaration syntax
(but again characters typed isn't everything, as long as you maintain clarity
it doesn't matter), but gets you ugh more flexibility to produce rules under
complex circumstances, which saw at you need for portability.

~~~
mansr
Your comment betrays your ignorance.

Firstly, you call it "make script," suggesting that you view makefiles as a
typical (procedural) script that is executed from top to bottom. I base this
on my experiences working closely with others who also used the term "make
script," all of whom suffered from this misconception.

Secondly, GNU make does have function calls. Look up the $(call ...)
construct. When the built-in functionality is insufficient (and of course such
cases come up), it is trivial to call an external script, written in your
language of choice, using either the $(shell ...) syntax or as part of a
target recipe.

As for the "Missing separator. Stop." it is not fair to dismiss an entire tool
because of one slightly obscure error message.

I also find it ironic that you complain about tabs vs spaces, then go on to
suggest using python, which also has invisible whitespace as part of its
syntax.

------
soulcutter
Over 10 years later, and this is still relevant. It's scary how entrenched
'good enough' solutions become.

~~~
brigade
It's also kind of amazing how terrible most of the replacements people
developed are. Some are significntly _worse_ than autotools, which is no mean
feat.

I blame make's awful syntax.

~~~
jjoonathan
Yes! But I'm not so sure we can entirely blame the make syntax. _Something_
made the CMake people believe that inventing a new macro language / m4-redux
was a good idea. I'm thinking that either m4 is a mind-virus or they're
serving wine in lead glasses at the annual build-system conferences.

~~~
fragmede
Or; a build system is harder than it seems.

Makefiles work fine in the _trivial_ case, the problem is that things quickly
become complex. Automating that complexity seems like it should be easy but,
as it turns out, it isn't.

~~~
jjoonathan
Is there something about the (admittedly very difficult and thankless) task of
automating builds that justifies re-inventing the wagon-wheel of languages?
Because that was my specific complaint, and there's a reason why it was
specific.

~~~
fragmede
Because it seems easy. And easy things done "wrong" _require_ re-invention.

There should be a way to $FOO. Well, there's a way to do it in a Makefile but
figuring that out is harder than should be, and doesn't make sense when you
finally _do_ figure it out.

"Well that way is stupid. Here's how I'd do it:"

------
mschuster91
I don't have many problems with the auto* toolkit.

Yet, there is one problem which has cost me lots of money for buying beer and
getting myself filled up: why can't auto* check for all libraries and then
output an aggregated "libx,liby and libz missing, libfoo outdated" instead of
the configure - apt-get - configure - apt-get cycle?

~~~
andrewflnr
Because it's just a bunch of macros. To create an aggregated list, they would
all have to, instead of printing out what they want, add their info to some
global variable, whose name they all agree on in advance and probably with
some kind of standard format. Which isn't to say it couldn't be done, but it
would be much more complicated.

~~~
mansr
It wouldn't take much at all to have the AC_ERROR macro append the message to
a global variable, set a global error flag, and continue. At the end of the
script, if the error flag is set, the aggregated error messages would be
printed. The variable names would only need to be coordinated between AC_ERROR
and one other macro called at the end.

------
jamesjporter
it's interesting how some software problems have garnered many really good
solutions while others are still mired in stuff like this.

e.g. no matter what your preferred source control tool is I think we can all
agree that there are some pretty good options out there these days. compare
that to the insanity that prevails in packaging in every corner.

~~~
mturmon
Text editors are another classic example of this conundrum. Some problems seem
to attract problem-solvers, and some do not.

~~~
mansr
Text editors are a solved problem since Emacs was written.

~~~
fnordfnordfnord
Haha... Oh. You're not joking are you?

~~~
Shish2k
Emacs has an infinite number of features; but the end user may be required to
write a few thousand lines of configuration file to enable some of them :)

~~~
mansr
The power of Emacs is that the configuration file is more than just that. It
is code that becomes part of the editor and can replace pretty much any part
thereof if the user so desires.

~~~
Dylan16807
You got the joke!

------
oakwhiz
I'm surprised there has not been any mention of Gitian, the build system used
by Bitcoin to perform exactly deterministic builds. The purpose of this is to
enable multiple developers to prove that they are all signing the same binary.

[http://gitian.org/](http://gitian.org/)

------
fsloth
As build systems go, this seems like a promising experiment:
[http://sourceforge.net/p/meson/wiki/Design%20rationale/](http://sourceforge.net/p/meson/wiki/Design%20rationale/)

tl/dr: -DSL for builds with well defined semantics, leverages existing
toolchain

-design constraints for the system that make sense, i.e. speed, portability, usability and common sense

-does not try to reinvent the wheel but rather simplify the usage of existing tools

\- the build config it generates on my Ubuntu box seems to really deliver on
the speed promise - I have no benchmarks to quote but a small c++ codebase
using all sorts of dependencies including boost compiled nearly instantly

------
dventimi
_Running "./configure && make && make install" usually results in a working
installation of whatever package you are attempting to compile._

followed by

 _the auto tools are constantly a thorn in the side of users and developers
alike._

Which is it? Do the autotools "usually [result] in a working installation" or
are they "constantly a thorn in [our sides]?"

~~~
mateuszf
Usually in that case is not enough. Developer is potentially dealing with big
number of users. Even a small percentage of people having problems with build
is being a very suboptimal situation.

------
caitp
[https://xkcd.com/927/](https://xkcd.com/927/)

But on a serious note, I really want to see Gyp become more popular, for the
simple reason that integrating multiple Gyp projects is essentially zero-
effort.

It's a beautiful way of working, even though Gyp certainly has room for
improvement.

~~~
ahomescu1
AFAIK, Gyp's biggest user is Google (Chromium uses it, and maybe others). Are
there any non-Google projects that use it?

~~~
caitp
In addition to the big google projects using it (Chromium, V8, WebRTC/jingle,
etc), there's also Joyent's Node.js (and forks, + numerous node packages
containing native code), as well as numerous private projects that I've been
involved with. Unfortunately, it hasn't taken off too much in the open source
community apart from the above mentioned big name projects. Better
documentation and more visibility could probably make that happen, though.

------
dror
That's an old school build approach (it _was_ 2003).

The modern approach is: apt-get install (or your OS equivalent).

It's 2014. I can't remember the last time I needed to install a package from
source. Worst case scenario I need to add a repo.

------
stefantalpalaru
> Joe GNU/Linux User has just downloaded and untared this package.

Regular users are not supposed to install software like this. The only sane
way to administer a distribution is to make sure everything passes through the
official package manager. This leaves the developers and package maintainers
to deal with autotools and I don't hear them complaining (very loud[1]).

Autotools is simply good enough for the job and instead of being some sort of
hated legacy it's being sometimes preferred for new projects ( _cough_ [2]
_cough_ ).

[1]:
[https://blog.flameeyes.eu/tag/autotoolsmythbuster/](https://blog.flameeyes.eu/tag/autotoolsmythbuster/)

[2]: [https://github.com/stefantalpalaru/vala-skeleton-
autotools](https://github.com/stefantalpalaru/vala-skeleton-autotools)

~~~
chomp
In a perfect world, every single piece of software ever written is in every
single distribution's package manager.

This is not always the case in reality, however.

~~~
csense
In a perfect world, the following also holds:

\- The versions in the repositories are always up-to-date with the latest
numbered version.

\- Users never want/need any features or bugfixes that haven't made it into
the latest numbered version.

\- Nobody ever needs to install an older version of anything, because new
versions never introduce compatibility-breaking changes. Which of course are
never necessary anyway, because software designers always have perfect
foresight, so the initial design of every package is always something that
will remain perfectly suited to it forever.

~~~
stefantalpalaru
All those problems are fixed by providing a way for the users to maintain a
local package repository (Like Gentoo does with its local overlay[1]).

The only downside to that is that it's more work to create new packages and
you end up with part user / part maintainer hybrids that no longer fit the
"regular Joe User" model.

[1]:
[http://wiki.gentoo.org/wiki/Overlay/Local_overlay](http://wiki.gentoo.org/wiki/Overlay/Local_overlay)

------
J_Darnley
Last time I built anything that required one of these autotools monstrosities,
which was LAME I might add, it gave me some obscure error about not having a
suitable type for a signed 64-bit integer. Which was unbelievably stupid since
it had just "found" uint64_t for its unsigned counterpart. So I opened its
nearly megabyte sized, thirty-two thousand line long configure file to track
down the problem. I seem to recall that in the end I fixed it by correcting
the placement of some braces around it many nested ifs to fix it.

Now I avoid anything that needs anything like these tools or anything that has
a step before running configure from a source code checkout.

------
dschiptsov
There is nothing wrong with autoconf or libtool.

Take a look how sane developers (nginx, cpython, etc) are using it.

