Hacker News new | past | comments | ask | show | jobs | submit login

From the point of view of an end user, I love compiling projects that use autotools. Download a tarball and then compile make install without needing to install a bunch of dependencies just for the build system. You can also trust that standard command line flags like --prefix and so on will be there, and do what you expect.

However, from the point of view of a developer, autotools are way too much for my brain. The m4 macros are inscrutable and I never felt like I had any hope of actually undesrtanding how they work. It's one of those technologies that I my only hope of getting work done is by copy pasting snippets of code I got from other people.

Anyway, does anyone know if there are alternative build systems that follow the same paradigm as autotools, but more pleasant to use as a developer?




My experience is that a lot of these projects are bluffing when they use autotools. Autotools will work very hard to make sure stdlib is there. But as soon as you compile on a more exotic system, none of the checks are of any use, and you fail in the make stage.

Maybe we should make any easy to use version of autotools which does nothing but accept the standard prefix, flags, etc options. You wouldn’t be able to do sophisticated configuration, but let’s face it most codes only pretend to do that.


That's the kind of thing I was thinking about. Do they exist?

One example that someone mentioned in a sibling comment is Autosetup, which apparently uses Tcl instead of posix shell + m4. An interesting idea... Tcl is one of those languages that's small enough that it's feasible to include a copy of the interpreter together with the build scripts.


This has been my exact experience. I used autotools for a couple of projects that I wrote in C. Ultimately it wasn’t terrible but I have to say I still don’t understand most of what autotools was doing under the hood, just that it checked which libraries were installed and yelled loudly if the ones I needed weren’t available.

Since then I’ve started rolling very simple Makefiles that just call the compiler. cc will complain loudly when it can’t find the right headers, and for these simple projects I don’t need any configuration flags, so why worry about all the machinery of autotools?


IIRC from my C++ days, there's no alternative to Autotools that keeps the same pattern of configure, make, and make install.

The big alternatives I remember are: SCons, Maven, and CMake.

I liked CMake the most; Autotools was nearly unusable for me. It seemed like I needed to learn almost as much about Autotools to get productive as I did for C++.

If I'm wrong about anything, someone will come along to correct me.


If you squint, `cmake ..` looks like `./configure`. From there, you `make` and `make install`.


Part of the issue is also the lack of standardized flags. I will remember "./configure --prefix=/opt/foo" until the day I die, but "cmake -DCMAKE_INSTALL_PREFIX=/opt/foo" is something I have to look up every time (like I did just now to type it here).


That could be solved by a "configure" script that translates its arguments to ones that cmake will understand. And it seems it's such a good idea that someone has already made one: https://github.com/Richard-W/cmake-configure-wrapper


You can run ccmake or cmake-gui to inspect and change the configure options interactively (either in the terminal via ccmake, or in a "proper" UI application via cmake-gui).


You need CMake installed on your system to do "cmake ..". You don't need autotools installed on your system to do "./configure".


And then it fails because you don't have "make" or a compiler installed. Build dependencies aren't an issue as long as they are reasonably standard and easy to install.


Yes, you need autogen.sh.


You don't need to run autogen.sh. The maintainer can run that ahead of time when they make the tarball.


Depends on the system. If it was old, or the release date was a while back, you might need to do it anyway.

Many Linux distributions do it as a matter of course to ensure it's actually possible to regenerate and that it's up to date. At that point, you start to question the necessity of embedding it in the first place given that its primary consumers don't care.


To avoid the autotools dependencies, we used autosetup https://msteveb.github.io/autosetup

It's very compact to be included along with the project. It does the configure, which tests compiler features and installed libraries. Then generates the Makefile using your custom Makefile.in.

Basically, it's a compact set of Tcl scripts, it even includes a small Tcl engine, in case it's not installed on the platform.


Autosetup is used by the Fossil VCS project (written by Richard Hipp, SQLite author)

https://msteveb.github.io/autosetup/articles/fossil-adopts-a...


> ...without needing to install a bunch of dependencies just for the build system...

Ahem

    autogen.sh: command not found
Autotools should be put out to pasture. They serve literally no useful purpose at all in 2021.


You don't need autogen.sh for a release tarball - the whole point of autotools is to generate a configure script that in turn generates a Makefile from Makefile.am etc. for you.

However, if you're pulling a random commit from git rather than ungzipping a proper release tarball where autogen and co haven't been run on a dev system for release, then that workflow of course can't work.

Which is one of the points of the linked discussion - that folks clone from git rather than doing "proper" releases, with cloned repos increasingly bringing their dependencies with them. Another point being that modern "language ecosystems" a la Go and Rust have their own canonical package management and aren't really made for polyglot development and linking with locally installed libs.

I don't quite get the autotools hate; from a user PoV, it's the one build system that has worked extremely well over the decades with just POSIXly tools installed locally (make, sh, cc). The same can't be said for cmake. Not a particular fan of libtool, but arguably the invasive thing it does is a consequence of link-loaders such as ld.so still not getting lib resolution quite right in spite of ld.so'd heavy-handedness (Mac OS's is saner IMO). Another reality is that Docker builds are used to shield against lib breakage.

IMO, what could be done to simplify builds is not to bring a new grand-unifying builder a la cmake, but to find common ground among GNU and BSD make, make generic "make" more powerful such that Makefile macro expansion works in more places than it does now, and rely solely on Makefiles and POSIXly/LSBly C/C++ header/macro def discovery in your source files rather than relying on automake, config.h, and -DHAVE_XYZ. Then slowly deprecate autotools and restrict yourself to target the much more uniform landscape of Linux, BSDs, and Mac OS we have today.


Autotools are _mostly_ fine from user's perspective (run configure, make, done) and horrible from devs perspective.


Quite the opposite. Autotools are wonderful for devs. I would never start a new project with cmake or bazel or ninja.

Either autotools for the quality projects, or makefile projects for header-only like projects. Cmake is faster, but extremely limited.


They may be of no purpose to you, but there are thousands, if not millions of people to which autotools is still incredibly useful, even with all its weaknesses and flaws.


autotools were originally designed to abstract away the platform-specific UNIX differences, 40 years ago.

Today UNIX and those platform-specificities don't exist anymore.

Instead we have Linux, FreeBSD and MacOS. Unfortunately, autotools haven't kept up and don't actually do anything useful to help you write portable code across Linux and MacOS.


I would love if the autom4te requirement for Perl could be removed so I do not need to always keep a Perl installation. (OpenSSL requires Perl, too.) If a scripting language is an absolute necessity, change it to something smaller like Lua. (Lua is part of NetBSD base.)


Perl is usually installed in most Unix or Unix-like systems. Lua not that much. And the size difference though proportionally big (5x) it's insignificant in modern systems (1M vs 0.2M). Also consider that Perl is more featureful language and switching to Lua could mean more work for the devs if some of them are required to be implemented.


Perl does not cross-compile so without applying non-upstream patches.


And Perl comes with OpenBSD.


What would the better higher-level scripting language be though? I'd love to use something better than sh.


Use a 3rd party tool to generate... autotools "scripts", that's the best compromise IMHO... build tools are not a solved problem yet, you'd think it would be priority #1 for the dev world but no...

Ultimately, devs shouldn't have to suffer to build C or C++ programs like that. Why is building executables so hard like even today? Developers need to solve that problem once and for all. Containers aren't the solution.


> Use a 3rd party tool to generate...

Isn't that a case of "now you have two problems"? :)


More than that, after all we already generate make files. So it is 3rd party -> auto tools -> Makefile. Every step in this chain probably adds another layer of obfuscation to any build error you encounter.


> > Use a 3rd party tool to generate...

> Isn't that a case of "now you have two problems"? :)

No. Now you have 3 problems: the 3rd party and the tool.


I think most of the engineering has moved towards things like Cargo for Rust or Go for Golang.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: