Hahaha, there is nothing simple with autoconf :D
I supposed if you don't feel like you need to understand all the moving parts it's not horrific.
To be fair, autotools evolved in a different era, and to solve differnet problems than bazel.
It superseeds http_archive and falls back to git_repository if needed. Just run `bazel sync`. See it working at .
Note: there's an open (and somewhat long standing) issue WRT Bazel fetching from gitlab.
These "get of my grass" comments really miss the point. If you don't understand why you might need a more complex build system than "a simple batch file" then this really isn't software you need. That doesn't mean it's insane.
At some point "recompile everything" is a bit slow, so I swap it out for a makefile.
At some point the makefile gets too complicated and I swap it out for Bazel.
1. Dependency handling
2. Single script for multiple platforms
3. Easy for a new user to just "run"
But I have yet to find a build system (for C/C++) that solves all these issues better than simply having a few scripts.
I use just a build.batch, linux_build.sh and osx_build.sh usually and I have never had problems.
A new user can just download those scripts, and as long as they haven't tampered with file/library locations it just works.
If people like their build systems - fine by me - but I just think they are unnecessarily complex a lot of the time. Granted I don't have a huge project with 100's of dependencies, but even then, those dependencies don't come all at once.
As a developer I don't want to have to worry about how to make all of the above work. I learned cmake and so I don't have to worry, now that I know it (the learning curve is there) almost all my issues go away because it just works with all the weird stuff people do.
Replacing autoconf with cmake or a similar metabuilder will provide a lot more flexibility to build on different platforms, including, eg, windows, and cross-build.
If it's easier to build on any platform, you don't need to try to get into the custom packager business and just focus on your library.
What Bazel (or Buck) bring to the table is slightly different, its better control of inter-module dependencies and external dependencies. In a way they are not only build tools but opinionated and effective 'linkers', bundling together your applications and libraries in a way that it is hard to mess up and very reproducible.
The other option would be the nix/guix route, which I think might be better-suited for the open-source model but require you to adapt at least the package managers.
IDE integration would remove 90% of my hesitation to push it at my company.
Where I am we were using CMake 2 for our build system for a long time and it got into bad shape. When we made the decision to rewrite the build system (across a few dozen subprojects), we looked at a few - Meson, Bazel, CMake, etc. CMake was hard to argue against because of the recent work (last few years) by Microsoft to support CMake projects directly (File -> Open Folder). We wanted to try to get away from maintaining separate IDE projects for every platform for every project.
I think the Bazel team introduced Aspects and punted the rest for a later date. Now they hope each language/IDE community will take that infrastructure and build out the rest of the ecosystem. I think its too important of a piece of the puzzle to leave to the community.
CMake can generate Xcode projects.
Every cmake project I've worked with just works. As in it only takes a couple hours to get it packaged for my custom (stupid...) package system I have to use, almost all of that boilerplate work.
Only 1 in 50 autoconf project I've worked with actually works - in theory autoconf supports this case well, but in practice there is always something broken and so I know I'll spend a couple weeks (full time) getting it packaged.
I don't think autoconf has a future - even when it was "the only choice" everyone avoided it if they could. Cmake took over and became the thing everyone knew.
Now if you (like the article) proposing something else you might have a chance. However autoconf has lost.
Now that we live in a post-innovation world, there is no need to port software: it's either Linux, Mac OS, or Windows.
Except, of course, there is still a lot of innovation going on, and there's a lot of porting and cross-building that happens outside the the web domain, and a heck of a lot of things that are none of the big three development hosts.
CMake is starting to converge towards what the autotools have provided for some decades now, but it's not there yet. When it comes down to it, the only real difference is different domain-specific languages and that just results in tribalism.
Except for all the times we don't.
Trust me, I was there in the nineties when hard dependencies proved unscalable. The only thing that has really changed since then is the scale (it's gone up orders of magnitude) and the flood of people who have no idea what goes on under the hood.
My experience using CMake and Autotools and migrating between the two is... the only appreciable benefit to CMake is that it works well enough on Windows, and Autotools sucks on Windows. Whereas Bazel is in a different league, IMO. For people on Windows, the fact that CMake works well on Windows is enough to sing its praises, but I think Bazel will be eating the C and C++ world over the next five years.
In defense of Autotools, it's designed to work on systems which don't have Autotools installed, back in an era where you might download Apache and compile it from source to get a web server running on your system. It will work on a stock install of Linux, Solaris, AIX, or macOS. Almost nobody cares about this use case any more.
Maybe, but how relevant is this? Application development on Windows is historically not based on open source, and for developing anything else than Windows apps, supporting something different from a POSIX system is not really important.
> but I think Bazel will be eating the C and C++ world over the next five years.
Do you think bazel is better than NixOS and GNU Guix? I think things like bazel are pretty much geared to big companies which want to move ecosystems to the cloud and have a lot of manpower to support very complex dependency systems. I think that for distributed open source projects, a system like Guix which decouples individual packages while suppurting (or even requiring) a build from source is better. There is also the issue that with a bazel build, there is potentially a whole lot of stuff which runs uncontrolled on your local system. It is basically "curl | bash" on stereoids, which means you hand over your machine to the cloud.
I always found that annoying because if i just want to compile something (be it a program or a library) from source, i also need to install a bunch of build systems i'm not going to ever use myself. That is just pure and unnecessary bloat for me.
A specific example I ran into was compiling random projects with -flto. libtool, in its infinite wisdom, decided long ago that it was a bad idea to pass any -f flags to the linker invocation and so stripped them out, even if you the user manually told it to via LDFLAGS. This of course breaks -flto--so newer versions of libtool got the picture and whitelisted that flag. Except projects haven't updated libtool for a while, so you have to patch their source to get it to work.
But TBH my comment was more about the other build tools than autotools... Autotools have many issues and are way more complex than need to be, but i do not see a build tool that does exactly the same thing as autotools do, despite being so many tools out there.
Windows + vcpkg + MSVC it works perfectly.
Windows + vcpkg + clang it is a knightmare of linking problems.
VS2019 otoh, in my experience, just worked out of the box (with the single package I tried ...).
The Win installation  (assuming VS is already present) was pretty easy. I tested by running `vcpkg install glfw3` and copy-pasted an example from the web into VS2019. The example ran with _zero_ configuration!
Is the experience as nice in Linux or Mac?
Footnote: If you're doing this on Windows and want static linking then the experience is less smooth. You still need to specify the vcpkg directory and triplet (x64-windows-static), but that doesn't automatically change your application's CRT to the static one (which all your libraries are now assuming). There are a couple of ways round this but the easiest is probably to use MSVC_RUNTIME_LIBRARY CMake property (requires at least CMake 3.15, which is pretty new).
- a previous version of Bazel
- Zip and Unzip
Not sure I'd call that "huge". Perhaps the JDK?
> - a previous version of Bazel
Why do all these systems seem to need to prove Goedel's theorem? What is wrong with providing a tool that builds portably with a simple C compiler?
I suppose you may be able to find a chain of older versions of Bazel starting with one that doesn't require Bazel itself to build, like the Guix devs did for Rust.
The learning curve is steep though, but the payoff is so worth it.
It works so great though, now that I've managed to set it up.
Bazel won't pull in transitive dependencies unless you ask it to in the WORKSPACE file, which means that you can use a private mirror for all your dependencies if you like, which is fairly easy in practice (I do it for personal projects).
The experience will vary depending on your preferences and the languages you use. With Go + Bazel, my experience is that the Bazel version will have fewer dependencies than the equivalent "go mod" version, because "go mod" will pull in dependencies more coarsely and Bazel has more fine-grained control. Go mod will pull in dependencies for the entire repo that you depend on, but with Bazel, you only need dependencies for the individual subpackages. As a specific example... suppose you use github.com/jackc/pgx. With "go mod" you end up with "github.com/sirupsen/logrus" in your transitive dependencies, but with Bazel you don't, unless you use the part of pgx that requires logrus.
You can also end up with "half the internet" in your dependencies pretty easily, but I think these days that is just the price of including random libraries in your project.
So how is that suitable for distributed open source projects? I understand this is how Google works, but open source?
Actually it will nag you until you do.
> Just let the distributions maintainers do their job.
Is this sentiment supporting Bazel usage or recommending against it?
* there is specific version of the library associated with the OS that the developer can develop against - if the OS updates, the developer also has to update their apps - so they don't leave old bugs and vulns open
* there is specific version of the library associated with the OS that the developer has to develop against - if the OS updates, the developer also has to update their apps - which sometimes can be very annoying if the API breaks
There is more to it, but this is the thing that constantly has me changing my opinion on the matter.
Some vocal part of linux community get extremely angry when packages don't use all the .so's provided by their (often outdated) distro of choice but instead rely on the dev of the app chosing the dependency versions which work best with its software.