
Show HN: A Modern C/C++ build tools (Simple, Fast, Cross-platform) - waruqi
https://xmake.io/#/
======
kstenerud
I so badly want to get off the C/C++ build system roller coaster. I've gone
from make to autoconf to various IDEs to CMake to Meson, and have looked at a
bunch of others but never made the jump.

Eventually fatigue sets in, you pick a tech, and stick with it even as the
tech fades into obscurity and nobody can figure out how to build your project
anymore, let alone integrate it :/

~~~
pjmlp
I have it easy, MSBuild, and when not on Windows, CMake.

Then again, I just need it to the extend of Java/.NET mixed builds with C++.

~~~
jstimpfle
> MSBuild

OMG don't say the name, it causes me physical pain. That XML crap which can't
decide whether to be a shitty static GUI-editable build format or a proper
description language that you can actually use. It ended up being neither (to
be fair the first can't really be achieved for non-trivial projects). It's one
of the worst designs I've had to deal with.

(I'm still in the process of cleaning up hundreds of .vcxproj files, each of
with has multiple 1000s of lines of auto-generated boilerplate in it, using
the crutch that is .props files).

> and when not on Windows, CMake.

Even better, let's add another abomination, in the form of CMake, on top of
MSBuild. And give up the last bit of control over your build that you could
hope to have.

~~~
jstimpfle
But, let's not forget to mention, MSBuild the "execution engine" is absolutely
fantastic... Working in Visual Studio, it reliably detects the minimal amount
of things to rebuild.

~~~
lightgreen
> it reliably detects the minimal amount of things to rebuild

So they managed to implement graph traversal which is a basic interview
question everywhere? Every dummy build system do it, but for Microsoft that’s
achievement!

~~~
jstimpfle
It's not about walking a dependency graph, but about getting the dependencies
100% right. In C, that's indeed very difficult, or at least a lot of work.

I don't know if you have tried MSBuild, but I've spent countless numbers
cleaning up .vcxprojs, shifting things around, factoring into extern .props
files, etc. Even on major changes I was often surprised how little rebuild was
necessary. The granularity at which the system detects changes is not just
per-file. It has per-file caches for the set of preprocessor variables, the
set of include paths, all the little compilation and link settings, and so on.

So what it will do if you change the build file is, it will re-execute the
build file processor and generate all the necessary build information for each
C file. And then, it only rebuilds each file if a build dependency has
actually changed in a relevant way for that file. For example, the order in
which the preprocessor variables are defined does not matter (as long as they
don't overwrite each other), but the ordering of include paths does. Also, if
any of the files that were #include'd (transitively) in the last build changed
since them, a rebuild is needed.

Compare that to the average crappy Makefile system that you are going to set
up. The best thing you could achieve with Make will never be going to rebuild
as little, or rebuild reliably the stale build products. I like Make for other
reasons, but MSBuild is better and more robust.

~~~
lightgreen
I have spent large amount of time working with msbuild while migrating msbuild
build into a different build system (including working on implementation of
two way converter, from an to msbuild).

What you described (cache based on preprocessor flags) is really cool, in
practice it does not give a huge win compared to proper modern build system
like Bazel, and decent user interface (including the build language) and
extensibility is much more important. Especially because it’s super hard to
properly configure msbuild project with proper dependencies between modules.

I have a friend who worked at MS. He told stories how in their VS projects
dependencies between modules (projects) were not configured, developers had to
manually invoke compilation of dependencies. In theory it is possible to
configure msbuild properly, in practice even MS employees failed to do so.

But I don’t consider msbuild a build system. For me it’s just a project format
for VS which some developers commit to VCS. It’s great that MSBuild allows to
have a proper C++ project definition with a custom command to actually build
it (external build system), so VS sees proper project structure, but
developers don’t have to deal with vcxproj files.

~~~
jstimpfle
> He told stories how in their VS projects dependencies between modules
> (projects) were not configured

Yes that's still an issue at my workplace as well. When I joined, there were
no dependencies - all files were compiled redundantly for each project. So
there were no issues with regards to reliably rebuilding dependencies, but of
course it takes a lot longer to build. And there are serious maintainability
issues with de-facto internal libraries, because adding or removing files
to/from each "library" means that each dependent project file has to be
updated. I've started to split out a few "library projects", and now we're
starting to get the rebuilding issue. I think dependencies can be configured
in *.sln files, but that probably means that they have to be configured
separately for each project, i.e. it probably can't be done automatically when
simply importing the library's .props file into a .vcxproj.

A nice middle ground could be to make .props files that just include .c files,
meaning the library's files will be part of each project, so no rebuilding
issues and no maintenance issues since the list of files that belongs to a
library is maintained in one isolated place (the .props file). But that again
means the files are built redundantly in each project.

But the long-term solution will be to maintain the build descriptions in
better-suited in-house data structures (yeah like CMake, just not that scary)
and to generate the MSBuild cruft from that.

------
Crinus
I'm not sure how it differs from all the existing tools really, it looks like
it does more or less the same stuff, just slightly differently. Also it does
look like it requires that xmake is itself installed on the target system.

Personally i'd like something similar to autotools, only much saner, in that
you write some sort of script (or definition file or whatever) that describes
your project and its requirements and then the tool generates a configure
shell script for unix and windows (i mean two configure scripts, one for unix
and one for windows) that itself generates a makefile and/or project file.

So, like autotools, the recipient of the code will not need to have your magic
build tool installed, just the common tools available on their system (shell,
make, cc for unix/mingw, visual studio or whatever on windows). This can be
very useful especially for libraries (it is annoying when every library wants
its own special snowflake build tool).

~~~
waruqi
xmake can also support to generatr other project file. e.g. makefile vsproj
cmakelist compile_command and etc.

    
    
      xmake project -k makefile
      xmake project -k vs2019
      xmake project -k cmakelist
      xmake project -k vsxmake 
      ...

~~~
Crinus
This still requires xmake to be installed. CMake does that too, as does
Premake and a bunch of other tools, but they all need to be installed - except
Autotools.

What i'm talking about is generating a script for something that is part of
the OS itself (like shell script, which is what Autotools does) that itself
generates the Makefile which uses a widely available standard tool (Make). So
a user (i include library users - as opposed to library developers - as
"users" here) wont need to install xmake/cmake/whatever just to build the
program/library, they only need to have shell and make which are available
everywhere. On Unix at least, on Windows it'd need to generate a batch file
for MSVC (this is where Autotools fall short since they're made for unix
only).

------
waruqi
Xmake has its own package management repository.[https://github.com/xmake-
io/xmake-repo](https://github.com/xmake-io/xmake-repo)

And it also supports self-built distributed repositories and third-party
repositories (e.g. vcpkg, conan, clib, homebrew).

    
    
      add_requires("libuv master", "ffmpeg", "zlib 1.20.*")
      add_requires("tbox >1.6.1", {optional = true, debug = true})
      target("test")
        set_kind("shared")
        add_files("src/*.c")
        add_packages("libuv", "ffmpeg", "tbox", "zlib")

~~~
derin
Just as a heads up, you should probably remove the

    
    
      yaourt xmake
    

section of the install section (the Arch Linux instructions).

Yaourt is no longer actively maintained and poses security risks. Just linking
to the AUR package is usually enough.

~~~
waruqi
Ok, I will look at it. Thanks!

------
ensiferum
Unfortunately no build tool can ever really _improve_ productivity in absolute
terms. They can only ever "improve" relatively, i.e by sucking less and
reducing productivity less than some other competing build tools.

The time that is spent dicking around with these silly build systems/tools is
always time that is wasted and sank into useless activities without _any_
value added in the end product.

~~~
imtringued
I completely disagree. Do you want to go back to typing g++ commands by hand
for each source file? That's not a build tool and yet it is objectively worse
than even a crappy build tool.

~~~
ensiferum
No, ofc nobody wants to do that and that's not really what I'm saying. I'm
saying that all current build tools suck and can only ever decrease your
software's value and decrease productivity.

Let me illustrate. Your software is a product X. It has some value V (as by
some measure of value, perhaps $ earned). The product X is the ultimate output
of some build tool T.

Now image you had this incredible build tool that produced X without any
programmer doing any build related work ever at all. Your build output is X
and value is V. 100% of your developer effort can go towards working on X and
increasing V!

Now in reality you spend some amount of time writing build files, working with
the build, fixing bugs in the build files, generally maintaining it. This is
typically non zero effort and the cost grows above linearly wrt respect to
build configurations/platforms supported etc.

However the output of the build tool is still the same X and value is still V.
So your build tool added 0 value!

In fact any build tool can only ever decrease your product's value. Why? How?
Because of the cost of messing with the build is non-zero and you spend time
on it that could otherwise be spent working on actual things that produce more
value (such as adding new features or fixing bugs or whatever). i.e. you can
only put 100% - "build effort" amount of effort towards working on X and
increasing V.

~~~
jcelerier
> Now image you had this incredible build tool that produced X without any
> programmer doing any build related work ever at all. Your build output is X
> and value is V. 100% of your developer effort can go towards working on X
> and increasing V!

That is only the case for the simplest of software though. "Build systems" are
often much more than just build systems: they are project management tools.

e.g. quite often I have to perform some preprocessing or code generation.

I could :

\- A: write a $cross_platform_scripting_language script and integrate it to my
build system & CI, ensure that the script interpreter is correctly found, that
the build dependency graph is correct, etc etc. But just getting the same
version of python to work reliably on 5+ different platforms is already quite
a pain.

\- B: write the preprocessor directly in my build system language (in my case,
most of the time CMake).

Option B takes wayyyyy less time in my experience than option A, even though
it looks more "build-related" than option A.

~~~
ensiferum
Right like I said some build tool can improve your productivity wrt to some
other build tool. But did it create any absolute value in the end product?

Let's say you first build your product with autotools and you sell it for X$
per unit. If you later port your application to CMake. What value does CMake
create in the end product? Can you sell it for more than X$ on the basis of
having been built with Cmake? Usually no. The build system created 0 absolute
value gain. You can claim that oh.. CMake is simpler than autotools and will
save some time in the future thus creating a gain relative to autotools. Yet
it's not capable of generating absolute value added. In absolute terms every
build tool can only retract effort away from the activities that produce the
value, i.e.software features.

~~~
jcelerier
> What value does CMake create in the end product? Can you sell it for more
> than X$ on the basis of having been built with Cmake? Usually no.

That is true for every technological choice.

> In absolute terms every build tool can only retract effort away from the
> activities that produce the value, i.e.software features.

Again you seem to think that build systems don't contain software features. I
don't think that this is true. My build system helps me to refactor things
that I would have to do by hand otherwise, and which are pretty project-
specific. For instance scanning subprojects for their license information and
generating a .cpp with that info, looking for all uses of a particular token
and listing the files using that token for them to be included somewhere, etc
etc.

------
jauer
Looks cool, but is there a tl;dr for why I’d use this instead of something
like Bazel?

~~~
waruqi
I haven't compared it to bazel, but if you like the description style of xmake
for the project or you like lua, you can try it.

~~~
troysand
You should definitely take a look at bazel. This is not say bazel is better
than the one you wrote but cleary bazel is becoming a industry standard making
other build tools obsolete.

~~~
jandrewrogers
The _de facto_ industry standard is cmake, I have never even seen bazel in the
wild and I work across many companies on big C++ code bases. I primarily use
meson, and have even seen some cmake projects moved to meson, but I am under
no illusions about it replacing cmake.

~~~
new_realist
cmake is mediocre. Bazel is the future.

------
ensiferum
This tool is (again) just rehashing the same old build methods only with
different syntax.

I'd really like to see a build tool that would take things to the next level.

\- As a developer if I include "foobar.h" in my code, why do I have to workout
the include paths myself? Why can't the build system search for the file and
resolve the include paths? If there are any unresolvable ambiquieties then ask
me.

\- As a developer if I use for example std::thread why do I need to add
manually -pthread to my build / linker flags. Why can't the build system do
this for me?

\- As a developer if I use let's say Win32 API in my code why do I have to
manually add the linker libs and flags. Why can't the build system do this for
me?

\- As a developer if I use c++14 features in my code, why do I need to
manually add the -std=c++14 whatever flag in my build files? Why can't the
build system do this for me?

etc.

Everything else is just re-hashing the same old tiring build metdhologies with
different syntax/problems/bugs/shortcomings/portability bugs.

~~~
jcelerier
> If there are any unresolvable ambiquieties then ask me.

what if there is no ambiguity, only a single wrong answer, and now it "works"
but you include the wrong thing ?

> \- As a developer if I use for example std::thread why do I need to add
> manually -pthread to my build / linker flags. Why can't the build system do
> this for me?

because adding -pthread and -lpthread have different semantics (-pthread
defines _REENTRANT which may change things - grep your /usr/include for that
;p) and you may want one or the other regardless of whether you use
std::thread.

> \- As a developer if I use let's say Win32 API in my code why do I have to
> manually add the linker libs and flags. Why can't the build system do this
> for me?

which linker libs ? on my computer I must have 5 version of those right now,
not counting UWP and various MinGW versions. Also sometimes I want the debug
standard library and sometimes not.

> \- As a developer if I use c++14 features in my code, why do I need to
> manually add the -std=c++14 whatever flag in my build files? Why can't the
> build system do this for me?

determining if you are using C++14 features sounds like a corollary of the
halting problem

------
pornel
There are so many nice build tools for C, but unfortunately very few users
will be happy to install and learn yet another tool.

And no matter which build system you choose, you'll find users who think your
choice is awful and you should have picked their one true build system.

------
tbfly
Great tools, good job!

------
loose_pants
Why not Python? Syntax is much cleaner and there’s nothing you can’t achieve.
Even performance is on par.

~~~
jacobush
Python as in Scons? Otherwise I don't understand your comment.

~~~
bsder
Scons is effectively dead, and I wish people would start removing it from the
Python build systems web pages.

Meson is written in Python and _isn 't_ dead. However, Meson requires a
partner to actually build things--something like cmake or ninja.

However, I think the original poster meant "Why Lua instead of Python?" And
the only real answer is "Because that's what the author wanted to use."

~~~
jhasse
btw: There's also Waf written in Python, which also isn't dead.

~~~
mandor
we are also happy with waf, wich gives us a lot of flexibility (e.g., use
python libs that are not build-related) and use a language that the team
already know. ([https://waf.io](https://waf.io))

