
Show HN: Xmake, a modern C/C++ build utility - waruqi
https://github.com/xmake-io/xmake
======
soapdog
a developer post a well rounded tool with documentation that was built with a
lot of care. The comment thread is mostly about other tools or people
dismissing the work done because another available offer is more popular or
common.

When did "Show HN" threads became shark tank? Can people at least check and
post about the tool itself instead of discussing CMake vs Ninja vs Meson?

~~~
fnord123
>When did "Show HN" threads became shark tank?

From the 2007 announcement of Dropbox:

[https://news.ycombinator.com/item?id=8863](https://news.ycombinator.com/item?id=8863)

""" 1\. For a Linux user, you can already build such a system yourself quite
trivially by getting an FTP account, mounting it locally with curlftpfs, and
then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP
account could be accessed through built-in software. """

~~~
DataWorker
Does anybody actually use Dropbox anymore? Feel like it’s a dead product.

~~~
Crinus
I used to, until they blocked the public folder. The sync between my computers
was nice, but for me the big feature was copy/pasting (or directly creating)
an image or static html page(s) with images (or flash files, at some point) in
the public folder, right clicking in Explorer/Finder (depending on the OS i
used at the time), copying the URL and pasting it on IRC/IM/Reddit/whatever
when discussing stuff.

They tried to sell their 'get public URL' wrappers as the replacement for the
functionality, but that only covered a tiny aspect (and OneDrive, Google
Drive, etc already offered that stuff) - sharing photos - and even that was
slower due to them wrapping the files in some sort of viewer instead of giving
the raw file data like the public folder did, while using some sort of hash ID
to identify the file instead of the actual filename (this was another
convenience lost since with the public folder i could guess the URL to share
without copy/pasting).

I never found anything as convenient as Dropbox's public folder while the rest
of their offering wasn't anything i cared much about and even then by that
time both Google and Microsoft's offers were better (and with Microsoft's
available right out of the box in Windows).

~~~
haditab
I agree with the public folder issue. In my experience they still sync much
better than OneDrive and Google Drive though. For me the syncing is a lot
faster and has less issues.

They are also the only ones I know that have a Linux application.

That being said I don't pay for their premium service because it seems to
expensive compared to OneDrive and Google Drive.

------
2bluesc
Meson[0] has been gaining in popularity and has migration tools for cmake
projects.

Large projects such as systemd and gnome[1] have migrated or have been
migrating for years

[0] [https://mesonbuild.com/](https://mesonbuild.com/)

[1]
[https://wiki.gnome.org/Initiatives/GnomeGoals/MesonPorting](https://wiki.gnome.org/Initiatives/GnomeGoals/MesonPorting)

~~~
ridiculous_fish
Why migrate from CMake to meson?

~~~
kstenerud
CMake feels a lot like C++89: Lots of things you can do, but there are
problems:

* No standardization or opinionated design, so you can't share your work easily.

* No sane defaults, so your build system is always fragile, difficult to maintain, and done wrong.

* No best practices, so people keep making the same mistakes over and over.

* Misguided attempt to remain compatible with the steaming pile of legacy they've accumulated over the years.

* Bad documentation, so there's no way to learn how to do things better.

* Steep learning curve with limited payoff, so most people don't bother.

Meson does some of these things better. It's still not pretty, but it's nicer
to use than CMake.

~~~
ahartmetz
1 and 3 are not true anymore, the new canonical CMake way is targets with
attached dependencies, header file search paths, compiler flags and possibly
other things. Bad documentation - well yeah, it's specifically missing first-
party best practices and howto documentation and the third-party documentation
as well as really old first-party documentation in the CMake wiki often
recommends bad old practices.

You forgot to mention that the language is awful (but that usually doesn't get
in the way IME).

------
waruqi
We can see an example about xmake.lua

[https://github.com/tboox/tbox/blob/master/src/tbox/xmake.lua](https://github.com/tboox/tbox/blob/master/src/tbox/xmake.lua)

------
claudius
What I miss about these tools is some "relatively" straightforward dependency
detection and generation.

That is, I have a bunch of .cpp files which need to be compiled into
individual executables in a folder bin/. I also have a folder inc/ which
contains some headers (.h) and those headers possibly also have some
associated TU (.cpp).

Now g++ can already generate a dependency graph of headers for an executable.
It is then (with a standard Makefile and some supporting bash scripts) quite
straightforward to mangle that graph into a list of translation units (namely
those files whose name matches a depended-on header) which must be compiled
and linked into the executable.

That is, I can simply create a new "executable file" .cpp file in bin/,
include some headers therein and when I say make, the Makefile automagically
figures out which (internal) dependencies it needs to link in when linking the
executable.

Now that I have these "relatively straightforward" scripts and the
corresponding Makefile, the incentive to move to another (nicer) build system
which would require me to rebuild this infrastructure to fit into this other
build system's view of the world is quite low – unless there is some way to do
this directly?

Xmake as shown here (and also Meson linked in a sister comment) appear to
still require manual selection of dependencies.

~~~
flohofwoe
How would a tool know from a header dependency, in which source file the
implementation for the header lives? C or C++ don't require any relationship
between a declaration file and implementation file. The implementation could
be in an entirely differently named source file, or spread over various files,
mixed with implementation code from other headers, or included right in the
header.

~~~
adrian_b
Generating automatically the dependencies is trivial with gcc and GNU make, if
you just take care to group adequately in directories and subdirectories.

I.e. you just have to put all the source files from which you generate object
files that will go in the same libray in a set of directories which does not
contain files that will not go there.

Similarly, all the source files for the object files required for an
executable, except those that are in libraries, should be in a set of
directories.

The sets of directories need not be disjoint, just a given set must not
contain files that must be excluded for linking a certain target, as that will
make the building process more complex.

Given this constraints, it is possible to write a universal GNU makefile
usable for any project, which will generate automatically all dependencies.

For any executable or library you want to build, it is enough to write an
extremely small makefile, containing 4 lists (of defined symbols, of source
files, of directories with header files and of libraries) and the name of the
generated file and its type (excutable, shared library, static library).

At the end you need to include a makefile that is good for any project
targetting a certain CPU + operating system combination.

The makefiles per CPU/OS must define a few things, e.g. the compiler used and
other utilities, option flags for all, locations of the tools and so on, then
you include a unique makefile for all architectures and operating systems.

I have started using this method more than twenty years ago and I have never
ever needed to write manually any dependency information.

Whenever I see the huge and intricate and impossible to maintain makefiles
that are too frequently encountered in many software projects, I wonder how
one is willing to waste so much time with a non-essential part of the project.

From my point of view, building easily any large software project is a problem
solved a long time ago by gcc & GNU make, but for reasons that I cannot
understand most people choose to not do it in the right way.

Of course having to use in 2019 a programming language which does not
implement modules by any better method than including header files is
something even more difficult to understand, but I still must use C/C++ in my
work, as there is no alternative for most embedded computers.

~~~
adrian_b
Sorry, there are several typos in my message above. For most of them it is
obvious which was the correct intended word.

However, one typo can lead to a confusion because an entire word is missing.
In the 4 lists that must be written in the makefile, the most important list,
as the other lists can be omitted, is the list of directories with source
files (not a list of source files).

For simple projects the list will be reduced to a single source directory.
Whenever you add, delete or rename source files, there is no need to edit the
makefile of the project.

All changes can be taken automatically into account by GNU make, which can be
instructed to scan the source directories for source files for all the
programming languages that you use.

------
jhauris
This is really great work, great documentation. It looks like CMake, but with
a full featured scripting language.

~~~
waruqi
thanks!

------
Game_Ender
Does it have any distributed build or caching support? That is my minimum bar
for a C++ build system. ccache and distcc/icecc are too limited, you want
something integrated with your build system directly.

~~~
waruqi
Distributed builds are being planned, but not yet implemented. You can see
[https://github.com/xmake-io/xmake/issues/274](https://github.com/xmake-
io/xmake/issues/274)

------
RcouF1uZ4gsC
For better or worse though, CMake has won! Many IDE's including Visual Studio
can directly work with CMake files. In addition, even Google which is famous
for doing things their own way, has now added official, first-class CMake
support to their open source C++ library Abseil
[https://abseil.io/blog/20190402-cmake-
support](https://abseil.io/blog/20190402-cmake-support)

If you are writing an open source C++ library, even if you support some other
C++ build system, chances are you will also have CMake support as well.

While I have no doubt, xmake is easier to use than CMake (just having Lua over
CMake's abomination of a language is a great improvement), the fact that so
many libraries and tools already support CMake is going to make adoption an
uphill battle.

~~~
geezerjay
Cmake won against the incumbent, which was autotools. Still it's still far
from being an enjoyable tool, whose experience is made even worse by its god-
awful docs.

~~~
shstalwart
Personally, I vastly prefer autotools, both as a user and developer. When I
got to the point I needed some kind of build system, I found autotools much
easier to learn than cmake.

As user, I find the experience with autotools to be much nicer as well. For
whatever reason, the interface just seems more intuitive. I mean, ./configure
--help will tell you basically all you need to know. An underappreciated bonus
is that you don't have to install more stuff just to __build__ some program
you might not even want. I've run into more than one project that required a
specific version of cmake, which as luck would have it, was not the version
packaged with my distro. This leaves you either building another cmake first
or finding a tool/library that isn't so persnickety.

Given the choice between trying a project that uses CMake or or autools, I'll
choose the autotools based project every time.

~~~
jcelerier
> An underappreciated bonus is that you don't have to install more stuff just
> to __build__ some program you might not even want.

sorry what ? I remember hours in my younger years searching which Debian
package provided autowhateverflavoroftheday.sh so that I could build $random
internet project

~~~
jlokier
> > An underappreciated bonus is that you don't have to install more stuff
> just to __build__ some program you might not even want.

> sorry what ? I remember hours in my younger years searching which Debian
> package provided autowhateverflavoroftheday.sh so that I could build $random
> internet project

The whole point of Autotools is that distributed source packages can be built
by themselves, without requiring any part of Autotools to be installed. They
build even on obscure systems that don't have any working version of
Autotools.

If you have to install autoanything to build a random project that uses
Autotools, either you are doing something wrong, or the project is using
Autotools wrong, or maybe the Debian package is using Autotools wrong.

That said, I know what you mean. I've had to seek out a number of different
versions of Autotools just to get some things to build. But that is because a
lot of projects and/or distro packaging blatantly uses Autotools differently
than it's was designed to be used. I don't think Autotools should be blamed
for this.

~~~
jcelerier
> That said, I know what you mean. I've had to seek out a number of different
> versions of Autotools just to get some things to build. But that is because
> a lot of projects and/or distro packaging blatantly uses Autotools
> differently than it's was designed to be used. I don't think Autotools
> should be blamed for this.

yes, it absolutely should be. If a tool is misused, it's generally because
it's hard to use correctly. In contrast, if I see a repo with a CMakeLists.txt
I know that it's going to be a simple cmake && make.

~~~
shstalwart
> If a tool is misused, it's generally because it's hard to use correctly.

Citation needed.

Tools get misused all the time. If I use a flat bladed screw driver as a pry
bar/chisel/whatever, that doesn't mean the flat bladed screw driver is hard to
use.

------
waruqi
Here is an official dependency package repository for xmake.
[https://github.com/xmake-io/xmake-repo](https://github.com/xmake-io/xmake-
repo)

------
waruqi
One of the purposes of xmake is to solve the problem of C/C++ dependency
packages.

~~~
geezerjay
So is Conan's[¹], and Conan is already supported by cmake.

[¹] [https://conan.io/](https://conan.io/)

~~~
waruqi
xmake also support conan, and vcpkg/brew

------
Iwan-Zotow
cmake is NOT a build utility. It is dependency tracking and configuration
utility

~~~
sigzero
"CMake is an open-source, cross-platform family of tools designed to build,
test and package software. CMake is used to control the software compilation
process using simple platform and compiler independent configuration files,
and generate native makefiles and workspaces that can be used in the compiler
environment of your choice."

They would disagree with you.

~~~
geezerjay
Cmake is a makefile generator. The output of cmake is a series of makefiles.
That's it. If you need to build a project and you don't have make, nmake,
jmake or whatevermake in your system them cmake does nothing to get your
project built.

~~~
kstenerud
CMake can generate:

* Borland Makefiles

* MSYS Makefiles

* MinGW Makefiles

* NMake Makefiles

* Unix Makefiles

* Watcom WMake

* Ninja

* Visual Studio projects

* Green Hills MULTI

* Xcode projects

And more: [https://cmake.org/cmake/help/latest/manual/cmake-
generators....](https://cmake.org/cmake/help/latest/manual/cmake-
generators.7.html)

~~~
geezerjay
> CMake can generate:

Although you're conflating project transcoders with makefiles, nevertheless
that's the whole point of cmake: generate makefiles that are used by some
third-party program to actually build the software.

------
euyyn
Does it produce hermetic builds?

~~~
waruqi
What's hermetic builds?

~~~
tele_ski
Hermetic Builds. ... Our builds are hermetic, meaning that they are
insensitive to the libraries and other software installed on the build
machine. Instead, builds depend on known versions of build tools, such as
compilers, and dependencies, such as libraries.

Kind of like a container for building? I had to look it up myself.

~~~
shaklee3
How would it do that besides building in a container?

~~~
joshuamorton
Look into bazel, pants, and similar hermetic build systems.

You pin all dependencies and manage flags and such but via the build system.

------
je42
mmh. cflags and cxxflags are command line options ? i would expect them to be
defined as part of build file.

~~~
waruqi
You can also define them in build file (xmake.lua)

    
    
        target("test")
            set_kind("binary")
            add_files("src/*.c", "src/*.cpp")
            add_cflags("-fPIC", "-Dxxx")
            add_cxxflags("-fPIC", "-Dxxx")
     

The command line arguments just give you a quick and easy way to modify
cflags.

------
phaedrus
I wish CLion supported this as an alternative to CMake for project
definitions!

~~~
waruqi
You can try clion/idea plugin for xmake.

[https://github.com/xmake-io/xmake-idea](https://github.com/xmake-io/xmake-
idea)

------
intea
Im not a fan of LUA. The syntax of XMake.lua reads somewhat like CMake but
easier to understand. What I'd really like to see is a build system in Python
(3!) utilizing objects and dictionaries for tasks like this should be a
breeze.

~~~
robmccoll
Do you consider waf to meet that criteria?
[https://gitlab.com/ita1024/waf](https://gitlab.com/ita1024/waf)
[https://waf.io](https://waf.io)

Or scons? [https://scons.org](https://scons.org)

~~~
jcranberry
Does waf still force you to be c++11/14?

------
unrealhoang
Now we have N+1 incompatibile build systems.

~~~
flukus
I think half those N's are cmake itself considering the amount of time I see
"requires cmake x.x or above" messages.

I wouldn't find having many build system such an issue if they'd just add a
makefile (and maybe ./configure) to call that build system, giving devs a
consistent interface and not having to lookup up how to do a simple build.

~~~
geezerjay
> if they'd just add a makefile (and maybe ./configure)

That's the whole point of cmake. Instead of running autotool's ./configure
(which in fact is a whole dance involving autoconf, autoreconf, automake, and
whatnot) just run cmake . to get yourself a fancy makefile.

~~~
flukus
> just run cmake . to get yourself a fancy makefile.

Well most instructions I've seen start of with "mkdir build && cd build &&
cmake ../src", which is a bit more complicated than just "make" with a default
build dir. I'm not sure why they're all like this, I would have though
supplying a default build directory is something cmake could handle.

My last and only big project with cmake we ended up with a makefile anyway to
drive all the things cmake couldn't do, or that we couldn't do with cmake due
to inexperience or some combination of the two. So we ended up with make
calling cmake calling make, all because it apparently made it easier for IDE
users (it didn't but that was definetly our fault).

~~~
tom_
I use Make to automate the initial CMake setup step - though by "Make" I
probably mean something more like "glorified shell script", as the Makefile in
question consists entirely of phony targets. It detects the OS with the usual
bunch of ifs, and by default does one of 3 things:

\- Windows - generates a VC++ project

\- OS X - generates an Xcode project

\- Unix (other) - generates Ninja files for Debug/RelWithDebInfo

(Typically there's also the option to generate Ninja files on OS X if you like
- good for automated build and/or if you'd just rather use some other editor
(which is not unreasonable).)

Once it finishes, on Windows you load "build/vs2017/whatever.sln" into VC++;
on OS X you load "build/xcode/whatever.xcodeproj" into Xcode; on Unix (other)
you change to "build/Debug" or "build/Release" and run "ninja". And off you
go. After that, it all just kind of runs itself.

The Makefile consists of basically stuff like this:

    
    
        .PHONY:unix
        unix:
                rm -Rf build/Debug
                mkdir -p build/Debug
                cd build/Debug && cmake -G "Ninja" -DCMAKE_BUILD_TYPE=Debug ../..
                rm -Rf build/Release
                mkdir -p build/Release
                cd build/Debug && cmake -G "Ninja" -DCMAKE_BUILD_TYPE=RelWithDebInfo ../..
    

plus some ifs to ensure the right target(s) are available depending on host
OS.

(I've found it beneficial to regenerate everything entirely from scratch each
time in the Makefile - ensures you're always working from a clean slate, with
no cached variables sticking around from old runs. The odd package does have
an unusually time-consuming configuration process, but I've always ended up
managing to bypass these somehow - it's possible a future revision of my
"process" will have to actually address this properly.)

This process does confuse people that don't read the instructions, as they
type "make", some stuff happens, and then nothing. But I've found it to work
well enough.

------
peterownen2
I really love it! When did you start the project?

------
_pmf_
What rubs me the wrong way is that a lot of build systems have a fatal
combination of unfamiliar syntax and complete lack of debuggability.

Conan and Meson seem so much better in that regard.

~~~
geezerjay
Conan is orthogonal to the choice of build system, as in fact Conan's main
choice of build system is cmake.

------
caikelun
great!!

------
fxfan
This looks thoughtfully created (and so documented!). I haven't gone through
the entire doc and am not particularly clear but can you cross-build too? How
would you run MSVC linker on Linux?

~~~
waruqi
Support cross-build, but you need install mingw for linux, if you want to
build win32 program on linux.

You can see [https://xmake.io/#/home?id=cross-
compilation](https://xmake.io/#/home?id=cross-compilation) and
[https://xmake.io/#/home?id=mingw](https://xmake.io/#/home?id=mingw)

------
honey1988
Great!

------
kayamon
Don’t ever install software by piping arbitrary remote scripts into bash.

~~~
bluejekyll
Assuming the script is hosted on an https site, is a popular tool (like brew
or rustup) from a trusted source, why would this be any more dangerous than
downloading and installing from a a package manager?

What would it take beyond https and a well-known site to make _you_
comfortable doing this?

~~~
adev_
> Assuming the script is hosted on an https site, is a popular tool (like brew
> or rustup) from a trusted source, why would this be any more dangerous than
> downloading and installing from a a package manager?

Most package manager use an offline signature mechanism done at build time
(rpm, dpkg, nix) and do not rely on HTTPS security for anything else than
eyes-dropping reasons.

Relying purely on HTTPS is insecure. Nothing guarantee you that your source /
script / package did not get hacked / modified between the time you uploaded
it and the time your user downloaded it.

This is not hypothetical scenario, it already happened in the past with sites
like sourceforge.

------
godman_8
It has an emoji in the title...I'm sold.

