
Build Tools – Make, no more - _superposition_
http://hadihariri.com/2014/04/21/build-make-no-more/
======
vinkelhake
I find these posts somewhat amusing. We've got people who (rightfully)
question the tools they use and look for alternatives. They then discover Make
and have some kind of zen Unix moment that they want to share with the world.

If what you are doing in your flavor-of-the-month build tool translates to a
roughly equivalent number of lines in Make, then yes, you should probably look
at using Make. But the thing is, Make is stupid, it doesn't know a lot.
Sometimes that is a good thing, sometimes it is not.

I've written about this before on HN: I mostly program in C++ and when I build
my stuff I want a build tool that understands things like header dependencies,
optimization levels, shared libraries etc. It's a bonus if my build files are
portable.

My point is that these alternative tools often strive to raise the abstraction
level and the reason people use them isn't necessarily because they haven't
discovered Make.

~~~
badman_ting
I find them irritating for the same reason.

It reminds me of the jQuery cycle: use jQuery for everything -> decide that
depending on frameworks is lame -> use "vanilla JS" for everything -> realize
this requires polyfills and various other inconvenient, inelegant things ->
either go back to using jQuery, or gain a much deeper understanding as to why
everyone uses it.

~~~
shadowmint
I doubt the analogy is apt.

Make is not an amazing (abit slightly bloated) meta tool that solves all your
problems on all platforms (abit slowly).

Make _is_ vanilla javascript, along with all the bumps and hassles of not
working correctly on multiple platforms, having odd obscure syntactic oddities
and only kind of supporting various operations in newer versions (which may or
may not be available on various platforms).

The newer build tools are trying to do _exactly what jquery does_ , and
abstracting away those rough edges for a consistent build behavior with better
syntax.

Going back to make is the 'use "vanilla JS" for everything' step in your list
above, not the final step.

~~~
ben336
That was exactly the analogy the gp was making...

Make -> VanillaJS

Grunt/Rake/whatever -> jQuery

~~~
jmstriegel
Taking the analogy a bit further, a vanilla proponent might put Make in the
second category as well. Funny that modern web development requires a build
process to change a line of CSS.

------
Too
Might go a bit off topic but i have to bring this up since 9 out of 10 make
tutorials on the internet do the same horrific mistake as you just did, 11 out
of 10 code bases out in the wild as well.

In your make file example the .o files are just depending on the .cpp files,
not the header files they include, the header files those included header
files include and the files they include etc etc. This means nothing will be
recompiled/relinked if a constant in a header file changes for example!
Changed function signatures will give you cryptic linker errors with the
standard solution "just try make clean first".

To solve this you can either manually update the make file every time _any_
file changes the files it includes, which almost defeats the purpose of having
an automatic build system. Or you can use automatic dependency generation by
invoking your compiler with a special flag (-MMD for GCC), and suddenly make
isn't as simple anymore as you laid it out to be. In conclusion your build
tool must be aware of ALL inclusion rules as your compiler(preprocessor) has,
or be given the information somehow. Maybe it's better to just use something
designed for your particular toolchain that can come bundled with this
knowledge?

~~~
humanrebar
Right. Make is mostly a kludge around the nonexistent module system in C and
C++.

It's so bad (specifically due to the way file preprocessing works), that you
need to have large parts of a compiler to accurately determine what the
dependencies of a source file are.

This is why a decent module system should be the top priority for C++17,
though it doesn't look likely so far.

~~~
conradev
Have you seen what Clang is doing[1]?

[1]
[http://clang.llvm.org/docs/Modules.html](http://clang.llvm.org/docs/Modules.html)

------
gyepi
1\. Since make has builtin suffix rules, the Makefile could be simplified to:

    
    
        CXX=g++
    
        hello: main.o factorial.o hello.o
    
        clean:
            rm -rf *o hello
    

2\. Shameless plug: he didn't mention redo [1], which is simpler than make and
more reliable. The comparable redo scripts to the Makefile would be:

    
    
        cat <<EOF > @all.do
        redo hello
        EOF
    
        cat <<EOF > hello.do
        o='main.o factorial.o hello.o'
        redo-ifchange $o
        g++ $o -o $3
        EOF
    
        cat <<EOF > default.o.do
        redo-ifchange $2.cpp
        g++ -c $1 -o $3
        EOF
    
        cat <<EOF > @clean.do
        rm -rf *o hello
        EOF
    

[Edit: Note that these are heredoc examples showing how to create the do
scripts.]

These are just shell scripts and can be extended as much as necesary. For
instance, one can create a dependency on the compiler flags with these
changes:

    
    
        cat <<EOF | install -m 0755 /dev/stdin cc
        #!/bin/sh
        g++ -c "\$@"
        EOF
    
        # sed -i 's/^\(redo-ifchange.\+\)/\1 cc/' *.do
        # sed -i 's}g++ -c}./cc}' *.do
    

sed calls could be combined; separated here for readablility.

[1] [https://github.com/gyepisam/redux](https://github.com/gyepisam/redux)

~~~
pekk
redo might be simpler and more reliable, but shell isn't. And redo is
encouraging even more work to be done in shell. Additionally, the redo version
is more verbose and harder to read. While fancier tasks will make's version
look horrible relatively quickly, they won't make redo's version look any
better.

~~~
malkia
To this day I still don't understand redo (I'm just staring at it, and don't
get anything) - haven't really read the internals.

With make it was easier for me to grasp the idea (or maybe I was simply 20
years younger then).

~~~
pmahoney
I think the big difference between redo and make is that make requires
knowledge of dependencies up front, and this is sometimes tricky to get right.

"as you can see in default.o.do, you can declare a dependency _after_ building
the program. In C, you get your best dependency information by trying to
actually build, since that's how you find out which headers you need. redo is
based on the following simple insight: you don't actually care what the
dependencies are before you build the target; if the target doesn't exist, you
obviously need to build it. Then, the build script itself can provide the
dependency information however it wants; unlike in make, you don't need a
special dependency syntax at all. You can even declare some of your
dependencies after building, which makes C-style autodependencies much
simpler."

[https://github.com/apenwarr/redo](https://github.com/apenwarr/redo)

------
tikhonj
A build DSL solves the problem of making your build rules and systems _first-
class citizens_. It's not just learning a new syntax—in fact, since you're
embedding into a known language, it isn't even that new—it's about getting
more control. You can pass rules around, modify them and do whatever
arbitrarily complex tasks you need in a natural, straightforward way using
your favorite programming language. You don't have to contort yourself and
bend over backwards to fit the logic you want into Make's limited and peculiar
language.

Your build system is an integral part of your whole program and you want to
treat it just like any other code. This means refactoring, this means
modularity, this means libraries, this means no copying and pasting... All
this is far easier with a system embedded in your main language than in Make.
You can use your existing tooling, debuggers and frameworks to support your
build system. If you're using a typed language, you can use the types to both
constrain and guide your build files, making everything safer.

Using an embedded DSL integrates far better with the rest of your ecosystem
than relying on Make.

Apart from making the logic _of_ your build system easier to describe and
maintain, an embedded DSL also makes arbitrary meta-tasks easier. You might
want to monitor random parts of your build process, report to different
services, connect to different front-ends (an IRC bot, a CI system...) and
maybe even intelligently plug into the features of your main language.
Wouldn't it be great to have a make system that's deeply aware of how your
server is configured, how your type system works, what your compile-time
metaprogramming is doing an so on?

You could just glue together a bunch of disparate scripts with a Make file. Or
you could use a DSL and call these services through well-defined, maybe even
_typed_ interfaces! No need for serializing and deserializing: you can keep
everything inside your system.

Sure, if you're just going to use your DSL as a different syntax for Make,
you're not gaining much. But it allows you to do far more in a far better way,
while fitting in more naturally with the rest of your code. I'm definitely all
for it!

~~~
feca
Can you show an example of what you are describing? It doesn't sound
interesting for the tasks I have in mind, so it must be the case that you are
dealing with very complex tasks.

~~~
ICWiener
See ASDF, for example ([http://common-lisp.net/project/asdf/](http://common-
lisp.net/project/asdf/)).

------
joeld42
I think everyone goes through a phase where they try to find the perfect build
tool, and then at least entertain the idea of writing one themselves.

Eventually, you grow out of it. There's a lot of build tools, each are better
at some things than others. It's not that much grunt work to convert things
from one to another (even very large projects). If your build tool is working
for you, leave it alone. If it's getting in your way or slowing things down,
try another one. Move on.

------
BoppreH
I think one reason is because Make is built with Shell, which is always one
step (and one letter) away from hell.

For example:

    
    
        clean:
             rm -rf *o hello
    

Did you really mean to erase all files and directories that end in "o"? Let's
say it's just a typo and fix it: "*.o".

Now, are you sure it'll handle files with spaces in the name? What about
dashes, brackets and asterisks? Accents? What if a directory ends in .o?
Hidden files?

This specific case may support all of the above. But if it doesn't, what will
happen? How long until you notice, and how long still to diagnose the problem?

Just like I prefer static strong typing when developing non-trivial projects,
the build system should be more structured. I agree endless reinventing is
tiring, but it may have some credit in this case.

~~~
idlewan
I'd expect all developers using make to know about this and never have this
problem thanks to one simple thing: sticking with sensible names (no spaces,
no brackets, no stars and other special characters in the name - hello
underscores!).

It's an easy rule.

    
    
        Just like I prefer static strong typing...
    

You probably don't use any special chars or spaces for identifiers in whatever
the language you're programming in. This is just applying a similar rule to
the files of your project.

~~~
BoppreH
For source code files I agree completely; but the build system will encounter
other types of files that aren't so strict.

Maybe you downloaded something and it came with a bracket because that was in
the page title. Or you copied a duplicated file and your system helpfully
appended " (2)" at the end. Or there was an excel file updated by someone not
so technical and this person didn't know they have to strip accents from the
words in their native language (possibly losing meaning). Or someone saved
their "Untitled Document.txt". Or you needed to include a date in the
directory name. Or you are just human and didn't mean to break the build by
pressing the biggest button on your keyboard when saving a file.

And remember "break the build" here is not "a red light flashes and you get an
email". It means you get unknown behavior throughout the process, including
security features and file removal.

Strict rules for source code file names are good because names usually bleed
into the language itself. Python file names __become __identifiers when you
import them. Identifiers in turn are strict because parsing is strict, and
there are many good reasons for strict parsing in general purpose languages.

Lacking accent support in file names, as some very popular software do, is
terrible. Lacking support for spaces is just atrocious.

I love shell, I use it daily for one-off tasks, but I don't think it's a good
fit to manage the build system of a project.

------
ThePhysicist
For me, the only justification for using a language-specific build tool (e.g.
grunt, rake, paver, ...) is when you actually want to exchange data with a
library / program written in that language. On the other hand, you could
probably accomplish the same effect using environment variables, with the
upside of having a cleaner interface.

For those that are curious which build tools exist for Python, here's an
(incomplete) list:

* pyinvoke ([https://github.com/pyinvoke](https://github.com/pyinvoke)) - claims to be the successor of fabric, pretty solid and well-maintained

* fabric ([http://www.fabfile.org/](http://www.fabfile.org/)) - not actually a build tool but often used as one

* paver ([http://paver.github.io/paver/](http://paver.github.io/paver/)) - no longer actively maintained

* doit ([http://pydoit.org/](http://pydoit.org/)) - one of the few tools that actually support monitoring file states (like the original make)

* disttools ([https://docs.python.org/2/distutils/](https://docs.python.org/2/distutils/)) - not actually a "universal" build tool but intended to distribute Python packages

~~~
SoftwareMaven
You forgot buildout[1], which is probably more than a build system, perhaps
putting a toe into the configuration management world.

1\. [http://www.buildout.org/](http://www.buildout.org/)

Documentation can be challenging to find, and it isn't the most actively
developed project in the world, but what it does, it does pretty well
(including supporting more than python dependencies).

~~~
ThePhysicist
Thanks for posting the link! I hesitated including configuration management
tools since the use case is not the same. There's a lot of interesting stuff
going on there though: With Saltstack and Ansible we have two serious "chef"
contenders for Python now.

------
rafekett
the last 10 years in build tools has felt like 1 step forward, two steps back.
i like being able to write tasks in any language other than Makefile. however,
it seems like many of the new popular options (cake, grunt, etc.) don't do
what, to me, is Make's real purpose: resolve dependencies and only rebuild
what's necessary. new task runners have either eliminated or pigeonholed the
(typically one-to-one in makeland) correspondence between tasks and files,
meaning the build system can't be intelligent about what tasks to run and
which to not.

computers are fast enough that this doesn't often bother me anymore, but i've
run across some huge Rakefiles that could benefit from a rewrite in Make.

~~~
chrismonsanto
> however, it seems like many of the new popular options (cake, grunt, etc.)
> don't do what, to me, is Make's real purpose: resolve dependencies and only
> rebuild what's necessary.

You might like tup[1]. Its killer feature is that it automatically determines
file-based dependencies by tracking reads and writes (using a FUSE
filesystem). It has an extreme emphasis on correct, repeatable builds, and is
very fast. Other stuff:

\- does work in parallel, and will let you know if your build isn't parallel
safe. (note it is NOT relying on your specification of dependencies: even if
you manually specify dependencies, it will tell you if something's wrong based
on what it actually observes your dependencies to be)

\- tracks changes to the build script and reruns if the commands change.

\- cleans up old output files automatically if build rules are removed.

\- lets you maintain multiple build variants (say for different architectures,
configurations, etc)

\- autogenerates .gitignore files for your build output

\- very easy to get started, and "Just Works".

\- for advanced usage, it is scriptable in Lua.

I've tried every build system out there. For Unix-y file-based build tasks,
tup is, by far, the best. I don't know why it isn't more well known.

[1]: [http://gittup.org/tup/index.html](http://gittup.org/tup/index.html)

~~~
rkrzr
I was already sold on tup after reading the first paragraph comparing it to
make[1]:

"This page compares make to tup. This page is a little biased because tup is
so fast. How fast? This one time a beam of light was flying through the vacuum
of space at the speed of light and then tup went by and was like "Yo beam of
light, you need a lift?" cuz tup was going so fast it thought the beam of
light had a flat tire and was stuck. True story. Anyway, feel free to run your
own comparisons if you don't believe me and my (true) story."

[1]:
[http://gittup.org/tup/make_vs_tup.html](http://gittup.org/tup/make_vs_tup.html)

~~~
andrewflnr
My favorite is the "Tup vs Mordor" benchmark.

------
flohofwoe
Another vote for higher-level meta-build-systems like cmake, premake or scons
(I'm using cmake because it has very simple cross-compilation support). My
personal road to build-system nirwana looked like this, I'm sure this is
fairly common:

\- Started using hand-written Makefiles and autoconf. Then someone wants to
build on Windows, in Visual Studio nonetheless. Add manually created VStudio
project files to the project. Then someone wants to use Xcode, so add manually
created Xcode project files. Now you add files, or even need to change a
compiler option. Fix the options in the Makefile, open the VisualStudio
project, fix the options there, open the project in Xcode, fix the options
there. Depending on the project complexity, this can take hours. The next guy
needs to build the project in an older VisualStudio version, but the project
files are not backward compatible...

\- Next step was to create my own "meta-build-system" in TCL (this was around
1999), which takes a simple descriptions of the project (what files to compile
into what targets, and the dependencies between target), and creates
Makefiles, VStudio-files and Xcode-files, this worked fine until the target
project file formats change (happens with every new VisualStudio version).

\- Someone then pointed me to cmake which does exactly that but much better
(creates Makefiles, VStudio-, Xcode-projects, etc... from a generic
description of the sources, targets and their dependencies), and I'm a fairly
happy cmake user since then.

\- Recently I started to wrap different cmake configuration (combinations of
target platforms, build tools/IDE to use, and compile config (Release, Debug,
etc...)) under a single python frontend script, since there can be dozens of
those cmake configs for one project (target platforms: iOS, Android, OSX,
Linux, Windows, emscripten, Google Native Client; build tools: make, ninja,
Xcode, VStudio, Eclipse; compile configs: Debug, Release). But the frontend
python script only calls cmake with the right options, nothing complicated.

Of course now I'm sorta locked-in to cmake, and setting up a cmake-based
build-system can be complex and challenging as well, but the result is an easy
to maintain cross-platform build system which also supports IDEs.

I general I'm having a lot less problems compiling cmake-based projects on my
OSX and Windows machines then autoconf+Makefile-based projects.

[edit: formatting]

~~~
greggman
I agree with this. As a cross platform (ie, Windows + OSX + Linux) person who
enjoys Visual Studio (XCode less so) I need more than make.

My own experience is with gyp and ninja which is used by the Chromium team
([http://martine.github.io/ninja/](http://martine.github.io/ninja/)) which
they use to build Windows, OSX, Linux, Android (and maybe iOS?)

Of course for personal projects I'll probably never notice the speed
difference but for bigger ones Ninja is FAST.

~~~
mpyne
CMake outputs to Ninja as well. It's the only way I know how to use Ninja in
fact.

------
asb
I've recently been playing with ninja, which does a good job of not being
'just another make'
[http://martine.github.io/ninja/](http://martine.github.io/ninja/). To quote
their website, "Where other build systems are high-level languages Ninja aims
to be an assembler.". It's used as a backend for GYP (Google Chromium) and is
supported by CMake as well. I've had good success generating the files
manually using something like ninja_syntax.py:
[https://github.com/martine/ninja/blob/master/misc/ninja_synt...](https://github.com/martine/ninja/blob/master/misc/ninja_syntax.py).

I also note that google are working on a successor to GYP, GN which targets
Ninja
[http://code.google.com/p/chromium/wiki/gn](http://code.google.com/p/chromium/wiki/gn).

~~~
evmar
Thanks for the plug! In line with the original post, I'll add that the Ninja
manual has a section where we try to convince you to _not_ use Ninja and
instead use a more common build system:
[http://martine.github.io/ninja/manual.html#_using_ninja_for_...](http://martine.github.io/ninja/manual.html#_using_ninja_for_your_project)

------
luckydude
I'm 52 years old. I've had this discussion with dmr, srk, maybe with wnj.

All I know is for years, decades, I carried around the source to some
simplistic make. I hate GNU make, I hate some of the unix makes. I loved the
simple make.

The beauty of make is it just spelled out what you needed to do. Every darn
time make tried to get clever it just made it worse. It seemed like it would
be better and then it was not.

Make is the ultimate less is more. Just use it and be happy.

------
geuis
Dunno if the owner of the site will read this, but here's a tip. Don't show a
full screen overlay telling me how my visit would be better with cookies
enabled.

1) I have cookies enabled. 2) The Eurpoean law is daft, but since you feel you
must comply do it in a more user friendly way.

~~~
hhariri
I know. I don't agree with it much either and I found this least intrusive
(wasn't aware of issue on mobile). If I can find a better solution, will
change.

Thanks.

~~~
username223
In what way would my reading that blog post have been a better experience for
me with your tracking cookies?

------
msluyter
Nothing against make, but I've found that it feels really nice when the
majority of your toolset uses the same language. This is what I liked about
Rails. Rails is ruby. Bundler is ruby. Rake is ruby. It's all ruby, which
allows for a certain synergy, streamlined feel, and less cognitive overhead. I
don't blame the js folks for attempting something similar.

~~~
Cyranix
Agreed. Mixing languages is fine when necessary, but a single language is
usually preferable to me. My dev team has been using Grunt in projects thus
far and have been pleased with Gulp in small experiments.

------
apples2apples
Misses the fundamental point that Make is broken for so many things. To begin
with you have to have a single target for each file produced. Generating all
the targets to get around this is a nightmare that results in unreadable debug
messages and horribly unpredictable call paths.

nix tried to solve much of this, but I agree it can't compete with the
bazillion other options.

~~~
webjprgm
It does not miss it, just ignores it. The author states that there are lots of
things we can improve but the point is that we have too many variations on the
theme without converging to a solution that has few (or no) dependencies and
comes with built-in build knowledge and the ability to discover what you want
rather than make you declare it.

Such a tool should be: \- Zero (or few) dependencies. Likely written in plain
C (or C++, D, Rust) and compiled to distribute in binary form. \- Cross-
platform \- Support any mix of project languages and build tasks. \-
Recognizes standard folder hierarchies for popular projects. \- Easy enough to
learn. Not overly verbose (looking at you, XML). Similar to Make if possible.

Examples of the auto-discovery: It can find "src", "inc", and "lib"
directories then look inside and see .h files then make some educated guesses
to build the dependency tree of header and source files (even with mix of C
and C++). Or it could see a Rails app and figure out to invoke the right Rake
commands, perhaps checking for the presence of an asset pipeline etc. Or a
Node.js project. It could check for GIT or SVN and make sure any sub-modules
have been checked out.

~~~
danielweber
The dependencies thing is a killer. I remember a Windows developer co-worker
insisting that _everyone_ had the .NET runtime installed, and after shipping
it turned out that most of our customers _didn 't_ have it installed, to which
he finally said, "well, _I_ always have it installed." (To be fair, I should
have pressed him harder, and I did ask the question twice, but because I'd
never built against the runtime I was unprepared for any challenge.)

Almost every new project I download starts with a sad, manual, and
demoralizing installation of a bunch of third-party stuff that you have to
google to find out what's missing. And it's not educational at all, because in
a few years all these tools will now be obsolete.

(The best project I ever encountered was the Stripe CTF, which almost always
used just one command to install a complete working copy of _everything_ you
needed and didn't have. I'm still impressed with that.)

------
daemin
I think the main problem with these articles is that the examples given are
exceedingly simplistic, and hence in no way represent real world build
systems. It's very easy to have a build system look nice and clean for trivial
examples, when it breaks down is when the software it builds gets more
complicated and the number of hacks and extra code is added making the build
system into a big mess.

------
shoo
I've been thinking a lot about build systems lately. I enjoy the discussion
that this post has provoked. The post itself is weaker than it could have
been, in that it does not stick to a single example when comparing build
tools, and does not pin down any criteria for distinguishing between build
tools.

If you are interested in a comparison of a few interesting build tools, please
check out Neil Mitchell's "build system shootout" :
[https://github.com/ndmitchell/build-
shootout](https://github.com/ndmitchell/build-shootout) . Neil is the author
of the `Shake` build system. The shootout compares `Make`, `Ninja`, `Shake`,
`tup` and `fabricate`.

Another possibly interesting build tool is `buck`, although it is primarily
aimed at java / android development. See
[http://facebook.github.io/buck/](http://facebook.github.io/buck/) . There's a
little discussion about `gerrit`'s move to `buck` here:
[http://www.infoq.com/news/2013/10/gerrit-
buck](http://www.infoq.com/news/2013/10/gerrit-buck) .

Here's some questions I'd ask of a build system:

\- is it mature?

\- which platforms does it support?

\- which language ecosystems does it support? (language-agnostic? C/C++? ruby?
python? java?)

\- does it support parallel builds?

\- does it support incremental builds?

\- are incremental builds accurate?

\- is it primarily file-based?

\- how does it decide when build targets are up-to-date, if at all? (e.g.
timestamps, md5 hash of content, notification from the operating system)

\- does it allow build scripts for different components to be defined across
multiple files and handled during the same build?

\- does it enforce a particular structure upon your build scripts that makes
them more maintainable?

\- how does it automatically discover dependencies, if at all? (e.g. parsing
source files, asking the compiler, builds instrumented via FUSE/strace)

\- how easy is it to debug?

\- is it possible to extend in a full-featured programming language?

\- does it let you augment the build dependency graph mid-way through
execution of a build?

\- how simply can it be used with other tools such as your chosen continuous
integration server, test framework(s), build artifact caches, etc?

Many of these criteria are completely overkill for trivial build tasks, where
you don't really need anything fancy.

------
sheetjs
One big advantage of vanilla Make is the community. There are some very nice
tools that work well with make (such as
[https://github.com/mbostock/smash](https://github.com/mbostock/smash)).

~~~
ztratar
I love documentation that has humor, as long as it doesn't get in the way.

What's special about the Make community as opposed to the Grunt or Gulp
communities?

~~~
pekk
For that matter, what's special about the Grunt or Gulp communities?

------
drawkbox
At least we are moving the direction of Grunt/Gulp rather than a maven sort of
direction. Many lives lost to maven, somewhat of a Vietnam of build tools. You
might think you are a Java developer with it but truly you are a maven
servant.

------
Xorlev
This post rather misses that while Make is simple, making Make do all the
things we're used to (e.g. Java dependency management) not as simple.

I'd like to think people have decided that it's easier to replicate the task
part of Makefiles onto their environment as the simpler alternative to making
dependency management and various other language-specific tasks available to
make.

------
Zelphyr
Never underestimate a young developer's need to reinvent reinventing the
wheel.

------
malkia
Make is the "assembly" language for build systems. Qt's qmake/cmake target it,
and the output produced is horrible, but then using "make -j8" or qt's own
"jom.exe -j8" as replacement for the underperforming nmake.exe and you are all
set.

------
arrowgunz
Make is still the best build tool form me.

[https://algorithms.rdio.com/post/make/](https://algorithms.rdio.com/post/make/)

------
josephschmoe
I've never felt hindered by Gradle.

Hindered by the fact I can't add an arbitrary github repo through Gradle? Yes.
That seems like it should be solvable though...

~~~
twic
Add an arbitrary GitHub repo to what? In what capacity? You mean as a source
dependency, an artefact repository, what?

------
demallien
Heh. The one and only time that I ever wrote a parser in my professional
career was for a build tool. In my defence, at the time I didn't know much
about command line tools, and had only really programmed in IDEs. So when the
new project was to be compiled on the command line, I quickly discovered that
maintaining dependencies, changing targets and doing all the other things that
a build system generally does by hand quickly gets old. Not knowing that
autotools, cmake, ant, and about a bajillion other tools already existed to do
just this, I wrote my own language, with a parser in ruby, no less :D

I have since repented. I find autotools (with occasionally a script of
[ruby|python|perl] to handle something that would otherwise be tricky to do in
make or m4, which is then called by the makefile) works a treat. Just don't
try to do anything tricky in the auto tool files - as I said, boot anything
exotic out to a separate tool.

Also, any discussion of build tools without also discussing package management
is but half a discussion.

------
eponeponepon
I'm unreasonably fond of Ant - there's plenty of scope for pointless clever-
dickery, and there are days where that's all that keeps me going!

Nice to see it mentioned in a context other than "oh god what a mess"... even
though, in fairness, many aspects of it are a complete dog's dinner.

~~~
revscat
Ant is a Turing-complete language in XML.

That is horrifying.

It is bloated, difficult to read, tends towards duplication. It also doesn't
do dependency management all that well, doesn't cache build results (so it
does a complete rebuild every time), and is difficult to extend.

Not an Ant fan. I have used both Rake and Gradle successfully, and have been
much happier with each. Their scripts tend to be (much) more compact, easier
to read, and less prone to duplication.

~~~
ndrake
I agree with a lot of your points, but can you explain the part about it doing
a complete rebuild every time? It doesn't do that for me (unless I
specifically tell it to).

~~~
revscat
My apologies: I just wrote a basic HelloWorld.java and a build.xml to go with
it, and it looks like it doesn't recompile the class unless there is a change
to the source .java file. So I was mistaken about that.

Wonder what Gradle's caching, then?

~~~
vorg
> Wonder what Gradle's caching, then?

Let's hope it's catching another scripting language or two in its upcoming
version 2 because having Groovy as the only option does it no favors.

------
krick
In fact I agree. I would be really glad if somebody explained me persuasively
which one build-tools is the best one ever, so I could use it always and for
everything, even when a shell-script would be enough. Yeah, it would be nice.
But then, don't we have the same thing with about every class of software?
Tens of text-editors and no perfect one. Many OSes and nothing sane. So many
programming languages with overlapping functionality! And we won't even talk
about such thing as linux distributions (and their packaging tools),
pepsi&coca-cola… oh, it's not even software.

So, yeah, there are too many build-tools. Whatever.

------
anonymous85
I hate how far spread apart the JS community is but I'm excited for the day it
starts to all come together and we don't have to worry about Javascript stacks
becoming outdated within 6 months

------
pwenzel
I appreciate Grunt and Gulp, but still fall back to Make for many of my web
projects, even those that require a CSS or Javascript build system.

Here's a Makefile example that utilizes fswatch:
[http://blogs.mpr.org/developer/2014/02/makefile/](http://blogs.mpr.org/developer/2014/02/makefile/)

------
nightcracker
I'm currently working on a build tool that doesn't work using the traditional
"makefile" approach. Instead it's designed as a Python library, and you have
the full power of Python at your disposal.

Sadly it's not ready for prime-time yet (early designing stage), so I won't
link my highly unfinished project.

~~~
cscheid
I suspect your downvotes are coming because you haven't mentioned
[http://www.scons.org/](http://www.scons.org/). Letting you know in case
you're not aware of it. (I would have emailed you out-of-band, but there's no
contact info in your profile. Sorry for the noise, everyone else)

~~~
checker659
Scons uses Python more for configuration rather than the control flow. I'm
sure most people think that's ok, but I find it limiting and a bit too derpy.

~~~
fidotron
In fairness that's the core of the argument: a significant proportion of
people that have had to maintain build tools over long periods believe they
should only be configuration and not contain any control flow.

The problem with adding flow control is that establishing the dependencies
without actually executing the whole process becomes next to impossible.

------
mwcampbell
Rich Felker, the lead developer of musl libc, uses make to generate his
ewontfix.com blog. See
[http://ewontfix.com/Makefile](http://ewontfix.com/Makefile)

~~~
voltagex_
Pelican [1] uses make and python to generate a blog. Simple makefiles are
great.

1: [http://blog.getpelican.com/](http://blog.getpelican.com/)

------
ASneakyFox
I get redirected with. "Your experience on this site will be enhanced by
allowing cookies"

I need cookies to read a blog post? Don't think so. Probably not worth the
read

~~~
hhariri
No. You don't. But EU laws requires that if you use Cookies for things such as
Google Analytics or even just stats, you must inform the user. And you can
switch it off.

------
runarberg
What about having your text editor do the build for you (use `.dir-locals.el`
in emacs to compile your less on save for example)

------
Kiro
Why would you need Grunt just to compile LESS?

~~~
benaiah
It is a non-ad-hoc specification of your build, and it allows you to integrate
things like LiveReload more easily.

------
mamcx
I start using [https://github.com/rags/pynt](https://github.com/rags/pynt),
and what a change. Is the simples thing you can imagine. Combining with
request and
[http://plumbum.readthedocs.org/en/latest/](http://plumbum.readthedocs.org/en/latest/)
(or similar) and whoila, you are done.

------
mutert2
What about contribution packages of Grunt/Gulp?

------
SchizoDuckie
So he's saying use 'make' instead of gulp/grunt and then he submits an example
where it's as easy as piping through GCC.

He's making the wrong assumption that you don't need to setup a build
environment when building with make, but to have gcc you will still also need
to install g++ and build-tools. Also, he refers to building on Windows using
yet another specialist tool, but have you recently tried building anything C++
on Windows when you don't have visual studio, or even worse, CYGWIN installed?

When you make such a statement, then please show me a makefile that's not
10.000 lines long, that will do the same as this, but without NPM and
'downloading half of the internet'.

    
    
       gulp.task('scripts', function() {
          // Minify and copy all JavaScript (except vendor scripts)
         return gulp.src(paths.scripts)
          .pipe(coffee())
          .pipe(uglify())
          .pipe(concat('all.min.js'))
          .pipe(gulp.dest('build/js'));
      });
    

I submit that's impossible, simply because this stuff took 2 years to evolve
(for the Javascript toolchain, that is) and a lot of people went through hours
of frustration trying out alternative methods.

~~~
MarkPNeyer
> this stuff took 2 years to evolve and a lot of people went through hours of
> frustration

GMake was first released in 1977:
[http://en.wikipedia.org/wiki/Make_(software)](http://en.wikipedia.org/wiki/Make_\(software\))

They've worked on this thing for decades

There are 42,000 issues filed for make. if you could resolve each of these
issues in 10 minutes, you'd spend 291 days of frustration.
[http://savannah.gnu.org/bugs/?group=make](http://savannah.gnu.org/bugs/?group=make)

kids these days.

~~~
SchizoDuckie
My point was referring to the JS toolchain here mostly. I get that for C, make
is the tool for the job.

~~~
mturmon
make is a general-purpose tool for describing dependencies for regenerating
files. It would be worth your while to learn make, and try it on your example.
Understand that it's a declarative language ("A depends on B"; when "B"
changes, here's how to update "A"), and not a scripting tool. This is a good
thing.

In my experience, make is coupled to Unix. Make is not coupled to C.

~~~
marshray
On one hand, _make_ typically comes with built-in rules for .c targets.

On the other hand, _make_ can't cleanly handle #include dependency detection.
I doubt that there is any major C project where "make extraclean" (or its
equivalent) isn't occasionally necessary.

So yeah it's really not very well suited for C.

~~~
nn3
What is wrong wrong with just using the following.

Given it would be nice if make had a builtin macro to do this, but it is not
too bad to type out.

depend: .depend

.depend: ${SRC} ${CC} -MM -I. ${SRC} > .depend.X && mv .depend.X .depend

include .depend

Makefile: .depend

~~~
marshray
Because now _make_ can't build a clean source tree.

I believe _make_ parses the entire Makefile before running it.

~~~
nn3
GNU make restarts when a Makefile: dependency changes. So it works perfectly
fine. Try it ...

~~~
marshray
What happens if you list _depend_ as a dependency of the first target?

