
The Makefile I use with JavaScript projects - theothershoe
http://www.olioapps.com/blog/the-lost-art-of-the-makefile/
======
neonscribe
I used make heavily in the 80s and 90s, but haven't much since then. Recently
I started a project that had source files getting processed into PDF files,
for use by humans. Since this is the 21st century, those files have spaces in
their names. At a certain point, I realized that I should be managing this
processing somehow, so I thought of using a simple Makefile. A little
searching reveals that the consensus on using make with files with spaces in
their names is simply "don't even try." In the 21st century, this is not an
acceptable answer.

~~~
hzhou321
Demanding the support of spaces in filenames significantly complicates code as
simple space delimination no longer works and other delimination schemes are
much more error prone -- forgetting balancing quotes, any one? While you are
allowing spaces, you probabaly are allowing all possible code points or maybe
even a null byte? Thinking about it gives me headaches.

I hate hearing people using 21st century or modern as reasons for inflating
complexities. Without rein on complexity (whether it is hidden or not), the
future is doomed, whatever you are building. While I am not saying we should
avoid complexity at all cost, I am insisting that all complexity should be
balanced with merits.

The merits of filenames with spaces is they read better in a GUI explorer.
Whether that merit balances out all the complexity it brings is individual
dependent. For me, that merit ranks very low and I avoid spaces in my
filenames at all opportunities. For some, they need those filenames to be
readable. And there are solutions. One solution, from those who don't code,
seems to demand ("developers", paid or not) that every tool that deal with
files should handle this additional complexity, regardless of their context
and what they think. Another solution would be to add an additional pre-step
of copying/renaming/linking/aliasing. With the latter solution, the complexity
is confined.

I guess for some, it only matters with "I do work" or "they do work" rather
than the big picture. That is fine. However given the context of you are
working with Makefiles, then you are a developer at some level, you are
supposed to do some work.

~~~
com2kid
> And there are solutions. One solution, from those who don't code, seems to
> demand ("developers", paid or not) that every tool that deal with files
> should handle this additional complexity, regardless of their context and
> what they think.

Users expect computers to work in a non-surprising ways.

It isn't natural to use dashes or underscore in file names. Training users to
be afraid of spaces is just teaching them one more way that computers are
scary and unpredictable.

Meanwhile over in Windows land, all tools have been expected to deal with
spaces in them for what is approaching 20 years.

~~~
hzhou321
> Meanwhile over in Windows land, all tools have been expected to deal with
> spaces in them for what is approaching 20 years.

That is an excellent example that deserves a second look from a different
aspect.

... it has trained a crop of computer users that are afraid of command lines
and with an attitude of anything beneath the GUI interface is owned by and of
someone else's problem. They are scared of computers more than ever. They are
very scared of it and having it heavily disguised as an appliance is
mandatory.

~~~
com2kid
> it has trained a crop of computer users that are afraid of command lines and
> with an attitude of anything beneath the GUI interface is owned by and of
> someone else's problem.

There is nothing beneath the Windows UI interface. The GUI is the primary
subsystem of Windows, the command line is a separate subsystem. Windows has
traditionally been built up around the Win32 API, which is GUI first.

It is of course possible to do things from the command line, and some advanced
stuff may need the command line, but users should never need to drop out of
the UI unless something has gone horribly wrong.

The Windows UI is incredibly powerful, with advanced diagnostic and logging
capabilities, incredible performance monitoring tools, and system hooks
everywhere that allow the entire UI to be modified and changed in almost any
way imaginable.

The way Services work on Windows is not some (hotly debated) config files. It
is through a UI. Management of hardware takes place through a UI. Pretty much
everything short of running Ping has a UI around it.

I love the command line, if I am developing it is how I primarily work. And
while doing so, younger devs look at me like I am crazy because they have
customized the living daylights out of their UI tools (VS Code, Atom), to do
amazing things that exceed what a command line can do, and of course those
editors have a command line built in for when that is the best paradigm!

> and having it heavily disguised as an appliance is mandatory.

Something working so well that it becomes an appliance isn't a bad thing. It
means the engineers who made it have put in so many fail safes and worked to
make the code of high enough quality that it doesn't fall apart around itself
all the time.

Heaven forbid I can upgrade my installed software and not have some random
package break my entire system.

~~~
hzhou321
> There is nothing beneath the Windows UI interface.

To clarify, beneath the GUI interface is the actual code that implements that
interface.

> Something working so well that it becomes an appliance isn't a bad thing.

Not at all. I don't attempt to call or think my phone as an computer. Window's
users, on the other hand, still call their PC computers. I guess that is ok if
computers are appliances. It is just that there are still a big camp of people
who use a computer as computer. That causes some confusions in the
communication between.

~~~
com2kid
> Window's users, on the other hand, still call their PC computers. I guess
> that is ok if computers are appliances.

It seems that 90% of computer use has moved into the web browser. Heck outside
of writing code, almost everything I do is in a browser, and my code editor of
choice happens to be built as fancy skinned web browser...

> To clarify, beneath the GUI interface is the actual code that implements
> that interface.

I'd say that everything moving onto the web has once again made the code
underneath it all accessible to the end user, if they so choose.

(Ignoring server side)

~~~
nineteen999
> It seems that 90% of computer use has moved into the web browser.

This is an extremely (web) developer-centric viewpoint IMHO.

Try telling 3D modellers/sculptors, games programmers, audio engineers that
90% of their computer use has moved into a browser. They will look at you with
a blank face since they all require traditional "fat" desktop apps to get
their work done at professional level.

And those are just the examples I can find off the top of my head, I'm sure I
could think of more.

~~~
jwdunne
Do all of those computer users constitute more than 10% of computer users? I
don't think there's even 5% of computer users there.

Talking about 3D modelling, game programming and producers like they're the
majority is a technologist-centric view.

Most computer use is by people who think the web is the internet and use
'inbox' as a verb.

~~~
nineteen999
> Do all of those computer users constitute more than 10% of computer users? I
> don't think there's even 5% of computer users there.

So as stated that was the list of the top of my head. I just pulled from my
personal list of hobbies, things that I use a computer for other than
programming or automation.

Within 50 feet of me at work there are a whole bunch of electrical engineers
who spend 90% of their time in some fat app for designing, I dunno, telco
electrical stuff.

In the fishbowl next to that, are 50 network operations staff who spend 90% of
their day in "fat" network monitoring applications.

I'm just pointing out if you look far enough there are plenty of people using
apps outside a web browser for their daily work and hobbies.

In my nearly 25 career in IT I have never heard people use 'inbox' as a verb
(as in 'inbox me?'). Sure some people must say it sometimes, but I think this
is overstated and another example of programmer cliche or hyperbole.

------
rschloming
I think the reason make is both so controversial and also long-lived is that
despite how everyone thinks of it, it isn't really a build tool. It actually
doesn't know anything at all about how to build C, C++, or any other kind of
code. (I know this is obvious to those of us that know make, but I often get
the impression that a lot of people think of make as gradle or maven for C,
which it really isn't.) It's really a workflow automation tool, and the UX for
that is actually pretty close to what you would want. You can pretty trivially
just copy tiresome sequences of shell commands that you started out typing
manually into a Makefile and automate your workflow really easily without
thinking too much. Of course that's what shell scripts are for too, but make
has an understanding of file based dependencies that lets you much more
naturally express the automated steps in a way that's a lot more efficient to
run. A lot of more modern build tools mix up the workflow element with the
build element (and in some cases with packaging and distribution as well), and
so they are "better than make", but only for a specific language and a
specific workflow.

~~~
lisper
> It's really a workflow automation tool,

That's true.

> and the UX for that is actually pretty close to what you would want.

That is so not true. Make has deeply woven into it the assumption that the
product of workflows are files, and that the way you can tell the state of a
file is by its last modification date. That's often true for builds (which is
why make works reasonably well for builds), but often not true for other kinds
of workflows.

But regardless of that, a tool that makes a semantic distinction between tabs
and spaces is NEVER the UX you want unless you're a masochist.

~~~
ajross
> but often not true for other kinds of workflows.

Examples? I mean, there are some broken tools (EDA toolchains are famous for
this) that generate multiple files with a single program run, which make can
handle only with subtlety and care.

But actual tasks that make manages are things that are "expensive" and require
checkpointing of state in some sense (if the build was cheap, no one would
bother with build tooling). And the filesystem, with its monotonic date
stamping of modifications, is the way we checkpoint state in almost all cases.

That's an argument that only makes sense when you state it in the abstract as
you did. When it comes down to naming a real world tool or problem that has
requirements that can't be solved with files, it's a much harder sell (and one
__not __treated by most "make replacements", FWIW).

~~~
lisper
> Examples?

Anything where the relevant state lives in a database, or is part of a config
file, or is an event that doesn't leave a file behind (like sending a
notification).

~~~
ajross
Like, for example?

To be serious, those are sort of contrived. "Sending a notification" isn't
something you want to be managing as state at all. What you probably mean is
that you want to send that notification once, on an "official" build. And that
requires storing the fact that the notification was sent and a timestamp
somewhere (like, heh, a file).

And as for building into a database... that just seems weird to me. I'd be
very curious to hear about systems that have successfully done this. As just a
general design point, storing clearly derived data (it's build output from
"source" files!) in a database is generally considered bad form. It also
introduces the idea of an outside dependency on a build, which is also bad
form (the "source" code isn't enough anymore, you need a deployed system out
there somewhere also).

~~~
Jtsummers
I need to send an email every time a log file updates, just the tail, simple
make file:

    
    
      send: foo.log
              tail foo.log | email
    
      watch make send
    

Crap, it keeps sending it. Ok, so you work out some scheme involving temporary
files which act as guards against duplicate processing. Or you write a script
which conditionally sends the email by storing the hash of the previous
transmission and comparing it against the hash of the new one.

That last option actually makes sense and can work well and solves a lot of
problems, but you've left Make's features to pull this off. For a full
workflow system you'll end up needing something more than files and timestamps
to control actions, though Make can work very well to prototype it or if you
only care about those timestamps.

================

Another issue with Make is that it's not smart enough to know that
intermediate files may change without those changes being important. Consider
that I change the comments in foo.c or reformat for some reason. This
generates a new foo.o because the foo.c timestamp is updated. Now it wants to
rebuild everything that uses foo.o because foo.o is newer than those targets.
Problem, foo.o didn't actually change and a check of its hash would reveal
that. Make doesn't know about this. So you end up making a trivial change to a
source file and could spend the afternoon rebuilding the whole system because
your build system doesn't understand that nothing in the binaries are actually
changing.

~~~
ajross
How would you fix that with your preferred make replacement? None of that has
anything to do with make, you're trying to solve a stateful problem ("did I
send this or not?") without using any state. That just doesn't work. It's not
a make thing at all.

~~~
Jtsummers
Lisper was replying to the OP who suggested using Make for general workflows.
Make falls apart when your workflow doesn't naturally involve file
modification tasks.

With regard to my last comment (the problem with small changes in a file
resulting in full-system recompilation), see Tup. It maintains a database of
what's happened. So when foo.c is altered it will regenerate foo.o. But if
foo.o is not changed, you can set it up to not do anything else. The database
is updated to reflect that the current foo.c maps to the current foo.o, and no
tasks depending on foo.o will be executed. Tup also handles the case of
multiple outputs from a task. There are probably others that do this, it's the
one I found that worked well for my (filesystem-based) workflows.

With regard to general workflows (that involve non-filesystem activities), you
have to have a workflow system that registers when events happened and other
traits to determine whether or not to reexecute all or part of the workflow.

~~~
Spivak
I mean you're just describing make but with hashes instead of file
modification times. It's probably the most common criticism of make that its
database is the filesystem. If file modification times aren't meaningful to
your workflow then of course make won't meet your needs. But saying the
solution is 'make with a different back-end' seems a little silly, not because
it's not useful, but because they're not really that different.

GNU make handles multiple outputs alright but I will admit they if you want
something portable it's pretty hairy.

------
dblotsky
Thank you. It's nice to know that I'm not alone in this dark, dark world.

It's so depressing when people use arguments like "it's old", "it uses tabs",
and "it's hard to learn". As described by one Peter Miller, "Make is an expert
system". If a tool is the most powerful, standard, and expressive among its
peers, its age or the fact that it uses tabs should be inconsequential.

If anything, the fact that it's decades old and used in every major software
project is a testament to its effectiveness, not a drawback.

And if "learning Make" is a barrier, that to me is a sign that someone cares
more about complaining than about their project. The same way people learn Git
when it's clear that it's the best tool, people learn Make. It really isn't
that hard. Even the basics are enough to reap huge benefits immediately.

~~~
hungerstrike
The tools are not the same on every platform. That’s reason enough for me to
not use it with my JavaScript projects.

The bigger reason though is that it’s not very idiomatic for JavaScript
projects to use Make. It sounds like the only reason that some people go out
of their way to use it is because they actually don’t want to learn something.

~~~
dblotsky
> "The tools are not the same on every platform."

Any common examples?

> "It’s not very idiomatic for JavaScript projects to use Make."

While I agree that popularity is a factor in picking a tool, it shouldn't be a
deciding factor. Going by popularity is precisely how we end up with a new
Build System of the Year(TM) every few years. The fact that we've gone through
4 fairly prominent tools (Gulp, Grunt, Broccoli, Webpack), all of which
contending to "fix" the previous, and none of which have proper DAG or
incremental builds (which Make has had for decades) is damning evidence.

In other words, I think Make could be (and I wish it was) idiomatic for JS.

~~~
hungerstrike
This comment points out the kind of problems that can occur just on Unix
systems -
[https://news.ycombinator.com/item?id=16485637](https://news.ycombinator.com/item?id=16485637)

And then there’s Windows...

Anyway, the fact that things change quickly in JS-land is more of a testament
to how popular it is than anything else IMO. If C were used in the same
environments as JS, I’m sure that you'd see just as much churn.

~~~
gonvaled
C is used in a lot of platforms.

Js only in the browser (thankfully mostly converged) and in node.

~~~
hungerstrike
JavaScript is in the server, browser, mobile apps, desktop apps and embedded
devices.

Basically, it’s used everywhere that C is used and then some.

~~~
dblotsky
> "Basically, it’s used everywhere that C is used and then some."

It's the other way around. Every JavaScript program is run by a C/C++ program.

~~~
quietbritishjim
But if you consider browsers to be a "platform", which I think they are in
fairness, then the GP comment could still apply. Whether the browser is
written in C is pretty irrelevant to whether it can host it.

~~~
dblotsky
I conjecture that an upper-level "platform" that's on top of a lower-level
"platform", for some definition of "platform", will be less varied and
fragmented than the lower-level one.

------
rectang
Make's interface is horrible. Significant tabs. Syntax which relies on bizarre
punctuation... If only whoever authored Make 40 years ago had had the design
acumen of a Ken Thompson or a Dennis Ritchie!

We're stuck with Make because of network effects. I wish that it could just
become "lost" forever and a different dependency-based-programming build tool
could replace it... but that's just wishful thinking. The pace of our progress
is doomed to be held back by the legacy of those poor design decisions for a
long time to come.

~~~
AnIdiotOnTheNet
Make is such a horrifically awful thing to work with that I just end up using
a regular scripting language for building. Why learn another language with all
its eccentricities and footguns when I already know several others?

~~~
aidenn0
This is why I like redo; it handles all of the dependency stuff for you, but
your build scripts are written in pure sh

~~~
dilap
always been curious about redo. which variant do you use, and what do you use
it for?

~~~
aidenn0
I use apenwar's because I installed it a long time ago and haven't needed
anything more; it does look like it may be abandoned though.

I use it for anything where I need job scheduling around creating output
files; even my DVD ripping is managed with redo calling into ffmpeg.

------
athenot
After going through a few build systems for Javascript, I realized they were
all reinventing the wheel in one way or the other, and pulled out venerable
Make from the closet. It turned out to be way more expressive and easy to
read.

One target to build (prod), another to run with fsevents doing auto-rebuild
when a file is saved (instant gratification during dev), then a few targets
for cleaup & housekeeping. All said, the file is 1/4 the size of any other JS-
based build systems.

~~~
bobbyi_settv
The reason I don't like doing this is portability. Since the steps within the
makefile are going to be run through a shell, it is going to behave
differently on different systems.

If your makefile fixes up a file using sed and your system has gnu sed, your
makefile may fail on a system with BSD sed (e.g., a mac). If you rely on bash-
isms, your makefile may not work on a debian system where it will be run with
dash instead of bash. And so on.

~~~
deckard1
I say, you JS guys doth protest a bit too much.

If you look in your package.json, you'll surely see a dozen or so "scripts"
lines that run through the same shell that make does and have all the problems
you just mentioned.

I'd also like to point out that Linux and almost certainly your production
environment (because it's most likely *ix) will be case sensitive. Your macOS
or Windows file system? Not so much. Point is, you're already up to your neck
in portability issues. My macOS coworkers often forget this detail.

~~~
bobbyi_settv
If an external package has scripts in it, those scripts have very likely been
run by somebody on a mac and somebody on linux and worked in both cases. That
is totally different than writing a line of script that only you have ever run
and assuming that because it works on your laptop, it will run everywhere.

~~~
deckard1
> That is totally different than writing a line of script that only you have
> ever run and assuming that because it works on your laptop, it will run
> everywhere.

huh?? No it's not. That's the whole discussion we are having now. I even
specifically mentioned my macOS coworkers who develop on their Mac and are
oblivious to case-sensitivity issues. Because "it worked for me."

> If an external package has scripts in it, those scripts have very likely
> been run by somebody on a mac and somebody on linux and worked in both
> cases.

Oh, I'd love for that to be true. But also not what I was referring to. Most
large projects have a "scripts" section and those are no different than
Makefile commands. If you're paranoid about your own Makefiles then you're
going to need to be paranoid about your own package.json.

------
jlg23
The author forgot some great features of make:

* Parallel execution of build rules comes for free in a lot of implementations. This is really noticeable when you do heavy asset pre-processing.

* Cleanly written build rules are re-usable across projects as long as those projects have the same structure (directory layout).

* Cleanly written build rules provide incremental compilation/assembly for free: You express intermediate steps as targets and those are "cached". I put the "cached" in quotes here, because you essentially define a target file which is regenerated when it's dependencies are updated. Additional benefit: Inspection of intermediate results is easy - they are sitting there as files right in your build's output tree.

~~~
theothershoe
Thank you for these points! I think that parallel execution is especially
appealing. I edited the article to mention that.

------
habosa
Yes! I recently used Make on some JS projects and my coworkers looked at me
like I was insane. Even if you don't know advanced Make-fu it's a really good
way to run all of your build steps in the right order without some crazy JSON
config format.

------
platz
The only build systems that I'm aware of that are monadic are redo, SCons and
Shake-inspired build systems (including Shake itself, Jenga in OCaml, and
several Haskell alternatives).

One realistic example (from the original Shake paper), is building a .tar file
from the list of files contained in a file. Using Shake we can write the
Action:

    
    
        contents <- readFileLines "list.txt"
        need contents
        cmd "tar -cf" [out] contents
    

There are at least two aspects I'm aware of that increase the power of Make:

\- Using `$(shell cat list.txt)` I can splice the contents of list.txt into
the Makefile, reading the contents of list.txt before the dependencies are
parsed.

\- Using `-include file.d` I can include additional rules that are themselves
produced by the build system.

It seems every "applicative" build system contains some mechanism for
extending its power. I believe some are strictly less powerful than monadic
systems, while others may turn out to be an encoding of monadic rules.
However, I think that an explicitly monadic definition provides a clearer
foundation.

[http://neilmitchell.blogspot.com/2014/07/applicative-vs-
mona...](http://neilmitchell.blogspot.com/2014/07/applicative-vs-monadic-
build-systems.html)

------
mikegerwitz
I use Automake and Autoconf (which generates the Makefile) for GNU ease.js:

[https://git.savannah.gnu.org/cgit/easejs.git/tree/Makefile.a...](https://git.savannah.gnu.org/cgit/easejs.git/tree/Makefile.am)
[https://git.savannah.gnu.org/cgit/easejs.git/tree/configure....](https://git.savannah.gnu.org/cgit/easejs.git/tree/configure.ac)

The nice thing with using Automake is that it gives all the standard build
targets with little additional effort (for example, `make dist` for producing
the distribution tarball, and `make distcheck` for verifying that it's good).

I use a much simpler one for a project at work:

[https://gitlab.com/lovullo/liza/blob/master/Makefile.am](https://gitlab.com/lovullo/liza/blob/master/Makefile.am)
[https://gitlab.com/lovullo/liza/blob/master/configure.ac](https://gitlab.com/lovullo/liza/blob/master/configure.ac)

------
jayshua
The concepts behind make are quite good, but the interface it provides is not
decidedly not. It reminds me of Git in that respect. I'd think replacing the
opaque symbols used everywhere with more descriptive words would be helpful. I
don't suppose anyone knows if modern Make versions support alternatives to $@,
$%, $?, etc. that can be read rather than memorized?

~~~
xaedes
You _could_ do something like this:

    
    
       make -f- << sed 's/MAKE_TARGET/\$@/' Makefile.descriptive
    

But for the sake of compatibility I wouldn't.

~~~
JetSpiegel
No need for ugly sed error-prone trickery.

Use on the Makefile:

    
    
        MAKE_TARGET := /some/path
    

And invoke

    
    
        $ make MAKE_TARGET=/other/path

~~~
lolikoisuru
His point was that all of $@, $%, $? have more descriptive names as well, you
don't need to use these single character macros.

Also you probably meant ?= instead of := as the command you gave would still
use /some/path instead of /other/path if you use :=.

------
afranchuk
I have used make for years and am very familiar with it. My two major
complaints are:

1\. The assumptions that it makes. Everything in and out are files (phony
files notwithstanding). It is hard and painful if you want outputs to rely on
and rebuild from configuration in the makefile itself. It's not impossible to
implement, but it's difficult to _precisely_ implement it: often times I've
seen systems that just rebuild everything after configuration changes.

2\. The mix of declarative and imperative styles, while useful for quickly
throwing a build together, gets difficult to deal with as things scale up. The
make language itself is pretty restricted, too (without using $(eval)).

I know that recent versions support guile extensions and even C(/C++?)
extensions, but at that point it's not giving you all that much. I have often
wished the make functionality was exposed in some "libmake" for me to extend.

For this reason (and others), I've recently refactored a huge build system to
use shake[1] instead. Now builds are precise and correct, and properly depend
on configuration and build environment.

[1]: [https://shakebuild.com/](https://shakebuild.com/)

~~~
pjmlp
That is not portable make, rather GNU Make.

~~~
afranchuk
Very true, thanks for clarifying. FWIW most of my stuff is portable make...

------
titanomachy
> The target name `all` is special: When you run make with no target specified
> it will evaluate the `all` target by default.

This doesn't appear to be true, at least on GNU Make 3.81 (MacOS). Rather, the
_first_ target listed is the one that gets built on `make` with no arguments.

------
blt
I'm baffled at the amount of debate in this thread. Make is good for tasks
that can be expressed as a directed acyclic graph of steps, where the steps'
inputs and outputs are files, and steps can be expressed in a few lines of
shell script. It works pretty well for such tasks in my opinion. Yes, it will
look contorted for tasks that don't fit this model.

------
rcarmo
I use Makefiles extensively for Docker, all sorts of individual projects (it's
much easier to "make serve" than remember the specific invocation for getting
a dev server up when you use multiple programming languages) and, of late, for
Azure infrastructure deployments:

\- [https://github.com/rcarmo/azure-docker-swarm-
cluster/blob/ma...](https://github.com/rcarmo/azure-docker-swarm-
cluster/blob/master/Makefile)

\- [https://github.com/rcarmo/azure-acme-
foundation/blob/master/...](https://github.com/rcarmo/azure-acme-
foundation/blob/master/Makefile)

~~~
fernandotakai
i (and people at work) do the same thing! i find it really easy to work with
docker/k8s/helm/compose with Makefiles.

one tip -- remember to use .PHONY on your targets --
[https://www.gnu.org/software/make/manual/html_node/Phony-
Tar...](https://www.gnu.org/software/make/manual/html_node/Phony-Targets.html)

------
isaachier
I started my career with C++ development and never used a direct makefile in
any of my projects. When I write C/C++, I use CMake. My current job has me
programming in Go where people seem to love makefiles, but I consistently find
bugs in the implementation (usually has too many phony targets, etc.), Why
don't people use makefile generators outside of the C/C++ community?

~~~
jschwartzi
I don't use CMake for embedded systems because the syntax and options are even
more obtuse for what I need to do with it. When I get down to it, every make-
based build system I've ever used boils down to using the macro language to
generate all the targets, and then building your output board image by
dependencies. But it's the details of how to do this that vary widely
depending on your application for the board.

CMake does a great job of compiling executable code and linking it using your
compiler of choice, but where it falls flat is in giving me a convenient
mechanism for fitting that executable into the system image.

~~~
pknopf
It sounds like you are looking for something like Yocto/OE, which you can bake
pretty much any build script into, cross compile and deploy.

What exactly are you looking for with regards to deployment, and what tool do
use, instead of CMake, to give you that ability?

~~~
jschwartzi
Can Yocto or CMake build QNX systems or bare-metal binaries with TI's
compiler? The tool I'm looking for is Make because Make gives me a ton of
flexibility to build whatever freaky combination of binaries might go into
whatever product I'm working on, and then combine those binaries into a system
image.

The point I'm trying to make is that every build system except Make solves a
really specific class of build problem(building applications, or building
Linux systems using GCC) and then pretentiously claims to support everything
that matters. What you're actually getting is a small sliver of what you might
need in my world.

This article is another example of how Make can do something really unexpected
by providing really simple features and letting you decide what matters to
you. I've yet to see another build system that is as generally useful.

~~~
isaachier
It seems like CMake supports both of these out of the box:

[https://github.com/Kitware/CMake/blob/master/Modules/Platfor...](https://github.com/Kitware/CMake/blob/master/Modules/Platform/QNX.cmake)
[https://github.com/Kitware/CMake/blob/master/Modules/Compile...](https://github.com/Kitware/CMake/blob/master/Modules/Compiler/TI-C.cmake)

------
fuball63
We use make for standardizing the way docker containers are built, pushed,
tested, and run (debug and production modes). I even prefer it to docker-
compose at this point, because it is more programmable.

~~~
ethagnawl
If you're able, I'd be very interested in seeing some examples.

~~~
mauvehaus
Not the OP, doing a bit less that it sounds like the OP is doing, but we built
out a relatively handy make-based build system for building a set of images in
correct dependency order.

Another member of the team subsequently taught make about the reverse
dependencies so that you can split jobs across Travis nodes by the top-ish
level image they depend on.

My favorite addition was the ability to generate a dependency graph using
graphviz straight from the Dockerfiles.

N.B. Project is now moribund, the team was disbanded. May not build at all.
Don't know if any of the forks are active.

~~~
fuball63
That's pretty interesting about Travis. Make is great because you can do as
much or as little with it as you want, but it generally always improves
organization.

------
nzoschke
I’ve had a lot of fun recently writing a Makefile for a Go and Lambda app:

[https://github.com/nzoschke/gofaas/blob/master/Makefile](https://github.com/nzoschke/gofaas/blob/master/Makefile)

It started very verbose, one target for every Go program, until I figured out
target patterns.

I’m also enjoying the -j flag to do things in parallel.

Now it’s 3 lines of Make to build 10 go programs in parallel in seconds.

Parallel is also enough job control to run the development server and to
watchexec rebuilding all go programs on code change.

~~~
wbkang
Looks good, I think you need to add PHONY targets

------
juliend2
One thing I've found useful about gulp and webpack is the fact that it's
multiplatform.

What is the best approach to make sure your Makefile will work as expected on
every platform, meaning, windows included?

For example: Can we write file paths using forward-slashes, or is it still an
issue? I imagine running it via git bash (bash provided by the git installer
on windows).

~~~
jcadam
CMake! No, I'm kidding, don't really use CMake...

~~~
jcadam
Downvote?! Looks like we have a CMake fan...

------
piano
Why is it that each time a JS dev discovers something most other fields have
been using for decades, it has to be "The Lost Art", "Superpower"
([https://medium.com/@wesharehoodies/typescript-javascript-
wit...](https://medium.com/@wesharehoodies/typescript-javascript-with-super-
powers-a333b0fcabc9)) or something similarly tacky?

There's no Lost Art, no superpowers. It's just the JS scene _finally_ slowly
getting up to speed.

~~~
Klathmon
Why is it that some commenters can't understand that all of humanity isn't at
the exact same level of knowledge on everything?

People learn things every day, acting like it's "slow" or you are better than
them is a very toxic elitist point of view that is better off not said. You
think your way of doing things is better, then you should encourage others to
do it that way, don't put them down when they just start using it! I'm glad
the ALGOL programmers of the 60's and 70's didn't spend their time laughing
and putting down that new-fangled C language when it inevitably made some of
the same mistakes.

And besides, very rarely is something in life a complete upgrade. Make is
nice, but it has some very real pain-points (as evidenced by the handful of
utilities that are makefile-generators, because getting make to do certain
things or work on all platforms is so difficult). Gulp is great too, but it
also has some very big issues in some areas. There is no universal "right"
answer to any of this, and assuming that everyone that's not doing it your way
just doesn't know any better isn't just naive, it's also wrong.

~~~
piano
> People learn things every day, acting like it's "slow" or you are better
> than them is a very toxic elitist point of view that is better off not said.

You must've misunderstood me. I don't have any problem at all with people
discovering things late. I am often a slow learner myself. What irritates me
is when people make spectacular "discoveries" with tacky headlines. If the
article were written in a bit more casual and technical tone, I'd be happy,
much more so than with this "let me re-introduce you this amazing but
forgotten lore you don't know about" nonsense...

> And besides, very rarely is something in life a complete upgrade. Make is
> nice, but it has some very real pain-points

I agree, but that's all the more reason to not make such "discoveries" ...

------
bsenftner
A long time ago, in a developer's paradise called the 1970's there was an
automated build tool called "make".

It had this and only this syntax:

If a line begins at character 0, that is a list of files whose timestamps
should be checked, if any are 'old', then execute the series of command lines
beneath, identifying them as starting with a tab character.

That was it, the entire syntax of "make", and it was complete. Then some smart
people fucked it up and we have that piece of shit we call "make" now.

~~~
Sir_Cmpwn
I did not use Make at this time (I wasn't alive!), but it seems that you'd
have to declare dependencies too. How did you determine if it was "old"?

~~~
brilee
"If a line begins at character 0, that is a list of files whose timestamps
should be checked"

The list of files would be the declared dependencies.

~~~
Sir_Cmpwn
Ah I see.

------
toast0
Working somewhere that really embraced Makefiles is actually very nice. If you
keep your dependencies simple, the Makefiles to build stuff are pretty simple
too, and it works for all the languages you might write software in. You can
use it to deploy to servers too.

------
gsaga
Anyone here uses makefiles for anything other than compilation and build? One
can use it to add some level of concurrency to shell scripts.

~~~
hooya
I use it for almost everything. Take ETL type tasks. It's almost impossible to
remember where I downloaded certain data files six months after initially
creating a project. I put that into the Makefile. Everything I do on the
command line, I put into a Makefile. That way, 6 months later when I need to
download new data, do some transformation on that data and load it into the
database, 'make' re-traces my steps from 6 months ago. Broadly, my ETL
makefiles look like this:

all: data.loaded data2.loaded

data.csv: wget 'some url'

%.sql: %.csv sed/awk/custom-python-script $< > $@

%.loaded: %.sql psql < $< && touch $@

------
wruza
For me, the problem with all make replacements is that they don't just solve
makefile's issues in knowledge-compatible way, but reinvent their own cryptic
syntax and structure from scratch. Minor issues that you know about (and know
how to avoid) are never worth fixing in this way, since most people don't have
time to learn infinite variations of superior tool #1 that has its own flaws.
They are fine with #2 that works for them.

When someone says "there are numerous alternatives", I just look out the
window and smile.

Almost all issues mentioned itt can be solved in barely-compatible, but easily
transition-able way. If this tool exists, its name is welcome.

------
drablyechoes
I have used Makefiles for a lot of little things over the years. One of the
latest things I have found it useful for is automating deployment of websites
via Jenkins jobs.

It is a lot easier and more manageable if your Jenkins job is set up to just
run a series of generic Make commands for a project, where any specific steps
for a particular project are defined in the Makefile.

This way I do not need to know or care about how any particular project or
site is built when configuring the job to deploy it. The Makefile takes care
of all that.

------
solomatov
There's actually a much better general purpose build tool:
[https://bazel.build/](https://bazel.build/) I consider it Make 2.0.

------
benkbit
The thing I like about Makefiles is the declarative structure you get by
defining the targets. It really can serve as a form of documentation about
your different stages of development.

------
Wintamute
However expressive or standard on other platforms, Make is not the correct
tool for building web projects. Putting aside fundamental technical
differences between Make and say, Webpack (of which there are many) there
needs only one argument: Make is not idiomatic on web projects. Most web
developers are not going to be productive authoring or maintaining a Make
build process. If this guy wrote a Make build process while working for me,
I'd ask him to rewrite it using Webpack.

------
jgh
CMake is so nice compared to vanilla makefiles. I wonder if it can be used
this way with Javascript.

~~~
armitron
Make may seem crude these days but has manageable complexity, I've never run
across a build issue I could not debug. Its model is so simple, I can have it
fully in my mind and be confident this is how it works. Moreover, this model
simplicity allows me to build reliable systems on top of it.

CMake is a nightmare of implied complexity. I've run into multiple situations
(usually but not always involving cross-compilation) where it was simply
incomprehensible. No amount of time invested would let me figure out why CMake
blew up. There is no way I can build something a little out of the ordinary on
top of CMake and be confident that it's solid. It's just too complicated.

This sort of balancing, upfront ease of use vs hidden complexity comes up
often and I have been burned enough in the past that experience dictates to
pretty much always go for model simplicity and pay the upfront costs. I don't
see this often however. A lot of the time people will go for what feels or
looks right, superficially, without bothering to look beneath the surface or
think about model complexity costs. It's an attitude that has led to many
disasters in this space.

~~~
jgh
Fair enough but it could be that CMake isn't the right tool across the board.
It works well for the projects I've used it for. Maybe one day I'll come
across a case that makes me hate CMake though who knows ;)

------
nickwanninger
I actually usually have a general-use Makefile that I work from as a starting
point [1]. It takes all c files from a src/ folder and builds their objects to
build/ and then links them all in one go. No need for me to specify the files.
It also works recursively.

[1]: [https://pastebin.com/b1tr9th3](https://pastebin.com/b1tr9th3)

------
s_chaudhary
I searched and didn't find a single soul using Makefile to orchestrate a set
of build steps. I have been doing that for a long time: `make test`, `make
build_docker`, `make push_docker`, `make deploy`. Now suddenly everyone shows
up in one place here. Long live HN.

~~~
dblotsky
Apache Cordova's site and docs, although currently using Gulp, have a Makefile
that mirrors the Gulpfile, and which nobody uses. :(

------
lykahb
Make has dated syntax and conventions. When developers use emojis in command
line, support for the spaces and other quirks look worse than a decade ago.
Despite all of its expressive power and speed, Make is not going to attract
many frontend developers.

If the concepts behind Make are repackaged in a more hipster way, the
resulting tool may get far more appeal. Shake
[https://shakebuild.com](https://shakebuild.com) is a build system library
that can naturally express even the dependencies that require Makefiles go
meta and generate new Makefiles. It can be a robust backend for any build
system DSL.

~~~
monsieurbanana
I like functional programming languages and Haskell is something that
interests me.

But you can't be serious when you say that make is out-of-touch with the
average frontend developer, and then link to this:

[https://shakebuild.com/manual](https://shakebuild.com/manual)

~~~
lykahb
I suggested making new build system that has the power of Make or Shake and is
still oriented toward frontend development.

You are right that Shake and most general purpose build tools are out-of-touch
with frontend. Even for people who are comfortable with these tools, they are
not necessarily better than Webpack for the typical frontend tasks. Shake
place in this hypothetical tool will be at its backend - like LLVM is the
backend of many compilers.

------
grimgrin
I learned a lot from this, and whether or not one should be doing this, I
simplified:

[https://github.com/shmup/react-
makefile/blob/master/Makefile](https://github.com/shmup/react-
makefile/blob/master/Makefile)

I opted out of checking for modification dates on node_modules and yarn.lock,
because that seems like exactly what Yarn itself is for. I let it manage
itself.

I also let Webpack do the heavy lifting.

So in short: I don't really need the Makefile at all and could just add the
clean and dev-server commands to a script block in package.json

I still like it though

------
pankajdoharey
I find NPM script parameter as the most reasonable build tool. You can write
any kind of shell commands or invoke shell or other scripts in javascript
script into it without learning the strange syntax of Make.

------
fiatjaf
make is scary because people use autogenerated Makefiles.

After you've tried to naïvely read a Makefile for a medium-sized C project
you'll never want to write a Makefile yourself.

That said, there's no tool that works so simply and so beautifully as make
when you have a lot of build steps that generate lots of intermediate files
with complex dependency links between them. That situation may happen
everywhere, even if you're generating a bunch of PDFs or rendering HTML
templates or making images from .dot files or whatever. Use make.

Or is there an alternative tool for these situations also?

------
benwaffle
I believe $< is only the first dependency and $^ is all of them

~~~
IshKebab
With such an easy-to-remember intuitive syntax how could you forget?

~~~
dblotsky
We remember it the same way we remember System.out.printLn(), undefined ==
null, and other trivia in other languages.

------
jaxtellerSoA
As a non-programer the only experience I have with makefiles are unpleasant
ones of running ./configure then ./make only to go down an endless rabbit hole
of missing dependencies. I am very grateful that yum/apt-get, etc. have come
such a long way, they grab all your dependencies, you don't have to wait long
periods of time for the code to compile.

I am sure Make still has it's uses, but I am sure glad package managers of
have made makefiles irrelevant to me.

~~~
yoz-y
This is not really a fair comparison though. Makefiles and package managers
are orthogonal and the latter has not really replaced the former. Makefiles
are like a recipe, package managers are food delivery. Somebody has still to
cook the food.

~~~
MaulingMonkey
> This is not really a fair comparison though.

I'd argue it's entirely fair with other examples (e.g. compare and contrast
to, say, Rust's cargo build.) You can write a Makefile which fetches and
builds your dependencies. Makefiles might be like a recipe - but these days
we're building recipes for building entire OS images, including grabbing the
dependencies in the first place. A single software project by comparison is
trivial.

The problem is Makefiles are a jack of all trades, and master of none. You
have to write or copy a distressing number of rules yourself, per project, per
_subproject_ , and know the underlying build chain in relative depth to do
even some rather basic things. I'd argue makefile authors not automating their
dependency fetching is a symptom of this.

As a programmer, I avoid them, because I'll end up stuck maintaining them and
end up lowering the build system's bus factor. With perhaps the exception of a
couple of simple rules that forward to a "proper" build system, because I've
written enough of them in the past that "make" still feels like the right
default action that should generally work. But if it all possible, never for
the meat of the build system.

~~~
yoz-y
Of course. But I would not compare cargo build or CMake to package managers
because again, they are developer tools and unless you are a developer you
should not need them.

I definitely think that we can do better than Makefiles, but again they have
the benefit to be extremely versatile so they can be bent to many uses.
(personally I do not use them, I use CMake and generate ninja files)

------
deadcoder0904
Make is awesome. I often did things with it earlier however now if I want to
do something with NodeJS I generally use npm scripts tag. And sometime it gets
tedious to do so then I use nps [0]. If I need something big then I go with
Webpack or Parcel.

[0]: [https://github.com/kentcdodds/nps](https://github.com/kentcdodds/nps)

------
DanHulton
A tiny improvement that could be added - instead of locating Babel by adding
your project's `node_modules/.bin` to your path or directly linking to it, you
can always write:

`npx babel`

This will use any installed version of babel in your node_modules, or, if not
installed, will temporarily install it for the duration of the command.

~~~
cesarb
> or, if not installed, will temporarily install it for the duration of the
> command.

Wait... doesn't that make typosquatting even more of a danger? People get used
to typing "npx <command>", they mistype the command once in a while, an
attacker uploads packages named after the most likely typos, and done - the
attacker wins.

I question the wisdom of that shortcut.

------
kraig911
A lot of good arguments about using and not using make. Count me in the not
using make camp. IMO it's just simply overkill. JS work nowadays is so modular
if we were talking about configuring a monolith service at build time... sure
yeah. Meanwhile I just wanna make this div purple.

~~~
codazoda
I presume you're running simple small js files. This post is about larger
projects that need to transpile newer js code down to browser compatible code,
combine multiple files into 2 or 3 http requests, and pack every possible byte
out of the js resources we end up sending to clients.

~~~
kraig911
Yup I'm a JS Dev and I use webpack (3). I get the entire thing. However I
think this is still all so complicated for no sake but for a small benefit.
I'm currently looking for a tool to find out how much is seen/used vs how much
is delivered.

------
jiaweihli
I went through a Makefile phase but ultimately setting on Jakefile[1] for more
approachable syntax and type-checkability via TypeScript.

[1] [https://github.com/jakejs/jake](https://github.com/jakejs/jake)

------
bryanlarsen
Make works great for JavaScript but it seems very few people do so.

[https://github.com/webpack/webpack-
cli/issues/152](https://github.com/webpack/webpack-cli/issues/152)

------
ziikutv
Wish this article was about just make rather than JavaScript and using it in
tandem

~~~
theothershoe
Noted! I wanted to provide concrete examples, and I did not want the post to
become too long. So I focused on one kind of project.

------
commandlinefan
I like this a lot. We preach the mantra of reuse all the time, but we practice
it so rarely - writing everything from the ground up every few years rather
than trying to fit the existing tools to the new(er) processes.

------
bitwize
If you need to write in C or C++, use CMake. Otherwise, use whatever _modern_
build tool your language provides. It's 2018, and Make should have died in the
70s,

~~~
peterwwillis
A lot of "modern" build tools lack necessary features.

------
lowbloodsugar
The problem with make is that people build makefiles such that it is later
referred to as a Lost Art.

~~~
dblotsky
People do this with all build systems. The difference is that
Grunt/Gulp/Broccoli files are a Lost Art in 3 years, and Makefiles have been
around for decades.

------
metalliqaz
I'm not a web dev, but this article has uncovered yet another thing in the js
ecosystem that just seems crazy to me.

This makefile snippet:

    
    
        lib/index.js: src/index.js
            mkdir -p $(dir $@)
            babel $< --out-file $@ --source-maps
    

The source and the output are both called index.js? Why, God, WHY???

~~~
eropple
Because `require` and `import` look at your package's `main`, which (if it's a
directory) implies `dirname/index.js`. Building to an `index.js` entry file
is, then, the most standard name that allows `require 'modulename'` to work.

The capitalized rending-of-garments is silly. This is transpiling; file names
should remain the same from input to output.

------
tutuca
I find the editorialized title a little bit harsh on author's intentions...

------
balls187
Used make for building javascript bundles back in 2013. Worked incredibly
well.

------
crescentfresh
Tangential, but I cringe everytime I see the word "transpiler". A compiler is
a compiler is a compiler.

Previous discussion on this:
[https://news.ycombinator.com/item?id=15154994](https://news.ycombinator.com/item?id=15154994)

~~~
mmjaa
Meh:

Compiler: collects sources from various places, assembles the result into
machine code.

Transpiler: collects sources from various places, converts the source to
another language.

Personally I cringe at the newspeak you seem to imply we should all be
applying to our lives. These words mean things - quite different things, it
turns out. There's a difference between human-readable source code language,
and machine-executable binary code. These tools function in different ways
entirely; your optimization to the language is not only un-warranted, but
leads to a desultory effect: programmers get stupider when they don't know
what their tools are actually doing.

~~~
cat199
> newspeak you seem to imply we should all be applying to our lives.

[http://foldoc.org/transpiler](http://foldoc.org/transpiler)

^ no match.

 _transpiler_ is the newspeak.

~~~
cat199
The original C++ is probably the most famous 'transpiler'

[https://en.wikipedia.org/wiki/Cfront](https://en.wikipedia.org/wiki/Cfront)

[http://www.softwarepreservation.org/projects/c_plus_plus/cfr...](http://www.softwarepreservation.org/projects/c_plus_plus/cfront/release_e/src/master.pdf)

ohey, noone calls it a 'transpiler'.

I'm not so angry about the word transpiler; perhaps it might improve things..
but to imply that using compiler as a common term for all things here is a
regression/neologism is patently wrong - transpiler is the newcomer.

~~~
lowbloodsugar
But we called it lots of other things.

------
dmitriid

        lib/%: src/%
    	mkdir -p $(dir $@)
    	babel $< --out-file $@ --source-maps
    

Just look at all those magic things. The percent signs! $<! $@!

Well, I know they are not magic ;). But why would I want them when I can
actually use normal names like "deps"/"entries" and "target"?

It gets substantially worse as we go down the rabbit whole. Where webpack can
easily walk the entire dependency tree by itself, we have to invoke

    
    
       src_files := $(shell find src/ -name '*.js')
    

Where we can use same webpack to seamlessly output the resulting file into an
output directory, we need to do the (very unintuitive) pattern substitution:

    
    
       transpiled_files := $(patsubst src/%,lib/%,$(src_files))
    

or even

    
    
       flow_files := $(patsubst %.js,%.js.flow,$(transpiled_files))
    

And when we want to watch for changes? Well, we need an external program
anyway.

    
    
       $ yarn global add watch
       $ watch make src/
    

The art of Makefiles is often lost for good reasons: because Makefile don't
really cut it anymore.

~~~
dblotsky
"Doesn't cut it anymore" isn't the same as "has a different way of doing it".
Make has many other (necessary) features that Webpack doesn't.

