Hacker News new | comments | show | ask | jobs | submit login
Time for Makefiles to Make a Comeback (medium.com)
148 points by mpweiher 134 days ago | hide | past | web | favorite | 113 comments



I've been using make files constantly since '76 when I first started programming in (US) 5th grade. It has been one of my career amusements watching the build systems come and go. I gave up talking about Make years ago. There is a huge population of us that smile and get things done, while others screw around with new, complicated, never-learned-the-past tools.

I use one of the earliest Make versions that barely does anything beyond recognize rules, rule actions, and simplistic macros. I've seen this version of Make used to build a feature animated film. Witnessing the versatility of that feature film's multitude of software and renders and composites all built through Make taught me Make is the only tool needed for any building of anything.

I remember when MS introduced a Make with extensions, and many developers ran over and fucked up their build environments, starting them on the path to the complex manual required build mess most people have today.

I write my Make films by hand, still. It is that easy. And you really should have a build environment that is that easy. When you need it easy, as in some major deadline and crap is broken, you will thank yourself.


You make a few statements regarding films: First, "I've seen this version of Make used to build a feature animated film" and later you state "I write my Make films by hand".

I suspect the second statement might be a type-o -- accidentally using "films" in place of "files" -- but in the case that it's not, I wonder if you could detail the practice of "Make films" -- seems like it might be interesting.


I meant "make feature animated films". During the late 80's, before more 'sophisticated'; build tools polluted developer's young brains, early visual effects were "nothing more" than complex Makefiles triggering renders and composites. All one needed was a facility that looked at the date stamp of two files, and if one was older that meant these series of command lines is executed. Beyond very basic #define style macro substitution, that was all that was used to create many feature film effects, as well as nearly ever single early 3D animated renderer produced animation before 1992.


> I've seen this version of Make used to build a feature animated film. Witnessing the versatility of that feature film's multitude of software and renders and composites all built through Make taught me Make is the only tool needed for any building of anything.

Awesome, tell us more! Which film was this? Or if you can't say, what year was it made? I have to imagine The Makefile Architecture is still pretty popular in VFX pipelines.

I run a SaaS that's made end credits for hundreds of feature films (including "Moonlight"). Our render pipeline uses The Makefile Architecture.


The first time I saw Make used in feature production was reviewing the "Last StarFigher" (1984) digital archives. This was after years of working in scientific visualization, and skirting VFX. Previously, while working in visualizations, Make was king for creating NASA, research publication and geologic animations. Much later, I saw Make in heavy use when entering the visual effects industry, but this was after another 15 years in the console video game industry. It was an incredibly welcome realization, after the, frankly, incredible cluster fuck and demoralizing experience of 15 years creating video games, to learn that the rational UNIX-paradigm (small simple tools) community I'd grown up in was alive in the VFX industry. Returning to the 'old school developers' I'd left in the left 80's was so reassuring to learn the tech world had not completely been brainwashed.


> There is a huge population of us that smile and get things done, while others screw around with new, complicated, never-learned-the-past tools.

They get the things done they've been doing all along. But build metadata is very important and very hard to extract from Makefiles, especially as builds increase in complexity.

You need things like search paths accessible to get IDEs, static analyzers, software packaging systems, and other third party tools working with your code base.

So make works great if the only things you're interested in doing are the top-level make commands you define, but it doesn't do you any favors when it comes to other things.


I don't understand your reasoning here. Make can call any program, and contain as many build targets as you would ever need. If some task can't be accomplished in Make itself or existing CLI programs, you write something that does that one task well, then ensure Make runs it whenever needed, even if that's every build.


You could write make rules to do about anything, sure. But you don't want to have to rewrite those rules for every project. You want make modules that you can reuse from project to project. Those are notoriously hard to get right. And when they are done correctly, they are hard to share across organizations, or even within large organizations.

In particular, I write a lot of C++. I want targets for gcov, g++, gtest, dpkg, rpm, clang-format, clang-tidy, cppcheck, clang, CLion, Eclipse, ctags, and include-what-you-use. Am I supposed to maintain each of those targets on each of my repos?

It's simpler to have a standard place with all the needed source files, binaries, search paths, definitions, etc., and just wire that data up to each extra target.


Make's ability to include other makefiles is insufficient? Surely there's a relatively simple structure that would let you modularise tasks then include only the relevant ones per project.


I also write C++ and am not aware how to comfortably automate all that with a single build tool. Is it some sort of cmake extensions? Can you please share your setup?


Some of those are built-in to cmake you can google "cmake clang-tidy" to find some examples, for instance. The rest can be added as custom targets and/or custom configuration options.

The point is that you can write reusable cmake modules you can ship with your package manager or even git and a tweaked cmake search path. You can get there with make, but it's more work and a bit of a pain to replicate and maintain. In fact I often see teams with those setups put too many things in their repos to avoid build tool pain. Or, perhaps more often, they just give up on ever getting clang builds to work (for instance).


Search paths accessible to other tools would have been just as well served standardizing search paths. Or, sadly, following standardized search paths.

Instead, the new tools came and each one would buck any preceding trend and introduce a new way of doing things. Often poorly. All the while introducing yet another new way of executing the actual build.

Let me preempt with saying that Make is not perfect. I genuinely feel that many things could have been improved on. However, I also question how many lessons learned in Make were carried forward, versus how many mistakes were just reintroduced and hit again and again.


Which version of make do you use? The one packaged in the 'heirloom' setup, or something different that's still available?


I have a theory that the people who love make are the same people who love macros and the people who hate make don't love them.

I wrote a book on make and I fully recognize that there are people who look at what I've written there and think I'm completely mad: https://www.nostarch.com/gnumake

Make's greatest irony is that it's a system for building files based on dependencies but has no way of actually discovering dependencies. I wrote about this for Dr Dobbs a while back: http://www.drdobbs.com/architecture-and-design/dependency-ma...

And make has deficiencies that are a nightmare to work around and get right (e.g. spaces in filenames, recursion, ...)

If anyone else out there is a make hacker like myself they might enjoy the GNU Make Standard Library project: http://gmsl.sourceforge.net/


(disclaimer: I like make but not macros)

> And make has deficiencies that are a nightmare to work around and get right (e.g. spaces in filenames, recursion, ...)

Yes! we seem to be stuck in a local maximum with make. I wrote my own make replacement, as have many before me, and learned the folly of my own ways.

Nowadays I use (and love) make despite its terrible handicaps, but I really wish there were something fundamentally similar but a) less C-focused and b) containing many fewer footguns.


Shake is similar to Make with some improvements (it is a Haskell DSL), I have not used it, but it looks good.

http://shakebuild.com/


If you compare Makefiles to the JavaScript ecosystem it will do so favorably, but many things will.

Make is a good build system but it is a shitty deployment system. Yes, you can do everything you want if you put enough effort in it as it is a complete scripting system, something many alternatives are not.

It does not mean that it is a good idea to do so.

Recently a client made me begrudgingly try Maven. 'Yet another make system' I thought. Yup, pretty standard one: 'mvn install' ,I'll compile everything and will also do something make has never done for me: install dependencies.

Sure, it is doable to do this with a makefile. With a lot of efforts multiplied by the number of platforms you target. In Maven you just add repos and dependencies to your pom file.

So yes, there recent build systems that are worse than Make but it does not mean that make is the perfect system with all the features we need.

It is minimalistic. It works. But it should not be your only option.


That last bit is really key. Nearly every modern build system is also a dependency management system, something that really isn't possible using Make. There have been tools introduced to address that particular need for C/C++, but they integrate poorly if at all with the build system, so it turns all your builds into (at least initially) three step processes of run dependency installer, run config tool to figure out exactly which flavor of C compiler/linker/etc. to invoke, run make to actually build the thing.

It's also possible to do some truly awful things with make. I once worked on an embedded device that the SDK provided by the OEM was written using a truly horrific makefile system. There were no less than 80 makefiles all tied together to build an entire system image, including the building of the linux kernel, all the bintools, and a bunch of other custom stuff the OEM provided. Trying to understand, much less modify any part of that build system was an absolute nightmare, and is one of the few times I've actually wished they had used shell scripts instead of make for some of that stuff.

As for Maven, it was quite a pioneer in its day, but it hasn't aged well, in particular being written in XML which just adds needless verbosity. For a more modern take, look at Gradle, which provides everything that Maven does (and uses mavens dependency system), but is a lot more flexible because it's a DSL for project building instead of a declarative system like Maven.


I think the declarative model is why I prefer Maven over any other build system I've worked with, yeah XML is a little noisy but if it's really that big of a deal there's the polyglot plugin. Maven POM's are pretty easy to figure out once you've worked with them for a while, there's a learning curve when you start adding plugins and have to get at least a basic understanding of build phases - but a large chunk of projects need nothing more than the default plugin bindings, they just add their dependencies and go off to the races.


redo is mentioned elsewhere in this very discussion.

One of the things that Daniel J. Bernstein did in his packages is merge steps two and three. There would be makefile/redo rules for detecting the presence of operating-system-specific stuff, and linking the appropriate source/header files into place.

Here's an example from ucspi-tcp that auto-configures whether the platform supports waitpid():

* https://github.com/comotion/ucspi-tcp/blob/master/Makefile#L...

* https://github.com/comotion/ucspi-tcp/blob/master/choose.sh

* https://github.com/comotion/ucspi-tcp/blob/master/trywaitp.c

* https://github.com/comotion/ucspi-tcp/blob/master/haswaitp.h...

* https://github.com/comotion/ucspi-tcp/blob/master/haswaitp.h...

When I extended redo to the whole of djbwares, this system was fairly simple to convert. Here is haswaitp.h.do which is what redo ends up running whenever something reveals a dependency from haswaitp.h :

    #!/bin/sh -e
    redo-ifchange trywaitp.c compile link
    if ( ./compile trywaitp.o trywaitp.c trywaitp.d && ./link trywaitp trywaitp.o ) >/dev/null 2>&1
    then
            echo \#define HASWAITPID 1 > "$3"
    else
            echo '/* sysdep: -waitpid */' > "$3"
    fi
* http://jdebp.eu./Softwares/djbwares/


> Nearly every modern build system is also a dependency management system, something that really isn't possible using Make

The BSD Ports tree provides exactly this functionality. All package dependencies provided via standard make variables and rules, enabling recursive resolution of all package dependencies.


Make isn't intended to be used for dependency management. With traditional `./configure && make && make install`, dependencies are detected by the configure script, or manually provided by the user if they have custom library paths. Dependencies are handled by your package manager or something else (be it submodules in git, custom scripts, etc). There's a separation of concerns.

For generating those configure files, Autoconf is extremely powerful (and really not that complicated once you have a basic understanding). If you use Automake, you get all the standard targets (including `make dist`, which will build your distribution archive, as is being done manually in this article) generated for you.


That's a rather dated view of what a build system should be. Most developers these days don't want to have to manually install all of a projects dependencies in order to build it. More importantly, there's also been a push towards wherever possible installing dependencies into sandboxes to try to prevent things like DLL hell, which isn't possible if you install dependencies at the OS level. I much rather just check out a project and run its build system to fetch all the dependencies instead of what we used to have to do with C projects, which is go read the docs to find the list of mandatory and optional dependencies, then go look through the distribution package manager to try to match up package names with library names (including a few head scratchers where something is included inside of a differently named package), and finally install the whole mess and hope that the projects config script properly finds them all.


> That's a rather dated view of what a build system should be

The author mentions JavaScript, which has npm as the package manager, which isn't a build system. It can _invoke_ a build system, but the build output produced is separately published to npm, or the build is triggered by e.g. a post-install hook.

I use GNU Make with npm. Some use e.g. Grunt. Etc.

> More importantly, there's also been a push towards wherever possible installing dependencies into sandboxes to try to prevent things like DLL hell, which isn't possible if you install dependencies at the OS level.

Make has been used this way for quite some time. Debian has its reproducible builds project. GNU Guix builds everything in an isolated environment, and even handles multiple versions of multiple packages---if you have five different packages that each depend on five different versions of the same library, then it'll install five different versions of the library and work just fine.

> I much rather just check out a project and run its build system to fetch all the dependencies instead of what we used to have to do with C projects

Make is just _part_ of a build system---it isn't necessarily one in itself. Some projects might use it exclusively in their build system, but others use a suite of tools. I use Autoconf (which generates configure) and Automake, which in turn generates a Makefile with all standard targets. If you check out a git repository for one of my JS projects, you run `npm install` to get the dependencies before building. Usually they're runtime dependencies, though, so they're not needed for the build. Other dependencies are detected via the configure script (e.g. I use graphviz for certain project, and the script makes sure it's installed and supports the feature I need).


For the most part, JavaScript doesn't need a build system, so from a practical standpoint npm is its build system. Most JS projects shouldn't require anything more complicated than npm install to put them in the proper location. Now you can argue for something like a minifier, and things like Typescript add a wrinkle (as it actually does need a build system), but all that should still be able to be handled by npm.

I'm curious how you would use Make to install dependencies. I'm aware of Guix, but that's probably an extreme example, most people probably aren't willing to sandbox literally their entire OS install, plus it's entirely unreasonable to require a particular OS in order to handle dependencies of your project. It shouldn't matter what OS I'm using (within reason), the project should have an automated way to install any dependencies it requires.

Here's the thing with Make, even if used only as part of a build system, it's both too complicated, and not complicated enough at the same time. The makefile format when used very sparingly isn't too bad, but once you pass needing more than 2 or 3 rules it gets unwieldy, and on any significantly sized project it will need more than 2 or 3 rules. Using autoconf and automake just proves the point, now you're using a build system to build your build system. Autoconf in particular is way too complicated, and an absolute nightmare to try to extend or customize.

Instead of using a tiny little sliver of make as essentially build system glue to connect the actual build tools together, why not just use slightly more powerful build tools and get rid of make all together?


You wrote:

> I'm aware of Guix, but that's probably an extreme example, most people probably aren't willing to sandbox literally their entire OS install, plus it's entirely unreasonable to require a particular OS in order to handle dependencies of your project. It shouldn't matter what OS I'm using (within reason), the project should have an automated way to install any dependencies it requires.

Guix is just a package manager, not an operating system. The Guix System Distribution, on the other hand, is a GNU system. You can use Guix on any GNU+Linux system; you do not need GuixSD for that.


JavaScript needs a build system: https://tomdale.net/2017/04/making-the-jump/


Author here. Great feedback! Thanks for taking the time to share it :).

If I were creating a JavaScript library package that I was just going to push up to npm, I don't think I'd use Make either. yes, Typescript adds a wrinkle, but like you mention, that can all be handled by npm scripts as well. If it was just a JavaScript library, I'd definitely just use npm scripts and call out to eslint, and other tools just fine and not be any less happy.

However, big front-end applications I think are starting to become more and more complex and can benefit from this. Part of the reason is perhaps more because of the lack-of-functionality around incremental builds in the compilers themselves (thinking less/sass/minify/uglify/etc., this is where I've seen people usually fall back to using Gulp). As the application becomes larger, I think the lack of incremental build becomes more and more of a tax on the developer because the builds become longer and longer (I've worked on several projects like that).

I think perhaps the biggest impact on this project where Make is used had more to do with the modular project architecture than anything else. If it weren't for that, I'm not sure I'd ever even think of Make. And frankly, I think a modular project structure in a single github repository like this project uses is most likely not the right solution (or even a good idea) for a vast number of JavaScript/TypeScript projects out there.

When the article talks about dependencies, it is more talking about dependencies within the project itself (due to the modular architecture). Before I switched it over to Make, it was using a combination of preinstall, postinstall, and custom "prestart" scripts to do everything in the right order because it wasn't as simple as using `npm link` or doing a single npm install in the root of the project. These various scripts become a total mishmash of different approaches and it became increasingly difficult to understand and visualize what was being built when and in what order.

It wasn't uncommon to run into issues as well where one project would be `npm install`'d before it's dependencies were actually ready to be used (since they hadn't been compiled and packaged yet).

With that said, Make in this project doesn't install dependencies in the sense that I think you are discussing. This project is most definitely a Node.js project and uses npm for all that stuff. It's just within the project itself, there are sub-projects that are NPM packages themselves that can be built and deployed separately. That's where it began to fall down. But the project still most definitely uses "npm install", "npm start", and such. It's just the build component of issuing "npm run build" just shells out to "make -r" to build everything in the right order, in an incremental fashion, and to get it where it needs to be.

There are many reasons behind this modular architecture that I never went into in the post for good reasons. And like I said, I would be sad if lots of projects thought all-the-sudden it was a good idea to do it :P.

All that said, over the last 10+ years, I've grown increasingly tired of new frameworks coming out every 18 months (or less) that simply re-invent the wheel, but do it in a different way. I think there are plenty of situations where Make would perfectly suffice yet people immediately pull in a code-based build tool with tons of dependencies because it's what "everybody is using now."


Well ./configure does not install dependencies either. Even worse: it fails at the first missing dependency instead of listing them. Making an automake project to compile is actually a long and manual process.

Just today I had to compile an older version of gimp, just a 2 years old one. Configure. Fails. Apt-get this. Configure. Fails. Apt-get that. Oh, no, configure doesn't want obscure-package4, it wants obscure-package2, that has been removed from your distribution's repos. Arrrrrr! Truth be told, I just rebooted on windows and dowloaded a binary from an archive.

With maven, I can just tell it which release I want and it just fetches it and all its dependencies, compiles them and installs them. Full auto. If I add a lib in my project, it adds it automatically and the intellisense works, the source navigation works.

Make is years behind.


> Even worse: it fails at the first missing dependency instead of listing them.

Yes, this is a fair criticism. Fortunately, many projects state their dependencies. But not all do. If I'm already on a Debian system that has another version of the package, `apt-get build-dep' is very helpful to get most of the way there. Of course this is completely outside Autoconf.

Whether it fails immediately or not depends on how the author writes `configure.ac'. Usually `AC_MSG_ERROR' is used right away, because it's convenient. Instead, some authors choose to set a flag and fail at the end.


Why wasn't

    apt-get build-dep gimp
or

    aptitude build-depends gimp
the easy answer?


Yes Make does one thing and one thing well.

Having said that, if you really want to manage dependencies with Make, it's still trivial to do a curl/yum/rpm install when some component is missing. I've done that in the form of a `make deps` pseudo-target, to prevent doing anything that the user doesn't expect.


> it's still trivial to do a curl/yum/rpm install when some component is missing.

Oh yeah? Which one? curl, yum, rpm, apt-get? All of them? Which revisions are you targeting? Which package names? And that only covers windows. A lot of apps need to be deployable under windows and OSX too.

If you start being even slightly cross-platform, you soon have your own build system.


Yes having a separate target that isn't automatically invoked unexpectedly is important---the user doesn't expect network requests during build traditionally. Environments that build in isolation with no network access would fail as well.


> Make is a good build system but it is a shitty deployment system.

Because you really should not deploy things with your build system. It's the today's tooling that got this backward. For deployment you have much saner tools, like package systems.


But only 'git pull' is web scale!!!!

(sigh)


Eh, different strokes. Every time I have to use Makefiles I yearn for the comparative flexibility and simplicity of the JS ecosystem.

Make files just seem good to users of the due to experience. They work, but let’s not pretend they’re easy to work with.


> simplicity of the JS ecosystem

[citation needed]


Citation: I work with it every day. You can make it complex but that’s true of everything.

It’s not hard or complex to get working initially, or maintain, which is not true of makefiles


> It’s not hard or complex to get working initially, or maintain, which is not true of makefiles

Again, [citation needed].

I use Makefiles in a variety of projects:

- building Vagrant base boxes

- building environment-specific configuration for LAMP-ish CRUD apps

- testing and building a range of shell-script based packages

Does Make have some warts? Sure, everything does. I'm not claiming it's perfect.

But it's a far sight clearer - to me at least - what's going on with a Makefile.

Makefiles also don't need to be replaced by something new in 6 months because the entire community has decided that its time for a new tool to do basically the same thing, solve 1 problem with the old tool and introduce 5 new ones.


Make provides awesome UX: in most cases you just type "make" and stuff just works, but there is a tradeoff: it's nearly impossible to write a crossplatform make file. Why you ask? Try copying a file (or directory) from one location to another, sure "cp", but it's -R is different on Linux and macOS! And then comes Windows - there is no good built-in alternative to "cp" as it's hard to make "xcopy" to ignore failures if target file/directory already exist (it tends to ask if you want to overwrite it or not), WSL might fix that in the future, but currently it's quite slow on IO side of things. So, sure given enough time and kung-fu we can write crossplatform make files ... but even if copying one silly file is such PITA .. it's hard to advocate for make in 2017. Probably the wisest solution is just using make to run something more comfortable, for C/C++ it can be cmake, premake, tup, etc. But then again: if I need to add something, like another build step, do I add it to the tool of choice or make? It's a very hard balance to sustain.

PS. Copying files specifically is such huge PITA that cmake implements crossplatform copy for you, just run "cmake -E copy $in $out" (can be added as add_custom_command in CMakeLists.txt)


If you want to write a portable makefile, you use a CP variable which is conditionally set depending on what uname says, e.g. cp -r on GNU Linux and whatever it is on Mac OS. But also, copying files around is a bit out of make's scope, as it is a tool for transforming files. Most standard tools that are used in makefiles (sed, awk, m4, c toolchain) have posix interfaces, and other compilers usually don't have incompatible competing implementations. Even if you want to use a certain implementations features, say GNU features of awk, gawk+GNU coreutils are available for all popular systems, as well as bsdutils. Requiring them is better than when every program require its own build system because they are portable and reusable packages.


maybe projects should have their own ./bin folder for make to use and have those command refer to the right native command.

mixing make with a modern scripting language would make this possible.


GNU Make has built in support for scheme


Having put together a build system together for building a modular os using make: no, it's really not time to make a comeback.

Make is simultaneously powerful enough to seem pretty good for use, but not powerful enough to do anything outside its very narrow problem domain without really ugly hacks.


Agree. God forbid you have to maintain cross-platform compatible makefiles either.


For a better "make" try "do" (also known as "DJB redo").

It's designed by Dan Bernstein of crypto fame and implemented by Avery Pennarun now at Google.

http://apenwarr.ca/log/?m=201012#14

I will contribute $100 to any Rust leader who wants to start coding "do" in Rust.


This seems like a prime moment to ask for a review of https://github.com/sagebind/rote


Rote is a great accomplishment. IMHO it's superior to Make, Rake, Cake, etc. Thank you for writing this!

If you're curious, take a look at Do. It's like all of the above, yet flipped over on their heads. The value of Do is in the design, which is essentially the opposite of Make. Do leverages the composability of items, files, scripts, artifacts, and typical Unix pipes.

I would be very interested in your opinions about Do, and if you want the $100 to jumpstart working on Do in Rust, let me know how to donate to you.


+1 for do/redo. Although it can be a bugger to get right for go (because there's no intermediate files) (although I think I've solved that now.)


I looked at your profile, but didn't see any clues. Please share?

How do you use redo for go projects?


[looks at pending blog posts guiltily]

https://bitbucket.org/rjp/can-x-win/src is a good example - basically, apart from `all.do` containing the targets, I have a `default.do` which looks for a matching `*.od` which has the DEPS and then passes those to `go build`.



I wish there was something as standard, ubiquitous, fast&light, dependency-less, and adopted by the whole Unix ecosystem and beyond like make is now, yet better (= less archaic, more "write once works everywhere", uniform single way of depending on other projects, focused on the actual code/bins/libs and their dependencies instead of specific packaging for specific distros, ...).

Not some "hype of the day" build system of course as that by definition makes it a short lived maintenance burden rather than universal.

While make is universal, with makefiles now it's so bad, that even if you write pure C90 code with zero dependencies (which should work everywhere), users will still find reasons why you should add all kinds of platform dependent stuff or things for a particular distro to your makefile...


Nonononono. There's a reason makefiles have been reinvented 400 times: they're a nightmare. I don't love the alternatives, but they are pretty much all universally better.


Dependency hell is one of the main reasons why developers are flocking to bloated solutions such as election: a single monolithic dependency with guaranteed cross functionalist behavior.

Most build tools are stuck in the 70-80's where a single library weighing hundreds of kilobytes was a big thing. In comparison to saving a few kilobytes vs spending hours of a developers time hunting down old versions of libraries is a no-brainier. Especially when you take security into account. I've seen .Net developers just give up in frustration and download dll files from those dodgy download sites and put them into production distributions.

I personally loathe make. I've never had a good experience when using it.


I just got done watching a video called "make for reproducibility in science", which made a lot of sense. The impressio I get is people do weird things to make and then hit edge cases and complain.


> The impression I get is people do weird things to make and then hit edge cases and complain.

Like not handling file paths that have spaces? [0] That's one hell of an edge case.

[0] http://savannah.gnu.org/bugs/?712


I'm speechless at that link. I don't understand why something so basic isn't fixed yet in such an important package. I guess I can see why qmake/cmake have been eating the market share now.

For what it's worth, I am pretty sure one of my first BSD systems forced me into using snakecase so that would explain why I never hit that limitation. That said, make, being designed originally for C, and C being also known for snakecase, and early linux also known for discouraging spaces in names, I can see how this happened in the first place. What I don't understand is the lack of an implimented fix.


Discover make, realize what a bunch of dumb hipsters everyone is who created build systems after it, try to use it, see how deficient it is for the problems you're trying to solve, realize why those dumb hipsters made all the stuff they made, repeat. If Chesterton's Gate were a build system.


I'm confused

So its agreeable that the building pipeline of NPM calling some packer does replace any-other-language calling any-other-packer, but what is the point in replacing the role of `npm build`, with a makefile that just calls what npm would of called anyways?

replacing make with npm with make-calling-npm?


I wrote a bit about an approach I'm using, A Touch Of Make (ATOM), which helps provide better developer UX across teams - particularly ones working on microservices.

https://www.alexhudson.com/2017/04/26/articulating-atom-appr...

You're right, replacing "npm build" with "make build" doesn't win you anything. But that's only true on the small scale. In a service world, there are lots of projects, each with different requirements. Some will have front-end, some won't. Others will require a totally different build process. There will probably be different languages involved.

Using make, you can standardise a lot of this. If you set up a coding standard that after you clone a repo, "make dep" should grab anything the project requires, then developers don't initially need to know whether that's calling out to npm or composer or pip or whatever - it's just "working".

This is a much bigger win when you have a number of projects. The process is standardised, so developers know it's only a two or three step build (or whatever you've setup). They know how to do it, and when they need to look under the covers, they can see how it works, and this knowledge is transferable from one project to another because Make is universal.

I don't advocate doing big Makefiles - that's why I call this approach ATOM; only use a touch of Make. But used judiciously, it smooths out projects very nicely. Not everyone needs that, though.


I tried this approach. It never works in the long run. The Python developers wanted Make for the few commands they needed in their projects. The JVM folks had Gradle. The Python folks refused to learn Gradle (because of some argument like: rah rah rah JVM tools hard; Make easy! Make all you need. If Make isn't enough your language is garbage).

So the JVM folks created a crappy Makefile that accomplished some of the basic stuff to appease the Pythonistas. Meanwhile the JVM folks continued to use Gradle directly and the Python folks who had to work on the JVM projects got a shittier experience through the Makefile. As new tasks were added to build process the JVM folks would just write custom Gradle tasks to accomplish the job which would drive the Python folks nuts because they wanted shell scripts or make targets invoking shell scripts or whatever.

My only take away from this experience was developer tooling sucks and there is no one size fits all approach to build systems. Try to stick to a convention within an ecosystem (e.g. all JVM projects should use one of Gradle/Maven/Sbt), but don't try and get cute with making stuff more common than it has to be.


You say the Pythonistas refused to use Gradle, but it also seems the JVM peeps refused to let go of it?

Isn't that the problem? Isn't Make more general than Gradle? If the project was primarily JVM, why couldn't the Pythonistas be forced to use whatever build system was mandated?


You'd have to reinvent a lot of wheels to replicate what Gradle (or Maven) does. For example dependency management, env setup and packaging. It would be similar to asking a Python developer to give up pip, setup/disttools and maybe virtualenv. In the end you want productive devs. Stripping away the tools they are productive with is counterproductive.


But surely these are targets for make?

e.g a pip target that builds a python dep folder. You can do the same for Java with Apache Ivy.

Is Gradle not a make like system with a lot of JVM-specific functionality built in?


I agree. I increasingly use makefiles with a small set of common targets. I know I can do "make build" in all my repos, whether they're C or Ruby and whether they're packaged as a docker container or directly. I know I can do "make run". I know if they're in a container I can do "make cli" to get a shell inside that container, and so on.

The Makefile then acts as documentation of exactly what to do. All of the setup stuff that people often put in a README is right there.

All of the complex stuff gets put in other tools or separate scripts. But there's probably going to be a make target that tells people which tool and where to look if they need to change it, or that'll let them run it without having to know the details.


>Using make, you can standardise a lot of this. If you set up a coding standard that after you clone a repo, "make dep" should grab anything the project requires, then developers don't initially need to know whether that's calling out to npm or composer or pip or whatever - it's just "working".

In my experience you will end up with a dozen coding "standards" and project structures. Especially in C and C++ where header dependencies have to be generated by the compiler even the most basic makefile will take you at least an hour to create if you already have experience in making makefiles.


Comes down to how you enforce it. To be clear, I'm not talking about mediating the build in Make unless that's the best tool for the job. On a C project I would use cmake, but within this system I'd run it out of a Makefile at a top level, so the person pulling it can just run 'make build'.

The point is to have a known starting place, so that when you pull down a new project you don't have to spend ages reading about how it works.

Even if you can't grab the deps automatically, doing 'make build' and it saying "Hey, you don't have cmake and a bunch of other things you need, but go look at http://whatever.. to set yourself up" is a much better experience.


I haven't done something similar with npm but I did write make files to call specific commands. For example a particularly long activate function very specific to internal tooling. Doing this makes sense for such commands and not for primitives.


I agree, why add make (which version?) as a dependency when you can make sure that the project can build with just node installed.


What do people think of 'tup'?


It's my favorite build system. A big advantage is automatic dependency tracking. It picks up unspecified inputs and automatically rebuilds anything further down the dependency tree. It also knows how to clean up files that it no longer generates since it understands you removing and adding build steps.

I can do huge refactors and it cleans up files without any additional work from me. The filesystem watch automatically builds things so I never have to run it directly. I use it on my production server when deploying since it's 100% accurate at incrementally moving between different project states.

There are definitely some disadvantages, you have to work around its syntax and limitations. For complicated build functions I've written the config in Lua and it works fine.


I tried out Tup, and thought it was really impressive and probably great for people who need to write C, but not suited for what I was trying to do.

The project I tried to use it on was a before/after comparison of the results of some development work to an energy model. So, a couple of my make targets cloned old and new versions of the repo.

Tup doesn't like you to create directories, and wants to track every file that comes into existence and check that it is either a) a declared output or b) deleted by the end of the job. It really didn't enjoy trying to process a whole Git clone operation.


I really like it, but I don't know if it's worth the hassle. For example compared to ninja it isn't that much faster and it's missing a good configuration system anyway. Definitely better than Make though.


Friends don't let friends write recursive makefiles.


It seems like each of the folders with makes files having Makefiles also have a package.json, suggesting they are self-contained packages under the same repository. Still, it seems like a questionable move


Friends don't let friends write Makefiles at all. The author does not appear to be familiar with them, except for the trivial examples in this post.


The types of trivial examples in the post are exactly what you should use makefiles for. make excels at managing simple file transformations: you get into trouble when you try to do complex cross-platform scripting with it.


Every folder gets its own little makefile. Calling those recursive shouldnt result in trouble?


You might like "Recursive Make Considered Harmful", Miller.

http://aegis.sourceforge.net/auug97.pdf

Although I suspect the main complaint (slow, due to checking directories where nothing has changed) has mostly been swept under the rug by increased CPU speeds some of the other items (managing dependencies, ordering, ...) are still worth considering today.


Tup solves this: http://gittup.org/tup/


Author here. Thanks for the great comments everybody. I'm honored this post even got attention here (_long_ time reader and lurker here). I'm glad to see a bunch of constructive feedback here that helps me clear up my ignorance around make since I'm lacking in experience compared to other folks (okay, really, I'm a Make noob).


Yeah, I'm writing a C build system just use shell+make, and it's work very nice for learning C programming and try some new ideas.

https://github.com/junjiemars/nore


Recent experience of working with various task runners such as Phing and Robo (plus others), eventually my previous employer returned back to simple Makefiles and I in turn have done the same. For my needs of generating small website builds, Make is perfect.


TIL Make is not used widely anymore.

I may live in a bubble, but my last use of it was… maybe 2 hours ago (to build someone else's project) and yesterday (to build one of my own projects).


It’s used for basically most C/C++ projects and somewhat outside of that.

It’s not been adopted (with good reason) in many newer language communities, so depending on your tooling and platform you may see it less.


Yup, if Make was used extensively for Javascript, I wouldn't have had to file this feature request:

https://github.com/webpack/webpack-cli/issues/152


If you would see our Makefile which can do literally_everything_ with the project, you would not write this article. :)

Makefiles don't scale, hard to learn because of awkward syntax.


That's the think. The concept of Makefiles are great. The syntax is pretty terrible.

GNUstep's Makefiles are actually really nice.


We have been using make for build and test scripts for docker+Node apps. I have no complaints after a couple years.


> Makefiles are simply a declarative way to transform one file (or series of files) into another.

Tab echo it certainly is not greater than side_effect.txt.


I used to be big fan of using Makefile for web development, but have since changed my mind, because make is not very good fit for watch-mode incremental building.


Interestingly enough, I tried using Makefiles for a web project yesterday, and it was a success.

    SOURCES = index.pug layout.pug style.styl main.coffee privacy.md
    OBJECTS = index.html style.css main.js privacy.html

    all: $(OBJECTS)

    watch:
    	http-server &
    	while true; do \
    		$(MAKE); \
    		inotifywait -qq $(SOURCES); \
    	done

    %.html: %.pug
    	pug -P $^

    %.css: %.styl
    	stylus $^

    %.js: %.coffee
    	coffee -c $^

    %.html: %.md
    	markdown $^ > $@

    clean:
    	rm -rfv $(OBJECTS)
This could be extended to produce an out-of-source build by using `-o "$(PUBLIC)"` on the command line generators, but I didn't need it for this project.


Do you know about the ; style for rules that have only one line of commands? It helps reduce the number of tabs.

    %.html: %.pug ; pug -P $^
I find it super useful for compact Makefiles:

    %.html: %.pug    ; pug -P $^
    %.css:  %.styl   ; stylus $^
    %.js:   %.coffee ; coffee -c $^
    %.html: %.md     ; markdown $^ > $@
    clean: ; rm -rfv $(OBJECTS)
Also, worth making clean .PHONY since it's not a file.


Sure, that's the traditional way of doing things, and I like its simplicity and the fact that the 40 years old core concept is still so useful.

But most of us want to require() modules also in the browser, so you'll anyway need browserify or other bundler, which already have cross-platform watch mode & plugin ecosystem, and are expected to be run continously, and suddenly the Makefile feels rather redundant, especially with npm scripting.


I'd recommend adding:

.PHONY: watch .PHONY: clean

In the appropriate places.


The data model of makefiles is perfect for watching and rebuilding. All you need to add is something that monitors the source files for any changes and runs 'make'.


You lose incremental builds of modern bundlers/transpilers, which makes your build time too slow for bigger projects. People nowadays expect to see changes almost in realtime using hot module replacement etc.


That's because those tools aren't designed to play well with others. It's not due to the design of make.


I would say it's partly because of the design of make.

A Makefile consists of separate commands¹ and is heavily file-based¹, which makes it rather slow and doesn't allow to keep state (in memory) between re-runs. Traditionally this hasn't been bottleneck, because compiling C code was relatively slow operation, but modern web development tools prefer to work with in-memory streams instead of invoking executables for tiny files on disk.

I'm not saying you cannot do things with make, but in NodeJS/web ecosystem it just doesn't feel as natural/flexible as the "native" NodeJS-based toolchain.

[1] Great for interoperability, but sometimes more tailored solution is worth it.


Launching a process on Linux only takes about 2ms, and make only needs to do it for parts that change. There is no reason you couldn't use a make-based system to get incremental rebuilds with under 100ms latency, which is about as good as most people get with native JS systems.


For js code you often want hot module reloading which is probably very hard to get with make.


That sounds like a great idea. Does anything like that exist already?


Easy enough (assuming $SRC is your list of source files and 'all' is top level target) 'make watch' could do this

   watch:
        while ! inotifywait -e modify $(SRC) ; do time -p make all; done


If you care for it, fswatch is cross-platform:

https://github.com/emcrisostomo/fswatch


something along the lines of inotify:

http://man7.org/linux/man-pages/man1/inotifywait.1.html

or just use tup

http://gittup.org/tup/



There's "when-changed" in pip.

    pip install when-changed
    when-changed *.tex -c "make all && echo 'done'"
Then you can make rule in your make file called "watch" or "build-server" which calls that command, so you just need to write "make watch".


It's trivial to roll your own: https://github.com/gall0ws/watch


>>> Time for Makefiles to Make a Comeback

Until your build chain is f----- by the tab VS space issue of makefiles. Then abandon makefiles again, for good.


This is really just an editor issue. Editors can parse a make file and do the right thing these days.


I really like the make approach but I want a modern updated implementation of the idea. With the domain of web development in mind, my wishlist of requirements looks something like this:

* Written in JS and installable via npm (i.e. runs on Windows, Linux and the other brand) * 'makefiles' are just JS code. * Supports using Shelljs for the command scripting parts and being cross platform. * Supports calling tools in "node_modules/.bin/" * Supports parallel builds for big modules projects. * No plugins, promises, streams, async or any other nonsense like most other JS/web related build tools or "task runners".




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: