The problem that I find with Makefiles, in Node projects, is that node scripts and tools are pretty slow to startup and pretty much all of them have some run-on-all-files type of setup
So, for example, instead of running the typescript compiler for a single file
tsc src/index.ts
You are better suited to run the compiler to the whole repository:
tsc --build ./tsconfig.json
Which implements incremental building and dependency tracking automatically.
For this case having make run tsc for each file individually hurts performance pretty bad. Something like this will not scale very well:
src/%.js: src/%.ts
tsc $<
There's definitely a overlap where each JS tool implements its own incremental building, cache system and dependency resolver. Eslint, prettier, npm, webpack could all benefit from being more Unixy and forwarding the task management to a tool to rule them all.
Things are changing though, and tools like swc and esbuild are getting pretty fast. And there a few attempts for an universal build system on NX and TurboRepo. My dream scenario is where we get a single command interface, similar to gradle or bazel, where you just run `tool build` and everything gets setup for you without fuzz
You can rather easily write rules with GNU make that sit at the comfortable middle ground between these two.
E.g. if you have 1000 JS files to generate from 1000 source TS files you'd "shard" them dynamically by having them batched N at a time, e.g. batches of 10 or 100.
You need to dynamically evaluate Makefile rules into existence for that to work.
I haven't used "tsc", but most compliers with start-up overhead care if you invoke them for every file. But if you can amortize that over N any resulting slowdowns are generally lost in the noise.
Why do that? Because if you have other rules that rely on all that JS being built, it can be much faster overall.
If some of the files are finished you can use free CPU cores for those follow-up rules, as opposed to waiting for "tsc" to build every last one.
It also makes your build system UK much better.
I know some think "good UX" and "GNU make" are diametrically opposed concepts, but being able to ask it "what do I need to incrementally generate to make X" for any file is really useful.
That works much better if your build system doesn't have one big "do all the things" step that's essentially reimplementing most of GNU make.
You could, but the you lose most of make's benefits, this way you are only using make as a script bundle. Looking back now it does align with what the article is proposing. But I still a bit annoyed that we don't get proper use of Makefiles, and that we need to reinvent incremental building for every tool
In my previous job, I used Make for my asset pipeline and compilation of javascript.
At some point, I discovered esbuild and realised that I had to split the file in two for tasks that had to be done sequentially and those that could be done in parallel.
Esbuild still had dependencies, but needed to be run only once for those dependencies.
Still got great results ultimately. Going from something like a minute to four seconds.
An interesting alternative is the javascript+typescript approach that Svelte uses. Typescript is only used for the typechecking, there is no transpilation for running the app since all the run-time code is written in plain javascript.
I love make because it has simple syntax, is language-agnostic, and provides consistent builds regardless of the tooling used. This is especially visible if you combine it with multi-language projects (e.g., C/Java/JS all in one).
Instead of running multiple commands, I can type "make prepare ship" and the magic starts. While it has its challenges with language-specific tasks, use language-specific build tools, like true Unix philosophy demands :D. When I need language-agnostic dependency management, make's simplicity is still unmatched compared to modern alternatives. And no, bash scripts can't match this with ease.
I find that https://github.com/casey/just is a great answer to a lot of problems in this thread.
I have a justfile in all my projects now and I'm very happy.
I found just and was about to use it in place of a Python script for preset targets in one of the Greenfield projects I had. I decided not to since it makes a weird decision to use sh in Windows environments, it tries to use Git for Windows bash or MSYS2 one which is very unwieldy.
I am still in search of a simple build scripting language/system that doesn't rely on any OS shell. Something that isn't as barebones as ninja, isn't as wacky and Unix specific as Make nor is as general purpose as Python. I just want a statically linked executable that I can install in any of the Linux, macOS or Windows versions as a single binary. It should handle the path differences well and should execute things directly without a shell. I basically need a very basic script interpreter.
CMake kind of is that system but it is too stringly typed and sometimes wacky too. Python kind of works too, but they have a quite a bit disregard for backwards compatibility and isolation from the surrounding OS, so I am forced to use solutions like Conda to ensure a specific version with specific dependencies and their versions are pulled and can be reproduced across different OSes without interfering with the locally installed Python.
No. Not really. Is Make a shell? It isn't. Neither is CMake. Python can be used as one but it really lacks niceties that an actual shell would provide. A cross platform shell with limited capabilities would do the job for me but it is overkill.
I am not really looking for features you find in shells like REPL, job control or very complex structure support. Having targets and being able to execute commands to build them independent from any OS shell is enough.
Same. I think a lot of it comes down to the language you’re using. If I’m doing something in Python or Rust, poetry or cargo hand all the magic I would have encoded in a Makefile for a C project years ago. Today I have a justfile with a target like:
build:
cargo build
and a bunch of other targets for testing, running it, etc.
The anti-make-reinvent-everything mentality has bugged me for decades. I never understood why every language seems compelled to reinvent the build system. Especially when the output is file based.
I’m going to use Cargo as a random example here. With Cargo, I can:
* Download a bunch of dependencies
* Build them with their own specific instructions
* Build all the code in my own arbitrarily complex source tree
* Link it all into a production-quality executable
…all without having to know or care how to do all those things myself.
I know how to do all those things. It no longer sparks joy in me to repeat the boilerplate of doing all them again and again in every project I start. It’s hugely appealing to me for a language to handle all that piffle just as it handles memory management, type enforcement, encoding loops as gotos, and all the rest.
I can write assembler, or machine code if needed. I can also find outdated .o files and rebuild them. But why? Let someone else handle the details so I can write code.
I use Makefile for my static blog that is powered by a static site generator. Something I love about this is I can have one place to describe all the different build tools. I can have PlantUML to generate PNG diagrams, Vim to generate syntax highlighted code snippets, openssl to generate certs, etc. I was one of the best choices I made for that project.
Make is great when working with the languages it was originally designed and intended for (C, C++, yacc, bison, etc). Basically the GNU ecosystem.
It's far less great when working with anything else, like Go, Python, JS/TS, etc.
People want great tools that work well with the way their software and teams work. Seems logical to me.
After all, someone reinvented the build system when they created make to solve the problems of their time. What if they just stuck with csh or plain POSIX compatible build scripts?
I think it's a good idea to have a pragmatic and general-purpose tool for filesystem-based dataflow programming. It makes an excellent glue. I am not convinced that tool should be make. It's got a lot of warts.
It makes sense to me to use a specialized build system for each language. They can be faster and richer than something generic. People can live inside one programming language for decades straight, so optimizing their experience is worth it.
The problem with Make (or any build tool) is that few developers care about how their project is built. They want to focus on writing their code, push a button, and get into testing it. That's a fair enough attitude.
But the problems with build tools are insidious. It's basically trivial to build a small project, so to start with, everyone assumes it's a nothing problem, and spend no effort on it. Then as the project grows, over months and years, the problems slowly accumulate. Each time some feature or corner case is added, the build gets slower, more complex, and cruftier. Eventually someone decides to take a look inside, and recoils in horror...
But it's not Make that's the problem (or any build tool). It's just the fact that building big projects is a tricky problem, and requires just as much effort to solve elegantly as any other tricky problem.
I worked at a big bank that had a problem with their Make-based build system taking multiple hours (sometimes days) to build some of their projects. They spent millions on trying to replace it with some Java monstrosity. I spent a few months rewriting their Makefiles, and the build times shrank to less than 1% of what they had been.
Make was the best tool for the job, because the job was vast, and had many, many corner cases. Make is a general purpose language, and is fairly unopinionated about how you use it to solve problems. All the "Make replacements" I looked at were far too opinionated to be elegantly bent to the task at hand. They were designed to look simple. That's great when your problem is simple, but big builds are not simple.
I like this, package.json starts to look really ugly when you build up lots of scripts that need to run interdependently. For example some stubbed back ends, front end, some auth service, etc. You end up with ugly long npm scripts with &&s everywhere. However for this sort of thing, personally I'd rather use docker compose, dependencies are explicit and declarative, like this I guess. You can define
health checks for services with docker compose too.
There is an even simpler way, just use a bash file with each function being a task, saves you from the .PHONEY hack and the "bash but not really bash" quirks of makefile.
Your makefile is a mix of 2 slightly different syntaxes. That leads to the kind of confusion like when you write interpolation, is it bash interpolation or make interpolation, and so on.
Make was made for system languages with slow compilation time where avoiding unneccesarily rebuilding and paralellization become crucial features. If you don't utilize or have a need for that then make does not bring anything to the table. If you DO have such need because your project is big now, you prolly also need monorepo workspace management etc, at which point you just use a modern tool like bazel.
I use this all the time, though not called a Taskfile. I recommend changing the shebang to:
#!/usr/bin/env bash
[ "${DEBUG:-0}" = "1" ] && set -x
if [ "${FORCE:-0}" = "1" ]; then set +eu ; else set -eu ; fi
export PATH="$(cd -P "$(dirname "${BASH_SOURCE[0]}")")/node_modules/.bin:$PATH"
This will do the following:
1. Use whatever Bash executable is in your path, which is necessary for portability (fixes many bugs)
2. If env var DEBUG is "1", turn on bash tracing
3. If env var FORCE is not "1", die on non-zero return status or unset variables
4. Prepend to the PATH the "node_modules/.bin" path, but find that directory from where this script lives, not the current working directory of wherever you executed this script from
I use Bash scripts as a customizable user interface to other tools. Sometimes that's better than Make (like when I want a customizable deployment frontend for both my laptop and ci/cd, I write a deploy.sh script). But sometimes Make is much better for what I want to do.
Make's weirdness exists for a useful purpose. Using Bash to avoid learning Make's useful features not only abandons useful functionality, but then runs into the weirdness of Bash (which also exists for a useful purpose).
Kids today are so impatient they never take the time to skill themselves up, and end up wasting more time in the long run. The most efficient use of anybody's time is to learn valuable skills once that save them time and frustration later. Read the manuals for tools like Make, learn how to use them, and you will reap the rewards for your entire career. Avoid learning Make and you will have a long career being frustrated by how everything is hard, complicated and time-consuming.
Yeah, just spend the two hours to learn the basics of make because it's already on pretty much every computer out there. Now for the rest of your life all your projects will have a baseline task runner, deployment strategy, infrastructure as code solution, and all around place to store random commands you need to remember, which will be in the repo from the first commit. Of course many projects will require something fancier which is fine. But make provides an 'engine' for all those things which is roughly as ubiquitous as your actual language runtime. You never have to suffer a project without them again.
Make is designed to take a bunch of little shell scripts, give each one an arbitrary name (which can be the output files if you want, but doesn't have to be), and run them. Dependencies are run first, if your script has any of them. Files can satisfy dependencies unless you tell Make that they don't.
It's really not different from a shell script with a bunch of functions that you can call by name, except that Make has already provided the scaffolding for you (including dependency-awareness, tree walking, parallel execution, etc)
This is what I use when I want the "common entry points" to a project, but don't need all the "make weirdness". It makes it very easy to essentially create a project-specific CLI, and has a bunch of features to make THAT use case much easier.
I like make but I don’t understand the logic here. If you’re going to install node anyway, why be so against using node for scripts? Perhaps less of an issue these days but make also isnt included in windows.
The article makes this pretty clear. Some of us work simultaneously in different stacks: js, python, php, go, whatever... each and every one having their own dependency/building tools. Having a consistent DX (make this, make that) across all projects has value.
One reason the article points out is to make the developer experience "portable" between projects regardless of underlying technology platform/framework.
That is not necessarily something everone agrees with ("when in Rome" etc) but it is a reason.
So, for example, instead of running the typescript compiler for a single file
You are better suited to run the compiler to the whole repository: Which implements incremental building and dependency tracking automatically.For this case having make run tsc for each file individually hurts performance pretty bad. Something like this will not scale very well:
There's definitely a overlap where each JS tool implements its own incremental building, cache system and dependency resolver. Eslint, prettier, npm, webpack could all benefit from being more Unixy and forwarding the task management to a tool to rule them all.Things are changing though, and tools like swc and esbuild are getting pretty fast. And there a few attempts for an universal build system on NX and TurboRepo. My dream scenario is where we get a single command interface, similar to gradle or bazel, where you just run `tool build` and everything gets setup for you without fuzz