Hacker News new | past | comments | ask | show | jobs | submit login
Non-Recursive Make Considered Harmful [pdf] (microsoft.com)
29 points by luu on April 9, 2016 | hide | past | favorite | 38 comments



Paper title is about yet another build system (this one written in Haskell), to replace Make. While make is awful, Most other build systems I've ever used have ended up being awful in their own way, and have the massive disadvantage that my users probably don't have the CMake 2.8.9.2 that I wrote my build script for.

Also, I find it amusing that they claim Make's language is horrible, when they admit their replacement isn't shorter, I have to learn Haskell to use their system, and it looks like this (random snippet). No better than Make (in my opinion), and further I have to write unicode arrows?

    "∗.o" %> \out → do
    let src = out -<.> "c"


I suggest to read more about Shake before dismissing it. The paper is short and easy to follow. Shake solves real world problems that go ignored in other systems. If needed, one could add a simple mode via an EDSL, but so far nobody using Shake has requested such a things from what I can tell, but Neil would be better equipped to answer that. I haven't written a single Shakefile with unicode, and no you don't have to, but apparently you can, if you prefer to.

As a start I'd suggest to read "Shake before building".


I did read all of that. To me, the system seems inferior to tup, which they discuss (tup doesn't require me to specify dependencies, which I consider a killer feature). Also, tup works on windows, while that paper suggests Shake doesn't.


I didn't try tup, but I found the way it's said to detect dependencies by monitoring stuff to be a bad idea, though this is probably my personal preference. Can't tell how many developers don't like that design aspect.

Shake should work on Windows, given that Neil wrote a Windows progress bar tool for Shake.


If you don't like auto-tracking dependencies, then tup certainly isn't for you.

For me, it is a killer feature -- any system where dependencies are maintained manually inevitably gets out of sync with the code. Some systems have special compiler hacks (like gcc's -MM flag), but you still have to find the magic option for each program you use, and link them into your compiler. That's also very hard to do in dynamic languages, where you don't know what libraries you have loaded until you've run the program.


> If you don't like auto-tracking dependencies, then tup certainly isn't for you.

I do use auto-generated/-detected deps for build time dependency graphs, but I didn't like how tup implements it.


It's not just written in Haskell, it is Haskell. Your "Shakefile" is a Haskell source file. You understand the syntax if you understand Haskell syntax, and vice versa.

And no, you don't have to write Unicode arrows. You write those things as ->. For some reason academic Haskell papers use Unicode symbols where real Haskell uses ASCII.


Build systems are a good case for a domain-specific language that just knows about how to build things. Make may not always be the right DSL.

"Here's a Haskell library, go write your own build system in Haskell" is technically a solution to every build problem, but not a very practical one. Especially because Haskell libraries are themselves so hard to build on a fresh system!


Haskell libraries as well-maintained as Shake should not be hard to build on a fresh system (especially nowadays with Stack). Getting GHC onto a fresh system might be difficult if your package manager doesn't provide it.


Build systems and functional programming languages are a good match, because of the way dependencies work. Therefore, I give this tool a good chance of being succesful (though I'm not sure if/how they will win over the non-Haskell crowds). At a first glance, this paper seems very interesting. This is certainly not "yet another" build system.


> my users probably don't have the CMake 2.8.9.2 that I wrote my build script for

cmake is "pretty good" about exposing policy flags to control compatibility.


From considerable experience of building unixy packages, I wouldn't use "cmake" and "pretty good" together :-(.

Fedora EPEL6 has had to include two new cmake versions so far to be able to build more packages.


The unicode arrow is just a pretty-printed ->

The bizarre opeators seem to be an attempt to maintain similarity to make's cryptic oerstorsr


I knew Make was terrible when I was first introduced to it.

This is why I think everyone should learn programming first and linux second - since many linux tools are terrible.

As expressed in the SICP book - the programming language should allow you to build abstractions. Make doesn't at all allow that.


I think you were misinformed about the purpose of Make. It should not be used as a programming language. You can write a short makefile by hand, reasonably, which compiles a few source files, but anything larger should probably be automatically generated by CMake or your own scripts or whatever.

Abstractions are perfectly possible, you just do it in your own script (Python or whatever) and this way you don't have to learn a new language. Ninja is a better alternative to Make, and it is even less expressive (which is one of the reasons why it is better, IMO).

Or in other words, we build abstractions in Make by composing it with other tools, rather than creating a huge monolithic Makefile syntax which can do anything you want.


After loathing make for decades and doing my best never to waste any time thinking about it, I spent a week digging in hard and learning it in detail, as though it were a programming language I actually wanted to use. It turned out to be just as terrible as I always thought it was, and I really don't want to use it - but I learned how to make it do the straightforward thing I have always wanted it to do, which I have often fumed and sputtered and groused at it for not doing, and I wrapped that knowledge up in a set of recipes which I now copy into every project I work on. This lets me take advantage of make's ubiquity without having to remember and worry about all of its odd awful corners.

Here are the recipes, most recent version: https://github.com/marssaxman/ozette/blob/master/srctree.mk

And this is an example of a makefile using those recipes: https://github.com/marssaxman/ozette/blob/master/Makefile


Nice summary! It is indeed often missed but rathar important point that make syntax is generator-friendly. It allows to easily script make files or even glue them together from several generators.

This ability to write and compose is a rare feature today. Json is pretty opposite to it and even Yaml is rather bad at it as to compose pieces one have to parse them first.


The problem with generated Makefiles is that you start to debug the generation step, if possible, if you want to tweak things, whereas a Makefile written by a human is usually easy to edit. Alternatively the generated Makefile could use overrides from an optional .mk file to be sourced, but that's not what I've seen with automake or cmake.


In which case why even use Make at all? CMake could generate anything.


CMake can't generate anything, it only generates a few different types of files. Make is the most well-tested output, and there is only one alternative generator for CMake for the Linux command line anyway (Ninja, which is even less expressive than Make).

It's like saying that "Why use assembly at all? GCC could generate anything." Well, yes. But you're on an x86 machine and there's an assembler right there for GCC to use, and GCC has been using that assembler for decades.


All of these things sound complicated. What is CMake now ?

Manipulating files is trivial in any programming language expect maybe in something like C,C++,Java,etc.

Python/js/go make it really easy to deal with complex build systems or even just 1 liners.

So I have no idea why make has any advantage over those - only use-case is if you are writing C where its non-trival to do file-handling.

I moved to npm scripts once I asked the question "how do i make a multi-process build script in make ?"

and all the neck-beards in uni had no answer - so I just write small scripts which grow and shrink based on my needs.


Make is really fast, and it's good at doing parallel builds (you just have to specify -j). You could try making something better in Python, but it would take a long time to write and Make is already here. You apparently had the opposite experience from me. I hate NPM build systems like Grunt and Gulp because they're hard to debug. Make is easy—keep the files around and tweak the command line until it works.

No idea how you missed the -j option.


"I moved to npm scripts once I asked the question 'how do i make a multi-process build script in make ?' and all the neck-beards in uni had no answer"

  make --jobs=2


> To validate our claims, we have completely re-implemented GHC’s build system, for the fifth and final time.

Until they re-implement it again :)

> Unfortunately no cross-platform APIs are available to detect used dependencies, so such tools are all limited in which platforms they support.

How about using libfuse? One could run the build inside a "virtual" folder served by libfuse, and thus detect all dependencies (they will show up as "read" operations in the libfuse API).


> Until they re-implement it again :)

As the paper says, every previous implementation started nicely and got horrible towards the end. This time, that hasn't happened, so _hopefully_ it's the final one. But, of course, never say never.

> How about using libfuse?

What about Windows? Certainly you could use something like libfuse, and we're trying other solutions - if we manage to build a cross-platform API Shake will be able to use it easily.


If your build reaches the point where you feel you need a wide variety of strange "make" constructs, you don’t necessarily need a new build system; you need to be smarter about the setup. For example: do everything in two phases, where the more “magical” version generates a less magical, verbose, static makefile with more rules that are relatively easy to understand and debug.

And, besides: a huge impediment to improving the state of build systems is that new build mechanisms won’t be installed on most platforms by default. If you must explore new ways to build, be sure to implement that in terms of what is already there (Perl/Python/whatever) and make sure to ship your new build system WITH your code.

When I download some "neat-project-1.0.tar.gz", the last thing I want is to have to futz with setting up your special build system that Probably Nothing Else uses or will ever use. If it’s not in the tarball, I will already lose a lot of interest; I know most things can "configure, make, make install" in minutes but I don’t know how much time will be wasted installing whizbang-build-0.1 first and I probably won’t even try.


The old GHC build system did do everything in 3 phases. And they were pretty magic phases. It's not clear there's any upper bound on the number of phases, and maintaining each phase separation is very tricky.


I was working on a CentOS 6 box not long back and it's shocking just how many big projects you pretty much can't build. Not because they didn't write portable C, but because they have requirements on a minimum version of autoconf, which is later than shipping under CentOS 6.


autoconf _shouldn't_ be relevant to building, rather than maintaining the project. (I know you can't always get a proper release in this day and age.)

In some cases you do need to re-autoconfiscate, which is why EPEL6 has autoconf268 (from RHEL7). Otherwise there's a software collection with up-to-date autotools, though you might expect them in the devtoolsets.


Reading the article, I wonder if Shake could be used to generate Ninja files. It looks like it might be possible. The main "backend" improvements in Shake are things like concurrency reduction (pools in Ninja), hash-based rebuilding (available but undocumented in Ninja), and generated dependency rules. In such a setup, you'd use Shake for all your abstractions, and Ninja to execute the rules.


Shake can interpret Ninja files, actually.

But I can see the appeal of Ninja files of something that allows one to avoid the GHC requirement in a build environment.

Though I've found Ninja to be more like something that needs to be generated from something else and less something you'd write. So maybe Ninja can be exactly that, kinda like build script assembly.


Make is convenient for tiny programs, and annoying for larger ones. The system in GHC seems like a nightmare to maintain - I remember having trouble building GHC, mainly because it was hard to debug their system.

I use Docker for a similar purpose nowadays. It's inconvenient for development, since rebuilding the containers is slow. But it usually works without much trouble. I think there's a GHC Dockerfile somewhere that hvr maintains.

I haven't used Shake specifically much. The only thing that bugs me is that Haskell isn't nearly as portable as plain Make. If Shake didn't build the program but rather emitted a build-file that a small reference C program could interpret, I would feel much more comfortable about using it for just about everything. Because I'd know that if I was on some ancient CentOS 4 system, I could still build my software.


Titles containing considered harmful considered harmful


I assume they are playing off the 1997 article "recursive make considered harmful"


It's from the 1968 letter "Go To Statement Considered Harmful" by Dijkstra


Comments containing "considered harmful considered harmful" considered besides the point.


Everybody wants to be Dijkstra.


It's like taking The Lord's name in vain.

Used frivolously, it feels like blasphemy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: