The best architected use of make is in the FreeBSD build system [1,2] If you want to experience "a system" please give FreeBSD a try.
The fact is, building software that requires 500 dependencies and 500 sub-steps and 500 configuration options is going to be complicated. It's complicated in the same way that implementing an operating system is complicated. There's no way around it. The complexity is there because it's inherent in the problem.
But it doesn't have to be. Instead of spending 300 hours implementing Shake, or Rake, or Bake, or Cake, or Jake, or Take, why not spend those hours cutting down the complexity at the source? Trim your dependencies. Stop putting so many sub-steps and configurations into your build systems. Cause it's the build systems with the 500 dependencies and 500 the sub-steps and the 500 configurations option that are harmful; not the tools
At any level, there’s probably a library that solves a large part of your problem. From your perspective, you only need “a few” of those. But from the library author’s perspective, they also only need “a few” of those. And on and on, down the chain. Each step seems reasonable, but suddenly the tree has 500 nodes.
Some library authors try to make sure their library doesn’t have dependencies. But that’s how you get libraries like Qt, that take two hours to build and produce 35MB .so files.
I can't count the number of times i've looked at a potential dependency only to find that they have massive bugs in some auxiliary part of the library that they decided to do themselves in an attempt to "avoid dependencies".
We have these tools designed to handle a complicated network of dependencies, use them! Don't implement your own cute way of parsing XML just because you want to avoid dependencies! Let the library that is focused on just parsing xml, well tested, widely used, and actively maintained handle XML parsing!
Much better to reach out to other projects and figure out how to make Make better, no?
Sometimes I think great coders (10x coders) can work against themselves. Because they are such great coders they are able to re-implement in situ rather than improve the stuff they get from over the fence. If Make is really that bad make Make better in a language-neutral way.
Considering the authors did so and find it to be an improvement in terms of maintainability and usability... I'm going to say "yes". Do you think you know more about the project than they do?
I used to work on GHC. The build system is complex. Hadrian is quite an improvement in power and expressiveness (and is now capable of doing things we wouldn't have been able to implement easily with Make, since extending the prior system was too hard).
> The fact is, building software that requires 500 dependencies and 500 sub-steps and 500 configuration options is going to be complicated. It's complicated in the same way that implementing an operating system is complicated. There's no way around it. The complexity is there because it's inherent in the problem.
I get the feeling you're going to use this random truism as a springboard to make suggestions despite the fact you've never been involved in the project?
> But it doesn't have to be. Instead of spending 300 hours implementing Shake, or Rake, or Bake, or Cake, or Jake, or Take, why not spend those hours cutting down the complexity at the source? Trim your dependencies. Stop putting so many sub-steps and configurations into your build systems. Is that the sane way to do things?
That would be nice if everyone had endless time and everything was always done exactly perfectly up front. It would also be nice if you could work completely on your own and never have to interact with any other software in the world.
Binary tarballs, source distributions, upstream library dependencies, cross compilation, thousands of tests, tracking all dependencies correctly (this one alone is ridiculously hard), autogeneration tools (to save errors on tricky parts). Feature detection at compile and runtime (because your users work on some old CentOS machine and no `pthread_setname` is not available), profiling builds, running documentation generators, handling out-of-source builds, handling relocatable builds. I can just keep listing things, honestly. All of these -- more or less -- come back to your build system.
In fact, GHC goes quite out of its way to expressly use as few non-Haskell dependencies as possible. Why? Because the ones it already has are often burdensome and complex, and we have to pick up the slack for them for every user. Nobody using your project cares if Sphinx or their rube-goldberg Python installation (spread over 20 places in /usr) was the reason doc building failed; your build failed, that's all that matters. You've still got to figure out what's wrong, though, for your user. And not wanting new dependencies has been a common reason to reject things -- I myself have rejected proposals and "features" to GHC on this basis alone, more or less. ("Just use libuv!" was a common one that sounded good on paper and never addressed any actual issues we had that it claimed to 'solve'.)
As a side note, it really just amazes me the amount of people who immediately see any amount of non-trivial work in some project and immediately question "well, why don't you just do <random thing that is completely out of context and has no basis in the projects' reality>". Seriously, any time you think of this stuff, please -- just give it like, 10 more seconds of thought? You'd be surprised at what you might think up, what you might think is possible. It's not the worst thing in the job, but being an OSS maintainer and having to deal with analysis' that are, more or less, quite divorced from the reality of the project is... irritating.
Every single self-described make replacement project makes the exact same claim, verbatim. Yet, when these projects start to see some use in the real world... Queue all the design shortcomings and maintainability and usability problems.
We're about 4 decades into this game. Perhaps this time everything is different. Who knows. Odds aren't good, though.
In other words, you're asking a question that criticizes the mechanism that would answer that question. It's a legitimate question, but a poor reason to disregard what they did.
For those who don't know, aseipp is/was a major contributor to GHC and will be intimately familiar with the build system of GHC. His observations are on point.
Relatedly, I'm currently also fighting about 3-4 different build systems which are "classics" of the genre and yet are broken in subtly different and interesting ways.
Software is always complex, it’s just that the complexity is gradually hidden from the developer by the use of libraries.
Until those libraries get baked into the standard library of whatever you’re using, you’re going to have to implement complexity yourself, or use a dependencies.
Unless you’re scripting, doing something entirely within your languages or OS framework, or implementing everything yourself (hello complexity), you’re going to hit complexity and dependencies very vey early.
The only time I’ve seen this avoided is in the embedded space where you physically don’t have enough bits to get complex.
We got to the moon with a computer less powerful than my microwave. My old smart phone worked just fine without 4 gigs of RAM and 32 gigs storage, and now this monstrosity in my hand is running out of resources? It doesn't have to be this way.
Can that computer show a GUI with multiple videos playing simultaneously surrounded by UI elements where multipile peripherials (mouse, touchscreen) can control their display area, all the while running two compilers (C++, Scala), and indidentally also running a Virtual Machine, etc. etc?
"Get to the moon" is an absurdly simplistic way to view complexity and it does your argument no favours.
(That's not to underplay getting to the moon. It's an amazing achievement, but if you look at the resources/humans poured into the project, it's actually not that amazing that it was possible.)
The other problem is integration. You can't expect all 3rd party projects to suddenly adopt your build system, so eventually you have to invoke `configure`, `make`, `cmake`, `pkg-config`, `xcode`, and so on. While you can satisfactorily capture most of these inputs and outputs, it's non-trivial to do it completely, at some point you have something that works and it's good enough.
Now try adding Ragel into the mix, which doesn't have `-M` or any other kind of dependency output. There are plenty of other tools like this. Practical applications eventually run into limitations like this as they grow.
No... they do capture every dependency. They don't try. They do. Try getting something to build that has dependencies outside the nix store. You can't, because you can't access the local file system, the internet, or anything else while putting things into the nix store.
Nix doesn't care about `-M` output. It's a package manager that captures dependencies. If your package's build system doesn't have `-M`, then you must declare your dependencies yourself. This is how people used to do it before `-M`.
Simpler languages like Go don't really have this problem, but they also have sane build systems so I don't know why you'd need CMake or whatever.
Hermetic builds are the way to go for a ton of reasons.
Yes. Nix is a package manager that drives multiple build systems and captures all dependencies: https://nixos.org/~eelco/pubs/phd-thesis.pdf
The problem comes when some part of your toolchain wants to access something outside of the build environment.
GNU's not Unix and there isn't a /usr/include in the build chroot; see the manual about that and profiles.
$ cat x.c
$ guix environment gcc
$ gcc -Wp,-H x.c |& grep stdlib.h
$ readlink . /gnu/store/fc363b1hsid7pfdxh18m4a1i1r04i5fl-profile/include/stdlib.h
While we have demonstrated that our approach works,
we have not yet implemented all features of the build
system, and hope to do so over the next few months
We implemented 5 a new build sys-
tem for GHC from scratch using Shake and our build abstractions
from §5. The new build system does not yet implement the full
functionality of the old build system, but we are currently address-
ing remaining limitations; nothing presents any new challenges or
requires changes to the build infrastructure.
Its mostly complete, and as someone that has used ghc's make system and likes make, this build system is miles better and isn't a chthulian horror.
That line caught my attention pretty hard. It is mentioned in passing, and I have been part of many failed systems that were able to say the same thing. :) That is to say, I'm making my comment from experience. Not a desire for them to fail or give up.
Famous last words.
For those who are sure we already have a definitive solution to one or more of these, the problem is in persuading everyone else, especially those who think something else is it.
Check out all the links under this heading: https://reproducible-builds.org/docs/#achieve-deterministic-...
Many legacy systems capture all sorts of nondeterministic values -- from build date (it might be a desire to be "helpful", but breaks reproducibility) to accidentally depending on the order of inodes on your files system!
All of these problems are solvable. It should be dirt simple. It's just a "small matter of programming" :)
Linux and BSD build systems deal with most of these issues usually with wide support of a variety of recursive makes. Though RPM, DEB honestly suck and never really tried to solve issues automatically. Still drives me nuts that packages are tainted by the 'gold' systems they are built on. The complexity of build systems means very few minds are up for it and most solutions are naive and end up with tons of patchy exceptions and work arounds.
ROCK Linux supported cross compiler capabilities and auto-detection of build parameters and dependency library tracking. (I was working on automated dependency ordering and QEMU based full cross builds, before I got a real job.) It was very robust and outside package developers breaking their own builds it worked solidly. No idea what cool things T2 Linux got up to after ROCK, but maintaining a fresh build system is hard. Build systems are always going to be fragile systems with complexity. The paper seems to be a survey of what they learned vs. definitely having any solution.
Both things are archaisms that can be easily avoided, and in the particular case of this article, the 2nd part of the title works just as well as the title.
Embracing an archaic grandstanding pose and an academic look when much more easily grokkable formats are available just doesn't help get your message across that well. And when there's one of this roughly monthly I'd personally love to at least see less of them.
So, they've never heard of templating?
Sometimes it's okay to make something with redundant functionality.
This looks like you're writing source code in another language that follows a kind of template, which you then compile and run, and then does stuff and is extremely complicated.
Seems like a failure to me? Shouldn't something "better" be equally simple or simpler?
$1/$2/build/%.$$($3 _ osuf) : \
$1/$4/%.hs $$(LAX _ DEPS _ FOLLOW) \
$$$$($1 _ $2 _ HC _ DEP) $$($1 _ $2 _ PKGDATA _ DEP)
$$(call cmd,$1 _ $2 _ HC) $$($1 _ $2 _ $3 _ ALL _ HC _ OPTS) \
-c $$< -o $$@ \
$$(if $$(ndstring YES,$$($1 _ $2 _ DYNAMIC _ TOO)), \
-dyno $$(addsux .$$(dyn _ osuf),$$(basename $$@)) )
$$(call ohi-sanity-check,$1,$2,$3,$1/$2/build/$$ * )
I suspect it easier to master Haskell than this.
Substitute "Clever" for "Poorly", if you'd prefer.
Second, sometimes the best trick you can do to make a build system cleaner, is to change the system it is building.
Though, as I stated elsewhere in this chain. I don't necessary mean to dismiss this effort. Just showing me gymnastics that are required to do something that most people just don't care to do, doesn't endear me to the ill suitability of the context. (Fun metaphor, actually. The fact that my house isn't designed to allow easy gynmastic practices in the living room is not a criticism of the house or of gymnastics. Showing me that a cartwheel will get you hurt there is not really showing me anything relevant.)
In fact, all of their examples of how bad make is involve the use of non-standard makefile generators, or non-standard extensions to make itself.
Alternatively, I could argue that no one should use C because C++ template metaprogramming is too opaque, and, as further proof, my use of a non-standard preprocessor I implemented leads to 10,000’s of lines of deeply nested macro invocations.
(There are all sorts of problems with make, but I’m not convinced the authors actually understand them in enough detail to improve on it.)
It's not like new build systems have to follow the paradigms of the old ones. Why do you think they write a new build system in the first place?
And even then, make is mostly declarative.
And even then, Haskell is totally implemented on top of imperative tools (formerly C, C--, and now LLVM bytecode). It's still a functional language.
All variables in make which are not target-specific, are global. That 100 % non functionnal.
> And even then, make is mostly declarative.
The docker API is declarative but the reality is not, so everybody somehow use docker shell to do dirty stuff.