
Make vs. Tup (2016) - edjroot
http://gittup.org/tup/make_vs_tup.html
======
dig1
This reminds me on 'Make vs. X' trend some years ago, where Make would be
bashed how is slow, not extensible, had weird syntax and incompatible between
implementations. So we got alternatives like Aegis[1], Scons[2], A-A-P[3] or
Jam[4] instead, which were much faster and flashier on paper.

But guess what, Make is still kicking, GNU Make got a bunch of new goodies
(Guile scripting or loadable modules to name a few) and all those alternatives
are pretty much dead now. What I see now is that we are repeating the same
mistakes only with flashier tools (node, python3, rust).

Although I always preferred Jam[4], I'm pretty happy with GNU Make now. Not
perfect, does the job well and if I ever hit some weird platforms, I can
always 'extend' myself to Autotools[5]. Funny thing, I'm even using Make to
run Ansible scripts or compile Java/Clojure code and works like a charm.

[1] [http://aegis.sourceforge.net/](http://aegis.sourceforge.net/)

[2] [https://scons.org/](https://scons.org/)

[3] [http://www.a-a-p.org/](http://www.a-a-p.org/)

[4]
[https://en.wikipedia.org/wiki/Perforce_Jam](https://en.wikipedia.org/wiki/Perforce_Jam)

[5]
[https://en.wikipedia.org/wiki/GNU_Build_System](https://en.wikipedia.org/wiki/GNU_Build_System)

~~~
heavenlyhash
I suspect a large amount of the issue is sheer availability.

There's an odd proclivity of developers of build tools to assume that their
favorite runtime (be it python for scons, etc) is already available or trivial
to nicely install. And that's just... not true.

The build tool itself is the one place in the ecosystem of building software
that we have the least room for manual dependency handling and manual setup,
because there's nowhere left to punt the hard part.

Make isn't _better_ at this either, but it's been so widely packaged and is so
nearly-default available (it's remarkably hard to get a working desktop system
without some transitive dependency having gotten make installed for you) that
it defacto gets a free pass on this.

My hypothesis is that these newer build tools would have a much better shot at
reaching adoption if they were well-packaged enough that a single tarball
(with zero external deps) or a single-line install script (think: "gradlew",
though even that was cheating by assuming an existing jvm) could bootstrap
them. Most don't seem to have invested that effort. And it shows.

~~~
geezerjay
Tarbals are a poor packaging method as they require users to install
applications manually without ensuring dependencies or pre/post installation
configuration.

However, taking the time to put together a working installation package for
specific platforms is something that does wonders to the app's adoption. It's
all about providing a standard distribution method and standard installation
in a platform so that it becomes a fixed target for end-users and developers.

~~~
heavenlyhash
> require users to install applications manually without ensuring dependencies
> or pre/post installation configuration.

Yet my point is the converse of this: by not supporting those things, it means
for something to be well packaged as a tarball, it may not have any external
dependencies nor may it have pre/post installation hooks. And that's a Good
Thing.

Because why would you need any of these things?

And how can you survive that level of complexity when the thing in question is
itself the build tool which is traditionally the thing that we use to handle
these complexities?

This may just be "my opinion", but I think my build tool should have packages
available which have no further recursive dependencies -- the buck has to stop
somewhere -- and it _certainly_ shouldn't need post-install hooks for
absolutely any purpose -- what would it use them for? Doing something
stateful? That's antithetical to the core of what a good build tool should be
doing.

------
JNRowe
I found the Build Systems a la Carte¹ paper, and NDM’s companion write up² to
be good comparison of build systems in a more general manner.

Caveat: It may skewed having been written by heavy Haskell hitters, and an
author of Shake. I’ll note that I didn’t notice any bias, but maybe only
because it aligned with mine ;)

1\. [https://www.microsoft.com/en-
us/research/publication/build-s...](https://www.microsoft.com/en-
us/research/publication/build-systems-la-carte/) 2\.
[http://neilmitchell.blogspot.com/2018/07/inside-paper-
build-...](http://neilmitchell.blogspot.com/2018/07/inside-paper-build-
systems-la-carte.html)

------
sjmulder
What I've seen of Tup is good but it doesn't add enough to convince me to
switch away from make which is already everywhere.

CMake and similar tools don't attract me because they're not language
independent. The best thing about make (and Tup too) is that it lets you
express dependencies and ways to satisfy them in terms of other tools.

Now what make is not good at is being ./configure. I'd like to see a more
elegant autoconf-like tool to detect things about the environment and generate
configure.{h,mk}.

~~~
AstralStorm
CMake actually is language independent, but it so happens that only C, C++,
Assembly, Fortran are first class citizens. Java and Ada are second class.
Anything else, nobody wrote it. You might find a package to find the compiler
and dependencies perhaps.

It is very much possible to add other languages.

------
aidenn0
My two favorite make alternatives make HN in two consecutive days.

I like redo because with make you need to know two languages, with redo you
know just one, the interface is simple and you get automatic dependency on
build rules for free. Also the implementation is so much simpler than make,
and yet I've not run into a feature I miss from make.

I like tup because it is very opinionated and forces you to write your build
scripts in a reasonable manner, while also being very fast and very correct.
It might be possible to write a tup file that incorrectly handles
dependencies, but you'd have to work hard to do so.

~~~
majewsky
Do you have a link to the redo story? I can't seem to find it on hckrnews.com

~~~
ksherlock
[https://news.ycombinator.com/item?id=18480433](https://news.ycombinator.com/item?id=18480433)
perhaps

~~~
aidenn0
That's the one

~~~
JdeBP
Don't forget
[https://news.ycombinator.com/item?id=18473744](https://news.ycombinator.com/item?id=18473744)
and
[https://news.ycombinator.com/item?id=18405227](https://news.ycombinator.com/item?id=18405227)
.

------
breatheoften
Anyone aware of an npm library that wraps tup to provide automatic dependency
graph construction from objects changing on filesystem?

I'd could use a javascript api that enabled automatically detecting filesystem
change dependency graph associated with each of a given set of javascript
function calls ... Perhaps with simplifying assumption that all function
parameters are serializable or maybe even that the functions being tracked are
only allowed to operate on strings that represents filesystem path's ...

I'd love for the functionality implemented within tup of running a command and
automatically doing the low-level kernel hacking necessary to track all the
filesystem objects read/written to be abstracted into an application-level
library ...

------
jp57
tl;dr: for the vast majority of projects, tup does not outperform make.

~~~
geezerjay
That was my take as well, and I honestly don't get the downvotes.

More importantly, in large projects the bulk of the comptational budget is
spent actually compiling source files. It's hard to believe that picking which
file neess to be compiled next takes more time than actually compiling it.

------
bhengaij
There was a good post yesterday about redo.

I'm all for good build systems and I have used make quite a bit but the
problem in moving to a better build system is that makefiles are so convoluted
and hard to reason about that nobody wants to be blamed for breaking the build
by migrating.

I wish there were a testing system for builds. Specify updating what should
change what and check timestamps (to begin with) of created files.

~~~
geezerjay
> but the problem in moving to a better build system is that makefiles are so
> convoluted and hard to reason about

Nowadays Makefiles are largely autogenerated by the build system. How many
people are actually editing makefiles by hand in non-pet projects?

~~~
cat199
> How many people are actually editing makefiles by hand in non-pet projects?

people that don't want makefiles so convoluted and hard to reason about that
they can't edit them by hand..

i suggest taking a look at any of the BSD build systems and seeing what sane
use of make can look like (PMake and not GNUMake; imho PMake's language makes
this doable; GNUMake's language makes autotools and other mess-generating
'helpers' required)

    
    
        http://cvsweb.openbsd.org/cgi-bin/cvsweb/src/share/mk/bsd.prog.mk
        https://svnweb.freebsd.org/base/stable/12/share/mk/

~~~
geezerjay
> people that don't want makefiles so convoluted and hard to reason about that
> they can't edit them by hand..

You've missed the point. The point was that nowadays editing or even looking
at a makefile is far from standard practice, because makefiles are
autogenerated by the build system and are closer to temp files than to project
files.

Who wastes their time reasoning about a temp file that just works?

