

Qake: GNU Make-based build system with a different approach - nkurz
https://github.com/mkpankov/qake/blob/master/README.md

======
maxs
I've been recently using Tup [0] with great success in a project. It is
interestingly different from other build systems in the sense that it defines
dependencies in the opposite direction (from source file to product, not the
other way around). It is incredibly fast and simple.

[0]: [http://gittup.org/tup/](http://gittup.org/tup/)

~~~
kitd
Tup came to my mind too when reading this. It must be doing a similar kind of
dependency tracking to only rebuild what is necessary.

------
chuckcode
I'm finding that for compiling code usually it is the configuration that makes
life hard rather than the build system part. I've been using cmake [1] which
makes the rules to build object files, libraries, executables pretty simple
although determining and adapting to the eccentricities of any arbitrary
system's environment is painful.

I am also liking snakemake[2] these days for running arbitrary chains of jobs
that have dependencies (in addition to build system). It has a nice syntax,
easy to build and run dependency graphs of different jobs and has built in
multithread support and cluster integration if you want to scale up to bigger
data sets. Very nice middle ground being much easier to maintain and rerun
than a pile of shell scripts but much lighter weight than a whole hadoop "big
data" setup.

[1] [http://www.cmake.org](http://www.cmake.org) [2]
[https://bitbucket.org/johanneskoester/snakemake/wiki/Home](https://bitbucket.org/johanneskoester/snakemake/wiki/Home)

~~~
mkpankov
Yeah, previously we had a no-configure setup (just a configuration file). I
have some thoughts on how this could be implemented in Qake properly, but
didn't get to it yet.

------
codygman
I've always wondered why Shake[0] wasn't more popular. Anyone not like Shake
or know why others don't, or is this just a case of not being known?

0: [https://github.com/ndmitchell/shake](https://github.com/ndmitchell/shake)

~~~
Tobani
The Manual[0] contains an intro that isn't particularly compelling. It
contains things like:

    
    
       phony "clean" $ do
            putNormal "Cleaning files in _build"
            removeFilesAfter "_build" ["//*"]
    

Which would be the make equivalent:

    
    
      clean:
        rm _build/*
    

Now that's obviously the trivial example. But its not trivial to drop make
entirely for something entirely different. A make solution can slowly get more
complicated over time. It looks like shake might be useful for something
really big. I can't quite tell where the investment of learning a new tool and
re-develop something(that works well enough) would pay off.

0:
[https://github.com/ndmitchell/shake/blob/master/docs/Manual....](https://github.com/ndmitchell/shake/blob/master/docs/Manual.md#readme)

~~~
mitchty
So as someone thats recently been getting into Haskell I can say this, that
example while sure its rather much more verbose for the simple case, the more
complex stuff is where it starts to shine more.

That and without knowing haskell a little a lot of that will look like
needless chatter. But like rake once you realize the dsl is just Haskell it
starts to fall into place more.

~~~
Tobani
Which is fine if you're using Haskell. I'm not sure I see the case for
something like this if you're not using Haskell. Adding another package
management system (cabal) into the mix seems like a lot of work.

Of course I don't think language-agnostic build systems in general make a
whole lot of sense unless you have a huge enough project to have a team
dedicated to the build system.

------
Tobani

      Goal
      
      The user is supposed to be never needing to call clean goal.
    

Does this track build tools also? If you upgrade a compiler and some
intermediate binary format changes, will it automatically clean up those stale
files?

~~~
mkpankov
Currently not. I imagine how this could be done, but it's quite a portion of
work.

------
dlundqvist
This is similar to what I wrote at work on and off over the last couple of
months, replacing a build system using recursive Makefiles. Due to the way our
product is composed I ended up adding support for building static libraries,
programs, RPMs and documentation. With full support for dependencies, meaning
if a source for a library is changed, the library will be rebuilt, programs
that links against it will be re-linked, if the program is part of a RPM it
will be rebuilt etc. Documentation will also be rebuilt if source embeds
documentation. Another great bonus is of course that with one GNU Make
instance and proper modeling of dependencies "make -j" works great, every
time. I guess we have a couple of hundred source source files and "make -j"
will happily start compiling them all. Read Peter Millers paper about
recursive Makefiles about why the above is preferred.

Makefiles are of course a bit limited compared to shell scripts, but you can
do a lot with implicit rules, static pattern rules, second expansion, call and
eval, etc.

~~~
mkpankov
I ended up using $(eval ...) a lot, and turns out Make's support for it is...
suboptimal.

As for paper - I believe it's "Recursive Make Considered Harmful" \- I read
it, it's great. Was one of the main motivators during build system rewrite.

~~~
dlundqvist
Three or four levels down in $(eval ...) and $(call ...) still makes me stop
and think how many $ I should have.

Yes, that's the paper, I was also heavily motivated by it. And when I read JDK
also switched[1] to something similar, it just cemented my belief it was the
right way to go, for the same reasons as for them.

[1] - [http://openjdk.java.net/jeps/138](http://openjdk.java.net/jeps/138)

------
q3k

        wget --no-check-certificate https://... - | sh 
    

Might as well not have that https in there at all... sigh.

~~~
nmcfarl
It's a self-signed cert - and just as encrypted as it would be with a
traditionally signed cert.

This is the half of SSL that I care about - I really don't care if you handed
your money over to some organization that verified you have a working phone
number.

\--

Actually it doesn’t appear to be a self-signed cert in this case - or even
necessary. That cert is playing fine with both safari, and GNU Wget 1.14.

~~~
colechristensen
When you're installing software from https, you're not trying to make sure
nobody can see the contents of the message (it's publicly available), you're
trying to ensure that there's no man in the middle tampering with your
software en route.

A self signed cert which you can't independently verify is entirely worthless
in this context. A man in the middle could simply substitute his own self
signed cert and you'd be non the wiser.

You use signed certificates so that a vendor can prove their identity
reliably. I care that the software I'm downloading actually comes from the
owner of the domain I'm downloading it from. I can't do that with a self-
signed cert.

~~~
nmcfarl
So the argument here is that we are trusting github not the author of the
software. And that way we can trust the code audit we do on GitHub to be the
same as the downloaded software. So we don't have to use our own software
tools to do that audit, we can look at the code on github.

I can see that being a valid argument for github however for self-hosted non-
famous authors the fact that they are who they say they are means nothing to
me°. And as such I'm going to have to audit the software on my box regardless.
(Or just forget about auditing and trust of the world is a safe place - which
is what most people do anyhow - and if you are doing that you don't believe in
mitm's anyhow.)

°also I would argue that they signed certificate doesn't prove that anyhow.
And state actors can forge these anyhow, so we are now talking about people
who control your pipes, but not the government, and who hasn't hacked the end
point. And

~~~
colechristensen
The point is, if you don't verify ssl certificates, you might as well use
http. Https with self signed certs provides you no security in any
circumstances downloading public software.

Self-signed certificates and http connections are trivially intercepted and
forged (ever used wifi in a public place?)

Signed certificates provide limited proof of identity true, but they can't be
forged by jokers hijacking the wifi in a coffee shop.

------
zimpenfish
Surprised no-one has mentioned redo[0]. I much prefer it to `make` for
personal projects these days.

[0] [https://github.com/apenwarr/redo](https://github.com/apenwarr/redo)

------
RVuRnvbM2e
I've always wondered why rake isn't more popular.

It's very make-like and simple, but with the advantage of a fully-featured
scripting language.

[https://github.com/ruby/rake](https://github.com/ruby/rake)

~~~
mkpankov
In some environments, even upgrading Make to a newer version is a hassle. Let
alone installing some different tool.

------
vezzy-fnord
Another alternative to make is mk [1], originally written for Version 10 Unix
before becoming the standard build tool in Plan 9 and Inferno, and later
ported to Linux/BSD/OS X as part of plan9port [2].

[1]
[http://doc.cat-v.org/plan_9/4th_edition/papers/mk](http://doc.cat-v.org/plan_9/4th_edition/papers/mk)

[2]
[http://swtch.com/plan9port/man/man1/mk.html](http://swtch.com/plan9port/man/man1/mk.html)

