Hacker News new | past | comments | ask | show | jobs | submit login
Make for hipsters (mattandre.ws)
188 points by kiyanwang on June 20, 2016 | hide | past | web | favorite | 194 comments



This introduction neglects to mention that Makefile rules are primarily rules to derive files (usually from other files). Make will not re-run a rule when the target (file) generated by the rule is already up to date.

From this perspective, a well structured Makefile can be thought of as a pipeline/graph of functions, that derive values (files) from previously known values (files).

Expressing a build process in this kind of way makes it possible to support incremental / distributed build in a fairly natural / obvious way.

It is less obvious how to support incremental / distributed builds for the more general kind of imperative build script that does a bunch of side effects (e.g. the style of build script you'd get using something like ant).


For anyone who's curious: you can use a ".PHONY" target to force targets to be rebuilt (i.e. that they don't correspond to real files). All dependencies of the .PHONY pseudo-target will be rebuilt unconditionally when encountered. Typically, "all", "clean", "install", etc. will all be phony targets as the goal isn't to literally make a file called "all".

The syntax is simple:

    .PHONY: all clean install
marks "all", "clean" and "install" as phony. Even if the file "all" happens to exist, "make all" will still trigger the build.


I prefer to put the phony rule right before the actual rules, it's better for understanding:

    .PHONY: all
    all: ...

    .PHONY: clean
    clean: ...


> All dependencies of the .PHONY pseudo-target will be rebuilt unconditionally when encountered.

To be clear, they will rebuilt only if they are not already up to date relative to their own dependencies, right?


yes.

Phony is only about making sure that 'all' is seen as target, not as file or folder


This is only an issue if you do indeed have files or folders named the same as targets, right? For example, if you do have an "all" file, you want make to ignore the file when considering the 'all' target, you'd apply .PHONY to the 'all' target.


right.


Yes, not mentioning .PHONY encourages people to do things incorrectly and not inline with how make actually works.


This sounds nice, except that the only kinds of dependencies are files (e.g. variables are excluded), and the dependencies should be mentioned explicitly, or things get ugly, especially when using non-standard compilers/tools.


No need to explicitly declare dependencies, you have wildcards and macros at hand to manage redundancy (1). If the compiler/program accepts a file and produces another, it can be used just as easily as the C compiler. My website uses awk, m4 and html-tidy on a pipeline to produce HTML, and with 45 lines of make (with about 10-15% whitespace) I build everything, with html-tidy being optional.


But some languages allow an "#include / import" construct. You'll have to somehow walk all those dependencies and convert them into something Make understands.


A rule can depend on imported files and exclude them with a modification to the variable for rule dependencies:

  a.o: a.c a.h
          cc -o ${.TARGET} ${.ALLSRC:M*.c}
This is bmake though, I don't know the posix way because I always confuse $*, $>, $+ etc. There is also mkdep(1), tho I didn't ever use it.


Why would that be the responsibility of Make?

Of course you need to convert those externally -- with e.g. the C preprocessor for C files...


Well, Make could figure them out, for example by using the Linux ptrace syscall, or by using libfuse.

However, my biggest problem with Make is still that it does not allow for dependencies in the form of variables.


What do you mean by "dependencies in the form of variables"?


Let's say I need to build the library again with a different set of options. Make clean and rebuild is only solution, even if not needed.


What I do is split the makefile in two.

There's a top level makefile with targets like "make debug" and "make release". Those invoke a secondary makefile that actually builds things and takes options on the command line. That way I can build different configurations to different directories without having to copy/paste all of the rules for each configuration of each file.

For example, the main makefile for Wren[1] has rules like:

    debug:
      @ $(MAKE) -f util/wren.mk MODE=debug
That util/wren.mk [2] then uses "MODE" and a few other options, like:

    # Mode configuration.
    ifeq ($(MODE),debug)
      WREN := wrend
      C_OPTIONS += -O0 -DDEBUG -g
      BUILD_DIR := $(BUILD_DIR)/debug
    else
      WREN += wren
      C_OPTIONS += -O3
      BUILD_DIR := $(BUILD_DIR)/release
    endif
It has the real rules for compiling files:

    # VM object files.
    $(BUILD_DIR)/vm/%.o: src/vm/%.c $(VM_HEADERS)
      @ printf "%10s %-30s %s\n" $(CC) $< "$(C_OPTIONS)"
      @ mkdir -p $(BUILD_DIR)/vm
      @ $(CC) -c $(CFLAGS) -Isrc/include -Isrc/optional -Isrc/vm -o $@ $(FILE_FLAG) $<
Took me a while to come up with this and quite some time to get it working, but I've been pretty happy with it so far.

[1]: https://github.com/munificent/wren/blob/master/Makefile

[2]: https://github.com/munificent/wren/blob/master/util/wren.mk


How I handle that:

    .var.%: FORCE
    	@printf '%s' '$(subst ','\'',$($*))' | sed 's|^|#|' | $(WRITE_IFCHANGED) $@
    -include $(wildcard .var*)
    .PHONY: FORCE
Where WRITE_IFCHANGED is a program that essentially does 'cat > $1', but only if doing so would create or change the file. https://lukeshu.com/git/autothing/tree/build-aux/write-ifcha...

Then, if a file depends on the value of a variable, I have it depend on .var.VARNAME. Of course, that means you may have to do something like $(filter-out .var.%,$^) in the command to generate a file.


You could also output to different directories, depending on the values of variables, like for separate debug and release builds.

Of course, this is a work-around for variable dependencies. You'd have to change the filename somehow for every variable that you want to track.


At this point in the chain of thinking, you should read http://cr.yp.to/redo/honest-script.html


I prefer a help rule in the Makefile, like so:

   help:
        @grep -P '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-20s\033[0m %s\n", $$1, $$2}'
Assuming that the trimmed Makefile read:

   build: ## Build the binary

   clean: ## Clean the binary and intermediate files

   distclean: clean ## Clean dependencies, intermediate files and binary

   deps: ## Install dependencies

   all: deps build ## Get dependencies and build binary

   release: distclean all ## distclean + all

Now issuing make help would display:

   all                  Get dependencies and build binary
   build                Build the binary
   clean                Clean the binary and intermediate files
   deps                 Install dependencies
   distclean            Clean dependencies, intermediate files and binary
   release              distclean + all


First off I wanted to say this is amazing and I'm going to roll it into my current and future Makefiles.

Second, if you're on OS X the `-P` flag is going to have problems [1]. Works fine if you just replace `grep -P` with something like `ack`!

[1] http://stackoverflow.com/q/16658333/169153


This is great. I think `help` commands are fantastic things to add to projects right from the start. It really helps the next person to get up to speed.


Simplified: (no color, no grep)

    help:
        @awk -F ':.*##' '$$0 ~ FS \
                {printf "%15s%s\n", $$1 ":", $$2}' \
                        $(MAKEFILE_LIST) | sort


Lovely! I'm going to start using this in all my Makefiles.

Seriously, thanks. You solved a problem I had.


If your grep is like mine you may need to add "-h" to the "-P" option to get the desired output.


Do so many people who go on this site not know how to make makefiles? Also why is it for hipsters? Or is the article for hipsters? Why does the author call his own published notes factually inaccurate? Why is it hacky to use a makefile? Is this all a big ruse?


Unfortunately, many developers have a mortal fear of anything from before the year 2000. They've grown up with this tribal wisdom of unix and it's various tools being a beardy wilderness of byzantine hacks and toxic waste dumps.

Meanwhile they'll merrily pile up a thousand node modules into a towering pillar of bad, just to run the equivalent of a shell one-liner, and call it a job well done.


I think it also has to do with the stackoverflow effect.

Tools that gain traction today are those that can be learned incrementally by doing a google search and copy pasting a solution, one problem at a time.

Out are all those systems that have an underlying theory to them that has to be mastered first. As pointed out elsewhere, there is a perfectly good manual for make[1] that you can read.

My theory is that it has to do with shortened attention spans, and impatience perhaps bred by constant use of technologies with short stimulus response cycles [2].

[1] https://www.gnu.org/software/make/manual/make.html#Introduct... [2] Cellphones.


this is an interesting thought. the Linux Documentation Project was full of full length books on every aspect of administering a Linux system. I printed out many of them, put them in a spiral binder and they lived on my bookshelf next to my O'Riely collection.

Now, I've only got one book on my desk and it's more of a coffee-table thing about html and css


> Unfortunately, many developers have a mortal fear of anything from before the year 2000. They've grown up with this tribal wisdom of unix and it's various tools being a beardy wilderness of byzantine hacks and toxic waste dumps.

Some stuff is genuinely awful.

- Sendmail was ubiquitous, but it was, according to anyone who'd had either fleeting or deep experience with it, a beardy wilderness of byzantine hacks and toxic waste dumps.

- All the WU apps had major exploits every few months.

- SysVinit sucked too, regardless of which replacement you prefer.

- RPM's macro-based .spec file format? Awful.

> pile up a thousand node modules to run the equivalent of a shell one-liner

If you don't like putting small modules together to achieve a task, I have some news for you about shell and Unix...


>If you don't like putting small modules together to achieve a task, I have some news for you about shell and Unix...

No, really. The "UNIX way" is not just "Use tons of small things to do a simple task, complexity of managing them be damned".

It is indeed "do one thing and do it well", but that "thing" is usually high level enough and ready to go -- not something you need a whole stack of dependencies, configuration, and lots of extra glue to use.

And of course those unix tools should be standard and evergreen -- not like npm module du jour that gets abandoned 2-3 years later or completely rewritten.


Totally agreed, that's why most npm modules are focused on specific tasks, and we use github and npm to track how active projects are before installing them.

It really seems like you don't do a lot of node - which is fine obviously, but you seem to have very strong opinions about it.


>It really seems like you don't do a lot of node - which is fine obviously, but you seem to have very strong opinions about it.

The first part of this is a guess -- and it's wrong.

I can't speak for "most npm modules" (there are 200.000 or so -- i've only used around 1000, transitive dependencies included) but even those focused on specific high level tasks tend to depend on tons of lower level modules of varying utility and quality -- living them open to things like the "leftpad" fiasco.

And using "github and npm to track how active projects are before installing them" is not really an answer to the UNIX way described.

First because with UNIX you don't even need to check. You know all the basic tools, from grep and unique, to watch and make, are there for the long term (and have been for 3, 4 decades).

Second, because with npm, even checking for popularity doesn't tell you much. Even the most popular modules can be dropped by their authors, rewritten in just 1-2 years to behave differently, fall out of favor and languish etc. Heck that's true even for npm itself (e.g. the changes for version 3, like the flat sub-packages), gulp (see the way the new gulp handles dependent tasks and the new syntax required) and everything else.

Heck, 6 years ago Backbone was merely just released. And from then on we already went from it being most used and "hot", to Angular, React, React+Redux and who knows what's next.


> and it's wrong.

Fair enough. Just saying that's the impression I get.

> even those focused on specific high level tasks tend to depend on tons of lower level modules

Yes. Good libraries do that. Smaller bits are made from bigger bits.

> living them open to things like the "leftpad" fiasco

I don't think you understand what the "leftpad fiasco" was. Small modules doesn't have anything to do with it - people who unpublish their modules on a whim does. And that's not been prevented.

> And using "github and npm to track how active projects are before installing them" is not really an answer to the UNIX way described.

Indeed. It's an answer to your concern about "npm module du jour that gets abandoned 2-3 years later or completely rewritten".

I'm not sure that npm, gulp changing significantly is a bad thing. Good software evolves over times. As long as the changes are clearly communicated I'm 100% down with that. Look how much better express got!


>Yes. Good libraries do that. Smaller bits are made from bigger bits.

I think there's a marginal returns (or worse negative returns) border with this approach, where the overhead, over-generalization, and, finally, bloat, maintenance and management issues stemming from having too many small parts from disparate sources justify writing a specialized solution.

Leftpad would be an extreme case in point -- I'd put the bar even higher than that.

>Small modules doesn't have anything to do with it - people who unpublish their modules on a whim does. And that's not been prevented.

The "unpublish" thing just made the issue instantly apparent, but the core issue would still be there even if nobody could ever unpublish anything.

E.g. such small modules could get abandoned and neglected with bugs or security issues, could have extra bloat the parent library doesn't need (this might not hold with leftpad but would be true for anything a little larger that covers 10 things when you only need 2 and could code them yourself and be done with it), is now a dependency that has to be tracked and updated, it's not optimized for the particular use case it is used in, etc.


> E.g. such small modules could get abandoned and neglected with bugs or security issues

How well do you think a module with 1000 users is likely to be maintained, vs a create-your-own-wheel equivalent?

Do you think writing the latter is an adequate use of your time?

Do you think you'll discover all the edge cases?

How does writing your own wheel module rank when compared to working on features your customers want?


Respectfully:

> - SysVinit sucked too, regardless of which replacement you prefer. > - RPM's macro-based .spec file format? Awful.

These are opinions, which is valid.

If you want to have some fun. install centos7 in a vm. Run yum update systemd. Reboot. Your vm is now bricked.

Rpm spec files can be annoying, but what is more annoying to me is the spec behavior changing between rhel/centos 5, 6 and 7.


> If you want to have some fun. install centos7 in a vm. Run yum update systemd. Reboot. Your vm is now bricked.

Yeiks.


I like SysV-style init systems. I also don't mind RPM .spec files. They've always let me get the job done.


I hated SysV init, because writing init scripts is a chore - especially since you have little to no choice but to implement the detach/fork strategy to get a daemon to work properly without resorting to things like supervisord, etc. As much flak as systemd gets for trying to do too many tasks, I really find it a much better solution to `init`.

With that said, I agree about rpmspec's. They're simple to read and easy to maintain, it's part of the reason I chose CentOS and Fedora as our standard at work instead of Debian or Ubuntu.


I certainly can't disagree about the SysV boilerplate, but I also haven't written more than 5 or 6 of those init scripts in the last decade, so it's never seemed bad enough to me to ditch it for something completely different.

In my current day-to-day, between work machines and my personal machines at home, I've only got to deal with init stuff occasionally, so picking up the Systemd way of doing things will take me a while.


Please show me the makefile for your Javascript project that will run perfectly on Linux, Windows and Mac with little to no setup.

I doubt it exists. Until then, I'll stick with my Node.js build tools, thank you very much.

Honestly, if anyone has an example of a makefile that can bundle my scripts, insert bundle links into my HTML which is compiled from jade and nunjucks templates, minify the JS, CSS and HTML and optimize my images...I'd love to see that.


I can't even get `npm install` to run perfectly on my mac. It _always_ fails and needs to be re-run a few times to get everything installed.

Anyway, the point is, everyone likes the smell of their own farts. But, they still stink.


"Please show me the makefile for your Javascript project that will run perfectly on Linux, Windows and Mac with little to no setup."

One can argue that it would be better to solve that problem, rather than requiring everybody to sort-of solve it well enough to run one's own build.

Having said that, minimizing .js, .css, .html or .png files conceptually is the same as turning a .c into a .o, and bundling is conceptually the same as tarring or zipping a set of files, so I don't see why it wouldn't be possible to write a makefile that does that.


I tend not to do much with Javascript, but if you have a project that I can take a shot at converting, I'd be curious to see what trouble you're running into.

Windows is the only real challenge there, but with mingw, I can't imagine that it's a serious one.


Sure github.com/nwmcsween/asm.css/Makefile


"that will run perfectly on Linux, Windows and Mac with little to no setup"? No.

I see no reason at all to use a Makefile in that project other than it being your personal preference. It's completely non-standard for web projects and it will only serve to confuse and annoy the next developer who wants to work on your code.

Node.js, NPM, Gulp and Grunt are all orders of magnitude easier to get running on a wide variety of operating systems than your typical C/C++ build tools. Furthermore, you don't have to learn some cryptic syntax to use them. It's all Javascript and JSON.


Because gulp, grunt are not real build systems, they are glorified scripts that do whatever, there is no tracking of dependencies, nothing. Make isn't a cryptic syntax it's a quite simple language that took me about a day to understand fully.


Incorrect. Gulp and grunt can track dependencies the same way you do anything else with it...just add the right plugin.

Saying that gulp/grunt aren't real build systems because of one feature that you actually missed anyway is like saying that Javascript isn't a real programming language because it doesn't compile to assembly.

Practically nobody uses Make for JS outside of a few misguided folks. There are good reasons for that. One of them is that Make's craptic language is useless outside of that one single task. Another one is that Make sucks at cross-platform. It forces you to use OS tools that don't exist on every OS. Meanwhile, Node.js tools work everywhere.


There are far more developers who will jump on the latest new shiny with no good reason.


Which is why I don't pity them when they scream "Javascript fatigue!". Put down the shiny new tool and back away slowly: you might hurt yourself and others.


I think they don't know better - as any relatively new web-ish dev has grown up in a server-side javascript ecosystem that's maturing rapidly in the last 6-7 years, and so has led to a lot of churn (frontend frameworks, js build systems).

Separately, on the information side, there just aren't as many articles on old, but stable technology - and so if you open up HN, all they're seeing is the latest; if your team doesn't have good experienced devs on it and if you haven't learnt it in college/bootcamp then you'll pick what is publicized.


In counterpoint, remember that there was a time when some of the things mentioned on this page, such as autotools, were themselves the "latest new shiny".


I think Make is also broadly associated with autotools, which surely can't help its public image.


Autotools is smoking shit, but the make man page is a few pages and very comprehensive. Also the includable make libs that come with bmake allows many makefiles to be a few or one lines. Also, the best makefile I've ever seen is that of Linux, if someone considers to use make, must see that to convince himself, yes, it is possible to write a comprehensible, cross-platform and editable makefile for a big, deep project with configurables w/o the atrocious madness that is gnu autotools.


The concept of autotools is fine for the most part, it's just M4 that has a really bad rap (and rightfully so, it's a huge pain in the butt!). I have a couple toy vala and C++ projects I've written and autotools "just works" without much effort, but when I've needed to do something outside of the standard functionality the need to write M4 macros makes me quickly switch it out for CMake instead.


The problem with autotools is that it is an abstraction over a thing that's already simple and easy at it's heart, resulting in ultimately something more complex than it should've (and would've) been.

M4 is nice for processing text, I do use it and like it for that, but for defining dependencies among files and processes to go from source to compiled files, make itself is second to none. Autotools is basically one of the biggest exaggerations in the computers' history.

Furthermore, bmake exists and its include files are simply better.


By the same token, old shit isn't good by dint of its age.


Every time I encounter the hodgepodge mess of bower, npm, grunt and gulp scripts that our front-end developers have erected to compile their javascript, I really wish they had taken the half-hour to learn make, instead of all of these other tools that don't work quite the same and do about half of what make will do.


And some of which will be rapidly deprecated.


>Do so many people who go on this site not know how to make makefiles?

You'd be surprised. Especially if they work outside C/C++, and do e.g. mostly web development, they neither care much, not know much about makefiles. At best the know to configure, make and make install something on their Linux, but most haven't even tried that.

>Also why is it for hipsters? Or is the article for hipsters?

The article is (supposedly) for hipsters. You seem a little challenged keeping with the latest "culture" (ain't we all), so let me explain.

What he means is that the article is not for seasoned veterans, or unfashionable people who delve in the old-fashioned word of GNU-tools and Make, but for "hipsters", e.g. the new-ish, young-ish developers, usually working with hip languages (and especially web, scripting, etc.), that are not rooted and well versed in the UNIX traditions.

>Why does the author call his own published notes factually inaccurate? Why is it hacky to use a makefile?

It's just a funny wording for the author to convey:

"I'm not a complete authority on Make and Make best practices, I'll just tell you what I know from experience about it. And I expect some Make-gurus to try yell at me about my suggestions being inaccurate/hacky etc, so I try to pre-empt them with this warning".


"Hipster" carries a more detailed connotation than that, at least in the US.

(Lots of fond memories arguing about the term in europe, where the word seems to lack its snobby, negative feel)


>"Hipster" carries a more detailed connotation than that, at least in the US.

Yes, but this isn't the generic term "hipster", it's applied to a specific context, that is hipster programmers.

>Lots of fond memories arguing about the term in europe, where the word seems to lack its snobby, negative feel

I guess depends on where in Europe. In the place I'm aware of (e.g. not sure about Germany or nordic countries), it is indeed negative. E.g. this is about hipsters from a UK perspective:

https://www.youtube.com/watch?v=lVmmYMwFj1I&feature=youtu.be


> Yes, but this isn't the generic term "hipster", it's applied to a specific context, that is hipster programmers.

Then he should say hipster programmers. I think of a less-parodied version of these guys: https://www.youtube.com/watch?v=IGkuxk-LPww

Also, the people I argued with were German. Perhaps a broad brush...


(1) I imagine quite a lot of this site's visitors use different platforms/toolchains, and haven't encountered make. (2) Its a fun/joke title, the article is for "hipsters", the kind who know the latest fancy build tools but not make, I suppose. I wouldn't take it too seriously. (3) It's a disclaimer against people attacking him for errors, and also to warn people that he's not an expert. Again, this is written in a lighthearted way. (4) See 3. (5) Maybe, or maybe you are just feeling cranky today.


If not a ruse, this is a dumb post that discredits itself and calls makefiles hacky, not noticing that they are used by many people for most native programming on most (unix type) platforms.

I'm feeling cranky every day, but just today this post is at the top of hacker news, and that's frustrating.


The refusal of so many people in this industry to learn a pretty simple and incredibly fundamental tool frustrates me no end.

I guess this article could be worse: the number of times I see people reimplement poorly thought out clones of make and present them as the new thing that everyone should use is just absurd.


You are frustrated about people not learning this tool, any yet when someone publishes an article encouraging people to learn that very tool, you damn it?


For simple use cases, Make is simple (and lovely, in my opinion). For moderate to complex cases, Makes syntax and contortions you have to put yourself into to work around the barrier between what Make expects and how certain things can be done today can cause a bit of pain.

That said, you can prize Make out of my cold dead hands. Still my favourite build tool, and damned powerful once you learn to tame it.


>If not a ruse, this is a dumb post that discredits itself and calls makefiles hacky, not noticing that they are used by many people for most native programming on most (unix type) platforms.

Bah, you just take things too literally, and can't process/appreciate some obvious and light-hearted cultural signs.

A post that makes a joke comment != a post that discredits itself.

A post that warns that its own makefiles might be hacky != a post that calls makefiles inherently hacky (in fact this is obvious, since the post's goal is the inverse: to promote Make to people).

A post that tries to introduce Make to a whole bunch of devs who don't know about it != a post that doesn't notice that Make is used by many people for native programming on unices


The gist of the post is that make is preferable to modern build systems. The author has used self-depracating humour to defend himself against attacks, but clearly it was too dry.

If we stopped attacking the authors of articles, then it would be easier for them to write openly and less defensively, but unfortunately if you write anything on the internet, people will call you an idiot.


Why would it frustrate you that this is something people want to learn?

There are plenty of valid reasons why an established developer would never use Make, namely integration with the rest of their 'ecosystem'. Also, there's plenty of reasons for everyone to get in and use Make where appropriate.


I've been cranking out code since the 80s and I can count the number of times I've had to create a make file one my fingers. Not everyone is working in the same domain with the same tools as you are.


I loved that old joke: Only one person has ever created a Makefile from scratch, and everyone else just copies and modifies it.


I thought that was an autotools joke. Make is simple.


I dunno. Maybe it's a reaction to the myriad build tools developed for Javascript recently. Could they have been done with makefiles instead?


That was Java all over again.

- you've got dependencies on some basic Unix commands (cp, rm etc.) in your tasks, so better rewrite that all to use your cross-platform language's facilities.

- XML, erm, no, JSON, shall be your only file format.


Well in fairness Java does have the excuse that the cloud, and virtual images didn't exist and it was basically a Windows world. Linux/Unix was barely known to most developers.

Java also pretty much brought the idea of dependency management (Maven) to mainstream programming languages (although I guess you could argue some of the early Linux did as well (RPM)). It is sort of interesting how Make is sort of like local dependency management.


> Java also pretty much brought the idea of dependency management (Maven) to mainstream programming languages

And Maven is still the only "package manager" I don't end up wanting to beat my head after using. For all the flak that pom.xml gets for being wordy, I know that after installing the JDK and maven all it takes is running `mvn package` and all my build plugins plus dependencies will be downloaded, installed, project compiled, tests run with whatever test runner I decided to use that year, and, assuming they passed, the project packaged up in a .jar or .war ready to be deployed.

Some more recent tools like cargo also get this right, but I don't have much of a use case for rust since the majority of my time is spend in web and desktop apps. But EVERYTHING being in one file and one tool makes getting down to business easy, there's no "make sure you run npm install before you call grunt/gulp/brocc/whatever" because maven handles it all.

Oh, as an added benefit, I don't scream about dealing with proprietary libraries like I do with .Net. The .Net SDK for our document management system has to be installed on all systems that use it via an MSI, yet the Java one is just a .jar with a couple dependencies, I whipped up a pom.xml for it in 30 seconds, pushed it to an internal maven repo and boom, done. Hell, at that point getting it packaged into an RPM (because I don't do "sprawl shit over the filesystem" deployments) took me another 3 minutes.


I pretty much agree entirely but have been too ashamed to say it :)

The build tools in Rust are incredible (given the maturity of the language). OCaml is also rapidly evolving as well (after a brief period of stagnation). I think those two languages would and probably even now make for excellent enterprise/business programming languages.

As a tech decision maker I would love to build future projects with Rust or OCaml but the cognitive load and talent needed for both those languages is higher sadly. The Ocaml build tools were a disparate mess back when I used to use it in college and for pet projects but I can honestly say there has been unbelievable strides forward.... and OCaml compiles so damn fast. I still can't believe Go users brag about compilation speed when I swear OCaml has been and still is much faster (IMO/anecdotally of course. I don't have numbers).


> I pretty much agree entirely but have been too ashamed to say it :)

I can understand that entirely, it's way too easy to get ostracized for having a positive view on Maven or Java these days.

Java as a language has warts, but thankfully there's a thriving community around the JVM and a wealth of alternate languages to pick up. Still, you can pull Maven out of my cold, dead hands.

> As a tech decision maker I would love to build future projects with Rust or OCaml

I would love to use OCaml but the lack of interfaces to a lot of tools we use really bites, so I've settled for F# (can't wait for .Net Core support). Until there's DB2 for i drivers plus Laserfiche by some miracle supporting any programming language not deemed "enterprise-grade" I'm stuck with .Net or JVM languages for a lot of things I do daily :(


I too think maven (from 2.0 onwards anyways) is awesome.

It's also declarative, BTW, which is a big improvement over Ant.

Maybe somebody should re-skin maven with a yaml based front end to attract the hipsters?


> It's also declarative, BTW, which is a big improvement over Ant.

It's an improvement over pretty much anything else except for maybe cargo and whatever that haskell build tool is (name escapes me right now). Builds are not iterative/procedural tasks in my mental model, I simply have a list of inputs and desired outputs, so I agree that the declarative model of Maven is awesome.

I absolutely hate dealing with MSBuild files, Rakefiles, grunt/gulp/whatever because they all focus on a chain of tasks in a specific order. Why should I have to tell my build system how to do its job, I told you what I want done, just do it!

> Maybe somebody should re-skin maven with a yaml based front end to attract the hipsters?

A YAML to pom.xml converter would probably be a rather simple solution to this. Honestly, I wouldn't mind it either, there's a lot of noise in pom.xml that I could do without even as someone who likes Maven.


> whatever that haskell build tool is (name escapes me right now).

I guess you mean Shake.


Cabal.


NuGet is the current way of doing packages for .NET and get the same experience as mvn package.


Not even close. Just because Visual Studio will automatically download packages from NuGet when you click 'build' before it executes MSBuild doesn't mean your build server does, or when executing a build from the command line. They are two completely independent tools with almost no integration with each other, beyond maybe stuffing an MSBuild task in to handle nuget package restore before it loads any plugins.

With Maven everything is in ONE tool, and the ENTIRE configuration for the build is contained within your pom.xml. Custom maven repositories, what build plugins need to be installed, how the build plugins are configured, project dependencies, relationships between multi-module projects, what mojos run at what phases, it's all in a single easy to read, declarative XML file. MSBuild is a giant clusterfuck compared to Maven, and it's integration with NuGet is shoddy at best.

NuGet doesn't even allow me to specify repositories on a per-project basis to this day, everything has to be specified globally. There's a LOT more configuration required to get my build server set up to build .Net apps than anything compiled with Maven.


You're right. I hope they get there eventually - NuGet has a lot of attention at the moment due to it being a critical component of .NET Core.

Solution level repositories might go some way to fixing your last issue, see https://github.com/voltagex/junkcode/blob/master/CSharp/zzCo...


> Linux/Unix was barely known to most developers.

As a personal machine maybe not, but even in the early heady days of Java, Unix deployment was the norm. I mean, what you're going to pick, one of those new-fangled NT servers or a proper IBM/Sun/HP-UX machine that has almost enough memory to run Oracle properly?

But given the "write once, run anywhere mantra", Java had something to prove, which lead to a lot ot NIH.


> dependencies on coreutils

Or just install that Ubuntu on Windows thingy and be done with it.


As soon as that becomes available outside the fast ring I will be spending a lot more time on Windows!


Are there even serious developers on the Windows platform? I can't imagine many people lack the `rm` utility.

I heard a legend that even the Windows developers run some flavour of unix.


I suppose so. But there are always many tools for everything. And everyone can select his favorite.


The problem is if you want support (even commercial help) you'll need to follow some kind of prevailing trend. If I've got a system with grunt or gulp and I need help, I can probably find a local web developer. I'm not so sure if I can find someone who's an expert in Makefiles at the moment.


You already found at least one. Also try anyone with a degree in computer science. I'd wager it's in most first year courses; it was in 3 of mine. Also you could consult anyone who spends a lot of time working on/programming for Linux and/or in C/C++.

And it's really easy, and there's tons of literature on it on the internet. And it has a manpage.


I never encountered make in any assignment during my studies. It certainly wasn't mentioned in the lectures. At my university, you didn't learn any tooling at all, if you thought you needed it you had to teach yourself.


My professor encouraged us to learn Make but ultimately provided the file for us in every project.


Never heard of it until my 3rd company, much less using it in my first undergrad year.


Seriously? The most used build tool in existence... and you don't think you can find someone who knows it?

...That may be the source of your problems if your devs are that inexperienced!


The main issue is that the prevailing trend in web development changes every six months. I wonder how many developers that know grunt you will find in 3 years.


I see this rhetoric being thrown around all the time, but I don't actually think its true in practice.

Grunt came out in 2011 and Gulp came out 2 years later with a superior (IMHO) code-over-convention approach. They've kinda been doing their thing since then.

Statements like seem to be nothing but FUD.


It's not about when things come out, it's when people start talking about them. I've been on an for 4 years (a relative newcomer, I know) and this is the first time I've seen either Grunt or Gulp mentioned.


So how can you be upset with the zeitgeist changing so frequently if it's taking to 4 years to hear about these widely used (in their field) tools?


if your devs can't handle learning basic make, all is lost.


I'd wager there are a fair few Windows devs who don't know make. Granted there are few who truly understand MSBuild to make the comparison fair.


yeah, makes you wonder what kind of bubbles some of these developers are living in if they see Make as some magical relic from the past.


where do you get "some magical relic from the past" from? Not knowing about something != thinking it is useless.

The "bubble" is "most people that never used C/C++" (because outside of there, make is pretty rare despite its usefulness), and even if you've used C/C++ you are not necessarily familiar enough with it to know how to properly use it or to realize you can use it as a general tool.

Why would you expect people to magically know about (and prefer to others) a tool that's almost never used in the ecosystems they touch, or only behind the scenes?


On Stackoverflow, I scan down the list of questions on the first page and all I see is a myriad of questions about tools I have no idea about and, therefore, cannot help with. Most of these questions have no, or one, answer; most of which aren't helpful.

So the problem with choice is the number of people who also know that tool who can help you.


Most people are probably writing web based apps in higher level languages that don't require make files. ("Hipsters" sounds like a dig at Ruby / Node devs to me).


Quite possibly, though I think most of the hipsters have left Ruby by this point.

For my part, as a Ruby developer, I detest rake and tend to write makefiles for my Ruby projects instead.


really? how come?

i use rake (very superficially) from time to time, and it seems fine. Any insights you have regarding its design after long-term or more indepth use?


Mostly that it adds complexity for very little benefit. 99% of the time, I can do the same thing just as easily with Makefile's and be able to have it nicely interoperate with non-Ruby tools.


I mostly do web development, and I went for quite a few years without having any more of a build tool than a shell script or two to minify files and the like. It's only been since I started using React that I went to a Gulp setup.

And really, my experience with Gulp had been the opposite of the snide "hipster" dismissals. I wish I could find clean examples via googling that weren't useless due to APIs completely changing, or due to using a library that's been deprecated in favor of another library that's been deprecated...because, man, who still uses a library after a year? :P

Somehow, though, I don't see make being a great replacement for my purposes. At the very least, it's not going to be any faster running Babel over a pile of files.


The single biggest piece of advice I can give for Make is to make sure your text editor uses real tabs in Makefiles. My personal preference for everything is spaces (mostly a habit learned from Python's PEP 8), but Make doesn't honour spaces for indentation - only tabs.

Despite being an obvious usability hole, this has never been fixed. Exacerbating the issue is the poor error message given upon encountering a space-indented line in Make:

    Makefile:2: *** missing separator.  Stop.
Make is ubiquitous and fairly easy to use, but it's warts like this that remind you that it is fundamentally a build system from the far past.


That's an error message from GNU Make. I'm not sure what version of GNU Make you're using, because GNU Make has emitted the following since 1998:

  Makefile:2: *** missing separator (did you mean TAB instead of 8 spaces?).  Stop.
See 2c64fb221a265f9e7fc93374906b1e7540377561:

  1998-09-04  Paul D. Smith  <psmith@gnu.org>
  
  * read.c (read_makefile): If we hit the "missing separator" error,
  check for the common case of 8 spaces instead of a TAB and give an
  extra comment to help people out.
I guess you may have indented it in the modern way with 4 spaces of "tab". Either way, there's also help to be had from your other tools, e.g., vim highlights space-indented make recipes in glaring red error bars.


You can also add a line like this to your .vimrc:

  au FileType make setlocal ts=4 sts=4 sw=4 noet
and have vim automatically use "hard" tabs when it detects that you're editing a Makefile. Of course, you can s/4/2/g or whatever your preferred tab width is.


I have inquired if they would like a patch for this: https://savannah.gnu.org/bugs/index.php?48276

Thanks for mentioning, I had no idea this special case was here!


> Despite being an obvious usability hole, this has never been fixed.

Well, which implementation of make should be fixed? The thing about these truly ubiquitous nix tools is that eventually people discover that there are unixes other than Linux.

…that said, a quick google found that this is* fixed in GNU make; check the docs for .RECIPEPREFIX here: https://www.gnu.org/software/make/manual/html_node/Special-V...


> Despite being an obvious usability hole, this has never been fixed.

You can now change the prefix for your recipes by setting the .RECIPEPREFIX special variable (https://www.gnu.org/software/make/manual/html_node/Special-V...).

That being said, with a properly configured editor I've actually found that TAB is an ideal character for the recipe prefix.


I mostly work in Python as well, but I always include an .editorconfig file in each project to automatically set things like

    [Makefile]
    indent_style = tab
http://editorconfig.org/

https://github.com/aaronbassett/EditorConfig/blob/master/.ed...


Not be sound harsh, but your problem lies here

    My personal preference for everything is spaces 
You want a tool that has been around longer than you do stuff your way. I think it is better if learn to work with the tools and accept that there is a reason some things work in certain ways.


I've already configured my editors and tools to work with Make, and I'm not so much complaining about Make here as I am pointing out one of its most annoying warts. I am a pretty heavy Make user and I think it's great (not least of which because it's so darn ubiquitous).

And, not to sound harsh, but if you always "accept that there is a reason some things work in certain ways" without questioning if there may be a better way, well, you will always be stuck with the status quo!


Given the downvotes I assume this is a very touchy subject to some people here...

Anyway, are you saying that accepting a tools established input format is to go with the status quo? I am all for breaking new grounds, but is this one really a priority?


For a lot of people for a long time, yes. People were complaining about the syntax based upon specific types and amounts of whitespace in the 1980s. Now note from the rest of the very discussion here in front of us that people took action on this almost twenty years ago. And that's just Johnny-come-lately GNU make. Microsoft nmake accepted either spaces or tab characters pretty much a decade before that. And there were plenty of others, from Borland's to multiple shareware versions. Even I got in on the act with my own private reimplementation of make. Somewhere around 1990, if memory serves.


This is true, but it's important not to lose sight of the difference between "questioning if there may be a better way" and bikeshedding. The former is often useful; the latter usually isn't.


Make 4.2 was released this year. Many people prefer spaces over tabs and many editors insert spaces by default. Most compilers can deal with both spaces and tabs in their parsers, so I assume that it's not an insurmountable problem to make both work for Makefiles, or at least improve the error message. So calling it a "usability hole" is completely accurate.


Most sane editors automatically switch to tabs when editing makefiles.

Anyway, my point was, not everything needs to be customizable to your liking. Sometimes things have a set of simple rules you just have to follow. I for one like that we are limited to tabs, otherwise we would have problems with different number of spaces in the same file and by the way exactly how many spaces are a tab? I like 4 but maybe you like 8 or, god forbidden, 2?


In case anyone's curious, I've been using Makefiles for web development for a long time. The src files of a webpage are un-minified javascript, less/sass files, and thumbnails. The dst files are the minified/built/thumbnailed output. This suits Make particularly well.

https://gist.github.com/zx2c4/11de49b2780c787b3ed5e1ee394857...

That's the Makefile I use on https://www.edgesecurity.com/ if you're interested. It also has in it some very simple ssh/rsync deployment mechanism. Since the deployment relies on the built files, I can simply run:

    $ make deploy
And the entire webpage is built and deployed in one step.

Every webpage I manage has a variant of this, with all sorts of rules for building and managing things. It turns out to be much easier than grunt or ant or random bash scripts or vagrant or whatever else kids use these days.


This tutorial is awful.

You should read the manual, which is shorter than that page!

make is NOT:

    alias:
        commands
this will bite you when you least expect. Make is:

    targetfile: inputfiles
        commands to get targetfile to exist
and you can use things like

    %.css_min: %.css
        sass stuff

    .PHONY: release
    # above is needed otherwise you won't build if there is a release/ dir
    release: release-min.css

    release-min.css: (list *.css_min files, there are helpers for it)
        cat *.css_min > release-min.css

anyway. you get the idea. work with files and why you need those files.


Yea, you can just do what he is doing using shell scripts anyway. I don't see why you'd use make for it.


Meanwhile, there’s a perfectly good introduction in the GNU Make manual:

https://www.gnu.org/software/make/manual/make.html#Introduct...

Also available in print: https://shop.fsf.org/books-docs/gnu-make-version-381


I learned how to use make by reading OpenBSD's man page for it[1]. It's written extremely well, and is concise yet comprehensive. Of course, it's specific to BSD make, but most of the concepts are portable to GNU make with some caveats. FreeBSD and NetBSD, iirc, both have similarly high-quality manuals for their respective makes, and I've come to favor BSD make in general, using bmake[2] on Linux and Cygwin for my own projects (bmake being a portable, packaged-up version of NetBSD's make).

[1]: http://man.openbsd.org/OpenBSD-current/man1/make.1

[2]: http://www.crufty.net/help/sjg/bmake.htm


Unless your distro excludes the Texinfo docs, the same manual is probably also available locally:

    info make
GNU tools often use man pages for a quick reference of the options, return codes, etc. Unfortunately, as info(1) isn't as well known as man(1), it's easy to miss that many GNU tools have detailed manuals.


thank you for posting this


I like to use makefiles for managing docker containers / images. For my needs it's a more practical and flexible solution than other docker orchestration / management tools I've tried. A few things I've learned that might be useful:

* (cd /path/to/something && make goal)

* You can use .DEFAULT_GOAL := goal_name to set the default target run by 'make' with no target name specified.

* Adding .SILENT will suppress make entering/leaving directory messages. Probably wise to be careful with one until you're 100% sure you're not ending up in an unintended directory doing nasty things.

* You can use $(COLOR) = `tput setaf x' and $(RESET) = `tput sgr0` to add colors to your make output. For example @echo "$(RED)ERROR!$(RESET)"

* Use if ! -f conditionals to start goals that rely on external scripts/programs. Otherwise make will run until it hits the error which will leave you with a partially completed goal.

  goal:

    @if [ ! -f ./scripts/blah.sh ]; then echo "ERROR: ./scripts/blah.sh not found"; exit 1; fi

    @echo "without the if ! -f this command would run"

    @echo "and so would this one"

    @./scripts/blah.sh $(IMPORTANT_STUFFS)

    @echo "but not this one"


> You can use .DEFAULT_GOAL := goal_name to set the default target run by 'make' with no target name specified.

I had always, perhaps erroneously, thought that the very first goal in the file was the default one.


IIRC .DEFAULT_GOAL is a relatively new feature. If it's not set goals will be processed in the normal order.


This is true -- the first goal declared is the default.


I routinely use a Makefile for building my frontend resources, which works quite well:

https://github.com/adewes/gitboard/blob/master/Makefile

Using the inotify tools it's even possible to automatically trigger a rebuild whenever a file changes.

For me, no need to use Gulp anymore, as Makefiles are easier to reason about, more composable and make use of existing infrastructure instead of reinventing the wheel over and over again.


> You need to create a makefile to tell make what to do.

Nope. You can say "make hello" and have make automatically use its default rules - which can be configured - to e.g. compile hello.c into a hello executable. I use this frequently.


Make's default implicit rule set is very useful for C, C++, Fortran, Pascal, and basically nothing else.

Out of curiousity: how do you configure the implicit ruleset? I wasn't able to find documentation on that (the closest I found was https://ftp.gnu.org/old-gnu/Manuals/make-3.79.1/html_chapter..., which explicitly documents the Make implicit rules but says nothing about changing/augmenting them).


Many implicit rules make use of variables, which you can change as you see fit. For example, creating an object file from a C file will do something like "$(CC) $(CFLAGS) $< -o $@". You can change CC to change the compiler used, and add stuff to CFLAGS to add compiler options, all without touching the rule itself.

Other than that there's not much configuration, but the predefined rules are usually quite simple, so it's pretty easy to just write your own implicit rules if the predefined ones don't suit your needs.


I hate implicit rules. Also the %: %.c implicit rule is especially idiotic, it causes to check this rule for every single file, so if you have hello.h, then it will search for hello.h.c too.


Well, you either force people name their files correctly, or you make the computer do a little extra bookkeeping so that they don't have to. This is the silliest thing to hate about make that I've ever heard of.


A better title would have been 'Make 101'. Its an absolute beginners tutorial.


A more precise title would have been 'Make 101'. I also wouldn't have read an article by that title, although (not knowing any make) I'm glad that I did read this one.


Something unmentioned so far is when you start getting "advanced" the make scene is much like the shell scene and its fairly easy to write something that only runs on GNU make or only on BSD make and there is plenty of opportunity for argument about that being a bug or feature etc.

So there's that. Make being such a good idea and a popular standard, there's no shortage of people implementing not quite compatible make-like-systems.

Another fun tip is make your makefile executable and try the first line like

#!/usr/bin/env gmake

or whatever seems appropriate. Or perhaps GNU make is installed as "make" on your OS or you use an "alternatives" link farm or whatever. Some people will flip out if you put arguments on that command line, others will pat you on the back and high-five. So there's that "fun".


One really neat thing about make: If you include other makefiles then before including them make checks if there are prerequisites of these files and rebuilds them as needed. One can use this to automatically generate header dependencies of translation units for C/C++ projects. Some of the relevant lines in one of my Makefiles:

     include ${DEPENDS}
     
     ${DEPDIR}/%.d: ${SRCDIR}/%.${SRCEXT}
     	@mkdir -p ${shell dirname $@} ; \
     	$(CC) -MM ${INCLUDE} -mmcu='${AVRTARGET}' $< >> $@
 
If any file in the "include" line needs updating then it is generated by gcc and included in the same make run. I didn't find a good documentation about this feature and I discovered this with make's pretty verbose -d option. Previously I did something like this using multiple makefiles and recursively calling make which is generally frowned upon (for valid reasons).


I think these days you're better off using -MMD to generate header dependencies at the same time as compiling, rather than doing it in a separate rule.

    %.o: %.c
            $(CC) $(CFLAGS) -MMD -c $<
    -include $(OBJECTS:.o=.d)


Is this always correct? The %.d files don't depend explicitly on the %.c files so they aren't rebuilt every time at the beginning of the make run, so old dependencies are included after a change in header dependencies. I'm still trying to construct an example where such a makefile breaks though.


The only case it becomes problematic is if you delete a header file, as make won't know how to remake it and won't recompile the C file (removing the dependency) until it has done so. There are two options to resolve this:

Also specify -MP on the GCC command line, which causes GCC to emit an empty rule for all the header files, like this:

    foo.o: foo.c foo.h bar.h
    foo.h:
    bar.h:
Alternatively, you can write a generic empty header creation rule:

    %.h:
Which has the same effect.


Neither make nor any of its alternatives or improvements get header files completely right. This is because compilers simply do not emit all of the information. A build tool not only has to know about the header files found, to monitor them for changes, but also has to know about the places where header files were not found, to monitor for files suddenly appearing there. I wrote a C++ preprocessor that would spit out both sets of information, for use with redo, but I've never published it. Without this information things can end up not being re-built correctly in a number of cases that one can encounter in practice.


> A build tool not only has to know about the header files found, to monitor them for changes, but also has to know about the places where header files were not found, to monitor for files suddenly appearing there.

I wonder how my makefile fails here.


I don't see what this has to do with hipsters. The fonts are not very cool.


Abuse of a top-level domain which occurs at the end of the author's name - that gets about a 7/10 on my hipsterometer.


Not knowing something any CS graduate should already know and then feel so great about discovering it you just have to write a blog post about it?

that gets a 8/10 on my hipsterometer


I own forhipsters.com, what does that score?


5/10 but only because it's not forhipste.rs


Here's a more in-depth and easy to read guide for beginners: http://www.computerhope.com/unix/umake.htm

If you want to see some interesting/convoluted Makefiles, get a very complex, large piece of software (something from Gnome, possibly) and use autoconf & automake to generate the Makefiles, then pour through them to see how they tick.


For the target audience (total Make beginners) poring through autogenerated makefiles might not be such good advice. It's functionally equivalent to telling people to learn Bash by studying a `configure` script, or learning JavaScript by studying EmScripten output.

If you're decent with Make and want to learn advanced tricks, then by all means go right ahead, but 95% of Make users will never need to delve into such details.


Fixing autoconf/automake issues (particularly when autotools is upgraded) is one of the least fun things to do.

Make is fine. Automake is horrific.


I feel like the author just structured their build dependencies wrong. I used to work on a project that had a 16,000 line Makefile, where almost the entire file was actively in use. So you can abuse any build system.


Another way is using npm-install-changed for the "npm install" problem and fix everything else with other npm scripts like npm-run-all. Use make if you like it but if everything else is npm you can solve most problems using "npm run" and npm scripts.

https://github.com/oNaiPs/npm-install-changed

http://blog.keithcirkel.co.uk/how-to-use-npm-as-a-build-tool...

[edit: formatting]


Supplying even a trivial Makefile is advantageous if your target audience knows it. They'll see a Makefile and just do `make && make check && make install` without needing to browse your README.

For example, Debian's packaging process is heavily optimized towards Makefiles that supply the standard targets, and you can reduce a lot of friction for packagers if you supply a wellbehaving Makefile with your program. (Even if developers would usually `go build` or whatever instead of `make`.)


yes it is good to adapt to your target audience. If that is linux users a makefile might be the best. For a lot of other users like javascript web developers npm might be better. For end users on windows an msi or exe installation file is probably best. Makefiles are unusual on windows which makes them a somewhat bad solution even for developers on windows machines. If you want to create system that works on multiple platforms there are probably better ways. At least make sure to not use some *nix-only syntax in the makefile or calling grep or awk.


Makefiles are only considered unusual on Windows if one ignores decades of .MAK files. (-:


I use make to manage JS and container builds. Couldn't imagine using anything else right now.


Answer me this, does anyone have a good cross-platform method of writing Make files? I guess with Windows now having bash, I can actually just target bash scripts?


Take a look at makefiles from various suckless projects. For example: http://git.suckless.org/dwm/tree/Makefile

Their makefiles work with GNU Make and BSD Make, don't know about nmake (AFAIR Windows).


You _can_ use the GNU Autotools on Windows, though you need a shell installed (and a vaguely sane POSIX environment). CMake does a good job targeting a bunch of environments, including Windows.


CMake? Something about having to write a build system for my build system always makes me feel like I'm doing something wrong, though ;-)


Write make files, or write CMake files.


Pop make quiz: what is the difference between these two Makefiles?

----------------------------------------------------

foo: bar

bar:

----------------------------------------------------

.PHONY: foo

foo:


if bar happens to already exist no rules will execute for the first.


Making Make Make Sense from @izs is great:

https://www.youtube.com/watch?v=dsqBSgdQz_8


Make is old and crufty, but it's still the best at what it does.

What I'd really love right now is a program which can look at a Go source tree and write make rules such that I don't have to run go build or go install unless dependencies have actually changed. It's really annoying to be pushing new images when nothing has actually changed.


> but it's still the best at what it does.

If "what it does" means "build complex software projects" I think your claim is rather debatable.

GHC is currently switching away from make after having already overhauled its make-based build system two times:

http://research.microsoft.com/en-us/um/people/simonpj/papers...


The problem with $buildtool is that 99.99% of the time the people implemting it had no idea how make worked and invented something easier to use in the particular domain (grunt, gulp, webpack and many more). Make creates a DAG of files and resolves targets, the examples given in this article also fail to understand how make works.


The depressing thing is that after decades of trying, there still does doesn't seem to be anything better than Make.

And Make is terrible.

At least if want have to deal with more than one directory.


Make is a great tool for automating simple things, but if you use a programming language you should probably use native automation tools for common tasks. If you have any non-trivial logic or want to enable code-reuse make doesn't seem like the right choice either.


I hate this. Every programming language that bolts on its own build environment. Make is language agnostic and that's a huge benefit, it means you have to learn about just one build tool and its peculiarities and that will allow you to build many different projects in different languages.

All these languages with their half-baked build tools that won't accept that they may have to play nice with other languages and their build tools are not helping.

Of course it is great when you write a language to also throw in a build tool, but in the end if the build tool re-implements 30% or so of make in a broken way I don't see the point.

Make does have its limitations, but most ordinary projects get nowhere close to reaching those.


> Of course it is great when you write a language to also throw in a build tool, but in the end if the build tool re-implements 30% or so of make in a broken way I don't see the point.

Well, other build tools sometimes also do a lot more than make ever did. (And in some areas, perhaps less.)

Example: Cabal/cabal-install (for Haskell) can automatically fetch all your project's dependencies and automatically compile them for you.

> Make does have its limitations, but most ordinary projects get nowhere close to reaching those.

I suppose it depends on what you mean by "ordinary", but IME this Makefiles by virtue of being purely file-based is useless for most of my projects. There are a many langauges (e.g. Java/Scala, Haskell) where it just doesn't fit for normal development. As an example: Tracking intra-file dependencies for proper recompilation is really difficult to for Scala code without reimplementing a huge chunk of a Scala compiler. (I'd venture to suggest that you'd be foolish to even try doing this in Make.)


> Tracking intra-file dependencies for proper recompilation is really difficult to for Scala code without reimplementing a huge chunk of a Scala compiler.

tup (http://gittup.org/tup) has a really nice way of handling interdependencies. They set up some filesystem magic to figure out which files were read while compiling a certain target, and record these files as dependencies for the target.

Another instance of a hard problem ("reimplementing [...] a Scala compiler") that turns into a very easy problem once you look at it from the right angle.


I love tup so much. It's a "functional reactive" build tool that practically nobody's heard of, so I'd argue that it's even more "hipster" than make (whatever that means).

When I decided I needed a single build system to replace a hodgepodge of stuff, I looked at make because that was apparently the go-to thing.

It took me about 2 minutes to decide that I wanted no part of make. I'd assumed it was basically what Tup is. I'm lucky that I learned about Tup from a thread on this site, because it's made me so happy (especially once the initial pain was over... and yeah, you have to change).

So whenever I see these posts, I make sure that someone is mentioning Tup as a clean alternative.

edit: to be explicit, Tup is obsessed with speed and correctness, and handles incremental builds optimally (see Mike Shal's whitepaper on this). The "price" is that you can't modify files during a build. A file is either "regular" (not touched by build) or "generated" (produced wholly from one build rule). You have to describe the complete DAG up front. Because of all this, Tup will actually remove build targets when they're not needed, which I believe no other build system does. I could never go back.


The only practical problem with tup is that it's not make, which creates friction for packagers. And since I'm doing some packaging work myself, I'm heavily optimizing for packager happiness.


Sure, tup is very nice, but it doesn't solve the problem in a way that leads to anywhere near efficient re-compilation on small changes (for Scala, at least -- it might be workable for Java). For that you need something like zinc[1]. This is because the dependency units in Scala don't map cleanly to files.

> Another instance of a hard problem ("reimplementing [...] a Scala compiler") that turns into a very easy problem once you look at it from the right angle.

Only if you don't solve the problem fully. TBH, not even zinc solves the problem 100% and sometimes does a little bit of unnecessary re-compilation. It is very good, though.

[1] https://github.com/typesafehub/zinc


> Make is language agnostic and that's a huge benefit, it means you have to learn about just one build tool and its peculiarities and that will allow you to build many different projects in different languages.

You need to learn the Make build tool (easy) and then you need to learn bash scripting to get anywhere (argh!).

The OP has linked an example Makefile and I wouldn't be able to maintain/debug it without reading plenty of manpages:

https://github.com/Financial-Times/n-makefile/blob/master/Ma...


I am not sure if you can get rid of some those commands with any other build tool. There is simply no build tool that includes everything, so sometimes some scripting will be required. Then, why not go with bash that you hopefully already know?

Also, a lot of the complexity is due to him being new to make:

    asset%:
    	@if [ -e webpack.config.js ]; then ... fi


... which is of course addressing make as merely a "build tool" for "projects".

One of the normal conventions for Daniel J. Bernstein's tinydns is to use make. No "projects" or "builds" are involved. The goal is not to create a program. Rather, make is used to ensure that the binary form of the DNS database is re-built whenever the system administrator has edited the text form. The administrator simply has to remember to run "make", and the Makefile re-builds, and atomically publishes, the new DNS database.

The simplest makefiles are of the scale exhibited at http://cr.yp.to/djbdns/run-server.html

But one can create significantly more ambitious makefiles than those. (-:


Notice how in the article nothing is "built", all that happens is an npm install. It sounds like make is used to manage a project and perhaps do things like running unit tests, pushing code to stage/qa/prod, fetch data from remote hosts to test locally, running linters and code analysis tools, generating assets, and many more.

Is make the right tool for executing such tasks? Well, it's not bad for some I guess, and ok if this one project is all you have.


That's true, but it also leads to bloated special purpose tools that take minutes to run, as described in the introduction of the article.

Sometimes you've better off having something that does exactly what you need, regardless of speed. Sometimes you need something fast, lightweight, ubiquitous, and reliable despite its quirks.


I guess make could be for hipsters in the same way that vinyl is for hipsters, i.e. obsolete technology that should have gone away by the end of the 80s.


make is not used in this post. He wrote shell scripts with a single tab indentation.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: