I’m using Make for years now but usually only for very basic things. I always feel like it can do much more.
Are there any good docs/books/articles about advanced usage of make?
> Of course, the manual itself is really very good!
I actually don't agree with this. It's really good as a "user guide", but it's not the best as a "reference manual". It does not always clearly define terms, it freely mixes technical terms and hand-waving language, the writing style tends toward an "expository prose" that makes it hard to find specific pieces of information. In general, it could do with some reorganization using the Diátaxis principles: https://diataxis.fr/
It does have pretty good indices, and the quick reference is very helpful. However the HTML version (can't speak for the others) needs a lot more anchors, to make it easy to cross-link to individual elements, rather than just to chapter headings.
Huh. I actually thought the manual was great, but you make excellent points that I happen to agree with.
I'm very comfortable with the manual now, when I need to look something up, but that's because I've spent years looking things up in it, so I have a fairly good idea where to find things. Even so, it's sometimes confusing, e.g. where to find the definition of automatic variables like `$@`, `$^` and the like? When I started learning Make I remember it being infuriatingly hard to find the entry about a topic of interest.
While the manual is great (as are most of GNU's major works!) compared to most of the software ecosystem [1], you're quite right that it can still improve a lot.
[1] It's kind of amazing how even a company like Apple can have some pretty underwhelming documentation, see today's post about ExtensionKit https://news.ycombinator.com/item?id=33409558
I'm a big fan of using a standalone Makefile to manage the build of a project rather than having the IDE do it by "magic." Makefiles can also be used to handle a lot of repetitive processes or tasks that have to happen in a certain order based on dependencies -- they aren't limited to running compilers.
One thing that's helpful is to study how real-world projects use Make. Try to understand them, and look up pieces you're missing from the reference material. You'll probably end up having to dive into autotools, too, for some of it.
[3] Install GNU Coreutils (just type “brew install coreutils” in the terminal)
Command Line Tools is the package that contains the compiler (clang), Git and all the expected development tools.
Obviously, depending on what kind of development you’re doing, you’ll install the runtime packages and utilities you would have installed on Linux, BSD or Windows anyway.
If you’re a web developer, for example, you'll probably need to install NodeJS, and a bunch of bundlers, linters, pre/post processors, task runners (Gulp, Grunt, Webpack, Parcel, etc.).
No matter which platform you’re on, nobody loves their default terminal emulator, so you’ll need to install iTerm2, Alacrity, Kitty or my new favorite, WezTerm.
Oh and the editor! macOS comes with Vim and Emacs, but you’ll want your favorite distribution/version/fork. (If you’re into VS Code, you can brew install vscodium). Even if the bundled version is what you want, then there’s the plugins, extensions, packages you need. Again, regardless of platform.
The point is, no matter what platform you choose, you need to install a bunch of things to get going. macOS isn’t significantly worse in this regard.
Why did you bother to install Homebrew in step #2, though, if you are unwilling to then use it to install something as trivial as a newer version of make?
mostly i use make for small tasks that will also run on other people's computers -- not worth the hassle to ask them to install a newer make just for that.
even for myself, as much as possible i try to just rely on what's built-in, if reasonable. time is precious!
the mac is shitty in a million ways, but it's still by far the least shitty, for me, out of the windows, linux, mac trinity. tho at this point it's been years since i've spent much time non-mac, so maybe the situation has improved -- i'll probably give it another go once asahi linux matures (or its m1 support is adopted by other distros).
(also the arm mac notebook hardware is just insanely good, imo.)
The GnuWin32 port of Make [1] is version 3.81. I've used it when I needed GNU Make on Windows. I know there are other ways to get newer versions of Make on Windows, but the GnuWin32 port is simple, standalone, and it still works despite its age...
If this wasn’t LWN, I’d call it “blogspam”, but I know LWN does this to have their own archive which they can link to. But this does not mean that there’s any good reason for anyone outside of LWN to use LWN’s copy of the announcement, unless the original link had gone stale or something.
LWN is not a general mailing list archive; AFAIK, LWN only archives the posts which they want to link to in an accompanying or related article. LWN does not, AFAIK, archive every post on, in this case, the info-gnu list.
> Previously each target in a explicit grouped target rule was considered individually: if the targets needed by the build were not out of date the recipe was not run even if other targets in the group were out of date. Now if any of the grouped targets are needed by the build, then if any of the grouped targets are out of date the recipe is run and all targets in the group are considered updated.
This is a welcome change! I always thought this was how Make worked anyway, until I tried it and it didn't. Unfortunately none of my coworkers using stock Make on their Macs can benefit, but I can also insist that they install v4.4 since they all have Homebrew already.
> .WAIT
What's a use case for this? Is the idea that you can use it for explicitly sequencing build order of dependencies, without making them actually dependent on each other? Why would you want that?
> let, intcmp
The Turing Tarpit grows... do we have a Scheme interpreter in Make yet?
> What's a use case for this? Is the idea that you can use it for explicitly sequencing build order of dependencies, without making them actually dependent on each other? Why would you want that?
A use-case that I've wanted this for for a long time is to have build phases. `make install` should build things if necessary; but it'd also be great to have it build everything before it installs anything.
I tried reading that part of the changelog a bunch of times but my brain is having a difficult time parsing it... Could someone please give us an example of what's that problem with the grouped targets?
> What's a use case for this? Is the idea that you can use it for explicitly sequencing build order of dependencies, without making them actually dependent on each other? Why would you want that?