Hacker News new | past | comments | ask | show | jobs | submit login

God I hate make. I hate make so very, very much. Compiling code isn't that hard. I swear to the programming lords on high it isn't.

Here's a wonderfully useful open source project - Google gperftools [1]. It includes TCMalloc amongst other things. The Makefile.in is 6390 lines long. Configure is 20,767 lines long. Libtool is 10247 lines long. That's fucking insane.

Compiling OpenSource is such a pain in the ass. Particularly if you trying to use it in a cross-platform Windows + OS X + Linux environment. One of my favorite things about iOS middleware is that one of their #1 goals is to make it as easy as possible to integrate into your project. Usually as simple as add a lib, include a header, and call one single line of code to initialize their system (in the case of crash report generation and gathering).

I work on a professional project. We have our own content and code build pipeline. It supports many platforms. I don't want anyone's god damn make bullshit. I want to drop some source code and library dependencies into my source tree that already exists and click my build button that already exists.

</rant>

[1] https://code.google.com/p/gperftools/




What you are describing is autoconf, not make. Make by itself is actually a very handy tool for performing tasks that have a dependency graph.

Autoconf.. Well, I can't disagree. It's a hack built on top of a hack and should probably be rethought. Once autoconf is done generating Makefiles, make itself is generally trouble-free.

http://freecode.com/articles/stop-the-autoconf-insanity-why-...


Everyone says that, until they're put in charge of babysitting an old HPUX or AIX box and it's time to install something. Then no one complains about Autoconf again (though they can't bring themselves to praise it, either).

Automake, on the other hand...


I'm okay with autoconf. Automake however is an abomination.


mk-configure is a "lightweight" autotools replacement

<http://sourceforge.net/projects/mk-configure/>


That's a GNU autotools setup, not a pure Makefile setup. A simple Makefile setup is much much easier. A current C project I'm working on has around 150 lines worth of Makefile stuff, and that includes both compiling source, running flex over lexer files, an option for creating a .tar.gz of the current source, and also an option to run tests on the compiled object files. It's also dead simple to maintain (The only time it requires a direct update is when adding test cases, and that's because you have to specify which source .c files to test).

Now to be fair, I am sacrificing some options here. The biggest is that autotools runs tests on the installation environment and tests the available functions and standards compliance, which in theory allows compiling the source on any system with has autotools for it, which is why it's so huge. You can't do that with standard Makefiles. I just stick close to the standard and avoid any non-standard extensions that I don't need.


Autotools is crap. Configure on it's own isn't godawful -- it scans for a LOT of Unix variants, and does sniff for features reasonably well. On my more cynical days, I'd say it does a good job of making things portable from unix to unix. Everything else about autotools is unimitigated crap, though.

I've written my own generic make library. The library itself is 95 lines of script, and handles all the dependency sniffing, library and binary building, and so on that I need in a reusable way.[1]

The makefiles themselves just list the sources and object files. They're typically only a few lines, like so[2]:

    BIN = mybinary
    OBJ = foo.o bar. baz.o
    include ../mk/c.mk
That's it.

[1] http://git.eigenstate.org/ori/mc.git/tree/mk/c.mk

[2] http://git.eigenstate.org/ori/mc.git/tree/6/Makefile


And yet most contemporary hipster build systems are all just shiny make reinventions.


Actually, most contemporary hipster build systems are bad reimplementations of make. Yes, make is a PITA, but every other build system is worse.


UI have yet to see a 'hipster build system' that mixes shell and make language or uses punctuation for variables.

Are all things made since 1977 hipster?


Yes.

Get off my lawn, hippie.


Most new things are inspired by something else to the extent that they can be viewed as reinventions. So that's a moot point, and doesn't contribute at all.


My preferred hipster build system (CMake) actually leverages make on UNIX platforms and nmake on Windows :). It just replaces the most of the autoconf mess.


> It just replaces the most of the autoconf mess.

With a non-standard mess on its own ...

CMake is pretty bad at doing things (standard paths, install targets etc.) that the GNU folks solved a long time ago. Yes, the Autotools are a royal PITA but at least a pain that one knows how to deal with.


With a non-standard mess on its own ...

Perhaps, but not any that I have had problems with.

E.g. an application that we distribute uses Qt, Boost, Berkeley DB XML, libxml2, libxslt, etc. Producing signed application bundles for OS X, MSI installers for Windows, and packages for Ubuntu has been nearly painless. And that's with clang on OS X, Visual C++ on Windows, and gcc on Linux. If it's easy to produce binaries on the three most popular platforms, with three different compilers, I don't see the problem.

We have tried autotools before. But it's a pain on Windows with Visual Studio. Let alone that I can quickly generate a Visual Studio project to do some debugging.


Sure, if you do the packaging for a restricted set of environments yourself, CMake certainly works fine. I do the same for a lot of projects and know what CMake is capable of.

But when it comes to the differences between all those Linux distributions, the respective packager will be very glad to see that he can customize install prefixes (no, CMAKE_INSTALL_PREFIX is not enough) and use standard make targets.

Your list of dependencies shows libraries that are well covered by the stock CMake modules but try getting a build variable that is not LIBS or CFLAGS from a library that can only be queried with pkg-config. Impossible.


But when it comes to the differences between all those Linux distributions,

That's a fair point. However, most often, I am more interested in accommodating the 99.8% of the population that uses Windows, OS X, or one of the major Linux distributions, than the tiny group that runs Sabayon and is able to get things compiled themselves if necessary.


CMake certainly has its own set of peculiarities, but sometimes it can work to let you build stuff on Windows with VC++ or plain MinGW without having to use MSYS/Cygwin.

Creating portable software and then distributing it with a Posix only build system seems wasteful.


Reinventions that are cross-platform (this actually matters).


What platforms doesn't make support? AFAIK GNU make supports pretty much everything and there are versions of make for Solaris/BSDs.


Make exists everywhere, but you have to explicitly write separate rules for every system.


Auto* is still the most cross-platform build system I've ever used. It let me compile a project from 2003 on windows vista; how many other systems can you say that for?


Good enough for me.


Count me in as a Makefile hater. I'm even using it to manage a portable /home directory and it's just making me hate it even more. Why did all the alternatives have to fail or be worse than Make?


If you're managing dotfiles with Make (which I attempted once...) then may I direct you to GNU Stow instead? It's much easier to manage a bunch of files centrally and just symlink to them all:

http://brandon.invergo.net/news/2012-05-26-using-gnu-stow-to...


The problem I seem to be having with various dotfiles tools is they seem to be doing too much, where a script with a bunch of calls to ln -s would do.


May I introduce you to dircombine? Take multiple directories full of dotfiles and symlink them all into another directory.

http://git.kitenet.net/?p=joey/home.git;a=blob_plain;f=bin/d...


I've come to think that build systems are a very personal utility. Everyone has their favourite. Mine's fabricate.py, for example. Over time I've built up a library of script snippets and shortcuts and so on which I'm familiar with, comfortable with, and exactly fulfil all my use cases. It's all very clever. :)

Yet when I download some random project's source code, I groan at any sophistry in the build process at all. I'm not interested in your build system - I'm interested in the application itself. Maybe I want to try and fix a bug, or have a half-baked idea for a new feature. I don't need dependency checking, incremental rebuilding, parallel building, and all that stuff you get from a fully operational build system at this point. I only need to build the project - once - as I decide whether to stick around. Sure, if I start working on it for serious, rebuilding over and over - then I'll bother to learn the native build system, and read any complicated scripts. Build systems are an optimization for active developers. They're a utility that is supposed to save time.

Of course, you're never going to get everyone in the world to agree on the same build system. We all have different desires and needs for what machines it should run on, how automated, how much general system administration it should wrap up, how abstractly the build should be described, etc. It's a bit like one's dot files or choice of text editor - my ideal build is tailored just for me but I wouldn't expect it to satisfy anyone else.

So now I wish that everyone who distributes software as source code would do this: include a shell script that builds the project. Just the list of commands, in the order that they are executed, that carries out a full build on the author's system. That's what it comes down to, in the end, isn't it? Your fancy build system should be able to log this out automatically. (Of course then you still include all the fancy build stuff as well, for those interested.)

Of course it's extremely unlikely that your shell script will work on my system without modification. There's probably machine-specific pathnames in there for a start. We might not even use the same shell! It's basically pseudocode. But if I'm faced with a straight list of imperative shell commands that doesn't work, and a program of some sort with its own idiosyncratic syntax and logic and a hundred-page manual and the requirement for me to install something - which also doesn't "just work" - well, as long as you know how to call your compilers and linkers and so on - which you should - the former is going to be easier to tweak into submission, to get that first successful build. After all, if I need much more than that I'll probably just recreate the build in my favourite system anyway.


Thank you for making me feel a little bit vindicated. This is precisely how I build my current side project, I essentially mask all of the build/runtime options/tweaking behind a shell script and call all of it through that. ('./run remake', for example, or './run with valgrind' if you're in a debugging mood)

For me, the makefile itself wasn't the problem, I've been rather aggressive to keep it as pretty much just a dependency enumeration with flag lists and (arrogantly) it is rather clean, but the runtime flags/things I need to wrap the executable around pushed me to the script.

(I honestly worried that this was sloppy since it indicated exactly what it did, mask really ugly complexity behind a shiny frontend, which always makes me wonder if that complexity wasn't undue, but it does give the advantage that your last paragraph mentions, that it gives a more modular pseudocode of the various components of building/running.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: