

Imprisoned by the Haskell Toolchain - andrewdotnich
http://jackkelly.name/blog/archives/2013/01/06/imprisoned_by_the_haskell_toolchain/index.html

======
klodolph
Given how much I've had to fight with Automake and Libtool to get really
simple things done, they're hardly the things you want to hold up as shining
beacons.

1\. Do you want to compile certain source files with different build flags?
Hah, no! Automake only supports it through obscene hacks, and those break if
you're using Libtool.

2\. Do you want to make a plugin? Best way is to ditch Libtool completely,
make an _executable_ target in Automake, and add the linker flags yourself. It
makes you feel like you're banging two rocks together and it's not portable
but at least it gets the job done.

3\. Do you have any linker flags? Hah, no! Libtool will mess with them and
they won't work.

The other bit is that Libtool is basically all magic, and Automake is
basically all macros. The amount of magic that Libtool does to _fool_ you into
thinking you're not writing dynamically linked code is enough to make you puke
and makes a mess when it breaks, and Automake's macro system is terrible.

Just imagine, if you will, that you want to compile one file in your library
with the flags -msse3, to produce a dynamic library that has to run on systems
both with and without SSE3. You can use cpuid to call functions in that file
or not at runtime.

All of my searching has lead me to the conclusion that this is impossible if
you want to use Automake and Libtool, and easy if you ditch both of them.

~~~
anonymous
Basically, these days I recommend cmake as the sane build system that runs
everywhere. I don't know about PCs running MacOS, but it works just fine under
Linux or Windows and you can do whatever magic you want; it makes simple
things easy and complex ones possible.

~~~
klodolph
I wouldn't recommend CMake without qualifications.

It is vastly inferior from the perspective of the end user. Any autotools
package can be handled with "./configure && make && sudo make install", and
you can change it up by "./configure --without-gtk --enable-network
--prefix=/opt/crazy CFLAGS='-O0'" and then make a package with "make install
DESTDIR=/tmp/package". Try doing that in CMake, you'll see what I mean.

The syntax is also really terrible, in particular, If/Else. The documentation
is not so good, in particular, it's available online as one big block of text
and they want to sell you a book.

Finally, CMake doesn't solve the one problem I was complaining the most about,
which was the ability to set per-file compilation flags. This is not an
optional feature in my eyes -- it's necessary for avoiding cross-contamination
of -D_GNU_SOURCE turds picked up by pkg-config, and it's necessary for writing
libraries that use processor features that might not be available at runtime,
like SSE3.

I think CMake, like Autotools, has a niche in which it does best. "Simple
things easy and complex things possible" is a nice motto, but the truth is
really boring -- CMake and Autotools make _different_ things easy and
_different_ things possible. CMake's niche is medium-large projects with
medium amounts of complexity. Projects that are complex, like Firefox or
FFMpeg, tend use Autoconf with custom build systems.

------
fpgeek
The Haskell toolchain has its share of weaknesses (e.g. the impurity of
package builds has been painful recently), but I also feel the need to point
out that GHC learned lesson 1 a long time ago:

[http://www.haskell.org/ghc/docs/latest/html/users_guide/sepa...](http://www.haskell.org/ghc/docs/latest/html/users_guide/separate-
compilation.html#makefile-dependencies)

Edit: I should also add that, as an expected corollary, if you use ghc -M to
generate your Makefile dependencies parallel make works fine. I used ghc that
way for years. I think there was a corner-case if you interrupted a parallel
build, but it was easy to deal with.

------
ezyang
Yeah, this happened to me when I was integrating Ur/Web and Haskell; Ur/Web
generated C and wanted to handle linking, but to link against Haskell
libraries I needed GHC to do everything. There's a happy ending to the story,
though: we patched Ur/Web to use ghc instead of gcc to do compilation, and
everything worked out great. :-)

------
lmm
The reason people do this is the C toolchain is terrible, especially on linux.
Want to depend on a specific version of a library? Whoops, no, you can't do
that. Want to extend make? Well, you either do it in make or in shell, neither
of which are nice languages. Want to compile and link your code once? Good
luck fighting libtool. Want a package with dependencies? You'll have to choose
whether to work on debian or on not-debian. And actually the debian
maintainers are going to move all your files around anyway, because the FHS
committee certainly knew better than you where your program's files should go.

It's bad enough I start to wish everyone would just use the JVM, where you
have maven. Truly reproducible builds, depend on whatever range of library
versions you work with (and it's no problem if two programs want to use
different versions), a structured and testable plugin system for extending the
build system, and you can use it to build any language (there are some
benighted fools who write their own tools like SBT, but they will at least
stay compatible so can usually be safely ignored).

------
chimeracoder
> The ultimate problem is that people insist on rolling their own sucky
> versions of build systems and package managers. (Though cabal and ghc --make
> suck less than most, I'll admit).

Cabal is probably the #1 thing keeping me from writing more Haskell code. I've
had so many issues with conflicting versions of various libraries being
required, and incredibly cryptic error messages upon failure.

Which is sad, because Haskell otherwise seems rather attractive.

It's also ironic, because I would have imagined that a package manager for a
purely functional programming language would be a bit more robust[1].

<http://nixos.org/>

~~~
carterschonwald
Please give concrete examples of the problems you've had. Likewise, when you
have concrete problems, have you asked for help on #haskell irc or the
Haskell-Cafe mailing list? The community members on both those channels are
super helpful and can probably help you quickly resolve / debug your problems.

People: hand wavy complaints that do not give concrete examples are hard to
fix. Please communicate concrete instance of your problems so that either a)
someone can help you solve them now or b) there is a concrete example of a
problem so that folks can fix it.

That said, any serious dev work really should use the cabal-dev tool, it
solves having those build problems that new ghc users hit.

~~~
stevekemp
I'm not a haskell user, but I did recently try to install a haskell project on
my Debian Squeeze host:

    
    
      $ sudo apt-get install ghc6 cabal-install
      $ sudo cabal update
      $ sudo cabal install hakyll
    

That failed with an error of the form "installing XX version YY requires FOO
version BAR". That gave me zero help in completing the installation, and so I
gave up.

FWIW: <https://github.com/skx/static-site-generators>

~~~
mercurial
Don't sudo cabal install whatever. Actually, avoid cabal install without sudo
too, unless you're working with software using dynamic recompilation like
xmonad. Use cabal-dev instead, this will save you a world of pain.

~~~
stevekemp
I'm confused by your comments relating to sudo ..?

That said it seems that cabal-dev is not packaged for Debian. Come the wheezy
release of Debian the hakyll project I wished to try out will be available as
a .deb, so I'll try it again then.

~~~
mercurial
What I mean is, "cabal install" is usually a bad idea, with or without sudo.

However, you _can_ do "cabal install cabal-dev" and rely on cabal-dev for the
rest. It means you don't have a global package cache, but in this case it's
really for the best. Don't forget to add .cabal/bin to your $PATH if you want
access to cabal-dev.

~~~
stevekemp
Thanks; I'll give that a try shortly.

