
Is Haskell the Cure? - bobfunk
http://mathias-biilmann.net/posts/2011/10/is-haskell-the-cure
======
substack
I gave haskell a shot as some of my earlier github repos indicate:
<https://github.com/substack>. I even wrote my blog in haskell with happstack,
since snap hadn't gotten popular yet.

Haskell is very hard, but even after 3 years of pretty intensive use, I never
really felt productive with haskell in the same way that I've felt in other
languages. Anything I did in haskell required a lot of thinking up-front and
tinkering in the REPL to make sure the types all agreed and it was rather
time-consuming to assemble a selection of data types and functions that mapped
adequately onto the problem I was trying to solve. More often than not, the
result was something of an unsightly jumble that would take a long time to
convert into nicer-looking code or to make an API less terrible to use.

I built an underwater ROV control system in haskell in 2010 which went well
enough, but I had to tinker with the RTS scheduling constantly to keep the
more processor-hungry threads from starving out the other ones for CPU. The
system worked, but I had no idea what horrors GHC was performing on my behalf.

Later I built the first prototype of my startup with haskell, but the sheer
volume of things that I didn't know kept getting in the way of getting stuff
done in a reasonable time frame. Then we started to incrementally phase out
haskell in favor of node.

I write a lot of node.js now and it's really nice. The whole runtime system
easily fits into my head all at once and the execution model is simple and
predictable. I can also spend more time writing and testing software and less
time learning obscure theories so that the libraries and abstractions I use
make sense.

The point in the article about haskell being "too clever for this benchmark"
sums up haskell generally in my experience.

~~~
bobfunk
It's pretty much what I mean when calling Haskell hard to learn. For me it's
also been a steep learning curve, and my experience hasn't been altogether
different than yours.

I started out maybe 5 years ago following tutorials, reading up on all the
metaphors about Monads and doing project Euler problems.

After a while I started to tackle some small web related things with Haskell
and had exactly your experience of running into a lack of understanding of how
the system works and wrapping my head around functional datatypes.

I pretty much gave up on Haskell as a practical language at that point, but
something kept me coming back once in a while.

Then at a point I had a use for making a small web service fast and the Node
prototype I made performed badly and crashed in spectacular ways under high
loads. I found Snap and made a quick prototype in Haskell. At that point the
experience of years of small experiments must finally have made something
click. In a very short time I had a very fast service using almost no memory.
It's deployed in production (as a part of <http://www.webpop.com>) and has
been extremely stable.

By now I think I've crossed some kind of barrier, and feel like I'm both being
productive and having fun when writing Haskell, but it really didn't come easy
to me and all else being equal my experience tells me that a good deal of my
colleagues would have an even harder time.

~~~
Homunculiheaded
I think part of the issue with learning haskell is that it seems to invert the
typical learning strategy for programming languages. Usually the best advice
is read a little, then write a lot. Typically you can just look and some
published code and go "ah yes that's how you do it". But I find, for better or
worse, Haskell really requires you to understand before you code. Which in the
end means your study to code ratio is very different than almost any other
language.

Most languages, even lisps, are somewhat tolerant of 'programming by guessing'
for beginners. Usually you write terrible code that works, learn more and see
what you did wrong. Haskell is very unforgiving of this, if you don't
understand why it works it probably won't

~~~
thesz
I think you're wrong at "if you don't understand why it works it probably
won't".

While I long given up PUI (Programming Under Influence) I still occasionally
do some in Haskell. After a litre of beer I am pretty dumb, but I can follow
clues from compiler to get something working.

Most of the time, it works the next day, when I sober. That's in contrast with
C/C++. Scripting languages give some power like that, but I can screw myself
with them much more violently.

In my humble opinion, Haskell is the language of choice for drunken
programmers.

------
jrockway
Let's have a talk about the $ operator. When you use it more than once per
line, you're writing code that looks weird and is hard to read. Switch to the
similar function-composition operator, and everything looks more idiomatic.

Instead of:

    
    
        fibServer x = quickHttpServe $ writeBS $ B.pack $ show (fibonacci x)
    

Just write:

    
    
        fibServer = quickHttpServe . writeBS . B.pack . show . fibonacci
    

The case for $ is where you want application instead of composition:

    
    
        fibOf42Server = quickHttpServe . writeBS . B.pack . show . fibonacci $ 42
    

I even write things like:

    
    
        main = print =<< foo
    

instead of

    
    
        main = foo >>= print
    

for consistency.

Anyway, it's a little style thing, but it's nice to use the composition
operator (.) when you want composition and the application operator ($) when
you want application. It makes the code look nicer and it shows its intent
more clearly. And really, they are different concepts, even if they both type-
check the same.

And finally, remember that function application, by default, is the highest-
precedence operator in Haskell. When you write:

    
    
        foo . (bar 42) . baz
    

It's the same as:

    
    
       foo . bar 42 . baz
    

Because of operator precedence. $ only exists to change the order of
operations for a particular expression.

~~~
Peaker
I'll add that a nice little benefit of using:

    
    
      a . b . c . d $ e
    

over:

    
    
      a $ b $ c $ d $ e
    

is that any sub-expression taken from the first expression is valid and can be
refactored out into its own name. (.) is associative and ($) is not.

------
the_mat
Please stop with the toy benchmarks and pretty one-liners that show how
awesome Haskell is.

There is growing list of smart programmers who get all enchanted with Haskell,
jump into it wholeheartedly, and end up frustrated (see bottom of message).
GHC makes the typical C++ compiler seem fast. Once code grows past the
homework problem size, all hope of understanding memory usage is lost. I don't
think people really get how bad that it is. The whole culture of Haskell is
based around static checking, yet you have to run a program in order to find
out if it blows your memory limit several times over.

Haskell is still a neat language, but we need less advocacy based on toy
programs and more honest realism.

(Here's a typical, non-superficial example: <http://wagerlabs.com/haskell-vs-
erlang-reloaded-0>)

~~~
gregwebs
That post is from 2005. The situation is entirely different today w/respect to
speed (both the compiler and the addition of ByteString and Text libraries),
and productive libraries, particularly for web development. Likewise, it is a
rare case that you would run into memory consumption issues.

I do agree that there is entirely too much enthusiastic toying around in
Haskell and not enough real world users and honesty about limitations.

~~~
the_mat
To me, there's a dichotomy between the "if it compiles, it usually works"
aspect of Haskell (and how this is often touted as superior to the dynamic
typing, test-driven approach) and that you can't get a picture of memory usage
until you run and profile the code. In my experience, hard to understand
memory consumption issues are common and take effort to solve.

Reference: [http://blog.ezyang.com/2011/06/pinpointing-space-leaks-in-
bi...](http://blog.ezyang.com/2011/06/pinpointing-space-leaks-in-big-
programs/)

~~~
gregwebs
That blog post says nothing about frequency of memory leaks, but shows that
there are good tools to help you in your effort to solve them. One thing to
keep in mind is that there are memory issues with every language. I just
debugged one in Ruby yesterday, and there wasn't a good tool readily available
for that effort. Do you know what the memory consumption of your programs are
in other languages are before running them?

As a contradictory anecdote, I have never once had a memory consumption issue
with Haskell code. Haskell is actually in a nice position w/respect to memory
now that enumerators (which always use constant memory) are taking hold. I
have no doubt you encountered many memory leaks, but I don't think your
experience completely generalises to modern Haskell.

------
Peaker
It is somewhat of a shame that learning curve plays such a significant role
for career programmers.

You would expect that people that spend years and years working with their
tools would be willing to put a few weeks or months into learning their most
important tool: the programming language. It seems most programmers get
frustrated and abandon learning of different programming paradigms very
quickly.

~~~
stonemetal
Not to be to contrarian but until I see proof to the contrary I think Norvig
put said it best:

 _In terms of programming-in-the-large, at Google and elsewhere, I think that
language choice is not as important as all the other choices: if you have the
right overall architecture, the right team of programmers, the right
development process that allows for rapid development with continuous
improvement, then many languages will work for you;_

~~~
thesz
Architecture heavily depends on the language. You have to make different
choices for C++ than for Java, not mentioning Haskell.

Also, I think that comma before "the right" in Norvig statements means logical
AND. If we rewrite that statement it will look like that: _if you have the
right overall architecture AND the right team of programmers AND the right
development process that allows for rapid development with continuous
improvement, then many languages will work for you._

I think it is too many AND's here. In most realistic situations you cannot
have such luxury.

Also, choice of Haskell (or similar language) allows you to address at least
two points from Norvig statement: the right team and right development
process.

Those who learned and applied Haskell almost cannot form the wrong team.
Almost - as we cannot rule out failure completely.

The right development process is almost ensured by strong type system. Type
systems like Haskell's can be viewed as a tool to spread requirement changes
through complete program.

(that's why it is seemed hard to introduce or change a constructor in a data
type)

So all in all I think that languages make a difference here. For many
languages you should fulfill those three points, for some languages those
points fulfill themselves.

~~~
eru
As much as I loath Java and co, I have to give them that their mature
development tools make up a bit for their weakness.

I.e. in Haskell you make a change to a type and propagate it, until the
compiler stops complaining. In Java you click some `refactor' button in your
IDE, and your changes will propagate through the code base automatically.

That's less of a comment on the languages, since Haskell will probably grow
better tools some day, but more a comment on the relative stages of maturity.

------
T_S_
The ecosystem for Haskell is improving rapidly. My startup built a computer
vision application on top of easyVision and with the intention of rewriting it
in ObjC. Instead we are working with the Haskell community to target the
mobile platform. A year ago that would have been a dicey bet.

About our Haskell experience: Yes, the learning curve seems steep, but mainly
because of the things you have to _unlearn_ (OMG no for-loops!). However,
functions are the most modular things ever invented. That translates into an
uncanny ability to add features quickly. A sophisticated type system catches
many errors at compile time.

------
ghc
I love Haskell; I do a lot of work with it. That said, I use Python for the
web. As nice as Snap is, Haskell just doesn't have the vast array of quality
libraries for web development that Python does. Lately, this means that I do
web development in Flask, and heavy lifting in Haskell.

~~~
vegai
Check out <http://www.yesodweb.com/>

------
exogen
I'll say this again: Ted shouldn't have wasted everyone's time highlighting
the response time of the request. He effectively benchmarked V8 right on his
blog and then called it slow. Now everyone's complaining that people at least
demonstrate that part to be untrue.

Ironically, every one of his clients in the ab concurrency test will receive
their responses _before_ the users of the hypothetically parallel Python and
Ruby services, because Node responds an order of magnitude faster. So he
didn't actually demonstrate a problem.

------
gvb
Why is a Fibonacci sequence used as a benchmark for an argument for
_concurrent_ programming? The Fibonacci sequence is a _recursive_ algorithm
that inherently has dependencies on previous calculations that prevent
effective concurrent execution. The Fibonacci algorithm executed concurrently
is going to spend an inordinate amount of time creating tasks that do a
trivial calculation (add two numbers together).

If you want to benchmark _concurrency_ , at least pick an benchmark algorithm
that _exercises concurrency._ The FFT comes to mind, but there are probably
lots of better examples (that is a challenge to HNers ;-).

~~~
lobster_johnson
He was not benchmarking concurrency, he was pointing out that Node is single-
threaded system that essentially implements the old-style cooperative
multitasking, where a single task will block everything else. He could have
used sleep() and it would have illustrated the same point (and more elegantly,
since half of the responses miss the point entirely and focus on the Fibonacci
part).

Node developers probably don't do a lot of computationally complex stuff, but
when they do, they have to think about the concurrency problem. Even something
as trivial as sorting a large list or parsing a huge chunk of JSON is going to
stop all other requests from executing.

~~~
gvb
But he isn't benchmarking _web server concurrency_ because he is doing a
single curl

    
    
      $ time curl http://localhost:8000
      165580141
      real 0m0.016s
      user 0m0.007s
      sys 0m0.005s
    

So he is running the Fibonacci (40) _once_ with the web server. The only
concurrency / parallelism that is happening is in the _recursive_ Fibonacci
algorithm. I stand by my contention that the Fibonacci algorithm is a very
poor test of concurrency / parallelism.

I stand by my contention that he should have implemented an algorithm that
could be solved in _concurrent_ pieces and then benchmarked node.js against
his favorite language. If the algorithm cannot be parallelized effectively, it
doesn't matter how many tasks you spawn to solve it (cooperative or
otherwise), the _algorithm's dependencies_ will cause all the tasks to block
and effectively serialize their execution.

~~~
lobster_johnson
I'm referring to the original post (<http://teddziuba.com/2011/10/node-js-is-
cancer.html>), which did test concurrency. The Haskell guy missed the point
entirely and seems to have given up after Haskell is shown to memoize his
function.

------
heisenmink
Wasn't the point of the original post that node.js blocks the event loop while
it executes functions and thus effectively kills concurrency? Not how fast it
calculates fibonacci numbers and sends it over http...

~~~
Niten
The point was that it kills parallelism – Node is just a single-threaded event
loop, running on a single core. And since computing fibonacci numbers is a CPU
bound activity, that type of benchmark would be relevant but for the
memoization bit.

EDIT: Well also, the author would have to actually benchmark this vs. Node
with many concurrent clients in order for it to be relevant; here he's just
timing a single request from start to finish, which obviously doesn't say
anything about how this scales.

------
dorian-graph
Hasn't the author already said that the measuring of Fibonacci was not the
point of his tirade? Which makes the line in this post 'I think a lot of
people missed the main point of Dziuba's troll" slightly amusing. Is there now
going to be someone running this 'benchmark' in whatever language they can?
One of the blogs already posted said he's going to find time to run it in C.

I'll do my part. Delphi, here I come. ;)

~~~
bobfunk
My benchmark was mostly a parody, since Haskell just memoized the call and
never really did the work.

The point of the article was more the difference between the languages that
really tackles concurrency (Haskell, Clojure, Go, Erlang) and Node's way of
simply offering one solution that works for a lot of problems where the common
scripting languages (especially PHP) doesn't work that well.

~~~
alexatkeplar
Don't forget Akka (esp Scala-Akka) in that list!

~~~
gtani
I just put up a [[actors in _ lang]] list

[http://stackoverflow.com/questions/7566548/actor-based-
distr...](http://stackoverflow.com/questions/7566548/actor-based-distributed-
concurrency-libraries-for-ocaml-and-other-languages)

------
agentultra
I don't think Haskell itself is really the answer.

If I read the article correctly it's simply a matter of concurrency and
parallelism that's important.

There are a host of languages that do that quite well and Haskell just happens
to be one of them.

~~~
thesz
How those languages fare in other fields, like, constraint programming?

<http://hackage.haskell.org/package/monadiccp>

The real point is that Haskell is quite good in many areas and is excellent in
parallelism and concurrency. While other languages are excellent in
concurrency and not so good in other areas.

Those many languages are the answer for the sole field of concurrency and
Haskell is the answer when you combine many fields, one of which could happen
to be concurrency.

~~~
agentultra
Sure Haskell is good for more than just concurrent programming, but the
article was leaning on concurrency and parallelism in its comparison with
Node.js

And I should also point out that said "blub" languages can also implement
those features which they lack that Haskell includes by design. Some have
better features than Haskell, IMO (ie: Qi/Shen sequent types and the ability
to turn the type system off when you don't need it).

Again, Haskell is a good language. I just don't see it as a "cure." There are
many other options.

~~~
Peaker
How can the "blub" languages implement Haskell's type-classes as libraries? Or
generalized type inference? What about Haskell's higher-kinded polymorphism?
And pattern matching?

Lisp can do some of these, but it is not exactly a "blub" language. Is there a
nice comparison of Qi and Haskell? Once you implement such a large, non-
trivial system (such as an advanced type system), I really doubt using Lisp
macros rather than implementing a compiler is easier. Macros that do such non-
trivial things also do not compose well, so I doubt Lisp is beneficial for
this purpose.

------
shocks
For every application there is a language best suited... Let's stop trying to
force every language to be good at everything and then compare them as though
they were all the same, shall we?

------
kennystone
What's more important is applying some important concepts in haskell -
functional programming and dividing your program into tiny self-contained
parts. You can write this way in most languages - Ruby, Python, Scala, etc.
The fancier parts of Haskell - lazy evaluation, static typing, whatever - are
less important to making software that works than its functional nature.

~~~
Peaker
The static typing is essential to making software that works (and scales).
Dynamic typing requires a lot more test code and test code is expensive to
write, maintain, and repeatedly execute.

------
megaman821
I don't think the point of Ted Dziuba's rant was that every request is a large
calculation. In Node.js if most results are small and generated quickly when
one large calculation request comes in all the small ones stop going out until
it is done. A Haskell web server like Snap should not have this problem.

------
giardini
If Haskell is the Cure, what is the Disease?8-))

Haskell occupies a niche similar to Hamilton's quaternions (for classical
physics) and Heisenberg's matrices (for quantum mechanics) - not mandatory,
inaccessible to the masses and abandoned with haste once a more intuitive tool
is found.

But they will always be there if you need them.

