

Very fast, scalable mutable maps and hashes for Haskell - dons
http://donsbot.wordpress.com/2009/09/26/very-fast-scalable-mutable-maps-and-hashes-for-haskell/

======
yason
I'm a lazy bastard and didn't read the complete article but isn't mutability
pretty much of a "NO!" in Haskell? I mean, it was Haskell folks who even wound
up implementing monads just to emulate mutable state without actually having
to be mutable.

Or are they finally going to look for the sweet spot in the middle of two
extremes on the im/mutability continuum...

~~~
codexon
All programming languages must have a side-effect somewhere, or else you would
never be able to see it.

I believe this work was spurred by my comments on Haskell here:

[http://www.codexon.com/posts/why-arent-functional-
languages-...](http://www.codexon.com/posts/why-arent-functional-languages-
like-haskell-and-ocaml-popular)

Within 10 minutes of publishing the article, there was a flood of angry
Haskell supporters ready to decry every single point I had. Needless to say I
still stand by what I wrote. And I take this as further evidence that I had
struck a chord.

~~~
a-priori
Trolls, too, "strike a chord". That doesn't mean they're right. Just saying.

I just read your post and the comments. I think Don (who, coincidentally,
posted this article as well) did a pretty thorough job of dissecting your
arguments. What, exactly, is your purpose in reiterating them?

~~~
codexon
He absolutely did not dissect my arguments. His responses were canned
responses or rather irrelevant.

1\. Haskell does not have good hash tables

His response: Just use HSJudy. He forgets to mention that it isn't even a hash
table and its even LGPL. An amusing fact is that the default hash table
implementations are so slow that someone had to implement their own from
scratch on the GPLS benchmark

[http://shootout.alioth.debian.org/u32q/benchmark.php?test=kn...](http://shootout.alioth.debian.org/u32q/benchmark.php?test=knucleotide&lang=ghc&id=3)

2\. Haskell's lazy evaluation has bad consequences for SMP programming.

His response: First he tries to argue against it. Then when I convince him
that I am not just making things up, he finally backtracks and says that's why
he made a library to cover up the shortcomings.

3\. Haskell's default static bundling of LGPL GMP.

I didn't specify the word "distribution", but I was talking in part about
commercial usage where this is a reasonable assumption. He decided to harp on
this single specific word and declared the whole point to be invalid.

~~~
jrockway
These are all problems (to someone), but these aren't the reasons why people
don't use Haskell. You are projecting your minor complaints onto the world as
a whole, and that simply doesn't make for a good blog post.

1\. I have written a bit of Haskell software. I have never suffered any
performance problems by using a tree instead of a hash table. O(1) is nice,
but O(log n) is very small for very large ns. "You are not Google", but even
Google does fine with trees. If you are actually experiencing a performance
problem, try HsJudy. If this doesn't solve your real-world performance
problem, then you can complain. Right now you sound like you just read the
Wikipedia page for "hash table" but have never had a dataset bigger than 10
elements.

2\. Lazy evaluation does have consequences, sure... but so does strict
evaluation. You are trading one set of problems for another set of problems.
What's nice about Haskell is that you can use strict evaluation when you think
it's required.

3\. LGPL is not a problem for me.

Any software I would distribute to outside users would be GPL'd anyway. Any
software that is worth money to keep secret is not distributed anyway. (Not
everyone is like this, of course, but like I said... this isn't a problem for
_me_.)

Anyway, you should just say you don't like Haskell if you don't like Haskell.
Instead you are trying to say that it is objectively bad, which requires more
proof than your blog post provided.

~~~
codexon
1\. Using the "You are not Google" argument is a blatant disregard for the
facts. If you've done the math, you would see that using tries and ternary
trees often lose performance at 60,000+ elements, and this is even ignoring
cache invalidation that often occurs in jumping through nodes. I'm sorry, but
you are the one who sounds like you've never had a dataset bigger than 10
elements.

And speaking of Google, have you ever heard of this?

<http://code.google.com/p/google-sparsehash/>

2\. The point is that lazy evaluation has More problems than strict. Not less.
If you read the comments you will see that Dons actually agreed with me after
arguing the contrary. Even Simon Peyton Jones believes that "the next Haskell
should be strict".

3\. LGPL is a problem for many companies. If you don't really care about it,
good for you. You are in the minority.

In conclusion, I thought that Hacker News would be immune from Haskell group
think, but I was unfortunately wrong.

~~~
jrockway
_In conclusion, I thought that Hacker News would be immune from Haskell group
think, but I was unfortunately wrong._

I'm not Hacker News, I'm one poster.

~~~
codexon
And that's why I have multiple downvotes and everyone else arguing against me
has multiple upvotes...

------
tel
Oi! Those bar graphs confused me for so very long. A better graphic would have
been a scatterplot or time series chart and the x label should be something
like "repetition no." or "trial no."

