It depends on your toolbox. FP (Haskell-FP anyway) appears at first to hobble a lot of mutable algorithms, but (a) it has its own toolbox which has its own strengths and (b) you can embed mutability in such a way to use all of your standard mutable algorithms just the same.
From a personal perspective, I program a lot in Haskell and I find it perfectly fluent.
Just don't expect to take an algorithm predicted on assumptions of a particular memory model and translate it unmodified without at least using the embedding trick I mentioned above.
Fortunately, you're in luck. The whole point of this series is to culminate in a description of the "embedding trick" I referenced above. Part 2 is written and being edited now. It introduces a slow, simple, buggy way of writing Union/Find "purely".
Part 3 will fix the bug and then explore a fast embedding trick.
It's not yet, but—to cheat a bit here—eventually we're going to use `Mem` to get quite intimate with `ST` and show why it's important to use `ST` perhaps more often than people even anticipate.
If you'd like, try to instantiate something along the lines of `Mem (State (IntMap v))`. It's slightly trickier than it looks due to some typing issues (which don't matter, but I'll save the reveal). It's also likely to be slightly buggy.
Yeah, exception resolution is pretty weird in Haskell due to laziness. In some sense, you end up thinking of exceptions as being buried in values in the language instead of exposed by computation—this is your null-value landmine problem exactly
Fortunately, in idiomatic Haskell exceptions are considered to be an expert feature (basically I see them in concurrent IO or resource management code only) and partial functions—usually introduced by incomplete pattern matching like `None.get` does—are veboten.
Syntactic naturality seems like it's mostly a function of familiarity. That said, the example has, for a lot of mathematical reasons, a great deal of semantic naturality.
It's far from immediately obvious, but `foldr (:) ` is the way to copy singly-linked lists. In particular, if you look at a linked list as a degenerate tree then what `foldr f z` does is replace all of the nodes with `f` and the single leaf with `z`. Since cons, i.e. (:), creates a node and nil, i.e. , is that single leaf then `foldr (:) ` is a very natural way to do nothing at all.
So it's kind of the most boring interesting example imaginable.
I personally think the priority queue one is somewhat simpler than a fixed array algorithm. It also produces an infinite list of primes (or generator) instead of the primes up to a particular number. This is especially important in Haskell.
I'd love to see it as it shows off some great data structures available in the libraries (like priority queues) but it's getting longish.
The mutable array version is basically the same as imperative code everywhere
sieveUA :: Int -> UArray Int Bool
sieveUA top = runSTUArray $ do
let m = (top-1) `div` 2
r = floor . sqrt $ fromIntegral top + 1
sieve <- newArray (1,m) True
forM_ [1..r `div` 2] $ \i -> do
isPrime <- readArray sieve i
when isPrime $ do
forM_ [2*i*(i+1), 2*i*(i+2)+1..m] $ \j -> do
writeArray sieve j False
It's a bit noisy syntactically and doesn't show off as much interesting stuff.