Hacker Newsnew | comments | show | ask | jobs | submit | tel's commentslogin

DLX might be a fun algorithm to explore too. Union-Find is one I was more familiar with, but I think you're correct that DLX cannot have a reasonable, direct "pure" translation.

reply


My next post shows the balanced tree approach. It's still in the works, but if you're eager then it wouldn't be difficult to find the draft in the same repo as the code.

reply


It depends on your toolbox. FP (Haskell-FP anyway) appears at first to hobble a lot of mutable algorithms, but (a) it has its own toolbox which has its own strengths and (b) you can embed mutability in such a way to use all of your standard mutable algorithms just the same.

From a personal perspective, I program a lot in Haskell and I find it perfectly fluent.

Just don't expect to take an algorithm predicted on assumptions of a particular memory model and translate it unmodified without at least using the embedding trick I mentioned above.

---

Fortunately, you're in luck. The whole point of this series is to culminate in a description of the "embedding trick" I referenced above. Part 2 is written and being edited now. It introduces a slow, simple, buggy way of writing Union/Find "purely".

Part 3 will fix the bug and then explore a fast embedding trick.

reply

judk 16 hours ago | link

When will Haskell have a standard library that provides fast embedding, so programmers don't have to resort to creative tricks to get working algorithms at readkanky

reply

dllthomas 16 hours ago | link

I was under the impression that was ST.

reply

tel 12 hours ago | link

Exactly!

reply


I was considering implementing something similar to the Overmars methods mentioned on page 90. Maybe I'll read this paper and see if I can tack it on at the end... Thanks for the link!

reply


After someone asked for it in /r/haskell I added a link to the complete code for this post.

https://github.com/tel/tel.github.io/tree/master/public/code...

Thus far there are no dependencies on outside libraries (though eventually I'll have to use `containers`) so you can just drop it into GHCi and play around!

reply


Does it not seem possible to you?

Hopefully by part 3 I'll be able to poke very precisely at the seam between mutable and immutable algorithms and unlock how to pick between them with greater refinement.

reply

rwosync 23 hours ago | link

I'm glad that you've taken the time to write more. Your posts have been very articulately stated, especially the one on type systems.

reply

tel 23 hours ago | link

Thanks! It means a lot to hear. It's also quite motivating :)

reply


It's not yet, but—to cheat a bit here—eventually we're going to use `Mem` to get quite intimate with `ST` and show why it's important to use `ST` perhaps more often than people even anticipate.

If you'd like, try to instantiate something along the lines of `Mem (State (IntMap v))`. It's slightly trickier than it looks due to some typing issues (which don't matter, but I'll save the reveal). It's also likely to be slightly buggy.

reply


Yeah, exception resolution is pretty weird in Haskell due to laziness. In some sense, you end up thinking of exceptions as being buried in values in the language instead of exposed by computation—this is your null-value landmine problem exactly

http://research.microsoft.com/en-us/um/people/simonpj/papers...

This is also why the Control.Exception module exposes `evaluate`, which embeds a value in an IO computation and thus resolves this problem... If you remember to use it.

http://hackage.haskell.org/package/base-4.7.0.0/docs/Control...

Fortunately, in idiomatic Haskell exceptions are considered to be an expert feature (basically I see them in concurrent IO or resource management code only) and partial functions—usually introduced by incomplete pattern matching like `None.get` does—are veboten.

It's not perfect, but -Wall will help with that.

reply


Syntactic naturality seems like it's mostly a function of familiarity. That said, the example has, for a lot of mathematical reasons, a great deal of semantic naturality.

It's far from immediately obvious, but `foldr (:) []` is the way to copy singly-linked lists. In particular, if you look at a linked list as a degenerate tree then what `foldr f z` does is replace all of the nodes with `f` and the single leaf with `z`. Since cons, i.e. (:), creates a node and nil, i.e. [], is that single leaf then `foldr (:) []` is a very natural way to do nothing at all.

So it's kind of the most boring interesting example imaginable.

reply


I personally think the priority queue one is somewhat simpler than a fixed array algorithm. It also produces an infinite list of primes (or generator) instead of the primes up to a particular number. This is especially important in Haskell.

I'd love to see it as it shows off some great data structures available in the libraries (like priority queues) but it's getting longish.

The mutable array version is basically the same as imperative code everywhere

    sieveUA :: Int -> UArray Int Bool
    sieveUA top = runSTUArray $ do
        let m = (top-1) `div` 2
            r = floor . sqrt $ fromIntegral top + 1
        sieve <- newArray (1,m) True
        forM_ [1..r `div` 2] $ \i -> do
          isPrime <- readArray sieve i
          when isPrime $ do
            forM_ [2*i*(i+1), 2*i*(i+2)+1..m] $ \j -> do
              writeArray sieve j False
        return sieve
It's a bit noisy syntactically and doesn't show off as much interesting stuff.

reply

More

Guidelines | FAQ | Lists | Bookmarklet | DMCA | News News | Bugs and Feature Requests | Y Combinator | Apply | Library | Contact

Search: