Hacker News new | past | comments | ask | show | jobs | submit login

> isolate the logic of map, filter etc from lists or any particular context

so Functors (fmap)? fmap still uses regular old fns, no machinery.

http://clojuredocs.org/clojure_contrib/clojure.contrib.gener... http://learnyouahaskell.com/making-our-own-types-and-typecla...

I feel that same confusion as you. In Haskell terms, what is it Rich have implemented or perhaps even invented?

I'm not sure I grok this, but I think that the main points are:

- reducers can work on tree structures, and thus can exploit parallelism. This would be like using a map that requires only the Foldable typeclass

- In Haskell you have stream/vector fusion, but it's not obvious to know when ghc will actually exploit it, you might want to use something like Control.Monad.Stream or Data.Vector. In theory it might be generalized to all Foldables, but in practice for now it might be a good enough compromise to stick to a limited number of types (the one that support such transducers)

So: nothing terribly new/groundbreaking, but it might bring something like stream fusion to the masses (of clojure developers :P )

Comparing transducers to stream fusion is far narrower than the scope of their applicability.

They are more comparable to Oleg's Enumerators (http://okmij.org/ftp/Haskell/Iteratee/describe.pdf), in that you compose a series of computations and then push data through them. The type signature is similar:

    type Iteratee el m a -- a processor of 'els' in the monad 'm' returning a type 'a'

    type Enumerator el m a = Iteratee el m a -> m (Iteratee el m a)
The Enumerators library is complicated by the presence of monads and by trying to automatically close handles when the stream is processed. In some ways it seems that the goal of solving the lazy IO problem led to missing a much simpler abstraction. Transducers seem to be simpler and more focused on just abstracting stream/map-reduce computation.

While not mentioned in the blog post, the transducers implementation supports both early termination and result completion/flushing/cleanup.

there does seem to be some overlap with stream fusion -- which was all about exploiting the optimization opportunities when separating the collection operation from the kernel performed on each element.

We called the bits in the middle "step functions" , which could be combined with "consumers", "producers" and "transformers".

And the algebra generalizes hugely (not just collections) but to things like concurrent processes, data flow programs etc.


Things to think about in a non-Haskell settings: how do you prevent reordering side-effects? Can execeptions/non-termination being reordered be observed?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact