They're cool and all—I've characterized them (partially, perhaps) as CPS encoded flatmaps over lists and also as stateful left fold transformers, the latter of which being much more general—but they're more like a rich ecosystem for list transformations than any kind of fundamental abstraction.
I would love a comparison in the form of Haskell, converted to Lisp syntax, and wired up to JVM.
Haskell is strongly typed and lazily evaluated, which easily enables functions to take only one argument at a time. Although a function could take a tuple parameter, it can be, and usually is, rewritten to take each component of the tuple as a separate parameter, which makes the strong typing and builtin currying simple, higher structures like monads possible, and the syntax can designed to suit this style.
Lisp functions, on the other hand, must take many arguments at a time to cater to the isomorphicity of the language. It's therefore much more difficult for parameters to be typed, or to curry them. The syntax requires explicit visual nesting. Inter-translating between this style and the Haskell style therefore is difficult.
I'd even suggest the Haskell and Lisp styles are two different peaks on the ladders of programming language abstraction, and the poster child of each, monads and macros, just don't interoperate very well, simply because of the different required foundations of each language.
And dear god do I wish people would stop abusing "isomorphic".
> I don't find it difficult to intertranslate at all.
I should have also mentioned "variadic" with regards to lists and macros. Although it's possible to inter-translate, the required add-ons such as Template Haskell macros and Typed Clojure make each language more clunky. The pure form of each language is based on two mutually exclusive foundations, i.e. strongly typed auto-curried parameters and variadic untyped macro parameters.
And variadic functions are easily translated as well. They're much more syntactic convenience than true semantic variation. Typically, variadic functions encode defaulting which, in Haskell style, is just ignored. Otherwise, you just encode them as separate functions with different names. That can be annoying, but in my experience it rarely is. Worst case, you can often abstract over most of the polymorphism using a typeclass.
I'm sure you could manufacture some Clojure code which takes advantage of non-obvious macros, bizarre variadicity, and massive dynamic typing... but it'd be really hard to understand as a human.
Human intelligibility tends to drive Clojure code to be easily translatable.
True story: the transducer announcement has mostly made me read up on the Haskell fold and lens libraries...
type Transducer a b =
forall r . (b -> r -> r) -> (a -> r -> r)
forall r . (b -> r -> r) -> (a -> r -> r)
forall r . (r -> b -> r) -> (r -> a -> r)
a -> (forall r . (r -> b -> r) -> r -> r) [non-obvious, but true]
a -> [b]
data Fold i o where
Fold :: (i -> x -> x) -> x -> (x -> o) -> Fold i o
type Transducer a b = forall r . Fold b r -> Fold a r
to :: (a -> [b]) -> (a -> (forall r . (b -> r -> r) -> r -> r))
to f a =
h (f a)
h :: [b] -> (forall r . (b -> r -> r) -> r -> r)
h b cons nil =
case b of
(x:xs) -> cons x (h xs cons nil)
 -> nil
from :: (a -> (forall r . (b -> r -> r) -> r -> r)) -> (a -> [b])
from f a = f a (:) 
I ran this example through Clojure in let's write a transducer, and it plain doesn't work; I think the mapping given above works for all transducers that are actually valid, but I think the type definition allows for broken ones.
Again, I'm not sure & still learning, so please explain if I've gone wrong here.
In an impure setting none of the math above holds.
Edit: note that this doesn't mean you have to throw away local or global state. You can recapture that by using state machine transformers or even Kleisli state machine transformers (although that gets pretty heavy!)
newtype Mu f = Mu (forall r . (f r -> r) -> r)
forall r . (f r -> r) -> r
newtype Mu f = Mu (f (Mu f))
The really fascinating part is that my Mu is equivalent to your Mu in Haskell... because it's Turing complete. In a world where least and greatest fixed points differ, in a world where data and codata differ, then Mu has a cousin Nu
data Mu f = Mu (forall r . (f r -> r) -> r)
data Nu f = forall r . Nu (r -> f r) r
If you want a total language, replace generalized fixed points with these two pairs of fixed points. Mu only lets you construct things finitely and tear them down with catamorphisms and Nu only lets you destruct things finitely which you've built with anamorphisms.
Even in Haskell you can treat certain uses of types as being data-like or codata-like by viewing them more preferentially through Mu or Nu---though you can always convert both to Fix
newtype Fix f = Fix (f (Fix f))
That step had me really confused.
Function composition usually operates at non-collection type inputs. If you compose f and g, that usually means f takes some input and then g takes as input the output of g.
Transducers are a generalization of composition in the sense that it takes a collection of those inputs f would accepts.
So, in that sense, function composition could be a specific instance of transducers with number of inputs = 1.
Transducers are a subset of functions. They cannot be generalizing them.
Re: the characterisation, the only thing I'd add is that they have explicit start and end calls. Start resets the state, end can clean up resources if necessary.
I tried thinking about what output structure a generalised transducer would form, and came to the conclusion it was a foldable monoid i.e. pretty list-like.
EDIT: s/composed functions/composed logic functions/
I think Rich's innovation here is extremely clever and quite subtle, but it's pretty Clojure-specific, both in terms of the problem it solves and the way it solves it.
Now what if this is done to reduce itself? What should a reduced arity (reduce fun) return? I think that is a noop: nothing needs to be done to fun to make it reduce under reduce, so (reduce fun) -> fun. I.e. to transduce the reducer "fun", an arbitrary function, into a function X such that (reduce X list) will do the job of (reduce fun list), we simply equate X with fun.
In case anyone is unaware, LINQ lets you represent queries as a series of function calls which it represents as a series of Expression objects and an in-memory AST.
Clojure's are more fundamental than other languages?
This already avoids the creation of intermediate structure since you just keep transforming the reduction function, but you have this sort of useless "reducible" thing attached. Mostly, the trouble is that you were afflicted by the kingdom of nouns---you don't really need a structure to think of first class objects.
Instead, you can just consider the various ways of transforming reduction functions. They all compose as (reverse) functions (you can see them as a category) and you can take your resultant "transducer" and apply it to a source and sink structure to map out of the source and into the sink.