
Clojure's Transducers are as fundamental as function composition - ithayer
http://thecomputersarewinning.com/post/Transducers-Are-Fundamental/
======
tel
What does "as fundamental as function composition" mean here? The article just
appears to describe transducers. Transducers _compose_ via ("reverse")
function composition, sure, but that just means that they _are_ functions of a
type... and considerably less fundamental than _functions_ since they're a
specialization of that class of things.

They're cool and all—I've characterized them (partially, perhaps) as CPS
encoded flatmaps over lists and also as stateful left fold transformers, the
latter of which being much more general—but they're more like a rich ecosystem
for list transformations than any kind of _fundamental_ abstraction.

~~~
judk
This is typical of Closure, as a sort of anti-Haskell. It starts with
empirical structures invented by folks without PL theory training, and then
wriggles to find an explanation in terms of standard theory. You can see this
here on HN when Rich talks to Haskell folks and people translate his writing
into standard language.

I would love a comparison in the form of Haskell, converted to Lisp syntax,
and wired up to JVM.

~~~
vorg
You relate Haskell to standard theory and Clojure to non-PL-theory structures.
I think the real issue is that strongly-typed lazy functional code (Haskell)
and dynamically-typed isomorphic code (Clojure, and Lisp) have an "impedance
mismatch" such that they don't inter-translate very well. I think that's based
on the different foundation of each language.

Haskell is strongly typed and lazily evaluated, which easily enables functions
to take only one argument at a time. Although a function could take a tuple
parameter, it can be, and usually is, rewritten to take each component of the
tuple as a separate parameter, which makes the strong typing and builtin
currying simple, higher structures like monads possible, and the syntax can
designed to suit this style.

Lisp functions, on the other hand, must take many arguments at a time to cater
to the isomorphicity of the language. It's therefore much more difficult for
parameters to be typed, or to curry them. The syntax requires explicit visual
nesting. Inter-translating between this style and the Haskell style therefore
is difficult.

I'd even suggest the Haskell and Lisp styles are two different peaks on the
ladders of programming language abstraction, and the poster child of each,
monads and macros, just don't interoperate very well, simply because of the
different required foundations of each language.

~~~
tel
I don't find it difficult to intertranslate at all. Trivial actually: the
currying is a complete non-issue. For pure, terminating code
laziness/strictness hardly matters.

And dear god do I wish people would stop abusing "isomorphic".

~~~
metasoarous
I'm guessing that "homoiconicity" was intended, not "isomorphic". The former
makes sense in context.

~~~
vorg
I did mean "homoiconic". It was 5am when I wrote it.

> I don't find it difficult to intertranslate at all.

I should have also mentioned "variadic" with regards to lists and macros.
Although it's possible to inter-translate, the required add-ons such as
Template Haskell macros and Typed Clojure make each language more clunky. The
pure form of each language is based on two mutually exclusive foundations,
i.e. strongly typed auto-curried parameters and variadic untyped macro
parameters.

~~~
tel
Many macros can be translated to Haskell due to laziness. It's true that these
can be difficult to translate or end up requiring TH. That's a downside, but
it's rare.

And variadic functions are easily translated as well. They're much more
syntactic convenience than true semantic variation. Typically, variadic
functions encode defaulting which, in Haskell style, is just ignored.
Otherwise, you just encode them as separate functions with different names.
That can be annoying, but in my experience it rarely is. Worst case, you can
often abstract over most of the polymorphism using a typeclass.

I'm sure you could manufacture some Clojure code which takes advantage of non-
obvious macros, bizarre variadicity, and massive dynamic typing... but it'd be
really hard to understand as a human.

Human intelligibility tends to drive Clojure code to be easily translatable.

------
dons
Title shows author doesn't understand programming language design. No, a small
set of higher-order functions[1] is not as fundamental as the concept of
higher-order functions in the first place.

[1]:
[http://www.reddit.com/r/haskell/comments/2cv6l4/clojures_tra...](http://www.reddit.com/r/haskell/comments/2cv6l4/clojures_transducers_are_perverse_lenses/cjjgbbs?context=3)

~~~
ithayer
Author here, thanks for the comment -- since that seemed to have been lost,
I'm using "fundamental" because transducers let you describe your logic so
that it that can be applied over sequences and non-sequences in a way that you
cannot by just applying composed logic functions through existing clojure
machinery.

EDIT: s/composed functions/composed logic functions/

------
moomin
If you ignore the arities, ignore the internal state, and correctly observe
the unwritten rules, yes transducers act like function composition. I looked
into this here: [http://www.colourcoding.net/blog/archive/2014/08/16/lets-
wri...](http://www.colourcoding.net/blog/archive/2014/08/16/lets-write-a-
transducer.aspx)

I think Rich's innovation here is extremely clever and quite subtle, but it's
pretty Clojure-specific, both in terms of the problem it solves and the way it
solves it.

~~~
lkrubner
The overall pattern of map-reduce is possible in most languages, certainly any
language that has closures. And the idea of composing multiple reducing
functions together is something that comes up in many languages. If a language
does not have closures, you could do something similar in any language by
walking an accumulating object through multiple loops, but that is awkward and
ugly. But you could certainly do something like this in Javascript, through
the clever use of multiple closures, composing the closures together. But
transducers formalize this as an idiomatic part of Clojure.

~~~
Tuna-Fish
In languages with more leeway on execution order, this problem can be made to
go away. For example, Haskell's stream fusion combines back-to-back calls to
list processing functions into a single iteration over the list.

~~~
seanmcdirmid
Lazy evaluation can also be encoded fairly easily in strict languages using
constructs like lazy stream abstractions.

~~~
jeremyjh
Uh, sure but does any other language have a compiler that actually implements
stream fusion?

~~~
gazarsgo
Not in the compiler, but enjoy a Microsoft Research paper about Steno, a C#
library that claims to be superior to stream fusion (section 8.2):
[http://research.microsoft.com/pubs/173946/paper-
pldi.pdf](http://research.microsoft.com/pubs/173946/paper-pldi.pdf)

In case anyone is unaware, LINQ lets you represent queries as a series of
function calls which it represents as a series of Expression objects and an
in-memory AST.

~~~
seanmcdirmid
There is also the dynamic language run-time (DLR), which as a generalization
of LINQ (to support statements) is pretty powerful when writing up these tools
on .NET. I use it often in my work.

------
dschiptsov
Subset of high-order functions are more fundamental than operations defined
for a whole set?)

Clojure's are more fundamental than other languages?

------
anon4
What is the difference (or what is gained) from transducing a reducer over
mapping (filtering, etc) a list and reducing it? Is it a clojure-specific
optimisation?

~~~
tel
It avoids intermediate structure and enables more sources and sinks to work.
For instance, if you build a reducer, you're basically adjoining a "reducible"
and a "reduction function" and then transforming the reducer by transforming
that reduction function.

This already avoids the creation of intermediate structure since you just keep
transforming the reduction function, but you have this sort of useless
"reducible" thing attached. Mostly, the trouble is that you were afflicted by
the kingdom of nouns---you don't really need a structure to think of first
class objects.

Instead, you can just consider the various ways of transforming reduction
functions. They all compose as (reverse) functions (you can see them as a
category) and you can take your resultant "transducer" and apply it to a
source and sink structure to map out of the source and into the sink.

------
rebcabin
In addition to map transducers (which are 1-to-1) and filter transducers
(which are many-to-1), flatMap transducers (which are 1-to-many) should be
fundamental.

~~~
tel
flatMap is the fundamental transducer (of a particular model). To be clear,
the function a -> [b] subsumes mapping and filtering---if [b] is always a
single element then a -> [b] is a map, if [b] is always either 0-or-1 elements
then a -> [b] is a filter (possibly adjoined to a map).

