

The Y Combinator, or How to Succeed at Recursion Without Really Recursing - edw519
http://mvanier.livejournal.com/2897.html

======
ced
Math envy.

Why do hackers focus on it so much? It has only very remote relevance to
normal programming.

The beautiful idea I see behind the Y combinator is that of formalism. Once
you describe something in mathematical terms, magic can happen. But this would
be much better illustrated by a good introduction to analysis or algebra. Even
number theory has some cool results, and it's easier to grasp.

In my second course at university, the teacher proved by contradiction that 1
>= 0:

    
    
       Assume 1 is negative, 1 < 0
       Multiply both sides by 1, so that 1*1 > 0*1 (multiplication by a negative flips the inequality)
       1 being an identity, this implies 1 > 0, contradiction
       

That was one of the few Wow moments I had in my education. Short and to the
point.

(that proof above is not really satisfactory, especially without the context
and the axioms, but it served its purpose well five years ago)

~~~
silentbicycle
1\. Conceptual elegance. It seems to me that recursion and the complexity that
emerges from uses it fascinates many people. GEB comes to mind.

2\. Directly dealing with these mathematical formalisms (the Y combinator, SKI
combinators (<http://en.wikipedia.org/wiki/SKI_combinator_calculus>), etc.)
can become really important in programs such as interpreters.

~~~
ced
1\. I was writing and rewriting a rebuttal, but you're right... There is
something cool about achieving self-reference without a name.

2\. Really? If you feel like elaborating / providing link, I'd appreciate.

~~~
silentbicycle
For example, recently I wrote a simple interpreter for a stack-based / postfix
language kind of like Forth and Joy
(<http://en.wikipedia.org/wiki/Joy_programming_language>) because I'm trying
to get my mind around that language family* . It's still very basic and
doesn't have much more built into it than the core Lisp PG discusses in The
Roots of Lisp does (<http://paulgraham.com/rootsoflisp.html>), but I really
noticed how the specific interplay of those sorts of forms end up setting much
of the subsequent language. Lambda calculus, at least the passing acquaintance
I have with it, has given me a conceptual vocabulary with which to think about
evaluation, scoping, and which aspects of a language are redundant, can
themselves be used to derive others, etc. I guess #2 could have been better
said, "Understanding these mathematical formalisms ... can be really helpful
when ...".

As a more concrete example, I originally had words (functions, so to speak)
defined Forth-style, e.g.

    
    
      : avg + 2 / ;
    

Which roughly means, "start recording (:) for a function called 'avg' in which
we pop two numbers from the stack, add them, and push the result (+), then
push the number 2, pop the two numbers on the stack and divide them (so
sum/2), then stop recording (;)". Then, I realized that once I added sytax to
the lexer for lists ("blocks") which are pushed onto the stack as one unit, I
could instead use a list of symbols like quoted code, and instead define same
by

    
    
      [+ 2 /] "avg" def
    

which pretty literally means, "pop a function name off the stack, pop a block
of code off the stack, and associate them in the local environment.". It no
longer needs to include any state in the interpreter for whether it was in
recording or executing mode, keep track of some words (e.g. ";") which must
interrupt recording mode, etc. (Adding code-as-data really adds to a
language's capacity for abstraction.) Similarly, "[+ 2 /] do" works for an
anonymous function of sorts. Much of the rest of the language is contained by
how some of the core idioms for things like evaluation and definitions work,
and formalisms like the lambda calculus and combinatory logic seem to be
frameworks for studying those processes explicitly.

By the way, if you'd like to write a quick interpreter as a learning
experience, Forth has my recommendation. It's small and (aggressively) simple,
but still supports an impressive amount of abstraction. The parser for it is
even simpler than for a Lisp, and that's really saying something. (Here is a
nice example of one written in heavily documented i386 assembly:
<http://www.annexia.org/_file/jonesforth.s.txt> )

Second, OCaml seems to be particularly well-suited for writing interpreters
(and compilers, I'm sure), though in this case it was _serious_ overkill.

* I'm kind of whining here, but AFAICT it isn't possible to install a working version of Factor (<http://factorcode.org/>) on OpenBSD (or, probably, any Unix) without downgrading to a version of the OS using the same dynamic libraries, installing their binary version, and then progressively updating both in tandem until it's running with current libraries. Maybe I just don't have enough experience bootstrapping new language environments, but that was a disappointment -- it looks like a cool language.

(Also, 1 isn't really an argument so much as an observation.)

~~~
ced
Thanks a lot.

~~~
silentbicycle
One more --

Also see chapter 22 in Shriram Krishanamurthi's _Programming Languages:
Application and Interpretation_.
([http://www.cs.brown.edu/~sk/Publications/Books/ProgLangs/200...](http://www.cs.brown.edu/~sk/Publications/Books/ProgLangs/2007-04-26/))

The goal of the chapter is essentially, "Now that we've built our own small
programming language on top of Scheme, how can we reduce the number of Scheme
elements that it relies on directly?" Among other things, he shows how to use
the Y combinator to allow recursive function definitions without having to use
Scheme's define.

------
luigi
"Recursing" and "recurse" aren't words. Recursion comes from the word recur. A
function recurs, not recurses. Got taught that during the infamous CS1321
Scheme course at Georgia Tech in 2001.

~~~
scott_s
They are now. Words become a part of the language because enough people use
them.

~~~
13ren
"English is what English-speaking people speak"

~~~
silentbicycle
Nicely played. :)

------
databus
God, if I see another code example using Factorial I am going to hurl.

~~~
silentbicycle
What are some other simple examples of recursion, though?

Quicksort would be pretty good for this, but the straightforward recursive
version has several edge cases that lead to bad performance.

Getting the length of a linked list is probably a better example:

    
    
      (* ocaml *)
      let rec length l = match l with
         | []   -> 0
         | h::t -> 1 + length t
    
      ; scheme
      (define (length l)
         (if (null? l)
             0
             (+ 1 (len (cdr l))))
    

Neither of these is tail-recursive, of course, but that's another post
entirely.

Any other ideas?

~~~
13ren
Does performance matter when seeking "simple" examples?

I vote for quicksort - not because it's quick, but because its implementation
has the same form as its proof. It's simple if you use the Erlang-style
version.

