Hacker News new | past | comments | ask | show | jobs | submit login
An intriguing discovery of Computer Science: The Y combinator [video] (youtube.com)
137 points by lgm__ on Aug 28, 2022 | hide | past | favorite | 38 comments



For a fun introduction to this field, check out Raymond Smullyan's To Mock a Mockingbird. If you have kids, this and his other books are a great way to let them get interested in puzzle solving and math without realizing that they're doing math (he cites such an anecdote in another book).


That’s easy though:

  Mockingbird mockMockingbird = Mockito.mock(Mockingbird.class);
  MockingbirdMocker mockingbirdMocker = new MockingbirdMocker();

  mockingbirdMocker.mock(mockMockingbird);

  verify(mockMockingbird).mock();


   when(mockingBird).dontSing(thenReturn(diamondRing));


Great suggestion, thanks!!! I'm about to teach simply-typed lambda calculus and fixed-point combinators in my class. This book is a great suggestion (as are all books by Smullyan). Shame that the pace of courses today and bite-size youtubization has killed the joy of sitting around with a single puzzle in a cozy nook or a bus.


Lately I've really gotten into reading Dover math books. They're written with such a love of the underlying topic without much concern for practical applications (most predate modern computers). For example, I read a book about graph theory before delving into the Neo4j documentation, and the code makes so much more sense.


Thank you, I never heard about Smullyan, I'll check that out.


Here are a few additional written articles on what a Y-combinator is.

-Stack Overflow, "What is a Y-combinator?": https://stackoverflow.com/questions/93526/what-is-a-y-combin...

-Wikipedia, "Fixed-point combinator," at the Y combinator sub-section: https://en.wikipedia.org/wiki/Fixed-point_combinator#Y_combi...

-For a more academic source, an article from a Johns Hopkins computer science professor is at: https://www.cs.jhu.edu/~jason/465/readings/lambdacalc.html


Hence the name Y Combinator, a startup that generates other startups.


And never stops. (Which would be the Z-Combinator!)


Which is apparently a thing! https://www.zcombinator.org/


I'll be honest, I didn't know what Y-Combinator is. It was one of those things I just took as granted. Mind blown.


A good video that made it click for me is the following, which derives the ycombinator from 1st principles:

https://m.youtube.com/watch?v=FITJMJjASUs


The author skips the (standard) construction of numbers in lambda calculus, the so-called Church numerals C0,C1,C2,... which iterate a given function a certain number of times on a given argument. So

    C0 f x => x
    C1 f x => f x
    C2 f x => f (f x)
Curiously, for these numbers, you can define the factorial function without any use of recursion, instead using their built-in iterative behaviour. Written in the standard notation for lambda calculus, it is

    factorial = λn.λf.n(λf.λn.n(f(λf.λx.n f(f x))))(λx.f)(λx.x)


What I don't get is when it decrements and checks if we're at the base case?


The definition of factorial (fac for short) can be broken down as

    succ = \n\f\x. n f (f x) -- the successor function for Church numerals
    F   = \c\n. n (c (succ n))
    fac = \n\f. n F (\x.f) id
As you can see there is no use of a decrement function. Instead, numbers 1..n are recreated by the iterated successor function.

We can follow the reduction of fac 3 as

    0 = \f\x. x
    1 = \f\x. f x
    2 = \f\x. f(f x)
    3 = \f\x. f(f(f x))
    fac 3 = \f. F (F (F (\x.f))) 1
          = \f. 1 (F (F (\x.f))  2)
          = \f. 1 (2 (F (\x.f)   3))
          = \f. 1 (2 (3 ((\x.f)  4)))
          = \f. 1 (2 (3 f))


I have to plug my favorite "explanation" of the Y combinator, which is called "Lawvere's fixed point theorem".

In a cartesian closed category, if there is a function p : A -> (A -> B) that has a left-inverse, i.e., e : (A -> B) -> A with p o e = id_(A -> B) then B has a fixed point combinator Y : (B -> B) -> B, i.e., every function from B to itself has a fixed point: Y(f) = f(Y(f)).

The proof is like the Y combinator in untyped lambda calculus, but with extra ps and es. This is because one way to understand untyped lambda calculus is that you are working with a single type D that satisfies D = (D -> D) (so p,e are identities).

I like this because it makes the connection between the Y combinator and diagonalization arguments like Cantor's theorem explicit.

For instance, Cantor's theorem says that the powerset of a set always has higher cardinality. We can reformulate this as saying that there is no onto function p : X -> (X -> 2) because a subset of X is the same as a function from X to booleans (2 = {true, false}). An onto function has a left inverse (using axiom of choice) and so if we have an onto function X -> (X -> 2) then every function (2 -> 2) has a fixed point, but this isn't true! If we took Y(not) we would construct a boolean b : 2 such that b = not(b), which is impossible so we have a contradiction. I found this helpful in understanding why the "not" appears in just the right place in the classical proof.


> if there is a function p : A -> (A -> B) that has a left-inverse, i.e., e : (A -> B) -> A with p o e = id_(A -> B) then B has a fixed point combinator

I think you meant to write "right inverse" here.

Agreed that Lawvere's theorem is very cool. It helps that Cartesian closed categories are basically just categories where you can do typed lambda calculus, so the underlying link is already very close.


I admit I hadn't thought this is YC's name origin. I didn't research it, just thought maybe "Y" is actually supposed to be "why" and "combinator" had to do something with _combining synergies_ or something. You know, something cryptic that sv people think is cool.

This, on the other hand, is pretty clever with the recursion reference. Sometimes I forget that YC was a wild idea at its time too, aiming to realize other wild ideas. Thanks for sharing.


In case you didn't know, PG is a big proponent of Lisp and I believe he had a big role in naming YC as YC. His well known (I guess not for the current generation) work is in creating the very first (kinda) ML automated based spam filter. I believe he implemented it in Lisp. It's a beautiful piece of work.

http://www.paulgraham.com/spam.html


HN is also built with Arc, Paul Graham's Lisp dialect.


This piqued my interest so I had a read of the below which explained things very well, although I’m only about 70% fully comprehending it:

https://lucasfcosta.com/2018/05/20/Y-The-Most-Beautiful-Idea...


An explanation by Gerald Sussman himself: https://www.youtube.com/watch?v=0m6hoOelZH8#t=1h12m30s


That is quite the first video of a channel!


I never understood this obsession with lambda calculus or ycombinator. If most of the computers have an instruction set that is rich and has support for loops and types and other things, what use is it to waste your time to understand this? It’s like reading poetry perhaps? There is no goal but just to come up with puzzles so you can show a cool solution? Why? Why?


The Y-combinator (or any other fixed-point combinator) turns the lambda calculus into a Turing-complete language by showing that it can express recursion. The lambda calculus in itself is an extremely simple declarative language. Its simplicity and Turing-completeness makes it a perfect target for programming language research. I think it's fair to say that most modern ideas of programming languages have been demonstrated as an extension of the lambda calculus first. So without the Y-combinator, you might not have Rust's traits or Python's yield, or C#'s LINQ.


>I never understood this obsession with lambda calculus or ycombinator.

It's a very foundational thing--at the foundation of theoretical computer science and a lot of programming languages. Understanding it pays a lot of dividends over your life--even if you don't use it directly. By now it's a mandatory course at many CS universities anyway--so that tells you how foundational it is.

>If most of the computers have an instruction set that is rich and has support for loops and types and other things, what use is it to waste your time to understand this?

Because lambda calculus is much simpler than that and still can do everything. As an engineer you strive for simplicity, right?

Lambda calculus is so simple, you can learn it (all of it!) in a week. You will be able to compute everything that can be computed--WITHOUT having jumps, loops, types, instruction sets, numbers, booleans, lists and so on at the foundation.

The foundation is function definition and function calls. The argument is a function, and so is the result. It's functions all the way down. You might think it's functions and numbers? Nope. Booleans? Nope. Lists? Nope. Loops? Nope. It's JUST functions at the foundation.

This makes it much easier to build programming languages and/or analyze programs because you have only very few primitives that the end user (programmer) can still use to build whatever they want! Get this, you can remove almost everything and still it works perfectly. The end result will look mostly like any other source code you are used to.

The most interesting thing about the Y combinator is that it's NOT A PRIMITIVE. That is so weird. You have a language that has recursion nowhere in the primitives, and it's able to recurse.


> Because lambda calculus is much simpler than that and still can do everything. As an engineer you strive for simplicity, right?

Brainf_ck is just as simple and it's also Turing Complete. You can't really say one way of expressing computation is more foundational than the other since they're all computationally equivalent. BF might not be as elegant as lambda calculus, but if your claim is true I don't see why it's not included as an alternative of the CS curriculum.

The real reason, IMHO, is that computer science historically evolved from disciplines that studied logic systems, and was a system that was proposed by Alonzo Church. There's a lot of worship for formal systems within the CS community, and in a self-referential way that makes it important because everyone else worth their salt would know this stuff.


it's part of theoretical computer science. You asking this is the same as someone in the 15th century asking why bother with square root of negative 1, and what it could possibly be used for.

lambda calculus can be considered fundamental to how computations can be represented via simple building blocks - it's in the same category as a turing machine. if a problem could be represented as a lambda calculus expression, you know that it is theoretically solvable - or if you proved that it isn't, then you know the problem is unsolvable (like the halting problem).


> what use is it to waste your time to understand this? It’s like reading poetry perhaps?

You answered your own question :-).

Does one need to learn/understand ƛ-Calculus to be good at their day-to-day programming job? Not at all. However, if you like to appreciate the beauty of these purely abstract structures and concepts then yes you need to learn it.

I hold similar opinion of abstract paintings, or say wine-tasters. But that doesn't mean they are useless. To be human is to experience such beauty, or to push our reasoning abilities.


Those computers with rich instruction sets are built up of smaller units made of smaller units, down to individual transistors (typically 2 kinds). It's pretty cool that you can build something so complex from a couple types of components and wires.

Lambda calculus is the analogous building block for algorithms. With just a few primitives (including the Y combinator) you can have a powerful system upon which you can build any (no-quantum) algorithm at all.

I find it deeply pleasing that all our fancy systems can be broken down to such simple components, and I'm glad to belong to the profession of people who are good at this task.


But the cool part is that the YC is not actually a primitive. Primitives in lambda calculus are essentially just variables, function definition, and application. At a first glance, you wouldn't tell that you could express infinite computation from such rudimentary tools, but the YC shows that you actually can.


It gives you a new lens through which to view your work. Not strictly necessary, but your understanding of the world will grow. Not unlike poetry.


This was my attempt to understand and implement the Y combinator a while back: https://gist.github.com/bluepnume/6973b6efb7d48edcbf19


It requires some thinking, but the video gives a complete explanation, and is accessible to anybody with a strong programming background.


in all the recursion explanation..there is a minor edit I noticed

at timestamp 10:16 the presenter originally said "g" but then edited the video later to overlay the audio of him saying "f" a second later :)


I am amazed you noticed this. It is correct. I noticed my mistake, didn't want to spend an hour putting back my camera and light at the same location, and proceeded to clumsily edit the audio :)


A Y-combinator was proposed to be added to the ISO C++ Standard Library. It was not adopted, as everyone agreed it would not be useful. If anyone ever did find a use, they could copy its proposed implementation into their own code. But everybody knows that will not happen.


This site is honestly one of the worst sites on the web. It's dumber than slashdot at its stupidest, but with a big dash of arrogance combined with Dunning-Kruger hall of mirrors effect. At its worst (which is frequent), it'll give the daily stormer a run for its money.

"Hacker" (lol) News is pure, unadulterated, proto-fascist garbage for narcissistic jerks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: