
Functional Bits: Lambda-calculus based algorithmic information theory [pdf] - theaeolist
https://tromp.github.io/cl/LC.pdf
======
tromp
Author here. The paper is currently in a state of flux, as I started adding
material about the graphical notation, but didn't get very far before my
attention drifted to other topics. So please have a look at the following page
as well

[http://tromp.github.io/cl/cl.html](http://tromp.github.io/cl/cl.html)

which links to an explanation of the graphical notation, to my corresponding
IOCCC entry from 2012, and to a Wikipedia page on Binary Lambda Calculus that
has since been deleted.

~~~
joe_the_user
Serious question.

Is Lambda calculus a good match for algorithmic information theory. The
versions of AIT I've read involved either Turing machines, hand-waving of
computation or recursive function. Lambda calculus seems more complex and thus
harder to integrate with an AIT that mostly needs to know that strings are
programs, that not all programs terminate, that some programs can give a
substring of their string as outputs and so-forth.

~~~
tromp
In my view lambda calculus is a perfect match for AIT since the language is
both so simple and so expressive. This should be readily apparent from the
concise self-interpreter. The versions of AIT that you've seen based on TMs
probably didn't have any explicit constants because determining them would be
much too cumbersome. Chaitin was the first to prove theorems in AIT with
actual programs and he resorted to LISP. I'm positioning lambda calculus as an
even better choice.

------
carapace
If you like this see also [1], and Joy as well as Iota and Jot[2] (programming
languages). And maybe "Algorithmically probable mutations reproduce aspects of
evolution such as convergence rate, genetic memory, and modularity":
[https://arxiv.org/abs/1709.00268v8](https://arxiv.org/abs/1709.00268v8)

> In the context of his Metabiology programme, Gregory Chaitin, a founder of
> the theory of algorithmic information, introduced a theoretical
> computational model that evolves ‘organisms’ relative to their environment
> considerably faster than classical random mutation. While theoretically
> sound, the ideas had not been tested and further advancements were needed
> for their actual implementation. Here we follow an experimental approach
> heavily based on the theory that Chaitin himself helped found. We apply his
> ideas on evolution operating in software space on synthetic and biological
> examples and even if further investigation is needed this work represents
> the first step towards testing and advancing a sound algorithmic framework
> for biological evolution.

[1]
[https://wiki.haskell.org/Chaitin%27s_construction](https://wiki.haskell.org/Chaitin%27s_construction)

[2]
[https://www.nyu.edu/projects/barker/Iota/](https://www.nyu.edu/projects/barker/Iota/)

~~~
tree_of_item
That Arxiv link is fascinating, thanks for that.

~~~
carapace
Cheers.

------
motohagiography
As a layman, it is a personal aspiration to be able to understand this precise
paper. It's very appealing to be able to express and reason about the related
problems of complexity and edit distances using a generative(?) notation like
lambda calculus that you can hack on in accessible languages like Haskell.

I also like that the citations include Emperor's New Mind; Godel, Escher,
Bach; the Brainfuck homepage, and Haskell. The effects of these ideas on a
couple generations of young minds is bearing fruit.

It feels like we're on the brink of gamifying a lot of important math.

------
salimmadjd
Is there a video lecture version of this paper that someone can recommend?

This is a great paper. But I burn so much mental energy trying to focus when I
read something vs. when I listen to a video lecture. So I end up retaining
more watching stuff and every more taking some notes.

~~~
chriswarbo
Not sure about this specific paper, but I've been following John Tromp's work
on this for a while. Its current homepage seems to be on GitHub
[https://tromp.github.io/cl/cl.html](https://tromp.github.io/cl/cl.html)

The paper assumes knowledge of combinatory logic and lambda calculus, which
you could probably find videos about. This paper uses pretty standard
notation, so almost any video/course/book/blog/etc. should be OK. The main
thing this paper does is to define a way to encode such programs as a string
of bits.

Note that combinatory logic is one of the simplest Turing-complete programming
languages, but it's _so_ simple that it's basically unusable for anything more
elaborate than toy examples (the paper actually has a quote from Chaitin
saying this!). Once you're comfortable playing with little examples containing
a handful of symbols, that's basically all you really need.

There's a nice book of mathematical puzzles
[https://en.wikipedia.org/wiki/To_Mock_a_Mockingbird](https://en.wikipedia.org/wiki/To_Mock_a_Mockingbird)
which is actually based on combinatory logic. This is why combinatory logic
terms are sometimes referred to as "birds", e.g. in this Haskell library:
[https://hackage.haskell.org/package/data-
aviary-0.4.0/docs/D...](https://hackage.haskell.org/package/data-
aviary-0.4.0/docs/Data-Aviary-Birds.html)

If you're familiar with other programming languages, it's usually pretty easy
to implement combinatory logic and play with it. For example, here are 'S' and
'K' in Javascript:

    
    
        function k(x, y) { return y; }
    
        function s(x, y, z) { return x(z)(y(z)); }
    

This isn't _quite_ right, due to most languages using strict evaluation, but
the following Haskell is essentially correct:

    
    
        k x y = y
        s x y z = x z (y z)
    

There's also the esoteric language "unlambda" which implements these directly,
including the "monadic IO" that the paper mentions
[https://en.wikipedia.org/wiki/Unlambda](https://en.wikipedia.org/wiki/Unlambda)

Lambda calculus is more tricky than combinatory logic, since it contains
variables. It's also more widely used (e.g. as the basis for many functional
programming languages, like Haskell and Scheme), so there should be more
material available. In essence, when you see something like:

    
    
        λa b c.x y z
    

You can think of it as acting like this Javascript:

    
    
        function(a, b, c) {
          return x(y)(z);
        }
    

When it comes to Kolmogorov complexity, maybe the Wikipedia page and its
citations will help (
[https://en.wikipedia.org/wiki/Kolmogorov_complexity](https://en.wikipedia.org/wiki/Kolmogorov_complexity)
)? Notice that all of the definitions, etc. on that page assume that we're
talking about some particular programming language or machine, but the
examples use pseudocode rather than a "real" programming language. This paper
is basically saying that we should use lambda calculus as that language, and
we can measure the size/length of a program by encoding it as binary according
to the method given.

~~~
sagebird
Re birds -sage bird is the y combinator.

------
tomxor
HN feels psychic sometimes!... this paper happens to tie together a lot of the
foundations I was lacking (out of ignorance) that I need for a personal
research project. So happy to have found it :)

