The Lambda Calculus for Absolute Dummies (like myself) :
A short introduction to the Lambda Calculus: http://www.cs.bham.ac.uk/~axj/pub/papers/lambda-calculus.pdf
Lambda Calculus - Step by Step:
He provides theory and code that starts from untyped lambda calculus and progresses to typed lambda calculus, system F, and higher-order system. Note, the article above was written by Barendregt who summarized high-order systems in terms of the "lambda cube". The Pierce book provides enough theory and context to understand why that's important by the time he gets to it in chapter 30.
I did a 5-part lesson on my blog. It starts from zero knowledge and ends up with two Fibonnaci calculators (one recursive, one iterative):
(There's also a theorycrafting post I did a bit later to build a rudimentary object-oriented programming paradigm in the Lambda calculus.)
For a good introduction, I would recommend J. Roger Hindley's introduction to the lambda calculus and combinatory logic entitled "Lambda-Calculus and Combinators: An Introduction". It includes some interesting info on SKI combinator calculus in addition to the lambda calculus.
Back in the days before parallel programming, large-scale software engineering, networks, cloud computation, etc it wasn't always clear that anything was wrong with the TM model. But given how computing works today it's clear (to me at least) that LC is a much, much better substrate for designing computation.
The point of lambda calculus isn't to be able to program servers or whatever. The point is to be able to express ideas like beta-reduction and Y-combinators in a succint way, and a way that's easy to manipulate with a pencil (or chalk). "Notation as a tool of thought", and all that. Turing machines are similar: they're excellent for modeling computation, but you wouldn't want to use it for actual programming. It's not made for that.
As someone else said: the fact that it is so remarkably close to several actual programming languages is pretty astonishing, and shows that there really is some deep brilliance to Church's original formulation. Turing is more famous for his machines, but it's interesting to note that while no programming language resembles a "raw" Turing machine, MANY languages resemble lambda calculus.
As I said: it's not made for actual programming. Doing actual programming in lambda calculus is like trying to use the Peano axioms to calculate how much money you need to pay in the supermarket.
Lambda calculus provides you with a set of ways in which you could rewrite your program and get an equivalent one (by the definition of lambda calculus), or how to execute the program itself. It means that most of the functionality for deciding when the code should be executed (by the compiler or at run time) is already in the compiler.
It's something that imperative languages struggle with.
zero = \x\y.x;
one = \x\y.y;
consz = \xs\p.p zero xs;
oneS = \cont\x\xs\p.p x (xs cont);
zeroS = \cont\x\xs.consz (xs cont);
primes = \N\p.p one (let SN = \r.oneS (N r); F = SN F in (primes SN) F);
main = consz (consz (primes zeroS))
\io.(\zero.(\one.(\consz.(\oneS.(\zeroS.(\primes.(\main.main) (consz (consz (primes zeroS)))) ((\f.(\x.x x) (\x.f (x x))) (\primes.\N.\p.p one ((\SN.(\F.primes SN F) ((\f.(\x.x x) (\x.f (x x))) (\F.SN F))) (\r.oneS (N r)))))) (\cont.\x.\xs.consz (xs cont))) (\cont.\x.\xs.\p.p x (xs cont))) (\xs.\p.p zero xs)) (\x.\y.y)) (\x.\y.x)
have you used scheme?
"Scheme is a very simple language, much easier to implement than many other languages of comparable expressive power. This ease is attributable to the use of lambda calculus to derive much of the syntax of the language from more primitive forms."
Lambda calculus is the basis of functional programming where a function is a portable black box containing instructions and returning a result. That said, a function is an abstraction such that a container of many instructions may be represented by a single reference. Lambda calculus also accounts for nested functions as a means of expressing layers of abstractions. One such style of layered functions is the notion of creating a new function inside a function and returning that new function, which is called currying. Currying allows for the creation of an abstraction where the computation is not ready at the time of creation, such as awaiting additional input via arguments, but ready once called.
In my own code I use nested functions to create a separation of concerns. This allows me to write code where portions are isolated from other portions but available by reference where needed. This separation provides a formal means of internal architecture and organization by which complexity is reduced.
A major consideration available from lambda calculus is the concept of lexical scope, which is a concept similar to abstraction layers but applies to reference resolution.
Why? Because it allows a function to create and return a new different instance of a function-skeleton coded in the inner scope, part of which function-instance is the values of the variables in its outer scope which were the values of those variables at the time the outer function returned.
That means it is easy to create new function-instances. Function instances are "first class citizens" as much part of the machinery of such language as any data-structure.
In this sense the "inner function" coded inside an outer is not a function at all. It is just an encoding, a set of instructions for creating a new function which instructions are executed to create a new function when you execute the outer function.
Whereas the outermost function is a function by itself since there can be only one instance of it ever.
I'm regarding this subject as
Lambda calculas > lambda functions > serverless > "business logic i can define and upload and let idle in the internets without worrying about managing the computers that actually made that business logic available worldwide behing an URL "
I know the HN crowd is skeptical about serverless but as a non developper, it is pretty mindblowing the barriers that AWS + lambda functions are taking down.