
Timeline for Logic, λ-Calculus, and Programming Language Theory (2012) [pdf] - adamnemecek
http://fm.csl.sri.com/SSFT15/Timeline.pages.pdf
======
pron
To those interested in the history of logic and computation, and the
relationship between them (and algebra) -- which, BTW, was not at all
accidental or surprising but completely by design -- I've compiled an
anthology of mostly primary sources starting with Aristotle and going up to
the 1950s: [https://pron.github.io/computation-logic-
algebra](https://pron.github.io/computation-logic-algebra)

~~~
lioeters
Over the course of a few weeks, I've been going through your "History of
Computation, Logic and Algebra". I just wanted to say, brilliant work, thank
you! The way you present the history is really insightful in understanding the
current of philosophy and ideas that led up to our modern conception and use
of computation.

------
linguae
I’ve taken some graduate courses in programming languages where I’ve learned
about the lambda calculus and about type theory, and I really appreciate this
post since this provides a nice timeline of how programming language theory
has evolved. One topic I’m very interested in is how computation can be
expressed in many different ways and the tradeoffs behind various expressions.

I apologize if this is off-topic, but I wonder if there has been work done on
algorithm analysis using the lambda calculus? The texts that I’ve read on
algorithms describe the algorithms using procedural code, and then the
analysis is done on the amount of procedures that get called. However, I would
imagine that the description would be different for non-procedural styles of
programming. I’ve heard of a book called “Functional Data Structures” but I
haven’t read it yet. I’m wondering if there has been work done on algorithm
analysis at the lambda calculus level.

~~~
pwm
(On my phone walking so very quick and dirty post) purely functional DSs in
general will cost you an extra O(log n) cause of their underlying tree
representation for immutability (immutability achieved by copying paths in the
tree to retain old data). Also some DSs are very difficult to express in a
purely functional way.

~~~
adjkant
While true on the first part, these days many cases don't have insane primary
efficiency needs that would be affected by that log(n). As always though, you
choose the tools based on the job at hand, not the tools. In my experience,
functional philosophy often is about the gained mental and consistency
advantages achieved by the features offered.

~~~
pwm
Don’t get me wrong, you are preaching to the choir (I write Haskell at work),
just thought it’s worth mentioning this to the OP. In practice sophisticated
compilers like GHC produce insanely fast binaries and is getting better and
better.

------
platz
Philip Wadler showcases and comments on some of this history in his conference
talk "Propositions as Types"
[https://www.youtube.com/watch?v=IOiZatlZtGU](https://www.youtube.com/watch?v=IOiZatlZtGU)

~~~
proc0
This is talk is responsible for my outrage whenever people say math is
invented.

------
dvt
Homotopy Type Theory doesn't really contribute much to logic (it's basically
just type theory with a few extra rules) or programming languages (computer-
assisted proofs are easier in HoTT, but that's about it).

HoTT is a foundations of mathematics thing (think ZF/ZFC), so it's a bit weird
to see it on the list. But it's kind of hot right now, so you see it just
about everywhere.

~~~
adamnemecek
It connects logic, topology and type systems. How is this not a big deal? It's
also computation friendly.

~~~
dvt
Logic and type systems have been connected since Frege (he coined "functional"
and "non-functional" types[1]). Computation friendly is interesting, but not
really revolutionary. Topology I'm not an expert in so I can't comment.

[1] On Function and Concept, 1891;
[http://fitelson.org/proseminar/frege_fac.pdf](http://fitelson.org/proseminar/frege_fac.pdf)

------
david_draco
It is missing Plankalkül by Zuse
[https://en.wikipedia.org/wiki/Plankalk%C3%BCl](https://en.wikipedia.org/wiki/Plankalk%C3%BCl)
which heavily influenced ALGOL

------
kjeetgill
This seems like a good a thread as any to ask my perennial question:

Is there some sort of logic or formalism that would let me algebraically
represent a program like bubble sort, and then algebraically simplify it to a
quick sort?

------
User23
Any timeline of logic that fails to include Charles Sanders Peirce[1] is
woefully incomplete. He discovered the existential and universal quantifiers
and did seminal work on the relational calculus, to name just a few items. If
you haven't heard of him I strongly recommend at least reading the article I
linked. He's a great American genius who is virtually forgotten.

[1]
[https://en.wikipedia.org/wiki/Charles_Sanders_Peirce](https://en.wikipedia.org/wiki/Charles_Sanders_Peirce)

------
divan
Is there any good book that provides an overview and reasoning behind all
those concepts in this timeline?

~~~
yaseer
Not that I'm aware of.

If you produced a mind-map to show their inter-relatedness, It's quite a dense
network.

A quick sketch of some of the nodes and vertices might start to look like:

Logic <\--> Gödel's theorems

Gödel's theorems <\--> Proof theory

Proof Theory <\--> Linear Logic

Computability <\---> Turing Machines

Computability <\---> Lambda Calculus

Combinatory Logic <\--> Lambda Calculus

Type Theory <\--> Lambda Calculus

Homotopy Type Theory <\--> Type Theory

Pi Calculus <\--> Process Calculi/Distributed Systems

Category Theory <\--> Everything...

And that's far from complete. I imagine synthesising all the concepts into a
single book, with a cohesive narrative would be quite hard, without some
deeper unifying theory, uniting all concepts. (Category theory and HTT may be
the best contenders).

In this case, a Wiki might best capture the semantics of structure. Although
I'd love to be proved wrong and find such a book!

~~~
chewxy
ncatlab.org

