Books. Some books on my shelf covering these topics:
"Type Theory and Functional Programming" by Simon Thompson (PDF available via quick search)
First 120 pages of "Type Theory and Formal Proof" by Rob Nederpelt / Herman Geuvers
For some early stuff on dependent types (also covered in the above), check out "Intuitionistic Type Theory" by Per Martin-Lof
"Computation and Reasoning: A Type Theory for Computer Science" by Zhaouhui Luo
"Type-Driven Development with Idris" by Edwin Brady
"Gentle Introduction to Dependent Types with Idris" by Boro Sitnikovski
"Types and Programming Languages" by Pierce
"Practical Foundations for Programming Languages" by Robert Harper (or any of his YouTube videos re/Type Theory)
"The Little Typer" by Friedman and Christiansen
"Modal Homotopy Type Theory: The Prospect of a New Logic for Philosophy" by David Corfield (just took this one to OBX North Carolina to enjoy on my beach vacation last week! Got some salt water in those pages. Also speaks to Category Theory.)
For Category Theory:
"The Joy of Abstraction" (2023) by Euginia Cheng
"Basic Category Theory for Computer Scientists" by Benjamin C. Pierce
"Conceptual Mathematics: A first introduction to categories" by Lawvere and Schanuel
"Categories for the Working Philosopher" by Landry
Okay, but then you'd show that 1/-1=-1, 1/-0.5=-2, 1/-0.25=-4, etc. The closer you get to 0, the more negative the number. Then you just wouldn't accept that if you actually got to 0 in your progression, it is anything besides negative infinity. And then you've shown that negative infinity is equal to infinity.
I do this too in some apps. Very near exactly how you show it.
How do you enforce required fields in the data model? By eliminating the possibility of null values, you eliminate the possibility of not-null constraints in the data model. I end up having to enforce these types of (non-complex) constraints outside the relational model. A missing row is not detectable in the same way as a null column value.
It's been on my list of things to explore for a while... the relationship between defunctionalization, continuations, and "Algebraic Effects".
The reason... I like seeing how different languages handle asynchronous code vs synchronous code. F# has their way, C# has theirs, JavaScript async/await, colored functions, Project Loom, co-routines, React Fibers, etc. I'm intrigued that a language that has built-in algebraic effects can do async wihtout any other changes to the language (as if async and sync were the same).
Is there any relation to defunctionalization for this kind of stuff?
I don't think the article's SIGIO example made much sense -- if you have delimited continuations, you _don't_ need defunctionalize them if they're staying in-process.
The Reynolds paper is one of my favorites. It was re-typeset in 1998 making it a lot easier to follow the math [0]. It's fun to compare it to a copy of the original from 1972 [1].
Philip Wadler had this to say about the paper: "Certain papers change your life. McCarthy's 'Recursive Functions of Symbolic Expressions and their Computation by Machine (Part I)' (1960) changed mine, and so did Landin's 'The Next 700 Programming Languages' (1966). And I remember the moment, halfway through my graduate career, when Guy Steele handed me Reynolds's 'Definitional Interpreters for Higher-Order Programming Languages' (1972)."[2] This is how I discovered it I believe.
The paper is exceedingly approachable. It was so well written that I immediately purchased a used copy of Reynolds' book on programming languages (which I did not have as easy time with compared to the paper - and still remains unfinished on my bookshelf).
When they republished the paper in 1998, Reynolds wrote about how the paper came to be [3], and I believe about the discoveries of continuations [4].
I recently implemented Reynold's meta-circular interpreter in TypeScript and serialized the abstract syntax into JSON. Coincidentally, a few days later I saw a post on HN something about "executable JSON" or some such "programming language" that the creator was very proud of making it into a product of sorts. (found it... JSON Logic: https://news.ycombinator.com/item?id=27306263). Queue Greenspun's tenth rule. I chuckled as I looked at the JSON Logic syntax knowing that a little Reynold's interpreter with its AST serialized to JSON is infinitely more powerful and extensible (allowing higher-order functions and such). I highly recommend anyone reading this to write the 50 or so lines of TypeScript necessary to implement Reynold's meta-circular interpreter (EXTREMELY EASY and nearly identical line-for-line to the 1998 paper, only in TypeScript instead of lambda calculus).
Please share the code! I’d love to learn about this space with the benefit of it as I am not well versed in functional languages. Also on the JSON Logic comment. I am using it in production to serialize the product of a GUI query builder and execute it on the backend. I don’t know of a better solution do you?
The continuous aggregates portion of this blog (along with the breakdown of Transition, Combine, and Final Functions) reminded me of "A Theory of Changes for Higher-Order Languages: Incrementalizing Lambda-Calculi by Static Differentiation" (Giarrusso et. al.) [0].
Particularly the part of getting logically-consistent results via different routes of computation. David Kohn says this in the blog post: "But, you have to have in-depth (and sometimes rather arcane) knowledge about each aggregate’s internals to know which ones meet the above criteria – and therefore, which ones you can re-aggregate."
I think Giarrusso addresses the internal requirements in a precise mathematical manner (though I may be wrong). Giarrusso Section 2.1 is the specific area I'm thinking, particularly Definition 2.1 and then later the discussion of the abelian group where certain binary operations must be commutative and associative. Giarrusso Equation 1 in the introducton defines externally what it means to be logically consistent.
Also, Giarrusso talks about "Self-Maintainability" where updating the continuous result only needs to look at the changes to the input. What was obvious to me from reading Giarrusso was that something simple like "AVG" is not self-maintainable unless you keep the intermediate state (sum and count) seperately. Kohn's distinction of Transition Function, Combine Function, and Final Function - together with Giarrusso's abelian group and change structure - is kind of a big deal in making arbitrary functions "continuous"-capable while being "self-maintainable".
I can see designing a [data] structure that adheres to the specific algabraic laws defined by Giarrusso's abelian group along with the 3 functions from Kohn (missing from Giarrusso). You can then feed this structure to a function that spits out two new functions for you: a normal function that can be used to calculate a result, and it's "continuous" version which only needs change to the inputs. For example, "avg()" and "avg_continuous()" can be automatically derived from a data structure of algebraically well-behaved operations.
Plus this would have a really cool Category Theory diagram for it!
David: absolutely love the animated gifs and pictures. Very well written indeed.
Thanks for this comment and link, I think this area is one of the most interesting frontiers in computer science today. We're starting to see products like here in TimescaleDB and elsewhere in Materialize that build on these ideas, but in five or ten year's time the developer story here is going to be absolutely incredible. There is so much potential in being able to write declarative specifications for very complex stateful dataflows, that are then just maintained quickly and automagically, especially if we can do it without arbitrary windows and watermarks. Very exciting!
"Type Theory and Functional Programming" by Simon Thompson (PDF available via quick search)
First 120 pages of "Type Theory and Formal Proof" by Rob Nederpelt / Herman Geuvers
For some early stuff on dependent types (also covered in the above), check out "Intuitionistic Type Theory" by Per Martin-Lof
"Computation and Reasoning: A Type Theory for Computer Science" by Zhaouhui Luo
"Type-Driven Development with Idris" by Edwin Brady
"Gentle Introduction to Dependent Types with Idris" by Boro Sitnikovski
"Types and Programming Languages" by Pierce
"Practical Foundations for Programming Languages" by Robert Harper (or any of his YouTube videos re/Type Theory)
"The Little Typer" by Friedman and Christiansen
"Modal Homotopy Type Theory: The Prospect of a New Logic for Philosophy" by David Corfield (just took this one to OBX North Carolina to enjoy on my beach vacation last week! Got some salt water in those pages. Also speaks to Category Theory.)
For Category Theory:
"The Joy of Abstraction" (2023) by Euginia Cheng
"Basic Category Theory for Computer Scientists" by Benjamin C. Pierce
"Conceptual Mathematics: A first introduction to categories" by Lawvere and Schanuel
"Categories for the Working Philosopher" by Landry