That's fun and a bit surprising, but maybe it shouldn't be. I'm reminded that Dana Scott with Christopher Strachey showed that by using lattices or complete partial orders with a bit of topology to model graphs of possible computations of a program you could, just as in analysis, define least upper and lower bounds and a notion of continuity to derive a construction of limit for a computation which is analogous to a limit in analysis. They called this model a domain. That bit of of math is the basis of denotational semantics of programming languages and is necessary because sets are largely sufficient as the basis for algebra and analysis but not for programs which have evils like partial functions, side effects and variable assignment. I believe that Christopher Strachey with Scott also introduced the formal notions of sum, product and recusive types. They also showed how definitions or models of recursive types and functions could be well founded through their construction of limits on domains. An early tech report on it can be found here:https://www.cs.ox.ac.uk/files/3228/PRG06.pdfand here's a more recent free book from David Schmidt on the topic:

 Another interesting area of program semantics that hasn’t seen much attention afaik is in the design of languages whose programs form some algebraic structure.My area of research is concatenative programming, where concatenation of two programs denotes the composition of those programs, and the empty program is the identity function (on the “program state”, which is usually a stack). That means that the syntax and semantics of the language both form monoids, and there’s a homomorphism from the syntax onto the semantics.Several years ago, I was trying to come up with languages with other more interesting algebraic structures, so you could apply theorems from those structures to programs. A particular challenge is a language whose structure forms a nontrivial ring¹, in particular a Euclidean ring—then you could find the greatest common divisor of two programs, and because a Euclidean ring is a unique factorisation domain, you could also factor a program into prime subprograms. I was fascinated by what that might mean and whether it could be useful.Unfortunately, it was that “nontrivial” aspect that I could never get past. The language always ended up insufficiently powerful to express anything of interest, its algebraic structure was trivial (e.g. there’s only one program), or it ended up having a weaker structure (e.g. an idempotent semiring). Chris Pressey also worked on this concept a bunch around 2007, and produced some results like Cabra² and Burro³, but ran into similar dead ends like Potro⁴.
 This algebraic approach to program semantics is basically how categorical semantics works. The syntactic equational theory of the simply typed lambda calculus with pairs, for example, is that of the "free" cartesian closed category. It can be mapped (in a way that respects the cartesian closed structure) into any cartesian closed category, giving you a semantics.A monoid can be seen as a one-object category (the monoid elements are the morphisms on that object, and the monoid operator is morphism composition), so perhaps concatenative languages have categorical semantics too. (Although it seems like the semantics of "quote", or whatever [ foo bar baz ] is in forth, might make things a little interesting - ie. require more structure than a monoid homomorphism. I expect you really want cartesian closed categories, and quote is probably curry or apply.)
 I don’t know how concatenative languages fit into category theory—I would guess Cartesian closed categories are the place to start, but that’s not really my area of expertise.Maybe you can offer some insight if I tell you that the quotation syntax “[ … ]” in a concatenative language corresponds to lambda abstraction:`````` e : a --------------------- [ e ] : ∀s. s → s × a `````` And the quotation operator “quote”, which takes a value on the stack and returns it quoted, corresponds to eta-expansion (or lifting lambda abstraction into a term, if you like) and has the type:`````` quote : ∀sa. s × a → s × (∀t. t → t × a) `````` Generally the standard Turing-complete basis in concatenative calculus is the following set of combinators:`````` compose : ∀rstu. r × (s → t) × (t → u) → r × (s → u) swap : ∀sab. s × a × b → s × b × a drop : ∀sa. s × a → s dup : ∀sa. s × a → s × a × a quote : ∀sa. s × a → s × (∀t. t → t × a) apply : ∀st. s × (s → t) → t `````` (Although you can get away with fewer if they’re equivalent in power.)The first three correspond to the standard B, C, K, and W combinators from combinatory logic, which give you all the substructural rules from logic—swapping is exchange, dropping is weakening, and copying is contraction. Without swap/drop/dup, it’s equivalent to ordered linear lambda calculus, which can be interpreted in any category (since it’s just B and I); what are the categorical equivalents of linear (BC), affine (BCK), ordered (BKW), and relevant (BCW) logics?
 Oh neat! (I'm working on using Joy in a practical way, I had an implementation in Python but it turns out Prolog is more conducive: https://osdn.net/projects/joypy/scm/hg/Joypy/blobs/tip/thun/...)> Unfortunately, it was that “nontrivial” aspect that I could never get past. The language always ended up insufficiently powerful to express anything of interest, its algebraic structure was trivial (e.g. there’s only one program), or it ended up having a weaker structure (e.g. an idempotent semiring). Chris Pressey also worked on this concept a bunch around 2007, and produced some results like Cabra² and Burro³, but ran into similar dead ends like Potro⁴.Maybe that in itself is an important result?
 > Maybe that in itself is an important result?Well, I didn’t have the skills to actually prove the negative then, I was just never able to find a positive example. Still not convinced that it’s impossible, if I ever get back to it, or someone wants to pick it up as a problem to tinker with.
 Also take a look at the Boom Hierarchy [1], which came from the same era (Bird-Meertens formalism, etc.). It describes relationships between data structures constructed from 4 algebraic properties: unit, commutative, associative and idempotent. The resulting 16 possibilities include the familiar set, bag, list and tree, but also other less useful combinations, such as non-empty mobile.

Search: