Lattices, and by extension semilattices, are one of my favourite algebraic structures in the context of programming. Their applications are innumerable, from access control, to distributed systems, to constraining the information flow of privileged information.
For example, CvRDTs (a class of conflict-free replicated data types), are made up of the semilattices over a monotonic operation. If you can prove a few basic properties about your data + the merge operation over it, you can construct a CRDT. It won't necessarily be efficient, but it's a good starting point for reasoning about the problem.
People spend a lot of time trying to sell abstract algebra using monads, but in my opinion it's a lot easier to grasp the applications of lattices.
Similarly, the most general form of rational decision theory or rational choice theory is based on lattices. That makes intuitive sense, since preference relations ~are~ partially ordered sets.
Then, analyzing rational (vs. for example boundedly rational choice behavior) is equivalent to analyzing lattice optimization problems (monotonicity).
Then, parameterizing the choice situation and / or introducing strategic dependecies between actors leads to analyzing sub/supermodular correspondences on these sets. In fact, the most general version would make use of quasi-supermodularity. I think Milgrom&Roberts 94 show that this is the most general way, one can think of coherent (rational) decision theory.
yes of course, but even then we are left to describe such intransitivities in a useful manner rather than conclude the actors is simply non-rational.
If it were so, then human behavior would be impossible to analyze. Instead, if we focus on causes and consequences of non transitive preferences - like in behavioral economics - we may regain the ability to do analyzes.
Lattices are also ordered (by intellectual descent) with respect to early explorations in philosophy, such as using transitivity on "Socrates is a man" and "All men are mortal".
Without lattices, we would never have had the ambiguous phrase "computer scientists commonly choose models which have bottoms, but prefer them topless" occurring in a serious textbook.
Edit: agreed, intellectual descent is only partially ordered, but the interval between aristotle's example syllogism and lattices contains a chain. Guess I should've said "also comparable (by."
Not the parent and this might be different from what they meant, but this reminded me of Lindsey Kuper's PhD work on something called LVars (lattice variables): "We present LVars, a new model for deterministic-by-construction parallel programming that generalizes existing single-assignment models to allow multiple assignments that are monotonically increasing with respect to a user-specified lattice. LVars ensure determinism by allowing only monotonic writes and “threshold” reads that block until a lower bound is reached." (from https://users.soe.ucsc.edu/~lkuper/papers/lvars-fhpc13.pdf ); "The LVars model allows communication through shared monotonic datastructures to which information can only be added, never removed, and for which the order in which information is added is not observable." (from https://users.soe.ucsc.edu/~lkuper/papers/effectzoo-pldi14.p... ); more at https://users.soe.ucsc.edu/~lkuper/
LVars are one of my favorite concepts, and they have roots in Vijay Saraswat's thesis on concurrent constraint programming, which is parametric over a choice of lattice. I second the recommendation for the LVars series of papers, and I also highly recommend digging up a copy of Saraswat's thesis work, which was published as a book. https://mitpress.mit.edu/books/concurrent-constraint-program...
The semantics of determinate CCP programs (and by extension, LVars programs) can be described really beautifully in terms of closure operators on the underlying domain. I'm not sure this made it into the initial published thesis, but this is an accessible paper on the topic: http://www.lix.polytechnique.fr/comete/stages/references/ccp...
A lattice has a least upper bound and greatest lower bound. A semilattice only has one of these two.
A monotonic operation is one that’s either non-decreasing or non-increasing. That is, it never reverses direction.
So in the context of conflict-free replicated data types, you can think of something like an append-only database as a basic example. Over time, the database can only grow or stay the same. It can never shrink. This allows you to split the database into pieces and distribute them. You always know they can be merged back together because they’re append-only. This makes it a join semilattice with the union of corresponding tables being the least upper bound operation.
Here's another perspective on Latties: It's called Formal Concept Analysis [1].
The core idea is that every binary relation (think of a binary matrix representing for example "Animal a has Property p") leads to a lattice of `concepts` associated to this relation. This lattice turns out to be very nice (it is complete: every supremum and infimum exists, even infinite ones) and conversely, every complete lattices is such a concept lattice in a very natural way (the relation that leads back to the comlete lattice is then `a is smaller than b`).
This allows to translate phenomena between the world of lattices and the world of relations with many applications in for example data mining.
The rabbit hole goes deep here. Formal concept analysis is essentially the study of certain galois connections on lattices derived from relations on sets. You can generalize relations to profunctors on categories and galois connections to adjunctions.
Tai-Danae Bradley works with these ideas applied to probability distributions on a product space. The same formal ideas developed in this linear algebraic context lead to a very rich theory.
This looks interesting! I hadn't heard of it before. But I do think that saying that "every supremum and infimum exists" is overselling it a bit. They exist, but from the example in the Wikipedia article it looks like they only exist trivially because you define unique supremum and infimum nodes. In the body of water classification example on Wikipedia the bottom-most node is a "concept" of a body of water that is both "temporary" and "constant" at the same time. As these are antonyms, this concept is unsatisfiable, and the node is entirely artificial, not corresponding to any possibly existing object.
I haven't tried to understand the more advanced extensions listed on the Wikipedia page, they might be able to rectify this kind of problem.
I'm not sure I'm being dumb but are these all actual lattices (formally speaking) as the division examples don't have (at least not explicitly) a LUB for {4, 6, 10}.
Yes and this is not the only error in the article. I'm not even sure what this is supposed to mean:
""" Every operation which preserves a lattice and doesn't use "incomparable" objects is equivalent to addition. """
And the statement at the end: "any lattice is equivalent to another lattice where the relationship is set inclusion." This only holds for distributive lattices.
For another take on lattice theory, and ordered structures more generally:
Lattices are very applicable to several areas of computer science. One of my favorite results involving lattices is the Knaster–Tarski theorem[0], which can be applied to areas ranging from grammars (e.g. showing the existence of a set containing all the strings generated by a BNF grammar) to semantics of programming languages (e.g. giving semantics of looping programs as a least fixed point).
Thanks! Is there any relationship? Some of the graphical examples [1] in your first link look like they could have a relationship to lattices of the second link.
If you program in a statically-typed language with interfaces like Java or C#, you have already encountered lattices though you may not have realized it. In those languages, the subtype relation between pairs of types forms a semi-lattice.
That means that for any pair of types, there is a least upper bound type that is a supertype of both of them. (When discussing this, we usually allow a type to be considered its own supertype. So by "X is a supertype of Y" we mean something more like "every instance of Y is also an X". So String is its own supertype because every String is also a String.) For example, the least upper bound of Object and List is Object. The least upper bound of ArrayList and Stack is List, etc.
This is important because there are places where you need to find a type that contains all values of two other types. For example, say you are type checking:
foo(condition ? a : b);
Is this a valid call? To determine that, you need to see if the type of the argument matches the declared parameter type on foo. But what is the type of a conditional operator? The value could have type a or type b. You need a type that subsumes both of those. The answer most languages use is to calculate the least upper bound of the types of the two branches. That works only because there is a least upper bound for every pair of types, and we know that because we know types form a semi-lattice.
Are types a full lattice? For that, you also need a greatest lower bound, or a "meet". That means for any pair of types there needs to be a type that is a sub-type of both of them. In some languages that doesn't exist. But languages that have a "bottom" type give you this property. The bottom type, by definition, is a subtype of all types. So whenever you need the greatest lower bound, if there isn't any other obvious candidate (like if one of the types is directly a subtype of the other), you can just throw your hands up and use bottom.
A really great low barrier to entry constraint language building off the concepts of lattices is Cuelang. It’s main developer is a former Golang core developer and the language is a superset of JSON meant for validating yaml, json, csv, etc. It’s does a good job of maintaining a mathematical purity with regard to unification disjunction, etc. while providing an easy to use tool for the common problem of validating configurations.
Is anyone else getting a "[Math Processing Error]" in red at a bunch of places?
I'm seeing:
>For example, [Math Processing Error], so multiplying by an integer "preserves" this lattice (on the Division section).
This one is even worse:
>The relevant fact about logarithms is that [Math Processing Error], meaning that the problem of multiplying [Math Processing Error] and [Math Processing Error] can be reduced to the problem of adding their logarithms. (in the Everything is Addition section)
I'm not sure if I'm understanding it correctly - for my mother to be the common ancestor of the set consisting of me and my mother, is it necessary for my mother to be her own ancestor? I don't think a person is considered their own ancestor in the common usage of the term.
Maybe I'm just being pedantic - the numerical examples make more sense to me since any number can be its own factor.
Based on that description of Pijul, it does seem to be a semilattice: it has a merge operation that's associative and commutative, and while it doesn't explicitly state that it's idempotent it's hard to imagine merging two states and getting anything else back than the same state.
I am one of the authors of Pijul. It is indeed a semilattice, but not exactly "commutative": the only operations that commute are operations that could be done in parallel. For example, adding a file doesn't commute with editing it.
But unlike in Darcs, where conflicts are messy and not super well-defined in all cases (and not to mention, sometimes cause Darcs to run for a time exponential in the size of the repo's history), two conflicting edits always commute in Pijul.
The main points of Pijul were (1) to solve the soundness issues with conflicts and (2) to fix the exponential problems.
Firstly, you would only have a semilattice (ie you have join but not meet). It sort of gives you the notion of a merge but maybe not the best thing ever. One thing to note is that it isn’t a good model of common vc systems as join must be associative and those systems do not have associative merges.
I guess you could imagine your state to be the join of all its patches. That makes sense if you have an “add foo” patch and a “delete foo” patch and join them to get a “empty but remembering that foo was there once state” (which is required because joining “add foo” again must do nothing). But you also need to accept that “delete foo” on its own is a valid state (or any “resolve this merge conflict” patch) and it feels not great for that to be a valid state.
Generally I think you want slightly more constraints (than none) in which things may be merged or valid repo states.
For example, CvRDTs (a class of conflict-free replicated data types), are made up of the semilattices over a monotonic operation. If you can prove a few basic properties about your data + the merge operation over it, you can construct a CRDT. It won't necessarily be efficient, but it's a good starting point for reasoning about the problem.
People spend a lot of time trying to sell abstract algebra using monads, but in my opinion it's a lot easier to grasp the applications of lattices.