
A Graphical Introduction to Lattices - nickysielicki
https://philosophyforprogrammers.blogspot.com/2013/06/a-graphical-introduction-to-lattices.html
======
QuinnWilton
Lattices, and by extension semilattices, are one of my favourite algebraic
structures in the context of programming. Their applications are innumerable,
from access control, to distributed systems, to constraining the information
flow of privileged information.

For example, CvRDTs (a class of conflict-free replicated data types), are made
up of the semilattices over a monotonic operation. If you can prove a few
basic properties about your data + the merge operation over it, you can
construct a CRDT. It won't necessarily be efficient, but it's a good starting
point for reasoning about the problem.

People spend a lot of time trying to sell abstract algebra using monads, but
in my opinion it's a lot easier to grasp the applications of lattices.

~~~
cced
Do you have any examples or links to code samples?

~~~
tom_mellior
Not the parent and this might be different from what they meant, but this
reminded me of Lindsey Kuper's PhD work on something called LVars (lattice
variables): "We present LVars, a new model for deterministic-by-construction
parallel programming that generalizes existing single-assignment models to
allow multiple assignments that are monotonically increasing with respect to a
user-specified lattice. LVars ensure determinism by allowing only monotonic
writes and “threshold” reads that block until a lower bound is reached." (from
[https://users.soe.ucsc.edu/~lkuper/papers/lvars-
fhpc13.pdf](https://users.soe.ucsc.edu/~lkuper/papers/lvars-fhpc13.pdf) );
"The LVars model allows communication through shared monotonic datastructures
to which information can only be added, never removed, and for which the order
in which information is added is not observable." (from
[https://users.soe.ucsc.edu/~lkuper/papers/effectzoo-
pldi14.p...](https://users.soe.ucsc.edu/~lkuper/papers/effectzoo-pldi14.pdf)
); more at
[https://users.soe.ucsc.edu/~lkuper/](https://users.soe.ucsc.edu/~lkuper/)

~~~
Twisol
LVars are one of my favorite concepts, and they have roots in Vijay Saraswat's
thesis on concurrent constraint programming, which is parametric over a choice
of lattice. I second the recommendation for the LVars series of papers, and I
also highly recommend digging up a copy of Saraswat's thesis work, which was
published as a book. [https://mitpress.mit.edu/books/concurrent-constraint-
program...](https://mitpress.mit.edu/books/concurrent-constraint-programming)

The semantics of determinate CCP programs (and by extension, LVars programs)
can be described really beautifully in terms of closure operators on the
underlying domain. I'm not sure this made it into the initial published
thesis, but this is an accessible paper on the topic:
[http://www.lix.polytechnique.fr/comete/stages/references/ccp...](http://www.lix.polytechnique.fr/comete/stages/references/ccp.pdf)

------
Garlef
Here's another perspective on Latties: It's called Formal Concept Analysis
[1].

The core idea is that every binary relation (think of a binary matrix
representing for example "Animal a has Property p") leads to a lattice of
`concepts` associated to this relation. This lattice turns out to be very nice
(it is complete: every supremum and infimum exists, even infinite ones) and
conversely, every complete lattices is such a concept lattice in a very
natural way (the relation that leads back to the comlete lattice is then `a is
smaller than b`).

This allows to translate phenomena between the world of lattices and the world
of relations with many applications in for example data mining.

[1]
[https://en.wikipedia.org/wiki/Formal_concept_analysis](https://en.wikipedia.org/wiki/Formal_concept_analysis)

~~~
danharaj
The rabbit hole goes deep here. Formal concept analysis is essentially the
study of certain galois connections on lattices derived from relations on
sets. You can generalize relations to profunctors on categories and galois
connections to adjunctions.

[https://link.springer.com/chapter/10.1007%2F978-3-642-29892-...](https://link.springer.com/chapter/10.1007%2F978-3-642-29892-9_24)

[https://golem.ph.utexas.edu/category/2013/08/the_nucleus_of_...](https://golem.ph.utexas.edu/category/2013/08/the_nucleus_of_a_profunctor_so.html)

Tai-Danae Bradley works with these ideas applied to probability distributions
on a product space. The same formal ideas developed in this linear algebraic
context lead to a very rich theory.

[https://johncarlosbaez.wordpress.com/2020/05/07/formal-
conce...](https://johncarlosbaez.wordpress.com/2020/05/07/formal-concepts/)

------
throwaway_pdp09
I'm not sure I'm being dumb but are these all actual lattices (formally
speaking) as the division examples don't have (at least not explicitly) a LUB
for {4, 6, 10}.

~~~
mathgenius
Yes and this is not the only error in the article. I'm not even sure what this
is supposed to mean:

""" Every operation which preserves a lattice and doesn't use "incomparable"
objects is equivalent to addition. """

And the statement at the end: "any lattice is equivalent to another lattice
where the relationship is set inclusion." This only holds for distributive
lattices.

For another take on lattice theory, and ordered structures more generally:

[https://www.azimuthproject.org/azimuth/show/Applied+Category...](https://www.azimuthproject.org/azimuth/show/Applied+Category+Theory+Course#Course)

------
siraben
Lattices are very applicable to several areas of computer science. One of my
favorite results involving lattices is the Knaster–Tarski theorem[0], which
can be applied to areas ranging from grammars (e.g. showing the existence of a
set containing all the strings generated by a BNF grammar) to semantics of
programming languages (e.g. giving semantics of looping programs as a least
fixed point).

Domain theory[1] also makes heavy use of this.

[0]
[https://en.wikipedia.org/wiki/Knaster%E2%80%93Tarski_theorem](https://en.wikipedia.org/wiki/Knaster%E2%80%93Tarski_theorem)

[1]
[https://en.wikipedia.org/wiki/Domain_theory](https://en.wikipedia.org/wiki/Domain_theory)

------
awirth
Note, this is about
[https://en.wikipedia.org/wiki/Lattice_(order)](https://en.wikipedia.org/wiki/Lattice_\(order\))
as in partially ordered sets, not
[https://en.wikipedia.org/wiki/Lattice_(group)](https://en.wikipedia.org/wiki/Lattice_\(group\))
as in geometry (and used by lattice cryptography)

~~~
chrispeel
Thanks! Is there any relationship? Some of the graphical examples [1] in your
first link look like they could have a relationship to lattices of the second
link.

[1]
[https://en.wikipedia.org/wiki/Lattice_(order)#Examples](https://en.wikipedia.org/wiki/Lattice_\(order\)#Examples)

~~~
danharaj
The common generalization of semilattices and groups are inverse semigroups
which are one way of capturing the idea of partial, or local, symmetries.

[https://en.wikipedia.org/wiki/Inverse_semigroup](https://en.wikipedia.org/wiki/Inverse_semigroup)

------
munificent
If you program in a statically-typed language with interfaces like Java or C#,
you have already encountered lattices though you may not have realized it. In
those languages, the subtype relation between pairs of types forms a semi-
lattice.

That means that for any pair of types, there is a least upper bound type that
is a supertype of both of them. (When discussing this, we usually allow a type
to be considered its own supertype. So by "X is a supertype of Y" we mean
something more like "every instance of Y is also an X". So String is its own
supertype because every String is also a String.) For example, the least upper
bound of Object and List is Object. The least upper bound of ArrayList and
Stack is List, etc.

This is important because there are places where you need to find a type that
contains all values of two other types. For example, say you are type
checking:

    
    
        foo(condition ? a : b);
    

Is this a valid call? To determine that, you need to see if the type of the
argument matches the declared parameter type on foo. But what is the type of a
conditional operator? The value could have type a or type b. You need a type
that subsumes both of those. The answer most languages use is to calculate the
least upper bound of the types of the two branches. That works only because
there _is_ a least upper bound for every pair of types, and we know that
because we know types form a semi-lattice.

Are types a _full_ lattice? For that, you also need a greatest lower bound, or
a "meet". That means for any pair of types there needs to be a type that is a
_sub_ -type of both of them. In some languages that doesn't exist. But
languages that have a "bottom" type give you this property. The bottom type,
by definition, is a subtype of all types. So whenever you need the greatest
lower bound, if there isn't any other obvious candidate (like if one of the
types is directly a subtype of the other), you can just throw your hands up
and use bottom.

------
rudolph9
A really great low barrier to entry constraint language building off the
concepts of lattices is Cuelang. It’s main developer is a former Golang core
developer and the language is a superset of JSON meant for validating yaml,
json, csv, etc. It’s does a good job of maintaining a mathematical purity with
regard to unification disjunction, etc. while providing an easy to use tool
for the common problem of validating configurations.

[https://cuelang.org/docs/concepts/logic/](https://cuelang.org/docs/concepts/logic/)

~~~
rudolph9
The notable difference between cue and other easy to configuration focused
tools is it’s a long way before you hit the bottom!

There are still some core features and bugs being worked out but you can do
exceptionally interesting things with it.
[https://gist.github.com/rudolph9/005f35c33e9255519d93c26058e...](https://gist.github.com/rudolph9/005f35c33e9255519d93c26058e9d470)

------
eyeundersand
Nice article.

Is anyone else getting a "[Math Processing Error]" in red at a bunch of
places?

I'm seeing: >For example, [Math Processing Error], so multiplying by an
integer "preserves" this lattice (on the Division section).

This one is even worse: >The relevant fact about logarithms is that [Math
Processing Error], meaning that the problem of multiplying [Math Processing
Error] and [Math Processing Error] can be reduced to the problem of adding
their logarithms. (in the Everything is Addition section)

~~~
tom_mellior
No. The formulas are generated using MathJax, maybe you are blocking their
JavaScript?

~~~
eyeundersand
Oh, yeah. Silly me! Appreciate it :)

------
ghostpepper
I'm not sure if I'm understanding it correctly - for my mother to be the
common ancestor of the set consisting of me and my mother, is it necessary for
my mother to be her own ancestor? I don't think a person is considered their
own ancestor in the common usage of the term.

Maybe I'm just being pedantic - the numerical examples make more sense to me
since any number can be its own factor.

------
jmiskovic
Would it be useful to consider code diffs as additions and subtractions from
code base, and then implement source versioning using lattice structure?

~~~
tom_mellior
I don't think it's a lattice per se, but Darcs had a notion of an "algebra of
patches":
[http://darcs.net/Theory/GaneshPatchAlgebra](http://darcs.net/Theory/GaneshPatchAlgebra).
Pijul does something similar, claiming to have "a formally correct version of
Darcs' theory of patches"
[https://pijul.org/manual/why_pijul.html](https://pijul.org/manual/why_pijul.html)

I've never used Pijul, but I miss Darcs almost every day when interacting with
Git.

~~~
bkirwi
Based on that description of Pijul, it does seem to be a semilattice: it has a
merge operation that's associative and commutative, and while it doesn't
explicitly state that it's idempotent it's hard to imagine merging two states
and getting anything else back than the same state.

Not super familiar with Pijul though!

~~~
pmeunier
I am one of the authors of Pijul. It is indeed a semilattice, but not exactly
"commutative": the only operations that commute are operations that could be
done in parallel. For example, adding a file doesn't commute with editing it.

But unlike in Darcs, where conflicts are messy and not super well-defined in
all cases (and not to mention, sometimes cause Darcs to run for a time
exponential in the size of the repo's history), two conflicting edits always
commute in Pijul.

The main points of Pijul were (1) to solve the soundness issues with conflicts
and (2) to fix the exponential problems.

