Hacker News new | past | comments | ask | show | jobs | submit login
String Diagrams for Linear Algebra (graphicallinearalgebra.net)
142 points by saroyas 10 months ago | hide | past | favorite | 25 comments



We have established that there is a homomorphism called θ from B, the PROP of diagrams, to Mat, the PROP of matrices.

I've never seen such a complex subject as linear algebra presented with such clarity and simplicity.


Are you being sarcastic?


Ya think? I like how Werner Heisenberg (re)invented matrix multiplication. If e_ij represents a transition from i -> j and e_ij e_kl = e_il if j == k and 0 otherwise, then A = sum_ij, a_ij e_ij represents the matrix A = [a_ij] and AB, using this rule, is matrix multiplication.


Why this episode OP? IMO the proof of fullness is quite belabored. In fact the very next episode gives a much better one-page proof in the section starting "Here’s what we mean by this rather cryptic remark". It's faithfulness that's more interesting, ie. are the graphical rules sufficient to prove that any two diagrams that produce the same outputs given the same inputs are equal, ie. is the list of graphical rules complete?

A string diagram is also equivalent to a program in a generalized-stack language (in the stronger sense of equivalence up to difference in the topological arrangement of the wires, not up to the full graphical rules). Unlike a pure stack language, you can operate on values at any point in the stack, not just the top. The four operations are: drop an element from the stack, push a zero to the stack, remove two elements from the stack and push their sum to the stack, remove an element from the stack and push two copies of it to the stack. Obviously any program in this language computes a linear function of its initial stack (there is a full homomorphism to matrices). A stack language captures the same ability of an input to be consumed that a string diagram does.


> Unlike a pure stack language, you can operate on values at any point in the stack, not just the top.

There's a language, Joy, that has combinators that allow you to work deeper in the stack, e.g. "dip":

       a b c [F] dip
    -------------------
          a b F c


    dipd == [dip] cons dip


    a b c [F] dipd
    a b c [F] [dip] cons dip
    a b c [[F] dip]      dip
    a b    [F] dip c
    a       F    b c
And so on...

I think it's a "Category Theory" paradigm language, whatever that might mean.

Joy resembles the point-free form of Haskell from Conal Elliott's "Compiling to categories" http://conal.net/papers/compiling-to-categories/ )


Matrices, although great for computers, are often complicated and unintuitive. This blog was my first insight into an alternative symbolic interpretation for Linear Algebra. It has often led to problems and complicated equations being represented with far more simplicity. As well as teaching me to be more flexible and creative with my own notations.


Agreed. The graphical notation elegantly demonstrates that a lot of symbolic manipulation just boils down to pushing parts of a formula around until they fit a specific pattern, so that you can apply an identity and replace it with a different pattern.


> It has often led to problems and complicated equations being represented with far more simplicity.

How? AFAICT, the string diagram contains far too much incidental information for doing linear algebra. It contains even more information that a fully parenthesized tree of additions because it even contains information about how to evaluate duplicated expressions (eg. it tells if in (x+y)+(x+y) you should evaluate (x+y) twice or only once and then reuse the value). If we write a string diagram as a normal linear system the whole question of fullness and faithfulness would be dead obvious.


Related : Recent Paper (2019) from Google introducing TensorNetworks and the graphical notation : https://arxiv.org/abs/1905.01330

Blog post from Google : https://ai.googleblog.com/2019/06/introducing-tensornetwork-...

and an official website about the diagrams : https://tensornetwork.org/


Without an ad blocker, this page redirects to a malware landing page for me.


Ditto.


Is this (closely) related to Penrose tensor notation https://en.wikipedia.org/wiki/Penrose_graphical_notation?


Yes. The difference is that here the operation represented by putting wires next to each other is the direct sum, whereas in the Penrose notation the operation is the tensor product. They're both examples of the string diagram notation for monoidal categories. (https://forum.azimuthproject.org/discussion/2335/lecture-73-...)


I expected it to -- graphically, it looks very similar to Penrose notation! -- but at first glance it seems to be quite different.

Penrose notation is very high level: edges correspond to tensor dimensions, joining corresponds roughly to tensor contraction. When you're working with multiple complicated multi-dimensional tensors, it can be a beautiful way to express things.

This notation seems to be expressing something a bit lower level: it seems that edges correspond to values and dots correspond to copy/add. So, roughly, they visually express linear functions. (I didn't read the full post, it's possible that they expand beyond this initial meaning.) Diagrams of a similar style are used in other contexts (eg. expressing the weights of neural networks), often as a pedagogical tool.


It might be easier to think of it as a kind of Feynman diagram: a sum over paths. That's what matrix multiplication is, it's a kind of sum over weighted paths. A matrix records all the ways to get from the set of "columns" to the set of "rows".


Yes


So,

I looked this page, the whole site and even slept on the questions involved. My experience with being student, doing a bit of math tutoring and doing a bit of math teaching is that linear algebra is one of the simplest "real" math courses a student encounters. Matrices are elaborate but they're just repeatedly doing pretty simple - and the geometric intuition of matrices as translation and rotation of linear spaces, is both useful and one of the easier intuitions to get, among "basic" higher math intuitions.

Now, this page comes along and overall message, "Wow, linear algebra is hard to understand, here's simpler way to approach it" and then [stream of stuff no undergraduate possibly get and that I struggle with having an MA in math] (also a structure that's not at all related to the concrete geometric interpretation of linear algebra). And, after scanning the thing structure, I at least have come away with what I think the motivation involved is.

The thing that's "complicated" and "unintuitive" to our author is that the development of linear begin with operations on individual vectors, describing the mechanics in detail. It then jumps to the qualities of sets of vectors and how one subspace can be orthogonal to another subspace.

So, basically, it takes informal set theory as implicit scaffolding to the entire theory. And if you are a mathematician (or one kind of mathematician), set theory is a grunging, undesirable thing. The mechanics of computation is also grunging. I'm tempted to say everything is grunging until you reach a supreme level of abstraction. But the computation + set thing definitely is constructing the structure "bottom". So what the author is doing is constructing things top down. Combining algebraic operations as strings that's constructing the equivalent of subspaces at the same it's describing what addition "is". Everything gets constructed from one "piece".

I feel like I can see the appeal of this but I can't actually see it as producing a thing useful to me, at the levels of abstraction I like to work with.

It also makes think how set theory is a bit the object-orientation of mathematic. It's a messy, do-anything, constructive exercise. I can understand an urge of someone who like their basis to be enlightening, to base their structures on things of great apparent elegance. Perhaps I'm pessimistic enough to believe that one is always to going to face "grunge".


Dunno. Matrix algebra seems perfectly intuitive to me. I don’t get what this adds except making it harder to write things down. (The insights elsewhere that you can also do this as a n appropriately constructed functional programming language are more useful - at least you can write that down!)


Would going through this course (if one can call it that) help me learn linear algebra, or is more of a "Here is a different angle for those who get it" type of thing?


The latter IMO.


This is poor man's Chu spaces http://chu.stanford.edu


It's poorman's category theory. The PROPs being manipulated here form a category. I'm not sure how Chu spaces in particular are related, although I don't doubt that they are.


Ok, but where are the string diagrams for Chu spaces?


https://boxbase.org/entries/2019/jul/15/chu-construction/

Strings is the canonical example for Chu spaces.


this page displays some malicious and dodgy ads




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: