
String Diagrams for Linear Algebra - saroyas
https://graphicallinearalgebra.net/2015/06/09/matrices-diagrammatically/
======
KKKKkkkk1
_We have established that there is a homomorphism called θ from B, the PROP of
diagrams, to Mat, the PROP of matrices._

I've never seen such a complex subject as linear algebra presented with such
clarity and simplicity.

~~~
lonelappde
Are you being sarcastic?

~~~
keithalewis
Ya think? I like how Werner Heisenberg (re)invented matrix multiplication. If
e_ij represents a transition from i -> j and e_ij e_kl = e_il if j == k and 0
otherwise, then A = sum_ij, a_ij e_ij represents the matrix A = [a_ij] and AB,
using this rule, is matrix multiplication.

------
edflsafoiewq
Why this episode OP? IMO the proof of fullness is quite belabored. In fact the
very next episode gives a much better one-page proof in the section starting
"Here’s what we mean by this rather cryptic remark". It's faithfulness that's
more interesting, ie. are the graphical rules sufficient to prove that any two
diagrams that produce the same outputs given the same inputs are equal, ie. is
the list of graphical rules complete?

A string diagram is also equivalent to a program in a generalized-stack
language (in the stronger sense of equivalence up to difference in the
topological arrangement of the wires, not up to the full graphical rules).
Unlike a pure stack language, you can operate on values at any point in the
stack, not just the top. The four operations are: drop an element from the
stack, push a zero to the stack, remove two elements from the stack and push
their sum to the stack, remove an element from the stack and push two copies
of it to the stack. Obviously any program in this language computes a linear
function of its initial stack (there is a full homomorphism to matrices). A
stack language captures the same ability of an input to be consumed that a
string diagram does.

~~~
carapace
> Unlike a pure stack language, you can operate on values at any point in the
> stack, not just the top.

There's a language, Joy, that has combinators that allow you to work deeper in
the stack, e.g. "dip":

    
    
           a b c [F] dip
        -------------------
              a b F c
    
    
        dipd == [dip] cons dip
    
    
        a b c [F] dipd
        a b c [F] [dip] cons dip
        a b c [[F] dip]      dip
        a b    [F] dip c
        a       F    b c
    

And so on...

I think it's a "Category Theory" paradigm language, whatever that might mean.

Joy resembles the point-free form of Haskell from Conal Elliott's "Compiling
to categories" [http://conal.net/papers/compiling-to-
categories/](http://conal.net/papers/compiling-to-categories/) )

------
saroyas
Matrices, although great for computers, are often complicated and unintuitive.
This blog was my first insight into an alternative symbolic interpretation for
Linear Algebra. It has often led to problems and complicated equations being
represented with far more simplicity. As well as teaching me to be more
flexible and creative with my own notations.

~~~
yorwba
Agreed. The graphical notation elegantly demonstrates that a lot of symbolic
manipulation just boils down to pushing parts of a formula around until they
fit a specific pattern, so that you can apply an identity and replace it with
a different pattern.

------
antpls
Related : Recent Paper (2019) from Google introducing TensorNetworks and the
graphical notation :
[https://arxiv.org/abs/1905.01330](https://arxiv.org/abs/1905.01330)

Blog post from Google : [https://ai.googleblog.com/2019/06/introducing-
tensornetwork-...](https://ai.googleblog.com/2019/06/introducing-
tensornetwork-open-source.html?m=1)

and an official website about the diagrams :
[https://tensornetwork.org/](https://tensornetwork.org/)

------
rudedogg
Without an ad blocker, this page redirects to a malware landing page for me.

~~~
MaysonL
Ditto.

------
Myrmornis
Is this (closely) related to Penrose tensor notation
[https://en.wikipedia.org/wiki/Penrose_graphical_notation](https://en.wikipedia.org/wiki/Penrose_graphical_notation)?

~~~
OscarCunningham
Yes. The difference is that here the operation represented by putting wires
next to each other is the direct sum, whereas in the Penrose notation the
operation is the tensor product. They're both examples of the string diagram
notation for monoidal categories.
([https://forum.azimuthproject.org/discussion/2335/lecture-73-...](https://forum.azimuthproject.org/discussion/2335/lecture-73-chapter-4-string-
diagrams-and-strictification))

------
joe_the_user
So,

I looked this page, the whole site and even slept on the questions involved.
My experience with being student, doing a bit of math tutoring and doing a bit
of math teaching is that linear algebra is one of the simplest "real" math
courses a student encounters. Matrices are elaborate but they're just
repeatedly doing pretty simple - and the geometric intuition of matrices as
translation and rotation of linear spaces, is both useful and one of the
easier intuitions to get, among "basic" higher math intuitions.

Now, this page comes along and overall message, "Wow, linear algebra is hard
to understand, here's simpler way to approach it" and then [ _stream of stuff
no undergraduate possibly get and that I struggle with having an MA in math_ ]
(also a structure that's not at all related to the concrete geometric
interpretation of linear algebra). And, after scanning the thing structure, I
at least have come away with what I think the motivation involved is.

The thing that's "complicated" and "unintuitive" to our author is that the
development of linear begin with operations on individual vectors, describing
the mechanics in detail. It then jumps to the qualities of sets of vectors and
how one subspace can be orthogonal to another subspace.

So, basically, it takes informal set theory as implicit scaffolding to the
entire theory. And if you are a mathematician (or one kind of mathematician),
set theory is a grunging, undesirable thing. The mechanics of computation is
also grunging. I'm tempted to say everything is grunging until you reach a
supreme level of abstraction. But the computation + set thing definitely is
constructing the structure "bottom". So what the author is doing is
constructing things top down. Combining algebraic operations as strings that's
constructing the equivalent of subspaces at the same it's describing what
addition "is". Everything gets constructed from one "piece".

I feel like I can see the appeal of this but I can't actually see it as
producing a thing useful to me, at the levels of abstraction I like to work
with.

It also makes think how set theory is a bit the object-orientation of
mathematic. It's a messy, do-anything, constructive exercise. I can understand
an urge of someone who like their basis to be enlightening, to base their
structures on things of great apparent elegance. Perhaps I'm pessimistic
enough to believe that one is always to going to face "grunge".

------
abrax3141
Dunno. Matrix algebra seems perfectly intuitive to me. I don’t get what this
adds except making it harder to write things down. (The insights elsewhere
that you can also do this as a n appropriately constructed functional
programming language are more useful - at least you can write that down!)

------
wodenokoto
Would going through this course (if one can call it that) help me learn linear
algebra, or is more of a "Here is a different angle for those who get it" type
of thing?

~~~
edflsafoiewq
The latter IMO.

------
adamnemecek
This is poor man's Chu spaces
[http://chu.stanford.edu](http://chu.stanford.edu)

~~~
mathgenius
Ok, but where are the string diagrams for Chu spaces?

~~~
adamnemecek
[https://boxbase.org/entries/2019/jul/15/chu-
construction/](https://boxbase.org/entries/2019/jul/15/chu-construction/)

Strings is the canonical example for Chu spaces.

------
Andr1
this page displays some malicious and dodgy ads

