
Graphical Linear Algebra - guerrilla
https://graphicallinearalgebra.net/
======
spekcular
The short version appears to be that the author has managed to replace
standard linear algebra notation with an alternative (but mathematically
equivalent) graphical notation, which is harder to do computations and write
programs in.

This submission is the latest in a trend of what I would classify as "math-
lite" category theory and Haskell articles that reach the HN front page, which
purport to explain something interesting but end up just rehashing standard
mathematics in opaque ways.

I really wish this would stop. Fortunately (unfortunately?) I've been down
this road a few times and know to avoid getting sucked in, but I can easily
see a bright, curious person wasting a lot of time before realizing that the
content is merely linguistic, not mathematical.

Is (for example) category theory a useful organizational tool in certain
abstract branches of mathematics? Sure (those branches being, basically,
algebraic topology and algebraic geometry). Is it a grand unified theory of
math? No, it's just some useful vocabulary. Most mathematicians go their
entire lives without writing a paper mentioning categories. Getting excited
about category theory is like getting excited about matrix notation – useful,
sure, but not where the meat is.

I also find any claims that category theory is relevant to the average working
programmer to be dubious at best.

(And yes, I realize the graphical notation presented in this article is not
category theory. I am making a broader point about certain kinds of articles I
see, which also applies here.)

~~~
guerrilla
> "math-lite"

The series is intended for a general audience and starts by laying the
groundwork necessary for exposition of this paper: [1]. Article 4 [2] explains
why it is introduced this way.

> end up just rehashing standard mathematics in opaque ways

It's just an explanation of string diagrams [3][4], pretty transparent and
standard.

I find it funny that your response to this is so similar to that of the
abacists to Fibonacci.[5]

[1]. [https://arxiv.org/abs/1403.7048](https://arxiv.org/abs/1403.7048)

[2]. [https://graphicallinearalgebra.net/2015/04/29/dumbing-
down-m...](https://graphicallinearalgebra.net/2015/04/29/dumbing-down-magic-
lego-and-the-rules-of-the-game-part-1/)

[3].
[https://ncatlab.org/nlab/show/string+diagram](https://ncatlab.org/nlab/show/string+diagram)

[4].
[https://en.wikipedia.org/wiki/String_diagram](https://en.wikipedia.org/wiki/String_diagram)

[5]. [https://graphicallinearalgebra.net/2015/04/26/adding-
part-1-...](https://graphicallinearalgebra.net/2015/04/26/adding-part-1-and-
mr-fibonacci/)

~~~
spekcular
I don't believe the exposition is written well (judged as either an exposition
of linear algebra or just the graphical calculus the article develops). But
this is perhaps a subjective point, and others here have already commented on
this in detail.

More importantly, I would like to remark that string diagrams are _not_
standard. The vast majority of mathematicians have never read the definition
of a string diagram (or even a monoidal category), precisely because it is not
standard and not needed for most (any?) useful mathematical work. They also
have obvious disadvantages when compared to the usual notation, as I pointed
out earlier.

When I read articles by category theory boosters, I get the sense (rightly or
wrongly) that they think the world revolves around them and they have stumbled
onto some deep and fundamental truths. This is not the case. It's a niche of a
niche.

~~~
graphlinalg
It's interesting to suddenly get all these hits.. I haven't touched the blog
in a long time.

I'm sorry that you didn't find it well-written -- it wasn't written with you
in mind. Originally I wanted to write about my research in a way that was
understandable to a lay person, but I quickly abandoned that and went for the
mythical "second year undergrad" level.

You have pretty strong thoughts about what is "useful mathematical work". Are
you the high priest and decider of the usefulness of mathematics? To be
honest, it almost sounds like some category theorist was super mean to you...

One more thing. Arguments like "niche of a niche" are sociological - just
because something is niche doesn't mean it's not important (or, god forbid,
fun!). Maths is one of the most conservative fields in this sense; what is and
is not considered "standard" is extremely political.

~~~
stablenode
"Are you the high priest and decider of the usefulness of mathematics? To be
honest, it almost sounds like some category theorist was super mean to you..."

Interesting and totally not ad hominem response... I believe Kevin Buzzard (an
actual mathematician) had a few words about this last year:
[https://youtu.be/Dp-mQ3HxgDE?t=1039](https://youtu.be/Dp-mQ3HxgDE?t=1039)

~~~
feanaro
Given the tone of the poster who this reply was for, it seems entirely
warranted.

~~~
stablenode
feanaro: fine, if you think so.

I find it interesting that actual mathematicians working in Category Theory,
such as Tom Leinster (who wrote a lovely little introductory text on Category
Theory [not covering monads though] and made it freely available on arXiv:
[https://arxiv.org/abs/1612.09375](https://arxiv.org/abs/1612.09375)) are able
to engage in polite discussion about the contentious viewpoints on the use of
CT without resorting to personal attacks (see 2-3 minutes from his talk from a
few years back:
[https://youtu.be/UoutGluNVlI?t=410](https://youtu.be/UoutGluNVlI?t=410)) ...
which stands in sharp contrast to some of the evangelists, whose attitude in
response to criticism often reeks of arrogance and puts people off taking CT
seriously, which I think is a great shame.

------
siraben
When I took a course on linear algebra, the professor would introduce the
notion of vector spaces and linear transformations first before talking about
matrices, and actually one can derive the matrix multiplication algorithm from
the specification of how it must correspond with composition of linear
transformations. I thought that the algebraic notation is very concise and
appropriate for linear algebra.

While new notation like this is interesting, I do wonder how useful it is from
a pedagogical standpoint for linear algebra. Category theory OTOH is one of
the areas of mathematics with diagrams rule over equations because a single
diagram can encode so many equations.

For excellent introductions to linear algebra I recommend Linear Algebra Done
Right[0], which Springer offered for free for a while (it doesn't seem to be
anymore though, contact me if you want the PDF).

[0] [http://linear.axler.net/](http://linear.axler.net/)

------
Tainnor
Regardless of the usefulness of category theory or not, the author seems to
imply that "traditional linear algebra" is about matrices.

This is not true. This may be true from an engineering perspective, or maybe
if you've studied in the US, but over here, linear algebra starts with fields,
vector spaces, homomorphisms. We establish pretty early on that matrices and
linear transformations are in essence the same thing (up to choice of basis),
and then we go on to prefer coordinate-free proofs. But, of course, we do
discuss the computational aspect (i.e. the Gauss algorithm) because this is
exactly what makes linear algebra so useful: many questions can be answered
extremely efficiently.

Whether you want to frame things in the language of category theory or not, is
up to anyone, but the "standard mathematical treatment" is quite beautiful and
intriguing already (if I want to feel inspired, I open up my copy of Halmos's
"Finite Dimensional Vector Spaces", not necessarily the best introduction for
someone who is new, but very beautifully written for someone who has already
seen linear algebra).

~~~
wenc
Just curious, what country would this be? I know many European countries favor
a more theoretical approach.

I definitely learned linear algebra from the numerical and matrix perspective
(though yes, we did cover the idea of vector spaces and linear transformations
too). In engineering school, the focus tended to be on the craft rather than
the theory, and in retrospect I think it was the right approach for
engineering majors, whose primary concern was execution. It helped engineers
use MATLAB to solve problems. The syllabus looked like this:

[https://ocw.mit.edu/courses/mathematics/18-06sc-linear-
algeb...](https://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-
fall-2011/syllabus/)

And this:

[https://math.nyu.edu/media/math/filer_public/a4/00/a4008ffa-...](https://math.nyu.edu/media/math/filer_public/a4/00/a4008ffa-f5b0-48df-9b76-357d32397009/math-
ua_140_fall_2019.pdf)

A syllabus for math majors might look quite different.

The US approach to linear algebra continued to be useful for me in grad school
as an engineering major (I have an engineering Ph.D.). To be honest, for what
I was/am doing, I don't think I would have derived much benefit from a more
theoretical approach (but that's just me. People who work closer to theory may
have a different opinion).

In my work, math has a retrospective role -- I first build something and later
recognize patterns that fit a certain theory.

For theory-builders however, math has a constructive role. Theory builders
however are relatively rare.

~~~
beagle3
I studied engineering with a European-style curriculum, which starts with
groups, rings, vector spaces, linear transforms, eigenvalues, etc - and only
then proved that matrices can be used to represent any finite dimensional to
finite dimensional linear transformation, and mostly went on from there. It
did seem roundabout at the time.

But then, _everything_ was using the same terms, and was relatively simple and
straightforward. Fourier transforms? Laplace Transforms? They are all just
linear transformations. Functional analysis? It's a lot of inner products,
hermitian forms and eigensystems but we were familiar with all the properties,
so we only concerned outselves with what's _different_ about it (e.g. the
spectral theorem). Coding thoery? It's finite fields, we did those in linear
algebra, now it's just applications. Stationary distributions on markov
processes? It's application of linear algebra+probability, but works mostly
the same whether it's discrete (and representable by a matrix) or continuous
(when it isn't).

The syllabus for engineering students was basically the same as for math
students, except we only covered proofs in class & homework (but did not have
to be able to reproduce them in tests), and that we mostly went through a
simple logical progression of ideas rather than a historical one (e.g.,
cauchy's theorem of analytic complex functions was essentially a two-line
application of Green's theorem in the plane, and we spent ~30 minutes
discussing it; math students spent ~6 weeks proving it in the way Cauchy
historically did, progressing from simpler to complex structures)

------
CharlesMerriam2
Meh. Sorry.

Start with [Essence of Linear
Algebra]([https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab))
by [3 Blue 1
Brown]([https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw](https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw))
for an intuitive overview of all the major concepts.

While working through a full Linear Algebra course, it is important to develop
your own software to do things. That is, some sort of command tool to enter
matrices and run operations on them. As you go through a course, you will keep
adding operations.

You should work on some of the most common tricks, like using 4 by 4 matrices
for computer graphics and understanding what a PCA does.

~~~
s1t5
As much as I like 3blue1brown, watching those videos is not a good way to
learn. It's maybe a good way to get some intuition about a topic and much
closer to the "infotainment" category. A good textbook + solving problems +
maybe writing code works much better for actual learning.

------
edflsafoiewq
The usual notation for a linear transformation

    
    
      X' = aX + bY
      Y' = cX + dY
    

minimizes irrelevant information. But the diagram for one seems to bring it to
the fore: a diagram is basically a system of fully parenthesized, unsimplified
expressions eg

    
    
      X' = (3X + (2(X+4Y) - 6X)) + Y
      ...
    

In fact, it contains even more information than that, since it also tells you
if, in (X+Y)+(X+Y), you are supposed to compute (X+Y) once and reuse the value
or compute it twice.

~~~
tel
Depends on what you mean by relevant information. GLA exposes the many
compositional natures of linear transformations. The fact that there is a
compact matrix representation is a wonderful treat.

~~~
edflsafoiewq
Can you be more concrete?

~~~
tel
A lot of category theory is just about composition, how different structures
compose and how compositions of those structures continue to be composable.

So in this case, GLA builds the theory of linear transformations from
trivial/simple pieces and their various ways of being composed. It also
discusses the mechanisms for proof and reduction (you noted how GLA is clear
about whether a sub-computation is reused or not, it'll also be clear about
how those two choices are equivalent).

So it's really not studying "just" linear transformation (which are, in finite
cases, summarized by a matrix of numbers/field elements) but also the theory
of their construction, manipulation, simplification. It gives you a rich
language for talking about how two linear combinations might be related to one
another, something that's more challenging to access from a matrix.

~~~
edflsafoiewq
Well, that wasn't very concrete. How about this: can you give an example of a
linear-algebraic fact that is more easily shown by reasoning about these
diagrams than by just using the numbers-moving-along-wires interpretation to
immediately turn it into regular equations?

For example, the only thing I can think of is maybe (AB)^T = B^T A^T.

------
enriquto
It would be nice to have a single-page complete summary of this notation. The
whole text is way too verbose and boring (at least the first two chapters).

~~~
guerrilla
There's a paper linked to here which he says contains all the "spoilers":
[http://arxiv.org/abs/1403.7048](http://arxiv.org/abs/1403.7048)

> If you are impatient and want to get to the conclusion quickly, here’s the
> link to our paper where the basic theory is explained (Spoiler warning!). We
> will also discuss several applications of graphical linear algebra that have
> been developed in recent years.

------
ianhorn
Diagrammatic notations seem to pop up all over the place, but in my mind they
all share a difficulty that interesting complex graphs tend to not be planar
(can't be drawn without criss-crossing lines). That makes it hard to organize
in a skimmable/glanceable way. I'd love to see what a future with ubiquitous
3D volumetric displays or VR or whatever could do for diagrammatic notations.
You might suddenly be able to organize larger diagrams and I wouldn't be
surprised if diagrammatic notations became more useful and started popping up
more in that hypothetical future.

------
adamnemecek
I would also invite people to check out Chu spaces
([http://chu.stanford.edu/](http://chu.stanford.edu/)). They are essentially
matrices with closure under norm (meaning you always have an inverse). This
puts them on the same side as probability, linear logic, C*-algebra, and
constructive mathematics.

------
gosukiwi
As a software developer, I remember basically nothing about Linear Algebra
from my time at University :( Besides graphics, what are some applications for
linear algebra in software?

~~~
categorybooks
I believe you'll find some applications here:
[https://codingthematrix.com/](https://codingthematrix.com/)

~~~
gosukiwi
Wow, that's exactly what I was asking for :) Thanks!

------
jovial_cavalier
This is not nearly interesting enough to justify the Stephenson-esque asides
on "English snobbishness" and bits on if Fibonacci had to apply for a grant
for Arabic numerals.

------
rurban
I loved the Makelele story in the intro. I hardly remember him, and didn't
recognize that Real was so much worse without him. Linalg really is the most
underappreciated technique.

------
da39a3ee
I would like to read this but I really need a version without the verbiage and
humour. I'm really not convinced it's necessary to try to make these things so
"accessible".

~~~
guerrilla
Then read the paper that it's based on which is linked to by the blog and my
other comments.

~~~
da39a3ee
Thanks, I didn't mean to offend. Which is the best starting point? Based on
the blog post I was hoping to get some introduction by way of familiar
concepts such as linear algebra and homomorphism, before confronting
applications to theoretical computer science. Anyway, is the best starting
point "Diagrammatic Algebra: From Linear to Concurrent Systems" or "The
Calculus of Signal Flow Diagrams I: Linear relations on streams" or something
else?

------
deltron3030
Reminds me of petri nets.

~~~
jshholland
Not entirely a coincidence - see this paper (coauthored by the guy behind the
blog in the OP):
[https://dl.acm.org/doi/10.1145/3290338](https://dl.acm.org/doi/10.1145/3290338)

~~~
deltron3030
And coauthored by you obviously :)

Thanks a bunch, very interesting. I've stumbled upon petri nets as a state
machine alternative when looking for graphical modeling tools for functional
programming (I'm a designer learning Elixir). Here's a talk:
[https://www.youtube.com/watch?v=aWnGPaputGE](https://www.youtube.com/watch?v=aWnGPaputGE)

------
CodeArtisan
There is also
[http://immersivemath.com/ila/index.html](http://immersivemath.com/ila/index.html)

------
ouroboros1
This looks interesting! I get the impression there would not be much friction
to implement some of these mechanisms in Haskell.

~~~
solomonb
I think it would be a challenge without dependent types.

------
mlwhiz
Linear algebra is often taught so poorly. So I am happy someone took the time
to create such relevant content. Thanks.

