
Graphical linear algebra: from matrices to monoids - jxub
https://graphicallinearalgebra.net/
======
InclinedPlane
I'll always recommend 3blue1brown's incomparable videos on linear algebra:
[https://www.youtube.com/watch?v=kjBOesZCoqc&list=PLZHQObOWTQ...](https://www.youtube.com/watch?v=kjBOesZCoqc&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)

~~~
0x4FFC8F
The best, hands down on YouTube.

------
BoiledCabbage
Really interesting. As well he seems to re-derive and implementation of Forth.

In episode 6, Right under the diagran for Crema di Marscapone he has a
formula.

Does anyone that uses Forth know if it has that circle-plus operator? It seems
to really lend to conciseness - and I hadn't remembered anything like it last
time I looked at Forth.

[https://graphicallinearalgebra.net/2015/05/06/crema-di-
masca...](https://graphicallinearalgebra.net/2015/05/06/crema-di-mascarpone-
rules-of-the-game-part-2-and-diagrammatic-reasoning/)

~~~
evincarofautumn
It’s not surprising—there’s a deep relationship among concatenative languages,
Hughes’ arrows, combinator calculus, category theory, linear algebra, logic,
even topology. I haven’t read through this but I guess the ⊕ operator would be
“* * *” (parallel composition) in arrow notation; in Haskell it has the type
“Arrow a ⇒ a b c → a b′ c′ → a (b, b′) (c, c′)”, or “(b → c) → (b′ → c′) → (b,
b′) → (c, c′)” specialised to the function arrow.

------
proc0
I've been looking for a categorical take on neural networks and have found
nothing, and this might get me started into translating some existing NN
papers to CT.

~~~
danharaj
[https://arxiv.org/abs/1711.10455](https://arxiv.org/abs/1711.10455)

~~~
proc0
Amazing! Thank you. And I'll say, CT is so much better on the eyes than
matrix/linear calculus.

------
Myrmornis
How closely is this related to Penrose's graphical tensor notation?

[https://en.wikipedia.org/wiki/Penrose_graphical_notation](https://en.wikipedia.org/wiki/Penrose_graphical_notation)

~~~
letlambda
I think the major difference is this notation has the count of loose wires
corresponding to dimension, and Penrose notation has the count of loose wires
corresponding to tensor valence.

Here, a diagram with 1L and 2R wires corresponds to a 2d-vector, in Penrose's
notation it would correspond to a (1,2) tensor that takes kd-vectors to kXk
matrices.

But... they certainly have a similar feel. I wonder if you could build up the
Penrose notation out of this.

------
grenoire
This was absolutely amazing! Reminds me of lambda maths a lot.

------
2_listerine_pls
What problem does graph-representation solve?

~~~
danharaj
The structure is more obviously modular and the algebraic laws become
topological phenomena, which makes it easier to consider more complicated
structures than vector spaces.

~~~
macawfish
You might be interested in Leavitt path algebras!

