I do wish the formulas for the dot product and determinant were derived from the geometric explanation, rather than justified with it afterwards. I always appreciated that in classes.
There are some more advanced topics in lin alg that I would have loved to see get the full visual intuitive treatment when I was learning these things.
- SVD, because it's more general and less pathological than eigenvalue decomposition, and often more useful.
- A linear transformation as consisting of (I think, it's been a little while) a choice of eigenvectors and eigenvalues, "divided" by the extra degrees of freedom from duplicate eigenvalues.
- The "taxonomy" of normal matrices and the polar decomposition (obviously comes after complex matrices)
And there's a nice visualization of the mechanical algorithm of matrix multiplication that looks way more "plausible" than the normal one: draw your two input matrices and your output matrix on grids on 3 sides of a rectangular prism around a corner. Then each value in the output matrix is the dot product of the vectors that intersect at that coordinate, and the whole thing only looks right if all the dimensions match up correctly.