In one of the last videos in the (relatively short) series, he discusses eigen-*:
~'eigen-stuffs are straight-forward but only make sense if you have a solid visual understanding of the pre-requisites (linear transformations, determinants, linear systems of equations, change of basis, etc.). Confusion about eigen-stuffs usually has more to do with a shaky foundation than the eigen-things themselves'
All of the videos in the series, including this later one on eigen-things, focus on animations to show what the number crunching is doing to the coordinate system.
He's the creator I support the most on Patreon: https://www.patreon.com/3blue1brown
(edit: ugh, contains an ad for MacKeeper)
Incidentally, that particular grammatical quirk is one I've seen a number of times. Is it a thing for people coming from Russian? Or just general difficulty with remembering the order of the two subjects in two-subject phrases?
Any recommendations? I would love to look into channels that you think are like 3Blue1Brown but in other subjects (natural sciences, history, art etc.).
PBS Space Time and Eons are both awesome:
* PBS Space Time, covers cosmology and quantum physics:
* PBS Eons, for geology and paleontology:
* Smarter Every Day, Destin's enthusiasm is contagious:
* Extra Credits various topics (video game design, History, and recently history of Sci-Fi) are great:
* Today I Found Out is just on this side of clickbaity, and is this age's "Ripley's Believe It Or Not", but still interesting and more importantly well researched:
* Crash Course, of course:
* Gaming Historian, for some insight into the making of systems that formed my (and earlier) childhood:
* Minute Physics, whose latest few videos made Special Relativity understandable to this peon:
* Practical Engineering, for some insight into civil engineering topics that we take for granted:
* Real Engineering, for insight into various other mechanical engineering topics:
* Standup Maths, host Matt Parker was the first to make maths approachable for me again (before 3Blue1Brown took the lead):
* Steve Mould, who covers various topics both mathematical and physical. You may have seen that gif of him demonstrating the "levitating" siphoning "pearl necklace" (also a friend of Matt Parker, above):
* The 8-Bit Guy, for some history of early home computer systems:
* Numberphile, the second-greatest math channel (after 3Blue1Brown), whose recent video finally made me take the Golden Ratio seriously, rather than an architectural gimmick/conspiracy theory:
* Mathologer, another good math channel (but I must sheepishly admit I prefer 3Blue1Brown... sensing a pattern here?):
* Periodic Videos, for chemistry and physics, often featuring the iconic Dr Martyn Poliakoff:
* NileRed, for some homegrown chemistry, I particularly appreciate the candor of the approach and results:
Not quite as much "educational", but still very very good:
* Every Frame A Painting, now finished, but a great explanation of what makes good cinematography:
* NoClip, long-form documentaries about the making-of video games. Danny O'Dwyer is a treasure:
Apologies for the link spam, this list turned out longer than I expected as I went down my subscriptions, and I've probably missed a few worthy ones!
Edit the final:
I discovered many of these channels through referrals from others I was watching, including from the twitter feeds of the authors. Turns out the educational landscape on YouTube is a well-connected graph!
I like tinkering with things so I have some mechanical / electrical engineering / makers on my list as well.
I listed my favorites of that list here
* Tom Scott - Lots of interesting unique topics related to engineering, programming, and history
* Colin Furze - Lots of crazy welding projects
* Mark rober - ex-nasa engineer, similar to SmarterEveryDay
* Hacksmith - just interesting project build videos normally related to marvel comics
* Live overflow - web / binary hacking with great case studies
* Wendover productions - Great documenatires on things like Airplane and shippping logistics
* Strange parts - ex-programmer who sources components and parts overseas
Also, something rant about YouTube's recommendation engine, that doesn't mesh with my subscriptions into account well enough, offering many videos from channels I don't subscribe to and not enough from those I do subscribe to. Does it work better for you with your firehose of subscriptions?
I tend to have a really bad habit of subbing things I dont really need too. I used subbing more as a way to "bookmark" channels I like. I do the same with github, I am that type of person that 700ish repos bookmarked
I do put alerts on channels I know won't constantly spam me with too many videos (e.g. smarter everyday, wendover productions, mark rober, tom scott, primitive technology). Usually i watch every video that gets released there
I dont put alerts on channels that are good quality but high spam (e.g. casey neistat, tested, vsauce, etc). Normally i just binge watch these in one setting.
I dont put alerts on any programming or shiny tech channels. Like tutorials on how to use react, etc. I go to youtube to get away from all of that and would naturally seek those videos out with search terms anyhow.
I put a like button on every video ive watched that doesnt suck and leave comments to timestamp videos. I put dislike on videos that are poorly explained or didntt work for me (e.g. these are usually how to install xyz videos). This way i can instantly know if I've seen a video before ans found it useful or had to debug something years later (IT type stuff) to find it didnt work out
If i liked a video i check the top comment if its mine.
Basically I use google and youtube as my own personal search repository. Its like forking a github repo and adding annotations. I treat it the same way. If im going to 9 times out of 10 google something in the end, i might as well make the process easier to do next time.
Im still looking for a better UX youtube experience that doesnt suck. Havent found it yet.
How i would like to have it is organized by the following.
1. High quality videos low spam first uptop in feed.
2. Everything else sorted by its video category type and or playlist
3. General recommendations from youtube
Also his new channel DONG ("Do Online Now Guys"... wut?):
Also I should've mentioned Veritasium:
Neither of those subjects really clicked with me until I could visualize it in a 2D / 3D representation
I’ve written a bunch of scientific data analysis code. I have a science PhD. Written large image analysis pipelines that worked as well as the state of the art... been published etc.
For the most part I’ve found basic math and heuristics to be good enough. Every so often I go relearn calculus. But honestly, none of this stuff ever seems to come in handy. Maybe it’s because most of what I encounter is novel datasets where there’s no established method?
I reasonably regularly pick up new discrete methods, but the numerical stuff never seems super useful...
I don’t know, just a confession I guess... it never comes up on interviews either for what it’s worth.
First, linear transforms map spheres to ellipsoids. The axes of the ellipsoid are the eigenvectors.
Second, linear transforms map (hyper) cubes to parallelpipeds. If you start with a unit cube, the volume of the parallelpiped is the determinant of the transform.
That more or less covers covariances, PCA, and change of variables. Whenever I try to understand or re-derive a fact in probability, I almost always end up back at one or the other fact.
They're also useful in multivariate calculus, which is really just stitched-together linear algebra.
The symmetric case is by far the most relevant for probability theory though.
(1, 0, 0)
(0, 1, 0)
(0, 0, 1)
To find out where those axes are after a 3x3 matrix transform, you just read off the first, second, and third columns of the matrix respectively. Then you can mentally visualize another unit cube in the new coordinate system using those three vectors as the edges of the cube.
Really basic change-of-basis stuff but academic lectures don't emphasize how useful it is to be able to look at a matrix of numbers immediately know what it does.
They're in the same class as logarithms and Fourier transforms IMHO. You won't need to calculate them by hand, but you should know what they do and why they're important.
If you just want to be a statistical script kiddie you do you. :)
Right now I'm taking a "math. modeling" course. Still, the only use case I ever found was... other courses! I already modeled a little bit in a biology course. Sure, in real life I could model this or that, but the truth is that a very rough estimate guided by experience and "feeling" has always been enough. There are too many variables that cannot be accurately measured, so going for a nice model is kind of useless.
For example, I was just asked today about the performance of the crypto-hash-connected data storage and exchange library I wrote. Now that sounds like something I could model! Only experience tells me that's useless. The only worthwhile answer is to set up a concrete scenario, with a concrete app using it, concrete network and concrete systems, and test it. Could be anything from smartphones to well-connected servers. Sure I could create a sophisticated model and simulation - and it would be useless.
Maybe I'm just a bit, or more than just a bit, disappointed that all the considerable amount of math I learned in my life didn't seem to be of nearly as much use as I would have hoped. I'm also frustrated each time such a topic comes up and everyone is so excited about how great it is, and I always feel like I'm missing something despite trying hard, like the color blind guy looking at paintings. I mean the usefulness to me, not understanding it.
IMHO, it comes down to individual beliefs about mathematical realism. Is there anything inherently real about math, or is it just a man-made, arbitrary set of cognitive tools? Is it valid to presume the existence of a Grand Mathematical Framework that can solve any problem a priori? Or, is every problem unique and independent of mathematical developments?
From the little I've read about Math history, it seems pretty clear that the Problems came first, and the Mathematics followed. Infintesimal calculus, game theory, etc. were mathematical ideas developed primarily to solve real problems. Then 20th century formalism came along and rebranded much of mathematics under a "clean" framework, while giving little attention to the human environment in which much of it was developed.
To me, it is a great shame that abstract mathematical concepts are made further abstract (e.g. in math education) by distancing them from their human roots. Instead of forcing oneself to understand this mathematical "new testament", I think it's far more productive to adopt this sort of irreverent attitude towards math as you describe.
>"As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality."
I mean, I can and have coded it, but can't be sure how it how it will behave in all situations.
So I can understand these papers, I'm going back to study math properly. I'm not fully convinced it's really needed (though how could I tell?), but I'm fully convinced it's needed to understand the papers.
Math is the latin of CFD.
The most useful info I learnt at university were a couple of equations: Bernoulli's (for general observation about expected pressure drop), Ergun's (for flow through packed beds) and the general laws of thermodynamics.
Those are mostly enough to be able to sketch out an intuitive 'guess' about expected behavior in a large range of systems and the underlying math is not particularly demanding.
You guess ("manufacture") a solution to the PDE. Then you plug it into the PDE. It won't be correct, so you just add source terms to make it correct. You now have an analytical solution for this PDE with those source terms.
You can now run your simulation (with those source terms) and compare it with the "correct" solution.
Unfortunately, you'll introduce discretization error, because your delta x and delta t aren't infinitessimal. But using the order of accuracy of your discretization method, you know how the error should change with delta x and delta t. So you graph error against delta x, on log-log, and see if the curve matches the order of accuracy. Apparently, it shows up even minor bugs in your code really well.
This approach scares me a little, because how do I know my math is correct? I need better appreciation of how the order of accuracy interacts over time as well as space; the behaviour of error summaries; and how log-log plotting works.
I'd be pretty nervous riding an airplane that didn't use modern control theory, or going over a bridge that didn't use FEA - or an self-driving car that ran on a raspberry pi instead of a RTOS...
My feeling is having a basic knowledge of testing/caching/memory management is way more useful when you're doing large image analysis.
I really would like to find an application in my work, because without that I find new techniques don’t really stick and after a few months I forget them...
Forgetting stuff you don't use is pretty normal, the important thing is to be able to recognize when a technique you don't remember the details of might be applicable, and to know where to look to refresh your memory.
Usually I go through the code and ask "what am I trying to do here" and "can I do this a better way". A lot of the aforementioned speedups have come because the previous developer was obviously trying to do something, like create a linear projector, but were following some sort of math formula, so made a bunch of extraneous matrices that were huge.
It's a simple fix, but adds up when you're dealing with massive datasets.
Just from the top of my head, you could have encountered eigenvectors/eigenvalues:
- if you ever used spectral graph algorithms
- if you ever done dimensionality reduction via principal component analysis
- if you ever calculated the steady state distribution of a Markov chain
A fun book on this is https://openlibrary.org/books/OL2398351M/The_algebraic_eigen...
That's a bit uncharitable. A fourier decomposition can absolutely be understood as an explicit bag of calculus tricks, with no loss of precision or generality. And an awful lot can be done with just those tools -- you don't need to explain JPEG compression or VLBI astronomy in terms of eigenvectors, for example.
Obviously (heh, "obviously") it's true that the space of decomposed functions form an orthogonal basis, so technically we're "really" operating in a linear space and that has expressive power too. But there are lots of ways of looking at problems.
To wit, you're not wrong. You're just... Well, you know.
So... yes I guess I’ve just not seen anywhere in my work where this stuff has proved useful...
Maybe one day vicapow and I will make a triumphant return to the explorables space, but life has a way of getting in the way as you get older.
Multivariable function extrema? Just look at the eigenvalues of the hessian.
Jacobi method convergence? Eigenvalues of the update matrix.
RNN gradient explosion? Of course, eigenvalues.
To imagine finding the eigenset, just ask, could I draw a line through 0,0 such that any point I put on it would stay on it after the matrix acted?
Maybe I'm being pedantic, but rotation matrices have eigenvectors and eigenvalues, but the eigenvalues are imaginary because imaginary numbers are rotations in the complex plane.
It's exactly like saying x^2 = -1 has no solutions: It has two solutions, like any other quadratic, but neither of them are real.
In three dimensions, rotation matrices have three eigenvalues, one of them being 1, and the eigenvector corresponding to that eigenvalue is, naturally, the rotation axis.
Rotations have eigenvectors: a 2D rotation has two complex eigenvectors, a 3D rotation has one real and two complex eigenvectors, ...