Check out this 156-page tome: https://arxiv.org/abs/2104.13478:
"Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges"
The intro says that it "...serves a dual purpose: on one hand, it provides a common mathematical framework to study the most successful neural network architectures, such as CNNs, RNNs, GNNs, and Transformers. On the other hand, it gives a constructive procedure to incorporate prior physical knowledge into neural architectures and provide principled way to build future architectures yet to be invented."
Working all the way through that, besides relearning a lot of my undergrad EE math (some time in the previous century), I learned a whole new bunch of differential geometry that will help next time I open a General Relativity book for fun.
I have very little formal education in advanced maths, but I’m highly motivated to learn the math needed to understand AI. Should i take a stab at parsing through and trying to understand this paper (maybe even using AI to help, heh) or would that be counter-productive from the get-go and I'm better off spending my time following some structured courses in pre-requisite maths before trying to understand these research papers?
And any prereqs you need. I also find the math-is-fun site to be excellent when I need to brush up on something from long ago and want a concise explanation. i.e. A 10 minute review, more than a few pithy sentences, yet less than a dozen-hour diatribe.
I once worked with a company that provided IM services to hyper competitive, testosterone poisoned options traders. On the first fine trading day of a January new year, our IM provider rolled out an incompatible "upgrade" to some DLL that we (our software, hence our customers) relied on, that broke our service. Our customers, ahem, let their displeasure be known.
Another developer and I were tasked with fixing it. The Customer Service manager (although one of the most conniving political-destructive assholes I have ever not-quite worked with), actually carried a crap umbrella. Instead of constantly flaming us with how many millions of dollars our outage was costing every minute, he held up that umbrella and diverted the crap. His forbearance let us focus. He discretely approached every 20 minutes, toes not quite into entering office, calmly inquiring how it was going. In just over an hour (between his visits 3 and 4), Nate and I had the diagnosis, the fix, and had rolled it out to production, to the relief of pension funds worldwide.
As much as I dislike the memory of that manager to this day, I praise his wisdom every chance I get.
A little stoichometry suggests that, ignoring oxygen, hydrogen, and energy input, the cited worldwide market for C2H4 would be satisfied by just about 1 gigaton of CO2. So if "we need to process gigatons of CO2 annually", that ethylene's gonna pile up.
Earth escape velocity is 11.1 km/s, which is Mach 32 at sea level. They have some more engineering to do, maybe even invent something better than carbon fibers.
The intro says that it "...serves a dual purpose: on one hand, it provides a common mathematical framework to study the most successful neural network architectures, such as CNNs, RNNs, GNNs, and Transformers. On the other hand, it gives a constructive procedure to incorporate prior physical knowledge into neural architectures and provide principled way to build future architectures yet to be invented."
Working all the way through that, besides relearning a lot of my undergrad EE math (some time in the previous century), I learned a whole new bunch of differential geometry that will help next time I open a General Relativity book for fun.