"There are exactly four normed division algebras: the real numbers ($\R$), complex numbers ($\C$), quaternions ($\H$), and octonions ($\O$). The real numbers are the dependable breadwinner of the family, the complete ordered field we all rely on. The complex numbers are a slightly flashier but still respectable younger brother: not ordered, but algebraically complete. The quaternions, being noncommutative, are the eccentric cousin who is shunned at important family gatherings. But the octonions are the crazy old uncle nobody lets out of the attic: they are /nonassociative/."
Imaginary numbers are needed to take the square root of a negative number, and complex numbers result from combining the new numbers with the real numbers. Complex numbers also allow solving roots that aren't found in just real numbers.
But I have no similar comparison of what I can do with a quaternion or octonion that I can't do with a complex number. I remember seeing some w based number system derived from the cube root of either 1 or -1 (forget which, but w and -w were the solutions that weren't 1 or -1), but it did all the same things that complex numbers do and was considered mostly uninteresting.
It also seems like there is a pattern to go infinitely beyond octonions, but they all behave identical to octonions, but are the octonions even needed in the same way complex numbers are needed, or do they just make some math problems easier to work with?
So Hamilton was trying really hard to find a way to have 3-dimensional numbers that behaved nicely, and he couldn't do it. But he did find 4-dimensional numbers -- the quaternions. And they were really neat. You can use them quite well for classical mechanics, and electromagnetism, and even special relativity. So, why don't we?
We look at Maxwell's equations of electromagnetism today, and they're really nice, single-line vector formulas. You can also write them as nice, single-line quaternion formulas. Our notion of vector didn't exist at the time the quaternions were first used, and it was a boon to have quaternion notation to simplify some of these physical laws. Vectors and quaternions competed for a bit, and vectors won since they generalize to arbitrary dimensions.
Hidden inside of quaternion multiplication, you can find the three-dimensional versions of the dot product and cross product. And they do have some theoretically interesting properties for number theory and abstract algebra. In the end, however, sometimes items are discarded in favor of better items. I'd rate quaternions as one of the coolest items that ultimately wound up in the discard pile.
I don't know if that's significant but it was how I stumbled on the concept in high school when I was messing with Direct X.
If you just have two you can slerp (or not), but if you have a large number of them (weights from an animation system, for example), a basic weighted sum followed by normalizing is shockingly well behaved and extremely fast.
The subalgebras formed by scalar and bivectors (1, x^y, y^z, z^x)--or the monovectors and the pseudoscalar (x, y, z, x^y^z)--from a three-dimensional geometric algebra have a lot of the same mathematical properties as quaternions.
I don't know whether or not octonions have a similar relationship with a 4-D geometric algebra, where one dimension is a timelike dimension, because that is one gnarly mess of anticommutative, nonassociative math to wade through.
Quaternions tell you what happens if there are three different i's that have this property. And it tells that it leads to nice algebra that expresses 3d rotations very well (and even 4d if you believe the article).
Octonions tell what happens if there are seven such i's. And it leads to cool algebra that helps expressing.... we don't quite know yet.
Also considering other numbers of i's doesn't lead to anything coherent. Also considering different possibilities of what should be the result of multiplying one i by another different i doesn't lead to anything cohesive.
So... i^2=-1 apparently can have either zero, one, three or seven solutions and they have to have very specific relationships between themselves for calculations to make sense
In math infinity turns up all the time so having there be exactly a finite number of anything feels weird.
They're nuts, though.
Sure. Unit quaternions form a double-cover of SO(3).
In other words, you can encode a rotation of a 3-dimensional object with a single unit quaternion.
But wait, there's more! You could do the same with a matrix, or a triple of angles. Why not do that?
Answer: interpolation. The "natural" way you want to go from one rotation to another corresponds to exponentiation of quaternions. If you linearly interpolate matrices, the intermediate steps will do something nasty: they won't even be rotations!
The natural way to implement the Arcball interface for rotations is using quaternions. Here, I have implemented it in ProcessingJS and wrote up the math behind it.
Quaternions (like complex numbers) can do other things too, but this alone is a good start. Also gives you intuition why they aren't commutative: because rotations in 3-space aren't.
TL;DR: Unit complex numbers = rotations of plane. Unit quaternions = rotations of 3-space.
PS: you shouldn't think of complex numbers as the solution to the problem of "taking the square root of -1". Think of them as "how can I multiply/divide a 2D vector by another 2D vector?" - there's only one way to do it sanely (multiply/divide lengths, add/subtract angles). This is what the complex numbers are.
Hamilton was trying to solve the same problem in 3D, and couldn't (turns out, it is not possible), but solved it in 4 dimensions, and later found many applications for them.
It's also good to think about complex numbers in the way described here because it neatly abstracts the concept of numbers and maps them to a form of dimensionality. Great comment all around.
On an unrelated note, thanks for that Feynman integral trick post.
No, you can blame Bourbaki for that. People such as V. Arnold decried the way mathematics is now presented.
It was from Hamilton's book that I learned what the word vector means and why it's used. It simply means carrier (as in malaria vector that you might heard from biologists) - and carries the space, by a translation!
Such lucidity is absent from all linear algebra books I've seen.
We need to go back to the presentation style of 19th century, where not only the result, but the thought process is presented. Today's papers look like they are written for formal verification systems.
I also have an anecdote similar to yours regarding Hamilton and vectors: I think it was in one of the "Analysis Infinitorum" (Euler) that I found the natural logarithm being called the "hyperbolic logarithm" (it was the English translation of course). When I was a kid I was perplexed by how everyone seemed to insist on using e as the base of their logarithms and exponentials -- why the hell? Reading Euler's treatment of the subject would have been very satisfying then.
Using unit quaternions (aka versors) to code the rotation instead, you cannot lose a degree of freedom. 4x4 matrices also solve this problem, but quaternions are more mathematically efficient. They also give very smooth interpolation for computer-assisted animations.
Also related to spinors in quantum physics and exists as a subalgebra of some conformal geometric algebras.
You also have ij = k, jk = i, ki = j
What are these thing? I have no idea, nor did I know there are 4 more - and only 4 more.
From that point of view, quaternions and octonions solve problems in the same way more common numbers do, they just have different sets of properties and so help solving a different set of problems. Of course, they can only do this if we study them well enough to have a sufficiently large suite of concepts and relations in our toolbox.
IDK much about octonions, but quaternions have similar algebraic properties to rotation groups and can be used to encode rotations in 3D graphics.
No, they are not. That problem is not well defined. A symbol is not a solution.
Complex numbers are needed to solve the polynoms of a higher degree. Eg. x^4+x^2=-1 can be simplified to y^2+y=-1 with y=x^2 which wouldn't have a solu
1. I think they meant "the only kinds of numbers constructed in this way".
2. Sedenions can still be added, multiplied, subtracted and divided. it's just that multiplication and division lose most of their useful properties. With octonions you've already lost associativity and commutativity, though.
Specifically they lose the property of not having zero divisors.
There exists sedonions a,b != 0 such that ab = 0
Of course associativity doesn't hold in the octonions either, but it holds just enough for cancellation to work.
N(xy) = N(x)N(y) - N is called norm.
Without this property, you have zero divisors.
edit: throwawaymath uses better notation: |xy| = |x| • |y|
E.g. N = 3, x = 2, y = 3
N(xy) = 3(2 . 3) = 18
N(x)N(y) - N = 3(2) . 3(3) - 3 = 51, which is not 18
So to be explicit, they're saying |xy| = |x| • |y| implies you cannot have a 0 divisor.
The "norm" of a number is more or less its absolute value: its size, it's magnitude. So "1" has norm 1, but "-1" also has norm 1, as does "i", and "-5" or "5i" or "4-3i" all have norm 5. So these four mathematical structure (R,C,Q, and O) all have the property that the norm of a product is equal to the product of the norms. Things get really obnoxious (or at least really unfamiliar) if you don't have that property.
Edit: Sorry for the repetition! I've got to remember to reload these pages before replying.
“What I had was an out-of-control intuition that these algebras were key to understanding particle physics, and I was willing to follow this intuition off a cliff if need be. Some might say I did.”
Wait? Since when did Einstein not accept Quantum Mechanics? He won the Nobel Prize for his work in discovering an important part of Quantum Mechanics.
Einstein rejected the Copenhagen Interpretation of Quantum Mechanics. That's not the same thing as rejecting Quantum Mechanics.
I've seen no claims that he would object to either the Everett or Bohm Interpretations of QM. (But then again, he died before they were invented.)
Einstein, by the way, is far from the only critic of the Copenhagen Interpretation. Quite a few physicists are drawn to the Everett Interpretation instead, for instance.
It would be very interesting to know how Einstein would have received the Everett Interpretation.
This is what Einstein had to say about Bohm's Interpretation, in a letter that Einstein wrote to Bohm about six months before he died:
"In the last few years several attempts have been made to complete quantum theory as you have also attempted. But it seems to me that we are still quite remote from a satisfactory solution to the problem. I myself have tried to approach this by generalising the law of gravitation. But I must confess that I was not able to find a way to explain the atomistic character of nature. My opinion is that if an objective description through the field as an elementary concept is not possible, than one has to find a possibility to avoid the continuum (together with space and time) altogether. But I have not the slightest idea what kind of elementary concepts could be used in such a theory."
From my reading of this, it seems that Einstein's greatest concern here is that GR and QM had not yet been unified.
There is a sense in which no competent scientist can fully accept either QM or GR, despite their incredible accuracy in making predictions, because we know that they are incompatible with each other, and consequently, both wrong or incomplete in some very important manner.
Otherwise you'd have to take into account what's happening at the other end of the universe in order to successfully predict the outcome of experiments. (Which I hear is what the people working on Bohmian mechanics struggle with.)
On the other hand, it's certainly been proven that this cannot be used to transmit information that you might like to at faster than the speed of light. So causality is preserved in any case.
But didn't Einstein write the following in his own book?
“I believe in intuition and inspiration. … At times I feel certain I am right while not knowing the reason. When the eclipse of 1919 confirmed my intuition, I was not in the least surprised. In fact I would have been astonished that it turned out otherwise. Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution. It is, strictly speaking, a real factor in scientific research.”
On my first reading I took "novel study path" to mean new research techniques or something along those lines, which seems obvious and is not likely what you meant.
And you still need a good amount of creativity to come up with special relativity and to a certain extent, with the photoelectric effect (though we could say that was the 'easy' one)
But yes, the puzzling results and existing mathematics were part of it. But there is a reason we hear much more about Einstein than Lorenz or Michelson-Moreley
If you mean the Michelson–Morley experiment (apparent zero velocity of earth with respect to the ether), apparently Einstein himself claimed that wasn’t a motivation.
I believe it was all about resolving the long-standing "action at a distance" question raised by Newtonian physics (Newton himself noted that this was philosophically disturbing but didn’t venture an answer). Intuition and philosophy, rather than experimental evidence and mathematics.
There's no shortage of specious ideas. From outright quackery, the naive, or the simply misguided, there are countless ways to waste a lot of time and money on ideas that will lead nowhere. Especially when all you're guided by is intuition.
That's not to say that intuition isn't valuable; rather, intuition shouldn't be the only guiding principle. If an idea is 'real' in the sense that it will produce substantive research findings, it's reasonable to expect some kind of evidence for this. Maybe you're trying to show that A -> D. Well, showing a bit about B or C can go a long way to convincing people it's worth looking at the link between A and D.
The counter-argument is that it's possible there is not B or C, that a large and courageous leap is required to get to D. That's indeed possible, and arguably has been demonstrated with some famous results. But it doesn't follow that you simply must take everyone's giant leaps seriously and give them funding.
Ultimately, we go by proxies. If A -> D is required, well, at least show us that you got from A to B in some other issue. Give us evidence, however imprecise, that you might be that 1 in a 1,000 (or 1,000,000?) leaper that lands somewhere successfully.
So yes, there's an art to knowing when an idea is 'ripe' for a wider audience, and when it's time to stake your career on it. There's no reason you can't work on something in the background or during a sabbatical. The notion that 'academic incrementalism' is so destructive is a tenuous one, and certainly isn't being demonstrated here. To argue this is to say that Dixon would have been successful, if only he had gotten this or that position. Yet is appears that his approach was the issue, not financial or departmental support. I'd argue that the academic system correctly identified that his idea wasn't ready yet. Now that some demonstrable progress is being made, even if it falls far short of 'D', it's attracting attention and enthusiasm.
Finally, the notion that academia produces no actual research is farcical.
(I remember three separate occasions in my undergrad particle physics class where we actually went through all the calculations involved with the velocity addition formula and finally saw that SOL was wacky).
I wonder if that is uncommon or if the article is just a bit ungenerous towards our understanding of non-associative objects.
So this quote about mathematicians seems a bit off. Baez is talking about mathematical physicists, I guess, not logicians and computer scientists. I know he is not a functional programmer from various online remarks he has made. So maybe it's more of a personal perspective that we shouldn't take too seriously.
'“Nonassociative things are strongly disliked by mathematicians,” said John Baez, a mathematical physicist at the University of California, Riverside, and a leading expert on the octonions. “Because while it’s very easy to imagine noncommutative situations — putting on shoes then socks is different from socks then shoes — it’s very difficult to think of a nonassociative situation.” If, instead of putting on socks then shoes, you first put your socks into your shoes, technically you should still then be able to put your feet into both and get the same result. “The parentheses feel artificial.”'
And to add to that, the are a Lie group I.e. they are anticommutative I.e. AB=-BA.
I’ve also been exploring this relationship between dual quaternions and linear logic. It’s pretty wild. I’m curious if anyone has any opinions on this.
Linear logic always reminded me of the no-cloning property in quantum physics, where a resource can only be used once -- no copying for multiple use.
Can you elaborate on the sense in which you think of linear logic as "squaring to zero"? UPDATE: A little googling throws up this: http://www.seas.upenn.edu/~sweirich/types/archive/1999-2003/...
So this is kinda roundabout however there’s homotopy type theory which states that topology=logic=type theory. For a second let's say that theres a fourth "element" in this equation and that's quaternions (this isn't a stretch I can provide some papers that hint at this).
Linear logic is classical logic with the added reference to time without explicitly referencing time.
Dual quaternions all about modeling deformation over time.
Therefore, I would conclude that linear logic and dual quaternions are related.
Check the guy on the right getting deformed like an idiot. Compare it with the smooth criminal on the right.
Offset curve deformations blow dual quaternion's out of the water for rigging!
This Clifford algebra has an odd and even part, and the even part is an algebra in its own right, and is very useful for representing “rotations” in the space.
With the Euclidean metric, with one input dimension, the even Clifford algebra is just the reals. With two dimensions, you get the complex numbers. With three, the quaternions. However, since a Clifford algebra is always associative, you don’t get the octonions in this way.
Finally, if you put in four dimensions and the minkowski metric, the even Clifford algebra construction gives you a lovely algebra representing spatial rotations in three of the dimensions, and “boosts” along the time direction - exactly what you need to do calculations in special relativity.
Kenichi's book works from basic vector calculus, Hamilton, Grassmann and Clifford. Grassmann was not even a mathematician by trade, but a school teacher and a linguist. He translated the Rig Veda. His big mathematical work did not get the attention when written as it did posthumously. I know things with cool names like differntial forms and fiber bundle space . I'm becoming quite the Grassmann fanboy!
I am thinking if I can learn these algebras (Grassmann, Clifford, etc...) I will be able to deal with more abstract thinking in geometry and handle it more concisely and logically. Almost like when I use J programming language for math! 
I think Quarternions don't hold after 3 dimensions, so what about Octonions?
this could be a nice implementation trick for an automatic differentiation framework.
Many years later I remembered my old question and started looking it up. It turns out that eight is the sum of 1+3+3+1 perhaps similarly to what's in the article.
Spherical harmonics ends up giving rise to a three dimensional 'overtone' series (borrowing from my understanding of music theory). In the first order there's only one mode of vibration. In the second order there are three additional modes. The summands above are something like positive and negative degrees of freedom for each mode in the second shell.
Here's a diagram of what the modes look like in each order:
...and here's an animation of a sphere undergoing the differing modes of vibration:
If I understand correctly, the math related to atomic orbitals can be described with 3 dimensions of space: x, y and z plus one more orthogonal dimension of frequency/time which would mean quaternions would be most directly applicable?
Not true. Any field (in the algebra sense) has these properties. 
Quaternions and octnonions also have weirder properties: quaternions are non-commutatve (jk=-kj) and octnonions are non-associative: a(bc) != (ab)c.
I think the article meant these are the only Euclidean Hurwitz algebras , which is a far cry from the claim.
You really do need the additional constraint of the Hurwitz form to restrict the possibilities to R, C, H, O ...
(Of course this has nothing to do with the work of Furey - I'm just seconding that the claim in the article is incomplete and inexact as worded.)
> The octonions’ seemingly unphysical nonassociativity has crippled many physicists’ efforts to exploit them.
I'd say nonassociativity is extremely physical property. And example with shoes and socks, if anything, shows just that. In idealised mathematical model, if you put socks inside shoes and then insert feet into socks you could end up with same thing as if you put feet in socks an then both in shoes. But in real world or just accurate model of physical world you'll end up with very different result.
I feel that nonassociativity is what's missing to take into account time which unidirrctuonality is suspiciously missing from almost all of the physics.
Edit I remember seeing an update from him two years ago where he addresses some of the criticism:
I'm fuzzy on the details now, but the criticism was along the lines of "mixing things together that don't make sense".
Why are pairs not commonly called duples?
Take a theory, add a bit about a Unicorn. or a Dragon. Now you have a more complicated theory. You can arbitrarily contrive a more complicated solution.
But the simplest? You've found a local minima. And because it's more stable you can compare it to competing theories without trivial refutation. see: the 3-5 theories of dark matter.
If you have infinite possible theories, the only logical thing to do is to stay with the most simple theory until it can't explain the world anymore, then you choose the next one more simple.
There is also, I think, a theoretical information argument: when creating a theory you are trying to compress all the observations. The better the compression, the better the theory.
I don't understand why people down-vote you, by the way.
The key is that in science, you never 'conclude' (in the sense of finalize) anything. Everything is temporal until new evidence deny your current understanding.
Honestly I haven't seen attempts at making this process more rigorous, when applied to physics. There's the large corpus of machine learning study which provides concrete results and even concrete comparison tools, but the times I've asked a physicist I've been dismissed; while it seems incredibly valuable in the face of lack or large cost of experimental data, which is quite relevant today.
Marcus Hutter has expressed this idea quite well ( https://arxiv.org/pdf/0912.5434 ) arguing that (a) smaller/simpler theories have more predictive power and (b) the "size" of a theory includes the complexity of its equations and the parameters needed to specify some result. The latter is important because some theories trade off between these two: e.g. a multiverse theory might have simple equations ("every possibility happens somewhere") but require very precise "coordinates" to pin-point the actual possibility that we observe.
Not sure if other physicists know of or take it seriously though.
> The latter is important because some theories trade off between these two: e.g. a multiverse theory might have simple equations ("every possibility happens somewhere") but require very precise "coordinates" to pin-point the actual possibility that we observe.
I think this is an important observation that's quite obvious for ML researchers et al but again seems to escape current physics discussions. An example is the endless drama about "Fine tuning": if your new theory requires many less bits for equation description, that it requires fine tuning is irrelevant as long as the additional model parameter precision uses less bits -- then it should be the preferred candidate.
W.r.t. [computational] multiverse theories (and variants such as Tegmark's MUH, Schimidhuber's, and others), I do believe they're an inevitable progression of physics/philosophy. I just think it's a bit pretentious to have any certain about a particular flavor. I feel there's still much philosophical and mathematical ground to be covered; it tests the limits of our imagination. It seriously feels like a very important step for humanity at large though -- finally approaching metaphysical theories that actually make sense, and explain the basis of much about humanity, existence, ethics, etc. I think it's an important void to be filled after the decline of religion, hopefully in coonjunction with the spread of humanism.
A theory is just a list of assertions. In the absence of any evidence, each assertion is just as likely to be true as false. Therefore the probability of a theory with n assertions is 2^-n. So the more assertions there are, the less probability.
Given equally true theories scientists look to other properties to establish theory quality, and simplicity is a philosophically important one: it represents the belief that natural laws should be as simple and elegant as evidence allows them to be.
I would say that studying octonions exclusively would be something I personally would avoid, as I would rather try to study the four structures (reals to octnonions) together, either more generally (e.g., group theory and ring theory) or more abstractly (e.g., as members of categories) and form an opinion on whether I think octonions in particular are useful for the questions that I want to ask.
That is not to say that the research here is not interesting, but it is difficult to judge that from "popular" mathematics articles. I got the impression that the author of the article places a much higher priority on the pictures accompanying the post.
I remember that I found it interesting that studying the four dimensional spacetime bears more fruit than stopping at three space dimensions, and that at the same time from complex numbers the next structure ends up also having four dimensions (i.e., being modelled by 4-tuples). However, apart from being interesting in this narrow sense, I do not know whether this suggests any creative yet precise mathematical questions.
But there don't seem to be many reason to believe that such a path exists, there are countless mathematical objects you could pick as a starting point and almost none of them have a path leading to the universe at the horizon. And even even of those with path going roughly into the right direction many will take a turn before you arrive at your destination.
So it seems much more promising to me to start at the universe and the mathematical structures describing our observations in a straight forward way and then explore from there the surrounding mathematical structures to see if they are a better fit, suggest new ideas, or whatever.
Quaternions were very popular way of expressing the "classical" physics around the 19th century (and the vector algebra we know today is in some ways just a derivative of quaternion algebra). Complex numbers are extremely useful in many fields even today. It's hardly in the middle of nowhere.
We toss our known quantities into a void, anticipating that if some impossible, imaginary thing really can fill the gap, and if or when it does, we'll catch the rebound off of it, and the rest of the universe proceeds predictably.
Somehow, we're always put into a position where we have to close our eyes, fly blind for some undisclosed intervening moment of unspecified length, and when we open our eyes again, we're grounded by familiar territory again.
It really is kind of stultifying.
By the way any idea why they don't mention the sedenion numbers?
When the best minds we have divide into two camps, one saying that (a) "with X and Y we don't have enough information to solve Z" and the other saying (b) we do but we need to think harder, the first camp builds a particle collider, the other creates what, string theory? Aren't we doing "fuzzy science" here?
It seems that the experts in one of the camps should go back and retrace their steps because somewhere along the line they made an assumption based on some data that they (I assume) forgot to encode into their equations and now they have trouble taking it to the next level. Why is it not clear to us that eiher a or b is true?
Isn't that kind of what it would mean if it turned out octonions were at its core?
>In mathematics, the octonions are a normed division algebra over the real numbers .
wtf is a normed division algebra??
>In mathematics, Hurwitz's theorem is a theorem [...] solving the Hurwitz problem for finite-dimensional unital real non-associative algebras endowed with a positive-definite quadratic form.
I have the same problem when I try to understand anything statistics related, I get hit by a barrage of unknown words
and my brain just melts.
Is there any place that explains mathematical concepts in ... different ways?
3 Blue 1 Brown: https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw
To understand what's going on here in any meaningful sense, the parent commenter should pick up an accessible undergrad textbook on abstract algebra. It doesn't have to be particularly advanced. Then they'll have a better foundation for understanding the algebraic features of various number systems.
An "algebra" is a set of thingies that have binary operations defined on them (i.e., operations that take two thingies as input and output another thingie). The thingies are usually referred to as "elements" of the algebra. Any algebra will at least have the "add" and "multiply" operations, but might also have others.
A "division algebra" is an algebra where the operations are add, subtract, multiply, and divide, and all of them work basically the same as they do with ordinary numbers (but they won't have all of the same properties--see below).
A "normed division algebra" is a division algebra with one additional operation called "norm", that takes as input any element of the algebra and outputs a real number, the "norm" of that element. On the ordinary real numbers, the norm is just the absolute value.
The "over the real numbers" part means that you can construct the normed division algebra by starting with real numbers; or, to put it another way, all of the elements of the algebra are "made of" real numbers. The simplest way of viewing this is as a repeated operation of pairing: complex numbers are made of pairs of real numbers, quaternions are made of pairs of complex numbers (hence sets of four real numbers, hence "quater"), and octonions are made of pairs of quaternions (hence sets of eight real numbers, hence "octo"). But each step in this series loses a key property. The reals are totally ordered; the complexes are not. The complexes are commutative under multiplication; the quaternions are not. The quaternions are associative under multiplication; the octonions are not.
> Is there any place that explains mathematical concepts in ... different ways?
Not really, because the only way people have found to really understand mathematical concepts is to build them up out of simpler mathematical concepts. That means you can't just encounter a complicated mathematical concept and expect to understand it if you don't understand all the simpler concepts it is built from. There are no shortcuts.
Jokes aside, i too think we need a new framework for divulgating math that actually tells what you need to know, without just handwaving at it, BUT without the amount of technical details of a mathematics class.
What do you know, i think this is Possible, too.. You can communicate a surprising amount of information if you use words properly.
Of course, since this has never been done except from basic maths, it would be quite a task to embark in, and one would only do it if it made economic sense..
oh man, have i got bad news for you about the amount of technical detail present in math classes.
What if writing wikipedia articles was treated like publishing on prestigious journals? (somehow)
I mean, I suppose you could try to walk the graph of definitions of terms in Wikipedia to try to understand one of the articles. But I'm pretty sure that's not an acyclic graph, and it's not clear where to start in that process.
Here's what that looks like: https://github.com/adam-golab/react-developer-roadmap
Then consider maths, how long and how many people have been contributing to that field?
normed -> there is a norm. A way of saying how big an element is. In the reals |x| - x if x is positive and -x if x is negative. In the complex numbers |a+bi| = (a^2 + b^2)^(1/2)
Division -> Division (except possibly by zero) is always possible. That means for a and b not zero there exists c such that a=cb (c is a divided by b)
I could be wrong about these things since I'm quite rusty
a) there exists a 1 in R
b) for all x in R, there exists y in R such that x * y = 1.
I'm pretty sure these are equivalent.
(==>) Let a = 1, and b in R be arbitrary. Then there exists a c such that b * c = 1. So inverses exist!
(<==) Let a and b in R be arbitrary. Then a = a * b^-1 * b, so c = a * b^-1. QED.