The background story of Évariste Galois is phenomenal. He was expelled from school twice, fought in a revolutionary unit, was imprisoned for threatening the life of the king and revolutionised mathematics forever. He died in a duel over a woman at the age of 20! http://en.wikipedia.org/wiki/Évariste_Galois
Imagine what he would have done if he had lived to 40? There were some really fascinating characters in the mathematics in the 19th century. A very far stretch from the world's stereotypes of a repressed, bespectacled geek or the boring image of mathematics given by high school classes.
"This article is an attempt to sift some of the facts of Galois's life from the embroidery. It will not be an entirely complete account and will assume the reader is familiar with the story, presumably through Bell's version. Because these authors have emphasized the end of Galois's life, I will do so here. As will become apparent, many of the statements just cited are at at worst nonsensical, or at best have no basis in the known facts."
> Imagine what he would have done if he had lived to 40?
Sorry, can't. Marriage alone is a major productivity killer, then there are also kids and general life problems. There was a study that basically built a histogram of mathematicians' (?) productivity vs age. The peak was around 25 years old, followed by a very steep decline.
Bela Bollobas, Tim Gowers, Ron Graham, John H Conway, Elwyn Berlekamp, Richard Guy, Reinhard Diestel, Alan Baker.
The problem is that you only hear about the truly eccentric. Ordinary people who are also extraordinary mathematicians never get romanticised, so you won't have heard of them. I've just picked a few there from my own personal acquaintance.
There's been a little interesting work that suggests that there are different archetypes among the extremely creative that roughly correspond to 'early peakers' and 'late peakers'.
I just skimmed this wired article but it seems like a good summary (note the '06 date though)
Perhaps 'early' and 'late' are not as important as the fact that there is a peak - people don't seem to be famous/productive in a sustained way. The 'early' and 'late' is probably continuous, you can peak anytime, but you tend not to hold it.
There's plenty more that's left out. After his near-total blindness, he averaged publishing one mathematical paper per week in 1775. To assume you or I are like Euler is hubris. To assume otherwise is doing a disservice to humanity.
Imagine how many geniuses of Galois' caliber have come and gone in the last 2 centuries but never had a chance to make a significant contribution to the world due to being discouraged or oppressed by a suffocating school system or a dysfunctional family life that drove them onto a different track.
On the other hand, mathematicians at the University of Chicago learn quickly that the name of Valois Cafeteria (http://www.valoisrestaurant.com) is not pronounced the way that they think it is.
When I was learning Galois Theory, I found Keith Conrad's notes really helpful for understanding the details -- http://www.math.uconn.edu/~kconrad/blurbs/. The subject of this post is mostly covered by the paper titled "Galois correspondence" (For anyone whose interests were piqued by this post.)
Mark Kac and Stanislaw Ulam explained these concepts pretty intuitively in their beautiful book Mathematics and Logic. But, who was the first person who thought that mathematics could be explained without diagrams? Or equations?
Edit: In particular they showed on pages 58-60, without using jargon, how the idea of permutations leads to Cardano's formulas for the cubic.
Does anyone have any insight into how Galois made the mental leap to his solution? Everything I've read says (or implies) that his solution came completely out of left-field--i.e. it wasn't really related to anything that had come before him.
And along those lines, does anyone know of an English translation of Galois' paper?
I recently started reading now algebra book with a goal of understanding the Galois theoretical proof of his understanding unsolvability of the Quintic. This is generally considered the capstone of a complete 2-semester undergraduate study of algebra for a math major who is not pursuing graduate level pure math.
The interesting part IMO is the analysis of those normal subgroup chains and understanding the isomorphism to splitting fields.
Definitions and theorems without proofs or examples or illustrations is like the box without the gift inside. That post built up a pile of terminology but then ended before showing any content. It shows where to look out find the solutions at least.
Nice text, but "This is a very limited set of operations, and certainly not all real numbers can be written this way -- π clearly can’t be written this way." caught my attention. I know it is true, if that is 'clear', one can just as well claim "clearly, quintics cannot be solved" and be done with it.
Imagine what he would have done if he had lived to 40? There were some really fascinating characters in the mathematics in the 19th century. A very far stretch from the world's stereotypes of a repressed, bespectacled geek or the boring image of mathematics given by high school classes.