Math is the manipulation of abstract symbols according to abstract rules. If you don't like symbols, you don't like math. If you are illiterate in symbols you are illiterate in math.
The word is often used imprecisely, though. Because so many real-world problems can be translated into math, there is a temptation to equate "math" with "any problem that can be expressed in math". Thus you frequently see poetic statements like "my cat is great at solving differential equations", or "music is math because it's all about harmonic series and Fourier analysis". But these things aren't literally true. You can put a bucket under a flowing faucet, and it will collect all the water, but that isn't really integration. The bucket isn't doing math.
And being ignorant of math isn't the same as being stupid. As the OP points out, you can get a lot of quantitative reasoning done without using math. A classic Fun Fact About Math is that it took thousands of years to invent the number zero. And it's true. But that doesn't mean that the ancient Egyptians used to waste hours staring into newly-emptied buckets and baskets in stunned amazement, murmuring "what on earth is that" to themselves in Coptic. People understood what "having no objects" meant long before there was a symbol for "the number of objects in an empty basket". It was the highly abstract symbol "zero", and the highly abstract operations involving zero, that had to be invented. 
It's worthwhile to recognize that interpreting the real world in terms of abstract symbols, and vice versa, is a terribly difficult skill that requires lots of practice. (In my case, I was well through grad school before many bits of physics clicked.) And it's worthwhile to recognize that you can often do without math: You can reason quantitatively without it. Birds do it! Bees do it! But don't pretend that you're doing math unless you are actually doing math. The abstractions are the math.
 Or, rather, discovered. Although we'd better stop there, because I won't be able to cope with the ensuing philosophical back-and-forth.
"0" is a formal symbol with particular formal behavior
"empty/missing/none" is a well-known physical concept
Zero is a precise, powerful mathematical object which can be represented by them both.
This is difficult to deny. Unless you want to deny the providence of most widely recognized mathematicians throughout history, you have to accept that formal language of math is relatively new. Furthermore, it's alive and growing, inconsistant and incomplete. There is a meaningful frontier, and there you observe mathematicians are really studying something else and furiously creating the formal language to describe it.
In this light, metaphor is absolutely a useful tool in the same class as formal language for explaining and reasoning about math. You're right to point out the non-equivalence of the two, but the author's Kill Math project is in no way not math. Furthermore, I'm anecdotally a supporter of the author's belief that doing math competently requires knowing a the metaphorical side since your symbolic projects may fail or be unclear.
I'd be willing to accept that metaphor will never be as powerful as formal language, but it does discredit to the way (I'd wager) most people understand math to deny the metaphorical.
At the heart of this trouble of definitions is Gödel's Incompletenesses. The practical effect of their discovery was the destruction of the dreams of formalists who had for years hope to discover the essential shape of the formal language from which all math would spring. With Incompleteness however, we are forced to admit that we can study, meaningfully, the behavior of mathematical objects for which the language of math cannot be used to reason about.
Then we extend that language, of course.
I have no time to craft a nuanced reply so let me just take a shortcut and concede: If you can get the student from "here are a bunch of physical concepts" to "here is a mathematical object, with interesting abstract properties that you can reason about" without introducing formalism you'll have succeeded in teaching math. Excellent.
Is this plan really going to work very often? It is easy to say that you can derive, say, the utility of "zero" without ever doing any arithmetic -- just as it is easy to say that you can be a full-fledged computer scientist without ever touching a computer -- but in practice?
It's true that the presence of a supercomputer in everyone's pocket will change this argument significantly. But simulations go only so far. They, too, are only metaphors, and if you don't know enough to tinker under their covers they are rather inflexible metaphors. Your classical mechanics simulator is not going to discover quantum mechanics for you.
In my experience learning about mathematical abstractions requires all of the above tools -- you tinker with the formalism, you ponder the physical analogies, you draw mental pictures of clouds and colors, you play with a simulator, you build some circuits in the lab, you go for a walk, you tinker with the formalism again, and six years later you finally get it.
I agree completely that the process of learning mathematics is probably highly multidimensional for... pretty much everyone ever. In particular, it's easy to see how formal descriptions can push mathematical generalization forward far before we have a suitable concept of the mathematical object we're describing.
I think we're all (incl. the op) in some kind of agreement here about the didacticism of math. The op didn't disregard the power and utility of mathematical languages — he came from being trained pretty heavily in engineering math, at least up to playing with higher-order differential equations — but instead was, perhaps not directly, arguing for increased metaphorical/physical descriptions in taught mathematics. He's just responding to the rather eye-opening feeling one gets when one starts to realize that math is so interpretable!
I think that's a perfectly fair argument to have. I know that in my own experience, I never understood the joy of math until the day that linear algebra took an interpretation as linear space transformation.
So we're just sort of all oscillating here in strong rebuttals of whatever interpretation of the "heart" of math the prior author champions for a while. Which is fun but unproductive.
I like the challenge of taking someone from physical concepts and metaphors directly to a mathematical object. I think it'd be possible, and maybe even useful when someone first starts to learn real math, but certainly it's not the most efficient way to become well-read. It'd be a lot like explaining the meaning of, I dunno, Día de los Muertos without immersing someone in Mexican language and culture. A single point of contact can be forced, but you lose so much context and fluency.
More directly, what I meant to say is that since there exist true theorems which cannot be proven within any particular choice of mathematical formalism, we need to operate with tools beyond simply symbolic manipulation. That was the death knell of Hilbert's Program and solidly separated the formal specification of math from "that thing which we're studying".
I'm not really sure that you mean about the "formal specification" for math vs. the "that thing we're studying". An informal (ie, not expressed in ZFC + 1rst order predicate calculus) proof of something non-trivial can go on for dozens, if not hundreds of dense pages of symbols. If I recall correctly, Whitehead's Principae Mathematica derived arithmetic from ZCF and predicate calculus, and it took the whole book.
I did a little reading to refresh myself on the subject, and this stood out as a good summary of the topic:
"In a sense, the crisis has not been resolved, but faded away: most mathematicians either do not work from axiomatic systems, or if they do, do not doubt the consistency of ZFC, generally their preferred axiomatic system. In most of mathematics as it is practiced, the various logical paradoxes never played a role anyway, and in those branches in which they do (such as logic and category theory), they may be avoided."
I mostly wanted to walk around the historical event I mentioned, the breaking of the Hilbert Program. At the time, it seemed that formal specification of math would provide a complete picture of what math was! Once the Program was finished then the job of mathematician would eke out into "computer" (of the abacus sort) or into other fields which interpreted the canon.
I'm not sure which death stroke was stronger, the incredible opaqueness and complexity of proof systems like ZFC or Gödel just saying what he was trying was outright impossible, but Hilbert's Program was killed before it even seriously took off, leaving the study of mathematics and the practical formalisms we use to study it pretty ad-hoc instead of grand and unified.
I'm unifying that with the fact that the way math seems to be practiced never comes from the formal language but instead first comes from imagining some kind of "mathematical object" and then taming its behavior with formalisms. You could consider them to be one and the same and argue that the difference is highly philosophical, and then this is where I'd invoke Gödel and inform you that there definitely exist things we could benefit from reasoning about that your formal language would fail to describe. This existence proof separates the classes of true things and provable things and makes their distinction more than philosophical.
Now, talking about what a "mathematical object" is gets you to the bleeding heart of the philosophy of science and epistemology. It's a tough question!
As a final note, ZFC is ZF + Axiom of Choice... which, yes, most practicing mathematicians just accept AoC so that they can integrate or whatever. The formal world without AoC is very sparse, but nobody has any sort of idea what the arbitrary decision means. I know that there has been some significant study of ZF-C, though it's been "impractical", I don't know if anyone is willing or capable of stating that ZF-C is in any way worse than ZFC. Impractical is a Mathematicians favorite adjective, so they're just two extant formal systems which disagree quite a lot on important things but we mostly pay attention to ZFC.
At that point, all the linear algebra I couldn't figure out for the life of me all those years finally made sense. And it was the same for most of my classmates. After that, whenever I saw xY, I thought "the vector x is being moved into a new space", and all the equations made sense to me.
You could explain what an SVM is with equations to me all day, but it's only when you say "you're trying to get the plane to separate your data by a margin as wide as possible" do I actually get it, and then all the math becomes easy.
Different people have different ways of manipulating the abstract symbols, and for me it's to equate them to something I already have experience in. Then I can get the solutions intuitively, rather than pore over pages and pages of equations.
In the end, I quit academia precisely because I couldn't manipulate symbols, and thus my way of learning wasn't compatible with everyone's way of teaching. Maybe I can come up with something better if someone explains things to me in terms I can understand.
I think anyone is capable of grokking anything, just the time taken to do so is variable. people that give up on "learning" something (academia in your case) just don't want to spend that time.
p.s. I also gave up on academia for the same reasons :)
* Math is full of symbology with implied meaning. For example, theta is often used for 'angle'. How many other symbols have implied meaning like that? Granted, it forms dense, concise, precise papers. Which brings me to my second point.
* If you don't know the symbology, it's difficult to read it. I believe that people suffer reading comprehension problems if they don't know how to verbalize a symbol like 'θ'.
* Lastly, the symbols make it very difficult to google for concepts.
That's the definition of a calculus, not the entirety of math.
>More generally, calculus (plural calculi) refers to any method or system of calculation guided by the symbolic manipulation of expressions. Some examples of other well-known calculi are propositional calculus, variational calculus, lambda calculus, pi calculus, and join calculus.
Math, fundamentally, is about abstract concepts, not symbols.
Sure we have intuition. But intuition can be wrong. Not all problems are as simple to explain as zero and empty buckets. Intuition is also worthless if you cannot communicate those ideas in an unambiguous fashion.
So if the OP really develops a method for communicating the concepts described by math in a much more efficient way - I am all for it! But I find that highly unlikely when even it is admitted in the article itself that he has no idea what this will look like.
I have always been a proponent of having more than one way to teach a certain subject. Studies have shown that different people perceive and interact with information in very different ways. The current "one size fits all methodology" to teaching is lame, to say the least. So a new way of teaching people mathematics is welcome.
But that's not to say the old way didn't work for people like me. If this site kills math so that English majors can cope, I hope someday someone will kill poetry so that I can cope.
When humans try to learn symbolic math / How many of them struggle with the test! / The teacher thought of like a psychopath / Dishonoring the realm of human zest
“We must have our emotions!” students cry / “Or else we'll run around like apes, confused / Our brains are built for stories, not to scry / A world of numbers, strangled and abused.”
The teacher sighs, “They always drag their feet / Unless they're cornered, up against the wall. / To risk my job with answers incomplete! / They'll never use it later, after all.”
Then, big surprise! The math is found at fault / Tear-stained by cringing memories of school / “Dispense with all the symbols, and Exalt / Thine Intuition”—that shall be the rule.
Professors' lamentations curse the air / Hung out to dry for calling any bluff / “To shun defective math must be unfair / For surely no one understands the stuff.”
So woe to ye from near the world of forms / Who strain to show the populace your realms / They're immunized against your grand transforms / And explanation only overwhelms.
(Now, please don't take this poem at its word / Or treat it as authoritative fact / Exaggerated story and absurd / Polemic leave specifics inexact
The author's nearly made of symbols, note— / Despite the slow decay of some to blanks / So though he doesn't mean to seem to gloat / He'd rather keep his “freakish” symbols, thanks.)
Modded up for this line alone.
[Not that there's anything wrong with the rest of the lines. ;) ]
There are several pretty obvious problems with mathematical notation.
- Meaningless one-letter names.
- Meaning of notations is usually highly context-dependent. (Exponentiation and matrix inverse operation are denoted in exactly the same way, for example.)
- A lot depends on arbitrary conventions.
- People rarely explain how notations work syntactically. It's all ad-hoc,learn by example.
Usually only the third can be represented visually, though often mathematicians develop diagrams which help internalise the complex notions involved in analysis and algebra too.
For example, in algebraic geometry, the more familiar notions of geometric objects such as curves and surfaces are replaced with purely algebraic notions, such as schemes. This is because of various categorical equivalences between geometric objects (on the geometric side) and various algebraic objects (on the algebraic side). But on the algebraic side, schemes are a very expansive generalisation of things that actually correspond to geometric (and visualisable) objects.
I once went to a teaching seminar on the use of a package called GeoGebra for the teaching of mathematics. None of us mathematicians could bring ourselves to put up our hands and ask how one might represent a complex of modules over a noetherian ring pictorially in GeoGebra. There's this fundamental misunderstanding amongst educators that symbolic mathematics is not essential to understanding maths.
This is an important insight when it comes to computer programs though. The same thing happens in computer science. You get splits between things that are geometric, symbolic and purely computational.
Often I get really annoyed at people showing off their latest concurrent programming paradigm by implementing a GUI or event loop for some graphical or network application. They forget that many things simply don't fit into that paradigm.
I equally get annoyed at computer scientists for forgetting that the number of integers is not about 10. Sometimes us mathematicians really want to do things with matrices of ten thousand by ten thousand entries.
We don't need mathematics to figure out how a swinging pendulum works; we can do that intuitively. (Actually, this isn't completely true, but we can at least get the general idea.) Simple problems like that are worked out in classes so that students can get used to the mathematics. In domains where our intuition fails us - e.g. quantum mechanics, high-energy physics, statistical mechanics, not to mention 11+ dimensional formulations of string theory, infinite or fractional dimensional spaces, and more esoteric theoretical mathematics and physics - we rely on mathematical symbols and abstraction to guide us, because our finely honed physical intuition is useless (and sometimes worse than useless).
I'm interested in seeing what the author does with concepts like superposition and n-dimensional spaces. Replacing them with graphs and animations is not going to cut it.
1) Abandon the absolutely batty practice of representing everything with a single symbol. It's crazy. We still can't reliably represent mathematical symbols over the internet in text, we have to rely on images. I'm obviously biased, but mapping mathematical functions to actual words (ala programming) would be a big win, imo.
2) Lock up all the physicists, mathematicians, engineers, logicians, you name it, and have them agree on a single unified notation. Every symbol ought to have one meaning and everyone needs to stop stealing symbols from a different field and giving it a new meaning! I'd wager that more than half of the symbols in this list (http://en.wikipedia.org/wiki/List_of_mathematical_symbols) have several valid interpretations. No thanks!
"Let V1 and V2 be a subspaces of W, such that their intersection is zero. Let f be a mapping from a direct sum of V1 and V2, such that it takes a vector, whose first component is x and second y to a difference of x and y multiplied by two."
And this was easy example, I can think of _a lot_ harder.
There is no regulating body of mathematical notation -- it can be (and usually is) created by introducing it in some paper or book by some mathematician who invented it and regards as useful. Frequently, there are more than notation introduced, but usually only one survives -- hopefully the best one. The only possibilities of encountering several different notations in use at once are either reading very old works, which is not good anyway, or the most recent ones, but I presume that people who are able to read them are also able to get over such a minor problem.
Seriously, I believe that the mathematic notation is a lot clearer, more intuitive and easier to understand than syntactic rules of many programming languages, for instance C++. Symbol overloading almost never pose a problem, since the intended meaning is usually obvious from the context. If one frequently misunderstands the intended meaning, it is a sign he does not really get the concepts involved, and the fact he is confused by notation is his smallest problem.
Even symbol overloading most often takes places only if the sign represents the same idea in all contexts. For instance, one usually uses '+' sign to represent a binary commutative operation whatever structure we all talking about, because, well, it represents similar idea. One can go even further and say that symbols like \oplus and \times in most contexts they are used in (Cartesian product of sets, direct product/sum of rings/groups/modules/vector spaces/mappings) are actually representing exactly the same idea -- namely, the notion of product/coproduct in some category.
There are a lot of different symbols in use in math. If we abandoned symbol overloading, we would need to introduce many, many new symbols, and this would create real confusion.
Heh. Heh. Heh.
So I would have believed until I tried to learn differential geometry. The default is to eliminate all parts of the notation that are unambiguous. Proving that they are unambiguous is left as an exercise to the reader, and the exercise is often non-trivial. Furthermore widely used constants vary by factors of 2 pi depending on who is using it.
Of course, I agree that any idea of a "symbol standardization committee" for mathematics is both crazy and stupid.
integrate(start, end, function)
Of course, I can think of a few problems with this. The large 'S' symbol is understood by everybody, regardless of their language, is one advantage of symbols that comes immediately to mind.
Calculus is a beautiful subject with a host of
dazzling applications. As a teacher of calculus for
more than 50 years and as an author of a couple of
textbooks on the subject, I was stunned to learn that
many standard problems in calculus can be easily solved
by an innovative visual approach that makes no use of
formulas. Here’s a sample of three such problems:
The second paragraph is even worse. This guy have simply no idea what he is writing about. 'Assigning meaning to set of symbols' is just abstracting unnecessary details and focusing on important information in problem, 'blindly shuffling symbols according to arcane rules' - um, calling math 'arcane' is a clear indicator of person's lack of understanding. Rules are not arcane, everything has explanation (proof) and is derived from other things in logical way (and as far as we know, world acts logcailly) - some of the are axioms, which seem to be abstractios of most basic properties. Also, 'shuffling blindly' is in fact spotting patterns in things on different levels.
Sure, explainig things on more intuitive level, using e.g. graphical representations is sometimes really helpful. However sometimes intuition doesn't work, and how do you graph 3-dimensional manifold embedded in R^4? Or finite field? Also, I can't imagine of other way of doing math that would be consistent and useful other than the one we are using now.
I'm actually quite surprised by that. Those are both excellent schools with top notch math departments, and the example that he mentioned in the article (about not really grokking the second order ODE and what it meant re: the phase space plot, etc.) indicates that he took a seriously badly taught class. I'm kind of surprised you could get through a diff eq's course at either school without having such basic stuff taught to you...makes me wonder who the teacher was, maybe it was pawned off on a grad student?
Edit: whoops...just looked back at the article, turns out he didn't take that diff eq course at Caltech or Berkeley, it was at a local college. That explains a lot. I'm not going to say that there are no good teachers at mediocre schools, nor that there are no bad teachers at the good ones, but on average there's a huge discrepancy in the quality of the classes.
I had similar experiences, where I took classes at a local college while in high school, and thought I was stupid or something when I didn't "get" them, only to find that when I took them again at a better school they were, in fact, very easy topics.
I blame the textbook writers in part: for one example, Serge Lang's math books are extremely difficult as a rule, and leave a lot of the scaffolding out. Scaffolding which, when he taught classes himself, he always filled in to make for an amazingly smooth and effortless learning experience, but which lesser teachers would never think to talk about (perhaps because they don't understand it themselves, or at least don't understand how important it is to explain). It's really a shame that even now, after so many centuries of teaching math to people, the effectiveness of the process is still so utterly dependent on the teacher. Hopefully things like the Khan Academy will begin to rectify these problems.
Maybe it's not a shame. Maybe it's just an indication that teaching is hard, like art, science, programming, and discovering new theorems, not easy like answering phones in a call center or being a short-order cook.
I am all for finding a way to explain quantitative concepts in a new way. However, it will be extremely difficult to avoid falling into the trap of "reinventing the wheel" if all we're talking about is coming up with a new set of symbols.
A certain recipe serves 3, but the cook is only cooking for 2, so she needs to 2/3 all of the ingredients. The recipe calls for 3/4 cup of flour. The cook measures out 3/4 cup of flour, spreads it into a circle on the counter, takes a 1/3 piece out of the circle and puts it back into the bag. That's 2/3 of 3/4.
Much easier to eyeball 1/3 when it's laid out in a rectangle as opposed to a circle. Author credibility -1
Did you just call the set of symbols evolved by mathematicians for thousands of years mindless? Credibility -2
Finally the two animated examples given are clever but not groundbreakingly clear. -3
It's a neat project but maybe you could think a little harder about defining your problem.
Maybe for you, but certainly not for me, and I'm guessing most bakers would agree with me. Bakers are used to circles because of pies. I can eyeball a third of a circle, but I'd have trouble eyeballing a third of a rectangle that I couldn't fold.
Further, the baker often works by feel, so an exact is not needed in these circumstances.
"Did you just call the set of symbols evolved by mathematicians for thousands of years mindless?"
Perhaps a better word would have been arbitrary, but there's no fundamental reason we pick y=mx+b. Y, M, X, and B are picked arbitrarily, and we do pick them without questioning whether these are optimal for initial learning.
I grokked math as a kid, but it was precisely because I was able to make the leap that the language of math was arbitrary and substitutable while other kids were stuck not understanding the meaning.
Then we need to teach them that, not a new set of symbols. Again, I think the crucial insight here which you uncovered is that people are distracted/confused by the symbology, perhaps trying to take everything too literally.
By the way, the way to "eyeball" a third is to use your two hands (rotate them so palms facing each other) to divide into sections A B and C; since we can very accurately eyeball a 50/50 split, you simply compare A to B and B to C, then adjust your hands until A=B and B=C. Bam, you have thirds. Once you get good at this you just mentally visualize invisible dividers instead of actually using your hands.
When it comes to a circle, if you are staring at pies all day then maybe you are better than average, but many studies have shown that humans are horrible at discerning angles other than 180 and 90 degrees.
You've given him some arbitrary credibility rating of -3 because his examples weren't exactly how you would have done them, except that you didn't write them, he did. You contribution was to write some bitchy comment about it.
A mere incremental change isn't worth the hardship of giving up the status quo.
His argument is that analyzing a differential equation without exploring it in phase space was like analyzing a piece of sheet music without actually hearing it.
And if music education were taught in that way-- by looking at music purely as the manipulation of concrete symbols-- I imagine some of us would be writing "kill music (as it is currently taught)" blog posts as well.
EDIT: minor grammar fix
They made some students learn something using concrete examples only, and then expected them to be able to abstract. That's just dumb.
I learn the maths, and apply to various situations in my head to make sure I understand it.
If we were to follow this guy's ideas it would cause even more of a class divide between those who can understand the "magical symbols" and those who can't. Doing what he suggests would mean that the non-cognoscenti wouldn't even have access to the understanding of simple algebraic equations.
"Next, calculate the fitness of the algorithm and add it to the pool if it is better than the worst of the last generation: <math here>"
That's exactly the opposite of my impression. Most papers are full of text, and symbols are not the main feature. For instance:
http://ttic.uchicago.edu/~yury/papers/kuratowski.pdf Graph theory, the symbolism is next to nonexistent.
http://math.berkeley.edu/~aboocher/math/tietze.pdf Topology, still symbolism does not take much space.
http://www.jstor.org/stable/1989708 Classical and very highly technical, yet the ratio of text to symbolism is still in favour.
One example: Using Robinson infinitesimals allows you very easily to write code for forward mode exact differentiation (not symbolic, not approximate). But justifying these simplifications is hard. The question is how much math can be simplified by analogous means without wrecking the foundations.
Another example: Many really useful systems have to deal with uncertainty. I have yet to see a system that allows programmers to easily build such models. I have seen some nice ideas probability monads, Bayesian networks, etc. But how many non-specialists are prepared to use such tools? Happy to have HNers prove me wrong on this one.
That's basically the approach Rosetta Stone takes, and it works fairly well.
I have a friend who was able to buy a laxative in Italy without knowing a word of Italian. It's a funny story. I doubt the same approach would work with math.
I remember learning calculus in college. The professor went up to the board, scribbled down symbols, and took us through various procedures. I was absolutely, utterly lost until my father (an engineer) told me that "a derivative is a rate of change."
At that instant, I understood everything. My professor never said this.
This taught me to approach math concepts-first, and that helped, but I've always had a problem with math. To make a long story short: I hate math for the same reason that I hate Perl. My mind recoils in horror from messy, crufty languages.
At the very least, all math lessons should begin by teaching the language and the concepts that the various symbols, arrangements, etc. refer to. Only once the language is thoroughly grasped should they proceed to methods, procedures, and problems. Right now it's like teaching Chinese literature before teaching Chinese...
So I'm sympathetic to the author's desire for better visualization and teaching tools.
But when I reached college, I became frustrated with math. It just wasn't easy anymore, the way programming was: I could pick up a programming book, read it in a weekend, and understand it. But when I tried to read an advanced math text, I became lost after 10 pages.
Eventually, I figured out what had happened: The information density of college-level math texts is insane. Even if you're bright and talented, it may take you a day to understand a single page. And there's no substitute for working carefully, finding concrete examples, and slowly building a deep understanding.
Here's an example that involves programming. Once upon a time, I needed to understand monads, in hope of finding a better way to represent Bayesian probabilities.
I started with the monad laws, a handful of equations relating unit, map, and join. I read countless monad tutorials, and dozens of papers. I read every silly example of how monads are like containers, space suits, C++ templates, and who knows what else.
I wrote little libraries. I learned category theory. I wrote a monad tutorial. I eventually wrote a paper explaining a whole family of probability monads:
And then one day, I thought about the monad laws again. I realized, "Hey, that's it. That's all. Just unit, map, join, and a handful of equations. Anything which quacks like a monad, is a monad. How did I ever think this was complicated?"
But when I look at the monad laws today, there's this huge structure of connections in my head. All that work, just to grasp something so simple, and so easy.
So I'm all for building better visualizations, and for helping people to understand math intuitively. That's an important step along the path. But math doesn't stop at an intuitive understanding. When you really understand it, the equations will suddenly be easy, and everything will fit together.
And then you'll encounter the miracle of math: Your deep understanding will become the raw material for the next level. Counting prepares you for addition, addition prepares you for multiplication, basic arithmetic for algebra, algebra for calculus, and so on. And someday, I hope that my rudimentary understanding of category theory will prepare me to understand why adjoint functors are interesting.
My favorite example to use is this: Complex Variables and Applications by Brown and Churchill. This book has been in print for 70 years or something, and it's somewhere around 400 pages in the current edition I believe. My professor I did research for had a early 80s edition, and it had almost 100 less pages than the current edition. There wasn't really anything new added between the versions (chapters are only about 5-10 pages, so there's something like 65 of them) I ended up using mine for the problems and his for reading because I have ADHD, and the wordiness absolutely kills me. Symbols and relationships are much more meaningful to me than words describing them. The real nightmare with the ADHD sets in because of the break in context when you have to switch between two or three pages to find the next theorem, formula or proof.
I retook that class twice.
On the other hand, I utterly and completely rocked my Advanced Electrodynamics course, outscoring even the graduate students, in a course which even made use of the stuff we were learning in Complex Variables (as well as PDEs and all that fun stuff) Why? I had a crazy russian professor who hated all the current textbooks (I'm looking at you, Griffiths) for the same reason that I hated textbooks, too much words and not enough symbols. So he wrote his own notes to every lesson and made his own homework. He said he originally wrote those notes when he first came here, and his english was worse, so there's little or no explanation, just proofs -- math and symbols. A few of these would span two pages, and very rarely three, but there wasn't the context break you get in many college level books, just beautiful math and lots of intermediate steps. The intermediate steps, almost never provided in most textbook proofs, really help the visual learners like me and provide stepping stones for the inevitable manipulation you will perform with those equations in your homework and on tests.
I still have all his notes, I want to bind them up some day when I get a chance.
Then... y'kno. Doing so :)
(I realize that scanning tons of a pages and ensuring quality isn't exactly a small undertaking, but I'm sure the HN community would greatly appreciate your efforts if such a thing were possible)
Thanks (or at least thanks anyway) !
It's similar to Griffiths, but I feel like it has more examples and is a little more concise in those examples than Griffiths (although I did borrow Griffiths occasionally) I grabbed one on ebay for $5 or something, and it was well worth it.
But Lorraine and Corson just start out with special relativity, without really providing any impetus for why it is necessary, which can be found in the very first paragraph of Einstein's paper on relativity. Historically, physics has been a mystery that keeps unraveling with time. However LnC destroy the mystery by telling us who the murderer was in the opening chapter :-) We don't really appreciate how relativity was discovered. I think teaching the mystery is a very important and easily overlooked part of Physics education which Griffiths seems to appreciate but LnC don't.
(To pique your curiosity: read it to understand why Hipmunk is awesome - there's much more to it than just that of course)
[Edit: wow! http://worrydream.com/cv/bret_victor_resume.pdf]
Which makes him the perfect Apple employee, as they cover the space from hardware engineering through user experience better than anyone.
Just started with 'What is Mathematics' by Courant and I am totally hooked on. He talks about everything from why we chose to adopt the decimal system to why pi was needed to solve certain problems.
As Hammock says, the author has to be clear about defining the problem itself. Is it mathematics as it is represented today which is lacking or the method of teaching which is lacking
> using concrete representations and intuition-guided exploration.
Is it just me, or are these two lines very much at odds with each other?
For instance, if I want to talk about 26-dimensional discontinuous space (the space of the basic alphabet), there is no visualization that can help you grasp the totality of the matter.
I believe there are a few interpretations of what math actually is (formal reasoning, interpretation, etc), philosophy of math 'junk'. I'll leave that to a more-beered time. :-)
The downside of the growing public awareness of people on the autism spectrum, some of whom are geeks, has been the slow trend towards conflating intellectualism with atypical mental function.
Some people like to call right now the "victory of the geeks". I suspect that within the next 10 years, nerdy kids will start being diagnosed as having Asperger's by school counselors and the like with about as much care, caution, and accuracy as we saw with ADHD.
And ! code cannot be misinterpreted, either it runs or it does not.
So dear mathematicians please do the evolution and cast your ideas as python code.
Sadly, ponies are not the best (or even appropriate) way to represent software, just as Python code is not useful in describing math.
With that new found interest in Mathematics I set about to find a book that I could use to teach myself. I won't say how I came across it, but I ended up coming across "Practical Mathematics" by C.I Palmer - the 1919 publication, very old. The book is intended for working adults in mechanical trades; the problems in the book were often given using real-world problems in the technical trades. The author also used much more technical language! It was refreshing! Even my high school math textbooks felt like the authors were trying to teach kids and regarded their audience as nothing more than insufferable immatures. This book has been immeasurably valuable to me, I now feel less "darkness" and confusion when I see a math problem, I'm finally seeing the utility in my everyday life of the things I'm learning (my gf had a wedge table and was selling it on Craigslist and needed to know the length of the arc, for example). I actually know how to add and subtract fractions, it's no longer a mystical act of numbers disappearing here and showing up there.
The greatest thing about that book? Certain operations that most schools only ever taught me as a mechanical process were taught to me and explained to me with the underlying fundamentals in "Practical Mathematics". In 1919 there were no calculators, it had to all be done by hand, even square roots, and you couldn't really survive without knowing why or how certain mechanical operations are used the way they are in Math.
I'm not sure if college or higher mathematics gets into that stuff, but, it was a revelation for me and I'm well on my way through the Algebras and Trig now - I didn't even make it out of pre-algebra in the traditional educational system. I can also see the beauty of mathematics too, something I never thought I would understand about mathematicians when I used to hate the subject.
There will never be any "killing" of math. But our educational system has a long way to go.
Lessons I draw from this:
- In designing a website that outsiders visit, you should probably follow this rule: If your page must have a majorly resource-intensive object on it, it should either be visible and obvious at the top of the page, or have a "Start" or "Play" button and not run until the user presses it.
- Firefox should get some (easily accessible) way to see resource usage broken down by tab. Chrome does this; it's probably made easy by the fact that Chrome makes a separate OS process for each tab (or group of tabs). I wouldn't recommend switching to that model, because it would be work and because I wouldn't like it (I don't like how it clutters up the global process list in Activity Monitor), but it would be nice if you could see CPU and memory usage broken down by tab.
- (Optional) Browsers shouldn't run purely graphical animations if they're not visible (like if they're several pages down). This might not be perfectly achievable--e.g. if animations started when you scrolled down to them, then one animation slightly higher than another might start earlier when the designer wanted them to be synchronized. Maybe you would instead have background things keep time without actually rendering anything. This might not work for cases where the nth frame depends on the conditions of the n-1th frame, and so you'd have to run the whole simulation in the background anyway--though maybe leaving out the "draw" part would make a big difference, I don't know. But if it's a series of static frames, like a GIF, then I think this would work--when the GIF is offscreen, the browser would just keep incrementing a frame number (mod the number of frames in the GIF).
Math and language are 2 faces of the same ability we have to model the world in arbitrary symbols. Just like math, language has symbols and syntax, that have meaning to us regardless of representation (spoken, written or digital).
E.g., his claim about symbol manipulation is total nonsense.
His direction is a waste of time.
If he wants to improve materials for learning math, then fine, but he should first learn some math.
He should start with the books and papers of P. Halmos, one of the best writers of math ever.
Note: My Ph.D. dissertation research was on the math of stochastic optimal control.