* * *
Offloading computation only works when you understand what the computations are and why we do them. That's something that must be learned, it's not knowledge that springs fully formed into our minds as soon as we step into a classroom.
Carrying out computations thus gives us explicit and implicit knowledge of how the things we may eventually automate actually work. But it's also valuable because it trains us to compute in a precise and effective manner—a capability that remains useful later on. For instance, in logic it's often important to be able to carry out syntactic manipulations (e.g. into normal forms) in one's head, or even tacitly.
I'm sure there are plenty of examples from other areas of mathematics where computation is important, it's just that we do it so automatically that we don't think about it. Often I've found that students have trouble following proofs that take logically and computationally innocent steps without saying what's going on. Here I don't mean things like applying AC in the background, but just simple tricks like de Morgan's laws or taking the contrapositive. They have difficulty because they haven't taken those steps often enough themselves to have internalised them.
And in particular I would like to hold up Electricity and Magnetism 2. Calculating the momentum of a magnetic field, in all but the most trivial case, takes a full sheet of paper: being rows and rows of 8 inch long equations as you carry out the tedious work of canceling terms; moving things in and out of square roots; and multiplying large polynomials together. It's all basic algebra stuff you learn in high school but it's a slog to work through and so time consuming that you actually lose track of the big picture and end up with very little better understanding at the end.
As far as I know that's why things like tensor and bra-ket notation had to be invented in the first place. Without a compressed notation the ability to get a correct answer to any interesting problem became less a question of knowledge and more a question of probability of transcription/sign flip errors.
not that anybody teaches sophmores tensor notation.
- What is the tangent line? How does it connect with the derivative?
- What is a limit. How is it used to make the above rigorous?
- What is the Fundamental Theorem of Calculus? Why, non-rigorously, would you expect it to be true?
That is not a random list. That's a list of the most important concepts taught in the first Calculus course or two. If you couldn't give a quick impromptu explanation of ALL of them, then you failed to master the key concepts. (Don't worry, most can't.)
To get to Terry Tao's formal math stage, you'd need to take proof-heavy courses such as real analysis.
But what I mean is that 25th time you're doing an integral to ram home some trigonometric identity or working out a fourier series for PDEs it's not because anybody hopes that this is the time you get the epiphany it's because the teachers need something for the grade books and you need to be able to do it during a midterm.
Assuming Wolfram wasn't engaged in just an attempt to sell more mathematica licenses I would assume that was kind of his point. If you dump the most of the endless repetition on to maxima/maple/mathematica you could actually spend the semester on the concepts and proving them instead of focusing so heavily on the student's facility at algebraic manipulation.
Now having had to do everything by hand I have the sort of knee jerk reaction that "well I had to do it so they should do it too" but then I also remember that it sucked giant balls. As I see it is students definitely need pretty solid facility at doing this sort of shit and so we get the classic: "where do we draw the line" problem, which means I should probably not be counted as a proponent of Wolfram, so much as maybe a sympathizer (in this regard; fuck NKS).
*also while I take didn't real, I did get a minor in math which included Basic Concepts of Mathematics, or as I tend to remember it "that semester of not being able to divide because it's not defined over the integer set" but it was certainly a purely proof oriented course, and my numerical methods 1&2 were at least 50% proof based, I've done the formal rigor thing.
I understand the grader was in a hurry, and the trig identity demonstrating that my answer was, in fact, equivalent to the standard one is not easy. But I had the right answer! And proved it was right, right there on the test!
I still remember the outrage. Over a question that did not matter then (I got an A+ in the course either way) or now.