Hacker News new | past | comments | ask | show | jobs | submit login
Calculus Learning Guide (betterexplained.com)
472 points by fauria on Apr 13, 2016 | hide | past | web | favorite | 97 comments



Apart from Keisler, mentioned there, I think there are 2 lovely and rigurous references especially appropriate for anyone with a CS background:

* Henle & Kleinberg [1]

* Hubbard & Hubbard [2]

Both are breathtakingly beautiful. Henle & Kleinberg is small and neat. Hubbard & Hubbard integrates calculus and linear algebra. In some ways it reminds me of SICP, described by Peter Norvig as "[...] a rich framework onto which you can add new learning over a career". [3]

[1] https://books.google.co.uk/books/about/Infinitesimal_Calculu...

[2] http://matrixeditions.com/5thUnifiedApproach.html

[3] http://www.amazon.com/review/R403HR4VL71K8


This is fantastic!

After nearly 20 years as a self-taught programmer I've been thinking about going back to school to get a CS degree (or master's)

Since I haven't taken linear algebra or calculus those are some pretty big areas of worry for me.

I've been looking for some good resources on the two. I started watching the video for year one calculus from MIT open courseware (specifically this one: http://ocw.mit.edu/courses/mathematics/18-01-single-variable...) ... and was so lost I had to back out and google "what is calculus?"

I'm pretty basic here :)

Turns out it's the math of "measuring change"

A colleague pointed me to Kahn academy as well, but this is a welcome resource. Thanks.


Kalid from BetterExplained here, glad you like it!

I had similar confusions; despite years of engineering classes, I didn't build an intuition for Calculus until a decade later. It felt like such a waste. (Yes, I could memorize rules; but could I visualize the product and quotient rule?)

I decided to Elon Musk It™ and reason from first principles (http://betterexplained.com/articles/honest-learning/).

What would a learning plan look like, assuming limited motivation, a need for genuine understanding, and goal of starting with appreciation and moving to proficiency?

Just like learning music, I'd listen to the song, then clap along to it, then play a few chords along with the band, then learn a bit of music notation, and then practice playing it. I wouldn't start with scales (er... limits) for weeks on end. Not if I was honest about my motivations.

Ultimately, we have to acknowledge the approach that helps us. For me, investing a few hours of exploration before the formalities is well worth it.


Kalid, thanks for your amazing content!

Reading this from the link you provided:"In my ideal world, every Wikipedia topic would have a guide that took you from the 1-minute version to a full technical understanding. Go as far as you wish, make meaningful progress at each step, and have fun along the way.".

Have you though how to build/scale it ? Because it would seem like an amazing site, and maybe a good startup.


Thanks Petra!

I've been kicking around ideas for an "intuitive wikipedia", guides explicitly (and realistically) designed to get you from curiosity to mastery.

Wikipedia has a few problems:

* Too much detail for beginners (to be fair, it's a reference, not a tutorial)

* Too difficult to contribute. Even excellent contributors have to fight to get changes in.

* Generic, boring, academic "tone of voice" (kills the fun for me)

Previously I'd tried to make apps, forums, etc. to help aggregate this, but it was over-engineering. My idea:

1) Have a topic like the Fourier Transform

2) Write a tutorial (http://betterexplained.com/articles/an-interactive-guide-to-...)

3) Collect feedback [see comments on article], examples, diagrams, etc. Make it easy to contribute an insight.

4) Weave the final result into an honest, realistic guide from curiosity to mastery. (I don't have one built yet for the Fourier Transform.)

I'd prefer to organically grow the guide out of our shared experience vs. "imagining" what a hypothetical student needs. (The hypothetical student should study limits for weeks, then derivatives, then integrals. In my experience it's the other way around.)

For scaling, getting the first few right should hopefully make a template others can take and run with. As long as we're dealing with text, it's infinitely editable and updatable. (Why are we so enamored with video? We can't fix our inevitable mistakes!)

I'd love to chat more if you like! Just ping me at kalid.azad@gmail.com.


> "In my ideal world, every Wikipedia topic would have a guide that took you from the 1-minute version to a full technical understanding. Go as far as you wish, make meaningful progress at each step, and have fun along the way."

This is my biggest gripe with Wikipedia for mathematical topics - as it stands, it's only useful if you already know the material, in which case you probably shouldn't be using wikipedia as a reference / refresher except as a last resort.


Exactly. Wikipedia helps me remember things I've forgotten, but it's slow going when trying to learn a new topic. The first paragraph includes a set of links that you recursively follow.


I just wanted to jump in here and add that trig never made sense to me until I read your "Intuitive Trigonometry" page. I was well in to calculus and (even after getting an A in my college trig class) suddenly realized that I didn't have even the slightest idea what a sine, cosine, or tangent actually was. Some googleing led me to your page and it suddenly made sense!


This looks amazing, and is so timely! I'm taking a supply chain management class. Though the pre-reqs said basic algebra was fine, they mention a lot of Calculus and I feel like I'm not fully understanding the material without it.

One of my biggest frustrations about learning math is that I don't get the concept, and it makes it hard for me to just work through equations. So your approach looks fantastic to me. The motivation level and how much time is available-- that's exactly what I need. The 1 minute breakdown was perfect. I learned something immediately, and more importantly-- I was then excited and intrigued about learning.

I seriously can't thank you enough for this.


That's awesome to hear, thank you. The goal of education should be to spark the fire so we learn more. Really glad to hear the intro made the rest more intriguing :).


Kalid - also your explanation of complex numbers is absolutely amazing. Sure I could manipulate them by memorizing rules, but I never really knew what they meant or why they were important.

I also love the Numberphile explanation of quaternions (building up from real to complex numbers, etc.) - https://www.youtube.com/watch?v=3BR8tK-LuB0

-Sujan


Thanks Sujan! I've been poking around with quaternions, appreciate the pointer :).


Yes this is really the best course on calculus I've ever run across thank you so much for this amazing guide!


Thank you!


MIT also has supplementary lectures like "Big Picture of Calculus" with Gilbert Strang.

http://ocw.mit.edu/resources/res-18-005-highlights-of-calcul...

He also put his text online for free http://ocw.mit.edu/resources/res-18-001-calculus-online-text...


I would recommend the book "What is Mathematics" to you. It covers a wide range of math topics from first principles. The beginning of each chapter is light reading, but gets more rigorous as the chapter progresses. It talks about mathematical notation a good amount, which is surprisingly helpful because many math ideas are intuitive but hidden behind extremely inaccessible notation.

The Kindle version has bad formatting, but the Google Books version works well on all devices supported by Google Books.


Two other resources I've used are:

this OSU / Coursera Calculus course:

https://mooculus.osu.edu/

and these Youtube videos:

https://www.youtube.com/user/mathbff

https://www.youtube.com/watch?v=UcWsDwg1XwM


Yes!!! MathBFF is awesome!


Hackaday also had a great series that nicely answers the question "what is calculus?". There are some really elegant explanations in there. Start here: http://hackaday.com/2015/12/15/calculus-is-not-as-hard-as-yo...


Also an interesting place to apply calculus is in computer graphics. Unlike a lot of math calculus is extremely practical here and sometimes its more fun to make a game then to beat your head on practice problems.


If you want to try a hardcopy book, a mathematician friend of mine suggested "The Cartoon Guide To Calculus" which is available very cheaply on Amazon. Her recommendation has helped me immensely to learn calculus.


Same! I took one college level math class and never looked back. At this point I think I'd get a marginal boost to my programming career, mostly I want to learn because it looks fun and I've never studied it.


You don't know what you're missing without a background in calculus. Calculus is the mathematics underlying human-scale physics, and therefore all of engineering, art, and design. It is the fundamental mathematical theory alongside statistics underlying most of economics and biology. I tell kids going into calculus to pay extra attention: it will change the way they view the entire world. And it will. It is absolutely no accident that the enlightenment, industrial revolution, and modern era began with the discovery of calculus.

If you don't have calculus under your belt, I'd definitely suggest starting there.


I've often thought that you could give people a huge leg up on calculus if you could first explain to them the discrete calculus, possibly with a programming language like Haskell. It would look like this:

We're going to start thinking of infinite sequences. A sequence starts with a value, and then there is another value, and so on, and so on. We might number these sequences and then write down rules for them; for example we might write $x_n = (n + 1) ^2$ to denote the sequence of square numbers starting with 1, `map (^2) [1..]`; if you're not familiar with these then we'll write out the first five as:

    squares = 1 : 4 : 9 : 16 : 25 : map (^2) [6..]
I'm going to teach you one list-operator, it's a function which subtracts each number in a sequence from the one that comes before it in that sequence. For the first element, it just subtracts 0, so it doesn't do anything. This is written in Haskell as:

    delta seq = zipWith (-) seq (0:seq)
When we use the above we get:

    Prelude> take 30 (delta squares)
    [1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31,33,35,37,39,41,43,45,47,49,51,53,55,57,59]
(Here the `take 30` just stops the computer from trying to give us all infinity of the results!)

Interestingly, the difference between square numbers is the set of odd numbers. This is not too hard to understand after you know it -- $(n + 1)^2 = n^2 + 2n + 1$, so of course after you subtract $n^2$ you get $2n + 1$.

Similarly when we start looking at the various counting sequences like [1..] (which again if you don't know Haskell is the same as 1:2:3:4:5:[6..], the infinite sequence of successive integers starting with 1) and [2..] we find:

    Prelude> take 30 (delta [1..])
    [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1]
    Prelude> take 30 (delta [2..])
    [2,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1]
    Prelude> take 30 (delta [3..])
    [3,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1]
We notice that they're all the same except for that very first element, which basically just tells us the first value. But actually, `delta` has a special property which is that it is invertible: just like you can invert `\x -> x - 3` with `\x -> x + 3`, you can invert this `zipWith (-)` beast with `scanl1 (+)`

    Prelude> let sigma seq = scanl1 (+) seq
    Prelude> take 30 (sigma (delta [3..]))
    [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32]
    Prelude> take 20 (sigma (delta squares))
    [1,4,9,16,25,36,49,64,81,100,121,144,169,196,225,256,289,324,361,400]
What is this magical beast `sigma`? What is `scanl1`? This is a function which processes a sequence by passing on an "accumulating value", but unlike `foldr` and `foldl` where you get that accumulating value at the end of the sequence, `scanl` gives it to you at each step of the journey, including the first. So like with foldl1, we start with the more general case of foldl or scanl:

    scanl fn val list = val : case list of
        [] -> []
        x : xs -> scanl fn (fn val x) xs
So we always emit a sequence whose next element is `val` and whose follow-up sequence is either empty (if the source sequence is empty) or consists of scanl'ing the rest of the sequence with the same function but with val updated to `fn val x`. Each new element gets combined into the existing value with `fn` and the results are getting steadily emitted as it goes.

Then the "1" in `scanl1` means, just like the "1" in `foldl1`, "read the first value from the first element of the list":

    scanl1 fn list = case list of
        [] -> []
        x : xs -> scanl fn x xs
So when we defined `sigma` as `scanl1 (+)` we just specified this `fn` to be one which adds the two values together. The above recomputation of the squares from the odd numbers actually just works like this:

    1                                =  1
    1 + 3                  =  1 +  5 =  4
    1 + 3 + 5              =  4 +  5 =  9
    1 + 3 + 5 + 7          =  9 +  7 = 16
    1 + 3 + 5 + 7 + 9      = 16 +  9 = 25
    1 + 3 + 5 + 7 + 9 + 11 = 25 + 11 = 36
Notice as the second column here indicates, we don't need to recompute all of these pluses each time: we can just hold onto the last number we've computed, the previous item from the last column, and add that to the new item to get the new value.

I'm asking you to notice the last point because if you study it carefully enough, that we generate each new value by adding the sequence to the last value, you'll start to ask, "wait, is `delta . sigma` also an identity transformation, just like we saw `sigma . delta` is?" And the answer is a resounding yes. This is actually a very common property for inverses, usually after you've proven that `f . g = id` then there's also an argument that `g . f = id`.

So, we've learned that just like `(+)` undoes `(-)` and vice versa, summing up a bunch of terms from some start point to a current place undoes taking differences between nearest-neighbors, and vice versa.

I haven't actually proven either side of this, but it's not too difficult: to see that delta undoes sigma, notice that for n > 0 we have $$

    \sum_{k=0}^{n} x_k  -  \sum_{k=0}^{n-1} x_k  = x_n 
$$ (and the x_0 case is just x_0 - 0 = x_0, so that works too). To see that sigma undoes delta is a little more complicated, but just write out the n'th term to see:$$

    (x_0 - 0) + (x_1 - x_0) + (x_2 - x_1) + \dots + (x_{n-1} - x_{n-2}) + (x_n - x_{n-1}).
$$ Notice that except for the first and the last term, each term contains two parts: a +x_k which will cancel with the next term and a -x_{k-1} which will cancel with the previous term. This is called a "telescoping" series because it's really long but it collapses like a telescope due to perfect cancellations between adjacent terms. The only thing that's left at the end is $(-0) + x_n,$ which of course is just $x_n.$ So `sigma` has indeed undone `delta`.

To follow up, first I would probably define:

    instance Num n => Num [n] where
        (+) = zipWith (+); (*) = zipWith (*); (-) = zipWith (-)
        abs = map abs; negate = map negate; signum = map signum
        fromInteger = repeat . fromInteger

    k !* seq = map (k *) seq
so that we have sequence algebra and we can multiply them by constants; then I would introduce people to basic discrete-calculus results like `delta (f + g) = delta f + delta g` and `delta (k !* f) = k !* delta f` and `delta (f * g) = f * delta g + g * delta f + delta f * delta g`.

Finally comes the big reveal: derivatives, to difference a continuous function, treat that function as if it were a straight line when you zoom in close enough, and to do this we have to divide `delta f` by some $d x$ to make it work; similarly integrals, to be the proper inverses to derivatives, contain a `(dx * )` operation. So if you start to define a grid of size dx over the x's that you care about, u < x < ∞, then each function corresponds to a sequence $f_k = f(x_k) = f(u + k * dx).$ Then `map (/ dx) . delta` generates a version of `delta` whose results converge, but not (in general) on the trivial answer 0; rather it converges on the "instantaneous slope" at x. Meanwhile `map (dx *) . sigma` perfectly undoes this, and calculates the area under a curve from u to x. From this, observe that the product rule gets easier due to these limits and introduce the formal definitions (limits, largest element of a partition going to zero for integrals; simple "h" parameter going to zero for derivatives).


The more advanced mathematics I learn, the more I realise that in many cases I do not have a fundamental understanding of many topics, which leads me back to the beginning. More often than not, Better Explained is at the end of that path. If only this was how mathematics was taught!


I never got calculus.

I mean, I got the ideas behind it and could talk with my professors about it, but I couldn't solve much of these equations.

It was a strange time at university...

I even helped friends to get these exams done, but I myself always failed. Luckily we only had one math lecture about calculus, the 4 others were about different stuff.

And it was always the fundamentals which got me. The whole rearranging an equation stuff all the school life is about totally eluded me. And this is what the calculus exams were about. At least the professors saw that I got the other math exams right, so they knew I wasn't an imbecile, haha


Calculus didn't really 'click' for me until I had to apply it in my college physics classes.

That's when it all came together.

Physics felt like the word problems you'd see in elementary school math. Ie, taking the concept from a theoretical 3+4=? to "Jim has three apples and Mary gives him four more, how many does he have?" And bringing that to calculus.


Yes. Many mathematical concepts didn't click for me in the math lectures.

The "aha"-moments I got in "computer graphics", "audio engineering", and "artificial intelligence"


I've a similar story! Working through the Nature of Code[1] is exactly the point in my life where differential equations and vector algebra suddenly made sense to me. Once I became the 'creator' I could see how this math described the real world.

[1]: http://natureofcode.com/


Do you remember what aspects of Calculus didn't click for you at first, that is when it was presented to you initially as pure mathematics? For example, was it the "idea" of a limit, or just not knowing at first why anyone (even mathematicians) would want to know the rate of change of a function?


More of the latter. And how to really think of derivatives beyond just the slope of a function. Or seeing an integral as a continuous sum of infinitesimals instead of just the area under a curve (which is only one way of visualising). And what does that 'dx' really mean inside the integrand. And where integration-by-parts comes from.

And Taylor expansions too. My high school calc teacher just put the equation on the board for Taylor series one day. No explanation where it came from, or how to think about it. When I asked if he could 'prove' it to give some context and make it seem less like black magic, the other kids in the class said "I'm sure the formula's right if it's in a textbook". But I didn't care about it being right, I cared about seeing where it came from.(and it's not that hard to derive actually). Taylor expansion is like the screwdriver of physics, it's used everywhere.

Other than calculus, matrices also didn't make much sense to me until physics. Eg, why is matrix multiplication so complicated, and what do these things actually represent? Especially opaque was the concept of eigenvectors and eigenvalues, until we used them prolifically in quantum mechanics.


Thanks for the response, I asked it out of genuine interest in how we discover new ideas, sometimes ignore or are unable to see their importance, adopt them and then share them. Even forget those ideas and perhaps have to re-discover them later. I personally empathize with the stuff you listed. In particular, I had trouble accepting at first why matrix multiplication is defined the way that it is.

As someone who feels "closer" at this point to applied math than pure math - largely, because it is literally more tangible - I admit a fascination with pure math, as if it were the human pursuit of universal beauty, in its ineffable, permanent and probably unattainable form. And that same pursuit of reason and order has brought renown & fame to some, but mania, loss to many. It's both the pride of humanity and terrifying, in equal measure. Like staring at a Van Gogh, and finally perhaps realizing how marvelous, impactful, insightful the statement the artist may have intended to make, yet from such humble circumstances, and with so few tools.


I am never comfortable with my understanding of a topic until I can teach it (or at least pass it on). There's just always an annoying, niggling feeling that if I were to be faced with a student who kept asking "why", I'd be a bumbling mess. I was the same as you for linear algebra. Everyone was always so happy to learn the rules, answer the questions like its a script and pass the exam. This frustrated me immensely. Sat down last year and started from the beginning, and I'm glad I did. Sometimes the key is to wipe the slate of existing knowledge clean and start again!


For me, the "why" came from the word (or story) problems -- the exact problems that most people really hated. There is one problem from calculus that really stuck with me, and made me really understand limits:

"You are on an island, which is 10 miles from the shoreline. You can row a boat 4 miles an hour, and you can run 7 miles an hour. You have a friend in need that lives on the shore, 6 miles down from the part of the (straight) shoreline that is closest to the island. Where on the shore do you need to land the boat to minimize your travel time?"

So obviously, aiming the boat directly to your destination minimizes your distance, but you are only going to be traveling at 4 mph. And if you aim straight to the shore you minimize your time in the slow boat, but your distance is longer. The visual solution is to plot the total time based on various landing points on the shore, then find which one is minimal. The exact solution is to make a formula that represents both legs of the trip with the landing point being your variable, take a derivative of the formula and solve for 0 (if I remember correctly).


I definitely had similar issues with learning math when I was in school. You mention that you sat down last year and started from the beginning... would you mind sharing a little more about where you started, that is what your beginning was, and what sorts of resources you used? I'm thinking of doing something similar and am looking for avenues to take.


Well funnily enough I started back at what a matrix really is. What does it mean to multiply them, add them, invert them and so on. The first good resource I used was Better Explained! I also took out some Linear Algebra text books from the library. First was Linear Algebra for Dummies (don't knock it!) and then an undergraduate text book. I've still a ways to go but it was a good start.


I agree, and would say that teaching others something forces you to look for weak points in your own understanding and firm them up. Teaching, for me, makes me think about the "why" more, which is something I always struggle with. Going through the motions is the easy part.


hehe...

Funny thing is, if people who learn about software engineering ask me "why" I often know the answers. But I wouldn't know how to start teaching someone this topic...

Linear algebra was one of those parts of math that I understood rather fast.


I love attempts to shake up maths education, and I reckon the first passage is a pretty neat explanation, but it seems to slide quickly downhill from there.

The first three chapters are a bit airy-fairy - I get that that's its schtick, but they're also pretty light on content.

The fourth reads like it's been taken straight out of a maths textbook by a "cool" teacher - engaging but not fundamentally simpler than what one learns in class.

The fifth teaches how to use Wolfram Alpha to perform calculus. I could teach a dog to use Wolfram Alpha.

It all seems to suffer a bit from the "Dummies guide" syndrome - it's easy to simplify the first few lessons on a subject, but you pretty soon hit a wall when you reach the parts that actually require a bit of mental legwork.

Calculus is not a simple subject. By teaching it better, you can probably improve the situation by, like, 20%, but you can't magically make a complicated subject trivial. It just doesn't seem like the way forward for education. We need to make the difficulty more rewarding, not pretend that it doesn't exist.


>It all seems to suffer a bit from the "Dummies guide" syndrome

I think your criticism is misguided. His tutorial is obviously meant to make the introduction to the topic more accessible. He does that by using pictures and diagrams to build intuitions. He can do that because he extensively uses drawing tools to help the learner "visualize" what Calculus is about.

Compare that with typical calculus teacher with a whiteboard. Because the teacher's drawing skills are limited, he/she dives right into the delta-epsilon limit definition: lim h --> 0 for f(x+h)-f(h)/h

Yeah there's some hand drawing of tangent lines to converge on a "instantaneous change" but that's not enough for a lot of students to "get" Calculus. The symbolic notation is demonstrated more often than pictures. It therefore devolves into mindless "plug & chug" for the rest of the semester.

As for the Wolfram Alpha comment, the idea is about "immersion". There are other tools besides pencil-&-paper that also "do" Calculus and they can be used to verify self-study Q&A.

Likewise, if I teach Differential Equations to programmers, it's helpful if I "immerse" them into other tools to numerically solve them such as MS Excel, MATLAB, Wolfram Alpha, C/C++ code, etc.

His BetterExplained doesn't replace a rigorous Calculus textbook but it can help with the "aha!" moment to understand what the dry Calc textbook is trying to teach.


I was taught calculus during my last two years of high school from a starting point of the conic sections (parabolas, elipses etc). Basically what you said drawing of tangent lines etc.

Limits and the fundamental theorem came after this.

We used old school TI graphics calculators in the classroom. This was 15+ years ago so students today probably have better tools available (In Australia some schools give laptops to students nowso at least something like excel is available) we certainly had better then pencil and paper in the late 90's.

My first year university calculus course covered 2 years of high school math in about 5 weeks and moved on to things like Taylor series Laplace transforms etc. There were people who hadn't studied calculus at high school in the lectures but they were the minority. I'd imagine if this was their first exposure it would have been a struggle.


>His tutorial is obviously meant to make the introduction to the topic more accessible.

And I'm saying he doesn't much succeed at that. After the first few chapters, the content is either trivial or very similar to what you'd find in a textbook.

>As for the Wolfram Alpha comment, the idea is about "immersion".

You could be right, but it seems like a generous appraisal. It reads to me more like an attempt to brush over the fact that the student doesn't actually know any calculus yet.

If your intent is to teach that calculus can be performed intuitively, teach that - don't teach how you can phrase your question to get a computer to solve them.


> It reads to me more like an attempt to brush over the fact that the student doesn't actually know any calculus yet. ... - don't teach how you can phrase your question to get a computer to solve them.

No, he's not showing WolframAlpha as a mental shortcut to cover up deficiencies of not knowing calculus.

The point is to utilize it as a diagnostic tool to check the student's understanding of previous Ch 1 - 4. If the student understands the previous chapters' vocabulary and concepts, he "doublechecks" that understanding by seeing if he can "describe" problems to WolframAlpha. Again, the idea is to create a feedback loop for the self-directed learner.

Instead of using WolframAlpha, one could be shown how to use the TI-83 graphing calculator for the same purpose. However, since the student is likely reading Betterexplained on the web, he can just conveniently click over to wolframalpha.com and "check his calculus knowledge" there. As a bonus, WolframAlpha is easier to use, draws prettier graphs, and doesn't cost $100.


>It all seems to suffer a bit from the "Dummies guide" syndrome - it's easy to simplify the first few lessons on a subject, but you pretty soon hit a wall when you reach the parts that actually require a bit of mental legwork.

And yet Dummies Guides are often the first book I'll look for as an introduction to a subject. I know they'll be light reading, easily consumable, and convey information that is simple and considered "mainstream". The wall you speak of likley means you've leveled up your understanding, and now have a better idea of where to go next.


What I'd love to see in math education is not dumbing down. Although finding a way to make people enthusiastic at first is important. I like math because I liked math. If people don't have pleasure then education fails. But here's my point I want teachers to state clearly what is to understand. "working hard" doesn't cut it for me. Give me a few 'really hard' exercise that will require my brain to change fundamentally then leave me on it. Affirm patience is key, that one WILL understand after looking at this elusive idea. Because that's why odd concepts are odd, they don't fit in our brain patterns, and thus it takes time to reconnect the dots.


Learning new material is hard at the beginning. You don't have any context and the terms are foreign. You might not even know Wolfram Alpha exists if you're uninitiated. Despite its ease of use, some people appreciate a guided approach to learning.

While the subject can't be trivialized, if someone gains learning momentum (whether by interest or confidence) then this kind of material did a great deal to help them.


Calculus is an elegant subject that is not nearly as difficult to pick up as its reputation leads people to believe. But there are two main problems with learning calculus today.

One is that mathematics in general is taught very poorly (well, all subjects really, but mathematics bears the brunt even more so). There's a great concentration on "teaching to the test", rote memorization, "plug and chug", churn through as much "material" as possible and then forget it over summer, that sort of thing. Cargo cult education, basically. This is very damaging to the teaching of calculus because it is very much a subject that is dependent on some very fundamental core concepts. If you don't "get" those core concepts, then calculus will be forever opaque to you. Worse, this pattern of "education" applies all the way down the line to every math subject taught in school, including the pre-requisites for calculus: advanced algebra and trigonometry.

And calculus is very dependent on actually knowing that pre-requisite math. Not being dependent on plug-and-chug, guess-and-check, "eliminate 2 wrong answers from the multiple choice question and then guess", or "cram for the test and then forget it". All of those techniques work well at gaming the educational system, but they leave you weak in the actual skills of applying trigonometry and advanced algebra to problems, and using them as tools while learning other, more advanced, mathematics. So you have a typical calculus course trying to lay down enormous concrete slabs of new concepts and "material" but it just sinks into the bog rather than building up a structure sitting on a solid foundation.

Additionally, calculus is often used as a filter course. Calculus has a potentially infinite amount of "trivia" associated with it. All manner of rules, substitutions, identities, definitions, theorems, and so forth. As such, calculus effectively has a "difficulty dial" than any teacher can set arbitrarily depending on how much memorization and busy work they want their students to slog through. And many teachers decide to set this at just high enough of a level to make calculus a real pain in the ass for most students. There's always a million justifications for doing this, but most of them are pretty weak. There's a justification for acquiring familiarity with all of these things, and proving you can walk through the steps to do harder problems, but in practice it's mostly useless. People who actually use calculus day to day generally just keep a cheat sheet around of integration and derivation rules and substitutions and whatnot, or they employ the use of a computer to do the heavy lifting. Once you understand the concepts there's not much value in slogging through many mindless examples just for bragging rights.

All of this leads to people getting the impression that Calculus requires superhuman skills to learn (no, it just requires being actually caught up with the pre-requisites before hand) and also that the application of calculus tends to be boring and monotonous. If education did a better job of ensuring the pre-calculus fundamentals were thoroughly in place, and concentrated more on concepts and applications of calculus (like physics, engineering, geometry, control systems, computer graphics, and so much more) would, I think, lead to not only a lot more students succeeding at learning calculus well but also enjoying it.


Being caught up with the prerequisites is a huge issue. If someone is weak in arithmetic, they'll struggle with algebra. If weak in algebra, they'll struggle in calculus.

If they learned by "guess and try," they'll simply hit a wall when problems cease to have a number as an answer.


Also, if it takes you, say, 2 minutes to struggle through a simple multiple choice algebra problem that you're able to overcome through lots of brute force methods but you still get the right answer, you can pass algebra tests easily, but you're going to struggle when you hit calculus. You need to be able to solve some of those problems mostly in your head in seconds, and the methods need to be intuitive and familiar to you. Moreover, the rest of your work depends on having the right answer to that problem. If you can get the right answer 90% of the time with a lot of effort that might be good enough to pass a class, but it's not actually good enough to use those skills practically. In an easy calculus problem you might have to work through dozens of little simple algebra problems, and if it takes minutes to do each one, and you get the wrong answer 10% or more of the time, then calculus problems are going to take hours and you're going to get the wrong answer most of the time.


trigonometry and advanced algebra is really a problem to me when studing calculus, any tips to improve that?


I wish I had a better answer. Find a good book or set of resources and go through it again. A good method might be to go through chapter by chapter and skim through the subjects and the problem sets, if you see something that isn't "slam dunk" easy then consider reviewing that material and just reading through that chapter and doing the problem sets until it makes sense. Similarly, if you're working through something in calculus and you hit a speed bump, try to figure out exactly what you're struggling with and then go back to your algebra/trig materials and focus in on that until you've learned it.

Khan Academy has a ton of videos on algebra and trig on youtube, so that's also another good resource. If you have the money, consider hiring a tutor for a few short sessions, they can be remarkably affordable. Also, if you're in college there are study groups and office hours, so consider taking advantage of those.

The most important thing in math is to make sure you're actually learning the material rather than just skimming over it and passing the tests.

You can also try various games, though sadly I don't have any recommendations.


To quote this site: "Review the intuitive definition. Rephrase technical definitions in terms that make sense to you."

I openly admit that the technical definitions are often how I imagined the material intuitively when the material was taught in school. But first when studying mathematics these definitions occured. I really wish that calculus (and the rest of mathematics) would be taught in a much more abstract way in school - it really makes understanding the material much easier since the "intuitive" definitions suffer a lot from being leaky abstractions (and mathematics is all about banging to the boundaries of the definitions until they move or fall down).


I'm the opposite (which, I admit, is backwards for a lot of engineers.) I need to have a high-level view to provide a context for the details. It doesn't help me to spend a week on defining and proving integrals without first understanding that the goal is to find the area of a rectangle or circle, or to track planetary motion. Once I see the problem, I can think about how we arrived at the solution, rather than present it as a solution in search of a problem. Newton wanted to describe planetary motion, so he invented calculus (Leibniz too, etc.) to solve it, not the other way around.


> It doesn't help me to spend a week on defining and proving integrals without first understanding that the goal is to find the area of a rectangle or circle

I will only discuss this point (but for your other points similar arguments can be made). So you want to measure some n-dimensional volume? This is a highly non-trivial problem, since it is quite possible to find subsets of R^n where you can't assign volumes in a sensible way:

> https://en.wikipedia.org/wiki/Banach%E2%80%93Tarski_paradox

If you don't care about this subtlety you end up in such a paradox.

So you think hard about how you can define "measureable" in an accurate sense and come up with the definition of a sigma algebra and think about how you can even define measures (say of volume) on a sigma algebra. Because these can be ugly to describe, you will immediately think about how one can describe the sigma algebra and the measure in a more easy way. So you come up with generators and note that the measure is defined uniquely by its value on the generator.

Doing this process on the real line using the Borel sigma algebra/measure is not that complicated (so you solved the problem for the real case). From here it is only a small step to the Lebesgue-Stieltjes integral in R^1 that is really worth doing.

But now you remember that you wanted to compute higher-dimensional volumes. So you ask how can you recycle what you've already done? The answer is obvious: You define product sigma algebras and product measures.

As you have seen (Banach-Tarski_paradox) all this is necessary just to even define volumes. I really know no easier way that is both useful in practise (at least be able to obtain the volume of spheres (i.e. not just "simple objects as simplices")) and avoids the problem that was laid out by Banach and Tarski-


I don't like this attitude at all. Euclid, Archimedes, Newton, Leibniz, Riemann, and many others did a fine job correctly calculating areas, volumes, and so on without need of measure theory.

Why is it so important to avoid the problem laid out by Banach-Tarski on the first go through the concept? We, as a species, did not even realize such a problem existed until about 1920, and yet we still managed to build stuff based on what we knew.


> I don't like this attitude at all. Euclid, Archimedes, Newton, Leibniz, Riemann, and many others did a fine job correctly calculating areas, volumes, and so on without need of measure theory.

The history of mathematics is a history of ever-increasing standards for proofs. "Proofs" that, say, Euler wrote down that were perfectly valid for the standards of their time would probably not accepted in todays standard. I personally also think it's plausible that today's proof standard in "typical" math papers will not be acceptable in 100 years anymore, when they perhaps will additionally have to be computer-checkable (and computer-checked).


The issue of proof is very different from the issue of learning. I thought it was the latter being discussed.


Mathematics is about proofs. Learning mathematics thus means learning about proofs.


I do not agree with your first statement. Mathematics is about developing and studying tools to solve problems. Proofs are only one aspect of that.


> Mathematics is about developing and studying tools to solve problems.

There clearly exist methods for solving problems that have nothing to do with mathematics. This should falsify your thesis.


I think maybe you made a mistake, because the exchange reads like this:

A: Knives are for eating.

B: Eating is only a part of it. Knives are for handling food.

A: You can handle food with a fork. Your claim is false.

Specifically, "math is for solving problems" does not imply "problems are only solved by math".


> I really wish that calculus (and the rest of mathematics) would be taught in a much more abstract way in school

That's valid, but you're probably in a very small minority. I took calculus in high school and then real analysis in college, and I still usually think about calculus geometrically whenever possible.


They should be taught in their historical context. Calculus was a very practical field, with axioms being discovered, and theories written based on engineering need.


A strange opinion: Do you also think that programming languages should be taught in the historical context? If not: How do you think mathematics should be taught this way.

My opinion is clear: One should teach the material in the best modern understanding that is possible. If this is more complicated but worth it this is perfectly fine and even necessary.


Completely agree with this.

I used to marvel/sob quietly at how an entire PhD/lifetime's work could be summarized into a single sentence in a given textbook, but human's single greatest advantage is that we can absorb other people's knowledge and wisdom orders of magnitude faster than it took them to acquire it.

A given human life is finite in length and that length is relatively narrow in distribution. We are able to progress as a race as we don't have to waste time relearning how to start fire or engineer a wheel from first principles, much less derive calculus in the absence of other knowledge. If only we could find ways to teach current maths at a younger age, we would faster raise the next generation of mathematicians!


In particular your last sentence could come from me. :-)


> Do you also think that programming languages should be taught in the historical context?

Not the OP, but: yes, absolutely.


So instead of teaching C in a semester one should teach you

- Fortan

- ALGOL 68

- B

- BCPL

- K&R C (with its historic syntax containing elements that are now disallowed)

- C89/C90

- C95

- C99

- C11

and what reasons there were to introduce the changes/extensions?

My opinion is clear: Teach the most modern version with its best practises and leave away all the historic baggage that is only of historic importance.


Some of those can and should be skipped.

You're also missing a native assembly language.

Programming should be taught from the point of view of writing your own forth interpreter, up through BIOSes, based off of it, then in to higher level languages.

Logical composition, however, should be taught inversely. Take a high level language (SQL comes to mind, yes that's high level in some respects!) and teach how to isolate and logically differentiate between elements.

However I'm also for the more simple base test first.

Show someone a few examples of how to play Mastermind (game), see if it clicks.

Show someone basic register algebra, as one labeled context, and then show them VHDL style flow definition statements as another context. Keep the contexts labeled.

Anyone that can apply those rules to 'play the games' is probably capable of being a programmer; and anyone that doesn't find it fun probably shouldn't /be/ a programmer.


Your approach might make sense: But this is clearly not teaching programming from the historical context.


This might be good for a "History of Programming Languages" course. To teach programming languages in an historical context, is a different task. While teaching the most modern version with its best practices, one could mention that feature X was first used in language L, for example.


> To teach programming languages in an historical context, is a different task.

To present the topic in a historical context the students first have to know what was context/standard at that time, under what circumstances the solutions of their time were developed, whether the restrictions still hold etc. In other words: You would have to include half of a "History of Programming Languages" course, since this is surely not common knowledge.


Has anyone completed the Calculus courses on Khan Academy (Integral, Differential and Multivariable)? If so, would you recommend Khan Academy for Calculus instead of the resources mentioned on Better Explained? I'm working through the Linear Algebra courses now on KA and they are very good.


I haven't seen the Khan Academy Calculus videos that require signup but my answer assumes it is similar to (or exactly the same as) the youtube videos presented by Salman Khan.

If you're already making great progress from Khan vids for Linear Algebra, you'll probably have the same success with Khan's Calculus.

S Khan's approach is friendly but it's also very similar to a traditional teacher using a whiteboard in a classroom. For example, S Khan dives right into the formal concepts of "functions" and "limits" and all the associated notation. This mirrors how a professor would present it. In fact, Salman often remarks that he has his "college textbook" open as a resource to order the sequence of topics and demonstrate examples.

Kalid Azad's approach with BetterExplained is to present images, pictures, and analogies that's not found in typical Calculus textbooks. This helps many learners get past the "mystery" of what the topic is about. You'll notice that the first chapters of Azad's approach does not discuss "limits"[1]. If you're already doing fine with Linear Algebra, you may not need all the pretty pictures as a gentle introduction.

[1] The idea of "limits" is just one example of a philosophical difference on how to teach calculus. Isaac Newton didn't use the rigorous concept of "limits" when he invented calculus. His peers derided his approach as "ghosts of departed quantities". It was the later mathematicians who refined calculus to be based on more rigorous foundations (Real Analysis) that "limits" was retrofitted onto calculus. It's interesting that we now think the best pedagogy for high school students is to force "limits" on them to learn calculus because ... It's More Rigorous. Unfortunately, "limits" are not intuitive.


Thanks for the advice. I actually really liked the "How To Learn Trigonometry Intuitively [1] on Better Explained as my Trig experience was limited to memorizing the formulas. It was only when I ventured into making a basic game that I learned the actual practical uses of Trigonometry. I wish math in school focused more on why and less on how, especially since all schools have access to computers and knowing where/when to apply mathematical theories and concepts is so important.

[1] http://betterexplained.com/articles/intuitive-trigonometry/


Your history of Calculus is a little..muddled.

Newton's approach was based on the idea that rate of change was a fundamental quantity. This lead to fluxions. His notation for Calculus was x with a dot over it. He used his method as a rough notation to figure out what the answers should be, before coming up with pure geometric constructions to use in the Principia Mathematica.

Inspired by the Principia, Leibniz reinvented Calculus with the idea of infinitesmals. Infinitesmals are nonzero numbers that are smaller than any real number. He published, which started a priority dispute between him and Newton.

Leibniz gave us our dy/dx notation. In the original, d is an operator meaning "look an infinitesmal away". Literally d(y) = y(x+dx)-y(x). (This converges faster if you make it y(x+dx/2)-y(x-dx/2)...) You do the calculation, then drop any infinitesmals and get your answer. Ever wondered why the second derivative is d^2y/dx^2? That's because you do d(d(y))/d(x)d(x).

The famous quote of "ghosts of departed quantities" was written by George Berkeley as a criticism of infinitesmals. Far from being a peer of Newton's, he was a bishop who was demonstrating that science had leaps of faith as big as any in religion. His criticisms were accurate.

The fact that math's foundations were crap occasionally inspired effort. But not much. Lagrange's contribution was to view everything as a power series, and do formal derivatives that way. To him we owe our f' notation.

Then Joseph Fourier upended the whole apple cart with Fourier series. He took infinite sums of nice functions like sin and cos, and wound up with a step function. What happens if your step is an infinitesmal away? This was a crisis.

Cauchy addressed this by making infinitesmals sequences that go to 0. This approach failed for the chain rule when the derivative is 0, but that's where the idea of Cauchy sequences comes from.

Weirstrauss in the 1870s came up with the modern limit approach. Mathematics was (very literally) rewritten.

Since then limits have proven to be a horrible pedagogical tool, but that is how we teach. However there are serious pedagogical suggestions ranging from skipping over the justification for Calculus to using infinitesmals (turns out you can make precise sense of them with modern set theory) to writing the tangent line approximation with little-o notation. (The last suggestion is from Knuth! See https://micromath.wordpress.com/2008/04/14/donald-knuth-calc... for details.)

Sorry for the history lesson, I love this topic. :-)


>The famous quote of "ghosts of departed quantities" was written by George Berkeley as a criticism of infinitesmals.

It was a criticism of both Newton and Liebnez:

"And what are these Fluxions? The Velocities of evanescent Increments? And what are these same evanescent Increments? They are neither finite Quantities nor Quantities infinitely small, nor yet nothing. May we not call them the ghosts of departed quantities?" -- from https://en.wikipedia.org/wiki/The_Analyst#Content

Your history is more detailed but it also focuses more on the evolving notations and more of the intermediate intellectual steps in between Principia Mathematica and today's rigorous Real Analysis. I had the lesser ambition of pointing out that the non-intuitive "limits" is not what Newton did and it is somewhat ironic considering we force it on today's teenagers to "learn calculus".


Is there any forum for discussing this type of thing?

I have some questions wrt issue like this, but you ask on a place like MathOverflow and you often get unintuitive answers, as if for the working mathematician, rather than explanations.

For example, I've always though of dy as intuitively involving some kinf of, higher-function over another;

d[y(x), a] = y(x+a) - y(x)

such that;

if I(x) = x so that d[I(x), a] = d[x, a]

and d[x, a] = a

then d[y(x), a]/d[x, a] = d[y(x), d[x, a]]/d[x, a]

so we drop reference to a (or lim as a->0) and = dy(x)/dx = dy/dx

But look, there's an implicit 'a' or 'dx' in that y(x), plus y(x) is often written just 'y' so that you can always switch around which the free variable is (an equation just describes the relationship, e.g. y = 2x can generate either y(x) = 2 * x or x(y) = y/2)

Also, since the imagined HO function d[f, a] actually takes a function for the first arg, there needs to be a implicit identity function for dx, so really df(x)/dh(x) means "rate of change of f, for change in x, relative to (normalised) rate of change in h, for change in x". Then the identity function becomes the default such that dx is short for dI(x).

Maybe this descibes the same idea:

https://en.wikipedia.org/wiki/Differential_operator https://en.wikipedia.org/wiki/Derivative#The_derivative_as_a...

But its a little undecipherable to me...


Not that I know of.

I learned it during grad school because it was more fun than the stuff I was officially learning.

Your intuitive notion is not that different from the original infinitesmal notation.

What Wikipedia is describing is that "derivative of" is an operator. Meaning it takes functions and returns other functions. So D(x^2) = 2x. Which is not exactly the same thing as what you wanted, but it is related.

When you go on in math, you learn that there are a million ways to think about things. OK, dozens at least. As an illustration Thurston listed a bunch for derivative in http://www.math.toronto.edu/mccann/199/thurston.pdf. I strongly suspect that this is a real list he maintained, and he didn't jump from definition 7 to 37 without filling in 29 definitions in the middle.

But when you listen to mathematicians, the odds are that they aren't picking any way of thinking about things that you would have thought of. And maybe they DID try your way, then found it didn't work because X, Y and Z. Or maybe they did try it, but moved on and everyone has forgotten it. Figuring out which of these is true can be legitimately hard, and doesn't seem to interest most mathematicians.

But if you find a good forum, let me know. It might be fun for me to visit. :-)


I recently started reading Amir Alexander's 2014 book "Infinitesimal: How a Dangerous Mathematical Theory Shaped the Modern World". It's a really great read, this is pre-Newton, pre-Leibniz. People were doing calculus more than two hundred years before those guys! They struggled with paradoxes to do with infinitely many zero length points making up a line, and the Jesuits just outright banned that stuff. Partly because it was protestant mathematics, but mostly because they wanted a version of mathematics that was somehow eternally unchanging and crystal clear (like how God is supposed to be.) Boy did that not work out. Anyway, these early mathematicians used imagery like the threads in a cloth to describe how infinitely many lines form a plane surface, or the pages of a book to help understand how infinitely many planes form a solid. So cool! In the beginning it was Italians working this stuff out (including Galileo). But then the Jesuits totally banned it thereby casting a pall over Italian mathematics for hundreds of years!

For a modern treatment of infinitesimals, check out "smooth infinitesimal analysis" [1]. The way to avoid all the contradictions is to weaken the underlying logic: in this case they remove the law of excluded middle, so it's not the case that something is either true or it is false. I think the Jesuits would definitely have trouble with that!

[1] https://xorshammer.com/2008/08/11/smooth-infinitesimal-analy...


> The way to avoid all the contradictions is to weaken the underlying logic

Pet peeve: I would argue that constructive logic is stronger, in the sense that it allows more reliable (read: usable in the real world) results than classic logic allows. But that's a value judgement, and there's no definitive right answer here.

Totally agree Re: Infinitesimals. Dual numbers[0] are crazy useful in computer science (c.f. Google's Ceres Solver[1]).

[0] https://en.wikipedia.org/wiki/Dual_number

[1] http://ceres-solver.org/


I agree with you, there's going to be more shades of grey using intuitionistic logic. I guess I was taking the point of view of the Jesuits!

So actually I understand about Dual numbers, but from what I see this is not the same thing as smooth infinitesimal analysis (there's no playing around with the logic.) I feel like if we scratch this a bit more we are going to understand what Grothendieck was on about with topos theory [1].

[1] http://www.oliviacaramello.com/Unification/Unification.htm


Removing the law of the excluded middle strengthens the underlying theory. It makes it constructivist...


I am blown away by the "Calculus in One Minute" explanation. And I came in with a high bar because I already knew about the good material on betterexplained.com


I love Better Explained. Drawing a parallel between multiplication/integration and division/differentiation is extremely powerful and almost never taught! It suddenly pulls perceptively "complex" topics into the realm of the familiar.


Someone posted an article a while back about working to calculus using functions, I think it was titled something like "If Socrates knew functions"? I'm not sure. Does it ring a bell for anyone?


Looks like "If Archimedes Knew Calculus", about "quantum" calculus. Very interesting, I'm about half-way through already:

http://www.math.harvard.edu/~knill/pedagogy/pechakucha/


To correct myself, the actual title is: "If Archimedes Knew Functions"

Same link:

http://www.math.harvard.edu/~knill/pedagogy/pechakucha/


Thank you so much!


I'm not sure if I like the example with the integration of the circle circumference into the circle area. Yes, it sounds intuitive. But here's another problem with a wrong solution that sounds intuitive: to compute the hypotenuse C of a triangle with other (perpendicular) sides A and B, you can simply replace the hypotenuse by a staircase of infinitesimal steps, and hence you find that C = A+B. We're overlooking something here, but that is not the point. The point is that the proof should be constructed so that we cannot possibly overlook something.

Conclusion: don't rely on intuition with these kind of problems.


I haven't finished it but I read most of the late James Stewart's "Calculus: Early Transcendentals" and found it very instructive. It would make a good supplement for anyone learning Calculus.


Stuff like Khan Academy, or even MIT OCW are great, but there can be holes wrt more advanced or unusual math. There also seems to be an "established" way of teaching things (what/how), such that alternatives aren't offered.

Thinks like Wikipedia are quite poor on explaining math, especially when the set theory is pulled. Set theory almost guarantees no intuition for me...


This is absolutely fantastic. Great method for teaching new concepts to those new to the subject matter. I have an engineering background and have to say that this approach is refreshing to read. It's like a bag of chocolate chip cookies for the brain :)


Looks nice; didn't know that first example; shall give it a read later, thanks!


Children should learn Calculus instead of Algebra 1 & 2.

The BIG Problem is the difference between a Good Calculus teacher and a BAD Calculus teacher is so HUGE and changes the outcomes of their students.


As usual, great stuff from Kalid.


Brilliant write up;




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: