
Why Do We Need Limits and Infinitesimals? - bhousel
http://betterexplained.com/articles/why-do-we-need-limits-and-infinitesimals/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Betterexplained+%28BetterExplained%29&utm_content=Google+Reader
======
btilly
Calculus can be rigorously done with limits, continuous functions, little-o
notation or infinitesmals. The key word is "or", use one approach and don't
jump around. (After you understand multiple you can see how they connect and
then jump around.)

Of those approaches infinitesmals look the simplest but require the most
complex mathematical machinery to really do right. Yes, you can construct a
non-standard model of the real number system with a transfer principle that
lets you transfer between proofs done in the one to the other which lets you
formulate calculus using infinitesmals. But using a construction using the
axiom of choice to prove that the derivative of x^2 is 2x is kind of like
using a sledgehammer to push a tack into a cork board.

~~~
kalid
Great point. I think one of the issues with calculus is that it can be studied
at so many levels. Most high school classes give lip service to limits
(somehow they're important), and then get into symbol manipulation (move the
exponent down, subtract 1, etc.).

I think the most important thing in the beginning is to learn to think
conceptually about calculus, about how to break a shape into a simpler model
and solve it (not just manipulate models that are given to you). Later on (in
analysis, which I've never formally taken :) ) the details about the transfer
principle, epsilon-delta proofs, etc. can emerge, but realistically only a
very, very small fraction of students need that. That fraction can be
increased if they are able to use calculus regularly (with infinitesimals,
say) and then wonder the details of how it all works.

I like to ask: Would it be such a bad thing if everyone understood calculus as
Newton did, at a baseline level? Many people taking calculus leave with no
understanding whatsoever, but the same isn't said for geometry or even algebra
(you can make references to parallel lines, equations and feel that people
will 'get it', but mention a derivative or limit to someone 10 years after
calc class and you'll get a blank stare). I advocate having rough mental
models that are successively refined over time (vs. presenting the most
rigorous version first).

~~~
btilly
History note for you. Newton understood calculus through the idea of fluxions.
It is often claimed that an insistence on teaching calculus with Newton's
fluxions instead of Leibniz' infinitesmals held back research in England for a
long time. (Another random note. Bishop Berkeley's famous comment about
"ghosts of departed quantities" was a criticism of fluxions, not
infinitesmals. In fact he had positive things to say about infinitesmals.) So
yes, I do think it would be bad for students to understand calculus like
Newton did. :-)

That said, to my eyes the primary problem with how we teach calculus is that
we introduce too many big concepts at the same time. The key picture that
people should come away with is the tangent line. But when you say that the
derivative is a function defined at each point as the slope of the tangent
line at that point, most people experience brain freeze.

Instead I would like to see students get to the point where they can calculate
tangent lines. Then stop. Work out lots of tangent lines for lots of
functions. Introduce them to how the tangent line can be used to estimate
error bars. (Typical word problem, if a rock take 5 +- 0.5s to fall off a
cliff, how tall is the cliff?) And then only after the picture and mechanics
of the tangent line is well established should we move on to defining the
derivative.

This approach, if done right, would be rigorous but informal. It should avoid
the problem of causing brain freeze by presenting too many layers of
generality at once. And it would really cement the key image behind half of
calculus.

~~~
kalid
Ah, thank you for the clarification -- yes, Newton loved his fluxions, didn't
he? In future examples I'll s/Newton/Leibniz/ :).

I think the general point is true about the brain freeze and too many big
topics -- it's way too easy to just throw a definition out there, say "this
represents slope which is the rate of change" and move on to the next idea.

------
RiderOfGiraffes
Admirable intent, and it seems pretty good, but I've seen far too many people
follow advice like this and end up producing confusing nonsense while thinking
they're doing the right thing.

By all means take the concepts, but don't try to apply them like that. There
are _so_ many pitfalls that this doesn't help you identify, let alone avoid.

If it helps you grok stuff, great. I think I'm pretty clear on how to do a
quadruple coronary bypass - I'll just figure out the details as I go along,
shall I?

~~~
kalid
Hey, thanks for the comment.

Yep, there is a fundamental tension between intuition and rigor.
Unfortunately, I feel that rigor (and not even rigor -- rote details) are
overemphasized, without giving any context, and more importantly without
giving any inspiration to the student. It's like focusing on the spelling of a
poem, not the content.

I do want to jump into the rigorous definitions in later posts (such as when
limits exist, when they don't, why it's important) but I find intuition ->
rigor is an easier path to navigate than rigor -> intuition [which often
becomes rigor -> memorization]. If someone had just said "Hey, limits and
infinitesimals help us solve the problem of 'How do we make an approximation
which is still accurate enough?'" calculus would have clicked a lot faster for
me :).

~~~
RiderOfGiraffes
I think we're pretty much in agreement. I would rather, though, see a real
example with proper rigor, but introduced and explained starting with
intuition and then developed forwards into the detail.

The problem is, as I'm sure you're aware, just how many traps and pitfalls
there are for the insufficiently paranoid. Working "intuitively" has a
tendency not to instill that paranoia.

And, as you say, working just on the rigor tends to result in rote work with
no understanding.

I think only time and practice is the real solution. The article, though, is
good for someone who already has the "rote" understanding, and needs to "grok"
the whole thing.

Similar criticisms can be aimed at my stuff:
<http://news.ycombinator.com/item?id=940169>

~~~
kalid
Yep, I agree with your points about pure intuition leading you down blind
alleys unless you have some structure in place.

For the examples, my approach is definitely tailored for people who need to
grok the subject after getting a rote understanding. It's not so much to
become an expert but to realize what insights would help on the path to
becoming an expert.

I figure there are thousands of other sites for the formal definition and
practice problems :). I'll check out the article now, looks interesting.

~~~
billswift
I studied calculus on my own in my twenties, and I think the textbooks' focus
on rigor led me down a lot of blind alleys before I started trying to come to
an intuitive, often graphical, understanding of the problem before attacking
the rigorous view. (There is a good textbook from 1977 that was reprinted by
Dover in 1998, Morris Kline's "Calculus: An Intuitive and Physical Approach",
that I wish I had available when I was studying.)

