
Calculus with pics and gifs - archibaldJ
http://0a.io/0a-explains-calculus
======
amitkgupta84
I haven't read the entire thing, my barometer for deciding to read the whole
thing was the section on limits. The intuition the author tries to develop is
very wrong. The whole section on the limit being the smallest impossible
number to reach is more misleading than it is instructive. It's dead wrong for
probably the most trivial kind of function, the constant functions. If f is a
constant function, it's limit (as x approaches any value) is not only not an
impossible value for f to obtion, it's the only possible value! Ignoring
constant functions, it breaks down with even trivial examples, e.g. the limit
of x sin(1/x) as x approaches 0. The limit is 0, but 0 is not impossible to
obtain as x approaches 0, in fact f(x) = 0 for infinitely many x in any
arbitrarily small neighbourhood of 0.

The real idea is much simpler than "what's the biggest impossible number",
it's "what number am I getting closer to."

Also, there wasn't even an example of when a limit doesn't exist, that's quite
a significant omission. The function f(x) = sin(1/x) is a good contrast to the
previous function; in this case, the limit does not exist as x approaches 0,
and it's easy to see here that it's because f(x) doesn't get closer to any
value, it keeps oscillating wildly between -1 and 1.

~~~
archibaldJ
Hi OP here. Thank you so much for pointing that out. That was a big mistake I
made while trying to bring readers to a different way of viewing limit, which
can be useful as an introduction to the (ε, δ)-definition. Yes, limit
shouldn't be defined this way, which was the reason why I used the expression
" _can be viewed_ " rather than " _is_ " at the start. But I got too carried
away and forgot to clarify that there're cases when we shouldn't view it such
way.

So I would like to thank you once again for letting me know about this huge
mistake of mine that I failed to notice. I was busy and didn't see your
comment until now. I have just added clarification for that in the article.

The reason why there's so little about limit in this article was that I
intended to go through the limit section as quick as possible and get started
on differentiation and integration, which are what most readers are on my page
for. On a second thought I decided to at least pen down the (ε, δ)-definition,
and so I added the blockquote with the "biggest/smallest-impossible" analogy
which unfortunately ended being the last thing you saw in my article before
you closed your tab. I'm actually planning on writing another article, one
specially for the concept of limit, covering topics from continuity, existence
of limit (as you mentioned) and one-sided limit to sequence and taylor series
to limit point, neighborhood and a bit of topology.

~~~
Retric
The other thing your missing about limits is functions like tangent where they
approach different values on each side of a value.

The reason limits are introduced in calculus is you need a Continuous function
for the basic assumptions that allow integration/differentiation to work.
Basically, the limits for all points you care about need to agree or you can't
take the derivative. In other words f(x) ~= f(x + 1/ infinity) ~= f(x - 1/
infinity) for all x.

[http://en.m.wikipedia.org/wiki/Continuous_function](http://en.m.wikipedia.org/wiki/Continuous_function)

[http://en.m.wikipedia.org/wiki/Fundamental_theorem_of_calcul...](http://en.m.wikipedia.org/wiki/Fundamental_theorem_of_calculus)

PS: I still like the overall presentation.

~~~
archibaldJ
Yup. That is continuity and one-sided limit which I would cover in the next
article. I actually mentioned that in the comment above.

Other things that you would find in a lot of calculus textbooks but I didn't
cover are complex function, the application of calculus (e.g. in Newtonian
physics, in optimization problems) and stuff like L'Hospital's Rule, Squeeze
theorem, etc. I didn't want to get into too much of the details in this
article because my plan was to be concise and get straight to the points. I
want to write it in a way that anyone who is new to calculus can grasp the
concepts and have an idea of what calculus is about in the shortest time
possible. On a side note, most of the functions they will be dealing with are
elementary functions, which are continuous over their domains.

And thanks.

------
scrollaway
Related: Better Explained, a fantastic "intuition-first math" website, has an
excellent Calculus primer. The book is free to view, and there are paid-for
video courses with it.

[http://betterexplained.com/calculus/](http://betterexplained.com/calculus/)

------
analog31
>>> A limit is the number you are "expected" to get from a function (or
algebraic expression) when it takes in a certain input. By "expected" it is
referring to the expectation of the output when x "approaches" a certain
value.

>>> Limit can be viewed as either the biggest or smallest impossible number
for a function to output, when you put in numbers that are slightly smaller or
bigger than what x is approaching.

I realize this is intended to be an elementary discussion of calculus, but
both of these definitions made me cringe. The second seems closer to the
formal definition that I remember learning. And the formal definitions might
not be necessary for making practical use of calculus. But I would call them
explanations or analogies rather than definitions.

I taught college freshman math for one semester. The idea that the right
answer is the right answer because it's what the teacher "expected" is a
strong misconception that my students somehow formed while in high school

~~~
archibaldJ
Just for the record, the (ε, δ)-definition of limit is later mentioned as _one
way of formally defining limit_ after the analogies.

~~~
analog31
Ah, that's cool. I missed it.

------
cjslep
Maybe a nitpick, but in this image:
[http://0a.io/assets/img/1d_2d_3.png](http://0a.io/assets/img/1d_2d_3.png)

The red triangle should be above the line so it touches the X-axis, as the
"area under the curve" should always be towards the X-axis, not the Y-axis.
The picture as is _just happens_ to give the right answer for a straight line
going through the origin, but is illustrating the incorrect reason why.

~~~
archibaldJ
OP here. Thanks so much for pointing that out. No wonder I'd got this strange
feeling every time I glanced through that part. I've just got it fixed.

------
anon4
> _What is a function?_

> _A function can be seen as a machine that takes in value and gives you back
> another value. It is what we use in maths to map numbers (input) to other
> numbers (output)._

That's absolutely wrong (mathematically). A function is a relation between two
sets, where one value from the first set is related to at most one value from
the second set.

It's kind of weird to read discussions on how "functions in programming
languages are weird because they're not functions in the mathematical sense",
but then you see how for most people a function denotes a machine that
performs work in time (all physical concepts), rather than the idealised
mathematical definition. Doubly weird when the author is someone trying to
explain calculus and has presumably sat through a lot of high-level
mathematics classes.

~~~
Chinjut
What's wrong with thinking of such a relation as a "machine that takes in [a]
value and gives you back another value"? This seems to me needless pedantry.

~~~
Someone
That machine can only handle computable functions
([http://en.m.wikipedia.org/wiki/Computable_function](http://en.m.wikipedia.org/wiki/Computable_function))

Although many reasonable choices are equivalent, you should also be more
precise in what you mean by 'a machine'. If your machine is a regular
expression matcher, for example, it cannot determine whether an arbitrary
string contains matching parentheses pairs
([http://en.m.wikipedia.org/wiki/Chomsky_hierarchy](http://en.m.wikipedia.org/wiki/Chomsky_hierarchy))

~~~
pavelrub
"Machine" is an abstract term, not limited to Turing machines. Somebody who is
just learning about functions has no understanding of what "computable" means,
nor why should something be "uncomputable" according to some specific
definition of a machine. We can invent in our minds all kinds of machines that
don't exist in the real world: Oracle machines have existed in recursion
theory since the time of Turing.

If f is a function that tells us if a given Turing machine halts on a given
input, there is nothing wrong about imagining this as a machine that receives
a string and outputs a True/False answer. That no such machine can exist in
the real world under some physical interpretation of the Church-Turing thesis
is just a further observation - it has nothing to do with the correctness or
usefulness of this formulation.

------
chuckcode
I think Gilbert Strang at MIT does a great intro with his "Big Picture" of
Calculus overview. I really enjoy his starting with how calculus is a mapping
between two related functions without getting into the mechanics of
calculating them until later.
[https://www.youtube.com/watch?v=UcWsDwg1XwM](https://www.youtube.com/watch?v=UcWsDwg1XwM)

------
mrcactu5
I don't think it's important that his narrative be perfect, as long as it's
well-presented. There are thousands of calculus books on the market and nobody
reads them because they all suck.

He may consider taking out the Will Farrell gif since it has nothing to do
with Calculus.

~~~
archibaldJ
Thanks! I totally agree with you. That is certainly one of the things that
inspired me to try writing explanatory article on maths.

And well, that escalated quickly.

------
catusia
"Gottfried Wilhelm Leibniz, a great German mathematician, came up with this
notation in the 17th century _when he was still alive_." That, and the
definition of function made me go through it with different eyes. But very
nice to see this approach anyway.

------
quacker
_And as we can see, definite integration is more of a local operation, while
indefinite integration is a global operation._

I think this was borrowed from here:
[http://math.stackexchange.com/a/20636](http://math.stackexchange.com/a/20636)
? Either way, it'd be cool to link to this question in that "why integrals are
hard" section as it has a bunch of great responses.

~~~
cabinpark
I don't believe it was borrowed as it is something that is definitely taught
in courses. I remember learning it at some point early on in my mathematics
education.

I definitely know I knew it by my vector calculus course (2nd year) where we
use the various named theorems to translate between the differential and
integral forms of Maxwell's equations.

------
chm
Interesting. I'm helping my sister with Differential Calculus this semester. I
made her calculate the rate of change of Sin(x) between different sets of
points for her to see that it can have different values if the "traditional"
rate of change rule is applied. It's then easier to justify the derivative
concept of "slope at every point".

------
annamarie
Pretty good. I think Larry Gornick's _The Cartoon Guide to Calculus_ [0] is
better. Has a lot of narrative elements as well.

[0] [http://www.amazon.com/Cartoon-Guide-Calculus-
Guides/dp/00616...](http://www.amazon.com/Cartoon-Guide-Calculus-
Guides/dp/0061689092/ref=sr_1_1?ie=UTF8&qid=1321901206&sr=8-1)

------
quarterwave
One way to understand limits is compound interest. Start with a dollar,
compound 100% in year to get $2. Halving the interest rate and compounding
twice gives $2.25. The "limit" of this less-interest-more-often process, ad
infinitum, is e (2.718...). This introduces both infinitesimal and infinite.

~~~
ivan_ah
Excellent example: e = lim_{n -> ∞} (1 + 1/n)^n

The concept of infinity (∞) is perhaps the most important new idea in
calculus. Specifically, calculus is about procedures with infinite number of
steps, or infinitely small steps.

The derivative is a slope calculation (rise/run) with an infinitely short run.
The integral is a rectangles-approximation-to-an-area using infinitely thin
rectangles, and series are summation procedures with infinite number of steps.

High school math deals with procedures with finite number of steps, whereas in
calculus we learn to use infinity as part of our calculations. The reason why
limits are important is because they allow us to make certain statements that
would otherwise not be true:

    
    
        1/n ≠ 0 even if n is huge    
             but    lim_{n -> ∞} 1/n = 0 
        
        sum([1/2^n for n in range(0,N)]) = 1.999999... ≠ 2
             but    sum([1/2^n for n in range(0,∞)]) =  2
    

The equality in both of the above examples depends on (mentally) carrying out
a procedure with infinite number of steps.

(examples taken from my math book)

------
officialjunk
the issue i take with the way most calculus is taught is starting with limits
and derivatives. this is not the way humans discovered calculus. we discovered
integral calculus (finding the area under a curve) a couple thousand years
earlier. it is really cool that derivatives and integration are inverse
operations, but starting with integration is in my opinion easier to grasp and
derive formulas the student already knows: area of square, triangle, circle,
etc. once that gap is closed, then introducing derivatives makes more sense.

~~~
xamuel
Depends on the definition of calculus. Without the fundamental theorem (which
absolutely requires differentiation), explicitly calculating those areas is
unreasonably hard (undoable in all but the simplest cases, and in the cases
where it is doable, you usually have to be Archimedes to do it). Sure, you
could start the course with _numerical_ integration, but this really wouldn't
help anything else.

(You might be interested, though, to know that your suggestion is essentially
how higher math is sometimes done. The natural logarithm function is defined
by ln(x)=int_1^x(1/x)dx, which if you look closely doesn't involve
exponentials in any way, allowing you to non-circularly define the exponential
function as the inverse of the logarithm function, and once you've got the
exponential function, the game is on)

------
PeterWhittaker
Thoroughly enjoyable review. Haven't touched calculus/analysis since the mid
80s, that was a good read.

------
hadoukenio
That is great.

If anyone knows of anything like this for set theory or
probability/statistics, I would love to see them.

~~~
tomrod
Good idea! If I can find time this quarter I'll give it a shot.

