
Calculus for the Curious - diodorus
https://thonyc.wordpress.com/2019/10/16/calculus-for-the-curious/
======
ncfausti
I just stumbled upon Infinite Powers in B&N, and after reading a bit, decided
to buy it. Loving it so far. I’ve been curious about calculus for a long time
now, since I’ve never taken a formal class on it and am now in a graduate CS
program, I feel like I’m missing the deeper understanding of many of the
formulas that are presented.

My plan is to get through it to get some background on the main ideas of
calculus, then work through khan academy and/or read through Aleksandrov’s
Mathematics Contents/Meaning.

If anyone knows of active forums/q&a/online practice for self-learning
calculus, it’d be a huge help if you could share.

~~~
lilott8
Oh, hey. Are you me? I am wrapping my time in CS grad program, and I also,
never took calculus as a formal class.

Now, I will say, save machine learning/AI, calculus isn't really necessary;
the world is completely discrete.

That being said, that doesn't mean that knowing calculus wouldn't _enhance_
your ability to understand and digest some of the more difficult reductions
and proofs in, say, a theory of computation course.

I relied on "The Calculus Tutoring Handbook"[0]. I wanted a book that had
answers to _all_ the exercises for confidence building purposes. The book goes
slow and provides a great amount of detail -- the authors are pretty good at
not hand-waving.

I also found \r\learnmath useful as a "I have a problem and can't ask anyone"
site. They are really friendly.

[0] [https://www.amazon.com/Calculus-Tutoring-Book-Carol-
Ash/dp/0...](https://www.amazon.com/Calculus-Tutoring-Book-Carol-
Ash/dp/0780310446)

~~~
traderjane
Am I wrong to say that even if the universe of your concern is discrete,
calculus can at least describe the behavior of recursive discrete processes,
among other things?

~~~
TheOtherHobbes
Depends on the process and the exact form of the discreteness.

Discreteness introduces discontinuities and errors, and it's usually possible
to describe the errors analytically. But there are situations where discrete
systems become numerically unstable and blow up while the smooth analytic
equivalent has no problems.

------
yessenia1
I used to always associate 'math smarts' with 'code smarts'. Spent most of my
life telling myself that since I was bad at math I had no hope learning how to
code. Now I'm 2 weeks into a coding bootcamp after losing my job and am
realizing they are complete different parts of the brain.

I believe in myself more with every push to heroku. maybe I will do a Calculus
class next and prove to myself I can learn anything. Anyone have
recommendation on an online calc course for the math-insecure?

~~~
mav3rick
What did you do before if you don't mind me asking ? Would you have done an
investment banking bootcamp if it guaranteed higher pay ?

~~~
yessenia1
Apple Retail. Didn't qualify for Genius position after trying very hard and
never felt like I fit in. it inspired me to play with swift playgrounds, which
led me to the bootcamp program which I am LOVING

------
pugio
I just finished reading this, and thoroughly enjoyed it. It provides a clear
description of the fundamental intuition at the heart of calculus (chop stuff
up infinitely, then put it back together again), and a mix of the historical
background of its development and its ancient and modern applications.

All books of this nature are somewhat idiosyncratic – it's not a history, not
a textbook, nor an applied maths book; it's a packet of passion sent by
someone who's clearly excited and enthralled with his topic: "Here's a story
about something really cool! Maybe you'll think it's cool too!"

I've been investigating recent books with different approaches to calculus
(trying to build a free online course that empowers people in maths). Here are
some other books I can recommend/mention:

[0] "Burn Math Class: And Reinvent Mathematics for Yourself" Goes from 1+1 all
the way to derivatives and integrals. My favorite work of math demystification
and pedagogy.

[1] "Calculus Reordered: A History of the Big Ideas" Released this year,
exactly as the title says. An accessible history, explaining each idea as it
enters the world stage. I've only just started this, but it's a definitely
more historically thorough, albeit less engaging, book than Infinite Powers.
(Since historical accuracy seems to be TFA's main focus, I wonder what they
would think of this book.)

[2] "Change Is the Only Constant: The Wisdom of Calculus in a Madcap World"
Another just-published book – this is the year for Calculus! Amazon has
lost/delayed my preordered copy, but from the author's other work I expect
this to be a LOT of fun.

[3] "Magnificent Principia: Exploring Isaac Newton's Masterpiece" Sort of "the
annotated Newton". Outlines Sir Isaac's history, social environment, and
development of the Principia Mathematica. The bulk of this book is going
through each section of the Principia, translating the language into modern
speech/formulations (where needed), and explaining what Newton was getting at.
Also not as gripping as Infinite Powers, but a great way of reading and
understanding one of the most foundational scientific/mathematical texts of
all time.

[4] "Introductory Calculus For Infants" I'm about to have my first child, so
am naturally collecting suitable reading material for the budding babe
(suggestions welcome!).

[0] [https://amzn.to/2pVXJwj](https://amzn.to/2pVXJwj) [1]
[https://amzn.to/2pMCZr5](https://amzn.to/2pMCZr5) [2]
[https://amzn.to/31Ny32e](https://amzn.to/31Ny32e) [3]
[https://amzn.to/2BIimyM](https://amzn.to/2BIimyM) [4]
[https://amzn.to/33ZxbJj](https://amzn.to/33ZxbJj)

------
8bitsrule
Thony's math and astronomy posts are great. _Very_ careful with the details (a
real scholar) yet readable as Asimov. Quite brave of Strogatz!

(Notice the tag cloud on that page? ... he's been at it for a while!)

------
lordnacho
"See for example Euler, who made great strides in the development of calculus
without any really defined concepts of convergence, divergence or limits, but
who doesn’t appear here at all."

Can anyone say more about this? I always suspected this was so, because the
end of high school / start of uni is roughly that sort of time period, and I
always thought there was something about convergence/limits that was missing,
it seemed very ad hoc.

~~~
braindeath
Euler lived in the 1700s. Not sure how that would have affected your training
unless you’re much older than average.

Cauchy and Taylor both formalized many concepts, again in the 18th and 19th
century.

What are you thinking is ad box?

~~~
soVeryTired
In high school mathematics, you're not really given the definition of a limit.
Consider the definition of the derivative

limit as h -> 0 of (f(x+h) - f(x)) / h.

That's well-defined on (0, inf) but not on [0, inf). So you can't just
evaluate at h=0 and be done with it.

The intuition is 'as h gets smaller and smaller, the ratio gets closer and
closer to a new function of x'. But many high-school students aren't given a
clear definition of what it means for one function to be 'close' to another,
or what it means for x to 'get smaller and smaller'.

To see the confusion more clearly, try having the debate about whether
0.999... = 1 with someone who doesn't understand what a limit is.

~~~
dxdy
Let n = 0.999... Then multiply both sides of the equality by 10 so that we
have 10n = 9.999...Then subtract n from both sides of the resulting equality
to get 9n = 9.000...Finally, divide both sides by 9 and voila we have n = 1
which is what we wanted to show.

~~~
soVeryTired
My gut feeling is that that proof isn't quite correct, since you haven't used
the notion of a limit anywhere. There's a fundamental fact about convergence
of geometric series that you need to use.

I think your proof goes wrong since you haven't justified how arithemtic
operations work with infinite decimals. AFAIK the only way to add non-
terminating decimals is to convert them to fractions (or sequences of
fractions as with pi, e, etc), add the fractions, and convert them back. So if
you convert 0.999... and 9.999... to fractions, you've assumed the conclusion.

To play devil's advocate, I can try to rephrase your proof without infinite
decimal arithmetic as follows.

Assume

n = 0.999... = 1 - epsilon, where epsilon is 'infinitesimal' (an ill-defined
version of not-quite-zero). We'd like to show that epsilon is zero.

10n = 9.999 = 10 - 10epsilon

9n = 9.999 - (1 - epsilon) = 9 - 9epsilon

9n = 8.999 + epsilon = 9 - 9epsilon

The only way to get the epsilons to cancel is to assume epsilon = 0, which is
to assume the conclusion.

------
ngcc_hk
It is an odd review mixing negative comment and positive appreciation. The
last bit did me as I am not historian but what to know the latest development
of the field. And the Zeno etc kind of philosophical discussion lately.

------
samwhiteUK
Holy, commas, Batman

