
Ask HN: What is calculus used for in Computer Science? - zabana
Hello HN,<p>This might sound like a stupid simple question to most of you, but I&#x27;m currently learning computer science theory and Calculus comes up very often. I was wondering why it is and what is the relationship between algebra and computer science in general. (I&#x27;d like to develop a broader vision of the field)<p>Thanks in advance<p><i></i>EDIT<i></i> typo.
======
ll350
I don't really understand your question, since you say you are 'learning
computer science theory and calculus comes up' it would appear that you have
the answer to "What is calculus used for in Computer Science?" already. In the
interest of trying to be helpful, I'll provide some of my some of my thoughts
after completing a Computer Science degree at a large research university in
the US.

The vast majority of computer science THEORY involves discrete math, at least
at the undergraduate level. There are numerous APPLICATIONS that use calculus
and some of them have already been mentioned. Initially I thought it odd that
we were required to complete multi-variable calculus for our degree, which
seemed to have little to do with the 'discrete' math world and more with
'continuous' math topics. However as time went on I encountered topics in
advanced courses where you were expected to be familiar calculus at that
level. The notable exception to the lack of calculus in computer science
'theory' that I recall relates to Probability, as it is used in algorithm
analysis. Hope I didn't misunderstand your question completely.

------
hackermailman
This story about Feynman using calculus and continuous functions to analyze
and approximate discrete systems sort of answers your request for a broader
view. [http://longnow.org/essays/richard-feynman-connection-
machine...](http://longnow.org/essays/richard-feynman-connection-machine/)

"By the end of that summer of 1983, Richard had completed his analysis of the
behavior of the router, and much to our surprise and amusement, he presented
his answer in the form of a set of partial differential equations. To a
physicist this may seem natural, but to a computer designer, treating a set of
boolean circuits as a continuous, differentiable system is a bit strange.
Feynman's router equations were in terms of variables representing continuous
quantities such as "the average number of 1 bits in a message address." I was
much more accustomed to seeing analysis in terms of inductive proof and case
analysis than taking the derivative of "the number of 1's" with respect to
time. Our discrete analysis said we needed seven buffers per chip; Feynman's
equations suggested that we only needed five. We decided to play it safe and
ignore Feynman.

The decision to ignore Feynman's analysis was made in September, but by next
spring we were up against a wall. The chips that we had designed were slightly
too big to manufacture and the only way to solve the problem was to cut the
number of buffers per chip back to five. Since Feynman's equations claimed we
could do this safely, his unconventional methods of analysis started looking
better and better to us. We decided to go ahead and make the chips with the
smaller number of buffers. Fortunately, he was right. When we put together the
chips the machine worked. The first program run on the machine in April of
1985 was Conway's game of Life."

------
eggie5
in convex optimisation, often machine learning, gradients (partial
derivatives) are used. In deep learning an approximation of the chain rule is
used in backpropigation.

------
trcollinson
The interesting part about Calculus, at least in the US, is that it is held in
such high esteem. As if it is the pivotal accomplishment in a mathematical
career. Calculus in many other educational systems is actually taught much
earlier as a normal part of a mathematical curriculum (early high school in
Russia, for example). Basic calculus is the look at a rate of change or a
sensitivity to change as it relates to a change in the variables of a system
(derivative). In other words an instantaneous rate of change would be a good
example of a derivative (how fast does a rock fall from a building after it
was dropped +2 seconds, or some such). The other area that is studied early is
the idea of displacement, aka integrals. How much area is under a curve. More
practically you can find things like the velocity from the rate of
acceleration.

I know, I haven't answered your question yet. I have used Calculus, even much
more advanced calculus, in computer science applications such as using it to
calculate the salient region of interest in a photograph, for example. But in
reality, I rarely use it. That being said, studying calculus and higher forms
of mathematics have given me a broader sense of how to best solve problems.
They give me the ability to see a problem and realize that brute force isn't
the only answer to the problem. Mathematics shows that solutions can be
nuanced and beautiful. Calculus is a good start in seeing that. Applying that
ability to look at a problem and find nuanced and beautiful solutions is key
to good software development.

Again, I am not sure that actually answers your question. But I find calculus
to be mind expanding. I hope you'll enjoy it as much as I have.

Now if you'd like to see an actual application of calculus being used in
computer science you might want to read about Richard Feynman at Thinking
Machines Corporation: [http://longnow.org/essays/richard-feynman-connection-
machine...](http://longnow.org/essays/richard-feynman-connection-machine/) .
One of the quotes from the article always makes me smile: "By the end of that
summer of 1983, Richard had completed his analysis of the behavior of the
router, and much to our surprise and amusement, he presented his answer in the
form of a set of partial differential equations."

There is also a rather funny urban legend/joke about teaching Calculus in the
US versus other countries:

"A certain well known mathematical from the USSR, we'll call him Professor
P.T. (these are not his initials...), upon his arrival at Harvard University,
was scheduled to teach Math 1a (the first semester of freshman calculus.) He
asked his fellow faculty members what he was supposed to teach in this course,
and they told him: limits, continuity, differentiability, and a little bit of
indefinite integration.

The next day he came back and asked, 'What am I supposed to cover in the
second lecture?'"

Enjoy learning Calculus!

~~~
_jn
The US high school curriculum tries to provide different courses in a way that
may be great for creative subjects or languages, but simply doesn’t work well
for math. The level of interconnection (eg uses of calculus in probability
theory and geometry) means that basic calculus (limits, continuity, one-
variable derivatives and integrals) really should be an integral (sorry,
couldn’t resist) part of a curriculum in which it is largely ignored.

------
bjourne
Algebra means symbol manipulation. I'm sure you can see how manipulating
symbols is essential to computer science. Calculus isn't used much because it
depends on infinite processes. But numerical approximations, based on results
from calculus are. For example, the trigonometric functions are implemented
using Taylor polynomials which we, thanks to calculus, know are good
approximations of them.

------
seattle_spring
Mostly just for overly-complex interview questions.

------
lukaslalinsky
Calculus is a low level tool that you need to know in order to effectively
think and communicate about problems that deal with numbers. Signal processing
is probably a big chunk of these problems. That can be things like, computer
vision, audio DSP, etc.

------
PaulHoule
If you see it coming up often you might want to catalog the ways in which you
see it coming up.

