
Why there is no Hitchhiker’s Guide to Mathematics for Programmers - disgruntledphd2
http://jeremykun.com/2013/02/08/why-there-is-no-hitchhikers-guide-to-mathematics-for-programmers/
======
danso
> _Perhaps one of the most prominent objections to devoting a lot of time to
> mathematics is that it can be years before you ever apply mathematics to
> writing programs. On one hand, this is an extremely valid concern. If you
> love writing programs and designing software, then mathematics is a nothing
> more than a tool to help you write better programs._

Calling it an "extremely valid concern" is understating it. The dilemma is
that programmers have many, many roads to follow if they want to improve their
product, all of them discrete, deep fields in their own. Math is great, and
certainly studying it will expand your mind...but so will studying engineering
practices, statistics (which yes, is a kind of math, but not as theoretical as
the kind the OP is advocating for), user interface design, and knowledge
domains (medicine, legal, earth sciences, etc.)...all of which, as far as most
programmers can tell, will yield concrete rewards much more easily than will
mastering higher level mathematics.

~~~
tikhonj
Abstract math is somewhat special just like studying programming languages and
semantics is somewhat special: it's something of a meta-science, applicable to
_all_ the programming you do.

Everything else is specific to the domain you're programming in; these "meta"
fields (I need a better name for them) have programming or reasoning _itself_
as the domain.

------
tunesmith
I've come up against this recently and I think a lot of imperative programmers
will. For me, the basic pattern, starting from a career based on perl/php/java
was this series of realizations:

\- EECS circuit design is based on a conceit that gives chips a speed limit

\- as a result, chips aren't getting much faster and that multi-cores are
happening instead

\- parallel programming will become more and more essential to make a buck as
a programmer

\- imperative languages aren't great for parallel programming due to mutable
variables

\- functional languages are better for parallel programming due to immutable
variables

\- functional languages are a lot closer to math concepts than imperative
languages

\- uh-oh

So for me it's been a fun few months of stumbling through self-directed study
in stats, probability, and math. Bayesian probability from some articles by
Yudkowsky. Fast, clumsy review of 0th-order, 1st-order, and zfc set theory.
Anki decks out the wazoo. Now I'm plodding through Learn You A Haskell.

I would _like_ to work more with actual mathematical proofs but I'm finding
that tough to get into myself. It was fun paging through metamath for a few
days, though.

~~~
bcoates
Message passing is an ancient and well-understood mechanism for producing
parallel, scalable, mutable-state imperative programs. Feel free to continue
to not learn math :)

~~~
tunesmith
But that's not as much fun. :-)

------
zepolud
Actually, there _is_ something that comes pretty close to this--it's called
The Princeton Companion to Mathematics, curated by T. Gowers but mostly
written by about a hundred of the top mathematicians in their respective
fields, and is one of the most beautiful books I've ever read.

------
dxbydt
> “The proof is trivial.” This is the mathematician’s version of piping output
> to /dev/null.

Heh heh. Never thought about it like that. Amazing write-up, btw. As somebody
who dabbled in both CS & Math, I'd say the cultures are vastly different. You
can spend years, decades even, just teaching undergrad calc courses while
having barely 1-2 papers to your name, and yet you'd be considered a
legitimate mathematician & get paid & all that. I know scores of math
professors who are in that category. With CS, if you don't have productive
output in a week, you are just idling & companies will fire you. CS academia
isn't a whole lot different...the paper output is a lot more, though much of
it is backed by programatic machinery.

As a consequence, a standardized notation/syntax has developed out of sheer
pragmatic necessity. Everybody knows what you mean when you say its a hash, or
a bst, or an lfsr, or a trie or a monad ( you wish :) Well atleast some of
these are standard concepts across all pgmming languages.

With math, you can labor for years in some obscure field ( heh heh a
mathematician will kill for that pun!) in which less than ten people know what
you mean. At a conference, Hilbert was supposed to have gotten up and asked a
speaker "What do you mean by a Hilbert space?". That should tell you
something. The syntax is remarkably nonstandard & even simple things like
edges, nodes, edge weight, graph correspondingly become arcs, vertices,
payload, network depending on where the literature originates from. When I
studied a few math papers in grad school, I had a lurking suspicion the author
was going out of his way to obscure his thought process & result. With CS
papers, you atleast get straightforward pseudocode & you can run off to your
favorite language whether php or haskell & give it a shot. There are math
texts out there where you pick up one of them, you feel like a complete fool,
you pick up the other, you instantly get the point, and they are both talking
about the same exact thing !! artin vs dummit & foote vs herstein comes to
mind...i got 1000% more out of herstein than the other two.

I agree with the author that in the absence of standard math terminology, I'm
afraid programers struggling to learn advanced math just learn bits & pieces
necessary to get their job done & move back to fighting their daily battles
with git rebase & jira tickets.

~~~
bad_user
I've seen CS papers that use mainstream programming languages directly,
instead of pseudocode, so you can just copy/paste it.

However, many CS papers aren't an easy lecture either, even if they talk about
something really practical. I remember reading the paper on String B-trees,
which is a data-structure for string search optimized for external storage (so
you don't have to fit the whole index in RAM) and even if I understood all the
concepts, the language was rather obtuse for no reason.

~~~
jfb
Writing prose, even highly structured prose, is an unevenly distributed skill.
Thankfully, it's also perfectly possible to get better at it. Would that more
academics (lawyers, &c.) would try.

~~~
j2kun
Amen to that. I can't tell you how many research papers I read which muddle
around with all the wrong words.

------
bcoates
If this is intended as a defense of Math culture, I think it falls short. He
does a good job of pointing out the corresponding problems in the state of
programming:

 _Indeed, the opposite problems are familiar to a beginning programmer when
they aren’t in a group of active programmers. Why is it that people give up or
don’t enjoy programming? Is it because they have a hard time getting honest
help from rudely abrupt moderators on help websites like stackoverflow? Is it
because often when one wants to learn the basics, they are overloaded with the
entirety of the documentation and the overwhelming resources of the internet
and all its inhabitants? Is it because compiler errors are nonsensically
exact, but very rarely helpful? Is it because when you learn it alone, you are
bombarded with contradicting messages about what you should be doing and why
(and often for the wrong reasons)?_

The difference is that the CS community recognizes that these are problems;
every single thing he's complaining about are open problems being taken
seriously and attacked from multiple directions, and there is hope for serious
improvement in the coming decades. Anyone who thinks rude snobs, bad
documentation, or useless compiler errors are a beneficial is rightly
ridiculed as a smug weenie or accused of having an ulterior motive.

By contrast, mathematicians are defensive and complacent about their arcane,
non-inclusive notation and communication: "At this point you might see all of
this as my complaining, but in truth I’m saying this notational flexibility
and ambiguity is a benefit." Look at the litany of problems he just presented.
Consider the fact that mathematics is _not_ the only complicated subject that
requires complicated, flexible, and rigorous notation. It just isn't credible
that the shitty state of mathematical notation is either necessary or
unavoidable. The occasional counterexample, where someone with a good
understanding of a subject presents it in full rigor without resorting to the
usual obfuscation, is a hint of what could be.

If your publications cannot be read without an expert interpreter, they are
defective. Hypertext has been around for decades, if you're going to invent
your own ad-hoc (or even standardized!) syntax to solve a problem your readers
have a right that you document the meaning of your notation.

~~~
john_b
This isn't entirely true. Numerous mathematicians have publicly bemoaned the
difficulty of communication between mathematicians in different specialties.
So it's definitely not something that is ignored.

The question is what to do about it. We're not just talking about confusion
arising from different notation between mathematical specialties (resolving
that would be as easy as defining your notation in an appendix), but
different, equally valid ways of mathematical thinking.

In some ways, it may be better to think of different mathematical specialties
as different programming languages. Proficiency in one will help you, but
won't guarantee that you can interpret another. Except that in mathematics,
the differences are more extreme. If you have two Turing-complete programming
languages, then you have two different tools that can solve the same class of
problems. But different fields in mathematics deal with entirely different
mathematical objects which require a _conceptual_ instead of _notational_ leap
on the part of the reader. It's not simply a matter of figuring out how to
write for loops or manipulate strings in the new language. You actually have
entirely different ideas in each, and trying to impose some common notational
standard among them is fraught with problems.

~~~
skybrian
Fair enough. But even though there's no hope of coming up with a single
programming language for everything, we do have quite good tools for popular
areas. It seems like being able to mechanize the error-checking of proofs in
certain of the more useful and popular subfields of math might be just what's
needed by non-mathematicians?

~~~
tikhonj
Well, we do have proof assistants. I think people are starting to use those in
more diverse fields of math now--I remember seeing an article about an
algebraic proof in Coq, but I don't remember any of the details.

------
hdivider
_"For the programmer who is truly interested in improving their mathematical
skills, the first line of attack should now be obvious. Become an expert at
applying the basic methods of proof"_

This may be good advice if you want to get into pure mathematics, but huge
parts of maths are not pure.

Applied mathematics is very relevant for programmers - especially game
programmers. It's not a sin to sacrifice mathematical rigour for the sake of
discovery.

It doesn't even have to be a direct map (e.g. actual implementations of
calculus or such). Just understanding various bits and pieces in granular
detail can help you get precise and testable quantitative ideas for problems.

For instance, a while ago I wanted to implement a kind of smooth up-and-down
motion for an object in my game. Almost immediately I thought of equations of
this form:

(y position) = (some amplitude)* sin((some frequency)*(time)) + (some value to
denote the origin)

This is a first step for some fairly interesting up-and-down motion - the rest
is all a matter of finding suitable constants.

Similar things come to mind when you want to move an object smoothly to
another position along a line, with a decreasing speed:

position = ((position you want to move to) - (current position)) / (some
constant) + (some minimum speed)

This stuff just flashes before your mind if you've done plenty of physics (or
'mechanics' in mathematics) before. What's more, you'll know countless other
ways of increasing the complexity of your equations without having to do much
trial & error hacking (except to find things like constants and such).

For games, I'm convinced that there's still an incredibly huge variety of
mathematical behaviour that has yet to be harnessed for the implementation of
actual game mechanics. (Games usually just use some game engine's
implementation of only some parts of classical mechanics, for instance.)

Never underestimate the relevance of applied mathematics to real-world
programming, even though it doesn't really occupy a spectacularly prominent
position in CS-focused courses.

------
rcfox
I think the biggest stumbling block for me was all of the stupid tricks
involved in the proofs I had to do for school.

"If I add sin(x) to both sides of the equation, then I can use such-and-such
identity which makes this equal foo."

If you don't make the correct mental leap, then you get completely stuck and
it's really easy to give up.

~~~
davorak
It is sometimes useful for me to break down this type of problem into:

* enough memory to remember the identities * fast enough recall of identities from memory @ * fast enough exploration of possible solutions @

@ Fast enough is when you can do it all within the amount of time it takes for
solving the problem to give enjoyment rather then ending a frustration.

Without the above I have to use different, usually more intentional,
strategies.

------
cschmidt
I don't know about the OP's point on the value of proof to programmers. I do
optimization and machine learning, and I _use_ all kinds of math on a regular
basis. But I don't prove things, and rarely need to follow someone else's
proof either. You can get a lot out of applied math without touching a proof.
And if it isn't applied, then you really don't need it ;-).

I agree there is value in learning how to read math notation, so there is a
valid point in there somewhere.

~~~
coolSCV
An analogous case is the programmer who relies on libraries and abstraction
and "knows just enough to be dangerous." Statistics and optimization
mathematics in particular are vulnerable to this. If you haven't read the
proofs or you cannot prove it yourself, then you truly don't understand it.
I'm not claiming this is you since your post doesn't imply that.

However as an example, if someone is solving linear or non-linear programs
using <insert software or library here> but hasn't bothered to thoroughly
study the mathematics underlying either and is just plugging in numbers from
the constraint equations they came up with, then I would not want that person
doing any sensitivity analysis. There would just be no way to be confident
that they are interpreting the results correctly.

~~~
cschmidt
I've benefited from lots of theoretical classes, and you're absolutely right
that you need to understand the theory of what you're doing. For example, I
took a lovely class from Prof. John Hooker (at CMU) that derived Linear
Programming duality as a natural consequence of Farkas' Lemma, rather than
just mechanically grinding through LP tableau. I certainly followed the proofs
discussed in many classes.

However, I can honestly say I got that understanding without doing hardly any
proofs of my own. Graph theory was the only class where I "did proofs", and I
didn't find it my favorite. (Later I did algorithmic graph theory and network
flows - and that was fantastic, with no proofs.)

I guess my point is just that an engineer can understand and use a lot of
pretty advanced math without spending much effort learning about proof
techniques. That certainly wouldn't be how I'd spend my time. (Of course, if
you enjoy doing that then knock yourself out.)

------
derefr
> It’s as if the syntax of a programming language changed _depending on who
> was writing the program!_

This isn't unique to mathematics; it's the reason it's hard to scale the
number of programmers working together on a Lisp codebase. Each programmer
writes their own notation (macros), and ends up down their own rabbit-hole of
a sub-language that nobody else can immediately understand.

------
andrewflnr
This seems like a problem of terminology. It is valid to desire a better
acquaintance with the theorems others have proven, just as it is valid to
desire a better acquaintance with the user-facing features of Chrome, iTunes,
Excel, etc. I guess "using math" is different than "doing math", and most
people are more interested in the the former.

There is a bit of a continuum, though. Using Excel effectively involves a
smidgen of programming, similar to how "using math" effectively really means
using other peoples' theorems to "prove" things, very informally, about your
own domain. Maybe that's why people are more likely to get confused between
"using" math and "doing" math.

------
jholman
There's a lot of interesting and valuable stuff in this post (enthusiastically
upvoted!), but I think there are a lot of errors, too.

First of all, let's acknowledge that many (the great majority of?) programmers
have little or no use for math, and know it, and are correct. That's totally
fine. I think danso's comment captures this well, with references to UI
design, domain knowledge, engineering practices, etc, and dxbydt similarly
mentions "git rebase and jira tickets". So let's assume we're restricting
ourselves to programmers who do have a use for math (and hopefully know it).

FIRST PROBLEM

TFA's introductory claim is apparently that there are programmers who say
"math is useful, I want to use it, I don't want to read or write proofs", and
that this is ludicrous. It's absolutely not ludicrous.

Yes, a working mathematician's job is to explore and understand the space of
mathematics, and yes, that is inextricably bound up in proof. And I would
suggest that learning more about math proof is a soul-enriching use of
anyone's time (in the same broad vein as learning Latin). But techniques of
calculation are one of the valuable outputs of mathematicians, and it is
reasonable and effective for a person with a task to say "I would like to have
more powerful techniques of calculation, and I don't want to learn proofs".

For example. Certainly it is possible to learn how to do Gaussian elimination,
and use it to solve systems of linear equations, to good effect, without being
able to _prove_ that elementary row operations preserve the system. Maybe it
helps? Maybe it doesn't? If you want to invent a new technique, I imagine that
in that case being able to prove the correctness of the existing technique
probably helps a great deal. But what's unreasonable about the person who says
"I have these equations, I need to find the values (if any) of these variables
here that make all the equations true, and I'm too busy to learn any proofs
today (and too busy to write any proofs ever)"?

For a simpler example, we used arithmetic just fine before ZFC.

SECOND PROBLEM

Maybe this is actually two problems, intertwined. And maybe both of these
problems are quibbling.

Primus, he says "Mathematics is cousin to programming in terms of the learning
curve, obscure culture, and the amount of time one spends confused". I think
this is highly questionable. Mathematics is far harder and more baroque than
programming, not least because as a discipline it's nearly one hundred times
older.

Secundus, after acknowledging the insanely annoying tendency of math papers to
make up their own notation without ever defining it, he claims that the non-
mathematician's confusion is comparable to the non-programmer's confusion at
for(;;);. No, it's not comparable. The math case is hard because someone
failed to tell you what you need to know, and no one wrote it down anywhere.
The for-loop is only hard if you did the opposite: failed to read what was
actually written down (e.g. in K&R) and took someone's word for it that the
standard and general form of the for-loop is for(int i=0;i<LIMIT;i++){ / _do
stuff with i_ / }

My claim here is that the learning math - even just the calculation techniques
- is genuinely inherently hard. Partly that's because it's so overwhelmingly
useful that if it wasn't, we'd already know it; clever students know more
calculation technique by age 15 than most professional programmers know about
programming. But on the other hand it's also contingently hard, because
mathematicians aren't generally very interested (nor incented) to export their
product so that outsiders can make use of it. Which I guess isn't so terrible;
once someone else labours mightily to learn all that very-hard math, you can
hire her at an exorbitant rate to make use of the calculating techniques she
has learned. Or if you want to get that benefit without paying the exorbitant
rates to outsource the math, you can roll up your own sleeves. Good luck!

~~~
j2kun
OP here. My point was that if you want to be _better_ at math, then you need
to get comfortable with proofs. There are lots of instances of using math for
useful things, but if you don't do proofs you won't get any better at
understanding mathematics. There are a lot of times when people want to know
more about a particular mathematical fact (they genuinely want to understand
why it is the way it is) but the minute you tell them, they stop wanting to
hear it. This is what I think is ludicrous.

The for loop analogy may be a bit of a stretch, but the reasons for it are as
wacky as for the mathematical syntax of inventing or dropping indices.
Literally the C99 standard says that if a test expression is missing in a for
loop, it's replaced by a nonzero constant. In my mind as a programmer and as a
mathematician, that's as arbitrarily confusing as dropping the limits of an
indexing variable in a summation. They both just allow you to write/type less
and have the expression still mean something.

~~~
EvanMiller
> if you don't do proofs you won't get any better at understanding mathematics

There is a vast world of mathematical understanding available to programmers
who don't like proofs. I think there is some miscommunication here about what
constitutes mathematics. Mathematicians tend to think of the art of proving
things, whereas engineers (and programmers, and scientists) think of equations
and how to solve them. I think mathematicians actually do a disservice to
engineers by trying to beat them over the head with real analysis and set
theory (which is why most physics and engineering departments end up having to
teach their own mathematical methods courses).

The "foundational" knowledge of mathematics -- epsilons, deltas, Cantor sets,
Banach spaces, etc. -- is irrelevant to almost all practitioners. Newton,
Taylor, Euler, Gauss, Laplace, Bessel, Maxwell and the gang got pretty far
without it. When programmers say they want to learn math, they mean they want
practical knowledge: interesting functions, how to compute them, and their
properties that can be applied with utility. They don't want -- or need -- to
spend time poring over proofs.

~~~
j2kun
It seems you're under the impression that real analysis is the foundation of
mathematics (I certainly wouldn't consider Cantor sets or Banach spaces in
there).

Let's say you invent a new machine learning algorithm, how can you know how
well it does? If you invent an algorithm to do anything, how can you be sure
it's correct? Well you can always test it some things, but if it's really a
new algorithm, then you need to prove it does what you say it does in all
possible cases.

It's fine and dandy if someone tells you that there is this function that you
can compute and use in this way and it's the best for its purpose, but this is
an extremely rare case. I believe I have mentioned this elsewhere in comments,
but there is almost never a clear-cut answer in mathematics that someone can
just use out of the box. It almost always involves tweaking to fit a
particular application, and to really understand how to tweak an algorithm you
need to understand why the algorithm works the way it does.

~~~
jholman
I agree with you that if you want to prove the correctness of a new algorithm,
you need to _prove_ the correctness of that new algorithm. You're gonna want
some proving skills there.

How often are programmers developing new algorithms? I guess it depends on
where you draw the line for "algorithm", but let me propose two possible
places for the line. On the one hand, you might use existing techniques in a
new function. In this case, the proving technique you need is the theory of
invariants and so on, as Dijkstra ranted about. You're not gonna need a lot of
practice with higher-level math proofs for that. That's even assuming you care
about "proving that it does what you say it does in all possible cases", which
of course we all know is done by roughly 0% of programmers, especially with
emphasis on the "in all possible".

On the other hand, you might make a bigger development, that requires more
mathematical expertise. I suggest for your consideration that the correct
description of a person doing this latter work is not "programmer", but
"professor" (or grad student), and that this is relatively rare, and if this
is what you meant all along, you've been using misleading language. I think
you have it absolutely backwards when you say "this is an extremely rare
case"; I think the case where you need real mathematical proof techniques is
the extremely rare case, even in relatively math-heavy areas, e.g. numeric
simulation. What you need, as I've said before, are techniques of calculation.

Also, my ML knowledge is weak, but from my modest knowledge, I don't think any
of it works provably. It works pro _b_ ably. :) Seems a weird example to
choose.

At this point, btw, I acknowledge that I'm thoroughly into "quibble"
territory. =]

------
anonymoushn
This is good advice for programmers _who want to become better
mathematicians_. That seems to me to be distinct from wanting to be better at
applying mathematics to programming problems. In the same way that wanting to
be a better gardener is distinct from wanting to understand the aspects of
gardening that will help you write programs relevant to that domain, with the
obvious caveat that mathematics-related knowledge is relevant to a much wider
range of problems people try to solve using software than gardening-related
knowledge.

In school I appreciated the sorts of classes that stressed proof the most,
like topology or real analysis 2, because they made me a better mathematician.
When programming, I am frequently glad that I took some upper level math
class, but generally for reasons unrelated to whether I've actually _done
mathematics_ in that area. For instance, I could take advantage of the
equivalence of the square metric and the Manhattan metric in a 2-space (up to
a 45 degree rotation of the space) regardless of whether my topology class had
emphasized proofs. The attitude portrayed as absurd in the article ("I don't
want to write or read proofs, just use theorems") doesn't seem that far off
from one that would work pretty well. The main issue with that attitude is the
belief that a large body of theorems can be understood without understanding
any of their proofs. Programmers don't need to be able to write proofs that
aren't extremely trivial, but people who want to apply ideas from some area of
mathematics probably need to understand it in a deeper way than memorizing
theorems.

------
gambler
The argument of the author sounds like an argument of someone who is addicted
to writing ugly Perl one-liners. Many, many, complicated and co-dependent one-
liners. "But I need this expressive power to do things!" Perhaps you primary
concern should be about explaining things you did, rather than doing them.
Especially if your job it to explain something to people.

------
xwowsersx
I feel like so much of this discussion is way too general. It's impossible to
debate how/how much math is needed for "programming" without speaking about a
particular (programming) domain and the subset of math, if any, needed for it.

In other words, "no math is needed" and "you need proofs" are probably both
true. There's some realm of programming in which no math is needed, ever. And
then there's other programming that requires lots of math. But the math
required, might be linear algebra in some cases and some other specific branch
of math in another area. This fact is a just a reflection of the fact that
"programming" is an extremely wide area. There are some realms of programming,
where the "programming" aspect of it is virtually subsidiary to the math. The
programming is just automating or systemizing some math which is the real work
is taking place.

So, my take away from seeing this discussion and similar comments so many
times is that everyone can agree that math is an enriching subject. However,
from the standpoint of its relationship with your programming, going really
deep into one particular area of math without a clear idea of if/how it's
going to help you accomplish something previously unattainable is probably not
the best use of your time.

------
thyrsus
He's analogizing to the fictional "Hitchhiker's Guide to the Galaxy". There
does exist an analog to the real world "Hitchhiker's Guide to the Galaxy":
Carl E. Linderholm's "Mathematics Made Difficult". Utterly hilarious.

Regrettably, it's out of print, and you have a choice of paying upwards of
$138.50 (abebooks.com) for a used copy of a book of jokes, or finding the
pirated pdf I once encountered in a search (I've no idea how complete that
was).

~~~
j2kun
My upstairs neighbor has this book, and I've been meaning to steal it from him
for a few months now :)

------
alok-g
+1 for the book recommendation alone [1]. I finally see a book on set theory
that explicitly states that the term "set" by itself is undefined.

Reading the first chapter preview at Amazon, I already see the need for a
human to explain something to me:

Two sentences from the book [1]: "A possible relation between sets, more
elementary than belonging, is equality." and "Two sets are equal if and only
if they have the same elements". These already seem to be contradicting each
other -- how can equality be more basic than belonging if the former is
defined in terms of the latter.

While I agree with OP on the difficulty I face when learning advanced
mathematics (notation, generally tacitly assumed by the authors), one thing
that also bugs me is use of human language (!) as a part of the proofs. How do
I know that the proof is correct and is not impacted by something fundamental
about the language itself? Merely via inclusion of a human language in the
proof, a lot more axioms "may" have been included than what meets the eye.
Probably not, but I often find it hard to convince myself of this.

[1] www.amazon.com/Naive-Theory-Undergraduate-Texts-Mathematics/dp/0387900926

~~~
xyzzyz
_Two sentences from the book [1]: "A possible relation between sets, more
elementary than belonging, is equality." and "Two sets are equal if and only
if they have the same elements". These already seem to be contradicting each
other -- how can equality be more basic than belonging if the former is
defined in terms of the latter._

Axiom of extensionality states that the sets A and B are equal _if and only
if_ they have the same elements.

The point here is this: if A and B are the same set, then whenever some x
belongs to A, it also belongs to B, because B is just a different name for A.
Similarly, if something belongs to B, it also belongs to A. So, the
implication "A is B implies A and B have the same elements" is just a logical
tautology, axiom of extensionality does not say anything new in this case.

Now suppose you have two sets, C and D, such that whenever something belongs
to C, it also belongs to D, and when something belongs to D, it also belongs
to C -- in other words, they have the same elements. Can we conclude that C is
the same set as D? Without extensionality axiom, no. That's why we usually
assume axiom of extensionality, which tells exactly that whenever sets the
same elements, they are the same, so that we don't end up with a weird
situation where we have two sets that are indistinguishable with set theory,
yet are different.

------
plainOldText
I know this is a bit off topic, but does anyone know of a source (book, video,
website, etc) which you could use to review the math from 0 to Calculus in one
day? A sort of big picture overview of how math evolves from simple to
complex. I've always wanted to do it, just to keep the math in my head fresh.

~~~
nborwankar
I'll do an outline here - then follow each outline item to Khan Academy for
steps and details

* numbers - natural numbers, integers +ve/-ve numbers addition subtraction

* numbers - rationals, reals, multiplication/division

* numbers - fraction and decimal representations - addition/subtraction/mult/division of fractions and decimals

* sets and functions - mappings, single valued and multivalued, domain and range

* powers and exponents

* algebra - functions and equations linear equations, quadratic equations, simultaneous equations

* geometry and trig

^^ althat was pre-calculus and I may have condensed the steps a lot

* limits <\-- this is basic stuff for calculus - all of calculus is about starting with finite expressions and then taking limits as something becomes infinite or infinitesimal.

* continuity of functions

* derivatives

mean value theorem

* integrals

integration and differentiation is often taught in that order but integration
is much easier to understand intuitively since areas and volumes are more
concrete and tangible velocity as a vector tangent to the direction of motion
is less tangible to most.

After this point you take a sidestep and start the process again at a higher
level with * sequences and series (limits on steroids) * linear algebra
(simultaneous equations on steroids) * multivar calc (calculus on steroids)

Then there's complex analysis, differential equations, partial differential
equations and so on.

Note here I am taking the usual more 'applied' approach to math which is
tangible. The more 'pure' approach' - abstract and much harder for most is to
go

logic and number systems real analysis algebra complex analysis measure theory
/ probability functional analysis ... which is what you'd do if you were on a
pure math rather than an engineering math track.

Hope that helps.

~~~
plainOldText
Wow, thanks. You took the time to write all that. That is awesome.

------
compee
I work with mathematics applied to various applied computer science problems
with a couple startups.

If anyone is interested in learning a bit of mathematics to structure programs
(for real-world usage, even!) feel free to let me know what you'd be looking
for. I'd love to help!

------
Uncompetative
The Hitch-Hiker's Guide to the Galaxy already parodies Maths:

<http://hhgproject.org/entries/bistromathics.html>

Clearly, Douglas Adams spent a lot of time in Bistros, as the second review on
this site echoes his sentiments about the odd behaviour of waiters and
nonabsolute numbers such as time of arrival:

<http://www.yelp.co.uk/biz/giannis-italian-bistro-san-ramon>

------
ekm2
I think contest math is the best kind of math for a programmer since it sets
out to solve problems and not to necessarily come up with theories like
regular math.

------
nborwankar
Re: the original post IMHO, I find it has many logical flaws and while it is
trying to draw attention to the fact that programmers need to learn math it
may end up scaring people away.

Background: I have a BS in EE from IIT Bombay, An MS in Applied Math from USC
and an almost PhD(everything but thesis) in Pure Math.

I also loved teaching math as a TA and was able to teach engineers
multivariable calculus and differential equations so that they lost their fear
of it.

But for the last 20 years have been in the database business. I am very
enthusiastically returning to math in the form of data science in the last
year.

I find that especially today it is necessary for math to be made accessible
without having to prove theorems - that's for professional mathematicians.

Programmers use hardware often without knowing diddly-squat about
semiconductors. And that's just fine if you don't want to or need to. We drive
cars, use mobile phones and all sorts of machinery without needing to know
anything about internal combustion or the CDMA algorithms or even how theory
of relativity figures into GPS.

I think the insistence on setting a high bar for people to learn/use/apply
math is unnecessary mystification and obscurantism.

Math is beautiful, useful and powerful. If you know math you should be wanting
to simplify it's teaching and instruction, thinking hard about making it
useful for the non-mathematician and also making it easier to access in a
breadth-first fashion rather than the depth-first fashion in which it is
currently practiced.

I know many people, engineers and mathematicians who believe that
simplification is "dumbing down". I disagree vehemently.

Today if you are a programmer and want to learn math I have a few of
suggestions

a) If you want to apply it in engineering take a couple of classes on
Coursera/Udacity - it will be a slow an steady way to re-activate your math
neurons (it's also insurance against Alzheimers to keep learning new and hard
stuff).

b) Learn some statistics, learn to use R. (Coursera again)

c) Learn some linear algebra, minimal, then learn MATLAB (Andrew Ng's Machine
Learning class on Coursera or a similar one by AbuMustaffa (sp?) of Caltech)

d) If you absolutely don't want to learn a new language then look at Machine
Learning by Marsden which uses Python and then pick up NumPy/SciPy and look up
the exciting stuff that's happening with the IPython Notebook.

e) If your focus is on pure CS the take the Algorithms track(s) by Roughgarden
(Stanford) or Sedgewick (Princeton) on Coursera

This way you dont have the notation issues - R has a syntax, MATLAB has a
syntax just learn it. Python hopefully is note even an issue.

Start using your math brain a little at a time - absolutely no proofs required
- and THEN once you see the beauty and want to learn more and want to learn
why, THEN dig deeper with the fundamentals. If you don't that's fine too - use
it, put its power to use and become a better developer and engineer. Math is
not trivial but it's not as hard as it's made out to be. There's way too much
noise in the channel.

Good luck.

(feel free connect with me on twitter if you want to get more help (@nitin))

~~~
j2kun
I certainly don't think programmers need to learn math. The vast majority of
programming requires no math, and there are tons of out of the box theorems
and algorithms that can be used successfully by programmers to do wonderful
things.

I am just pointing out why math can scare programmers away. The first step in
overcoming that fear is to identify it.

And the fact is that programmers do need to tweak the algorithms they use.
Your analogy with a car works perfectly: can you expect someone who only knows
how to drive a car be able to convert a sedan into something that can go off
road? Then how can you expect someone who only knows how to call a function to
alter its contents to suit their particular application?

I also think that I haven't set a high bar. The basic four methods of proof is
quite a low bar (this could, and should be taught in high school mathematics,
and it baffles me why it's not). Learning common syntax will come as you
learn.

~~~
nborwankar
It's true you may have set a low bar re: proofs, I just don't think any formal
proofs are needed at the first step.

Learning about numerical convergence and computer arithmetic so you can
understand why your MATLAB Linear Algebra program is not giving a useful
result is far more valuable and doesn't need any proofs. It needs a knowledge
of how to _use_ theorems and results and what the conditions are for a result
to be applicable. But there is no need to do the proof. Especially since many
many proof sare quite idiosyncratic and give no help in suggesting how one
thinks like that.

Using the analogy of the car I am talking about teaching someone who rides a
bike to drive a car. My explicit non-goal is being a mechanic of _any_ kind.
So your example of modifying your car is a strawman. That's exactly what I
assert a student-driver doesn't need and *shouldn't have to do. It is going to
be pretty much useless in actually learning how to steer and how to follow
traffic rules. One is about proper usage, the other is about understanding
internals. Sorry I still disagree and reassert that proofs are no use at all
and a huge distraction for programmers trying to learn math. There is so much
valuable and useful work a programmer can do even using existing mehods
without ever modifying them. Especially with special purpose languages like R
and MATLAB - the underlying algorithm are very mature and a first time user is
not expected to modify them.

Quite separately I think it's extremely valuable to learn/relearn the
foundations of math but this is for anyone and not specific to what a
programmer needs to do useful things with math. Yes and it should be taught in
high school math. It's not taught bec the teachers didn't learn it that way
and so it goes ...

------
DanBC
Sorry, but Helvetica Neue and font weight 300 make the page unreadable for a
number of people.

------
ilaksh
Math is just obfuscated code that you have to run in your head.

------
vorvzakone
tl;dr: i'm a snobby research student who is trying to trash developers who
didn't need college. (yeah. that was an ad hominem. come at me.)

~~~
sthatipamala
That's not at all what the author is trying to say. He's pointing out the
shortcomings of mathematics notation and why that can be off-putting to
otherwise motivated programmers.

~~~
NegativeK
Not only is it not at all what the author is trying to say, I'm honestly
confused at how you could arrive at that.

