
Mathematicians are chronically lost and confused  - aditgupta
http://j2kun.svbtle.com/mathematicians-are-chronically-lost-and-confused
======
nly
Secondary math education, for me in the UK, didn't deal with anything outside
of elementary algebra, Euclidean geometry, some statistics, and relatively
simple calculus. Nobody talked to us about imaginary or complex numbers, or
bayes theorem, decision theory, or non-trivial mechanics problems until I was
in college (age 16+). Nobody mentioned matrices, broader number theory or
discrete transforms until I was in university. I studied EE not compsci.
Things like algorithmic complexity I had to learn for myself and from Knuth.
I'm trying to grok group theory right now to help with my understanding of
crypto. Before this, it was never mentioned throughout my education, so I
don't know what courses you would have had to take to learn that. The fact
that I didn't even know group theory was important to crypto until after I had
made the choice strikes me as a bad sign.

The common theme at every level is learning cherry-picked skills, before
you're even _told what the branches of mathematics even are_. Everything seems
disjointed because you're not taught to look past the trees for the forest.
Most people infact, even technical folk, go through their entire lives without
knowing the forest even exists. Any idiot can point to a random part of their
anatomy and posit that there's a field of study dedicated to it. The same goes
for mechanics or computer science. You just can't do that with mathematics as
a student.

I loath academic papers. Often I find I spend days or weeks deciphering
mathematics in compsci papers only to find the underlying concept is intuitive
and plain, but you're forced to learn it bottom up, constructing the authors
original genius from the cryptic scrawlings they left in their paper... and
you realise a couple of block diagrams and a few short paragraphs could have
made the process a lot less frustrating.

So many ideas seem closed to mortals because of the nature of mathematics.

~~~
cs702
_Often I find I spend days or weeks deciphering mathematics in compsci papers
only to find the underlying concept is intuitive and plain, but you 're forced
to learn it bottom up, constructing the authors original genius from the
cryptic scrawlings they left in their paper... and you realise a couple of
block diagrams and a few short paragraphs could have made the process a lot
less frustrating._

This is SO TRUE.

The same thing happens to me regularly, and not just with "computer science"
but with other technical fields, hard sciences, and mathematics. The purpose
of most academic papers is not to explain (let alone teach!) ideas in an
intuitive manner, but rather to express them in formal, correct, unambiguous
terms -- that is, to make them as _accurate_ and _critique-proof_ as possible
for publication in some journal.

~~~
coldpie
Just as a specific example, I had this experience with Bayes' Theorem
<[http://en.wikipedia.org/wiki/Bayes%27_theorem>](http://en.wikipedia.org/wiki/Bayes%27_theorem>).
As an informal paper for my computer security class, we used Bayes' theorem to
implement aimbot detection in a simple FPS. It sounds like a big, complicated
theorem with a special name that some genius had to come up with and has
complicated notation involving probabilities and logic symbols.

And then it's basically (paraphrasing with reckless abandon) just the probably
of your event divided by the total probability space. Lots of words and jargon
and theory given in countless papers and articles, and it pretty much just
boils down to intuitive addition, multiplication, and division.

And our aimbot detector actually worked pretty damn well! Just gather some
data points to determine probabilities, plug them into the simple formula, and
it was always correct in our test cases.

~~~
applesinjuice
Fixed link for the lazy:
[http://en.wikipedia.org/wiki/Bayes_theorem](http://en.wikipedia.org/wiki/Bayes_theorem)

~~~
coldpie
Thanks. I reported this bug nearly two years ago :)

[https://news.ycombinator.com/item?id=4112327](https://news.ycombinator.com/item?id=4112327)

------
japhyr
I currently teach math to at-risk students. I don't read all of these
submissions about math education, but I skim the comments on most of them. The
comments people make change the way I teach math.

I have always done a decent job of teaching math. I focus on helping students
understand concepts, even when they are focusing on mechanics. I use words
like "shortcut" and "more efficient method" rather than "trick" when showing
students more efficient ways to solve problems. I have students do problems
and projects that relate to their post-high-school goals.

But with the routines of school life, I get away from the fun of math from
time to time. The comments on these submissions often remind me to go in and
just tell stories about math:

\- "Hey everyone, did you know that some infinities are bigger than other
infinities?"

\- "Hey everyone, do you have any idea how your passwords are actually stored
on facebook/ twitter/ etc.?"

\- "Have any of you heard the story about the elementary teacher who got mad
at their class, and told everyone to add up all the numbers from 1 to 100? One
kid did it in less than a minute, do you want to see how he did it?"

Thanks everyone, for sharing your perspective on your own math education, and
about how you use math in your professional lives as well. Your stories help.

~~~
csours
I love stories like these. I haven't gone much past calculus, what stories can
I look for that will take me farther?

~~~
wetmore
You may recall that one can use the quadratic formula to find solutions to a
quadratic equation (polynomial with highest term being x^2). Did you know that
(one) guy that proved there is no such formula for quintic equations (highest
term x^5) died in a duel when he was only 20? [1]

There is a town with a particular rule when it comes to facial hair: those who
do not shave themselves are shaved by the barber. But then who shaves the
barber? [2]

The other poster mentioned different infinities. One "size" of infinity is
called "countable infinity" and is the infinity describing the size of the
natural numbers (1,2,3,...). Say we have a hotel with a countable infinite
number of rooms. I've been travelling all day and I show up at the hotel, and
talk to the clerk at the front desk. He tells me every room is full, but when
he sees the sad look on my face he tells me not to worry - he can make room
for me. He simply moves the person in room 1 to room 2, the person in room 2
to room 3, room 3 to room 4, etc... And then the first room is empty for me,
and everyone still has a room. [3]

[1]
[http://en.wikipedia.org/wiki/%C3%89variste_Galois](http://en.wikipedia.org/wiki/%C3%89variste_Galois)
[2]
[http://en.wikipedia.org/wiki/Russell's_paradox](http://en.wikipedia.org/wiki/Russell's_paradox)
[3]
[http://en.wikipedia.org/wiki/Hilbert's_paradox_of_the_Grand_...](http://en.wikipedia.org/wiki/Hilbert's_paradox_of_the_Grand_Hotel)

------
acjohnson55
I've felt this is the case for a long time. A lot of people have a smooth
experience in math for years until they hit their first serious discontinuity.
That could happen anywhere: times tables, fraction arithmetic, two-step
equations, geometric proofs, radicals, limits, or maybe even college math. The
reaction is nearly universal though. The person thinks, "holy crap, I guess
I'm actually _not_ good at math", anxiety strikes, and they freeze up.

Some people find eventually find their way around this first road block, and
future discontinuities in understanding become less stressful, and eventually
understood to be a completely normal part the process.

But the usual experience is that a person's math confidence is blown and as
the math truck barrels on ahead, they never catch up. They understandably
accept the identity of not being "good at math".

What's missing in math pedagogy at most schools is a systematic way to deal
with the discontinuities when they strike, especially that first time. We can
prepare students to deal with that panic. The tough part is that the math
teacher probably has 90 students on roster, but the discontinuity could hit
pretty much any given lesson, for some given student.

I know so many people who have come back to intermediate math later in life
and breezed through it, armed with intellectual confidence gained from other
fields. They look back and wonder how they came to be so intimidated by math
in their younger days. We've got to give younger people the tools and
knowledge for overcoming this intimidation at a younger age. We've got to kill
"I'm just not good at math".

~~~
japhyr
_as the math truck barrels on ahead_

I've been teaching math to at-risk high school students for the last 10 years.
I have spent more time helping students understand that they are not stupid,
that something just got in the way of their learning at one point, and they
never understood anything after that. I'm going to use your quote in some of
these conversations now.

What most of my students think: "I could never do math, I fucking hate it, and
I might drop out because I will never finish my math credits. I can't do math
because it's stupid and meaningless and I will never get it."

What really happened to get people off track?

\- Some just didn't follow one topic in some early grade, nothing else made
sense after that, and no teacher was prepared to get them back on track.

\- Parents split up, student couldn't focus in school for 6 months, they got
off track.

\- Parent/ sibling/ significant person passed away when student was young,
couldn't focus for 6 months-2 years, no way to get back on track.

Any number of other external events happen, and it is perfectly reasonable for
students to get off track in math.

 _a systematic way to deal with the discontinuities when they strike,
especially that first time_

Exactly. I would like to see every elementary school have a math specialist,
who knows advanced math, to help students with their overall understanding
when they get off track. Helping a kid master some mechanics does a little to
get them back on track, but diagnosing misunderstandings takes more math
expertise than most elementary teachers have.

I could go on forever; thank you for putting some of these issues so clearly
in focus.

~~~
jackmaney
> I would like to see every elementary school have a math specialist, who
> knows advanced math, to help students with their overall understanding when
> they get off track.

Note: This response is US-centric.

These individuals are exceedingly rare (if they exist at all). In fact, I
would be absolutely shocked if 100 such people existed. College students who
go into Elementary Education are stereotypically terrified of mathematics, and
they have (at most) one required math course. This course is a "general
methods" course that essentially acts as a survey of the elementary school
mathematics that they'll be teaching.

~~~
pflats
There are more than you think, I assure you. Consider the people who write
elementary curricula, who implement it in large cities, who teach middle
school and high school mathematics but might prefer to teach _just
mathematics_ at an elementary level. Consider NCTM[1], TERC[2], EDC[3], and
UChicago[4], and their programs and work. Consider the math coaches, who
instruct their peer elementary teachers on teaching mathematics.

Until these positions exist, are respected, and _are not first on the chopping
block the next time budget cuts roll around_ , these people will continue to
exist under the radar. (I'd gladly transfer into a Elementary Math Specialist
position, if I was sure it wouldn't threaten my family's livelihood.)

[1]:
[http://www.nctm.org/resources/elementary.aspx](http://www.nctm.org/resources/elementary.aspx)

[2]:
[https://www.terc.edu/display/About/Mission+and+Vision](https://www.terc.edu/display/About/Mission+and+Vision)

[3]: [http://ltd.edc.org/mathematics](http://ltd.edc.org/mathematics)

[4]: [http://everydaymath.uchicago.edu](http://everydaymath.uchicago.edu)

------
yomritoyj
Mathematicians are indeed lost and confused but in a very different way from
beginning students. One must put in one's dues in what Terence Tao calls the
"rigorous" phase before one can become productively confused in the "post-
rigorous" phase. [http://terrytao.wordpress.com/career-
advice/there%E2%80%99s-...](http://terrytao.wordpress.com/career-
advice/there%E2%80%99s-more-to-mathematics-than-rigour-and-proofs/)

------
api
I completely agree about the power of math, and why programmers should learn
it. There are two problems with math:

(1) Math is IMHO the worst taught of all academic subjects.

It's taught as if it were not a language. Math profs and books on mathematics
_never_ explain what the symbols mean. They just throw symbols at you and then
do tricks with them and expect you to figure out that this symbol means
"derivative" in this context. I have literally seen math texts that _never_
explain the language itself, introducing reams of new math with no definitions
for mathematical notation used.

I've looked for a good "dictionary of math" \-- a book that explains every
mathematical notation in existence and what it means _conceptually_ \-- and
have never found such a thing. It's like some medieval guild craft that is
passed down only by direct lineage among mathematicians.

Concepts are often never explained either. I remember struggling in calculus.
The professor showed us how to do a derivative, so I mechanically followed but
had no idea why I was doing what I was doing. I called up my father and he
said one single sentence to me: "A derivative is a rate of change."

 _A derivative is a rate of change._

I completed his thought: so an integral is its inverse. Bingo. From then on I
understood calculus. The professor never explained this, and the textbook did
in such an unclear and oblique way that the concept was never adequately
communicated. It's one g'damn sentence! The whole of calculus! Just f'ing say
it! "A derivative is a rate of change!"

(2) The notation is horrible.

If math were a programming language it would be C++, maybe even Perl. There
are many symbols to do the same thing. Every sub-discipline or application-
area of mathematics seems to have its own quirky style of notation and
sometimes these styles even conflict with each other.

Yet baroque languages like C++ and Perl at least document their syntax. If you
read an intro to C++ book it begins its chapter on templates by explaining
both what templates are for and the fact that type<int> means "type is
templated on int."

Math doesn't do this. It doesn't explain its syntax. See point #1 above.

~~~
j2kun
I agree wholeheartedly with how frustrating it is. I think part of the problem
is that really great mathematicians are encouraged to stay as far away from
teaching (and improving their teaching) as possible, and great teachers are
often discouraged from pursuing more mathematics for a variety of reasons. And
when I personally teach calculus I make sure to explain derivatives in the way
you want in the very first day (before describing limits or anything else).

As to your second point, I think notation is a big problem, but it's a bit of
a straw man. With very few exceptions that I doubt you would ever find
yourself in, I have never met a professor or mathematician that would not
explain notation if you asked (gladly stopping in the middle of a lecture or
talk to clarify). There is still a lot of it, but every mathematician who is
presenting the mathematics can explain the notation to any degree of precision
you could ever want, and I have very few colleagues who have never stopped
someone for this reason.

I think the bigger problem is trying to read mathematics by yourself, without
the ability to ask questions. And even after understanding the notation, I
feel programmers have bigger problems, which I've expanded more on in this
post [1], the main difference between learning programming being there are
simply more free and open resources for learning programming. This is probably
because programmers invented the internet and filled it with their favorite
content first.

But one point I make is that mathematical notation is inherently ad-hoc, and
the only kinds of notation that stick around are the kinds that get used ad-
hoc enough times to become standard. And even then people will make up their
own notation for no other reason than that it's their favorite (Physicists are
really good at this, and perhaps ironically it drives mathematicians crazy).
Because of that (and because notation is introduced often to be rigorous, not
to explain a concept) you're unlikely to ever find such a dictionary. Sorry :(

[1]: [http://jeremykun.com/2013/02/08/why-there-is-no-
hitchhikers-...](http://jeremykun.com/2013/02/08/why-there-is-no-hitchhikers-
guide-to-mathematics-for-programmers/)

~~~
api
The problem is really very simple.

First you teach the basics of the language. Then you teach how to express
concepts in that language and what those concepts _mean_. Finally, you teach
how to manipulate those concepts to build new higher-order forms.

Mathematics is taught like this:

First, students are shown how to manipulate symbols they do not understand.
During this process, sometimes (if you're lucky) these symbols are explained
in a piecemeal and oblique way. Sometimes conceptual meaning is discussed at
the end to wrap things up (oh by the way this is what you'd use this for, now
let's move on), but this is rare. Mostly you just get elaborate dances of
symbols thrown at you with no explanation to tie what you're doing to any
problem, reality, or conceptual meaning. In the end most students end up
memorizing these meaningless opaque incantations and never understand why
anyone would be interested in math.

~~~
cma
I don't think just doing everything from first principles is reasonable.
Building up the natural numbers from set theory etc. would just be frustrating
for kids.

Like trying to teach them their native language by grammar diagrams etc.
instead of immersion

[http://en.wikipedia.org/wiki/Language_immersion](http://en.wikipedia.org/wiki/Language_immersion)

Math is not exactly the same as natural language, but there are tradeoffs to
doing it one way or the other and there needs to be balance.

------
ColinWright
I highly recommend reading this. I didn't agree 100% with everything, and you
probably won't either, but it's an excellent insight into what learning and
doing math is about, and what it's like.

I'd love to read alternate viewpoints, but this is an excellent read.

------
chwolfe
The entire post was enjoyable but I found the last paragraph to have the most
actionable advice:

 _What’s much more useful is recording what the deep insights are, and storing
them for recollection later. Because every important mathematical idea has a
deep insight, and these insights are your best friends. They’re your
mathematical “nose,” and they’ll help guide you through the mansion._

------
zacinbusiness
I really enjoyed this because it captures so much of the frustration that felt
early in my programming career - especially in college when I had classmates
several years my junior who were (as far as I could tell) mathematics and
programming wunderkinds. I also think that this is the sort of rhetoric that
should be used to begin teaching children basic mathematics and more advanced
concepts as well, because I still recall many of my classmates in elementary
and even highschool who simply felt like failures or that they weren't smart
enough to understand things because they didn't "get" it the first, or fourth,
or fiftyth time.

------
dalke
"If you’re going to get anywhere in learning mathematics, you need to learn to
be comfortable not understanding something."

This is true for all research.

And I don't mean just the physical sciences either. Historians and
sociologists are also chronically "lost and confused." Otherwise it wouldn't
be a topic worth of study.

This is why students who are "good at X", whether it be math, German, sports,
or programming, may become frustrated when they find out that "good at
researching X" is a very different matter.

~~~
j2kun
I think it's a good point, but I still think the kind of lost and confused in
mathematics is more embarrassingly extreme. Imagine a few hundred historians
trying to discern when King George I died, and after 50 years of work they
conclude, "All we know for sure is that it was between the day he was born and
yesterday." A startlingly large part of mathematics feels like this.

And I think the reason is that "prevailing theories" mean nothing in
mathematics.

~~~
dalke
That's a poor comparison. I find it hard to believe that mathematicians are
still trying to decide if the set {1, 2, 3} is finite or infinite.

A "startling large part" of all science fields like this.

Physicists don't even know if the gravitational mass of an object is really
the same as its inertial mass. Or if there are true magnetic monopoles. And
that's after over a century of trying.

Immunologists have barely scratched the surface of how that field works.

Economists make lots of conjectures, with lots of math to back it up, but it's
not a perfect predictor of the human economic system.

Biological evolution still surprises us, 150 years after Darwin and nearly 100
years after the neodarwinian synthesis.

Chemists still don't come close to handling some of the reactions that natural
systems have figured out.

And so on.

~~~
j2kun
Good point.

------
weavie
I started off doing a combined maths and computer science degree.

With both computer science and maths you are chronically confused. The
difference being with computer science it doesn't matter so much if you don't
understand something, if you can get it to work you know you are on the right
track. Maths is much more progressive, each proof builds on a previous one. So
if you fail to understand one step you are screwed from that point on.

After the first year I realised I didn't actually enjoy being permanently
confused and so I ditched the maths to focus on computers. I do regret this.
It didn't take long at all before I forgot all that knowledge I had spent
years sweating over.

------
bpyne
I wish this post was around when I finished my undergraduate degree in
Mathematics. I would have taken my adviser's advice to go to grad school. At
the time, I remember telling him that I feel like a barely made it through the
program. Apparently I wasn't alone. Amazing the difference 25 years and the
internet makes.

------
fidotron
This misses the dangerous part, which is mathematicians in groups can confuse
each other into accepting ideas which are basically nonsensical, especially if
the counter argument relies on some obvious but intuitive observation of
reality but cannot be easily formalised within their chosen framework of the
moment.

As a consequence of this it wouldn't surprise me if the overwhelming majority
of maths was actually incoherent nonsense and that the people that understood
this thought they were just very confused due to being shouted down all the
time, when the really confused people are the ones oblivious to their own
situation.

~~~
pflats
I'm going to be rather dismissive in my reply, and for that, I apologize,
because I'm not quite sure how else to respond.

This is more or less a non-issue. Thanks to mathematicians building on Euclid
for the last 2300 years, we have a system of mathematics built on a few basic
principles (that you would not disagree with) and deductive reasoning. If you
take a theorem that is accepted as proven, you can almost definitely follow an
immense chain of logic back to the fundamentals. It will take you a ridiculous
amount of time to do so, but it is possible.

If you're referring to specific debates in the math community (e.g. "I feel
that the general math community accepting the axiom of choice was a bad idea")
then that's worth being specific about in your post.

~~~
AimHere
>we have a system of mathematics built on a few basic principles (that you
would not disagree with) and deductive reasoning

I think it's even better than that; mathematicians don't necessarily care
whether the reader 'agrees' with the axioms, or whether they're in any sense
'true' or 'false'. Mathematics is always of the form 'if these axioms are
true, this theorem follows from it'.

The real world and the notions which people consider to be self-evidently true
is just some messy slimy gooey gunk best left to psychoanalysts and
theoretical physicists and sewer workers and the like.

~~~
pflats
Oh, I agree, that's the best part. Pick your rules: oh, you picked those
seven? You've got a ring; here are your math rules!

In that specific case, though, I figured I'd point out that the basic rules of
math aren't usually things people squabble over. (Although I do enjoy a little
mathematical philosophy from time to time.)

~~~
pseut
Even better is when, to everyone's surprise, those seven turn out to be
relevant for describing real world phenomena.

------
ChristianMarks
Fair enough. I tried in my youth to solve every problem I came across. There
were many I couldn't solve. It took a while before I developed the wisdom and
discipline not to solve every problem no matter how long it took. By a while I
mean decades. I sacrificed the possibility of family life, have stopped
talking to my uncomprehending stepfather, and have kept my social interactions
to an absolute minimum to pursue my consuming interest. (I mention this as a
point of pride.) I find myself continually astonished by the ingenuity of
solutions I probably could never have imagined after years of work. Perhaps,
after a lifetime of effort that must be continually maintained, I have
attained the level an entering freshman at Harvard. At this stage, I may be
reduced at best to connoisseurship of some aspects of mathematics.

Now for some reflections on attitudes. Mathematicians sometimes act as if they
believe that expertise in mathematics transfers to expertise in mathematics
education. Suppose you are a sensitive student, lacking in confidence. You
open Korner's beautiful book on Fourier Analysis, and the first thing you are
greeted with is "This book is meant neither as a drill book for the successful
student nor as a lifebelt for the unsuccessful student." Korner does not
mention other references suitable for the successful and the unsuccessful
student. You take this comment to mean that Korner would let the unsuccessful
student drown. There is no implication, but this is the psychological import,
the implicature. Why mention the unsuccessful student at all? Why not say who
the book is for, without planting this gratuitous image in the reader's mind?
It would take some time to return to this book, to get past the wonder at a
mind capable of such an incidental, dismissive, off-handed acknowledgement of
"the unsuccessful student."

You could say this is "overthinking." Such remarks, microagressions as they
are termed today, "perpetrated against those due to gender, sexual
orientation, and _ability status_ ", are sometimes revealed in the asides of
mathematical authors [1].

And now if only mathematics educators would evaluate their students on the
state of their confusion!

[1]
[http://en.wikipedia.org/wiki/Microaggression](http://en.wikipedia.org/wiki/Microaggression)

------
graycat
No, the OP is giving bad advice.

Reading good foundational text books carefully is darned good advice. But for
solving every exercise before moving on, no, that's not a good idea. Instead,
be willing to be happy solving some 90-99% of the exercises. For the rest,
guess, with some evidence, that they are incorrectly stated, out of place,
just too darned hard, or some such. If insist on solving 100%, then get on the
Internet and look for solutions.

Next, if read some foundational text books, then in each subject also read
several competing text books, perhaps just one mostly but also look at least a
little at the others for views from 'a different angle' that can be a big
help. Why? Because likely no text book is perfect and, instead, in some places
is awkward, unclear, misleading, clumsy, etc. So, views from a 'different
angle' can make it much easier to learn both better and faster.

His description of doing applications by just getting what really need and
forgetting the rest can be done but is not so good. Instead, having a good
foundation helps a lot. And, commonly for an application in an important
field, there really is some good material in that field that should understand
with the application. Else risk doing the application significantly less well
than could have.

His description from Wiles is more or less okay for doing some research but,
really, not for learning. And for research, more of a 'strategic' overview,
i.e., with the 'lay of the land', would be good, i.e., for publishing not just
one okay, likely isolated, paper but a series of better papers that yield a
nice 'contribution'.

~~~
nbouscal
You misread the article. He is not recommending that students solve every
exercise; he's recommending the exact opposite.

~~~
graycat
No, I'm correct: He set up an extreme straw man to knock it down. I clearly
agreed that his extreme straw man is foolish. There is a common reason
students fall for his straw man: They are concerned that if there is an
exercise they can't work they are missing something important. My advice was,
instead, for a very diligent student, to solve 90-99% of the exercises and
just let go of the last few as illposed, stated in error, out of place, use
the Internet, etc.

To do just the "opposite" of his straw man is not good -- for solid
foundational material, Halmos, Rudin, Royden, etc., the exercises are darned
important. Right the Rudin exercises where have to consider uncountability are
not so good. The Royden exercises on upper and lower semi-continuity are a lot
of work for a little curiosity but likely won't see again. The Fleming
exercise on every bounded linear functional on a intersection of finitely many
closed half spaces achieves a maximum value is mis places. Etc. The abstract
algebra book I had had an exercise where the student had to reinvent Sylow's
theorem; a student wrote the author and got back a letter that the purpose of
the exercise was to see if a student could reinvent Sylow's theorem -- bummer,
misplaced exercise.

I'm correct.

~~~
nbouscal
This is a good example of how being correct is completely irrelevant if you
can't communicate it well. That said, I still maintain that you're thoroughly
misunderstanding the position the OP was arguing for.

~~~
j2kun
For the record, I'm saying two things:

1\. I hear about lots of people trying to do every exercise. 2\. I think this
is bad if it causes them to quit out of frustration. 3\. I provide some tips
on how to deal with frustration, and to know that you're in good company.

------
napowitzu
This is true with many, many things. Very often it is the connections between
ideas that yields the deep understanding, not the ideas themselves. Focusing
too intensely on a single idea or subject results in not making connections
and, consequently, not really understanding.

~~~
j2kun
I think that, for whatever reason, people tend to think mathematics is somehow
different.

------
edtechdev
"If you’re going to get anywhere in learning mathematics, you need to learn to
be comfortable not understanding something."

That's true of everything. It's fear and anxiety that prevents a lot of people
from learning and trying new things. I keep trying to tell students or family
members when they are learning to do stuff on the computer, just right click
everything, just google anything you can think of, don't worry about it being
perfect, don't worry about breaking anything. You have to hold back showing
them the "answers" or else they become dependent.

------
Bahamut
I think this is a good read, although I don't agree with all of it - I'm of
the mind that there is immense value in being able to figure out difficult
proofs. The process develops your logical ability.

~~~
michaelochurch
_I 'm of the mind that there is immense value in being able to figure out
difficult proofs._

Absolutely.

However, the rabbit hole is very deep. Many papers make leaps from one
sentence to the next that, if you're not familiar with the field, can take a
couple days to figure out. Even then, real world proofs are informal and
therefore not air-tight. They're close enough, almost always, but there's a
reason why a mathematical proof isn't considered valid unless it's lived for
two years under peer scrutiny.

One could drill down to _formal_ proof in the Godelian sense, in which proofs
are mere typography and can be checked mechanically, but that's not how most
of real mathematics is done and, practically speaking, most of it _can 't_ be
done that way and remain useful to humans (like assembly language, it's too
low-level for most applications).

~~~
nnq
> most of it can't be done that way and remain useful to humans (like assembly
> language, it's too low-level for most applications).

Sincere question (I'm not a mathematician): _why can 't it be done that way?!_

On top of an assembly language you can create a higher level language and on
top of that an even higher level one, and _it is airtight_ , it has to be or
the code won't compile or will throw a runtime exception, the compiler or
interpreter doesn't just "roll a dice" when it comes across and ambiguous
statement! You just can't have ambiguous statements, so starting from a
"precise" assembler everything else built on top can absolutely be "air tight"
at the language level.

(Now concerning what the program actually ends up doing (like something else
than you intended), or that sometimes you trade off security for speed and get
a buffer overflow, ok, these things happen, but _not at the language level!_
usually, and when they do - like C programs exploiting undefined but known for
certain targets compiler behavior this is either advanced malicious
obsfucation or random rookie mistakes.)

So explaining the question: why can't one build a higher level mathematical
language _bottom up_ , starting from an "assembler" of machine-checkable proof
steps and building one or a few levels of higher level human-friendly
languages that still map unambiguously to the lower level one?

Just because mathematical language has evolved in a _top down_ fashion,
starting with describing proofs in words or symbols derived from words, and
then developing more an more precise language and systems, it doesn't mean
that one can't go the reverse route, _bottom up_ , an maybe meet closer to the
top in a way, so that the resulting new mathematical language will be similar
enough to classical one not to scare everyone away, right?

...and the benefits seem _immense!_ Imagine:

(1) replacing years of peer review replaced by machine checking basic
correcting (+ some machine testing on huge data samples, for testable proofs,
just to be sure there was no bug)

(2) AI expert systems bringing real contributions to math by actually
discovering new proofs AND providing them in a language understandable for
humans, so humans learn from them and discover new techniques

EDIT+: (3) allowing the development of much more advanced theories, because
just as in software you can build much larger systems once you learn how to
write more "bug free" code, the actual complexity of the proof could be much
larger and maybe new realms of mathematical will become accessible to human
understanding once we have a "linguistic aid" to reducing the percent of
faulty proofs and the time spent debugging them

~~~
alex-g
This is a very good thought. Some current projects are trying to develop
computable mathematical foundations in a more structured way. Homotopy type
theory ([http://homotopytypetheory.org/](http://homotopytypetheory.org/)) is
one example that has a lot of buzz around it just now, but automated theorem
proving has been trying to work with higher-order concepts for ages now.

In the classical approach of "compiling" everything into sets/logic/etc., you
end up with just the assembly language problem that's being discussed, where
all the high-level structure vanishes. In order to do your bottom-up approach
instead, one of the things that needs to happen is to make the theory really
_compositional_ , so that once you've defined some abstraction or higher-level
concept, you can use it in constructions and proofs without having to break
the abstraction. You don't need to know - and in fact you shouldn't be able to
find out - just how the natural numbers were constructed, as long as they work
by the right rules. This motivates the use of type theory to describe
mathematical objects, and say which operations are allowed. We want to be able
to add two numbers and get another number, but we don't want to be able to
intersect two numbers as if they were sets, _even if_ they happen to have been
built out of sets.

So I think you are right - or at least, there are plenty of people who agree
with you that this is a good idea. It is difficult to _actually_ do, of
course, but that's life.

~~~
nnq
> We want to be able to add two numbers and get another number, but we don't
> want to be able to intersect two numbers as if they were sets, even if they
> happen to have been built out of sets.

Can't we do this in current mathematics?! I mean, no physicist or engineer
ever thinks of numbers as sets, even if you are the kind of physicist that
reads and understands mathematical proofs.

~~~
alex-g
Right, this is how mathematics really works. But formalizations of mathematics
may suffer from leaky abstractions. If we prove facts about numbers by
compiling them into sets, and then using set-theoretic axioms, we might
accidentally make it possible to prove things about numbers that are incorrect
or meaningless.

~~~
nnq
Isn't this an abstraction problem that you solve by simply providing an
"interface" or equivalent concept or access specifiers in oop like
private/protected? All other modules that use the number module for applied
math will just see an "interface" (let's call it GeneralNumber - as far as I
know there are a few other alternate ways of defining numbers besides sets,
right?), and the particular "implementation of numbers as sets".

For more abstract algebras or who knows what, the "numbers module" might also
implement another more advanced interface that exposes more of the
implementation, a "SetsNumber" interface. If you know have a proof that uses
this interpretation of number that is tied to one particular "implementation",
then there is nothing incorrect about it leading to weird or "meaningless"
results, they would be correct for SetsNumber but not for GeneralNumber (or
someone might need to take a good look and see if they can be made to work for
GeneralNumber too).

(I know, the words are all wrong, it probably sounds either "all wrong" or
like a gibberish to mathematicians that don't also happen to be programmers
...someone should figure out more appropriate terms :) )

And about leaky abstractions, I think they happen a lot in software because of
the tradeoffs we make, like 'but we also need access to those low level stuff
to tweak performance', 'but we need it done yesterday so it's no time to think
it through and find the right mode' or 'our model has contradictions and
inconsistencies but it's good enough at delivering usable tools to the end-
user, so we'll leave "wrong" because we want to focus on something that brings
more business value right now' etc. Also, there's a biggie: for some problems
using no abstraction is not good enough (initial developing/prototyping speed
is just to small), but if you figure out the right abstraction it will end up
being understandable only by people with 'iq over n' or 'advanced knowledge of
hairy theoretical topic x', and you can't hire just these kinds of people to
maintain the product, so you knowingly choose something that's leaky but works
and can be maintained by mediocre programmers, hopefully even outsourced :)

~~~
alex-g
Yes, the interface idea is basically the right thing, but there are technical
difficulties which aren't immediately obvious. For example, we'd like to be
able to prove that two different implementations of the natural numbers are
equivalent (one can do this mathematically, so if our system is going to
handle general mathematics then it has to be capable of doing this). So we
have to think quite hard about what this "equivalence" actually means. It's
not enough to say that they satisfy the same axioms, because in general there
can be all kinds of models for some set of axioms. It's closer to say that all
number-theoretic statements about numbers-v1 are true of numbers-v2, and vice
versa: but you can see that this is starting to get a bit hairy in terms of
computable proofs.

A related problem, which speaks to the leaky abstraction issue, is "proof
irrelevance". Typically, if I've proved something, it shouldn't matter exactly
how I did it. But it turns out to be tricky to make sure that the proof
objects in the system don't accidentally carry too much information about
where they came from. Sure, you can define a way to erase the details, but you
still have to prove that erasing doesn't mess up the deductive system.

None of this is insurmountable, but it's a glimpse into the reasons why
encoding mathematics computationally is not trivial.

------
baby
That's what I tell people around me. Studying math is hard because it makes
you feel stupid. You always feel lost, you always feel like you missed so many
things when you're starting to learn a new thing, you always feel like your
questions are stupid (until you get that the rest of the class is pointless as
well).

Especially with talented professors (Lyon 1, France, the professors there are
not really good educators, but they are geniuses), they make you feel bad for
not understanding things that seem so simple to them.

Studying math is depressive if you take it too seriously.

------
mathattack
It's strange to hear mathematics described more as a search for art and
structure than computation. Unfortunately most of my math education was on the
computational/applied side. I'm only getting into number theory and the more
esoteric math later in life for fun. As a parent I think we can't let the
school system destroy our kids love of math through too much rote learning. We
have to make it fun for them. (Same with music btw)

------
GIFtheory
Reminds me of this great quotation, which Oksendal places before the preface
to his stochastic differential equations book:

We have not succeeded in answering all our problems. The answers we have found
only serve to raise a whole set of new questions. In some ways we feel we are
as confused as ever, but we believe we are confused on a higher level and
about more important things. Posted outside the mathematics reading room,
Tromsø University

------
trevorhartman
Jeremy, I really appreciate this post and all the _excellent_ content over at
Math ∩ Programming. Thanks, and please keep it up!

~~~
j2kun
I'm just so happy that I get to read everyone's interesting stories and
thoughts in the HN comment threads! HN is really one of the highest-quality
places for discussion on the web ^_^

------
jmnicolas
Before a few articles of this kind I never suspected there was such depth in
Maths.

There's already so much to learn in programming, but I'm sure I'd love to dive
in Maths (without the pressure of school like "understand this or you're an
idiot").

~~~
pseut
Without that pressure, you won't learn it. :)

------
beltex
Loved this post.

FYI, the Andrew Wiles quote is from the opening of an awesome BBC documentary
about how he solved Fermat's Last Theorem -
[http://www.youtube.com/watch?v=7FnXgprKgSE](http://www.youtube.com/watch?v=7FnXgprKgSE)

------
nilkn
Heh, I knew this title seemed awfully familiar. Here's the discussion on
Hacker News which (presumably) spawned this:

[https://news.ycombinator.com/item?id=7331693](https://news.ycombinator.com/item?id=7331693)

------
vsbuffalo
Funny, this title is the same as a recent thread on HN:
[https://news.ycombinator.com/item?id=7331791](https://news.ycombinator.com/item?id=7331791)

~~~
j2kun
That comment received so much positive feedback that I decided to write an
article expanding on the idea :)

------
minikomi
First time I've intentionally kudos'd

------
egdelwonk
What's the best way to relearn math?

~~~
dwaltrip
Focus as much as possible on the underlying concepts from any math topic and
how they connect to other concepts. Try to boil these concepts down to the
most simple, clear form that makes sense to you. Test your boiled down
conceptual understanding by applying it to related exercises/problems you
haven't tried before and seeing what happens.

For each sub-topic, most math books give you the tools first and then teach
you problems they should be used on. Read the problems first, and think how
you might solve them (don't expect to figure it out, but if you do, great!).
Then, go back and learn the tools, trying mostly discern the "how" and the
"why" as opposed to the "what". Math is all about "how" and the "why". As some
motivation, whenever a new thing clicks, it is very satisfying! :) But it
definitely is a tough process.

Good luck!

~~~
twobits
'discern the "how" and the "why" as opposed to the "what". Math is all about
"how" and the "why".'

Could you pls expand on the diff between the "how" and the "what"?

------
gaius
A svbtle article worth reading, deserves an upvote.

