
How to Read Mathematics - ColinWright
http://www.people.vcu.edu/~dcranston/490/handouts/math-read.html
======
wimagguc
Feynman’s method to understand complex problems is so simple and elegant!
Surely you’re joking, mr Feynman:

 _”I can’t understand anything in general unless I’m carrying along in my mind
a specific example and watching it go. Some people think in the beginning that
I’m kind of slow and I don’t understand the problem, because I ask a lot of
these “dumb” questions: “Is a cathode plus or minus? Is an an-ion this way, or
that way?” But later, when the guy’s in the middle of a bunch of equations,
he’ll say something and I’ll say, “Wait a minute! There’s an error! That can’t
be right!” The guy looks at his equations, and sure enough, after a while, he
finds the mistake and wonders, “How the hell did this guy, who hardly
understood at the beginning, find that mistake in the mess of all these
equations?” He thinks I’m following the steps mathematically, but that’s not
what I’m doing. I have the specific, physical example of what he’s trying to
analyze, and I know from instinct and experience the properties of the thing.
So when the equation says it should behave so-and-so, and I know that’s the
wrong way around, I jump up and say, “Wait! There’s a mistake!”_

~~~
sundarurfriend
Also from the same book:

 _" I had a scheme, which I still use today when somebody is explaining
something that I’m trying to understand: I keep making up examples. For
instance, the mathematicians would come in with a terrific theorem, and
they’re all excited. As they’re telling me the conditions of the theorem, I
construct something which fits all the conditions. You know, you have a set
(one ball) – disjoint (two balls). Then the balls turn colors, grow hairs, or
whatever, in my head as they put more conditions on. Finally they state the
theorem, which is some dumb thing about the ball which isn’t true for my hairy
green ball thing, so I say, ‘False!’"_

~~~
j2kun
What kind of second-rate mathematician tries to prove a theorem without
writing down examples? That's Proving Theorems 101.

~~~
Koshkin
Right. The very first step in trying to prove a theorem is to try and
_disprove_ it (by looking for a counter-example).

~~~
jwdunne
Which are a form of example. You don't store those counter examples in your
head - you write them down. If they don't turn out to be counter examples,
what are they?

------
jordigh
One thing that I really would like to impress upon computer people is that
mathematics is not (usually) a formal computer language. Symbols and technical
terms change meaning depending on the author, people write with different
"accents". There's a little ambiguity and informality at times, but a whole
lot less than other kinds of writing.

Mathematics is (usually) written for humans, not computers. Don't attempt to
read mathematics as you read code. To keep making bad analogies, it's a bit
more like reading music. Picking the right tempo, the notes to accentuate, the
interpretation -- this all requires a human. Computer still make crude
approximations.

~~~
retox
Why is that? Why is there not a universal way of writing mathematics that is
not ambiguous and can be read by anyone that understands the 'language'?

I have a deep dread and fear of numbers and mathematics in general because I
don't understand them and I have never learned. Now I learn that there isn't
one thing to learn but a vast array? No thanks. I had this apparently romantic
view that an equation is an equation (the same equation) anywhere in the
world.

~~~
Smaug123
Why is there not a universal way of writing _anything_ that is not ambiguous
and can be read by anyone that understands the 'language'? Because humans have
a tendency to defy attempts at classification and constraint. Constructed
real-world languages (Lojban, Esperanto) haven't really taken off, while non-
constructed languages mix and match whenever they feel like it ("le parking",
"Schadenfreude"). The same is true in maths: people use what is easiest.

~~~
erasemus
Because whenever we read something we have to _guess_ the meaning. To the
human mind, all communications are inherently contextual and metaphorical. To
put it another way each individual has his own internal private language into
which everything he reads must be translated.

~~~
emacsgifs
It's as if we truly believe that 0 is positive charged and 1 is negative
charged.

Or to take us a step further, that a certain arrangements of electrons is
"negative" and the inverse arrangement is "positive"

I often tell CS people that everything in computing is metaphor and it
honestly disturbs me that 80% of the time I get blank looks.

~~~
sevensor
> I often tell CS people that everything in computing is metaphor and it
> honestly disturbs me that 80% of the time I get blank looks

This is why people think Machine "Learning" means that the Singularity is
near.

------
zitterbewegung
How I read Mathematics. First try to find the paper. First use
[http://front.math.ucdavis.edu](http://front.math.ucdavis.edu) . If you can't
find it then use google scholar. If that fails then go on the authors academic
website. If that fails then see if the author is live and try to contact them.
Then repeat for their grad students. If all else fails post it to r/math or HN
and ask for the pdf.

First you want to see if this paper is even worth reading. Scan the Abstract
and the conclusions. If it is still relevant then do a quick scan of the whole
text and look for the things you don't understand. For everything you don't
understand look up on wikipedia. Once you have found those pages then look on
the citations on wikipedia to gain more understanding. If you still can't
understand the paper reread it and then attempt to redo their calculations. If
all else fails ask someone to help you read it. Whenever you get confused
write on the paper questions. If you figure out that this paper is meaningless
then look for another one.

If you figure out a new question that you want to persue more then try to
answer that question. If you don't have access to professor or if they are
impeding your work then use stackexchange. Either that or try to invalidate
your new theory yourself. Recognize that you can be wrong.

~~~
Iv
Also a very good advice I had received a long time ago but that I had
dismissed until recently: some articles are just badly written.

The author sometimes forget to mention what B actually is, changes the meaning
of the multiplication operator implicitly, assumes the formula given is inside
an integral.... I could go on.

A maths article is an article. A lot are imperfect, quite a few are bad. Some
times it is not the reader that is lacking.

That being said, do your homeworks and only consider that possibility after
giving the article one hour of your time or so.

~~~
nicklaf
I remember reading this exact piece of advice, but have been trying to
remember where.

Would anybody be able to provide the blog post I believe we both read, which
says almost exactly what Iv wrote? I could have sworn Michael Nielson wrote
it, but I was unable to produce it last time I searched. So I am following
OP's advice and am asking here. :P

~~~
nicklaf
I may have found the post by Michael Nielson I was looking for... on Hacker
News!

[https://news.ycombinator.com/item?id=666615](https://news.ycombinator.com/item?id=666615)

In particular, this piece of advice he gives still rings true to my ears:

 _Often, when struggling with a book or paper, it 's not you that's the
problem, it's the author. Finding another source can quickly clear stuff up._

------
lostcolony
"Mathematical ideas are by nature precise and well defined, so that a precise
description is possible in a very short space"

"A well-written math text will be careful to use a word in one sense only"

So many papers are not that careful, and thus, not so precise. There are only
so many greek letters, so they end up overloaded. I can't count the number of
times I've struggled over an equation, only to find out that the reason it
made no sense and/or the value for a specific example came out differently was
because the author assumed one variable to mean something differently than I
did (and my assumption was perfectly valid in another context. Sometimes
another context in -the same paper-). The whole experience is incredibly
frustrating.

Echoing another point here, if you just show a sufficiently non-trivial
example and walk me through it, I grok it quickly. -Then- give me the equation
if you must.

~~~
vidarh
My main exposure to mathematics is via CS papers, and this is doubly true
there it seems to the point where my warning bells go off if they present
their solutions in equations - it's all to often a sign that they're going to
hand-wave away essential details that'd be obvious were missing if they
presented code or even pseudo-code.

E.g. I did my MSc. on statistical approaches to reducing error rates in OCR,
and so many of the papers I reviewed for my literature review suffered from
leaving out absolutely critical information when they presented equations etc.
that most of the time I spent implementing a lot of the methods presented
tended to be to try to reverse engineering missing information (and often that
was only possible because many of the papers used one of a few well
established public datasets for their experiments).

To me, for those CS papers, maths was a warning what was lying ahead was
likely _not_ precise or well defined. It's incredibly frustrating as there
were generally no good reason to do so - it wasn't a matter of leaving out
large bulks of complicated code. Often it could be as simply as leaving out
the concrete values of given parameters.

I'm not implying maths have no place in CS papers, but the CS papers that I've
seen that have used it best have tended to _also_ present code fragments, or
use maths very sparingly and coupled with painstakingly defining variables
etc.

~~~
lostcolony
Yeah, my attempt at a master's in CS (before deciding the time would be better
spent elsewhere) left me feeling largely the same. "Oh, isn't that equation
elegant. Oh, wait, actually trying to implement it...what the hell, where is
all the other information I need?! That's a gaussian...but where do the
parameters for that gaussian come from? What is this variable? WHAT IS THIS
VARIABLE?!" etc.

I had a 4.0 departmental in CS from the same school I tried to get a master's
in, but yeah, the master's experience was so bad, and working through those
issues so time consuming, I just said "Screw it, I can get a better return on
this amount of time doing my own things". Even if it's learning the same
things, but to an explicit goal rather than just 'understanding' (and an
eventual test/assignment that may or may not relate to what I care about), and
where I'm not constrained by an academic policy that prevents me from going to
others to have them explain exactly what I need to know.

------
wiz21c
FTA :

>>> One can now check that the next statement is true with a certain amount of
essentially mechanical, though perhaps laborious, checking. I, the author,
could do it, but it would use up a large amount of space and perhaps not
accomplish much, since it'd be best for you to go ahead and do the computation
to clarify for yourself what's going on here. I promise that no new ideas are
involved, though of course you might need to think a little in order to find
just the right combination of good ideas to apply.

I hate that. When I program my computer, I specify all the steps to accomplish
something. If it's too verbose, then I abstract it in a function.

I even add comments to make sure that a human reader easily and completely
understands what I do.

I do that because I want the computer and the human reader to be productive in
their understanding of what I do. And because I want the human reader to see
that it's either elegant or just mechanical.

I'm not that pretentious to say "hey, I've hidden a few details because if you
are as smart as I am, then you'll understand easily".

I had this when learning maths. As a student I was sometimes lost because I
always thought maths were hard. Should I have had all the details, I'd seen it
was indeed much easier than I thought and wouldn't have been intimidated. Now,
I'm older, and I know all of that and I'm much better at mathematics. But what
a waste of time.

And the space argument, come one. Just put all the stuff in annexes and it'll
be fine.

~~~
klibertp
> I hate that. When I program my computer, I specify all the steps to
> accomplish something. If it's too verbose, then I abstract it in a function.

> I even add comments to make sure that a human reader easily and completely
> understands what I do.

Yeah, honestly, any single time I read a maths-related paper I feel like I
would be better off with an example, well-written, implementation in any
programming language.

Then I recall what the code, written by mathematicians, tends to look like,
and I'm not so sure anymore...

Seriously, the fact that math people choose to ignore the last 50 years of
software engineering development, which was in a large part focused on
readability and maintainability, instead of incorporating the techniques for
themselves is really baffling. As it is, aside from a couple of really well-
written articles, reading any paper has a high chance of frustrating me to
tears...

~~~
gizmo686
Because mathamaticians have thousands of years of history where they learned
how to write _math_ in a way that is readable to other mathamaticians.

Software engineering is a very different activity from the math that
mathamaticians do. It makes sense that it has a different standard for what
mathematicians do.

There are undoubtadly ways to improve mathematics, but it seems highly
arrogant to say that your <100 year old field has discovered a massive
improvement that mathematics has missed for centuries.

~~~
gfodor
appeal to tradition, poor argument. not arrogance to assert perhaps a field
needs to update its methods.

~~~
gizmo686
A software engineer suggesting that math should adopt the practices of
software engineering sounds a lot like someone trying to generalize their
domain into another domain; and saying that they know better then the domain
experts in the field they are generalizing into.

Occasionally they are right. Most of the time however, the domain experts are
right.

~~~
klibertp
As I said in another comment[1], I'm not even talking about software
engineering, just a good practices of everyday programming.

Could you please tell me why do you think that the name `x` is better than
`element` (when talking about some set), the name `X` is better than
`training_set`, and the `∑` is better than `sum`?

[1] Here:
[https://news.ycombinator.com/item?id=15914024](https://news.ycombinator.com/item?id=15914024)

------
vinchuco
I disagree with 'do not read too fast':

Skim all. Skim again the next day. But the third time, get pencil and paper
and take notes, and effort. If it's neither uncomfortable nor exciting, you're
doing it wrong.

~~~
catnaroek
Just... no. If you actually have to understand it (rather than, say, peer-
review it), then make a honest attempt to understand it right from the
beginning. For instance, suppose it takes you 30 minutes to skim through a
chapter of a math book. Then you've wasted 30 minutes you could've spent
understanding the chapter's first section.

~~~
sillysaurus3
What if everyone learns differently?

~~~
criloz2
haha true, I envy people that just sit down read a book/chapter/article/blog
linearly and then comes enlightened, I can't, I need to first Skim almost 90%
of the material, and then start again from the bottom, reading the last part
and trying to figure out how the first part is connected or are important for
understand what is written.

I think that people that read it linearly don't miss important info, but I
just can't

~~~
catnaroek
Nobody is born knowing how to read effectively. It is a matter of practice and
discipline.

~~~
heavenlyblue
Have you actually read any of the papers? While I tend to agree with you in an
ideal situation where you may assume the writer is <probably> good - that is
usually not the case with real-life papers.

It's not even the writing per se - it's the structure, or omissions instead. I
would rather skim first to find which parts I may not be able to grasp right
away and to find out if there's any point spending time on this, rather than
go on and read the pieces from the citation list.

~~~
catnaroek
> Have you actually read any of _the_ [emphasis mine] papers?

What specific papers are you talking about? TFA doesn't contain a single
occurrence of the word “paper”. On the other hand, it contains several
occurrences of the word “book”. And, most importantly, it is obviously written
for newcomers to mathematics, who for obvious reasons will spend more time
reading carefully written books rather than papers written in a rush.

------
sulam
I’m curious if anyone has advice on how to understand all the symbols used in
math papers. Frequently I find this is the first barrier for me in reading
them.

~~~
kalid
Shameless plug, I write about math on BetterExplained.com

For each symbol, I try to make an analogy. For "e", for example, I have the
notion of "continuous growth". The formal definition is this:

[https://betterexplained.com/ColorizedMath/content/img/E_(mat...](https://betterexplained.com/ColorizedMath/content/img/E_\(mathematical_constant\).png)

"The base for continuous growth is the unit quantity earning unit interest for
unit time."

Once you see the role of each part of the definition, the idea snaps into
place. I just wrote about it here:
[https://betterexplained.com/articles/colorized-math-
equation...](https://betterexplained.com/articles/colorized-math-equations/)

Hope that helps!

~~~
klibertp
Just wanted to thank you for a great job you do on BetterExplained. I was able
to understand and put to use some concepts only thanks to your articles, most
recently it was Fourier transform. I'd be much better at math if people
writing educational materials realized that there are more effective ways to
teach (and to learn) than what they had experienced. Thank you!

~~~
kalid
I appreciate it, thank you! I have a similar wish, hoping more people would
share math in the way that truly helps them.

------
nathell
> The author usually spends months discovering things, and going down blind
> alleys. At the end, he organizes it all into a story that covers up all the
> mistakes (and related motivation), and presents the completed idea in clean
> neat flow. The way to really understand the idea is to re-create what the
> author left out. Read between the lines.

Pity there's little mathematical writing that actually _does_ contain these
blind alleys and mistakes. Part of why I enjoy "Concrete Mathematics" by
Graham/Knuth/Patashnik is that it's one of the rare counterexamples.

~~~
chestervonwinch
I was thinking of ideas for a talk recently, and I thought of "Things I tried
that didn't work and one that did" or something to that effect. However, when
I looked through my old failed approaches, I found the amount of documentation
was proportional to the level of success of the approach, making such a talk
nearly impossible.

------
killjoywashere
There's a contrast between the comments here about Feynman, and the author's
example of birthdays. The birthday data is totally random, and the author's
assert this is the sort of thing mathematicians enjoy. It strikes me that
Wolfram does this too, he'll demo something by saying "let's get some data"
and then generate a big pile of random numbers.

I usually space out at that point. I understand the utility of it, but I find
it worthless in most cases. I'd rather understand a concrete example well,
then check my understanding by trying the random case, then try to think of
some clever or experience based edge case or counterexample.

I think Feynman, especially later in life, probably had the gift of jumping
straight to step 3: he understood that his most valuable contribution could
often be simply expanding the space of possible examples. And random doesn't
really get you the whole space, and certainly doesn't get you the interesting
parts of the space.

------
a-saleh
One good advice our calculus lecturer gave us was:

"Proofs are not meant to be understood by reading them from beginning to end.
They are usually meant to be `checked` that way. Most of the time, I start
reading the proof from the conclusion and work backwards. Often I write my
proofs that way as well."

------
mad44
Good advice. Here is my advice for reading research papers.
[http://muratbuffalo.blogspot.com/2013/07/how-i-read-
research...](http://muratbuffalo.blogspot.com/2013/07/how-i-read-research-
paper.html)

------
ocfnash
A rather obvious but often overlooked piece of advice is to "read the
masters".

IMHO it is very easy to waste tonnes of time reading papers / books that are
OK but still vastly inferior to an account of a master.

Want to learn index theory: read Atiyah and Singer's original papers.

Interested in the classification of quadratic forms: read Serre's course in
arithmetic.

Need to learn about the recent breakthroughs on prime tuples: read Tao!

Find time to read Weil, insist on reading Poincare, come up with an excuse to
read Scholze etc.

Top mathematics literature is often spectacularly well written.

See also: [https://mathoverflow.net/questions/28268/do-you-read-the-
mas...](https://mathoverflow.net/questions/28268/do-you-read-the-masters)

------
billfruit
Mathematical notation is often too vague, informal and too dependent on
context.Frankly there is too much ambiguous overloading of symbols.It seems
the language has evolved to be succinct and short to write, rather than being
explicit and unambiguous.I know maths people say that it is meant to be read
by people and not by computers. More people are acustomed to reading code than
ever before.I feel that math notation needs reform that makes it formal,
unambiguous and capable of describing imperative manipulations rather than be
merely declarative, with clear annotations of type information of entities.

~~~
kachnuv_ocasek
That's what type theory is for. There have been some very interesting
developments in the last 40 years that you might want to check out.

------
rkachowski
I like this approach. I've always been stumped by the exact phrases like "it
follows easily" and my eyes tend to glaze over whenever I encounter that latex
math formula typeface. Breaking it down like this suddenly helps me appreciate
the other layer of communication in math writing, which isn't just about
expressing a specific mathematical idea - but also about expressing how to
appreciate it.

One gripe is the circular > Before you start to read, make sure you know what
the author expects you to know.

How can you know what the contents of the text require before you know (read)
the contents of the text?

~~~
ColinWright
As others have said elsewhere, skim it once or twice, ignoring the detail,
reading the text, getting an overall gist of the way it goes.

Then brew a fresh coffee, grab pad and pencil, roll up your sleeves, and make
notes as you go.

------
agentultra
What I most enjoy about reading mathematics is convincing myself that what I'm
reading is true. I like to keep a pad of dot-graph paper and a pencil with me
as I read to break down the theorems, lemmas, and expressions as I go; try
derivations, etc. Martin Gardner got me into the habit as I read his articles
while I was growing up.

These days I can even toy with the idea in Lean[0] or TLA+ and find different
ways of formulating the results. What a time to be alive!

[0] [http://leanprover.github.io/](http://leanprover.github.io/)

------
dan_mctree
It's too bad in the majority of cases you won't have someone besides you
answering all the questions. Without that, the reader would've gotten stuck
long ago.

------
dekhn
After many years of frustration not being able to read physics papers, a
helpful physics guy told me: when it's bolded, that means it's a matrix, and
that other one, they didn't say so, but it's a vector, so this step is a
vector matrix operation.

(that's why, today, I use numpy, where the operations are much easier to see).

------
nullsmack
Also here: [http://web.stonehill.edu/compsci/History_Math/math-
read.htm](http://web.stonehill.edu/compsci/History_Math/math-read.htm) In
fact, I think this must be the original source.

------
g00gler
I’ve been struggling with this lately as I’ve been studying ML/Stat. Thanks
ColinWright!

------
Kaka4
Pure mathematics is, in its way, the poetry of logical ideas - Albert Einstein

------
Dowwie
This advice could be adapted to more general advice on learning

------
tomerbd
I would add - draw!

------
x14
Just my perspective, but mathematics seems like badly-written software to me.
Shitty single-letter variable names with all the code in main() without any
separation of concerns.

~~~
platz
try reading some proofs in coq to see how badly software does in making
mathematics readable.

nevertheless, i agree for things like white papers the overloading of
identifiers is frustrating.

