
Will Our Understanding of Math Deteriorate Over Time? - yummyfajitas
http://blog.computationalcomplexity.org/2015/07/will-our-understanding-of-math.html
======
jacobolus
_“In mathematics and theoretical computer science, we read research papers
primarily to find research questions to work on, or find techniques we can use
to prove new theorems.”_

This is why figuring out an elegant, concise, and powerful set of mathematical
models which apply to multiple domains, and then devoting effort to
simplifying, organizing, and explaining those ideas in an accessible way is so
important.

Incentives for researchers are mostly to push and prod at the boundaries of a
field, but in my opinion mathematical ideas are only of marginal value in
themselves; more important is the way they help us understand and interact
with the physical universe, and for that building communities, developing
effective languages and notations, codifying our understanding, and making it
accessible both to newcomers and to outsiders is the most important task for a
field, and perhaps for our society generally.

Just like with software projects or companies, the most “success” comes from
helping a range of other people solve their problems and extend their
abilities, not from making technically beautiful art projects for their own
sake (not that there’s anything inherently wrong with those).

Perhaps more generally, while theorem proving has overwhelmingly dominated
pure mathematics and related fields for the past 80–100 years, and has been an
important tool since Euclid, theorem proving is only one way of approaching
the world, and in my opinion is a mere tool, not an end in itself. Just like
simulation is a tool, or drawing pictures is a tool, or statistical analysis
is a tool.

I like this bit from Feynman:
[https://www.youtube.com/watch?v=YaUlqXRPMmY](https://www.youtube.com/watch?v=YaUlqXRPMmY)

~~~
blintzing
But sometimes the connection between "beautiful art project" and "practical
tools" is totally unexpected. We often invest time in projects that seem
simply like "beautiful art", and then much later stumble upon something
practical.

I think a great example of this is cryptography. The foundations of it come
from number theory (prime numbers, modular arithmetic, elliptic curves), but
the subject of number theory, before the advent of computing, was possibly the
most useless kinds of mathematical 'art' that could have existed. I imagine it
was the mathematical equivalent of frolicking in the fields.

Mathematicians explored Fermat's little theorem starting in 1640, but they
didn't do it because they knew it'd be useful several hundred years later in
RSA. They did it simply because math is worth exploring in itself.

Even if you don't subscribe to the idea that we should pursue math for math's
sake, history shows us that it's very difficult to know what parts of math
will be useful to humanity, especially hundreds of years later. Since people
work best on what they find interesting, mathematicians should continue
exploring the topics that most interest them, because we really can't say with
any certainty what will prove useful (or even essential) to future
generations.

~~~
Houshalter
That's a popular meme but its mostly false. See
[http://lesswrong.com/lw/4kt/the_value_of_theoretical_researc...](http://lesswrong.com/lw/4kt/the_value_of_theoretical_research/)

The vast majority of "useless" mathematics really do turn out to be useless.
In the rare exceptions, there's not much evidence that doing the work
beforehand is actually an advantage. E.g. Einstein wasn't aware of most of the
work on non-Euclidian geometry before developing relativity IIRC.

Stuff like prime numbers have eaten up millions of brain hours of highly
intelligent people. I remember thinking it was weird that so many project
Euler problems were about prime numbers. And I looked up what the applications
of them were and couldn't find anything significant beyond cryptography.

And they seem to have been chosen for cryptography simply because it was a
well studied problem with certain properties. Not because cryptography
inherently needs prime numbers and would be impossible without centuries of
previous work studying them.

~~~
coliveira
> Einstein wasn't aware of most of the work on non-Euclidian geometry before
> developing relativity IIRC.

That's the worst example you could find, because Einstein didn't develop the
mathematics for general relativity. He relied on the math invented in the XIX
century for non-Euclidian geometry. If nobody had though about such a "sillY'
geometry with "no practical value" it would probably take much longer because
the necessary results would be out of the reach for Einstein.

~~~
sdenton4
Riemann's contribution is overlooked far far too often. The early non-
Euclidean geometries were spaces of constant curvature - spherical and
hyperbolic - and Riemann brought the idea of a manifold, and the notion of
having a geometry that changes as you move around the space. And he did it in
a fantastic lecture with only one equation in 1854, a good 50 years before
special relativity.

Einstein was also definitely familiar with the work of Helmholtz, who did some
fascinating work on non-Euclidean geometry in the context of ophthalmology:
Lenses change the amount of curvature we perceive in space (think of fish-eye
lenses), and provide a great jumping off point for the notion that the
universe might not be as flat as it appears.

The Dover book 'Beyond Geometry' collects a bunch of the major papers in non-
Euclidean geometry leading up to relativity, and is a fantastic read.

------
tokenadult
This is a very good and thought-provoking essay for a short blog post, and I
have already shared it in a Facebook community heavily populated by
professional mathematicians (where the moderator, with a Ph. D. in math from
Berkeley, has given it a thumbs up). Thanks for sharing.

I really like the overall point of the post that mathematics once known can be
forgotten or neglected, and mathematics written up for mathematics journals
can be difficult to understand. Professor John Stillwell writes, in the
preface to his book _Numbers and Geometry_ (New York: Springer-Verlag, 1998):

"What should every aspiring mathematician know? The answer for most of the
20th century has been: calculus. . . . Mathematics today is . . . much more
than calculus; and the calculus now taught is, sadly, much less than it used
to be. Little by little, calculus has been deprived of the algebra, geometry,
and logic it needs to sustain it, until many institutions have had to put it
on high-tech life-support systems. A subject struggling to survive is hardly a
good introduction to the vigor of real mathematics.

". . . . In the current situation, we need to revive not only calculus, but
also algebra, geometry, and the whole idea that mathematics is a rigorous,
cumulative discipline in which each mathematician stands on the shoulders of
giants.

"The best way to teach real mathematics, I believe, is to start deeper down,
with the elementary ideas of number and space. Everyone concedes that these
are fundamental, but they have been scandalously neglected, perhaps in the
naive belief that anyone learning calculus has outgrown them. In fact,
arithmetic, algebra, and geometry can never be outgrown, and the most
rewarding path to higher mathematics sustains their development alongside the
'advanced' branches such as calculus. Also, by maintaining ties between these
disciplines, it is possible to present a more unified view of mathematics, yet
at the same time to include more spice and variety."

Stillwell demonstrates what he means about the interconnectedness and depth of
"elementary" topics in the rest of his book, which is a delight to read and
full of thought-provoking problems.

[http://www.amazon.com/gp/product/0387982892/](http://www.amazon.com/gp/product/0387982892/)

~~~
eli_gottlieb
>"The best way to teach real mathematics, I believe, is to start deeper down,
with the elementary ideas of number and space. Everyone concedes that these
are fundamental, but they have been scandalously neglected, perhaps in the
naive belief that anyone learning calculus has outgrown them. In fact,
arithmetic, algebra, and geometry can never be outgrown, and the most
rewarding path to higher mathematics sustains their development alongside the
'advanced' branches such as calculus. Also, by maintaining ties between these
disciplines, it is possible to present a more unified view of mathematics, yet
at the same time to include more spice and variety."

While I do agree, we have to remember why most math classes actually exist: to
teach calculus to physicists and engineers, and, as my stepfather's
undergraduate advisor once said, "to keep the children from running in the
halls".

(For the mathematician's extremely self-centered view of "children" as "anyone
who has yet to ace two semesters of real analysis".)

I've been starting into real analysis myself via Pugh's textbook[1] after not
taking a serious math class since multivariable calculus, and found that, once
I get past the applied stuff, I really _like_ the approach of building up
calculus from its foundations in real numbers (taken as Dedekind cuts), limits
(Cauchy-convergent sequences), the set-theoretic construction of functions,
and the construction of topological and metric spaces "from scratch". But I
can tell that I like it because, deep down, I have the mind of a theoretical
computer scientist (which is what I like to be when I'm not writing firmware),
which is a kind of mathematician. I appreciate that someone has to teach the
applied classes to the people who _aren 't_ going to kvetch about "how can I
trust that works!?" and who _demand_ to just get their math over with as
quickly as possible.

[1] -- [http://www.amazon.com/Mathematical-Analysis-Undergraduate-
Te...](http://www.amazon.com/Mathematical-Analysis-Undergraduate-Texts-
Mathematics/dp/144192941X)

~~~
arh68
> _I 've been starting into real analysis myself via Pugh's textbook[1] after
> not taking a serious math class since multivariable calculus_

Do you have recommendations for other books? I stopped at multivariable
calculus as well. For what it's worth those yellow Graduate Texts in Maths
books feel like reading TaoCP or CLRS; I'm looking for more approachable
textbooks. I feel like I'm not even up to the 1800s, math-wise, not even up to
Gauss.

~~~
jacobolus
There are hundreds of good math textbooks to recommend, it really depends on
your interests.

For a broad overview at an undergraduate level, with a great job explaining
the context of various mathematics topics, these Russian books from the 50s,
_Mathematics: Its Content, Methods and Meaning_ by Aleksandrov, Kolmogorov,
and Lavrentiev, are pretty fun. Amazon link to the one-volume Dover reprint
(but I’d recommend finding a used three-volume hardback copy):
[http://amzn.com/0486409163](http://amzn.com/0486409163)

Or check out John Stillwell’s _Mathematics and its History_ :
[http://amzn.com/144196052X](http://amzn.com/144196052X)

------
lmm
But the proofs survive because they are proofs; if they don't communicate the
proof of the result then they have failed and should not be accepted by
journals. At the extreme end, machine-checkable proofs are in standard,
documented formats; an alien reading them in ten thousand years should still
be able to understand what's going on, at least if they understand the
notation and the axioms.

~~~
Tyr42
And codes does what code does. Given some binary executable, an alien reading
it far in the future should be able to understand what's going on, at least if
they understand the architecture. :P

I've written a few machine checked proofs, and there's really two ways that
I've seen, either writing it for the next human to read, or just enough that
the checker accepts it. The latter makes free use of tactics like `crush`,
which brute force solutions out of current assumptions, exploring the search
space automatically. That's really convenient, but can make reading the proof
very un-enlightening.

~~~
eli_gottlieb
Chlipala, if you're on here, I want to be the second person to say: `crush` is
the _least_ communicative tactic I have _ever_ seen. Where the hell is
`unsafePerformIO` when I want some reporting on _exactly_ how you crushed my
proof goals!?

------
stared
Ludwik Fleck's "Genesis and development of a scientific fact" goes very much
in the line of "[science] only exists in a living community of mathematicians
that spreads understanding and breaths life into ideas both old and new."
(written pre-WW2; it served as an inspiration for Khun). Its most eye-opening
example is the history of [the concept/knowledge/science/... of] syphilis,
from ancient to modern times.

PDF (of print from 1979):
[http://www.evolocus.com/Textbooks/Fleck1979.pdf](http://www.evolocus.com/Textbooks/Fleck1979.pdf)

------
pc2g4d
Integrate concise and effect explanations into the relevant Wikipedia articles
and you at least give future generations a good head start on understanding
these things.

~~~
jacobolus
Most of the Wikipedia articles on technical subjects, and especially on
mathematical topics, are terrible as introductory exposition. They are
jargony, highly technical, and self referential. They usually contain much
that is irrelevant, and they almost never properly explain the context for an
idea.

The main problem is that Wikipedia articles are tiny and atomic, so it’s
difficult to synthesize and organize ideas into a coherent story. The culture
of Wikipedia frowns on the kind of exposition found in textbooks or lectures.
And perhaps most importantly, no one is responsible for either individual
articles or sets of related articles in a field. Working within those confines
is not the best way to spend your time if the goal is to give future
generations a leg up, in my opinion.

If you want to learn about mathematics, even a mediocre textbook is nearly
always better than the relevant Wikipedia pages. The Wikipedia pages are then
useful later, as a reference, for people who already understand their content.

~~~
vezzy-fnord
_The Wikipedia pages are then useful later, as a reference, for people who
already understand their content._

Which, if I'm not wrong, is exactly the intent of an encyclopedia. It's a
reference work.

~~~
marcosdumay
If the only people capable of understanding what you are saying are the people
that already know it, saying it is quite useless.

Reference material or not.

~~~
AnimalMuppet
If I (still) knew it, I wouldn't need reference material. But I agree, if I
never knew it, then reference material is... not quite useless, but quite
useless _to me at the moment_.

------
stephencanon
Of course. Most modern mathematicians aren't fluent with half the material in
(the ~100 year-old text) Whittaker and Watson "A Course of Modern Analysis".
This was standard material even 60 years ago. You can get a PhD in mathematics
today without once seeing an elliptic function, because computers are good
enough at numerically solving the problems they were once used to solve
symbolically.

~~~
cbd1984
How many people know how to multiply two numbers expressed in Roman numeral
format without reference to an algorism (not a typo!) or other methods based
on Hindu-Arabic numerals?

How many people are fast at computing fifth roots without recourse to
computational tools such as Hindu-Arabic numerals?

Are those things math or arithmetic?

~~~
knodi123
> multiply two numbers expressed in Roman numeral format without reference to
> an algorism (not a typo!)

Thanks for making me waste my entire afternoon on wikipedia.

------
ripter
To make it readable, paste this in the console:

    
    
        document.querySelector('.date-outer').style.backgroundColor = 'white';

~~~
thrownaway2424
4real. What a color scheme!

------
nemoniac
Whatever about Mathematics, this is certainly true of Computing. Well
understood ideas are continually being reinvented, frequently badly. New
programming languages and frameworks spring up like mushrooms and everyone
wants to jump on board the next big thing.

Nowhere is it more true that those who don't know the past are condemned to
repeat it.

------
spiritplumber
Most people use math less and less (even your average cashier will have issues
if the machine isn't working).

Will Myron Aub give us the feeling of power back?

[http://downlode.org/Etext/power.html](http://downlode.org/Etext/power.html)
by Isaac Asimov on just this topic.

~~~
LimpWristed
I am a cashier. Never use math. I doubt anyone outside academia and parts of
industry ever uses math in its proper sense.

~~~
JadeNB
It may be that there are few industries that require it professionally, but
just because one's _profession_ doesn't require it doesn't mean that one
can't, or doesn't, use it. Consider, for example, Fermat, Young, and legions
of other, perhaps less well known, amateur mathematicians (for example,
Garfield ([http://en.wikipedia.org/wiki/Pythagorean_theorem#cite_ref-
Ga...](http://en.wikipedia.org/wiki/Pythagorean_theorem#cite_ref-
Garfield_22-0)) and Napoleon
([https://en.wikipedia.org/wiki/Napoleon%27s_theorem)](https://en.wikipedia.org/wiki/Napoleon%27s_theorem\)))!

------
agentgt
I read the SA article the blog refers to and I couldn't decide if that
particular colossal theory on symmetry was just an isolated incident or that
"deterioration" is really happening to many disciplines/theories of math. It
is certainly an obvious fact that things become popular and then eventually
forgotten and then sometimes brought back. There is also different levels of
understanding: breadth vs depth. I recall at one point there was concern of
the opposite. That is too much depth and not enough breadth (the above theory
is depth problem as many mathematicians know of the theory just not the exact
proof).

I still think the unpublished problem ie "publication bias" is a bigger issue
which I suppose is somewhat in similar vain. Supposedly google was working on
that.

~~~
anatoly
The theorem the SA article is talking about - CFSG, the Classification of
Finite Simple Groups - is somewhat special in that respect. Lots of things in
math fall out of fashion and get forgotten, often whole subfields. CFSG is
different because the theorem itself is so basic and important that it's not
likely to be forgotten in any foreseeable future. But its _proof_ is so long
and complicated that it's not even clear that there's one person who
understands all of it, and the heap of details is not organised well enough
for someone to just study it from books/articles without the help of people
who lived through proving it back in the 70ies.

Suppose there just isn't enough interest in the younger generation of
mathematicians to study _the proof_ , even if the old guard are able to
organize it better before they retire. Then we may reach a situation in which
CFSG will still be used as a proved theorem and not a conjecture - because
it's so powerful and important in many fields of math - but its proof will be
lost to collective memory. I'm not sure, but I think that state of affairs
might be without precedent.

(Here's a quote from Gian-Carlo Rota's _Indiscrete Thoughts_ on forgotten and
rediscovered math:

"The history of mathematics is replete with injustice. There is a tendency to
exhibit toward the past a forgetful, oversimplifying, hero-worshiping attitude
that we have come to identify with mass behavior. Great advances in science
are pinned on a few extraordinary white-maned individuals. [...]

One consequence of this sociological law is that whenever a forgotten branch
of mathematics comes back into fashion after a period of neglect only the main
outlines of the theory are remembered, those you would find in the works of
the Great Men. The bulk of the theory is likely to be rediscovered from
scratch by smart young mathematicians who have realized that their future
careers depend on publishing research papers rather than on rummaging through
dusty old journals.

In all mathematics, it would be hard to find a more blatant instance of this
regrettable state of affairs than the theory of symmetric functions. Each
generation rediscovers them and presents them in the latest jargon. Today it
is if-theory, yesterday it was categories and functors, and the day before,
group representations. Behind these and several other attractive theories
stands one immutable source: the ordinary, crude definition of the symmetric
functions and the identities they satisfy.")

~~~
reagency
How are categories equivalent to group representations?

~~~
j2kun
I agree. To some extent new generations invent notations that subsume existing
ideas, but I don'tthink they are claiming to reinvent symmetric functions.

------
sklogic
Exactly, this unfortunate "saturate and move on" can be observed in pretty
much any area.

------
hayd
Ah, I remember a lecture course where we classified all groups of order up to
1000.

Joy.... "Like having your brains smashed out by a slice of lemon wrapped
around a large gold brick."

~~~
hayd
Not sure why the downvotes, it _was_ an interesting course: a lot of tricks to
show isomorphisms / reasons why certain groups couldn't exist. It just went up
to order 1000, which was a __lot __of cases (30 lectures + exercises worth).

