Hacker News new | past | comments | ask | show | jobs | submit login
What does it feel like to do maths? (2016) (maths.org)
72 points by notagain on Dec 7, 2022 | hide | past | favorite | 19 comments



Used to walk to work, took 40 minutes, and with nothing else to occupy my mind I would think of some math problem and see what I could do.

I remember cutting into the library one time when almost at work to see if my solution was the same as the classical one. It wasn't! I knew then the joy of discovery.

Of course I wasn't the first to think of it but that didn't matter, the joy is the same.


>Of course I wasn't the first to think of it but that didn't matter, the joy is the same.

Exactly right.

When I was in undergrad physics, I signed up for the intro comp-sci course (which is recommended back then but not yet required). First we learned the basic syntax of C, and in the second unit we learned about sorting algorithms. The Prof introduced us to bubble sort, selection sort, and insertion sort, and also the idea that the efficiency of a sorting algorithm was proportional to how many comparisons it required.

Being a physics student, I knew that there were physical process, like driven granular systems, that sorted objects by size without mathematically comparing anything at all. I wondered if I could beat the efficiency of all three of the algorithms by programming the computer to do something similar, treating integers like differently-sized grains of sand.

Over the weekend I first coded an algorithm I called jostle sort that sent the unsorted integers flying across a 2D array a distance equal to their magnitude, then picked them up again from left to right, top to bottom. This worked, but slowly because it had to look at every cell of the 2D array.

Thinking about it further, I imagined tipping the 2D array up from the bottom edge, so that all the integers in a column slid into a pile in the top row. I realized I only needed the count of how many integers ended up in each column, not the columns themselves. I could just initialize a 1D array of zeroes the size of the max value to be sorted, and increment the value in index n whenever I encountered n in the unsorted array. This worked really well, easily beating the other three algorithms in the criteria we'd been asked to test on our homework assignment.

I'd invented counting sort.

I would later learn that I wasn't the first, but that didn't lessen my pleasure. It's still my favorite sorting algorithm, because of that joy of discovery.


> Of course I wasn't the first to think of it but that didn't matter, the joy is the same.

Thank you for emphasising this key point! There's pleasure and there can be prestige in being the first to solve a mathematical problem, but there's joy and accomplishment just in solving the problem yourself, even if your approach is exactly the classical one. I am a rock climber as well as a mathematician, and I compare it to that: the first person to summit a route will get the big-name recognition, but it's still worth it for me to come along and summit myself later, even if I use exactly the same sequence as tens of people who have gone before me.


I always recommend the book "Love and Math", by Edward Frenkel. The book is fairly approachable I think, and it mixes his own biography with examples of interesting math concepts and connections.

Funnily enough, Frenkel works in the Langland program that Wiles talks about here, and explains a little bit more about it and why is it interesting.

Also, I really like this quote from the article

> Yes, some people are brighter than others but I really believe that most people can really get to quite a good level in mathematics if they're prepared to deal with these more psychological issues of how to handle the situation of being stuck.

I foud this true in experience. Sometimes people find maths hard at the beginning because it's a different way of thinking, much more abstract, and you don't usually have an intuition when you start. But in the end, most of it is practice. Also, some things only click when you go back to them, or need them for more advanced concepts.


The big thing I always wondered was what do you do when you are stuck with something that nobody else has an answer to?

I did contest math as a kid, loved it, but of course there's always an answer. Often if I didn't get the answer, there would be some kind of "oh yeah why didn't I think of that" when I heard the solution. So you would always be bailed out eventually. But also sometimes I'd get this bolt from the blue and solve the thing. It would be some weird little trick that didn't seem to come from anywhere or lead anywhere other than this one solution. I'd also see these bolts in other people's solutions. That is both satisfying and insanely frustrating, because how do you know you'll keep thinking of these things?

At some point, you will hit a rock that nobody you know, even really smart people who have done this their whole lives, can jiggle. What do you do?

I also wonder about how it works working on "proper" mathematical problems. Are people who do that basically experts at the little problems? Or is it like coding, where you can design a fairly large system and rely on the parts to come together one way or another? Because when I read a math text, it seems really quite dense with things that require a lot of details to be correct. If it's a house of cards spanning a hundred page thesis, how brittle is it?


I recently read a 900 pager on Newton's life, which includes a lot of details about his mathematical accomplishments (book link: https://www.amazon.in/Never-Rest-Biography-Cambridge-Paperba...).

Key things that stand out:

+ Deep scholarship: Learn from your predecessors as much as possible; pick up their techniques, at the same time be sceptical, do not believe much, even when you're learning/mastering techniques. Ex: Newton mastered Descartes work early in his career, used that mathematics to open up Calculus, but at later stage became disillusioned with the analytic method, and converted his thought into pure geometric methods.

+ Imitate the masters before you: Newton was really one of the deepest students of the ancients. He'd ferret out obscure passages, and use them to solve his problems. Ex: He knew the intricacies of the analysis/synthesis method from Pappus, which most people barely know in the depth he knew of them.

+ Dedication to truth, desiring certainty and taking nothing less. Insanely high standards.


> Imitate the masters

«Ved å studere mesterne, og ikke deres elever»


Google translate says "By studying the masters, and not their students", apparently from Norwegian, apparently a quote from the mathematician Abel (who's name I know from Abelian groups but is probably better known to some here).

en.wikipedia.org/wiki/Niels_Henrik_Abel


"Read the source, Luke"

This one came to mind, which I think captures the same idea


> At some point, you will hit a rock that nobody you know, even really smart people who have done this their whole lives, can jiggle. What do you do?

Keep working. Or leave it, come back after some time (hours, days, weeks, months... even years).

> Are people who do that basically experts at the little problems? Or is it like coding, where you can design a fairly large system and rely on the parts to come together one way or another? Because when I read a math text, it seems really quite dense with things that require a lot of details to be correct. If it's a house of cards spanning a hundred page thesis, how brittle is it?

I haven't worked too much in pure math research, but from my understanding it's a process of adding details. You have something you want to prove, and a base you start from. You have an intuition of what you need to be true in order to prove the thing, so start developing those proofs too. In the way you'll find you need certain conditions, and you might find that certain concepts reappear so you find it helpful to define them.

> Because when I read a math text, it seems really quite dense with things that require a lot of details to be correct. If it's a house of cards spanning a hundred page thesis, how brittle is it?

While technically it is a house of cards (i.e., if a proof is wrong everything that depends on it fails), most of the time you won't find such fundamental errors. Maybe a proof has some missing details that can be amended, or the same statement can be proven another way, or the theorem needs just tiny changes or extra conditions to be fixed.


> even years

the tide comes in for all problems, but for some more rapidly than others:

cf https://en.wikipedia.org/wiki/Parallel_postulate#History

    Problems worthy
    of attack
    prove their worth
    by fighting back.


For some reason I associate that rhyme with Piet Hein, based on the game Soma.


For the reason that it (along with many others, eg https://www.phys.ufl.edu/~thorn/grooks.html ) is his.


You solve it yourself, that is all there is to it. I have solved novel math problems, all you do is think about it until you have solved it, there are many strategies for doing that but in the end its all you thinking about different ways to solve it.


What certain long term math problems have felt like to me:

https://www.sargentsfineart.com/img/nisla/all/nisla-perplexe...

“Perplexed” by Nisla.

I have a print hanging in my office.


Related:

Andrew Wiles: what does it feel like to do maths? - https://news.ycombinator.com/item?id=13116335 - Dec 2016 (71 comments)

Andrew Wiles: what does it feel like to do maths? - https://news.ycombinator.com/item?id=13088294 - Dec 2016 (1 comment)


That was a deeply beautiful interview. Such profound yet relatable thoughts, humbly expressed.


Ime its like solving puzzles. Maybe like getting really into Go or some similar game.


Like being stuck in mud.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: