Hacker News new | comments | show | ask | jobs | submit login
Mathematics as thought (aeon.co)
147 points by magoghm 59 days ago | hide | past | web | favorite | 78 comments



I was a bit disappointed that this article did not offer more practical examples of the ties between the development of mathematics and the history of ideas in general.

For instance, the vision of mathematicians like Hilbert and Russell of a cohesive mathematics defined from first principles seems a very Victorian notion that the universe is knowable if you simply search hard enough. It's the same enlightment frame of mind that resulted in intellectual creations like the Oxford English Dictionary.

Similarly the results of Gödel, Church, Turing, and others showing the limits of mathematical consistency seem of a piece with the confounding discoveries of Quantum physics, which replaced the static models of classical physics. They seem to correspond with a darker vision of the limitations of human intellect that emerge in abstract art and poetry like Eliot's 'The Wasteland' around the same time.

If you push this too far you end up sounding like an idiot (perhaps this is already there) but there are clearly discernible connections across science, art, literature, and politics both now and deep into the past. It might be a personal projection but Brahms concertos always strike me as the music of a people who believed in an orderly, Newtonian universe and upheld the static political order of the Ancien Regime.

At the end of the day most intellectuals read the same books and attend the same salons, however defined. There is continuous cross-fertilization between different realms of thought. The resulting connections are there if you just look for them.



Edit: 'Brahms' should be 'J.S. Bach'.


"How mathematics works rests on no absolute timeless standard, despite what many assume today, given its precision and efficacy"

Exactly this is key to teach people to like math, stop trying to train me in it. It's like math has become so saddled with politics or ego that people want to train everybody in the same system of values and they are willing to lie to get you to use to the same symbols, the same way of thinking, etc. Disclaimer, I'm really bad at math. But I hate how math and physics are taught, it should always be taught from a spirit of exploration as if you are discovering new islands or continents and you get to name the things you discover. Try getting a math professor to admit that he doesn't actually have an intuitive understanding for any of the stuff he teaches. I believe the vast majority probably don't have any such understanding but they are understandably scared to admit it. It's a cycle of lies. Who are you to say that I have to use that particular symbol or term to describe this math? People have a large stake in interopability between mathematicians so they teach us these symbols as if they are facts of nature when in reality they are just arbitrary drawings that people invented to name stuff that they found in nature. Fields of math stack on top of each other because they happen to work together but in reality we dont know why. They just do and it's interesting that they do so we keep doing it that way because it works and hasnt broken yet and it's useful. There were cultures that did math with a completely different set of symbols, not just symbols but even way of thinking. Multiplication, addition, entire base ideas were thought of differently. Yet our current way of doing math is taught to us as dogmatic, accept it or you are a troublemaker. No wonder people dont like math, nobody wants to be a slave to somebody else's ideas and value system. Every math class should start with something like "This is the symbol for addition: '+'. This was invented by some person, you could use something else, you could try to find some other way to add numbers but the way we teach you to think has proven to be fairly fast and convenient so we teach it." That in my opinion is how you make people interested in math because you treat them as equals instead of subjects to be trained in your favorite cultural way of doing math. Math should be seen like a weird natural phenomenon that we observe, from the getgo. From childhood on. Explain it like that and I believe kids (and adults) will love to discover more about it.


Try getting a math professor to admit that he doesn't actually have an intuitive understanding for any of the stuff he teaches. I believe the vast majority probably don't have any such understanding but they are understandably scared to admit it.

Mathematicians are famous for embracing the fact that they have no intuition. For example, here's a famous quote from Geoff Hinton:

To deal with hyper-planes in a 14-dimensional space, visualize a 3-D space and say 'fourteen' to yourself very loudly. Everyone does it.

And here's a famous quote from G.H.Hardy:

In mathematics, you never understand things; you just get used to them.

The idea behind both of these quips is that there is no intuitive way to visualize these weird mathematical objects. The main reason people struggle with math (in my experience) is that they assume there must be an intuition. There's not. Math is weird. And all mathematicians feel that way about math.


>Mathematicians are famous for embracing the fact that they have no intuition.

That's just plainly false. Mathematics is all about intuition.

That's how we all know that Riemann hypothesis is true, regardless of whether there's a proof of it. Or that Fermat's Last Theorem holds.

That's how we still do math, even though we never really had a solid system of axioms for it -- and Godel showed that, in some sense, it's not even possible.

That's how even famous theorems were proven after many attempts, with people accepting proofs with errors in them.

That's how Calculus was invented before the notions of the limit, derivative, and integral -- the most fundamental ones! -- were solidly written down. Newton and Leibniz built up the math upon heresy, and everyone took it, because it felt right.

What you are writing about is surprise that mathematics still gives to practicing mathematicians.

And your first quote is a trick on how to extend one's intuition, not abandon it!


We know that FLT is true because Andrew Wiles has proven it. We don't know that RH is true. We have strong reasons to believe it is true since we have discovered directly a large number of roots on Re(z) = 1/2 and none anywhere else. I do believe intuition exists in the process of doing mathematics, but I don't think either of these are good examples.


How about in the process of reading a proof? Each single step is validated by the intuition of the reader.


I have to disagree. The lesson I have drawn from studying mathematics is that intuition cannot be trusted and that even obvious things must be proven. Or you must admit they cannot and convert those things to axioms.

Russell's Paradox [0] is a good example of intuition going wrong in the field of set theory. As I recall Zermelo-Frankel (ZFC) set theory eliminated the known paradoxes of naive set theory but it took decades to complete. There are formally undecidable propositions in ZFC but that's different (to me at least) from incorporating outright contradictions.


> And your first quote is a trick on how to extend one's intuition, not abandon it!

Can you elaborate on this? Do you mean that the point is 'to free up your intuition, abandon your hope of truly visualising this stuff, and make do with a hybrid of simplified visualisation and abstract thought'?


In a way, yes.

But also: it is surprising how well building up the intuition in 3 dimensions works for N dimensions. You can build up a very working, visual intuition for Linear Algebra just by doing everything in 3D, for example. There's a textbook that takes this approach: "Practical Linear Algebra".

But, of course, this comes with a caveat: it works until it doesn't. The volume of a unit sphere increases for dimensions 1-5, then starts to go down[1]. There exists an exotic R^4: a manifold that is homeomorphic, but not diffeomorphic to R^4[2]. And so on.

But for a lot of (intro) differential geometry and topology, (e.g. concepts like covering spaces), I haven't found many examples where intuition based on 3 dimensions breaks.

A friend of mine[3] once told me that geometry is the art of getting correct results from incorrect diagrams. I think that the quote we're discussing is in a similar spirit.

[1]https://en.wikipedia.org/wiki/Volume_of_an_n-ball

[2]https://en.wikipedia.org/wiki/Exotic_R4

[3]Dmytro Karabash


> To deal with hyper-planes in a 14-dimensional space, visualize a 3-D space and say 'fourteen' to yourself very loudly. Everyone does it.

another version that I heard is "Start with visualizing a N-dimensional space, then let N be 14"


> In mathematics, you never understand things; you just get used to them.

That was von Neumann.


He was a notorious formalist, and a damn good one; but it's hardly the only style of mathematics. Compare with Groethendieck who also invested a whole new field of mathematics, but had a completely different approach: saturate oneself in the problem until one understands it, then write the solution down.

http://www.landsburg.com/grothendieck/mclarty1.pdf


Have you considered that this perspective is also an almost perfect ego preserving technique?

It's not my fault I'm bad at math, its just that those other guys have memorized the language. I just think different so of course I wouldn't be suited to the normal way of doing things. If only someone had nurtured my special way of thinking and treated me as the individual that I am I'd surely pass by those who so readily conformed to the standard without really understanding. We should teach math with a child like curiosity, because children have infinite potential (and so do I).

I also suck at math. I have had these thoughts. I don't think they are healthy or constructive, they probably come from a place of pride and ressentiment. I'm not saying you have these thoughts or there is a direct connection between what you wrote and what I did but it certainly reminded me of that thought pattern.


If math is invented, math is the union of thought, in relation to an objective measurement of reality, because the language expresses consistent and testable truths that can be observed and measured.

If math is discovered, math is the reality, that is tested against itself.

As for thought patterns and special ways of thinking - there's nothing wrong with nurturing an interest.

I think any thought patterns that exaggerate or minimize one's ability in math simply get in the way of doing math, eventually. Pure mathematics is done for itself. Applied mathematics is done for something else.

These things are so simple when they can cleanly be separated in descriptive rhetoric, but in the actual domains of math, they could not be more complicated.


Mathematics is invented - it is only an approximation to reality. The "truths" of mathematics are what you start with - I am finding that looking at the Metamath system is a good way to see this.

It is a very useful tool in many areas but is just plain useless in others. Unfortunately, in my little opinion, there are too many people who give too much credence to the use of mathematics to discover the "truths" of the universe. They have an idealised perspective as to its efficacy instead of having a more pragmatic perspective.


I don't know if you can concretely say that mathematics is invented. Where does it come from? You observe reality, but what if reality is math?

You can't say it exists outside of the realm of possibility that some people have a base observation that rests more on math than it does anything else.

Your transition into pragmaticism is appropriate for your argument, but I don't know if it's appropriate for mathematics that concerns computation. The universe in the argument I am constructing would be the universe of math, not the physical universe. In that regard, I think pragmaticism can still be held as a primary value that can be realistically met.


Where does mathematics come from? Good question. How about a not so satisfactory answer - we like to make sense of the world around us and we also have the ability to map the world around into patterns.

With that, we try to put what we observe into some possibly useful form. Now those attempts at making patterns can take a number of different forms, of which one is mathematics.

In the case of mathematics, we create axioms as a basic framework on which we build the various mathematical edifices that we use. We have applicable rules that are used and from that people come up with all the various fields in mathematics that exist today.

However, as part of a challenge that I accepted some time ago, I am finding that everything we use in mathematics is but a map of reality and not reality itself. No matter what mathematics one uses in whatever area one uses mathematics, it is only an approximation to the reality around us. When the rubber hits the road, our mathematics often fails us. Forty years ago, one of my engineering lecturers advised all of us students at the time that though mathematics could give precise answers, reality had a penchant for not being precise at all, as in, there is much variation in the physical mediums in which we would work. Always treat all mathematical results obtained as only being approximate and design for the worst case scenario.

Every map in existence is but an inaccurate description of the observable universe. From the very small to the very large, from the very simple to the very complex, mathematics gives us a possible way of inadequately describing what we see. Though, in its inadequacy, mathematics provides a means of studying and manipulating the universe around us from which we can get useful results.

So we have all sorts of interesting technology and science with which we manipulate our environments.

The problem I see is that there are those who attribute an unwarranted "honour" (I suppose) to mathematics without stopping and seeing its limitations and that it is a tool which we have created or invented so that we can make sense of the universe around us. Studying the Metamath system is interesting for me in that by using some very simple rules we can build all the mathematics of today.

What I find interesting in your last paragraph is that you, yourself, seem to make the distinction between reality and mathematics, which would indicate, at least to me, that you recognise that it is but a tool created for a use and not something that had a pre-existence that we could discover. However, I may be completely misunderstanding your argument at this point.


>Every map in existence is but an inaccurate description of the observable universe.

That's the thing though: mathematics can be inspired by the world around us, but more often than not, it does not aim to describe it.

Mathematicians working, say, on exotic 4-manifolds, are not concerned in the least with how well these structure describe anything out there in the "real world" of yours. The short answer is: it doesn't. The longer answer is: 4-manifolds are the real world, or, rather a part of it that, perhaps, the only way you can observe and interact with that part of the real world is through mathematics.


> What I find interesting in your last paragraph is that you, yourself, seem to make the distinction between reality and mathematics, which would indicate, at least to me, that you recognise that it is but a tool created for a use and not something that had a pre-existence that we could discover.

No, I continue to express uncertainty and doubt concerning the origin of math. If math is the ground for some people, math does not have an origin. Math is the origin.

I understand that may be a very difficult paradigm to think with if you are not used to thinking in that framework, but I am like that. Math, ground. Reality is defined by math. We may be incorrect in the translation, but that's a work in progress. Math that comes from reality is math that is incomplete.

I can't give you a question that yields a satisfactory answer until it does.


Fair enough.


Thank you, that means a lot.


Was this computer-generated?


Is this some variant of nerd bullying?


I'm sorry, but your complaint makes as much sense as telling a coach that it would be easier to get people into fitness if they weren't made to do pushups because pushups are unpleasant.

There is no way to acquire complex concepts without acquiring a language that can state those concepts. This takes work and discipline. It requires training. And if you're going to train someone anyways, it would be monumental idiocy to allow them to be trained in a way that won't let them communicate with others.

If someone refuses to go through the training, the result is that, like you, they can't learn to be good at math. Just like you won't become physically fit without a certain amount of work.

Now the training can be made more interesting. There are ways to improve it. But asking for no training, and instead just have people observe it as a weird thing from the outside? That will no more build the mental pathways that let you do math than watching people weight lift will make you stronger.


This strikes me as fairly unreasonable.

The full range of human thought is probably possible to express in the language my sibling and I made up as children, but I'm not arrogant enough to assert that my teachers are all incompetent liars just because they refuse to grade essays written in our made-up language.

Every mathematics professor I've ever known would readily agree that part of teaching mathematics is teaching how to communicate ideas using the established linguistic customs of modern mathematics. That's reasonable, just like it's reasonable that literature professors insist on essays written in modern English.


There is a form of this critique which is correct, but requires a lot more mathematical sophistication than you can get from starting from this viewpoint.

For example, Fields Medalist Terrence Tao essentially believes the same thing, but sees it as an extension of having gone through a phase of rigour to understand how in particular to ignore the details of a system: https://terrytao.wordpress.com/career-advice/theres-more-to-...

And modus ponens, modus tollens. Consider the view where if any approach to mathematics is valid, including your idiosyncratic version, it is also the case that the style they teach is going to be at least as valid. To actually understand why you would want to choose one mathematical system over another requires familiarity with both systems, and so some flexibility is demanded of you.


"Master your instrument, master the music & then forget all that & just play." ~ Charlie Parker


I would trust a math professor to have an intuitive and abstract understanding of the kind algebraic or notational concerns you have. Math people reuse symbols and invent new algebras all the time, so they don't hold these symbols (or their syntax) sacred at all. There is in fact a healthy professional irreverence.

Perhaps you may have benefited from a constructionist approach to mathematics, which is far less "magical", but most math is for engineering so its understandable why math professors teach differently.

* https://en.wikipedia.org/wiki/Church_encoding


> Try getting a math professor to admit that he doesn't actually have an intuitive understanding for any of the stuff he teaches.

I have an intuition for everything I teach.

I cannot speak for others, of course, and I don't teach 14-dimensional stuff, myself, so maybe there are areas where no intuition is possible but if I don't have an intuition then I keep plugging until I get one.


I am in the process of writing a step-by-step elementary algebra equation solver that solves these problems like a human does. One thing I have learned from this experience is a good way for a person to discover how much they don't know about even the simpler parts of mathematics is to try teaching them to a computer. This leads me to suspect that Meai is correct about most mathematicians not having a complete understanding of the material they teach.


> mathematicians not having a complete understanding

Oh, I'm happy to concede that there is always more to know. But that is a far cry from the poster's assertion that people teaching the class do not have any intuition about what they are teaching.


Please do share. Have you looked at the various CAS's that exist to see how these are implemented?


I am the main developer of the MathPiper CAS (http://mathpiper.org), and I forked it from the Yacas CAS in 2008. MathPiper is a rewriting system that is similar in design to Mathematica. Most CASs don't solve elementary algebra equations the way humans solve them by hand, so these systems can't show the steps they took in a form that is easily understandable by a human.

However, a few CASs do solve elementary algebra equations the way humans do, and the most famous of these is probably PRESS (Prolog Equation Solving System). The story of PRESS's development is an interesting one.

In the 1970s and 1980s, a group of artificial intelligence researchers led by Dr. Alan Bundy at Edinburgh University conducted research on how computers can be "taught" to solve elementary algebra equations the way humans do. The first thing they did was to try and figure out exactly how mathematicians do mathematics. They were surprised to learn that a significant number of the techniques that mathematicians used to perform mathematics were not written down anywhere. They were not in any textbooks, nor were they in any journals or research papers. As the researchers dug deeper, they discovered that the techniques did not have names, and they were not taught explicitly. The researchers concluded that mathematicians were using these techniques unconsciously. (Alan Bundy. The Computer Modelling of Mathematical Reasoning. Academic Press, 1983, p.164.)

Why were these researchers the first people in history to discover this information? I think it’s because computers were the first "students" in history that absolutely refused to learn any mathematics that was not taught explicitly. The researchers then devoted years of effort to discovering and naming the unwritten techniques that mathematicians used to perform mathematics. When they "taught" PRESS these techniques, it was able to perform mathematics similar to the way humans typically would.

The step-by-step solver I am building is based on PRESS, and here is an example of what I have working so far:

https://www.youtube.com/watch?v=cy6bwNBkAK0


Thanks. I haven't come across PRESS before in my various investigations, nor have I come across MathPiper either. I'll have a close look at these to see what more I can learn.


Reminds of when Feynman invented his own trigonometric notation (but then later switched to the notation everyone else uses just for the sake of making himself understood)

https://www.cambiaresearch.com/articles/83/rule-6---adapt-an...


Math is not a natural phenomenon, it's a social construct. It's methods and definitions that some people find convenient. It's invented, not discovered. Realizing that helped me grok linear algebra and statistics.


What you are complaining about, is that you are being taught:

a) a system of mathematics. b) the wrong one. c) in an extremely disconnected fashion.

I have a particular gift for mathematics. But it drove me nuts trying to go through a Western curriculum. Often, everything was backward, over-simplified to the point of stupidity, or just straight....


As someone with a math PhD who has taught mathematics at a university level, I agree with your sentiment, but your ire is misdirected.

-----------------------------------------------

First, let me highlight what I agree with:

>But I hate how math and physics are taught, it should always be taught from a spirit of exploration as if you are discovering new islands or continents and you get to name the things you discover.

Absolutely.

>Who are you to say that I have to use that particular symbol or term to describe this math? [..] they are just arbitrary drawings that people invented to name stuff that they found in nature.

Indeed, the notation is just a choice - and a tool.

>It's a cycle of lies.

It's an open secret :)

>Every math class should start with something like "This is the symbol for addition: '+'. This was invented by some person, you could use something else,

Absolutely. Old texts didn't have the "=" sign, people would write "eq." or something like that -- "=" is a relatively modern invention.

And in math papers notation is often invented on the spot. That's why papers often start with defining all the symbols - otherwise, there's might not be a way to know what "a * b" means. It can mean whatever the author felt like that day. Maybe it's quandle product! Maybe multiplication. Who knows.

>you could try to find some other way to add numbers but the way we teach you to think has proven to be fairly fast and convenient so we teach it."

Absolutely. I would like to add: "..and here are five other ways you could do the same thing - can you find a sixth?".

>There were cultures that did math with a completely different set of symbols, not just symbols but even way of thinking.

That's why we teach different notation: it encourages a different way to think. For example, in Calculus, we have Leibniz's "d/dx" and Newton's dot notation for the same notion of derivative.

> Math should be seen like a weird natural phenomenon that we observe, from the getgo

That's one perspective that many mathematicians do have. We do teach it -- in philosophy; it's the Platonic Universe. "Is mathematics invented, or discovered?" is a never-ending debate.

>Fields of math stack on top of each other because they happen to work together but in reality we don't know why. They just do and it's interesting that they do so we keep doing it that way because it works and hasn't broken yet and it's useful.

More of the same - yes, that's the view that is quite common.

>Yet our current way of doing math is taught to us as dogmatic, accept it or you are a troublemaker. No wonder people dont like math, nobody wants to be a slave to somebody else's ideas and value system. That in my opinion is how you make people interested in math because you treat them as equals instead of subjects to be trained in your favorite cultural way of doing math.. From childhood on. Explain it like that and I believe kids (and adults) will love to discover more about it.

Yes, sadly, that's the problem with the way mathematics is often taught -- by people who either don't know better or are forced to teach it that way because they are a part of a system.

That's not how math professors are teaching it to their kids, I guarantee you that.

You would greatly enjoy reading Mathematician's Lament[1], which shares your sentiment.

------------------------------ Now, my main point:

> Try getting a math professor to admit that he doesn't actually have an intuitive understanding for any of the stuff he teaches. I believe the vast majority probably don't have any such understanding but they are understandably scared to admit it.

First, no need to be aggressive.

Secondly, that's quite an accusation!

I am not going to refute it other than by saying that it's plainly false -- at least by the time you get to college.

At a high school level and earlier.... The problem is that one often doesn't get to do mathematics unless they are in a research program for math. That is too sad, but we end up with people who teach math and never got to do it -- because doing it is just doing what you are writing about: playing, exploring, being surprised by the fuckery of the universe.

What is true even for universities, though, is that the administration sets requirements on the curriculum and how it should be taught - and there's, sadly, little place in it for the intuition, vision, play, experimentation - which are the only things that should be taught.

Many people are fighting to change this.

Finally:

>Disclaimer, I'm really bad at math.

You are not. You have the right perspective. Armed with it, you are equipped to learn it far better than many others. Let us (the professors, teachers, random people who know some math that you don't yet know, etc.) help you.

Ask any professor about any math concept in this way: "What's a way to look at it? How should I really see it?" - and their answer will go into everything you are talking about.

----------------------------

[1]https://www.maa.org/external_archive/devlin/LockhartsLament....


"And in math papers notation is often invented on the spot. [...] It can mean whatever the author felt like that day. Maybe it's quandle product! Maybe multiplication. Who knows."

This shows another longtime insight I think I have on math, I think tool-wise we are in the stone age of math. Every formula somebody writes down should automatically be an executable program that can be debugged or at the very least that has intellisense/autocomplete/documentation pop up over its symbols. Forget executability for now but even just having a convenient way of parsing a formula, havings its notation explained to you automatically.. that would be so important. Also I don't think that a math notation which has symbols that don't even exist on ascii keyboards should be acceptable. To me, math notation is a programming language like any other and it shouldn't be so hard to write it, find documentation for it, get autocompletion for it and to verify it. I get that math was invented in the time of paper and pencil but we need to find a way to write math more efficiently on the computer now so we can get better and faster at it, so people can copy and paste math on the internet without having to use picture formats or specialized software to render it. You probably agree that the vast majority of math papers are never read by more than two people, the original author and the reviewer. Imagine the alternative: Every paper could be like a programming library that should be plug&playable on the spot without even understanding it. Just hit that button on the top of the paper that reads "play" or "execute" and let's see the testcases/subproofs/partial proofs running green in a checklist. Then let's read the API docs to get which final formulas are usable to me now that you wrote this paper. Whoever sits at these prestigious journals in my opinion is already being paid for this very job: to make sure that papers are accessible. Well to me that also means development of tooling, conventions, etc. Not just a simple online search box where you might find a whole paper via a keyword.

"Is mathematics invented, or discovered?" is a never-ending debate."

I think that this is the wrong focus because I heard this question when I was younger and it made no sense to me. The act of asking that question (often playfully, probably somehow already admitting thereby that it's a wasteful question which is surprising in itself) was confusing to me because intuitively I think we all know the answer, it's obvious. But as soon as you ask, you imply that it's an open debate and thereby create confusion. Everybody already knows the answer: Math is obviously an invention in most ways because humans invented the symbols but it's also obviously an approximation of something inherent in the universe. We may not know the exact percentages of how close we are to the universe's true inner workings but that wasn't the question anyway.

I will read your link.


The act of formalizing mathematics in a way that can be machine verifiable is an active area of research, but not one that is much of a concern to the majority of mathematicians. I will say that in the current state-of-the-art writing machine verifiable proofs is often quite difficult and unintuitive, and not possible yet for many branches of math. The foundations upon which generations of mathematics are based upon do not lend themselves to machine verifiable proofs. Placing these fields on such foundations is very non-trivial, but people are trying.

To your points on notation, I think requiring math to use purely ascii characters would end up being more of a burden than you expect. There are a lot of concepts in math and having more characters to use to represent them in a small space is helpful. I would much rather see a phi (one char) than a word representing something because it's easier to parse. Succinct notation lets you abstract big concepts and express relationships between them in a big-picture sort of way without requiring multiple lines. Then it's easier to remember the resulting relation.

It's also worth noting that a piece of notation generally doesn't have one global use throughout mathematics. As long as notation is defined it's generally not a problem.


Ever try to freestyle your own symbols while writing code? It doesn't go so well. There is extremely little freedom in the characters that you can code with. Coding is also rife with abstraction and complexity. All of the problems and barriers to learning that you describe exist with coding and are arguably worse. Yet, like mathematics, computer science is budding and creative; you just need to drudge through some busy work before you get to the edges of current knowledge.

Your remark about professors not knowing what they teach in some sense is mean-spirited.

If you don't like how mathematics is written or taught then just write something else and use different language. All you really need to do is communicate your logical point. With computers, on the other hand, you can't just write your own language or run any type of software on your hardware.


>Forget executability for now but even just having a convenient way of parsing a formula, havings its notation explained to you automatically.. that would be so important

>To me, math notation is a programming language...

That's the thing though - it's not. It's a language, but not a programming language. It's not strict. It's by humans, for humans.

Why is it so clunky sometimes? Because explaining things is not easy. People are trying their best, but in the end, they get together after the conference talks over a glass of beer and go "Well, here's what's really going on there".

Have you ever been in a state where you know what you want to say, but just can't find the right words for it? That's the perpetual state of mathematical writing.

>You probably agree that the vast majority of math papers are never read by more than two people, the original author and the reviewer.

Unlikely. Usually, there are groups of people who get together at conferences and talk about what they do. Mathematicians rarely work in isolation.

>Every paper could be like a programming library

In a way, they are -- but the hardware is your brain. You can't make the computer do the work for you -- no more than we could improve on this very comment. In the end, the paper is communicating ideas.

Yes, there's work on people formalizing math to turn proofs into computer programs. The result is machine-verifiable, but unreadable - as is often the case with code anyway; without documentation, it's not easy to understand what the code is doing.

>Then let's read the API docs to get which final formulas are usable to me now that you wrote this paper.

But math is not about formulas. Often it's about concepts, constructions, patterns, ways of looking at things.

>We need to find a way to write math more efficiently on the computer now so we can get better and faster at it

I don't think tooling is the bottleneck, really. We have LaTeX, which is easy enough to use, in my opinion.

---------------- That said, one person that would agree with you is Stephen Wolfram.

Mathematica Notebooks are pretty much exactly what you describe: they are executable papers, where you can mix text with computations and code, etc.

One reason they are not the norm is that Mathematica is a product that costs $$$ (although the engine is free with RaspberryPi, and Notebook reader is free, IIRC). They are going the way of the cloud now, though.

The competing FOSS solution is SAGE math: http://www.sagemath.org/ - but it's more of code-for-math than the concept you describe. Mathematica hits it on the nose.

It's been around for a while, but still didn't really catch on.

Part of it is proprietary format, part of it is inertia -- but part of it is that it's often not what the authors want.

What the authors want is tell a story, not create an executable object. The symbols are just crutches.

I think the real problem is that the story-telling aspect is thrown away in many papers, leaving the place only for the result. People don't like showing the dirty work, the unfruitful steps, their thinking that brought them there. The informal, gritty stuff. (They leave it for beers after the talks).

But that's not how it used to be. I was trying to learn about quaternions one day, and found the original lectures by Hamilton, their inventor. It read like a novel. That's how math writing should be. It degenerated in the last 100 years or so, but it's coming back to life now, I think.

>Whoever sits at these prestigious journals in my opinion is already being paid for this very job: to make sure that papers are accessible.

HAAHAHAHAHAHAHAH. HA. HA.

Sorry, my friend, let me ruin your world view here.

First, effectively, nobody sits in the journals. Mathematicians write papers, other mathematicians review them - voluntarily. The journals are often little more than matchmakers. That's why we are having a revolution of sorts now: people are starting to ask why we need the journals in the first place. And some people outright believe that we don't - that ArXiV (the website where mathematicians put their papers without review) is enough.

Secondly, nobody gets paid. Mathematicians don't get paid to write papers, reviewers don't get paid to review. There is an immense pressure to publish, but it's not like one gets paid per paper.

If you mean the publishers that host the papers - their purpose is to make money off subscriptions, and that's about it. #downwithelsevier

Why do people still try writing good papers? Because they want the ideas in those papers to spread. Why do papers still suck? Because explaining something clearly is hard.

>Also I don't think that a math notation which has symbols that don't even exist on ascii keyboards should be acceptable.

And everyone should just speak English. Увы, увы, было бы довольно печально жить в таком мире.

>I will read your link.

Please please please come back here and share your thoughts when you do! Can't wait to hear them.


On what planet do mathematicians not have an intuition for what they study?


Nice parallel with Iverson's 1979 Turing Award Lecture, "Notation as a Tool for Thought": http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pd...


Opening up with the claim that computer software is "based on the ideas of Claude Shannon" is strange. Does anyone seriously believe that Shannon's contributions to the foundations of computing stand so far above others'?


Turing - On Computable Numbers, with an Application to the Entscheidungsproblem - 1937

Shannon - A Mathematical Theory of Communication - 1948

Shannon calculated the entropy of arbitrary symbol systems over noisy communication channels, and made some first steps towards practical data compression algorithms.

Turning formalised the concept of computability using an abstraction of a mechanised logic system.

I'd say Turing wins easily - we still say logical systems are Turing Complete rather than Shannon Compatible.


Don’t forget Shannon’s 1937 master’s thesis, A Symbolic Analysis of Relay and Switching Circuits.


Yeah, he actually had the idea that you could use the 19th C Boolean logic and switching circuits to do mathematical operations, making modern computers possible. It's disappointing that he's not better known for this, to say the least. e.g. that anyone on here doesn't know his huge importance.



I thought the same; my account [1]. Maybe Shannon [1937] influenced the ENAIC-ers? Dyson hardly mentions him (4 references [2]), of which the most notable is:

"Claude Shannon, whose mathematical theory of communication showed how a computer built from unreliable components could be made to function reliably from one cycle to the next."

That sounds ... like a basis for translating Turing machines into hardware, not software.

[1] http://whatarecomputersfor.net/versatile-information-machine... [2] https://www.goodreads.com/book/show/12625589-turing-s-cathed...


> Does anyone seriously believe that Shannon's contributions to the foundations of computing stand so far above others'?

Not far above but he's right there with Church and Turing. Church and Turing laid the foundations to computer science. You could argue shannon gave us computer engineering.

Shannon created information theory and his proof of equality between electric circuits and boolean algebra is the foundation of physical computing today. All modern computing devices go back to his proof and his adder.

Shannon tied the electrical ( circuits ) with the mathematical ( boolean algebra ) ( which one could argue is computer engineering ).


I do, but that sentence is weird.


> For instance, say you’re in a race at school. You do surprisingly well and beat most of your classmates. All things being equal, the next time around, you’re actually not likely to do as well, relative to the other runners.

If the events are independent (as I believe the author is assuming at this point), then your performance in the first race has no effect on your performance in the next race, right? Maybe it's just a poor choice of wording, but it comes dangerously close to reinforcing a common misconception about probability (Gambler's fallacy).


No, the author is not assuming that. And in fact the opposite is true.

The assumption is that your performance on any given race is some combination of luck and ability. If you perform extremely well, it should be assumed that both were in your favor that time. The next time you'll still have whatever ability is in play, but you are unlikely to have luck.

The result is that your performance is likely to be good, though not stellar. You racing one day or the next are not independent events - your ability creates a correlation. But your performance one day is also not a linear prediction of your expected performance the next.

This is called regression to the mean. It comes into play in everything from stock picking to poker players.


You are committing the gambler’s fallacy. There’s no reason to expect your “luck” (whatever you mean by that) to decrease rather than increase the next time you race.

This is different from saying that you are likely to win again. You’re just as likely to win as you were the first time around, which may be high or low depending on ability.


No, I am not committing the gambler's fallacy. Though I can see why you might think that. There are enough things that sound similar.

I'm instead talking about a more subtle detail. Which is that selecting a group based on performance results in selecting people in part for having been lucky. They were lucky to have done that well that time, were lucky to be in your group, and their future performance probably won't be as good.

The result is called regression to the mean. See https://academic.oup.com/ije/article/34/1/215/638499 for an example of statisticians talking about it. See http://onlinestatbook.com/stat_sim/reg_to_mean/index.html for a more introductory tutorial. And see http://wmbriggs.com/post/63/ for an example of an article discussing this counterintuitive phenomena in the context of sports. (Namely the "Sports Illustrated curse" - the future performance of athletes whose performance was good enough to get them on the cover of Sports Illustrated drops after their article appears.)


I think I see what you mean now.

If you roll 20 dice and select the highest, the next time you throw that very same die it’ll probably have a lower value, simply as a consequence of outcome-dependent selection + a fixed probability. It’s not that the probability has changed, just that you didn’t choose the first outcome according to the true generating distribution.

Comparing this highest value with the highest value across all coins when you toss them all again a second time is a different matter. It’s no more likely to be lower than higher.

It's best to avoid using the term “luck” altogether in discussions of probability, since it’s easy to misinterpret.


That's exactly the right idea.


Drop out the ability part. Let's focus on a pure luck scenario: Take 2 dice (2d6) and roll them. You get a 12. What is more probable on your next roll? Another 12 or a 7?

With regard to the race, if you do really well, much better than your average, without any fundamental change to your ability that's the same as rolling a 12. It's an unlikely (though possible) event that happened to occur. Over a series of races, however, we would not anticipate a repeat of that performance (that is, it's a low probability event, and a series of such low probability events has even lower probability).

The gambler's fallacy goes the other way. It's the belief that if we've had a string of bad luck [0], then we should wager on good luck being around the corner. That is, the gambler assumes a low probability event is "inevitable" after a series of high probability events. Regression to the mean: A high probability event should be anticipated after a low probability event.

[0] In gambling this usually means getting close to the average, that is: the average hand is the lowest valued hand, the rarest hand is the highest valued. So a gambler making this fallacy would wager on seeing a royal flush after several hundred hands of only seeing pairs.


I understand what the GP was saying now (see my sibling comment).

The gambler’s fallacy can apply the other way, e.g. “This die has rolled over 3 many times, so it will probably roll 3 or lower next time.” So that's not really the issue.

Your dice example is not illustrating the point the GP is trying to make. It’s just comparing the probabilities of two outcomes of the same distribution (the sum-of-two-dice distribution).

This has nothing to do with the selection effect the GP is talking about. That would be recording the value of the die that rolled highest (i.e. taking the maximum over dice) and then focusing on the same die next time (i.e. not taking the maximum over dice).

It’s comparing the probability distribution of max {X_1, X_2, X_3, ...} with the probability distribution of any individual X_i. If A is sampled from the former and B is sampled from the latter, P(B < A) > P(B > A).


To diagram it out:

  a = ability
  l = luck
  p = performance

    a     l
     \   /
      \ /
       v
       p
Each (ability and luck) has a contributing factor. Your ability is more likely to be consistent (within some stretch of time). Your luck is less consistent. So if you average a 7 minute mile, and you somehow run it in 6, unless your fundamental ability has changed it was luck (weather, mental state, wear on your shoes) that pushed you so far outside your norm, and you should anticipate returning to your average in the future.


What is “luck” supposed to mean in your diagram?


Consider children of immigrants. Look at the ones who are tall, taller than their parents. Are they tall because they got good nutrition, etc.? Many people supposed that they have natural tallness genes and their parents were only shorter because of tough times in the old country.

Now look at their children, the grandchildren of the immigrants. They tend back toward average.

The tall kids were tall because, in part, there is a natural variation and they happened to get on one side of the bell curve. Luck, if you will.


Any contributing factor outside your base ability. Weather (too hot, too cold, just right), mental state (did someone put you on edge, sleep poorly), etc.

So some of those are more or less under your control. Maybe you can control your mental state better than me. Me, a particularly stressful day at work doesn't lend to the quiet mind I need for a run. Those are my 36 minute 5km days. But nicer weather, seeing the geese on the pond as they migrate, can put my mind at ease even on those stressful days and I might pull off one of my faster runs. I can't control that.


so performance is ability plus the contribution of all other factors that are not ability.

tough to argue with that one.


I mean, we can break out luck to be all the various factors. But it can be summed up into one (for the purposes of this discussion). And each of those will still have some mean that they tend toward. The weather, outside San Diego, isn't going to be perfect year round. Your relationships (impacting mental state) won't all be rosy. Your sleep can be interrupted by a neighbor moving in at 3am. Though that last one is atypical. And if it happens, and your race time is 8 minutes instead of the average of 7, you should still expect to be closer to 7 the next race barring similar misfortune.


I think the main point is that performance depends on some factors that we can reasonably expect to remain constant between races ('ability'), and others that are likely to vary quasi-randomly between races ('luck').

Those names are imperfect but the underlying logic is sound: the winner of a race is relatively likely to have both high 'ability' in himself, and higher 'luck' than usual for that particular race. In the next race, his 'ability' will remain high, but he will probably have roughly average luck.


If you have a competition where everyone flips coins, and you are eliminated as soon as you flip a tail, and you win, it's likely that you will also win the next time. This is not because the events are not independent, but because you are always unlikely to win.

Likewise, if you get a royal flush in a hand of poker, you probably won't get as good a hand next time. Again, this is not because the deals aren't independent events, but because it is always unlikely to get a royal flush.


Good points.

> Likewise, if you get a royal flush in a hand of poker, you probably won't get as good a hand next time.

I understand what you mean, but I think that most people who haven't taken a probability class would misinterpret this. A sentence of the form "if [this] then [that]" is usually meant to express dependence. Don't you think that most people would interpret your sentence as: "Because I got a royal flush this time, I probably won't get one next time."?

I think the average pop-sci article reader will get confused by this.


it's really hard to map subtle statements about probability to natural English sentences. pretty much any non-awkward formulation of that sentence is going to admit misinterpretations.


> it's likely that you will also win the next time. This is... because you are always unlikely to win.

Did you mean to write “it’s not likely that you will also win next time”?


I did. Unfortunately the editing window is closed. Thank you.


I don't think it's phrased to correlate the second outcome dependent on the first. That part of the article was referring to the racer regressing to their mean. With the assumption that they overperform in their first race based on the phrasing "surprisingly well", then the conclusion is still correct because the likelihood of them performing to their average performance level (whatever that may be) is still more likely than overperforming.


I was surprised at this passage, too. It begins by quoting the Oxford dictionary, which always raises alarm bells that the author's ideas on the topic are inchoate, but the wording of the 'racing' example is an unforgivable blunder. The 'mean' referred to in 'regression to the mean' is the mean of the individual, not of the pack.


I don’t see why you think that. Let’s say each of your classmates has a number of dice representing their innate ability, and rolls all of them. You have 5 dice, do surprisingly wel and roll four sixes and a four for a total of 28, beating some classmates who have 6, 7, or even 8 dice.

If you repeat this, it’s unlikely you will do similarly well.


I thought this was a really beautiful article. All I have to add is that it seemed to avoid discussing philosophy as a discipline, even when it seems impossible to ignore. For example:

>Perhaps more than any other subject, mathematics is about the study of ideas.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: