Hacker News new | past | comments | ask | show | jobs | submit login
There’s more to mathematics than rigour and proofs (2007) (terrytao.wordpress.com)
145 points by _ttg 67 days ago | hide | past | favorite | 90 comments



One day I'll retire and go back to school. The idea of learning Math - really learning & understanding Math - as a fun pastime is so appealing.

What's stopping me now? That sweet overpaid SDE salary and the endless obligations that come from being an adult. I suspect I am not alone...


I’ve had a very similar experience and my solution was to incrementally move into more and more math intensive jobs, starting from regular old SWE working on a web app to working on an aerospace-related web app that involved lots of physics and geospatial calculations and then moving toward MLE/DS jobs. All without a STEM degree, teaching myself the math as I go. It hasn’t been easy but I enjoy what I do more and more over time.


This is a bit tangential, but was the lack of a STEM degree ever an obstacle in getting those jobs? Just asking because I'm currently in a similar position, regular SWE job, but I'd like to eventually move to something more mathy.


Impressive :)


keep the salary. if the goal it to change the world, to understand reality, to have an impactful life, have a good standard of living, etc. math is one of the hardest ways of achieving that. It's such a saturated field. Almost everything you can imagine has been done to the highest possible degree of abstraction. Every stone overturned except for things which may take a lifetime to even try to understand. Writing a blog post is probably way more fulfilling and also a doable challenge. A top mathematician may spend years working on a result that maybe if he is lucky be worthy of a footnote somewhere.

As a field I think math is well past its diminishing returns imho. It's like 'what was the last big philosophical discovery'...yeah...hard to think of one. Maybe the P zombie concept or the simulation hypothesis. But new and important books, fiction and non fiction, are being written all the time.


I'm not going to gainsay your experience, but it doesn't match mine. Much of what you do in graduate school is to discover where the accessible areas of research are--where do we have a foothold, and are making progress, and what are some achievable results?

There are big and hairy problems that are bad investments for a young mathematician. I would steer students clear of the Collatz conjecture. But once you get up to speed in your research area, you usually find interesting problems thick on the ground.

Tenure-track positions are competitive, but I don't think there are a lack of interesting things to work on.


when I worked on math I found that no matter what problem I was working on, someone had already solved it completely or to high level of abstraction than I had. got discouraging after while.


I think the whole point would be to retire and do math for fun, not desiring anything more than the joy of discovery, no footnotes, no recognition, just math.

99.99% of everyone won't be remembered for their "contributions" so why not do something you enjoy?


Even if no one remembers you, your work is a contribution. Learning for its own sake is a hobby for yourself, like watching TV or reading a book.


Let me be remembered for watching TV. I would be utterly delighted if that’s my long term legacy. “They finally rested and enjoyed the most banal show imaginable. It was relaxing. Any other lasting contributions will be in other records should you care.” There I’ve written my eulogy.


My grandfather's last 10 years were spent developing diabetes, watching those three shows, and playing Microsoft solitaire until death. If that's what you want, then, sure, you can be remembered that way.


This is why I'm happy to simply read the well established facts of a variety of fields. There's more than enough to learn for a lifetime, and from what I read about research there's a heck of a lot of BS in the way for small incremental gains.

Having said that, if something does come along that interests someone, they should try it. A friend of mine is doing his 2nd PhD 40 years after his first, having found his way into it via a love of jazz music.


I totally agree with what you say, but there's a lot of low hanging fruit in mathematizing biology.

It's not easy, but it's certainly very impactful.


can you expand on that? sounds really interesting


Drop me an email, contact info in profile :)

Otherwise, I will try to come back to this comment in a few days. I'm a bit busy with an urgent deadline, but I'm glad to expand on this later on!


This is what a imagine late stage security after OS mature there stacks and such. The advent/widespread use of robost memory protections like PAC and Cheri are going to be so depressing for those on the offensive.

I wish I just had more time to do math like the OP but the saturation of the field especially with people who can deeply understand the abstraction is very very intimidating.


About the simulation hypothesia, how is that different from Descartes evil demon?

https://en.m.wikipedia.org/wiki/Evil_demon

(apart from saying that the evil demon is a future type computer)


The simulation stuff is just “Plato’s Cave”:Reloaded for people who never understood the concept of the cave in the first place. At a basic level you can interpret it from Gödel’s incompleteness theorems, which state that systems of logic need some form of observer to function. That observer concept reaches way back across different philosophical domains and authors as well.


That is not at all what the incompleteness theorems say.

The first incompleteness theorem says that for any consistent formal system T (with a recursively enumerable set of axioms) capable expressing of elementary arithmetic, T can express a statement which it can neither prove nor disprove.

The second incompleteness theorem says that T can't prove the statement "T is consistent". (I've still glossed over a number of technical details here; pick up a book on model theory if you want all the messy internals.)

First order logic is notably not capable of expressing elementary arithmetic. And observers aren't involved in any way.


Yes, it is.

Sorry if you can’t read deeply into it or something. I’m not posting for grad students. I can sense you just like to correct people. Ahhhh I’m so wrong, you’re right?


We've banned this account for continuing to break the site guidelines after we asked you to stop.

https://news.ycombinator.com/newsguidelines.html


Godel's theorems mean something quite specific, and rely on an equally specific set of hypothesis.

It's tempting to try to apply them (or rather the same kind of conclusions) in other (non math) contexts, but it's very not obvious that you'll get something sensible. While you can play with the ideas, invoking Godel's theorem outside of its specific context doesn't make much sense.


I usually think of logics as search algorithms, since that's how the semantics for the meta-language describing them are defined.

I guess you could call a Turing Machine implementing the search algorithm for proofs implied by a logic an "observer", since it produces "reachability observations" i.e. proofs.


just a minor thing to note here: there's the foundations of mathematics... questions like, what are numbers and other mathematical structures? How is it that math, any math at all, can potentially accurately describe parts of reality? Symbolic representations of math are just squiggily lines... why should squiggily lines have any special relationships to the nature of the universe? Details like that are far from being understood.


maybe I'm misunderstanding you, it sounds like you're just describing abstract algebra. See linear algebra or group theory as topic titles.


I couldn't wait that much, started researching stochastic processes a few years ago and been developing a theory for supertasks.


What was this utter gibberish I just read? I’m absolutely positive you have the least idea of what modern mathematical research entails or is even about. Please refrain from sharing your opinions on subject matters which you clearly lack any understanding of.


I'm actually looking into this. Just an hour ago I mailed the local university that I won't be doing any courses there. My initial plan was to do a bachelor at around 50% speed. But working and having to girls (1,5 years and 3 weeks) makes that quite impossible. And looking at photos of myself at 17 makes me feel rather out of place at a university at age 42.

The Open University has an AI master that I'm thinking about right now. It has about 25% of the math that I want to learn, so that would be a good start. I did some prep work (an official high school math certificate) last few months and I noticed that I need a schedule to keep me going.

One thing that I'm quite certain about, is that doing math is the most important thing. And doing math leads to more doing math.


RE learning math: you might want to check out some of my (non-free) books on MECH+CALC https://minireference.com/static/excerpts/noBSmathphys_v5_pr... and LINEAR ALGEBRA https://minireference.com/static/excerpts/noBSLA_v2_preview.... They are written especially with adult readers in mind.

RE schedule to keep you going: I've had some students use the concept maps as a "world map" (like the stage map in Mario World) and check off boxes as you progress through them (each concept corresponds to roughly one section). See https://minireference.com/static/conceptmaps/math_and_physic... and https://minireference.com/static/conceptmaps/linear_algebra_... I guess you could time-box these and do N of sections each week to make this into a schedule. Make sure you dedicate lots of time for the exercises/problems, because that's when the real learning happens...


Be sure not to retire too late. At an older age, with all the experience you will have under your belt, picking up new concepts and methods will not be a problem, but retaining details in memory will. You have no idea how incredibly hard it will be. So start early.


Same situation, but I've started! I only spend about 30 minutes every other day, and it's extremely slow-going, but so fulfilling. I had the same goal as you -- _really_ learning & understanding math.

I finished pre-Algebra last year and I'm halfway through an Algebra text by Gelfand & Shen now. My friends look at me funny when I tell them I'm re-learning Math from the ground up for fun (esp. with a degree in CS) but it has been so rewarding. I probably won't get to finishing Calculus for another couple years but I'm already having so much fun. Stumbled upon deriving some exponent laws last month by accident and truly understanding the sum and difference of squares has been awesome.


I'm still grumpy that I accepted I knew what real numbers were just because I could recite back the definition given by the teacher. There is so much depth there if you go looking...


I contend no one understands real numbers if they can’t explain (without resorting to essentially purely symbolic rigor) why you can cover the rationals with intervals of arbitrarily small total length, but you can’t do the same with R.


I've been thinking about the reals a lot...beyond the rationals are all the real numbers like pi that have finite definitions, (even if those definitions, like pi's, require infinite computation.)

But there is also this vast set of reals that are simply undefineable, non-repeating sequences. These numbers are unmentionable and unknowable. Does it really even make sense to say that this subset of the reals exists in the same way that definable numbers do?


Yes, of course. Because definable (really, computable) numbers don’t really “exist” either.


We can can use definable numbers. Pi is used with such massive frequency that is seems silly to say it has the same nonexistence as numbers that we can literally never even mention, let alone use.

My assertion is that there is something wonky with these u definable numbers and that wonkiness is directly related to how absolutely massive the infinity of reals is.


I feel like that's downstream of seeing the reals as the rationals with the holes plugged, no? Obviously you can start from either end but everything special about the reals comes from being the complete version of the rationals.


Well, to understand an explanation one must already know something, otherwise first you have to explain those other things, e.g. the simple fact that unlike the reals the set of rational numbers is countable…


Cardinality hasn’t much to do with it since there are uncountable sets which you may cover like that.


Sure, but for countable subsets (such as the rationals) it is easy to show.


Sure. Explaining what’s different about the reals is the hard part


How so? Even a real interval of a finite length cannot be covered by any set of intervals of a smaller total length. (Unless the person you are trying to explain this to starts raising questions about the meaning of 'interval' or 'length', in which case the meaning of the original question becomes just as uncertain in the first place.)


> Even a real interval of a finite length cannot be covered by any set of intervals of a smaller total length.

You’ve only restated the problem without saying _why_ covering the reals is different.


Because the 'intervals' in question are already 'real'?


Well, first you would have to explain what you mean by your statement.


I mean that for any epsilon > 0, you can have a set of intervals of the form (a_i, b_i) where every rational number is in some interval and the sum over all i of b_i - a_i < epsilon.

That is, you can cover the rationals with intervals of arbitrarily small total length.


I see. But is that not just a simple conclusion of the fact that the rationals are countable, and that there exists a converging series of positive numbers?


Right but explaining why the reals are different is the tricky part


The way I explain this super simply to people is absolute value.

To most people, it just means that when you take the absolute value of a negative number it becomes positive, and the absolute value of a positive number stays positive.

Now there is more to it, but how you might think of absolute value instead is as a distance function, particularly how far away from zero you are on a number line.

This is way over simplified, but an example of how there can be a little more buried beneath the surface.


In the same boat. One of my dreams is to go back to school, learn vector analysis, differential equation, differential geometry, classic mechanics, electromagnetic, special relativity and finally general relativity.

Actually quite doable as long as one can grit through the Math, some of which do not need a back to back read.


Why would you need to go back to school to learn those things?

At best school is a syllabus telling you what people think you should know, fairly easy to get a hold of, and maybe a bit of accountability to make sure you actually learn it.


Yeah I agree with that. I can also buy a few books and go from there. But essentially O cannot do that now because of X, Y and Z.


If your goal of learning is for your own benefit, it's much easier than actually going to school and doing exams, assignments, and getting credits which involves a massive amount of formalities.

A good starting point could be MIT OCW 18.01, 18.02, and 18.03. Do all their problem sets and get as much understanding you want. It corresponds to a first year university engineering curriculum.


My lofty future academic pursuit for fun when I can relax from my professional programming career would probably be compsci and then research if I’m still interested. Gonna get really old and tired working in the semicolon mines, then turn around and learn all the stuff I was supposed to learn when I started. At least that’s my picture of my casual retirement interests.


One of my biggest regrets is not having studied (more) math. But would I regret not studying CS had I studied math?

You really can't win :/


Same. But we can keep looking for opportunities to learn math in our spare time. For me though, old CS books have a lure all their own :) Maybe an algorithms book (plus MathOverflow) or TLA+ as a gateway.


CS is math...


I’d argue math is actually (a proper subset of) CS.


Math is at it's broadest the study of formal systems. Computer science is the study of a particular formal system. While it is a powerful enough system to contain all of math within it, there are many such systems nested within each other. Is the Turing machine formalism more powerful? No. Is it more efficient or intuitive? Also no. It requires axiomatic reasoning to construct, and then reproduces it internally. Math and CS are set equal, but one predefines an entry point and the other does not. From the human perspective math gives rise to CS, which is just one of math's many children.


I’d argue that logic is a subfield of computability theory, and so by extension we (in CS) absorb math.


Like I said, you can indeed construct logic through CS! But you can also obviously construct CS from logic. You can also construct logic through the natural numbers, so number theory absorbs math and by extension CS? Fact is anything reasonably complex can replicate everything else. Exactly one of these fields, however, is specifically carved out as the study of any formal system, and it isn't CS.


I’m arguing that logic is a subfield of CS, so “constructing” CS “from” logic isn’t a contradiction.


CS only involves computable numbers ( Turing ) which are a subset of real numbers. By definition CS is a subset of math, not vice versa.


But the real numbers cannot be faithfully represented in a computer!


So? Neither can the list of possible computer programs, yet CS studies them


It does not look like CS studies the real numbers.


Each real number can be represented as a subset of the natural numbers. CS surely studies these.


But not the representations themselves (which is what's important in this case).


In a practical sense, not really.


Aim for financial independence on that SDE and you can retire a few decades ahead of schedule. And you’ll have plenty of time for that sweet math learning.



not alone, or more like the vast majority

I'm still trying to find a part time dev gig so I can just focus on graphs and advanced combinatorics


You can’t really learn Math at school. Not really from the perspective of understanding mathematical beauty deeply. I wish there was an outlet or means to do so. I always found schooling to be woefully inept at assisting in learning the craft.


Yes, you can't do something if you chose to do something else instead.

Clearly you find earning/spending money more appealing than math.


What a ridiculous oversimplification of life...


I love Terry Tao’s writing on math. One thing that strikes me about him is that, despite being an absolute technical powerhouse, he writes in a very down to earth style that connects disparate areas of math - e.g. his article on “what is a gauge” https://terrytao.wordpress.com/2008/09/27/what-is-a-gauge/am... where he explains how dimensional analysis might be viewed as a change of coordinates. Too often exposition in math is myopic and fails to impart a unique perspective on the subject, but Tao imbues his writing with a wisdom that I consider the sign of a true genius.


Related:

There’s more to mathematics than rigour and proofs - https://news.ycombinator.com/item?id=9517619 - May 2015 (32 comments)

There’s more to mathematics than rigour and proofs - https://news.ycombinator.com/item?id=4769216 - Nov 2012 (36 comments)


I understood math much better after following the Software Foundations tutorials:

https://softwarefoundations.cis.upenn.edu/

Another good starting point (for mathematicians) would be the Lean community group and the Math lib project:

https://arxiv.org/pdf/1910.09336.pdf

https://github.com/leanprover-community/mathlib


I feel like this is how all domain expertise works, no? Start with intuition which helps you solidify some of the foundation. Flush out the foundation and start building complicated structures. Now that you've built up the experience, go back and use your intuition to figure out new types of buildings to build.


This is remarkably accurate and resonates with me a lot. I did mathematical olympiads in high-school, where intuition to crack problems plays a major role. Then went on to college to study an undergraduate degree in maths (concentration in analysis). Analysis requires, at least in its rigorous foundations, to be careful and have a skilled knowledge of logic/quantifiers (more than elementary abstract algebra in my humble and biased opinion), often very scrupulously. Then in my graduate studies intuition along with the maturity of rigor work to produce new theorems. I'm impressed that several times I look at a paper or series of results and can read them "diagonally" to get the motivation without scanning all the text (of course, if the aim is to cite/build on top of/generalize/apply it then close attention to reasoning should be paid).


You can infer from this how Artificial Intelligence and Logic should be combined: AI enables a "post-rigorous" mode, and Logic is how you know you are still doing something sensible, and how you expand your sure footing.


#2, #3 means being at the stage where you can look at something and be like "no this cannot work" or "maybe this can work" without having to do all the steps.


Which are the newest developments in math? Someone told me Geometric Algebra has been around for a long time but wasn't really useful until some recent theorems.


It really depends on which field you are talking about. I’d say it is very hard to find an area of math that’s completely new , but you will often find existing areas where novel perspectives are driving math forward. Geometric algebra may be a hot topic in some areas, but the ideas of exterior algebra go back more than a century at this point, so is it really new??

Just to humor you though, I think Deep Operator learning is a vastly exciting new field which combines ideas from functional analysis and deep learning in order to do things like solving PDEs.


I can't wait for theorem provers to be commonplace.


A proof that no one would understand in not a good proof. The ideal approach to proving theorems, at least according to how Grothendieck did it, is to build a beautiful theory in which the proof becomes elementary.


It's quite a big assumption to think truth can always be bent so as to satisfy our ridiculously limited cognition. And math has been used instrumentally from the very beginning, so results are often much more important than the process. Theoreticians may still value elegance because that gives them pleasure or whatever, but few other people care about that as long as they can use the results.


The results aren’t often nearly as useful as the techniques used to find them. For example, a completely opaque proof resolving P vs NP is completely useless to theoretical CS.


Mathematics needs imagination to come up with conjectures that need proving; without that there is nothing to apply rigor to.


All you need to get by in life is a solid understanding of a little applied math and statistics. Leave the rest for professors and hobbyists.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: