Hacker News new | past | comments | ask | show | jobs | submit login
Is Math Real? (maa.org)
208 points by hhs 8 months ago | hide | past | favorite | 336 comments



> one view is that mathematics is a stiff and fixed set of rules and algorithms while the other view is that mathematics is flexible and our understanding of math comes from questioning of why mathematics functions so effectively

I havent read the book but this dichotomy is not an intrinsic feature of how people pursued mathematics historically.

The "stiffness" and excess focus on rigor and accuracy developed gradually over the 19th century because people were being loose canons - primarily around calculus.


This is going to be a classic "computer programmer wants math to be like computers and doesn't get it" kind of take, but In my view the problem of mathematical rigour (or lack thereof) has only become worse over the 20th century and we certainly did not resolve any of the underlying issues in a foundational sense or a practical sense. In a practical sense, it's become much worse and we have many more layers now. In a foundational sense, we succeeded in giving up because we learned that we can in some sense pick and choose whatever is most convenient for our line of research.

That's probably okay if we view mathematics in the way this book (I have not read it, going based on the description here) advocates, as a sort of toy for playing with arguments. And I'm certainly not saying Math should ever be viewed as an empirical discipline nor constrained by that kind of thinking. But I don't think I'm the only one that takes one look at things in the realm of say higher category theory and thinks it's mostly playing word and symbol manipulation games, and lacks any real mathematical content that could not be discovered at a lower and more understandable (and less likely to produce new research) level of abstraction.

I guess I've sort of betrayed that I am pretty firmly a platonist in that respect so make of that what you will.

Like I said, this is not an unusual opinion for a computer person to have and I'm sure it's fairly annoying to any pure mathematician at this point. But I think it's still fair if we want to understand what turns certain people off of pursuing mathematics further.


I'm not a mathematician so anyone please correct me if I'm wrong, but isn't the point of category theory to provide abstractions so your proofs are very general and cover a lot at once? Of course, it needs to be very abstract to achieve that.


This is a classical example of what happens when people give up on rigour:

https://en.wikipedia.org/wiki/Italian_school_of_algebraic_ge...

TL;DR: They started producing false results.

Note that this was not about capital-F Foundations of Mathematics like (arguably) the foundational crisis of math that had its origins in the 19th century, but rather about lowercase-f foundations of a particular field, in this case algebraic geometry.

Weil's foundations of AG in the eponymous 1946 book were horrible and messy but they solved the issue (even today there are a lot of celebrated results that can only be found as expressed in Weil's language) and later in the 1960s Grothendieck provided the elegant language of schemes in which people generally learn and research AG today, and which helped prove long-standing problems like the Weil conjectures and (to some degree) FLT. Category theory was, in this case, essential to proving theorems about "real mathematical content" like numbers and points.


Is that really the case? I'm not an expert in AG, but I would imagine that any major result published in Weil's framework can either be trivially converted to schemes or it has been ported over by now


IMHO, in order to be able to speak about these subjects, we must either be a mathematician, or have read logicomix (http://www.logicomix.com). It is the easiest book to read. Another usefull one is https://en.wikipedia.org/wiki/G%C3%B6del,_Escher,_Bach


This subject actually doesn't belong in mathematics: it's philosophy of mathematics. I.e. no matter how much you study mathematics you will not be able to answer (or even attempt to answer) these questions because no mathematical tools or disciplines are designed for that.

I also believe that the emphasis on "rigor" here is misplaced. The argument isn't about whether mathematical rules are rigorous or not. The argument is about whether mathematics exists independent of mathematicians (and they discover it in a way how an astronomer peers into telescope and discovers new stars) vs mathematics being created by mathematicians' minds (similar to how an architect designs a building: there weren't one before, and now there's a concept of a new building with so many walls, floors, windows etc.)

I believe that mathematics is art, not science. I.e. mathematicians create new rules, they don't discover them. The whole argument to support this point would be too long to write it in a single post, but the general idea is that mathematics is a system that can easily describe counterfactual worlds. We use it to also describe our physical world because, of course, it can do that. But then asking the question about the "surprising effectiveness" is moot: we deliberately made it to be as effective as possible, so how is it so surprising that it is?


> The whole argument to support this point would be too long to write it in a single post, but the general idea is that mathematics is a system that can easily describe counterfactual worlds. We use it to also describe our physical world because, of course, it can do that. But then asking the question about the "surprising effectiveness" is moot: we deliberately made it to be as effective as possible, so how is it so surprising that it is?

The "surprising effectiveness" is, I think, one of three things. First, the surprise is that we can create mathematics that describes our physical world.

The second surprise is that, when we find out something new, it often takes the form of existing mathematics that we didn't design to describe the physical world. (Though, from your point of view, I suppose you could say that we created mathematics to describe everything that could be described by mathematics, and so it's not surprising that something was there to describe reality.)

The third surprise (maybe this is just a restatement of the first one) is that mathematics really describes the physical world. It's not that we find some math that describes it, and then we change the situation a little bit and we need to find some new math. The surprise is that the math describes what is going on so well that it applies to situations that we didn't know about when we devised the math that applied. That is, it's predictive, not just descriptive.


Well, the last one is a kind of surprise like... if you believe in real numbers, then there are some nonrepeating real numbers (pi is believed to be one of them) that embed all other real numbers. I mean, to someone unfamiliar with the concept it sound amazing and improbable that some number embeds all other numbers (all of which are infinite). But this is because we aren't used to dealing with infinite things.

Similarly, if mathematics is so powerful as to be able to describe any physical reality, it shouldn't be surprising that it can describe ours, no matter how complex and detailed.


All ideas are discoveries, not creations. The fact that anybody else can come up with the same idea lends credence to this. That's how ideas work - they are transcendental.


Your argument is similar in spirit to the claim that all, even infinite sets can have single element selected from to form a new set (so called "axiom of choice"). I.e. you equate the mere possibility of existence with a proof of existence (it's not a slur, there are a lot of mathematicians who do just that).

I don't see the reason to think that coincidentally arrived at ideas mean that ideas are discovered. We use the same framework, with the same rules. It's not unlikely that we'll create same ideas, because we use all the same rules, but the framework is so vast... it has so many rule combinations it's mind-boggling how would you think all ideas already exist in this framework.

More practically, we call an action a "discovery" when we (unbeknownst to us) faced the consequences of the phenomenon being discovered, but didn't know why we were facing them. An astronomer who finds a "new" star was ever so slightly influenced by that star's light, gravity etc. A marine biologist who discovers a new deep-water fish was ever so slightly affected by that fish through a complicated chain linking many different species through biosphere.

An artist drawing a new painting isn't discovering it in the same sense. She isn't interested in how an existing painting was connected to the consequences of her life or the lives of the whole human species. By selecting of all possible ways the paint can be laid down on canvas her particular way of doing that she creates something genuinely new, or as new as it can possibly be. It's counterproductive to label this activity "discovery" because then we lose an important distinction between the nature of the work of an astronomer and that of a painter (or a mathematician).

There are sculptors who'd jokingly say that they "discover" the statue in a stone slab. But they do so in a sarcastic kind of way, really (well, artists are weird and will make a lot of nonsense claims just to trigger non-artistic audience). But, deep down, nobody believes that they are searching and finding good images, melodies or novels. I cannot really imagine a mechanism through which I'd discover the answer writing to you. It's a lot easier to explain what i wrote by saying that I meant to write it.


It's not my argument. It's actually a very old argument.

https://en.wikipedia.org/wiki/Theory_of_forms


IMHO, the book was too loose with the truth (even though it was purporting to write about mathematical truths). It's a work of fiction (beyond just filling the characters' words to each other), but that wasn't revealed until the end which left me sour. It pretending to be history right up until the end

GED isn't really related to this subject although it contains some of the same themes and characters


Its still not stiff enough. Mathematicians really ought to settle on a sound and coordinated syntax


Syntax is about the least of people's worries when they worry about 'mathematical rigor'.

Math is made for people to read and write, and they have different 'domain specific languages' for different parts of math.

Now what would be useful is a tool, perhaps something like a large language model, to automatically translate from the notation used in one area of math into another.

That can't be a fully mechanical procedure (hence the need for something flexible like an LLM), because it's part and parcel of human mathematics to abuse notation here and there in the name of ergonomics.

My personal pet peeve is defining a function like f(x) := x + 3, and then treating f(x) as the name of the function, instead of just f. But really, it's just a harmless abuse of notation when done by some humans to other humans.


While I agree with the point you make, I think that syntax has a huge impact. Maybe not on mathematicians themselves, but I am certain that mathematical syntax has done more to scare people away from the field than the actual mathematical problems themselves.

For example the convention to use the greek alphabet for certain things. This is totally arbitrary and you could have also used emoticons instead (had they existed). But what this means is that the pupil, before tackling the meat of the mathematical problem has to accept that weird looking letter they have never seen for no real reason whatsoever.

And I say that as someone who can fluently read the greek alphabet.


It is not arbitrary, it is heritage. Apart from the fact that Greek was the de facto scientific language of the west (which is no longer the case), I think we can agree that the characters of a single alphabet are not enough for notation, especially given the fact that it is very useful to be able to discern different types of entities, e.g. constants from variables or vectors from matrices.

If we changed symbols now, it would create an even bigger mess. Because the people that learn the new symbols, could not read any textbook published before 0 A.D. (Anno Discombobuli)


So you read my comment and thought: "That guy who can read the greek alphabet doesn't know there are historical roots for greek letters in mathematical notation".

Sure, back then when everybody who learned trigonometry had a classical education with ancient greek picking greek letters when the latin alphabet wouldn't do was a rational decision. It just hasn't aged well.


I'm just saying that changing the symbols will make things worse, not better.

Virtually all the textbooks use those symbols. Do you have a viable and better alternative to suggest or are you just complaining? And I didn't assume that you know or don't know something, I just wrote it down for the sake of the argument. We are not having a private conversation, we are contributing to a public discussion.


That'd not what they said: they made a claim that a single alphabet is insufficient, and thus Greek makes as good a choice as it was used historically too (and allows easy differentiation).

Math also uses made up symbols specific to math a lot (integral, summations, arrows, operators...).

If you want to claim that Greek is the problem, how would you solve the problem of insufficient symbols?


There are simple practical reasons though. We don't do that to be fancy. There are simply not enough letters in the latin alphabet to not have common intersection in writing. We like to use the same letters for objects of the same type (like x,y for coordinates or i,j,k,l for indices) because that increases readability significantly. But it does mean that you run out quite quickly.

Adding another alphabet alleviates those issues somewhat but even with greek letters added in we still run into this issue somewhat commonly.


There are simply not enough letters in the latin alphabet to not have common intersection in writing.

Agreed. But yet... some of the approaches taken to deal with that can be wildly annoying. Actually, using the Greek letters is probably the best of the lot, since they are a completely different set of characters with known pronunciations.

OTOH, sometimes you'll see people use both upper-case and lower-case latin letters in the same problem, forcing you to read it in stilted language like "The derivative of Big X with respect to y, plus the integral of Little x ..." Aaarrgggh.[1]

And then you get the "stylized" letters, which are (mostly) just Latin letters, but have no obvious unique pronunciation or verbalization without going through contortions. I mean, what do you say for "𝔑" especially if there is also a "n" on the page? And who's even going to recognize these monstrosities unless you're already a mathematician: 𝔖, 𝔚, 𝖄? Aaarrggggghhh.

[1]: to be fair, you could have the same problem with mixed case of Greek letters, but I haven't seen that as a common problem. But maybe that's only because I'm not a mathematician. shrug


Well, you could use variable names longer than a single letter?


It's a trade-off between brevity and verbosity, some mathematics do that. Often longer variable names are in ALL-CAPS. Since ab = a*b, it's important to differentiate vars from multiplications.


Yes, it is. I just brought it up because it's an important factor when talking about the need for additional symbols.

Btw, even mathematicians don't mind writing out `sin` or `cos` or `ln` in their formulas. So they are certainly not completely averse to multiple letters.


I do not think mixing up 'f(x)' and 'f' is harmless, given mathematics is all about clarity. In any quality text, 'f(x)' denotes the value of 'f' at 'x', and 'f' denotes the function 'f' itself. Also, speaking of notation, I wonder why you used ':=' instead of '=' to define 'f'. There is no computation going on, right?


Mixing up 'f' and 'f(x)' is mostly harmless in practice. The underlying principles are still clear enough. (And I say that as someone who would _really_ like to make that argument that people who mess this up are somehow unclear in their thinking. No, they are mostly fine.

Getting 'f' vs 'f(x)' right mostly is really important for programmers who deal with higher order functions in general all the time. Most mathematicians don't fall into that category.

You could say calculus deals with higher order functions, like the derivative. And that's a valid way to look at it. But most people get by just fine using special purpose notation for the derivative and not thinking about it as a function just like 'f'.)

I used := to emphasis that I am defining 'f' here, not just writing down any old equation. (Eg like like in the example "Find all functions f such that f(x + 1) = x * f (x).")

Though if you wanted to be pedantic about notation, I could have written that as with the x on the other side of the :=, like f := \x -> x + 3 (for Haskell inspired notation) or f := (x |-> x + 3) where |-> means the little arrow I draw by hand to denote a mapping when I'm writing math on a chalk board or piece of paper.

I'm not sure why := would denote a computation? At most you might want to use it to denote an assignment in a mutable context?


> Getting 'f' vs 'f(x)' right mostly is important for programmers who deal with higher order functions in general all the time. Most mathematicians don't fall into that category

Mathematicians deal with higher order functions all the time, e.g. in functional analysis.


The amount of "cheating" (as in, notation/language abuse) in functional analysis is much worse than that. People routinely call points in L^2[0,1] "functions"... OTOH, I don't think it leads to serious problems.

OTOH, the lack of rigor is definitely one of the problems of contemporary math. Many years ago, when I was a student, I studied one paper, coauthored by 2 people - call them X and Y. X was a very established mathematician, Y was a relative newcomer. There was one (set-theoretical) argument I couldn't understand, so I asked Y (he was my advisor's friend) about it. He told me "yeah, X asked me this, too, and I told him to use Zorn's lemma, and after a moment of thinking, he said, «yeah, that would work»". I'm not set theorist myself, but it smelled suspicious to me, so I asked another friend, who knew much more about set theory than me. He smiled and said "of course it's wrong, it's a very common mistake".

Had X and Y written out the argument more rigorously, we'd have one less published result with no correct proof...

And I have quite a few other anecdotes like this, unfortunately.

One professor at my former faculty once told how he approaches refereeing papers. "For the first 30 minutes, I try to prove the main result myself. If I don't succeed, I spend the next 30 minutes trying to find a counterexample. This way I write most reviews in half an hour."

A few years ago I coauthored a book about non-linear analysis. Quite a few quite interesting topics. One of the coauthors insisted on writing out proofs in detail and rigorously, and now we joke that our book is the first one where some (quite established and known in this field) theorems are proved correctly for the first time. (And that includes proofs with gaps/mistakes in both research papers and monographs, btw.)


Integration and differentiation are examples of higher order functions. And so are many things you can do to groups.

I specifically meant getting this is important for programmers who deal with higher order functions.

It is not so important for mathematicians who deal with higher order functions.

Mostly because the intended audience for their writings is smarter than a computer, and there's typically more context.


> Getting 'f' vs 'f(x)' right mostly is really important for programmers who deal with higher order functions in general all the time. Most mathematicians don't fall into that category.

Operators and functionals?


It's usually clear from context what you mean, even if you work with operators and functionals.


> Also, speaking of notation, I wonder why you used ':=' instead of '=' to define 'f'. There is no computation going on, right?

In math, := is typically used to denote a definition. Using equality (=) only makes sense if both sides of the equality sign already have a definition.


Well, to be fair, = is also very often used for definitions. And the reader has to figure out from context which meaning of = applies.


> In math, := is typically used to denote a definition.

i mentioned this up-thread but is that why := is assignment in Pascal? Wasn't Pascal the main academic language there for a while?


Yeah, I think that has the same origin. Though I'm not sure if programming languages or maths came first. Apparently, for programming languages it appeared first in ALGOL in 1958[1].

Edit: On math.SE[2] someone claims that it's notation borrowed form programming. Someone else claims that it was introduced by Bourbaki, which might predate programming, as Bourbaki started publishing in the 1930s. However, I couldn't find any evidence of this from skimming a few Bourbaki PDFs.

[1] https://en.wikipedia.org/wiki/Assignment_(computer_science)

[2] https://math.stackexchange.com/a/25215/312406


I agree but i'm by no means even remotely an expert. Mixing up f and f(x) seems pretty bad to me.

f = y+3 makes sense f(x) = y+3 does not make sense (at least to me), f(y) = y+3 makes sense however.

f(x) is a function of x correct? It's articulated as "f of x".

> I wonder why you used ':=' instead of '=' to define 'f'. There is no computation going on, right?

:= is assignment in Pascal iirc, maybe that's where it's coming from.


Something like 'f(x) = y + 3' can make perfect sense, depending on context.

For example, that could describe a constant function that doesn't depend on x, and y is a free variable that gets its value from context.

Or y could implicitly be a function of x. That happens a lot in calculus or physics.


Syntax and notation is an incredibly minor "problem". We have much more significant (and interesting) stuff to deal with.

Judging maths on its syntax is like judging a poem or work of literature on its font. It really isn't a central thing.


This book is not really addressing the more common "is math real" question of it being empirical or invented. For an interesting take on that question, see the 1st section of the 2nd part of Daniel Shanks' Solved and Unsolved Problems in Number Theory. He makes some interesting points about the old Pythagorean views


For me, both questions "is math real" and "is math discovered or invented" miss the point. Math is a model of the universe in the same sense that a world map is a model of the earth.

Is a map real? Well, it is. I can see it on my desk. Is the earth real? It is too, but they are not the same. In that sense map is also not "real".

Is the map discovered? Well, it uses data that was mostly discovered, but some parts were "invented" or edited for simplification for the map to be useful.

The real question should be "is math useful" as a model. We all know most basic parts are, but some mathematicians forget that they are dealing with an imperfect model and keep finding paradoxes. It's like we would forget the imperfections caused by the mercator projection and be surprised the real world distances are not proportional to map distances.

That's the reason I always liked engineering more than maths. When programming you always "import" the libraries you need and find useful for the task. You only make sure that they are compatible with each other. Mathematicians "import" all axioms, call them maths, and are surprised they get paradoxes.


You're taking a distinct philosophical stance, but you're also being unnecessarily dismissive by claiming other stances "miss the point".

Math is nothing like a map -- maps are approximations of something real and they don't have any kind of internal consistency or complexity.

But there's a good argument that math is the fundamental nature of the universe, and mathematical discoveries lead to predictions of real-world behavior. While maps don't predict a thing.

The philosophical discussion isn't around whether math is useful for tracing the arc of a ball in the air, for which it always will be merely a useful approximation. It's more around math as the language of the universe, in things like quantum physics -- there's no "approximation" here, it's more the nature of reality itself.

And here, the philosophical questions around whether our descriptions of quantum physics are "invented" or "discovered" go quite deep, and necessarily involve the nature of human knowledge itself. For many people, these don't "miss the point" at all -- they're some of the deepest, most profoundly meaningful questions that exist.


> You're taking a distinct philosophical stance, but you're also being unnecessarily dismissive by claiming other stances "miss the point".

I read my comment again and I was surprised, as I did not intend this tone. I’m sorry for being dismissive and for generalising too much about mathematicians.

Could you elaborate or point me to a formulation of the “language of the universe” argument you mentioned that avoids mentioning quantum physics? I don’t understand quantum physics and I’d like to avoid falling for the quantum physics fallacy [1]

[1]: https://www.logicallyfallacious.com/logicalfallacies/Quantum...


No worries! Just wanted to make sure you were aware of other perspectives.

And a good place in general to start is always Wikipedia:

https://en.wikipedia.org/wiki/Philosophy_of_mathematics

And none of this has anything to do with the "quantum physics fallacy" at all. Philosophically, it's simply an argument about the most basic physical understanding of our universe, and right now that happens to be quantum physics.


> It's more around math as the language of the universe, in things like quantum physics -- there's no "approximation" here, it's more the nature of reality itself.

Why is it that everyone thinks of mathematical models of quantum mechanics as much closer to the "nature of reality" than any other mathematical model? If anything the constant disagreements between quantum mechanics and physical models at other scales should make it clear that all the models we have are wrong by virtue of incompatibility.


I think the question should go even deeper. There are so many fundamental axioms that must be accepted on faith alone. The question I usually start with is "Can anyone prove that numbers exist outside of our imagination?" I not talking simply about perception. Even I believe that if I perceive that I am hit with a brick then the brick exists. We have no senses that can detect numbers. When I asked this question to any of the several mathematicians that I know, the answer has always been ~ Yeah, good question ~ and then they move on.


Feynman has some useful words about this phenomenon of always wanting to dig deeper: https://youtu.be/36GT2zI8lVA?si=Boiqod3GXHVMyE_s


Why do you think we should go deeper with pointless questions? What would you do with the answer if someone provided one?


> "Can anyone prove that numbers exist outside of our imagination?"

What do you mean by "exist" here?


> Math is a model of the universe in the same sense that a world map is a model of the earth.

Except math can hypothetically model any consistent universe, not just our universe, which kind of undercuts the argument that it uses data that was mostly discovered, or that it's merely a model.

I think the most general view is that math is the study of structure, and some structures are real (in the sense that they exist in our universe), and some are not but we can still "discover" them by selective permutation or enumeration of axioms.


Cartography can also model any consistent universe, and I fail to see how that changes anything for the “it’s just a model” argument.

We can permutate and enumerate symbols for mountains, rivers and roads on a piece of paper. Maybe we would even get some “interesting” results like a map of the Lords of the Rings universe. How would that change anything?


I think you are both getting lost in the weeds trying to make this metaphor work, or not work.

Math is simply the logical conclusion of a set of conditions someone accepts as inherently true. If this, then that. Follow this logic far enough and you end up where we are today.


I think this conflates logic and mathematics. Some would dispute that logic underpins mathematics. Counting can be analyzed logically, but it does not in any meaningful sense seem to depend on logic.


Sure it does. How else would you prove that one number follows or precedes another? I think you are conflating the act of physically counting with the logical foundation of our number systems.


It doesn't seem correct to equate mathematics with proof. If I express a mathematical construction like the whole numbers (let Whole = Zero | Succ Whole), and I build further constructions on that foundation, am I doing mathematics? If so, then it seems mathematics does not depend on logic, as logic depends on propositions and there are no propositions to be seen.

Certainly you can analyze such constructions using logic, but that's again conflating logic with mathematics. There's overlap, but they aren't strictly the same.


I would say that a mathematical construction is based in logic, maybe not in the traditional sense, but there is definitely logic behind the construction itself. I think we are talking past one another, what I mean by logic here is more nebulous than predicate calculus. There is an innate logic behind the philosophy of mathematics and I believe you cannot divorce mathematics from logic.


Show me a cartographic map of a 5-dimensional universe.


> Math is a model of the universe in the same sense that a world map is a model of the earth.

That describes pre-1900s math we inherited from the greeks. With advent of non-euclidean geometry and abstract math, math is no longer bound to objective 'reality'.


>Math is a model of the universe

And I would dare to disagree right here. Math contains many structures that we don't know from our universe and that probably do not exist in our universe. If math is a model of universe, why is there a Mandelbrot set?


Yes, that is an intrinsic property of all models. They are imperfect, and we accept it as long as the models are useful for some purposes.

My map has a text written on it saying “Pacific Ocean”, yet I would not complain if I went to this place an couldn’t find a giant object in the ocean that would look like a letter P from the skies.


> This book is not really addressing the more common "is math real" question of it being empirical or invented.

Please note, this is mentioned at the beginning of the review:

"I settled in to read the book “Is Math Real?” expecting to become embroiled in the age-old controversary of whether math is invented or math is discovered. Instead, I found myself confronted with two viewpoints of mathematics: one view is that mathematics is a stiff and fixed set of rules and algorithms while the other view is that mathematics is flexible and our understanding of math comes from questioning of why mathematics functions so effectively.

The premise of “Is Math Real?” is that people have different emotions about math. Some love the math and have little difficulty determining the correct answer to a problem while others loathe and dislike the math and have a difficult time ascertaining the correct response. Many times, a student is humbled or chastised for asking ‘a stupid question’. Author Cheng states that there are no stupid questions. In fact, the most profound concepts in mathematics are learned from asking the simplest of questions.”


Math is invented with a purpose to help humans understand the economy and the world around them. Economy, in turn, was also invented to help humans organize their resource use.

The more humans understood the world, the more they tried to apply math and other sciences (also invented by humans) in order to explain it.

It's not even a question. Two apples will always be two apples. It's just that, without math, it would be "an apple and another apple next to it".


Pythagorean ideas—- well, it has been some of the most enriching philosophy I’ve ever encountered. It’s a rabbit hole, for sure.


It's interesting that mathematicians, when asked philosophically, might have all kinds of interesting and nuanced ideas about this topic. But when you let them get back to their mathematics, they behave as if they believed deep down in their heart that mathematics exists independently of the observer.

(It's a working attitude that works well in practice. Just like a heliocentric world view works well enough for most celestial navigation you can do without computers.)


I am waiting for the universe to do something non-mathematical. There is no reason to believe it can't :-)


Sorry to break it to you, but most mathematicians don't give a s*t. Even worse, a large percentage of mathematicians (not sure if "most", but I'm afraid yes) do not usually have "interesting and nuanced ideas" (nor opinions) on anything.


This book is on my reading list. I listened to an interview of the author who said that their primary motivation of the book was less about teaching math, but teaching the value of looking at the world abstractly to determine patterns and constructs to explain our shared experience.

Is math real or not? It doesn't matter she posits, it works and continues to work and we can learn from its existence that almost everything can be explained and is "predictable" given enough inputs.


I don't know if it's real but it's definitely complex.


Came here looking for this comment, not disappointed.


OTOH it can also be rational and natural.


OTOOH sometimes it's irrational and transcedental so there's that.


And, amongst the reals, the irrational infinitely outnumber the wholesome


Get out.


A better question is - is math an inherent part of the universe or is it a uniquely human construct?


Math is a human construct to describe inherent parts of the universe (among other things).

The universe is compatible with math not because the math is part of it, but because the universe is what math was invented to describe. Most fundamentally, relationships between related structures and sizes. Obviously the universe is full of those since an order does emanate in ours. So aliens probably also have math and even discovering the same relationships etc but that still won't make math an inherent part of the universe to me. The universe doesn't care for math, it just is. Intelligent beings want to describe and discuss it though so we keep inventing math in order to do so and speak a common language.

Personally I find this very simple and not controversial at all. Math was simply invented as a system for us to teach and jot down things so that we don't lose knowledge across generations or for example colleagues.


Right, mathematics is a language for very formally and precisely describe consistent relationships. Since the universe is persistent and consistent, it is useful to us such descriptions.

I may not be possible to know why the universe is this particular way, but I don't think the universe is consistent 'because of maths'. The universe is this way for unknown reasons, but languages don't define or create the things they describe.

To prove this, we can construct descriptions of things that do not or cannot physically exist. Frodo the Hobbit, for example, in English. I'm sure there are equivalent expressions in maths that don't relate to physical things. The description, and therefore the concept exist (same thing), but the thing itself does not. Another way to say it is that the description does not correspond to something that is physically real. So we can construct mathematical descriptions of unreal or hypothetical things, and we can construct English language descriptions of such things. That's just a feature of languages.


The mathematical universe hypothesis says reality is mathematics. Not just that math can model reality, but that the universe is a mathematical structure.

That isn't really at odds with what you are saying though, it's just a lower level.


Mathematics is an inherent part of the Universe. What we Discover/Invent are Models/Notation for identifying and using them. The difference today is that we have built and extended the Abstractions/Relationships to such an extent that people feel it is not "Real" which is of course not the case.

The best argument for this are the various structures across the World (eg. Pyramids in Egypt/Central-Latin America, Temples/Structures in India, Aqueducts from ancient Rome, Great wall from ancient China etc.) spanning thousands of years which could not have been built without a knowledge of Mathematics as we define it today. Their approach and models/notation may have been different but the essence of the Mathematical Abstraction is the same.


Sometimes math can be pretty surprising, such as when complex numbers might be shown to be necessary to explain quantum mechanics:

https://www.nature.com/articles/s41586-021-04160-4

Or when group theory predicts subatomic particles through symmetries:

https://www.britannica.com/science/subatomic-particle/Hidden...

But other things like continuity and limits, as well as various topological spaces, seem to be purely mathematical constructs.


Continuity and limits, and some relevant topological spaces are absolutely required to describe very basic fundamental physics. You can't write down even stuff like Newton's laws without calculus, and you can't do calculus without something equivalent to limits.


See my other comment here: https://news.ycombinator.com/item?id=37656080

As the other commentator points out, "Continuity and Limits" are fundamental to explaining physical phenomena (via differential equations) and are not "purely mathematical constructs".

PS: You might find this interesting; Imaginary Numbers are Real - https://www.youtube.com/playlist?list=PLiaHhY2iBX9g6KIvZ_703...


You point to the argument that "we have to calculate with them, therefore they are real", but another aspect of the debate is about which mathematical entities are real in the sense of being physically realizable. From this perspective, it's not even clear that real numbers are real, e.g. whether and how it would physically possible for a real-number based quantity to be present in a finite space.


If Mathematics helps in explaining Nature/Universe then it is "Real" no matter if the explanation goes through a whole stack of abstractions or not. There are two main parts;

a) A Formal System consisting of Set of Objects, Operations, Mappings, Axioms and Logic Rules. We invent the symbols and notations to express these.

b) A Domain of Discourse/Interpretation in which the above is applied to map to "Reality".

We Humans have an innate sense of Quantities, Proportion, Objects and Relationships which is what can be called the "Mathematical sense". Even the most uneducated goatherd can count his goats (eg. using pattern-matching with one stone per goat) without knowing anything about the number system. He can also compare his bunch with his neighbour's and tell you which is larger. If you throw a ball at him he can estimate its trajectory and move accordingly to catch it. We have merely abstracted out the essentials from the above and modeled them as Set Theory, Integer/Real Number lines, Rate of change of one quantity w.r.t. another etc. and labeled these as "Mathematics". The models are by design "abstract" but once applied to a "domain" become concrete.


I agree with everything you say but that's not really what I was talking about.

Physical realizability concerns the question whether some type of entity can in principle exist in the known physical universe, whether it's physical existence would violate existing laws of nature. The question is independent of the question whether there is (also) a Platonic realm of mathematical objects (although there is a connection if you are neither a constructivist nor a Platonist). As far as I know, nobody doubts that integer quantities can be physically realized without violating existing laws. Likewise, you can say that a square is an abstraction from a square macroscopic object, even though no side of that object can be perfectly square in nature.

However, the case with real numbers is a bit different from the square. It doesn't make much sense to claim that real numbers are abstractions from quantities that exist as finite, quantized integers in empirical actuality. But if it's not an abstraction from something that clearly can be physically realized, then it is meaningful to ask whether a real-number quantity can exist in the physical universe. From what I remember, some philosophers and physicists think the answer is No.


>It doesn't make much sense to claim that real numbers are abstractions from quantities that exist as finite, quantized integers in empirical actuality. But if it's not an abstraction from something that clearly can be physically realized, then it is meaningful to ask whether a real-number quantity can exist in the physical universe.

Real numbers are absolutely "physically realizable" (in the sense that you are defining it) in the Physical World. If you have 3 litres of water and you give me half, you have just "realized" the Real Number 1.5 from a "quantized integer" 3. Incidentally even integers are just an abstraction of attributes of collections of things i.e. cardinality of a set of things. This is why i tell people to look at Mathematics as a Formal System+Domain of Discourse in the Real World. You do all your symbol manipulations in the former and at the end map it to the real world to see whether it is valid.


You're right that some real numbers are physically realizable, but not all of them. The debate usually focuses on irrational numbers. Please bear in mind this is an existing debate in the philosophy of mathematics, not my personal invention, and I have to apologize for being somewhat vague about. I tried to find a paper I've stumbled across years ago but couldn't find it, so I'm writing from distant memory.

Anyway, the argument goes roughly like this: Real numbers also include the irrational numbers, and if these were physically realized, then they would contain an infinite amount of information within a finite space. This violates various physical laws.

Now don't get me wrong, this is all controversial. The idea is, for example, that π cannot be physically realized because it has an infinite decimal expansion. Some people would agree, other would disagree.

I understand why you disagree, but bear in mind my original point was not to argue that real numbers aren't physically real, but rather that there is no general agreement about this issue among people who muse about these kinds of philosophical questions. The question is relevant for foundational views about mathematics. If certain real numbers like 1/3 and π cannot be physically realized, they cannot be abstractions from something encountered in nature (at least not in the sense of "abstraction" according to which some properties are ignored). The view remains compatible with regarding them as mental constructions and compatible with mathematical Platonism, though.


The difficulties with Irrational Numbers were obvious from the beginning (https://en.wikipedia.org/wiki/Irrational_number#History). Many of them are very "Real" and occur naturally in the Universe (eg. Pi = Circumference of a circle / Diameter of the circle) but our modeling of them doesn't feel "natural" and hence the confusion. You might find the following interesting;

1) God created the Irrational Numbers : https://www.welovephilosophy.com/2014/03/26/god-created-the-...

2) What is a real-world metaphor for irrational numbers? : https://math.stackexchange.com/questions/2065998/what-is-a-r...

> Anyway, the argument goes roughly like this: Real numbers also include the irrational numbers, and if these were physically realized, then they would contain an infinite amount of information within a finite space. This violates various physical laws.

See Do irrational numbers contain infinite information? : https://www.quora.com/Do-irrational-numbers-contain-infinite...


Well, Yes, that's what the debate is about. Just to make this clear (in case someone else reads this thread), this is an ongoing debate in the philosophy of mathematics and not just something people muse about on Quora and Stackexchange. What's important for me is that the notion of physical realizability is understood correctly. It does not pertain to philosophical arguments or common sense, it really means that something can be present as a quantity in nature (actuality) without violating existing physical laws. When someone argues that a mathematical structure or entity cannot be physically realized, the argument must concretely show that the realization would violate some currently well-confirmed laws of nature or fundamental physical principles.

There is no disagreement with you. I just wanted to clarify that (hope you don't mind). I didn't have just any arguments against irrational numbers in mind but a specific type of arguments.


Your idea of physical realizability seems to me to be questionable. Because there are many abstractions in a chain starting from mental concept to a final physical object where each link is necessary to get to the final result (eg. a modern computer). Do not get caught up in metaphysical arguments of philosophers on mathematics which are often just playing with words rather than substance (eg. my link above to God created the irrational numbers by a PhD in Philosophy).

Finally to conclude this thread; i highly recommend reading The Unreasonable Effectiveness of Mathematics in the Natural Sciences by Eugene Wigner if you haven't already done so.

1) Summary on wikipedia - https://en.wikipedia.org/wiki/The_Unreasonable_Effectiveness...

2) Complete paper - https://www.maths.ed.ac.uk/~v1ranick/papers/wigner.pdf


> your idea of physical realizability seems to me to be questionable.

I made it abundantly clear that is not my idea but an ongoing discussion in the philosophy of mathematics. You're telling me to not get caught in philosophical arguments and in the very sentence before that presuppose the idea that mathematics is a mental construction, which is just one out of many philosophical views in that area. By the way, I'm interested in philosophical issues because I am a philosopher. Just because you don't like these issues or find them "questionable" doesn't mean anything. Some of my colleagues defend an Anti-Fregean formal foundation of mathematics to which physical realizability seems to pose a huge problem. If they want to get their papers published, they'll have to address the issue.

I'm aware of Wigner's paper, it's a well-known classic. Finally, to conclude this thread from my perspective, while I'm personally not interested in the (broadly conceived) metaphysics of mathematics and am happy to leave these issues to mathematicians interested in them, the way you're just presupposing that mathematical objects are mere mental constructions cannot really count as engaging with the problems yet. If you read what I wrote above again, you'll realize that I presented an argument why irrational numbers cannot be abstractions, yet you keep talking about abstractions. In a nutshell, it's not that simple.


>it's not even clear that real numbers are real,

See https://en.wikipedia.org/wiki/Real_number

> whether and how it would physically possible for a real-number based quantity to be present in a finite space

If you have a continuous function between two points in space then you need Real numbers.


There’s so much more math though. More than will ever be seen or used in structures, thought about, etc even in principle. E.g. if string theory isn’t real the math still is, so where is it? Part of the universe is pretty unsatisfying.


>There’s so much more math though.

True, that is why I used the words "Discover/Invent". Also i used physical structures as an example since they are the most visible and unarguable evidence of "Real" Mathematics from the earliest times.

You can invent abstractions to model concrete things(eg. all that is needed to model a skyscraper) or to model still further abstractions in a chain (eg. vector spaces for multidimensional/functional/etc. spaces). We only realize that it is "Real" when it is "Applied" in the concrete World (eg. number theory in cryptography).


I’m not disagreeing just thinking your line of argument must say more about where all this perpetually undiscovered math is confined in the universe. It sounds like you’re forced to believe math is abstract in nature then, but doesn’t that conflict with your earlier points.


https://news.ycombinator.com/item?id=37662094

>where all this perpetually undiscovered math is confined in the universe

It is in the very fabric/structure of the Universe itself Eg. Calculus buried in the motion of planets around the Sun and discovered by Newton.


Agree. Our math is convenient for us humans living in our universe.

But how much of our math is just a poor approximation of our universe? Like Newton's gravity was.

If our math only _approximates_ the world, if we discovered something that explains things better, it would all be irrelevant.

There are a lot of hints that something big is missing in our maths as a means of explanation. Like the mathematical constants Pi and Euler repeating infinitely, quantum randomness...


A good line of questioning is to explore the constants that arise in physics, of which there are nineteen[3].

E.g. "Why is the speed of light what it is?".

~300,000,000 meters per second. But the definition of a meter is actually defined by the speed of light, so this number is very human-math-specific.

So instead, you want to look at the speed of light in terms of other physical constants to find a "dimensionless" constant.

This leads us to the fine-structure constant[1], which is a single number that pops out when you relate a few of these experimentally measured constants to each other.

0.0072973525693 ≃ 1/137

This is a number that if any different would mean the universe would not exist in the way it does.

Something very human is the notion of "1". Counting things is very important to intelligent life.

I was thinking the other day, about the world from the perspective of a tree. It doesn't care about counting things. So "1" is irrelevant to it. It's an invented concept by humans.

And most of our mathematical thinking is based around this.

There could be an infinitely deeper and more complicated maths to explain things.

It's like looking at a leaf without a microscope to figure out biological processes. Until the 1600s, biologists could only study what their eyes could see.

All this quantum randomness feels like we are still just looking at a leaf with our eyes.

[1]: https://en.wikipedia.org/wiki/Fine-structure_constant

[2]: https://en.wikipedia.org/wiki/Dimensionless_physical_constan...

[3]: https://en.wikipedia.org/wiki/Physical_constant#Number_of_fu...


Plants may not have a concept of numbers or math, but they seem to follow mathematical patterns regardless.

https://thatsmaths.com/2014/06/05/sunflowers-and-fibonacci-m...


How do you know that trees don't care about counting things? Have you ever been a tree?


Would in a completely different universe with completely different physics where some life and civilization (for whatever that means in those different physics) manage to emerge, this civilization have similar mathematics, or completely different?

Is it possible for a universe to exist where those who can think and would follow all possible logic rules, find that e.g. the natural base of logarithms turns out to be something else than 2.71828, or get different prime numbers in the integers despite using similar addition and multiplication rules, or other such changes..., or would they find exactly the same?

I think they would find exactly the same (when it comes to the real actual logic, they may use different conventions and focus on different things if they got e.g. a different amount of dimensions in their universe etc...), I simply can't think how following logic rules could conclude something else no matter in what universe...


This depends a lot on how different you imagine this universe could be.

For example, in this different universe, if I have an apple and you give me another apple, how many apples do I have? If I have 2, just like in our own universe, then you're probably right. But what if I have 3 apples? What if I still have 1 apple?

We can certainly create number systems that don't behave like the integers, or addition operations where 1 + 1 = something other than 2. We haven't explored many of those too much because they're not very interesting, but they still have structure and may have similar concepts to what we call prime numbers etc. The integers and regular addition happen to be much more useful for understanding our world than all of these other systems.

In a vastly different universe, the opposite may happen: if they studied this weird operation where 1+1=2, they would reach the same conclusions as we do. But they never study it, because it doesn't match their universe at all.


> the natural base of logarithms turns out to be something else than 2.71828

The Euler number has a very concrete definition (or, actually, quite a few equivalent ones). The answer is clear if definitions are the same (all - including the operations we perform and structures we use).

Yet, math we know revolves around the abstraction of (discrete) language AND that we operate with things that we count. Even if we were slime molds (well within the same universe), we may have never developed the concept of integers. At the same time, there could have been 3D geometry without words.


Maths in other planets or universes can be different only if they have different sets of axioms and/or different logical inference rules.


Reader added context: humans are an inherent part of the universe.


We are a way for the universe to know itself.

Carl Sagan


"Through our eyes, the universe is perceiving itself. Through our ears, the universe is listening to its harmonies. We are the witnesses through which the universe becomes conscious of its glory, of its magnificence." - Alan Watts


The universe has a mind (human) and its current opinion is that math is inherent and discovered.


There's also the viewpoint that the universe is part of mathematics.


The mathematical model of the universe is part of mathematics.


But the universe itself might be made of mathematics, to the extent that it is made of anything at all.


it depends a lot on who you ask, so that part is human, and the part that you didn't specify is ambiguous enough that the answer can only be yeah no.


It can be both.


I've read other books by this author and felt that, for math books, they were overly political.

Is this one any different?


Yeah, I think the mathematical exposition was pretty good, but I couldn't bring myself to finish the book because of the culture war she kept trying to shove in.


Are you referring to this book in particular or her writing in general?


I am referring to her other book, The Joy of Abstraction.


Can you elaborate on what you mean by that?


Not the person you’re asking, but the reviews of previous books make similar complaints:

https://www.amazon.com/Joy-Abstraction-Exploration-Category-...


> This is not just a mathematics book, but unfortunately also a polemic of progressive politics. For example, on p.42 there is a helpful definition of “cisgender”, in case you didn’t know. Then, on p.319, the notion of isomorphism — ubiquitous in mathematics, and really quite simple — is explained by considering two categories induced by a partial ordering, one consisting of concepts involving rich, white and male, and another involving rich, white and cisgender. It is then shown that rich white cisgender women have the same status in the second category as rich white males do in the first. Just below this, there is a helpful diagram showing that each of these categories is a 3-fold product of the category consisting of two objects and one arrow, namely (people with structural power —> people who are structurally oppressed). Really, there are many simpler ways to get these ideas across.


Yeah, I try to avoid the culture war stuff, but this seems pretty unnecessary for a math book.


She used the same example in some lecture on youtube, too.


A good starting point for questions like this is the SEP article: https://plato.stanford.edu/entries/philosophy-mathematics/


Here's the irony.

In science and therefore reality as we know it nothing can be proven to be true. Things can only be falsified in science.

This occurs because if we make 10 million observations that verify a hypothesis we still haven't proven anything to be true because there always exists the possibility that a subsequent observation falsifies the entire hypothesis.

While we can't prove anything in reality is real, we can prove things to be true in mathematics. Proof is the domain of math and logic not science.

Therein lies the irony. We don't know if anything is real in science and therefore reality as we know it but we can verifiably know whether things are real in the universe of math.

It really puts the question in perspective: what does it even mean to be real?


But this just isn't the case, for your claim to be true we would have to be constrained to only do _inductive_ reasoning.

When we do _deductive_ reasoning, and we feel as though the reasoning has been done correctly, then we know something _assuredly_. Mathematics and empirical science are very different in this respect, that much of mathematics is purely deductive.


Inductive reasoning is by nature probabilistic so it can't prove anything.

See: https://en.m.wikipedia.org/wiki/Inductive_reasoning

The first paragraph mentions this. As science is statistical in nature inductive reasoning by being probabilistic suffers from the same problem.


Your claim was that we can only know about reality through inductive reasoning, and this is the part I'm disputing, not that science itself is not inductive.

Mathematical reasoning is deductive, and mathematics is part of reality.


> Your claim was that we can only know about reality through inductive reasoning, and this is the part I'm disputing, not that science itself is not inductive.

My claim still stands resolute in dispute of your claim. You can't know anything through probability/inductive reasoning. At best you can say something is "probably" true, but even this is a limit that's impossible to reach. If you observe something 1 billion times and that observation confirms your hypothesis, you never know if the next 10 trillion observations can deny your hypothesis completely. So even saying something is "probably true" can't even be done. Nothing can be truly known in science and therefore reality as we know it.

>Mathematical reasoning is deductive, and mathematics is part of reality.

It's only part of reality in the same way a fantasy novel is part of reality. It just so happens that it matches our observations. But observations are not always the same or consistent, how will you verify logic and math consistently hold true in reality? You would recursively use stats and science to determine it's veracity which suffers from the same issue as I stated above... you can't prove anything to be true with science.

Math and logic is an axiom of reality. We simply assume it to be true and there's no way to prove that it's true. But here's the kicker. EVEN if we assume science and logic to be true, we STILL can't prove anything to be true in reality. This is because of exactly what I'm talking about above... we can never know the true sample size of all possible observations... any amount of observations or samples we have is finite, but the universe is unknown and samples are potentially infinite. Therefore any sample could be 1/1000000000 of what's out there.

I'm not making any of this up. This is real stuff: https://en.wikipedia.org/wiki/Falsifiability

Quotation from the article:

"One of the questions in the scientific method is: how does one move from observations to scientific laws? This is the problem of induction. Suppose we want to put the hypothesis that all swans are white to the test. We come across a white swan. We cannot validly argue (or induce) from "here is a white swan" to "all swans are white"; doing so would require a logical fallacy such as, for example, affirming the consequent.[4]"


This is a more direct link to what I'm talking about:

https://en.wikipedia.org/wiki/Problem_of_induction


> what does it even mean to be real?

exactly the question which has to be answered before answer "is X real?"


I can already tell that a vast majority of the replies here are going to respond to the title and not the post


> The question “is math real?” is answered in the epilog of this book. Cheng tells us that math is real because it is an idea and ideas are real.

Solved!


If ideas are real, do you have any idea about entities that are not real? I mean, what is his exact definition of "unreal" things? Something that can not be formally described?


I'll let Parmenides take that one

> It needs must be that what can be spoken and thought is; for it is possible for it to be, and it is not possible for what is nothing to be.


This is not an alternative to people reading the article


Agreed. I just thought this little paragraph at the end of the article, which does directly address the title of the article, was amusing.


I've encountered this argument before, but also applied to other subject such as eg Latin:

> 3) math has an indirect usefulness which is a way of thinking that is transferable to a myriad of disciplines and solutions to problems in everyday life. And it is this third reason that makes math relevant for most people.

Are there actually any good studies that show this in a counterfactual setting? Like, do we actually know that spending time teaching maths (or, less plausibly, Latin) helps students aquire these abstract skills more than other subjects? Is this "transferability" of meta-skills a testable outcome?


I’m skeptical of (3) as well, and yet I encourage students to study math through calculus, and to study Latin.

My feeling is that the utility of higher math, Latin, history and literature comes every minute of every day, as you experienced life as someone who has familiarity those things and your life will be richer and fuller.

This is decidedly not testable. And yet I still believe it.


There are plenty of ways to use this stuff in everyday life.

I took logic in college, and although the "logic as english statements" stuff was sort of confusing, the symbolic stuff like A&B = !A|!B stuff has helped with computers all my life.

It was only much later in life that I ran back into logic as english statements in a way that made sense as practical.

I read a book where they took apart the statement:

  If you loved me, you would take me to the movies.
Most people in relationships will respond to this with:

  Well, I just took you to the movies last week!  Why do you want to go again tonight, we had other plans!  etc...
But the book explained that with "If X, then Y" it was futile to address Y. You must address X:

  Wait, do you think I don't love you?  Of COURSE I love you!
...just hard to unpack this in the middle of an emotional situation unless you've studied the logic

:)


I would highly advise against treating informal natural language statements as logical statements. It is impossible to know what someone who says "If you loved me, you would take me to the movies" actually means without far more context. The fact that it has the form of an X=>Y statement is at best a hint, but it definitely shouldn't be taken as literally as that.


I think you meant

  !(A&B)
Anyway… I made a video called “Why Think Mathematically?” as the first in a series called “Thinking Mathematically” on a YouTube channel years ago:

https://www.youtube.com/@thinkingmathematically

It might help to answer the question


An intro to formal logic class was the first thing that made me think of becoming a professional programmer. The class had a lab portion that used a program called Tarski's World that I remember as being a lot of fun.


The correct answer is

  Since when have you thought that I didn't love you?
Read The Gentle Art of Verbal Self-Defense for why.


I actually think point (3) is a good one, but am still extremely skeptical of teaching students more maths.

Basically, I believe that there's a heavy correlation between being good at maths and being good at solving every day (and not so every day) problems.

But I don't believe there's much of a correlation between being taught math at school for even more hours will make much of a difference. Most schools are terribly at teaching anything.


I feel I became much better at writing non-fiction after taking math at university (mainly calculus, linear algebra and statistics).

Going through proofs and proving things on your own really transferred to being able to better present arguments. The diversity of the math I learned has helped to reflect on things from different perspectives.

In sum this helped with everything from thinking more and better about the core issue at hand, writing argument chains in the correct order, cutting down on irrelevant stuff and more.

I've used this to significantly help the grades of both my SO and a family member, who both took non-math topics, by improving their hand-ins. I didn't know their field so was strictly improving the structure and presentation, and asking for clarifications where I felt the arguments didn't add up, and have them write down the answer.

I feel it still helps me a lot writing emails at work and similar.


That (3) is a very interesting issue. My suspicion is, it basically says "knowing logics is good for your well-being".

The skill of logical inference — even at the level of very basic syllogisms — is both very much underappreciated and underdeveloped in the American college population, at least from my personal experience. As good citizens, we all collectively should grab a couple of Martin Gardner's or Lewis Carroll's books off the shelf and give them a good read. I predict it will do much good... and if I'm wrong, it certainly won't do any harm!


Here is a study: https://journals.plos.org/plosone/article?id=10.1371/journal...

And, since LLMs are so bad at math currently, we may find that, by improving their math ability with gobs of synthetic data, we get improvements in general reasoning.


Thanks, this is a nice reference! Interesting that the abstract and introduction mention the lack of existing evidence - seems like an under-studied question?

One aspect that's quite easy to criticise about this study is that it uses existing groups of students with different levels of maths training. This means that there is possibly self-selection etc, and one may argue there might also be a causal effect in the opposite direction (e.g. folks that are good at reasoning like to do maths.)


> Are there actually any good studies that show this in a counterfactual setting? Like, do we actually know that spending time teaching maths (or, less plausibly, Latin) helps students aquire these abstract skills more than other subjects? Is this "transferability" of meta-skills a testable outcome?

We know that if you try to train someone in math (or in Latin), and they do well, then they will also do well at other things in the rest of their life. Some people would like to give the credit for that good performance to the Latin training.


Brains model patterns in signals, math could be described as the brain's language for expressing higher order manipulations on all those models with communicable symbols. So it's kind of both real and invented, because brains invent, and brains are also just real objects and any models they form came from somewhere.


The statement that brains model patterns rests on the supposition that patterns to be modeled exist.


The notion of existence is doing a lot of the leg work. A epistemological solipsist can say things "exist" when they have a model of their subjective experience in which things can exist. It gets real funky because your interpretation of qualia results in a model that includes the brain and thus a model of how you are interpreting things as a model in your model.


If there isn't then I guess we're all just talking to ourselves anyways so it doesn't really matter.


Am I the only person who hates it when people say “anyways” when they mean “anyway”?


Perhaps it's acceptable in moderation, like "ain't". It's needed for a rhyme in a well-known song by Billy Joel, for what it's worth. On the other hand, there is a different song by Billy Joel in which "anyway" is used for a rhyme.

No, I'm not a Billy Joel expert or fanatic. I just happened to notice, all right?


That stray character doesn't change the meaning of the sentence, so yeah, stylistic complaints are subjective and beside the point.

We're not in school.


This introduction is a very pleasant read and adequately dense and easy at the same time.


... the typical working mathematician is a Platonist on weekdays and a formalist on Sundays

Philip Davis, Reuben Hersh, "The Mathematical Experience"


Meanwhile the actual working mathematicians, better known as engineers, still use geometry on weekdays.


Engineers are their own special species. They aren't just 'working mathematicians'.


No if you use sqrt(-1)


I think I heard that all numbers in the real world are irrational. So that means most of math is not real, except of course the irrationals like pi :)


I've come to the conclusion that all numbers in the real world are integers and real numbers are a human construct necessary evey time we select a unit that is too large. There is evidence (e.g. quantization) that at the most fundamental level, reality is discrete. Math is layer upon layer of abstract toolsets to operate on integers. In that sense for me, it is very real, but invented.


The fact that reality can be described by calculus would suggest that reality is continuous and not discrete. But I haven't seen any real evidence of either claim.


The more interesting question is are the numbers in the universe the subset of real numbers that are computable.


Why is that interesting?


If they are non computable, it would indicate that the universe cannot be modeled by mathematics.


Modellability or Computability?

There are incomputable models.


Pi is no more of a human construct than any of the integers. Pi is inherent in Nature as are the integers, but more mysterious. If you accept the existence of the integers then infinity exists and therefore pi exists too; it does not need to be constructed.


There are no circular objects in fundamental physics.

Pi shows up in many physics equations, but that’s entirely due to our choice of units.


There may be no circular objects, but thats mostly because the universe isn't two-dimensional. There certainly appear to be spherical objects (to varying degrees of approximation), and pi pops out quite naturally when you ask questions about the geometry of such fundamental objects as a black-hole.


That seems like a very weird statement to make. Many numbers in the real world are integral (two objects, one electron, etc.). Thanks to quantum physics, most measurements are integral too.


Explained better above. There are more irrational numbers, almost guaranteeing any number come across in nature is irrational. Interesting thought since I think one thing that makes irrational numbers is there is no function for them. So it's kind of a cheeky way to say no math formulas can ever describe the real world since all the numbers are irrational.


There are more irrational numbers, almost guaranteeing any number come across in nature is irrational

There is nothing that says that the distribution between rational and irrational numbers that show up in nature is the same as the distribution in our construction of the real numbers.


You’re missing the point: physics has shown the world to be discretized. Almost every number you go out and measure IS ACTUALLY AN INTEGER. In that sense the real number system doesn’t exist. It’s super useful, yes, but it’s an abstraction away from reality.


I thought that "infinities of infinities" ala Cantor exist though. I agree with you that reality being quantized means everything in physical reality can be normalized to integers.


> physics has shown the world to be discretized.

It's a bit more complicated than that.


I am a physicist, and I believe what I said is accurate. Although yes I am (1) oversimplifying, and (2) assuming that some form of quantum gravity is correct.


To expand a bit with some examples: the energy a photon in a standing wave in a specific cavity can have is quantised. But a photon out in space can have any old energy it wants to. (Of course, a given energy level will correspond to a specific wave length etc.)

Similar, an electron in a single isolated atom has specific quantised energy levels. But if you look at the electrons in a hunk of copper, they are essentially free to absorb and emit energy in almost arbitrary amounts.

An even stronger example is time: as far as I am aware, time is not quantised in any of our accepted theories.

There's some reasonable speculation that ultimately everything is quantised at the Planck scale, including time. But that's just a very reasonable hunch, not something that 'physics has shown'. (And you already point out that trying to marry quantum mechanics with general relativity is a hot mess.)

I agree that 'quantisation all the way down' is the way to bet. But that's just speculation.

(But I strongly disagree with your claim that physics is build on integers. Yes, it might be discretised, but there are plenty of discrete structures that are not integers. Look at a Rubik's cube for a simple example. On top of that: almost any real world measurement is better described by a probability distribution than by single number, be that an integer or otherwise.)


But pi is ONE "real" irrational number. The golden ratio is ONE more. That's TWO. QED the rational numbers 1 and 2 are "real"?


What's "real" about π? It's a tool. Circles are useful, but there aren't any.


It is better stated that: since there are "so many more" irrational numbers than rational ones, if you were to pick a real number "at random," the probability that it would be rational is zero. The "many more" and "random" ideas are made precise in measure theory (and elsewhere).


If you actually look at how real numbers are constructed. They are quite bizarre. The simple concept of the number line becomes a quite complicated set of sets that follow certain conditions.

(Sqrt(2) as a real number, is actually encoded as the set of all rationals less than sqrt(2) on the number line).


> The "many more" and "random" ideas are made precise in measure theory (and elsewhere).

Well... one of the consequences of that precision is the theorem that there is no such concept as choosing a real number "at random".


There is an infinite quantity of both rational and irrational numbers, so isn't it therefore impossible for there to be more of one than of the other? Or is the reasoning that, because there is an infinite quantity of irrational numbers between any two given rational numbers, there are therefore many more irrational numbers than rational numbers? I would have thought that there being an infinite quantity of both, makes it impossible to compare the quantities.


There is a mapping from counting numbers (1, 2, 3, ...) to rationals and back again that shows these quantities are the same; for every element in set A there's an element in set B and vice versa.

This is not the case for irrationals... therefore it is concluded that the infinity of irrationals is a larger infinity than the infinity of rationals.

See:

https://en.wikipedia.org/wiki/Cantor%27s_diagonal_argument

https://mathworld.wolfram.com/CantorDiagonalMethod.html


> There is a mapping from counting numbers (1, 2, 3, ...) to rationals and back again that shows these quantities are the same; for every element in set A there's an element in set B and vice versa.

That is true, but it's never taught. I don't even know what that mapping is, though I've seen it mentioned once in a popular treatment.

What's taught is always the mapping from naturals to rationals that overcounts the rationals, hitting them all an infinite number of times. (Because it's very easy to show a bijection between the naturals and the ordered pairs, but while (2,3) and (4,6) are distinct ordered pairs, they do not represent distinct rationals.)

But then all you've shown is that the naturals are at least as numerous as the rationals. To show that the naturals and the rationals have the same cardinality, you either rely on the idea that the naturals are the smallest infinite set, or you appeal to the fact that the naturals are a subset of the rationals.


There are also infinitely many rationals between any two distinct irrationals.

My favorite way of visualizing the difference uses the fact that every rational has a repeating decimal after some nth decimal place, and no irrational has a repeating decimal. Say you want to construct a number x, where 0 < x < 1, by drawing integers 0 through 9 randomly from a hat. Each integer drawn from the hat is placed at the end of the decimal; for example, if you draw 1,3,7,4 then the decimal becomes 0.1374. You then draw, say, 1, and it becomes 0.13741, and so on. If you could draw infinitely many times from the hat, what is the probability that you'll construct a number with a repeating sequence? That would give a rational number.


> There is an infinite quantity of both rational and irrational numbers, so isn't it therefore impossible for there to be more of one than of the other?

Mathematicians can even meaningfully compare infinities.

See eg https://www.cantorsparadise.com/this-may-seem-more-irrationa... or https://math.stackexchange.com/questions/474415/intuitive-ex...

You can also look at eg a uniform random variable on the interval between 0 to 1. The probability of hitting a rational number is 0%. The probability of hitting an irrational number is 100%.

> Or is the reasoning that, because there is an infinite quantity of irrational numbers between any two given rational numbers, there are therefore many more irrational numbers than rational numbers?

No, that's not enough. There are also an infinitely many rational numbers between any two given irrational numbers.


> because there is an infinite quantity of irrational numbers between any two given rational numbers

Indeed, you've grasped the core of it. There's no rule you can write for irrational numbers such that "b is the next number after a", because there are infinitely many numbers between a and b that you'd be missing. You can't count them, i.e. you can't map them to integers.

Uncountable Infinities > Countable Infinities


While the thrust of your argument is correct, you're missing an important point. There are infinite number of rational numbers between any rational a and b as well, and the rational number don't have the concept of the 'next' number either. Yet the rationals are Countable.

The argument as to why the irrational numbers are uncountable and the rationals are countable is more involved than what you've made out. But very simply you can think of it as you need an infinite string of digits to describe each irrational number, but each rational number can be written as two finite strings of digits (in the form A/B, where A and B are integers). So to write our the irrationals you have an infinite number of strings, where each string is also infinitely long, while with the rationals you have an infinite number of strings, but each string is finite.


> the rational number don't have the concept of the 'next' number either. Yet the rationals are Countable.

That's literally the same thing. What is counting if it isn't being able to say what the next thing is? Do you have a mapping to integers or not? If so, then every n has n+1.

I know it was more complicated, but jaza had the essence of it. Without what they observed the whole thing falls apart. Yeah, it still needs proof, but I'm pretty sure five other comments went there.

> So to write our the irrationals you have an infinite number of strings, where each string is also infinitely long, while with the rationals you have an infinite number of strings, but each string is finite.

You've set the table but forgotten the feast! You're missing the step where you demonstrate that there's a number that isn't in this list. (Hint: think diagonally.)


What is counting if it isn't being able to say what the next thing is? Do you have a mapping to integers or not? If so, then every n has n+1.

The point I was trying to make is that there is no concept of 'next' inherent to the rationals, nor is there any natural or canonical ordering. The ordering and what comes 'next' is entirely a property of which arbitrary mapping you choose (I'm partial to Gödel numbering). The resultant order that your mapping imposes on the rationals is rarely useful or meaningful.


The rationals are a totally ordered set. There definitely is a natural, canonical ordering to the rationals. It's the same numeric-magnitude metric we use all the time. 1/3 is less than 2/3.

That ordering doesn't have the property that all sets of rationals contain a least element, or that any rational has a successor rational. (That would make them "well ordered".) But it's a natural ordering.

>> The argument as to why the irrational numbers are uncountable and the rationals are countable is more involved than what you've made out. But very simply you can think of it as you need an infinite string of digits to describe each irrational number, but each rational number can be written as two finite strings of digits (in the form A/B, where A and B are integers). So to write our the irrationals you have an infinite number of strings, where each string is also infinitely long, while with the rationals you have an infinite number of strings, but each string is finite.

This argument doesn't actually work. If there were only a countable number of irrational numbers, you could specify them all fully by doing no more than a countable amount of work, even stipulating that describing a single irrational number requires listing a countably infinite number of digits.


>> because there is an infinite quantity of irrational numbers between any two given rational numbers

> Indeed, you've grasped the core of it.

What? That's not the core of anything. It tells you that the irrationals are dense in the real number line. You know what other set is dense in the real line? The rationals.


I maintain that the rest still hinges on that observation. See other reply in thread.


How is that possible? We've made the same observation about the irrationals and the rationals. We want to make a followup observation that is true of the irrationals but not the rationals. Our first observation obviously can't be related.


Counter-intuitive as it may seem, math does have the notion of larger infinities.


This point of view is conflating two meanings of the world "real".

The "real" in "real numbers" has ultimately not much to do with our everydays notion of real. I'd rather treat it as an arbitrary name. You could as well call them "asdfasdf numbers" and they'd remain the same.


Last time I counted my kids, their number was definitely rational.


2 is also a real number.


Really not the best title for this book.


> Oftentimes, math is taught as a set of rules and processes divined from an authority in which learners must memorize the rules and strictly follow the processes in order to satisfactorily find the one and only correct answer

I was surprised later in life when the majority of people I talked to felt Math was this way (and a good thing). To them, math is a set of rules you learn to follow to the T and use them on other problems.

For me, everything a Math teacher conveyed was more of a recommendation, a suggested tool that I could incorporate into my tool box. The methods they employed to solve problems were a matter of preference to me, rather than rigid rules.

This way of thinking has always had some pros and cons. I never solved a problem the way a 'grader' was expecting. Some teachers loved the creativity and efficiency, other TA's just marked as zero. It made applying what I learned to other things, but on exams I would always be stressed with time because I would spend time on which way I was going to answer the problem.

That being said, I am very good at Math. scored well in HS/college, 165/170 Math GRE score, in a field where Math is important (Data Scientist).


If math is not real, it should be complex then.


Everything is real by definition


Is value real? Does it matter?


It is if you use pascal :D


Do you mean Pascal the person? :P


I was making a terrible joke about the old “real” type for floating point in pascal. Or possibly PASCAL :D


Unfortunately, it takes extraordinary effort by the math teachers to make Mathematics interesting to students. Even if some of the teachers are willing to put the effort in teaching this wonderful subject in a way that makes sense to students, it's not possible to turn all mathematics teachers into teachers who understand how mathematics MUST be taught.

I hope all these books, YouTube videos, and websites can help in making students curious about mathematics and explore further on their own. It is hopeless to even expect the school systems in US and a lot of other countries, to change the way their school systems work - teachers are not given enough time to spend on the topics in mathematics. It takes lot of extra effort by the students.


The actual reality is most students don’t have the cognitive horsepower for much past long division. I’ve seen adults struggle with basic algebra: “but what is X?” And so on. And that’s fine. Billions of people are going to live satisfying and fulfilling lives without ever knowing the chain rule.

Edit: I thought it was pretty advertent.


I think you inadvertently highlighted what I suspect is the real issue; a ruthless focus on the path to calculus to the detriment of other useful maths.


Other maths are more difficult to grasp than calculus. Calculus at least has physical applications and can be visualized.


I think I disagree! Calculus is great and all (truly), but logic, boolean algebra, probability, statistics, even linear algebra all have really interesting and useful insights and are very approachable. I'm sure there are other maths I've never encountered that can also be enjoyed with minimal pre-requisite knowledge, though that might be more in the 'abstract and difficult' territory.

The bits you need from calculus to approach those other subjects is also very minimal and approachable relative to the content of a calculus pre-req (where applicable).


Yes. Next question.


some of it is integer


You might as well ask "Are ideas real?" in the broader philosophic sense. Yes, they (with math as a subset) are real existents within consciousness. The question of mathematical ideas' connection to reality is epistemological. "Real reality" pertains to metaphysics - the externally perceivable stuff of reality that exists independent of our minds (as in, eternally prior to, and subsequent to, the existence of any humans or other beings with conceptual consciousness.) Math is an epistemological tool to abstract the causality of existence into a simplified structure amenable to consideration and manipulation by human consciousness.

A quadratic equation can be used to approximate the coordinates of the trajectory of a thrown object in a gravity field. I think the really interesting question is why that particular equation reflects that trajectory. Arithmetic operators - multiplication and addition in that case - in a particular order, are approximating the causal operations of existence that are actually at work. I think of this as the philosophy of mathematics and much more should be done to investigate it.


> Math is an epistemological tool to abstract the causality of existence into a simplified structure

The mathematics to accurately predict or relay reality is still complex enough that it’s often beyond us. You’re right in that it’s a tool to understand, but if we’re using simplified math for simplified reality, is it really epistemological?

As you suggest, math isn’t outside the boundary of philosophic investigation. It never was in the past, and I don’t think better approximations change that calculation.


You made me remember this article you might have read before: https://www.wired.com/story/our-machines-now-have-knowledge-...


Thanks, I’ll give it a read


> the externally perceivable stuff of reality that exists independent of our minds

How can any perception be independent of mind consciousness? And how can mathematics reflect anything other than that consciousness?


It's perceivable as long as a consciousness exists with sense organs. Existence doesn't depend on that perception. Existence existed long before any minds existed and will it exist long after. ("Long" is a euphemism here for eternally.)


> real existents within consciousness

"Within consciousness" doesn't belong here. Even if all sentient life in the universe died out tomorrow, the idea of triangles (and "triangluarity") would still exist.


well… on the other hand: have been triangles ever experienced without a consciousness?

this "the idea of triangles would still exist" statement looks very reasonable, but can we ever proove it? you can not remove ALL consciousness to verify. at least a bit consciousness must be in the system to do the experiment.

yes, it very much coincides with all of our understanding of reality, that abstract ideas like triangluarity is independent from any consciousness. at least any consciousness known or conceivable to us so far.

so my extension to this statement would be - admittedly supernaturally-sounding - that: Even if all sentient life in the universe died out tomorrow, the idea of triangles (and "triangluarity") would still exist, possibly because there is a consciousness above/beyond of the conceivable the universe to sustain their existence.


We're very quickly coming to the realization that information is a physical quantity, like mass or energy. (And can be measured and follows some sort of physical laws.)

If so, then consciousness doesn't need to exist for information structures to exist.


My view is that Math is real but the way it's represented is arbitrary. I'm sure that math could have looked very different than it does today and it would have worked better for a different kind of people. I find that some mathematical abstractions and ways of describing certain ideas are over-complicated and almost seem to be intended to confuse and obfuscate rather than to clearly explain. I feel similarly about certain fields of science; the ideas are really very simple but whenever you hear someone explaining them, the explanation seems to be full of gaps, rely on the audience to make certain assumptions, use custom terminology or terminology repurposed from different fields which can create confusion.

I often felt that being good at math requires having a natural talent for making the right assumptions to fill in any gaps or ambiguity in explanations. I feel like the language of math does not do justice to the complex, intricate ideas it tries to convey. On the other hand, being good at programming requires the opposite; it's about being able to resist making assumptions.


That's really interesting, can you give some examples?


I feel like almost all the descriptions of advanced (university level) math concepts are confusing when you first hear them. For example, the definition of a Taylor series according to Wikipedia:

"In mathematics, the Taylor series or Taylor expansion of a function is an infinite sum of terms that are expressed in terms of the function's derivatives at a single point. For most common functions, the function and the sum of its Taylor series are equal near this point."

This kind of description really throws me off. Especially "terms that are expressed in terms of the function's derivatives at a single point." first of all, superficially, the different meanings of the words 'term' and 'terms' throws me off and then even when I get over that, it's not clear what is meant 'in terms of' in what terms? What kind of relationship are we talking about here? I need to keep reading a lot more to fill that gap... In the meantime, I'm in a state of confusion and will need to re-read that sentence later to make sense of it once I have more info.

I'm not sure how to solve that problem to explain it more clearly but it feels like definitions should gradually build up without gaps. Is it even solvable? I would prefer not having read that definition at all TBH as it only serves to confuse me, it has way too many possible interpretations. I prefer to jump straight to the formula.


Completely agree.

I think there is a cultural component that has transferred throughout history whereby people need to signal their level of education to others.

You've probably had it happen yourself without realizing.

You hear someone explaining something in a simplistic way, and notice yourself wondering how deeply they understand the topic. Then when you are explaining the topic, you don't want people to question your own knowledge level like you did to the other person, so you use techniques to signal the depth of your knowledge.

This might be fancy words, or skipping over simplistic things.

And I think this just becomes second nature.

You can see it with programming languages. If I told you to rate a Rust dev vs JS dev, you are thinking Rust is harder to learn so they must be smarter.

It can also just be a challenge to imagine how you thought about a concept when you were initially learning it.

English is pretty terrible for explaining a lot of math too. Math is better understood visually, but back in the day you couldn't exactly share an interactive diagram.


That's some really bad wording indeed. But it may vary from one source to another. The more intriguing point of your comment is about assumptions. Can you provide a relevant example?


It's hard for me to provide a specific example because (for me) it applies to many different fields of advanced math. It seems like people who are good at math have some preconceived idea in their heads about what the purpose of a math concept is going to be and that helps them to make sense of new concepts faster. Maybe that's what people refer to as a 'mathematical intuition'?

To me (who is not naturally gifted at math), it often seems like math has no specific direction; it appears to explore almost every direction arbitrarily. I can't usually tell which part is supposed to be interesting or potentially meaningful so I don't know what to focus on or what to look for when I'm learning it.

It's like if someone gives you a confusing and vague instruction or question, it helps if you know what the reason is. Like if someone asks you "what day is it?", it helps if you know the intention behind the question or else you can't know for sure if you should answer with "27th of September" or "Wednesday". You don't fully understand the question without knowing the intention behind it. To me, learning math presents a much more extreme variant of that effect.


Mathematics is, in part, the study of universal traits:

(1) Ratios between things in the world.

(2) Logical relations between things in the world.

(3) Absolute distinctions between things in the world, in a nominative sense, which is to say in the sense that numbers can be assigned to things or elements of things.

Mathematics that relates to one of the three use-cases above is absolutely real, hence its unreasonable effectiveness in the natural sciences. (See Wigner: https://www.maths.ed.ac.uk/~v1ranick/papers/wigner.pdf )

When mathematics does not relate to one of the three -- for e.g., in Cantor's theories of "countable" and "uncountable" infinite sets -- it is totally unreal. A construct or game played with logic that has no prior or intrinsic relevance to the material world.


So when an application is found to a previously pure theory area (e.g. number theory and RSA) does math becone real?

Is realness not an instrinsic property but just a judgement on how useful something is?

Is being real a real property of something or just a construct?


Math is in language. In a similar way that we can have fiction and non-fiction, with different uses of language, you can have 'pure' and 'applied' uses of math. Using 'real' as the goal kind of misses the point. It's not imaginative, if that's the antithesis of real in this whole conceptual discussion. But, like fiction to non-fiction, it's still a public phenomenon that "touches" the world in language as a shared, communal, element in a form of life. Math is remarkably useful as a representation of the world and for us solving problems, but you can also give directions to a location to someone with just hand signals and barking like a dog. In a way, math is not special. But, the degree of accuracy that we've developed in math, however, is what makes it remarkable.


The study of math in language is called semiotic, and thanks to a bunch of french poseurs it’s become a joke word among the genuinely intellectually curious. Nevertheless, attempts have been made to tie it all together, most notably by the brilliant and mostly forgotten Charles S. Peirce.

Whence the pragmatic maxim:

  Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.
Pragmatically speaking, math is no more or less real than that.


> Using 'real' as the goal kind of misses the point

It misses the point of the question "is math real"?

Tbf, i am actually sympathetic to the position that "realness" is not really relavent or a well defined term when it comes to what is basically a descriptive language of patterns and relations. But "is it real" is the question we started the thread on.


Okay, here is precisely where the Simulation Hypothesis becomes useful and interesting. I know, I know. But bear with me for a minute...

The universe, as we know it, is simulatable. In other words, it can apparently be reduced to mathematics in precisely those three senses I outlined above. (If it helps, you can imagine electrons and other subatomic particles as bits of information in a coordinate space, which are constrained to operate in accordance with rules that govern logical relations between things.)

What mathematics is real? Anything that would relate to that universe or any of its constituent parts, in _any_ meaningful sense. We don't need to have found a use for it, and surely there's a great deal that is still undiscovered.

What mathematics is unreal? Whatever is, a priori, absolutely unnecessary to the existence of such a universe -- or, even worse, would break a universe described in terms of logic were it somehow made manifest.


Given that logics (of all sorts) are subsets of mathematics, I'm not sure one can convincingly hoist (any) logic above all other mathematics to make arguments.


> When mathematics does not relate to one of the three -- for e.g., in Cantor's theories of "countable" and "uncountable" infinite sets -- it is totally unreal.

Are you considering the set of natural numbers to be “unreal”? Or, if you consider the set of natural numbers to be real, do you not consider power sets to be a logical relation?


I'm not a finitist, but it's a highly defensible philosophical position.

As for this:

> Or, if you consider the set of natural numbers to be real, do you not consider power sets to be a logical relation?

Skolem solved that quite neatly already: Every "uncountable" set has a countable model. Thus the power set is, in fact, no larger than the set of natural numbers, because both can be fully described in a countable manner. What I'm describing is necessarily an abstraction of an abstraction, though, so I'd consider it "unreal" by definition.


I know what “a model of a theory” (such as a model of a theory of sets, even one which asserts that some expression defines an uncountable set) means.

I’m not quite sure what a “model of a set” means. I suppose maybe you mean like, the set in some model, which is described by the given description of a set in the theory?

> Thus the power set is, in fact, no larger than the set of natural numbers, because both can be fully described in a countable manner.

I don’t think this follows.

There are countable models of set theory. And for these models, there is (in the meta-theory) a bijection between the set representing the set of real numbers, and the set (in the meta theory) of natural numbers.

I don’t think this establishes that “in fact” there is a bijection between the real numbers and the natural numbers.

Rather, in any model of any of the usual set theories, there will be no bijection between the set of real numbers and the set of natural numbers. (Of course there will not be, because these theories entail that there is no such bijection.)

If I take a non-standard model of arithmetic, and for some non-standard natural number n, and consider the uniform distribution of (non-standard) natural numbers less than n, then for every standard natural number, the probability of getting that natural number from that distribution, will be equal. Are we therefore to conclude that there is a uniform probability distribution over all the standard natural numbers? By no means!


Skolem's own proofs are a little bit opaque, so I've constructed a new algorithmic proof of the downward Löwenheim-Skolem Theorem which is easier to comprehend. I have yet to publish it, but it'll probably turn up somewhere else towards the end of the year:

Preliminaries: (1) Let T be a first-order theory with a countable or uncountable infinite model M. (2) Let L be the language of T. (3) T is assumed to be a set of sentences (closed formulas) in L. Algorithm Steps:

Initialize Countable Set S: Start with S = Const(L), the set of all constant symbols in L.

Extend Language: For each n-ary relation R in L and each n-tuple (c1, ..., cn) of elements in S, if M⊨R(c1, ..., cn), then add a new constant symbol c to S and extend L to L' by adding c.

Iterate for All Formulas: For each formula ϕ(x1, ..., xn) in L and each n-tuple (c1, ..., cn) of elements in S: If M⊨∃xϕ(c1, ..., cn), then pick some a in M such that M⊨ϕ(a, c1, ..., cn). Add a new constant symbol c to S to represent a and extend L to L' by adding c.

Closure: Repeat Steps 2 and 3 until S no longer changes. Since T and L are countable, this process will eventually result in a countable set S. Construct Countable Model N: Take the substructure of M generated by S as N. By construction, N is countable.

Elementary Submodel Check: By construction, N is an elementary submodel of M. This is because for any formula ϕ and any n-tuple (c1, ..., cn) from S, M⊨ϕ(c1, ..., cn) if and only if N⊨ϕ(c1, ..., cn).

Conclusion: N is a countable elementary submodel of M, and therefore T has a countable model. End.

You're right about the bijection, but, in the countable model, the set that "represents" the real numbers is countable. (From the perspective of the meta-theory.) However, within the model itself, this set still satisfies all the axioms that make it "look" uncountable. For example, there's no bijection between this set and the set representing the natural numbers within the model, even though such a bijection exists in the meta-theory.

This is why Skolem's solution can be most succinctly described: "Every set has a countable model if one steps outside the set and constructs a countable model from its elements."

In Kleene's “Introduction to Metamathematics,” he describes the situation as follows: "Either we must maintain that the concepts of an arbitrary subset of a given set, and of a non-enumerable set, are a priori concepts which elude characterization by any finite or enumerably infinite system of elementary axioms; or else (if we stick to what can be explicitly characterized by elementary axioms, as we may well wish to in consequence of the set-theoretic paradoxes ) we must accept the set-theoretic concepts, in particular that of non-enumerability, as being relative, so that a set which is non-enumerable in a given axiomatization n may become enumerable in another, and no absolute non-enumerability exists."


To be fair it's not too strange to consider both the natural numbers and the power sets to be to greater or lesser extents abstractions to allow us to reason about all the things we can logically put in one of them.

After all both of them are mostly full of things we will never need, encounter, or even be able to define.


I kind of think the core of the question of if math is real, is if abstractions are "real".


I was going to post something similar and then saw yours, and it was better. We can add that math is language, and sometimes language describes things that are real, and sometimes not.


> When mathematics does not relate to one of the three -- for e.g., in Cantor's theories of "countable" and "uncountable" infinite sets -- it is totally unreal.

I find that to be quite a bold statement. The question as to whether "mathematics is real" feels firmly rooted in philosophy. First, I would question what is meant by real and unreal. Second, I would question if the answer matters whether the existence of the real and the unreal are not yet discovered, or even imagined.


Your describing physics more than math in my opinion. But it's hard to say things like set theory isn't real when a lot of calculus works nicely and describes the world very accurately. That toolbox is heavily built on the "unreal" things in math your describing to be not intrinsically relevant. The weird parts of math often speak to some deeper thing your discounting. Hopefully I'm not reading into your comment to much


I think the tie between Mathematics and the real world is more of a fortunate coincidence than a necessary tie or even causality. Human will still study math (as we know it now) if it real world doesn't exist or it somehow becomes not describable by math, because there's the innate desire for logic and solving puzzles.



And yet irrational numbers are real.


“The question “is math real?” is answered in the epilog of this book. Cheng tells us that math is real because it is an idea and ideas are real.”

Alright great lol


As a data point, the author was interviewed on NPR's "The Indicator"[1], and the way this review summarizes it might not capture what her point was:

> CHENG: I'm not trying to answer whether math is real or not. I'm trying to show that considering the question at all leads us to interesting thoughts. And in the end, what I say is that, with all these questions, I don't think there are yes-or-no answers, and we shouldn't claim that there are. What we should do instead is say there is a sense in which - you know, what is the sense in which math is real, and what is the sense in which math isn't real?

> And the thing is, I think a lot of people who say math isn't real are using that to say, oh, so it's irrelevant and stupid. Why should we study it? It's made up. And what I want to say is that just because it is made up doesn't mean that it's irrelevant. And actually, the fact that it's kind of made up makes it really powerful because - well, it makes it really, in a way, more accessible because you don't need a lot of money to get it. All you need is an imagination. And I think that's a really amazing thing about it. And just like fiction isn't real, but fiction can give us insights about the world around us to highlight much more, specifically, things about society. And that's what I think is powerful about abstract math as well - because we're not constrained by reality.

I haven't read the book, but it seems like the title may be a throwaway question meant to pique the reader's interest and say "let's get philosophical about math".

---

[1] https://www.npr.org/transcripts/1193035114


This is an idea that goes back to Pythagoras and Plato.

Think of a triangle. Now draw that triangle on paper. If you look closely enough, you'll see "imperfections" in the triangle you just drew. Now ask yourself: "how do I know this thing I just drew is imperfect? Where did the idea of a perfect triangle come from?"

Plato would say the perfect triangle comes from the realm of "forms". This mystical place which is "more real" than "reality" because everything there is perfect and everything here is just a flawed approximation. Plato also said that this is the place where our souls go when we die, and we engage in "congress" with the forms and then return to earth, reincarnated. When we learn things, we aren't learning something new but actually recalling memories of the forms. This is why everyone knows what a perfect triangle is but no one has ever seen one in the physical world.


Because the Platonic soul (psuche) is different from consciousness. Actually, Philolaus (first Pythagorean to write a book) said that conscious feelings come from the combination of the mathematical soul with the body. So, Pythagorean soul can be viewed as the set of logical or conceptual forms— which constitute a person, yet can also be passed on from person to person. That’s a very different kind of reincarnation…!


What is “real”? Well, it’s just an idea… really


There is, almost certainly, an objective universe that exists outside of human minds, and it also seems likely that all of mathematics can be determined through objective mechanisms.


Shoulda put that on the preface just to save everybody some time!


Redefining terms until the question doesn’t matter is a totally valid way to solve a problem /s


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: