Hacker News new | past | comments | ask | show | jobs | submit login
What Even Is a Number? (drmaciver.com)
93 points by ColinWright 28 days ago | hide | past | web | favorite | 163 comments



I had a fun mind-play with my kids; I asked them if numbers like "one" and "two" really exist. They said yes, of course.

OK, I said, show me a plain old "one". Nope, that's one fork; nope that's one ball; nope, you get the idea.

You cannot show just "one" unattached to anything else.

The concept of "one" is an idea. It only exists as a "real" concept in your mind because it exists as the same concept in the minds of others, and it's a very useful concept in describing the world around us.

This odd fact becomes ever more apparent when you find out about cultures that have very different approaches to quantitative reasoning, like the Pirahã in Brazil [1]. They don't have number words beyond "one", words roughly meaning "some" and "many" are used for anything more than one.

[1] https://www.sciencedaily.com/releases/2008/07/080714111940.h...


That's just "Exist as physical objects exist"

Ie., you're asking someone to find you an object with spatio-temporal properties which, as an object, does not have spatio-temporal properties.

Likewise you could ask, "find me a smile?!!" and then deny every example was a smile because "it was only a face smiling!".

"Existence" is defined relative to a context in which it can be evaluated, and it -- in general -- does not imply any particular properties.

"To Exit" in general is "to be found in some domain"

2 can be found in the domain of numbers


This reminds me of my Italian grammar teacher telling me that nouns can be broken into two categories: ‘concrete’ and ‘abstract’ (for example, ‘chair’ and ‘love’ respectively). Then came a totally mind bending digression for my eight-year-old mind when she mumbled “well actually all nouns are abstract because there’s no such thing as a chair divorced from reality, there’s nobly instantiations of these abstract concepts we consider close enough to be concrete”. That totally blew my mind. Together with reading The Computational Beauty Of Nature” when it first came out nine years ago those were the two key experiences that drove me towards choosing to get a degree in mathematics.


  > all nouns are abstract because there’s no such thing as a chair divorced from reality,
  > there’s nobly instantiations of these abstract concepts we consider close enough to be
  > concrete
Very Plato for an 8 year old.



Agree. Maybe it is more helpful to think of numbers as adjectives than nouns. Oneness plus twoness equals threeness. The part of speech is sort of irrelevant but it obviates a debate over whether or not instantiation of the unattached idea of a number is important, while still allowing math to occur.


Yes, I could have really tortured my kids and said, you can't even show me a "ball", because it will always be a specific ball and not the archetypal concept of "ball".

I probably should, it'd be fun and they're a bit older than when I pulled the "one" game.


Make sure you say "show me ball" instead of "a ball"...


The "technical" term amongst philosophers is "ontological category":

http://blog.rongarret.info/2015/02/31-flavors-of-ontology.ht...


I believe that in fact the numbers do exist as physical objects exist. Just in their own universe.

When we reason about numbers, we're embedding a representation of a universe of numbers into our universe.

What we call "physical existence" could is also probably just be "math all the way down". Just not in such a way that the number two per se can be an entity for us to behold.


I believe that in fact the numbers do exist as physical objects exist. Just in their own universe.

How is it any different than believing that unicorns exist as physical objects, just in their own universe?

What we call "physical existence" could...just be "math all the way down".

Or, math is also a number of simplified alternate views of reality of varying degrees of "applicability." (Where "applicability" has to do with predictive power over reality.)


> How is it any different than believing that unicorns exist as physical objects, just in their own universe?

Good question, but I think there might be a difference. Suppose, in a galaxy far, far away, there are aliens. They probably don't have the concept of unicorns. However, they probably have discovered numbers, pi, Pythagoras theorem, e^(i pi) + 1 = 0, etc.

So, does the fact that something is discovered (rather than invented), and can be discovered many times independently in identical fashion, confer it some sort of elevated ontological status?


However, they probably have discovered numbers, pi, Pythagoras theorem, e^(i pi) + 1 = 0, etc.

We can only be confident of that if they're in this universe. Also, this is only a conjecture. It's just something some of us imagine would be true. You can't elevate such an idea to the level of empirical data.

So, does the fact that something is discovered (rather than invented), and can be discovered many times independently in identical fashion, confer it some sort of elevated ontological status?

Unicorns. Kirin. Narwhals. Rhinos. So you have two independent origins in culture for such a creature, plus two appearances of something similar in nature. Does that confer it some sort of elevated ontological status?


> Unicorns. Kirin. Narwhals. Rhinos. So you have two independent origins in culture for such a creature, plus two appearances of something similar in nature.

It's not at all clear that the cultural constructs are independent of each other or real world creatures. Particularly in the Western case, some of the earliest Greek accounts of the unicorn as well as Marco Polo’s account confirming its existence and difference from the contemporary evolution of the image stem from the distant East (from the European perspective) and clearly are describing a rhinoceros. They are also very similar to accounts of the Kirin.


Kirin is giraffe and upset Confucius as it should not appear in his chaotic time.


Some accounts of Kirin, yes, appear quite likely to be giraffes, others are nearly identical to the Western accounts of the monoceros of distant India (one of the original of the unicorn legend) and are almost certainly accounts of rhinoceros.


I would sadly have to say no. Because in some completely different universe unlike ours, the aliens have no concept of, say, "hemoglobin" because there isn't matter like ours: no atoms, no iron, no carbon. Unlike numbers, hemoglobin and unicorns contain a kind of arbitrary complexity. Yet one is definitely real. Some real entities are far removed from elementary math (even though they may ultimately rest upon it, many levels down). The question is: are unicorns bunk, or are they like hemoglobin.


I'm not claiming that if something exists (haemoglobin), we must have a notion of it and true statements about it in every universe.

Rather: if we have a notion of something and statements about it in every universe, then it must exist. However, come to think of it, that's bonkers.

Still, it's remarkable that mathematics allows us to make reproducible true statements about fictional objects.


> They probably don't have the concept of unicorns.

What would happen to your epistemology if every alien civilization has unicorns but some lack π?


> every alien civilization has unicorns

I knew it, Unicorns are real!!11!

I am not sure I can conceive of aliens that don't have pi (how can they have cylindrical rockets?).

Anyway, maybe we should let the professionals deal with this.

https://plato.stanford.edu/entries/platonism-mathematics/

https://plato.stanford.edu/entries/intuitionism/

https://plato.stanford.edu/entries/mathphil-indis/

https://plato.stanford.edu/entries/nominalism-mathematics/

https://plato.stanford.edu/entries/philosophy-mathematics/


> How is it any different than believing that unicorns exist as physical objects, just in their own universe?

We have very well-developed representations of the number universe that we can embed into our own universe. These things are governed by precise axioms. By contrast, we don't have such well-developed "unicorn universe".

(For starters, what is the definition of "unicorn"? If it's just a horse with a horn, then such a thing is plausible with genetic engineering in our universe.) As we endow the unicorn with additional properties, then it becomes less and less clear that there exists a universe where such a thing can exists other than as an imaginary being. If the concept of a unicorn could be axiomatized, then it could constitute a universe.

> Or, math is also a number of simplified alternate views of reality.

Well, that's the math that we do; not the math that we (possibly) are.


We have very well-developed representations of the number universe that we can embed into our own universe. These things are governed by precise axioms. By contrast, we don't have such well-developed "unicorn universe".

Indeed. The current better known attempts at a "unicorn universe" have problems with power scaling and the square-cube law. https://www.reddit.com/r/mylittlepony/comments/mrs81/accordi...

Do you have a good definition of what constitutes a "well-developed" universe?


>Do you have a good definition of what constitutes a "well-developed" universe?

An easy definition would be one where everyone agrees on the exact mechanics and sequence of states. The fibonacci sequence is well-developed: it has an exact definition that every competent simulator can simulate the same way. If I say my favorite number is the 1000th item in the sequence (starting with 0, 1, 1 as items #1, #2, #3), then everyone can calculate the 1000th item the same way.

If we all try to talk about what happens in the "unicorn universe", everyone has their own different ideas about that. There's no unambiguous rules for how the unicorn universe is calculated. Everyone has a different "unicorn universe" in their heads, and each of those universes operates on our own mind's whims rather than any rules we can describe to each other.


If we all try to talk about what happens in the "unicorn universe", everyone has their own different ideas about that.

Homo sapiens has this issue with this universe. Even if you subset that group to just scientists, there are still differences of opinion.


Reality itself is well-defined in that it acts consistently with itself. We as humans don't have a well-defined embedding of this universe into itself. We would need to know the complete underlying rules of physics for that.


That other "universe" is just patterns embedded in our brains. That's all you need.

You could likewise argue that sounds exist in a different universe, but are only "embedded" into the air using pressure waves, but why would you ever think you needed a different universe of sounds when you've already got pressure waves in air? I mean... what's "not enough" about that for you?

>What we call "physical existence" could is also probably just be "math all the way down".

This is a needless diversion. Mathematics is the study of structure. Physical reality has structure, and very well could "be structure" in some equivalent sense, but to say it is math is just abject poetic nonsense. This change of language would literally make no significant difference to anybody about anything. You may as well be declaring the world to be flat and asking everybody else to go and change all our language to make it true.


> abject poetic nonsense.

No it isn't! Either it is math all the way down, or else it isn't.

If it isn't, then what is at the bottom?

So this is actually an existential question.

Suppose we reach a final description of reality that is entirely accurate (no longer an approximation that fails to account for some structures and processes).

Won't that description consist of nothing but math?

Then, if it is complete, how does that description account for the fact that reality isn't math? If there is a difference between reality and math, can't that be described with math, so then it goes away?

If you want math to just be a description of a structure that itself isn't math, then the structure has to have attributes that are not covered by the math. The math is then using a substitute structure with fewer properties as a proxy for that structure. Once all the attributes are rolled into the math, there ceases to be a difference between that target structure and the math; it is that mathematical object. To prove otherwise requires one to name the difference. If two entities are completely described and found to be the same, they are just one entity.

> You may as well be declaring the world to be flat.

It is testable that the world isn't flat.


My guess is that eventually, if we manage to ever reach that point of total knowledge, there are certain properties of the universe that exist for no reason; they just are. Whether that's at the level of today's fundamental particles/qft or much deeper is anyone's guess.

Example: space-time exists in n(>=4) dimensions, there are several quantum fields that exist in that space-time and evolve/interact according to the equations of qft, gravity has some mathematical description (better than the ones we have now). And that's it, there's nothing else for us to know. Of course, we can never know for certain whether we've bottomed-out our description of reality, so physicists will always have some hope.

Now, if the universe is described entirely by mathematics, bootstrapped by a few base cases (the existence of fields/particles/space-time/strings/etc), I'd be comfortable with that. I'm not sure what a truly satisfying understanding of reality would look like beyond that.


> there are certain properties of the universe that exist for no reason; they just are.

But we have that in math and logic: axioms!

> Of course, we can never know for certain whether we've bottomed-out our description of reality

That we do not know is a fact; but is it a fact that we can never know?

What if one fine day we do know?


> But we have that in math and logic: axioms!

Yes and no. Axioms are chosen such that the math that is based on them makes sense to us (because the math reflects what we'd expect in the real world) and isn't self-contradictory.

So they do exist for a reason.

> That we do not know is a fact; but is it a fact that we can never know?

What if our universe actually has randomness that appears to come from "outside the system", i.e. there is really no way to predict the outcome of certain things from inside the system.

Picture you're sitting in a box trying to play Mikado. In theory you should be able to perfectly predict the outcome of every move you make - except there's a stranger shaking the box you're sitting in at truly random intervals.

For our reality this would mean that there's no way for us to develop an understanding of certain things and predict their behavior.

So essentially the end of the line for all sciences except theology and philosophy.

This possibility makes me somewhat uncomfortable.

On the other hand maybe everything is perfectly deterministic within our universe.


>Suppose we reach a final description of reality that is entirely accurate (no longer an approximation that fails to account for some structures and processes).

> Won't that description consist of nothing but math?

Math is just words. Any description of anything can be reduced to math if you assume the proper math will be developed to describe it. But the same could be said of any subject. You could say the universe is Mandarin Chinese if you find a perfect description in that language. You could say the universe is binary code if you found a perfect description of it in binary. You could say the universe is tree bark if you found a perfect description of it in tree bark.

You're saying nothing useful. You're just confusing the map for the territory. (And arguably, you will never find a complete description of the entire universe that fits in a network of human brains, so the universe can't be math for that reason alone.)


> You're just confusing the map for the territory.

I am not at all! We do math with symbols and diagrams, but those symbols and diagrams are about something: they are maps, and there is a territory that they are about. When I say "math", I include the concepts, not just the symbolic, graphical and other representations.

So what I'm saying is that if/when mathematics provides a complete map to that territory which is "the world", that territory must then be made of exactly the same stuff as other described-by-math territory-entities like "ellipse", "pi", "finite field", "Hilbert space", ... the map of mathematics doesn't describe any other kind of territory.

I'm not saying that the symbols and pictures used to carry out math (the "maps") are the world.

> you will never find a complete description of the entire universe that fits in a network of human brains

The description of the state of it as it is before us is monstrously large, but the initial conditions and rules might not be. E.g. the Mandelbrot set is just iterating on z^2 + c for various c. To describe isn't necessarily to evaluate.


The concepts are also just symbols, just in different media. They are also a map.

>if/when mathematics provides a complete map to that territory which is "the world",

Then it will encode all the information that exists in the universe. Which will not fit in your head, and thus will not be mathematics. Just because a territory is trivially a map of itself doesn't mean it is useful to call it a map.

>The description of the state of it as it is before us is monstrously large, but the initial conditions and rules might not be.

Reality isn't just the initial rules. It has state. That state must be described for you to have a complete description of reality. Yes, we could mathematically describe the complete laws of nature. But reality is necessarily more than just that, because the state is really damn relevant. You can't just ignore it because it doesn't fit your argument.


>Reality isn't just the initial rules. It has state. That state must be described for you to have a complete description of reality.

If reality is deterministic, then all the state is derivable from the rules and the initial state, and it's possible we could figure out the rules and initial state and become relatively sure of them. (We probably wouldn't be able to calculate the future, since evaluating the rules within reality probably necessarily takes more steps than reality takes during that time. But we might be able to calculate enough to see that it matches up with what we think the early universe must have looked like. We might notice that even the smallest of tweaks to the rules or initial state creates an avalanche of changes that doesn't line up with what we believe the early universe looked like, so we can be sure there are no free parameters.) It's fully possible that the rules and initial state could fit in someone's head. They might be ridiculously simple, like Conway's game of life in a few more dimensions.

(If you think "reality can't be deterministic because true randomness or quantum events exist", well you can fit true randomness into a deterministic theory by having the world branch of every random event outcome. The Many-Worlds Interpretation could be viewed as doing something like this for quantum events; it's a deterministic theory.)


Even if reality is deterministic, you still have to know which state to derive. And that process will have to account for all of the same information.


The strange thing about mathematics, as opposed to other noncorporeal ideas, is that it is nearly uniquely suited to being transmitted between brains. My idea of the number "three" and your idea the number "three" are probably very similar. This sort sensation that they have an independent existence leads people to the Platonic notion of a mathematical "realm" or "universe".

(personally I think mathematics exists in our brains, but also and crucially in the interactions between our brains, since a proof is never accepted until it is reviewed and so on. A proof that's just in my brain is no proof at all, even if it happens to be correct)


You could say that about most language we use: it's meant for transmitting thoughts between human brains. Why would it be bad at it?

Besides, there are plenty of mathematical concepts which are hard to understand. I don't see any uniqueness to math here.


Yeah, mathematical objects like numbers are concepts in our brains, or imaginary objects. Numbers in particular are imaginary objects used for counting. You can set up an isomorphism between these numbers and real-world objects, e.g., to count them.



> You cannot show just "one" unattached to anything else.

Also interesting to try to show "zero" of something. Zero is fun because it took longer to exist as a number than one & two, and there was some debate over whether it should be a number. "How can nothing be something?"


Historically true, but glad it's there so ℤ is a group. The way to think about 0 for a child is what you add to anything so it stays the same, ie nothing.


Hate to be pedantic, but ℤ︎ is a set (not a group) and even with 0 adjoined it only forms a group under the operators of addition, subtraction, and multiplication.. If you want a set that also forms a group under division you need to ‘upgrade’ to the set ℚ︎.


ℤ︎ (…, -2, -1, 0, 1, 2, …) and + forms a group. It is also a ring with * and +. ℚ︎ with * and + forms a field.


Yes, that's what qubex said, in correcting the error in your previous comment.


ℤ︎ with * is not a group, there is no inverse for 0. When being pedantic it's good to be correct.


You are perfectly correct, I was wrong. Upon closer reflection it appears impossible to have a nontrivial set (meaning other than {0}) that forms a group both under addition and under multiplication because the identity element of the former does not have an inverse in the latter. However, take this with a grain of salt, because I haven't had time to prove this to my own satisfaction yet (and I am loathe to go googling for answers in my own field of expertise).

My abstract algebra professor would be horrified by my misstatement, and I shall go to bring flowers as a sign of atonement and rememberance to his grave at the earliest available opportunity.


Ah! I wrote the below before seeing your comment:

Numbers don't exist. Take the number two. You can have two apples but that's not the number two. You can write the numeral "2", but that's not the number two. It's one line. The word "two" has three letters, it's obviously not the number two. In fact, the number two doesn't exist (or it's existence is not contingent on an arrangement of matter/energy. No pattern of matter/energy/space/time is actually THE number two.)

To me, it's wonderful to reflect on the epistemological status of numbers. They don't seem to exist, but we talk about them as if they do. Certainly they are useful. But they don't exist any more than, say, Sherlock Holmes does.

It's also much fun to reflect that, whatever the epistemological status of number, that status is shared by computer languages and algorithms and much else. E.g. the C language doesn't exist. The C standard(s) exist, many C compilers exist, lots and lots of C code exist, etc., but the language itself doesn't exist anywhere, any more than does the number two.


Numbers don't exist - unless you're a Platonist.

The experience of numbers certainly exists. The reason you recognise one-ness and two-ness in objects is because your brain experiences a difference between one-object and a-pair-of-those-objects.

The experience is subjective. If you have two objects that are identical except for the fact that one is red and the other is blue, you can parse them as one-each-of-red-and-blue or as two-identical-things. Both interpretations are logically consistent, but the most useful interpretation depends on the context, and your specific needs as defined by the context.

It's easy to underestimate how subjective math is.

I've long suspected that this is why math eventually dissolves into incompleteness theorems. You can't prove a subjective experience, and eventually all mathematical reasoning reduces to subjectivity - even something basic like true/false, which is essentially just an interpretation of experience. (And can be very unreliable.)

Math is really an introspective map of experiences and processes that we find collectively consistent. We like to tell ourselves it's external and objective. But how can we tell the difference between true objectivity and subjective consistency when we only have shared human subjectivity to use as a reference?


Proof that at least one number exists: Take for our logic first order logic with equality[1]. Take for our definition of the word "number" the Peano axioms[2]. The first Peano axiom is "0 is a natural number." Define "N" to be the unary relation "is a natural number." In our formal language, we write the first Peano axiom as "N0". By the principle of existential generalization[3], we have "∃x Nx", which we read as "there exists x such that x is a natural number." Q.E.D.

I assume you find the above deeply unsatisfying. Most people do. Yet I find it clarifies the matter entirely! If a person's conception of "number" is consistent with the Peano axioms, and if that person's use of the word "exists" includes the existential logical quantifier, then for that person "numbers exist" is a theorem. If a person uses other definitions, the matter may be otherwise.

[1]: https://plato.stanford.edu/entries/logic-classical/ [2]: https://en.wikipedia.org/wiki/Peano_axioms [3]: https://en.wikipedia.org/wiki/Existential_generalization


Does an apple exist? You can show me an apple, but I would wag my finger in exactly the same way you are to people trying to show you “one”. The apple you’re showing me is not the same thing as the word “apple” which is an abstract category that encompasses all types of apples - honeycrisp, Granny Smith, etc. You could say a honeycrisp is like an instance of apple which is a base class, so then it is indeed an apple. But then youd have to say an apple is also a “fruit” and also a “food” even though those words also encompass broad categories of things.

If you buy the above, you’d have to say that a single apple is also “one”. After all, it’s just another broad category that a single apple fits in to. If not, I think you’d have to reject the fact that a honeycrisp is an apple, which seems untenable.


Reflecting on this stuff is my golf...

Even if I agree that THE number "one" is "just another broad category that a single apple fits in to", I think the epistemological status of the Broad Category is then itself just as problematical, existence-wise, as the original concept of "number", eh? We can at least display a pair of apples and "count" to "two" with them, etc. What hope do we have of enumerating (no pun intended) the Broad Category of "one"? And what do we gain from introducing it?


Numbers do not have a physical existence, but we can talk about them because in some capacity they absolutely do exist: conceptually.

There is the Tao and then there are the 10,000 things.


Numbers exist in a different way to Sherlock Holmes who of course exists as a fictional character. One of the things people look for when searching radio signals for alien life is prime numbers as intelligent species would discover those but they are not produced by mechanical processes like quasars. How could we and they discover them if they don't exist at least in some form?


Numbers don't exist.

Only for the sense of "exist" which is roughly equivalent to "can poke a hypothetical indestructible stick at it, given enough energy."

the C language doesn't exist. The C standard(s) exist, many C compilers exist, lots and lots of C code exist, etc., but the language itself doesn't exist anywhere, any more than does the number two.

For the purpose of tax law, software is defined as a "tangible good." However, in other areas of the law, it's considered intangible.


You should keep in mind that being unable to show or find an example doesn't disprove the category. You'll never show me a fork yesterday, but that doesn't mean there were no forks yesterday.


I’m a mathematician and you seem a bit confused. Natural numbers ({1, 2, 3, ... }) are for counting. Giving a bunch of objects as being prime examples of that which can be counted with natural numbers is as close as one can get (short of just scribbling down the numerals and kind-of circumventing the point). Other classes of numbers (particularly the Reals) are for measuring. Again, short of writing down examples of decimal expansions, a child could point to a (non-integer) mark on a measuring tape.

I get the feeling you’re not quite so well versed on this stuff as you think you are. Want a true example of ‘one’: the number of solutions to the intersection of two linear equations that are not parallel. Want an example of ‘two’: the number of solutions between a quadratic equation and a linear equation (given that certain existence criteria are met).

I think you need to revise your measure theory.


I think they're just trying to say "1" is not a tangible object in the physical universe, not that it does not exist as a concept. The natural numbers are just data, information, they don't exist beyond the first dimension except as descriptors to things in further dimensions, just like lines can be built up into higher dimensions but cannot be physically created for us to observe.

Maybe you're reading too much into this.


How is two straight lines crossing in one and only one place not an example of a single occurrence without reference to real world objects (which the OP dismissed as “not really being a number”).

As for reading “too much into this”, as Instated: Irma a mathematician, of course I’m going to take this seriously and make an effort to disprove naive misconceptions.


"Reification" is the term to search for if you wish to delve into the extensive Philosophical bibliography on this topic that's accumulated over the last few centuries. Within mathematics "Intuitionist Mathematics" is the rubric for the anti-reification view. This relies on computability as a substitute for "existence." The Intuitionist view has pretty much triumphed.


I’m all for reification. It’s not even limited to mathematics: ‘democracy’ is just another example of a reification from another realm of human endeavor: it gets implemented in different countries with slightly different axioms and varying degrees of success but in aggregate those nations that abide by some form of it are accepted as functional equals by other whom have likewise implemented their own instantiation of the overarching ideal.


That's not what reification means, it's more a form of Platonism (Idealism), plus. See The Meno and it's discussion of the forms (say, the "form" of a spoon.) Plato would reify democracy (although he hated it) but most of us certainly do not. A nominalist (even a Berkeley follower) could discuss democracy perfectly well.


I'm pretty well versed in what reification means. Fundamentally it entails taking a theoretical concept and treating it as if it were a tangible, almost embodied, thing.

Maybe my example wasn't the best but this “democracy thing” gets discussed as if it has material properties: that it can expand, that it can be transported from one place to another, that it can be expelled from a locale, and so forth. Those are pretty material attributes to assign to something that is actually notional.


Now if you look up tangible...

I think you're caviling, at this point.


“Tangible” means touchable and has a Latin root; “tangere” still means “to touch” (though somewhat archaic) in my native Italian. That same Latin root is why we call linear equations that share the gradient of a given curve evaluated at the point where they cross to be “tangential”.

Look, we could go on with this all day. What’s your point? To show your terminology and grasp of topics is greater than mine? Is this some kind of passing match? Because I’m really not interested in these schoolyard antics.


So, how do you pinch a democracy, physically? By avoiding answering my point in my last message, you're still caviling. One progresses to the end of a conversation by honestly answering the points made, and you aren't interested in doing that.


This is why Aristotle said that the intellect cannot be material. The human intellect is able to receive and understand abstractions. However, whenever you try to represent an abstraction as a physical object or in a physical medium, it is no longer abstract. It is true that you can write about abstractions using physical ink on physical paper, but those are just symbols; they are not the abstraction. It is only when a person reads the symbols, understands them, and forms the abstraction in his mind that the abstraction comes into existence.

So, we can create programs that appear to do intelligent things, but it is the human observer that brings meaning to what the program is doing. The program might output the letters "dog" when a dog pass in front of its camera, but its the human that sees the "dog" that knows what a dog actually is.


That sounds like the illusion of consciousness. Neural networks show that you don't need "dog" in order to do whatever you need to with the idea of a dog.


You might even say the unattached "one" is more real than the fork. At the end of the universe, that fork is long gone, but "one" is still there.


But if there is no one around to appreciate or think of "one", does it still exist?


Where is "one" then, if it's still "there"? :P


A coiuple of my favourites:

- Child is put into perfectly sealed cube. They have a torch, which provides light. Since the light cannot escape the cube, what happens when they turn the torch off? Where does it go?

- If you put a brain in a jar and wired up electrodes which pulsed up every nerve ending etc to simulate reality - would the brain know it was in a jar? How? (The matrix being the modern representation)

-


In the case of the cube the internal temperature would rise slightly. Th kid is also emitting some heat too so that would also contribute to the rising temperature. Eventually the kid would die from lack of oxygen. If you provide oxygen to the child and remove carbon dioxide the system would no longer be sealed: adiabatic expansion of added oxygen would lower the temperature and removal of carbon dioxide would extract some heat from the cube.

Observed externally the mass of the cube (assuming you didn’t provide a life-support system as I suggested) would remain unchanged because all mass energy is still within the cube. It has simply rearranged itself in a manner unobservable to an external observer.

When they turn the light off you have a black body cavity with an intruder. Ultimately all the light would get reflected back and forth until it fell on various parts of his body whose colour allows them to absorb the incident light waves. The child would get slightly, probably imperceptibly warmer. His surface molecules would move faster and thus he’d have more heat energy.


By extension, does God exist? I don't know the answer but he seems to be very useful (at least to some members of the society!)


Sure (s)he does - inside all of us. This works for both spiritual as well as a human nature points of view.



So by your definition, one is an adjective, so asking them for a noun that represents the adjective "one" violates the definition. It's sort of a trivially false disingenuous question.


I was once posed a related question by a three-year-old: "How do you make a number?"

It was a simply worded but delightfully insightful wondering. I did my best in the moment and answered, "One way we can make numbers is by counting." I've continued to mull this question over since then. I'm drawn to this child's conception of numbers as something we make. It makes sense to me as a teacher of young children and a wonderer about math myself, yet it doesn't really square with what we've been told about numbers in most educational settings.


One of the Peano axioms is that every natural number is either zero or the successor of another. It's easy to explain to kids without jargons: a number either counts nothing at all, or it counts things that have an extra item compared to some other thing.


This kind of breaks down once you move past natural numbers, but for early learners of mathematics i agree it's useful to think about.


But other kinds of numbers can be made from that:

- Differences of counting numbers (integers)

- Ratios of numbers (rationals)

- Limits to sequences of numbers (reals)

- Solutions to polynomials made from numbers (complex)

You can think of each of these as adding a layer of behavior to the basic “each number is nothing or one more than another number”.


I agree with your sequence until the last two.

The second to last, that should be Cauchy sequences.

And for the last, demonstrating that the algebraic closure of the reals is the complex numbers from first principles is much harder than describing complex numbers as pairs of reals with a multiplication rule that (a,b) * (c,d) = (ac - bd, ad + bc) and then much later proving that it is algebraically closed through complex analysis.

For those who don't know the proof, the idea is this. Liouville's theorem says that if a function is differentiable everywhere, and it is bounded, then it must be constant. Now suppose that p(z) is a polynomial. Consider the function 1/p(z). You can show that as z approaches infinity, it approaches zero. It is not constant. Therefore it must not be differentiable or not bounded. It doesn't take too much work from there to prove that it blows up somewhere, and the spot that it blows up is a point where p(z) is 0.

Apply unique factorization for polynomials (see http://sites.millersville.edu/bikenaga/abstract-algebra-2/po... for that proof) and you quickly get the fact that the complex numbers are algebraically closed.


A much easier to understand definition for the set of reals is probably to use infinite decimal expansions that don't end in 9999 etc. Equivalence classes of cauchy sequences is tougher to explain I'd say.


In that case, good luck coming up with a good definition around multiplication where 3 * 0.33333... works out right. And then proving arithmetic properties like the associative law. And then proving that when you do the reals in decimal, you get the same system as the reals in binary.

It sounds harder, but is actually easier to go through the Cauchy sequence definition and then point out that the decimal representation naturally gives rise to a Cauchy sequence. So, for example, 3.1415926535... gives you (3, 31/10, 314/100, 3141/1000, ...). And as Cauchy sequences, of course, (1, 1, 1, 1,...) is easily proved to be the same as (9/10, 99/100, 999/1000, ...).


Oh right, 3 * 0.33333 is an issue, yeah. Still, you need to define that equivalence relation on the Cauchy sequences and then explain what a set modulo a relation means.


This is all standard mathematics.

The equivalence relationship is that the sequence (x_1, x_2, x_3, ...) is equivalent to (y_1, y_2, y_2, ...) if and only if the limit as n goes to infinity of x_n - y_n = 0.

Formally, the real number represented by (x_1, x_2, x_3, ...) is the set of all Cauchy sequences which are equivalent to that one. Since "equivalent to" is transitive, any Cauchy sequence in that set will define the same set.

Addition and multiplication are defined elementwise. Proving that they are well-defined is relatively straightforward. Their algebraic properties follow for free. Any rational number q can be mapped to the Cauchy sequence (q, q, q, ...) which leads to a unique real number that we somewhat sloppily call q again.

I've left some details out, but this construction is well-understood, and is how we define the completion of a metric space.


One way to explain the real numbers to a child might be:

There are whole numbers (integers, terminology not accurate but it'll do for now), and then there are numbers called fractions (rationals) which are either whole numbers themselves, or they fall between two whole numbers, and we get them by dividing one whole number by another.

The real numbers are either fractions, or they are numbers that fall between two fractions. If it's one inch from the center of a circle to its edge, the distance around the circle is not a fraction. In inches it's about 3.14; it falls between 3 and 3 1/7, and no fraction you can possibly choose will ever get it exactly, it will be at best a little under or a little over.

You can get other non-fraction real numbers by finding square roots, but there are infinitely many of these numbers between any two fractions, not all of which are square roots. And if you try to take the square root of a negative number, you won't get a real number at all, and we call these numbers "imaginary".

Hearing stuff like this as a kid blew my mind, a bit like learning about black holes. It set me up to enjoy math throughout my life.


Nitpick but sequences of numbers can diverge (converge to infinity) or be stuck in a cycle; it's more accurate to say limits of Cauchy sequences.


> You can see some of that history of what we call some of the different sorts of numbers: e.g. Negative, irrational, and imaginary numbers. Each of these represents a bitter argument about whether a new type of number should be allowed, all of which were eventually won by the people on the side of the new numbers...

Was there ever a proposed type of number that is today not considered a number? Like i=squareroot(-1), was there ever an attempt to do math with something like y=1/0 or another illegal construct?


> was there ever an attempt to do math with something like y=1/0 or another illegal construct?

Wheels are a type of algebra where division by 0 is defined.

https://en.wikipedia.org/wiki/Wheel_theory

https://math.stackexchange.com/questions/994508/wheel-theory...

To call wheels "obscure" is to make them out to be better-known than they are, however.


I had not heard of Wheel theory, quite fascinating.

But eek! From the wikipedia article:

    0x ≠ 0 in the general case
    x − x ≠ 0 in the general case
Like, you get the ability to divide by zero, but at what cost? Multiplying by zero is also not zero! Seems bonkers.

Are there any practical uses for this kind of algebra?


Likely no (the peculiar distributive laws seem unusable to me). But I have come across similar rules that I think may have a real usage. It sometimes seems to me that it might make more sense to keep 'factors' attached to 0s, such that x-x = (1-1)x = 0x might be 'sound', and to keep track of 'powers' of 0s by having factors of 0 attached to 0s: (0)0 = 0^2 != 0, etc.

If you keep factors like this, then you could implement L'Hopital's rule without the result only being true if considered under a limit: say, lim(x->0) (5x^2/ 3x^2) = 5/3 could be computed as (5)0^2 / (3)0^2 = 5/3.

This is not like Wheels though; it requires that any power of 0 be a distinct number. I of course have no idea if it is sound or meaningful, but I do find myself thinking about it a lot.


Those "factor" zeros look like infinitesimal forms, which are well studied.


There is a similarity, but no one claims infinitesimals are _literally 0_, and certainly no one would claim that 2-2 = 2 epsilon.


The original version of Calculus by Newton used "fluxions". That doesn't correspond to any number system that we use today.

Leibniz's (re?)invention of Calculus used infinitesmals. Infinitesmals as understood by mathematicians then do not correspond to any numbers we use today. (Yes, yes, something else called infinitesmals do show up in nonstandard analysis and the notation deliberately looks the same. But the underlying concepts are more..complicated.)


> But the underlying concepts are more..complicated.

Can you expand on this?


Can you expand on this?

Yes, maybe more than you want. :-)

Abraham Robinson's version of non-standard analysis went like this.

You start with the standard model of the real numbers and associated concepts like sets, functions, and so on. Using something called the ultrafilter construction, you construct a new, larger model of the real numbers+stuff, and a mapping from the standard model to the nonstandard model. So we have nonstandard numbers, nonstandard sets, nonstandard functions and so on. Some of which are just mappings of the standard ones, and others of which are new.

Thanks to something called the transfer principle, all statements in first order logic about things involving real numbers remain true about the nonstandard versions of the same.

In other words within the nonstandard model, the world looks the same as within the standard model. But there is a key feature. From the construction we know that the nonstandard model has numbers in it that are closer to 0 than any real number except 0. That set is the infinitesmals. The infinitesmals are a set identifiable from the construction, but are NOT a nonstandard set. And 1/infinitesmal gives you infinite numbers whose absolute value is larger than any standard real.

Now here is the point of the whole construction. Suppose that (f(x + dx) - f(x))/dx only varies by an infinitesmal from a non-standard version of a real across all possible infinitesmals. Then it turns out that that real is the derivative in the usual notation. Similarly if a Riemann sum that breaks an area into N pieces always gives the same answer to within an infinitesmal for all infinite integers N, then it turns out that the function is Riemann integral, and that answer is the actual integral. And with these two insights, most of the handwavy arguments that people used to use can be rescued.

Some mathematicians have found that the infinitesmal notation helps them think through problems, and a some important new theorems were proven using this approach. However those proofs can be translated back to more standard notation. Many students intuitively prefer the infinitesmal notation over limits. But when you unpack it, does it make sense to invoke the axiom of choice to define the derivative? (The ultrafilter construction uses the axiom of choice. Alternatives exist, but they use complex model theoretic mechanics under the hood. Really understanding them requires a lot of machinery.)


Infinitesimals [1][2][3] are probably the best example. Though there's a whole range of theories, the basic version is just a magic number, d, such that d^2 == 0. So (x+d)^3 = x^3 + 3dx^2 + 0, giving us a more "natural" path to get to the notion of derivatives without resorting to limits. They get hard to manage, though, so they haven't really made it into the mainstream.

[1] https://en.wikipedia.org/wiki/Non-standard_analysis

[2] https://en.wikipedia.org/wiki/Smooth_infinitesimal_analysis

[3] https://en.wikipedia.org/wiki/Dual_number


Surreals and Hyperreals have lots of "extra" numbers that don't exist in the reals. They're a bit niche though.


I think mathematicians have mostly moved past the idea of fighting over whether something is or isn't a number. There are lots of interesting number systems that never made it into widespread use in the same way that the "core" number systems did, and there are lots that see niche usage, but people are pretty relaxed about whether they are "actually" numbers - they might or might not typicaly be referred to as numbers, but the strongest negative answer to "Is this a number?" you'd typically see would probably be to shrug and say "sure I guess, if you like?"

One example of previous attempts which we'd now consider to be invalid is a lot of operations on infinite series. In early days of analysis you'd get people concluding things like "1 - 1 + 1 - 1 + ... = 1/2", which gets you into more hot water the more you look at it. The problem here isn't that they are philosophically unsound per se - you can define all sorts of notions of "infinite sum" that make this work, like cesaro summation - but they don't behave as nicely as people intuitively expected them to and the naive versions of them don't really work.


You can consider almost anything a number if you can do some internally consistent operations with it.

Hamilton attempted a theory of triplets before setting on 4d quaternions, so that would be one example of a failed number system off the top of my head.


There is a video (https://www.youtube.com/watch?v=S4zfmcTC5bM) from the PBS Infinite Series that covered this topic as well, for people that what a visual version of the explanation. I'm still sad that the series/channel was shutdown though, such a great series.

PS: this article/video essentially defines the naturals from the fundamental set theory axioms. This other video (https://www.youtube.com/watch?v=KTUVdXI2vng) from the same PBS series shows how you can then use the naturals to construct other types of common numbers, up to the reals.


As a follow on people may enjoy Feynman's lecture on Algebra http://www.feynmanlectures.caltech.edu/I_22.html and some audio clips http://www.feynman.com/the-animated-feynman-lectures/

>To discuss this subject we start in the middle. We suppose that we already know what integers are, what zero is, and what it means to increase a number by one unit. You may say, “That is not in the middle!” But it is the middle from a mathematical standpoint, because we could go even further back and describe the theory of sets in order to derive some of these properties of integers. But we are not going in that direction, the direction of mathematical philosophy and mathematical logic, but rather in the other direction, from the assumption that we know what integers are and we know how to count. ...


”This means we eventually end up either with an empty sequence (the set is empty) or with a sequence with only + operations in it”

For valid starting points, that is true, but it doesn’t follow from the text. The loop has an error exit (”If the sequence starts with − then something has gone wrong”), and the text doesn’t show that you won’t get there for valid inputs, and showing that isn’t trivial.

For example, if the sequence is ++---+, the first iteration removes the second and third item, leaving +--+, and the second iteration removes the first and second item, yielding -+.

The ‘program’ crashes there because the input is invalid, but proving that it never will crash for valid inputs without resorting to “that’s how integers behave” isn’t trivial.


My usual answer is that math isn't too concerned with what things are, but with how they work with each other. So the answer would be: anything that works like the Peano axioms says it should. The "what it is" can be filled in later, and that's what makes math so powerful.


Article seems very verbose and only really addresses the natural numbers (it mentions negatives and rationals in passing).

In case anyone just wants actual answers without reading pages and pages of prose, here is one set of constructions (there are other competitors too).

The natural 0 is the emptyset. The natural 1 is the singleton {0}. The natural 2 is {0,1}. In general, the natural n+1 is {0,...,n}. Exercise to the reader: define appropriate arithmetical functions on the naturals.

Define a relation ~ on pairs (a,b) of naturals by saying (a,b)~(c,d) if and only if b+c=a+d. Exercise to the reader: ~ is an equivalence relation. Its equivalence classes are called "integers". The equivalence class containing (a,b) represents the integer b-a. For example, the integer -1 is the equivalence class {(1,0),(2,1),(3,2),...}. Exercise: define appropriate arithmetical functions on the integers.

Define a relation ~ on pairs (m,n) (n nonzero) of integers by saying (m,n)~(p,q) if and only if mq=pn. Exercise: Show ~ is an equivalence relation. Its equivalence classes are called "rationals". The equivalence class containing (m,n) represents the rational m/n. For example, 1/2 is the equivalence class {(1,2),(-1,-2),(2,4),(-2,-4),...}={(k,2k) for all nonzero integers k}. Exercise: define arithmetic operations on the rationals.

To get from the rationals to the reals, see https://en.wikipedia.org/wiki/Dedekind_cut


The thing that I don't like about this sort of construction is that it also implies all sorts of nonsense. For example, I can ask the question "Is 2 a member of 4", which is clearly nonsensical, but will get the answer "yes" from this model.

Type theory and category theory give us a much better way of constructing these sorts of objects without having to resort to creating constructions with all sorts of side effects. Not to mention that this is one possible construction of the natural numbers in set theory; in other formulations the theorem "is 2 a member of 4" would not be true, making things even more confusing. For example, we have the other standard approach of 0 = {}, 1 = {{}, 0}, n={{}, n-1}. The untyped lambda calculus and things like Church numerals have the same defect.

If we can start assigning types, then we can reason about these things in a much more concrete manner without these nasty side effects of theorems that only make sense because of definitional shortcuts. We can even derive isometries between different derivations of the same objects to show that two different definitions (like the Peano numerals and binary numbers) of a "natural number" can be used exactly interchangeably, because theorems like "is 2 a member of 4" are simply not expressible given the derivation.


> The thing that I don't like about this sort of construction is that it also implies all sorts of nonsense. For example, I can ask the question "Is 2 a member of 4", which is clearly nonsensical, but will get the answer "yes" from this model.

Why? Any set of 4 things also contains 2 things. Where's the nonsense? Within this definition, that's consistent.

> Type theory and category theory give us a much better way of constructing these sorts of objects without having to resort to creating constructions with all sorts of side effects.

Out of curiosity, what is the category theoretic construction that avoids Russell's paradox? I don't know that there isn't one, but I can't think of it off the top of my head. I know there are category theoretic constructions in general (I responded to one someone else posted in this thread).


> Why? Any set of 4 things also contains 2 things. Where's the nonsense? Within this definition, that's consistent.

That's the problem -- it's only within this definition. In other definitions, notable the nesting example that I gave, that theorem is false. And the theorem makes no sense in and of itself, because we're talking about numbers, so it is unexpected that the membership operator would apply at all.

Whereas I can take two definitions of the natural numbers

  PN =
  Peano_One: PN
  Peano_Succ: PN -> PN

  BN =
  Binary_One: BN
  Binary_2x: BN -> BN
  Binary_2xp1: BN -> BN
and I can define a Plus: PN x PN -> PN and a Plus: BN x BN -> BN, and so on, and once I can define Binary_Succ: BN -> BN and Peano_2x and Peano_2xp1 I can prove that these are isometric types, so all theorems derivable from PN apply to BN and vice versa, not just the convenient theorems that don't use any syntax from the meta-language (e.g. set theory).


What you're saying isn't demonstrating any inconsistency. A thing is only true in mathematics if it follows from the definitions. If you change your definitions, you should expect that theorems built upon those definitions will no longer hold.

Stated another way, consistency is only a coherent mathematical concept from the perspective of a specific set of definitions. There's no problem here.


I don't claim that this makes it inconsistent. Just that it admits meaningless concepts. You can attribute this, if you like, to the fact that I am a computer scientist, and see things through that lens.

If you give me a statement saying "For two sets, A and B, is 'A union intersection B empty", the statement would not be well-formed, so you could just say that it is not a valid proposition without asserting anything about its truthiness. Similarly, a good foundation of mathematics should be able to reject a statement like "2 is a member of 4" as poorly formed, and make no statement about its truthiness.

I guess what I'm saying is that set theory is a bad basis for mathematics and should just be regarded as an interesting relic from previous attempts to formalize mathematics so that we can move on to more powerful systems.


It's similar to the way in C you can malloc memory for two data structures, and then check which pointer is smaller. Say, one pointer points at the data for Mario, and the other pointer points at the data for Luigi. You look at the raw pointers and notice that the Mario pointer is a smaller 64-bit number than the Luigi pointer. That seems absurd or arbitrary because why should "Mario" be "smaller" than "Luigi"? But it doesn't really matter: the game still runs fine and nothing breaks as a result of the arbitrariness.


The definition of the Peano numerals has two constructors, Zero: N and Succ: N -> N. Russell's paradox (assuming that we're talking about the idea of "set of all sets that are not members of themselves") is avoided simply because the objects produced are not sets, and sets have no exalted position within the mechanics of category theory.

Talking about concepts like "the category of categories that don't contain themselves" is kind of navel-gazey; and ends up falling apart in most constructive variants just because you can't give a comprehensive construction of elements of this category.

Admittedly I'm throwing some concepts from intuitionalism and type theory in the mix here; if I took the time I could make these statements more precise.


I see what you're getting at. My (similarly hand wavey) perspective is that constructions using category theory end up defining natural numbers as categories of sets with cardinality n. What I was really getting at is, how do you define your relation for the category in such a way that isn't pathological ("n is the category of sets with cardinality n except all such sets containing n")?

Alternatively, what are you selecting as the objects for your category if not sets of a given cardinality?

To be clear, I consider a lot of discussion about the foundations of set theory (and paradoxes thereof) to be pretty navel-gazey.


I'm getting flashbacks of Baby Rudin's first chapter from this comment :)

For anyone who wants to learn more about this: what the parent comment is describing is the construction of natural numbers as cardinalities of sets. This is the (modified) Frege definition which avoids Russell's paradox - you can use that as a jumping off point for further dives into set theory and number systems.

To define Peano arithmetic you start with a successor function, then an addition function, then multiplication, etc.


”an equivalent relation.”

Auto-correct at work, I guess. For those not familiar with the term: if Google says did you mean equivalence relation, Google is right.


Thanks, fixed


Your castle is built on sand. What, then, is {}?


One of the axioms of set theory is the axiom of the empty set, which states that there exists at least one set which has no elements. Another axiom of set theory is the axiom of extensionality, which states that two sets are equal if they have the exact same elements: from which it follows that all sets without elements are identical, i.e., there is only one set without elements. We call that the emptyset.

Other axioms of set theory are used to formalize other steps in my post. https://en.wikipedia.org/wiki/Zermelo%E2%80%93Fraenkel_set_t... For example, the Axiom of Infinity is used to formalize the handwavy part where I said "In general, the natural n+1 is {0,...,n}".


An axiom is simply something made up.

At the end of the day, all of our Mathematics rests on foundations that are made up.

It's difficult to say what the empty set is. Because it isn't really anything at all.


Well, yeah. Axioms are not deduced, they are the starting points for the various theories that are deduced from them. (There is not much you can deduce ex nihilo.)

Did you have some further point?

If you were hoping that mathematics would have - or thinking that it _should_ have - some ontological foundation that is not in this sense "made up", I'm afraid it simply doesn't.


There are legitimate criticisms you can levy against set theory, but I'm starting to lose you here. I'm not really following your point anymore - this seems like arguing about whether or not mathematics is invented or discovered.

Are you trying to argue coming up with new definitions isn't worthwhile if the axioms don't have a foundation in reality? If so, why? If not, what are you saying?


The original post is called "What even is a number?". The comment I replied to tried to answer that question with "some formalism".

What I'm saying is: questioning foundations leads you to new foundations. New foundations that you can then question all over again. It's turtles all the way down.

If you're actually looking to use math, this is a futile exercise. It works, so just use it. Essentially, I am re-iterating von Neumann's statement:

> In mathematics you don't understand things. You just get used to them.


In that case I'd agree with you. I'm not particularly keen on rehashing foundations of mathematics either.


Do you complain similarly when Euclid states that for any two distinct points, there exists an infinitely long straight line which passes through those two points? We cannot point to that line, we can only point to a small portion of it within our field of vision, and have to extrapolate it to infinity.

Either there is an empty set, or there isn't. If there is one, I win. If not, then let S be the set of all empty sets. There are none, so S is the empty set and I win again.


This is the same response I gave to throwawaymath:

The original post is called "What even is a number?". The comment I replied to tried to answer that question with "some formalism".

What I'm saying is: questioning foundations leads you to new foundations. New foundations that you can then question all over again. It's turtles all the way down.

If you're actually looking to use math, this is a futile exercise. It works, so just use it. Essentially, I am re-iterating von Neumann's statement:

> In mathematics you don't understand things. You just get used to them.


>In mathematics you don't understand things. You just get used to them.

Yep, one of my favorite math quotes and highly under-rated.

One thing I've found is that if you find yourself using the wrong "types", you're almost certainly going in the wrong direction. For example, if you're trying to solve a complex analysis exercise and you find yourself thinking about how the complex numbers = the real-coefficient polynomials modulo the ideal (x^2+1), STOP: You're getting nowhere! :)


All of mathematics is built on sand in the sense that there necessarily are going to be undefined terms. There’s no way around this. Think about the English language. Grab a dictionary. Look up any word in it. Look at the definition of that word. Pick a word in that definition. Repeat and eventually you’ll end up with a defintition that contains one of the words you previously looked up. So English is built on sand too and yet we are able to still communicate.


> the natural 0 is the empty set.


That is a definition of "natural 0", not a definition of "empty set". GP was pointing out that GGP was using the concept of "empty set" to provide a definition for the natural numbers, without having first provided a definition for the empty set.

Replies that consist solely of throwing a quote at someone are kind of rude even if you're in the right.


I don't see why GGP should have to reinvent set theory to use it in his construction of numbers. Set theory and the idea of {} is well defined, and hardly "built on sand".


He doesn't necessarily have to. But when he defines "natural zero" as "the empty set", and someone asks "what's the empty set?", and you say "natural zero", you're being circular.


{} is the empty set, which as mentioned is natural 0.


GP understands the notation; he means to cast aspersions about the axiomatic project.


Fair enough, though there weren't really any "aspersions" cast. I'm aware you can unravel set theory if you care enough about that particular exercise. But, in context, you could also interpret that comment as being written by someone who isn't aware that the empty set is defined as {}.

Responding to someone's definition that natural 0 = {} that mathematics is a "castle built on sand" isn't exactly a cogent criticism of set theory.


I was surprised by no discussion of ordinality. Cardinality was something I also specifically expected.

Counts of things are described here, but the greeks went to great lengths with geometry to discover numbers (and even killed people who came up with irrational answers)


Most of computer science deals with finite things. As such, it isn't necessary to introduce cardinal numbers or ordinal numbers. Natural numbers suffice. If you do, you quickly need to get into territory that requires lots of math background like well orderings, cardinal comparability hypothesis, transfinite recursion etc.


I read this as "What is an even number" and spent the entire article stoked for when he got to the part about a number being even.

Still, lovely article.


Don't even numbers have a pretty easy definition? If prime factorizartion of the number contains at least a 2


Oh sure, but then the article started talking about sheep and rocks and I'm like "Oh boy, I bet I totally don't understand what it means to be even!"


Well you can reduce the +/- notation further by allowing yourself to (after removing all +/- pairs) to remove pairs of sequential + tokens, and if it reduces to an empty string then it's an even number, otherwise it's an odd number.


It's extremely obvious that the author is a computer scientist


Exactly! I didn't know the author, but couldn't help but check on his background after reading the blogpost.

It's almost a dead giveaway for excerpts like 'The notion of a counter is not that of a single concrete class of object, it is an abstract description of the behaviour of a system providing certain operations satisfying certain rules.' , and 'A number is not any one thing, it is any one of any number of things that implement some operations (and there are different types of number depending on what operations you want to implement).'


And the very first thing he did was define natural numbers as a grammar of operations with rules for interpretation


I mean, I guess I am? I'm technically doing a PhD in it at the moment, but my opinions on philosophy of mathematics don't have much to do with the finer details of test-case reduction.

My actual degree, which is where most of my philosophy of mathematics opinions were developed, is in very pure mathematics. I've done quite a lot of software development since and that's definitely shaped the framing, but the core philosophy is one that I've had since long before I knew much about computer science at all.


> What even is a number?

It's a category. 1 is the category of singletons, 2 is the category of pairs, etc.

As to what numbers are, they're an ordered collection of categories.

Edit: To whoever downvoted this comment, kindly explain. Insofar as I'm aware this is textbook maths, psychology, and philosophy.


I downvoted it; it's dismissive and obnoxious to just provide a intuitionless one-liner explanation: "what are numbers? they're X", as though there was no complicated ideas involved, or like it should be obvious or trivial. The whole point of an article about this is to acknowledge and grapple with the subtleties.


If you define the natural numbers this way you'll run into Russell's paradox. For practical purposes that's fine and this is an elegant, modern restatement of Frege.

But if you need to avoid Russell's paradox the sets defining each natural n can't contain n as an element. The easiest such construction was outlined elsewhere in this thread by xamael. Under your category theoretic construction, every natural number n is defined as the category of sets having cardinality n. But then every nth category will necessarily contain infinitely many sets with cardinality n that also contain n. That causes the paradox.

Unfortunately I don't think you can construct the naturals in a category theoretic way while avoiding Russell's paradox since any category by cardinality will fall into that trap. But if you don't need to mind that problem, this is neat.

Responding to your edit: I didn't downvote you; in fact I upvoted this comment because it's correct and it was gray at the time of my writing. My comment is just a point of clarification.


I don't understand how Russell's paradox comes in. The set of all eg. pairs does not contain itself.


If you've defined the natural number 2 to be an arbitrary set with cardinality 2, you're including sets which contain the number 2. That's the basic form of Russell's paradox.

If you define the natural number 2 to be the category of pairs, your objects are the sets with cardinality 2, and your relations between objects are equivalence relations. As a consequence your category 2 will contain sets which contain itself.


Why is that a problem?


I love articles like this. Thanks for sharing. Ultimately math is just a language invented to describe the world. Which is what any language does. And it's a really useful language for sharing certain kinds of intelligence. I got into a discussion with my son last night about sine waves. Which don't really exist, per se, but they are a useful language construct for thinking about waves (especially sound waves as we were discussing). Shared intelligence like this is, as some have maintained, a major differentiator between humans and pre-humans. So the very fact we can argue about what is a number is part of what it is to be human.


Sine waves do exist. They are the 2D projection of a circular spiral, which is fundamental to the properties of the physical world. This is rarely explained fully when these concepts are introduced in trigonometry (which should more properly be called "circleometry") because of the higher level maths required before introducing imaginary exponentials.

The computer your are using would not work without sine waves. The nanometer scale chips couldn't be manufactured without them. The high speed signals transmitted over its wiring would not reach their destination without knowing how to manipulate them.

A deeper question is what to make of "uncountable" transcendental numbers.


It may be more than a language to describe the world. Take Euler's identity, e^(i*pi)=-1. One of the most striking equations in maths and it doesn't really describe anything physical in particular and true regardless of the nature of the world.




Here's a silly definition:

A number is a sequence of 1 or more digits, optionally beginning with a '-' and optionally followed by a '.' and one or more digits, where digits are the characters {'0' '1' '2' '3' '4' '5' '6' '7' '8' '9'}.

Examples: 1, 25, 3.2, -8

"1 + 2" is not a number, but its evaluation is


This appears quite similar to the difficulty in explaining Monads (from the programming perspective rather than the mathematical)


I think the counting is natural to human. The corresponding theory op used is not natural. If one has to use that, one start to use writing to do correspondence. Or use shell/good/coin.

That is for +integer.

Not sure about how natural is real number other than pi using geometry. And irrational also use geometry.

Use algebra and group-ring-field is very late.


The article ends with "Almost certainly not, but if so why would we care?" which I find (along with its inverse "Almost certainly, but if not why would we care") to be my answer to many philosophical questions.


This really just boils down to: Counting is an intuitive thing for us and it's useful to mathematically define it.


Can you expand? I think the author chose counting because of its intuitiveness, but they then go on to show that a very simple definition of counting can lead to some non-intuitive concepts, and that even restricting yourself to that simple definition of counting can lead to classifying something controversial (i.e. infinity) as a number.


Number is an abstraction. What is an abstraction? It is a definition for a set of things by stating what is common to all things considered to be in that set.

"1" is the set of all things which have exactly one element. It is an abstraction for all such things.

What does it mean to "have one element"? I think that's a more difficult question. Maybe it should be an axiom?


I often hear people say mathematics is the language of science. Since learning about cryptocurrency, I came to realize that more precisely, mathematics is the language to describe energy transformation.

So a number can be seen as a metaphysical representation of energy.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: