Hacker News new | past | comments | ask | show | jobs | submit login
The number line freaks me out (2016) (mathwithbaddrawings.com)
137 points by mananaysiempre 2 days ago | hide | past | favorite | 139 comments





It seems to be an article about all those "harmless" lies we tell students.

The vast majority of people think mathematics is about numbers, when it is actually about relations, and numbers are just some of the entities whose relations mathematics studies.

Nobody is born with this misconception; we teach it, and test it, and thereby ingrain it in the minds of every student, most of whom will never study mathematics at a level that makes them go "wait, what?". The overwhelming majority of people never get to this level.

I suspect this is also why statistics feels so counterintuitive to so many people, including me. The Monty Hall problem is only a problem to those who are naive about probability, which is most people, because most of us don't learn any of this stuff early enough to form long lasting, correct instincts.

It's not fair to students to bake "harmless" lies into their early education, as a way to simplify the topic such that it becomes more easily teachable. We've only done this because teaching is hard, and thus expensive. Education is expensive, at every step. It's not fair or productive to build a gate around proper education that makes it available only to those who can afford it at the level where the early misconceptions get corrected. Even those people end up spending a lot of cognitive capital on all those "wait, what?" moments, when their cognitive capital would be better spent elsewhere.


> The vast majority of people think mathematics is about numbers, when it is actually about relations

It is somewhat unfortunate that mathematics is two different things, simultaneously very closely related and very different. One is the abstract study of relationships between axiomatic entities, and the other is arithmetic.

Vast majority of people out there need only arithmetic, and boy they really need it. Calculating tax, taxi fares, shopping bills, splitting bills etc. And to some extent, you need the abstract maths to understand arithmetic.

We have one curriculum for that vast majority of people and for the few who move on to academic maths. Simplifying ideas like integers to number lines doesn't seem like a high price to pay.


I recently thought a student who "always was bad at math" how to use it to make his marionette puppet servo controlled. On the way we encountered a lot of scaling ranges, working with angles, trigonometric functions for motion synthesis, all kinds of complex remappings with saturation functions, random walks, linear interpolation functions and so on.

He was absolutely stunned and asked me why mathematics wasn't thought that way all the time. Instead of a bunch of things he had to do, he came to see it as a toolbox with things you can use.

And I myself wonder why the hell my maths teachers failed at making this easier as well. I distinctly remember my math teacher wbo failed to answer me when I asked after months of solving integrals why we need those. I had to figure that out myself, pre-internet.


> when it is actually about relations

I think of it more as, math is ultimately about symbols. Like, if a mathematician says that "2 = 2" is a true statement, a reasonable onlooker might ask "Does that mean that all twos are interchangeable? Or that there's a unique concept called two and it equals itself?" And the mathematician replies, "Neither! It means that the string of symbols '2 = 2' is reducible to the symbol 'true', given certain axiomatic symbolic transformations. Nothing more, nothing less!".

And obviously we can project concepts onto the symbols, like "integer" and "real number", and talk usefully about them, but those are the map and the symbols are the terrain, as it were. At the edge cases where we're not sure what to think, we have to discard the concepts and consult the symbols.


> math is ultimately about symbols. [...] Neither! It means that the string of symbols '2 = 2' is reducible to the symbol 'true', given certain axiomatic symbolic transformations. Nothing more, nothing less!

If that were true, math would be useless, and nothing more than an esoteric artform.

The true power of math comes from the correspondence between those symbolic transformations and observation from the real world. Two objects that look alike can be placed in juxtaposition with any other (different) two objects that look alike, and no matter how much we move them around, as long as we don't add or remove any objects, they can still be placed in the same juxtaposition as before (while this description may seem verbose and clumsy, in the real world it does not need a description - it is a much more primitive sensory perception, learned at an early age).

> obviously we can project concepts onto the symbols, like "integer" and "real number", and talk usefully about them, but those are the map and the symbols are the terrain

It wouldn't be "obvious" that we can project concepts onto symbols, if we didn't discover that symbols correspond to concepts and that symbolic transformations can help us predict the future. Thus I'd say it's the other way around: symbols are the map that we know how to read - of the terrain that we can't traverse easily.


Platonist mathematicians do exist.

The vast majority of mathematicians are platonists. They think math is real.


They might not admit it if they're not drunk enough.

I'll edit out the part of my comment where I suggested they don't.

You can't, so you didn't.

Weirdly aggressive, but fair enough.

No aggression intended; I just wanted to clarify for anyone reading the thread afterwards who thought I might be responding to a different version of your comment than the one that's there now.

And then there are undecidable statements and their ilk.

> The Monty Hall problem is only a problem to those who are naive about probability, which is most people, because most of us don't learn any of this stuff early enough to form long lasting, correct instincts.

I think it’s more than that… we come with some built-in heuristics for probability, which mostly work pretty well. Until they don’t.


i would argue our built in heuristics for probability are pretty bad, which is why the monty hall problem is so hard for most people to grasp (even though it is a relatively straight forward application of probability). probabilistic thinking comes much less naturally to the human mind than deterministic thinking.

I find that the bad intuition on the monty hall problem is mostly due to the small delta of going from 1 in 3 doors to 1 in 2 doors, combined with some bad human intuition. If you change it to start with 1000 doors, I find it to be a lot more intuitively convincing.

Same here, although I've talked to people who were equally confused with that formulation.

> It's not fair to students to bake "harmless" lies into their early education, as a way to simplify the topic such that it becomes more easily teachable

Childrens' brains are not fully developed. I see no gain from telling a 6 year old that "most numbers aren't countable". Especially because most numbers are never used or interacted with in any way shape or form. It's not "lying", it's separating concepts and prioritizing.


> We've only done this because teaching is hard, and thus expensive.

That's just silly. We've done that to make the math useful and possible to teach. Unless you're saying you're able to start with sets of numbers and defining a ring for kids, before explaining what 1+1 is.


>>when it is actually about relations, and numbers are just some of the entities whose relations mathematics studies.

Well no. Math is studying anything at their most atomic machinations. That mostly involves.

- Making hypothesis, that is assumptions about start conditions and rules of play.

- Evolving the system you just created. Such that conclusions are consistent with the rules of play.

The real deal is good math involves lots of paper work, to an extent you could almost say Math is a writing skill than a thinking skill.

Think of it like generating a lengthy changelog.


I would say instead that math is a game. A universal game with no predefined rules at all and only one guideline: if the rules you make up lead to a contradiction, then the rules are probably boring. If your rules say that 1+1=3, then you can prove anything and the whole thing becomes uninteresting.

Mathematicians have come up with various rules (axioms) that seem to work pretty well. And they spend a great deal of time figuring out their consequences. But it may still happen that the rules have a contradiction and they need to come up with a different set.

Sometimes mathematicians add extra rules when they run into a roadblock. And part of the meta-game is to come up with the minimum set of extra rules they need to keep going. Sometimes they spend time figuring out if the existing rules aren't needed.


>>I would say instead that math is a game.

Yup, and as you keep going the level too go up!

But the core ideas are simple though-

1. Start some where where you understand things enough to make sense.

2. Make the smallest possible, atomic change to some aspect of thing you know at point 1.

3. Test if the change sticks- If yes, repeat steps 1 - 3

4. If the change doesn't stick- Go back to step 1. Now either make a different change to the same thing or make a new change to a different thing. Repeat steps 1 - 3.

As you can see you write a lot. Like really a lot. Math is just writing skills.


> The Monty Hall problem is only a problem to those who are naive about probability, which is most people, because most of us don't learn any of this stuff early enough to form long lasting, correct instincts.

I mean maybe? Depends on what your definition of being naive about probabilities is. The Monty Hall problem has a sordid history of even very learned mathemathicians specialising in probability getting it very wrong. For example Paul Erdős got it wrong[1] (until someone walked him through it)

Now maybe you count Erdős as someone who is naive about probability. In which case I guess you are right. But that puts the bar very high then.

1: https://sites.oxy.edu/lengyel/M372/Vazsonyi2003/vazs30_1.pdf


Like Art is about making pleasing artifacts whereas it's really about making business connections and laundering money?

One thing not mentioned that still gets to me is that all the numbers we know, that we will ever know, is measure zero on the real number line. The lovecraftian nightmare numbers, the infinite maw of the unknowable, aren't some rare exception, they're everything.

This video parallels the article, going further. I love the tag line, "We know none of the numbers." https://www.youtube.com/watch?v=5TkIe60y2GI

Another number line mind-blown moment is that the complex plane is actually a half-plane since the distinction between i and -i is arbitrary, so any graph in the complex plane has to be symmetric about the real number line.

Are you sure? The distinction between i and -i is no more arbitrary than the distinction between 1 and -1. Example of an asymmetric graph: Im(x) = 1

Expanding the definition of the imaginary part, this says

(x - x∗) / 2i = 1

where x∗ denotes the complement. If you replace i with -i, the graph will be precisely the complement of the original graph.


Nope. Take any true mathematical sentence and (consistently) replace i with -i and it remains true; that is not the case for 1 and -1. Im(x) = 1 is meaningless; it would have to be Im(x) = ±1. (In fact you'll only ever see complex numbers in the form of a±bi, never a+bi alone.)

It's why you can't say e.g. -i < i; the signs on purely imaginary numbers are not an ordering.


f(x) = |x + i|

Non-symmetric real-valued function on C.


And indistinguishable from f(x) = |x - i|

The choice of one as +i and the other as -i is arbitrary, which is not true with 1 and -1.


Seems pretty true with 1 and -1. Map R with f(x) = -x and f(x) = |x - 1| for x in your new mapping is indistinguishable from f(x) = |x + 1| in R.

In any case I’d say this is arbitrary like using + for addition and - for subtraction. It seems like you’re just talking about the symbols themselves. I’m not sure how you get to half plane from there.


1 and -1 are distinguishable: one of them equals its square, the other does not.

Sure, but I'm not sure I'm understanding the argument. I don't understand how a function like f(z) = e^z has to be symmetric about the real number line or how i and -i aren't distinguishable with something like Im(z) > 0. Is there a proof somewhere I can read?

It falls out of complex numbers satisfying the conditions of a field though I don't know of a specific "proof" of that (you generally don't "prove" definitions). You could equally say "i is indistinguishable from 1/i" or "i's additive inverse is its multiplicative inverse"; in either case it's an arbitrary choice which of the conjugates is positive and which is negative. The key being that you cannot say "i > -i" because of that.

I interpreted your words as "the complex plane has topology of a plane where conjugates are glued together".

Ah, gotcha. I think I wasn’t understanding exactly what was being said. Thanks for the explanation.

They are conjugate elements (both roots of the minimal polynomial of C as an extension of R), so they satisfy all the same algebraic properties over R, but they are certainly distinguishable as elements of C.

For example "i" satisfies the polynomial "x-i=0" and "-i" doesn't. It's just that you can't find any such polynomial with real coefficients that differentiates them.

Of course there are lots of non-algebraic ways to distinguish them too. Or did you mean something stronger?


I meant that the distinction between i and -i is entirely arbitrary, because it is

That's not a mathematically meaningful statement. I'm just trying to understand what you mean in more detail.

Any true mathematical sentence containing i is still true if you (consistently) replace i with -i. It's why complex numbers are always in the form of a±bi, because there isn't anything that distinguishes a+bi from a-bi other than the conventional sign.

Okay, thanks, I see where you're coming from. That's elementary equivalence, which has some caveats. The truth of sentences that only refer to complex numbers and elementary operations on them is preserved by mapping i to -i. But it's not true if you start involving other structures.

Complex numbers are generally only in that form when obtained as roots of a polynomial. There are lots of applications where different signs have different interpretations. You can say it's a convention, which is true, but that's not quite the same as saying the two signs are the same thing.


There are an infinite number of numbers between each number on the number line.

That's always something fun to think about.


And those two numbers are boundaries of an interval bijective to the whole number line...

I think it’s slightly more fun to say there are an infinite number of numbers between each number on the INFINITE number line

... and the article doesn't even mention the concept of +/- infinity and the fun things you can do with that!

There are more structures in non-computable numbers. As an example, a non-computable number could be "definable", i.e. you can describe it. For example, a real number the nth digit in the binary representation of which equals to whether the nth Turing machine halts.

Since the set of all English sentences is countable, whereas there are uncountably many real numbers, it follows that there must be numbers that are NOT even definable.

Think about that.


Why would this follow? There are many more ants and yet I can use a single word to refer to them, and certainly the biology text that defines them contains fewer words than there are ants.

You can certainly refer to the set of all real numbers, but that’s not the same as referring to one particular number.

I see. It still does not make sense to me that there are necessarily undefinable numbers. That is, for any given number x there is not an English sentence y that defines the number. That sentence may need to be arbitrarily long, but so what? English has more flexibility than math in defining things, maybe that is the disconnect. It seems to me that insofar English can define any number, it can define all of them.

It feels to me like this is trying to draw an equivalence between language and mathematics yet disallowing the inherent ambiguity of language. At that point, the comparison is just silly.


You mean "NOT even definable"?

Oops. Thanks for pointing out.

> Since the set of all English sentences is countable

Is it? Where can I read a proof? I have a feeling it’s uncountable set but would be happy to see a proof one way or another.


An spoken English sentence is a finite string of phonemes. The set of allowable phonemes is finite. Given a finite set X, the set of all finite strings of elements of X is countable.

(The latter statement holds because for any given n, the set X_n of all strings of length n is finite. So you can count the members of X_0, then count the members of X_1, and so on, and by continuing on in that way you'll eventually count out all members of X. You never run out of numbers to assign to the next set because at each point the set of numbers you've already assigned is finite (it's smaller in size than X_0, ..., X_n combined, for some n).

In fact, even if you allow countably infinitely many phonemes to be used, the X_n sets will still be countable, if not finite, and in that case their union is still countable: to see that, you can take enumerations of each set put them together as columns an a matrix. Even though the matrix is infinite in both dimensions, it has finite diagonals, so you can enumerate its cells by going a diagonal at a time, like this (the numbers reflect the cells' order in the numeration):

    1  3  6  10 15
    2  5  9  14
    4  8  13
    7  12
    11
However if you allow sentences to be countably infinitely long, then even when you only have finitely many phonemes, the set of all sentences will be uncountable, because in that case each countably infinitely long sentence can be mapped to a real number represented as an expansion in some base, and you can apply Cantor's diagonal argument. The "just count out each X_n separately" argument doesn't work in this case because it only applies to the sentences of finite length.)

The set of text files is clearly countable because it's made of binary. Do you think you can make an English sentence that can't be written into a text file?

If humanity lives forever, it will keep on inventing new words and therefore new sentences. So the question of whether or not language is finite is really the same question as whether or not the universe is.

> If humanity lives forever, it will keep on inventing new words and therefore new sentences.

That just increases the fraction of text files that count as "English". Which doesn't affect the argument.

> the question of whether or not language is finite

does not need to be answered. If English has a thousand words and never gains another one, the list of English sentences is countably infinite. If English gains 10% more words every year forever, the list of English sentences is still countably infinite.


Finite is not the same thing as countable. https://mathinsight.org/definition/countably_infinite

eventually that will be something that is not engish in any way we, but formally it will not be english straight away

See The Library of Babel, by Jorge Luis Borges:

https://archive.org/details/TheLibraryOfBabel


You are right because you can recursively add clauses. 'Buffalo buffalo...' or 'This was my dad's dad's dad's...' If you think you have a full set you can always add one more

You seem to misunderstand the concept of countable infinity

Haha oh you're right, was misframing in my head the word countable as meaning finite.

I even provide the definition of countable infinity in my counterargument without realising it, though maybe that too is a misunderstanding.


> Simon Gregg calls noncomputable numbers “the dark matter of the number world”...

Thanks but no, we don't need more of this kind of bs naming. "dark matter" already ruined physics because it implies something mysterious and magical is going on whereas it's quite the contrary. I hate it when people dumb down beautiful abstract concepts to the point that it's not only not intuitive, it actually makes the thing less accessible to those who are not in the know.


Not just abstract concepts. We dumb down moderately complex concrete things to stupidly simple explanations and names too. It's soooo bad. We basically encourage each other to have hazy 30,000 ft view of things when the details aren't really that difficult, it just takes a little work.

In what way is dark matter not mysterious? It's a mystery, so mysterious seems like the perfect description...

mysterious ≠ magic

I didn't say it was magic

Just forget number line after high school. Number line is needed for engineering graphs and economic data

This helps explain the different kinds of numbers in that recent HN article about how hard it is to make an accurate calculator.


I find it hard to wrap my head around non computable numbers. How can I even “point to one” of them if I can’t express/describe it? And if I cannot communicate which number I’m referring to, does it really exist? In what way do they exist?

It is a good question. I can say that, briefly, you have to take two things for granted. 1, that the real numbers can be constructed, and 2, that the number of computable numbers is countable because the number of programs describing them is countable. Therefore there must exist uncomputable numbers (and in fact, 'almost all' real numbers are uncomputable).

I can accept the second thing, but how do you mean the first one in a way that doesn’t fall back onto the second one? What is the way to “construct” reals that isn’t a program?

I mean construction in the mathematical sense. If you believe, for example, that the rational numbers exist, then it is easy to construct the real numbers. Very roughly, we look at the set of all sequences of rational numbers that converge in a specific way (technically, all Cauchy sequences) and call this the "set of real numbers" (technically, after taking an appropriate equivalence class, since intuitively multiple sequences can converge to the same real number). [1] has a few other constructions. This is very different from a program, which has its own definition.

[1] https://en.wikipedia.org/wiki/Construction_of_the_real_numbe...


So, if I understand correctly, you are saying that many (actually most) Cauchy sequences converge to uncomputable numbers. I think I understand a bit more, though I still have an issue with defining such a Cauchy sequence, because either I can describe it exactly (in which case the number it converges to can have an infinite expansion but it's computable, like pi) or I cannot define it exactly, in which case I don't know what number it converges to. I think my gap is that I don't understand how these constructions differ from programs. By program I mean a sequence of instructions, which may have an infinite number of steps because of loops, but it still has a finite description. I agree that all such programs are countable. Do you mean programs with an infinite description?

Even programs with infinite descriptions are only countably infinite, but the reals are uncountable.

This construction by Cauchy sequences looks innocent, but it's the diagonalization argument in disguise. (Start with all the rationals, make sequences out of them, make one that picks one from all, sort them into equivalence classes, and try to map them back to the rationals, notice that you will end up with more equivalence classes.)

The trick is basically that between every rational you can fit an infinite number of irrationals (using the rationals via these sequences). And exactly in this way these are "programs" -- like diagonalization itself. The fact that we can't give programs for most of them is because they are non-computable. (And it's the definition, the indirect proof is above via the cardinalities.)

[but it's dangerously late here, so double check my ramblings ... https://math.stackexchange.com/a/1488502 ]


Wait, the number of programs is countable? Are we saying that programs must be of finite length? (because if not a diagonal approach would prove them to be uncountable)

“Countable” as used in mathematics does not necessarily imply finite. The integers are “countably infinite”, and so is anything you can put in a 1:1 correspondence with integers.

But as I said, if programs are of infinite length, then a diagonalization argument proves the computable numbers not to be countable I think.

Ah yes, of course you're right. I think it does make sense to assume the programs are finite, I don't think numbers described by an infinite program should be considered computable.

Yup! Programs are assumed to have finite length, in the sense that the program must have a finite description. Of course, it may use recursion or include a loop that runs forever, for example.

> In what way do they exist?

In a way that makes the real number line continuous. Those numbers have to be there if we want the set to have properties useful for practical applications like algebra.


As far as I understand, if you look at a number between 0 and 1 with a truly random infinite decimal expansion:

    0.22134967842153005356...
then there is absolutely no pattern in the digits, so a program that wants to compute it can do no better than storing all the digits. But then the program would have infinite size.

Not all non computable numbers are undescribable. Chaitin's constant[1] is non-computable but can be described.

[1] https://en.m.wikipedia.org/wiki/Chaitin%27s_constant


I can write a program that will visit every number between 0 and 1, but it will take infinitely long to run and use infinite memory.

But such a program can't visit every real number in the interval, because there are uncountably many, but the program will only run for countably infinitely many steps.

You're right, I actually hadn't grasped this, I realised later on but it was too late to edit my comment. And to be honest I still don't completely get it.

Since you can count the cycles your computer takes, then at any point in time it's outputting a countable digit on a list of numbers that is also countable. Accelerating it to infinity lets you finish both of those tasks, but it doesn't break you into another realm.

If you get a nondeterministic computer, where every digit it splits into 10 identical computers that each picked one of the 10 options, then when you run that for countably infinite cycles you'll find that you have uncountably infinite computers and you have finally calculated every real number.

The cardinality of the real numbers is 2 to the power of the countable numbers.


You can't point to one of them, but you can point to infinite sets of them.

Its because its recursive in nature- every segment contains a new line and every segment of that line contains again - a infinite set of sub-segments.

Infinity in a box, right in front of your numeric microscope.

Which is why dividing by zero- is exactly the same operation. You take something finite- and you unpack the boxes- in parallel. Every time the operator hits something finite, it unpacks a new set of parallel boxes. The sum of all the boxes, is a infity with a signature.

And those parallel running overlapping infinityssquences, form the irrational numbers


Nothing so spooky deserves to be called a "Real" number.

I'm probably not very clever - but I don't get why calling the gap "fractions" is problematic.

The example the author gives of "fractions" is... rational numbers, and then proceeds to say "what about irrational numbers" - but in mymind (and this is probably where I'm a wrong?) an irrational number is still a fraction of a whole number, just we cannot express it "properly" (yet)


"x is rational" means there are two integers p,q such that x=p/q. So for example, 2/3 is rational (p=2, q=3), but the square root of 2 is not rational (there is no such fraction). The last part is not very obvious (it greatly distressed the Pythagoreans when they figured it out) but there are a bunch of proofs in Wikipedia:

https://en.wikipedia.org/wiki/Square_root_of_2#Proofs_of_irr...


A fraction is some integer over another integer. Those are not fractions, though they may be in a colloquial sense.

Well, you're wrong. An irrational number can't be described by a fraction. That's the very definition.

They mean "fraction" as in "part", the same way that an arm is a fraction of a whole body. But it's more about the words we use in everyday language than about mathematical definitions.

Also, I think I remember that the definition of a rational number implies fractions of integers. Otherwise I could write π as π/1 and give you a rational representation of π.


From what I remember "things you can compute" splits into "things you can mathematically define" and "things for which you can construct a Turing machine which computes them".

If we define "x is computable" as "there exists a Turing machine T(x) which takes n as input and produces n-th digit of x" then there are numbers which are defineable but not computable.


Even “e” and “pi” would have been noncomputable at one point in time.

But the noncomputable numbers make me wonder if our notion of mathematics is too general/powerful.


We call a real number computable if there is an algorithm that can compute it to arbitrarily high precision. So e and pi have always been computable.

I think in this case it's Reals that are too general. What is the virtue of these uncomputable numbers? If we can't compute/express them then what can we do with them?

Was pi ever really uncomputable? You can draw a really big circle and measure it in multiple ways.

And when e was defined as a symbol, it was with a computation, (1 + 1/n)^n


Math is the study of futility. Futility to calculate, to understand, to define, to rationalize. 1 is the only number. Everything else is a name.

> 1 is the only number. Everything else is a name.

http://scihi.org/leopold-kronecker/


I’d paraphrase. I’d say that zero and one are the only objects. Everything else is a name.

So in a sense math is exploration of the relation between existence and nonexistence.


You can't show me zero of something which is why you can't divide by it. Zero is a placeholder for what we can't show which is also why negative number exist on the opposite side of it. Zero isn't a number, it too is a name.

> You can't show me zero of something which is why you can't divide by it.

I've shown you zero fish. The number of fish I've shown you is zero. If I tracked you down, brought a fish with me and showed it to you, I'd have shown you 1 fish, but I haven't.


I did mention objects, not numbers. Maybe “symbols” representing concepts. I’d go with “name” for both zero and one — nonexistence and existence.

The underlying problem is that infinity doesn't exist. It's a convenient illusion to make special cases go away. It's possible to have entirely constructive mathematics. In a true constructive model, everything can be constructed in a finite number of steps. There are only integers, no reals.

Well sure, there is a constructive subset of the topic we call "mathematics", just like there is a subset that admits only numbers less than or equal to 5. But if you want the whole topic of mathematics, it's going to include nonconstructive theorems like the Banach-Tarski paradox. One could take a philosophical view for or against the idea that the non-measurable set in the paradox platonically "exists", but either way, those theorems are a legitimate part of mathematics. At best you can say the theorems are about mythological entities rather than "real" ones.

I like to think of continuity as always having more "resolution" available if a sharper "picture" is required. It is _weird_ though.

Constructive mathematics can handle rational numbers. Rational numbers are continuous, in the sense of being infinitely sub dividable. That is, between N/M and (N+1)/M lies (2N+1) / (2M).

Integers are already infinite and you need infinity to express things like convergent sequences. Real numbers are a whole new level of wtf altogether

See https://math.stackexchange.com/questions/4216831/what-does-c... for some discussion on convergence using constructive mathematics.

It's the definition of "exists" that is contested. To me, the infinity of anything is a generator (a program that continues to print digits). Does some infinite number exists "now"? Not really (it's being printed). But is the number being printed larger than any given integer? Yes. So, does infinity exists? ¯\_(ツ)_/¯

"The Emperor's New Mind" is a great book on this and related topics.


You can represent pi on a number line but it is absolutely completely impossible to randomly put a dot down on a number line and have it be pi. You can achieve endless measurable precision with decimal rational numbers. So randomly placing a dot on a line will always be a rational number.

Right???


Not necessarily. If you assume an ideal ruler and compass, it‘s pretty easy to construct the square root of any number which means that you can easily put a dot on that number line which is provably not a rational number. There is the question of whether continuity actually exists (are time and space quantized like matter and energy or are they continuous? this is currently unknown) and the fact that your paper and the line on it are composed of discrete molecules. But if the real numbers are, in fact, real, then the probability that your dot is at a rational point is actually 0 since while the number of rationals in [0,1] is infinite, it’s only countably infinite and the number of irrationals in [0,1] is uncountably infinite meaning |ℚ|/|ℝ\ℚ|=0.

Actually, the opposite is true. The rationals have measure zero, so a number selected uniformly at random from 0 to 1 (formalized way of placing a dot on a number line), has a 0% chance of being rational.

You mean in real world or in math?

In real world, correct, assuming "dot" and "number line" are consist of real world materials.

In math, you need to define "randomly placing a dot" first, because it's proven there isn't a uniform distribution over real numbers ("pick randomly" is usually a colloquial way to say "pick from a uniform distribution.")


> it is absolutely completely impossible to randomly put a dot down on a number line and have it be pi

It's not impossible, it just has zero probability of occurring.


I wonder if this was the inspiration for this numberphile video? https://www.youtube.com/watch?v=5TkIe60y2GI

The video does go further than the article.


When you say "most" of the numbers are non-computable, the word "most" is meaningless in this context. There are infinitely many of each kind of numbers you have listed there. You can't compare one infinity to another and say that one kind of infinite is bigger than the other. The concept of comparison (smaller/bigger) doesn't exist outside of finite numbers. Cantor was just what people thought he was - a crack, who did not consider the bounds of logical comparison.

And for the line itself, the line is not made up of numbers. Line is made up of continuity, while numbers are cuts in that continuum. Infinite number of cuts do not make up a continuous piece. Mathematical continuity (or extent or measure or span) is the essence of the imaginary spatial existence. It is not composed of cuts. A cut is a non-existence, completely opposite of the existence.


This is not the case. A set A can be said to be larger than set B if there exists an injection from A to B, but not from B to A. This is a well-defined extension of the concept of size in finite numbers, and preserves all the properties you might expect (e.g. transitivity).

Probabilistically it’s true. Pick a number at random from the uniform distribution of values between zero and one. With nearly total certainty, this value cannot be represented except by enumerating every single one of its infinite digits.

What does it mean to pick a number at random between zero and one? Does "picking a number at random" even make sense for an infinitely large set where you can't describe most items? Isn't "picking" the act of describing an item? (This might sound like a stupid question but I'm sure there is a mathematical definition of "picking")

If I pick a number at random using some method for picking that requires me to identify what I picked then 100% of the time I'll get a number I can identify, such as the number that is the solution to x^2=2, or the ratio between a square and a circle, or the quotient of 3 and 7. All those numbers I can't describe will never be picked.

I can do infinitely many coin flips and say the number I picked has the decimals described by that binary sequence. But I'd never be done picking...


Do you know measure theory? It gives you a formal definition of "almost all" or "almost surely" based on subsets which have the same measure as the full set they're in. Like the irrational numbers between 0 and 1.

Concept of comparison exists for infinite numbers. That is not controversial, bit more of 101 math analysis on university.

If you say one kind of infinite is smaller than the other kind, then the first kind no longer qualifies to be called as an infinite as it smaller than some other number. So first you need to define what an infinite is.

Also infinite is not a number. And comparison exists only for numbers.


The finite numbers extend easily to "cardinal numbers", which may be infinite: https://en.m.wikipedia.org/wiki/Cardinal_number

This is not right.

There are infinitely many integers.

There are infinitely many real numbers between each pair of integers.

Thus there are more real numbers than integers.


Unfortunately this line of argument doesn't quite work either. You could replace "real numbers" by "rational numbers" and it would still be true except for the last line. The size of the integers is the same as the size of the rationals. You have to think in terms of injective functions.

Searching for meaning is all good but sometimes its just the umbrella man - https://www.youtube.com/watch?v=yznRGS9f-jI

Recent and related is the discussion of "Dedekind's subtle knife": https://news.ycombinator.com/item?id=43084200

> It should be a timidating

Total nitpick, but i think the in in intimidating means "into a state of being timid" and not "in" in the sense of opposite of timidating.


Maybe intended as wordplay with scrutable, which is a word?

It's a perfectly cromulent word

Is a cromulent word a cromule?



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: