> he resulting system is unsound (in the sense of Tarski) because it asserts the existence of natural numbers that have no "written form" (i.e. the existance of natural numbers that are larger than any term you can write to denote a natural number).
Isn't that basically the definition of the natural numbers, ie. if you write down any natural number (say n) I can always construct a natural number that is larger than it (like n+1) ?
This surprisingly doesn't mean repeatedly adding 1 will exhaust all natural numbers -- there are models for the natural numbers with elements that can't be reached this way!
The ultrafilter construction gives one such model. You take the set of all sequences of natural numbers (n1, n2, n3, ...) then use an ultrafilter to decide which of these sequences are considered to be equal. The usual operations of natural numbers are defined term-by-term. You can think of the usual natural numbers as being the sequences (0,0,0,...), (1,1,1,...), (2,2,2,...) and so on. However, there are many additional numbers in this system, like (0,1,2,...) that are strictly greater than all the usual numbers, and these cannot be reached by repeatedly adding one. (The reason (0,1,2,...) is greater than (n,n,n,...) is that if we do the comparison term-by-term we get (false,false,...,false,true,true,true,...), and the ultrafilter will decide the comparison is true because it is true for all but finitely many terms. Ultrafilters are devices to consistently turn infinite sequences of trues and falses into a single decision, but every ultrafilter will make a true decision in the case there are only finitely many falses.)
Counter-intuitively, proofs by induction still work for this system... speaking with no authority here, maybe an intuition is that it's doing a hypercomputation.
> This surprisingly doesn't mean repeatedly adding 1 will exhaust all natural numbers -- there are models for the natural numbers with elements that can't be reached this way!
Isn't that just true for models of a first-order axiomatisation of the natural numbers? As far as I know, if you write down the Peano axioms in second order logic (as is required), you do get the natural numbers uniquely, and applying the successor function over and over again does exhaust all the natural numbers eventually.
The reason why you get non-standard models of arithmetic in first-order logic is the Löwenheim-Skolem theorem, but that isn't available for second-order logic.
Yeah, my understanding is that second-order arithmetic picks out the usual natural numbers, but it is a much stronger system since you can additionally quantify over subsets of naturals in your formulas.
The context of the comment is ZFC and its concept of the natural numbers (and what I had in mind in particular was what is its minimal infinite inductive set?). ZFC is a first-order theory, so "model" means model of a first-order theory.
> The reason why you get non-standard models of arithmetic in first-order logic is the Löwenheim-Skolem theorem
The ultrafilter construction I described is a way around the compactness theorem, in that it constructs new models semantically rather than syntactically. Ultrafilters are also a good way to prove the compactness theorem. But in any case, it's essentially the first half of the upwards Löwenheim-Skolem theorem before applying the downwards Löwenheim-Skolem theorem -- I think the significance of that theorem is that you can get models of any infinite cardinality you want, rather than just a strictly larger one. (Though I'm no logician.)
>The ultrafilter construction gives one such model.
It's worth noting that this construction relies on the axiom of choice, so exists in ZFC but not ZF. Generally a lot of counter-intuitive constructions disappear when eliminating the axiom of choice (such as the Banach–Tarski paradox and non-continuous functions).
I think the step function doesn't depend on choice. Say f(x) = 0 if x < 0 and f(x) = 1 if x >= 0. Also, suppose we define the reals through Dedekind cuts. We can define f(x) by checking whether 0 is an element of x as a cut. We can prove f is discontinuous in the usual way.
What I had heard is that Brouwer proved, using some kind of constructive/intuitionistic logic, that every function is continuous, but ZF is still based in classical logic. I believe the step function is not definable constructively.
Ok, but I'm not sure what your point is. Throwing out only the axiom of choice (which is the difference between ZFC and ZF) does not make things constructive. If you're thinking of Diaconescu's theorem, it's that choice implies the law of the excluded middle, but the converse is certainly not true.
I was objecting to this:
> Generally a lot of counter-intuitive constructions disappear when eliminating the axiom of choice (such as [...] non-continuous functions).
Can you explain what use these have? If natural numbers are (n, n, n, ...) then you have just made a new type of number not comparable to natural numbers.
A theoretical use is that it shows that the axioms of the natural numbers (the Peano axioms) aren't enough to pin down what we think the natural numbers should be -- there are these crazy non-standard natural numbers out there!
In context of the discussion, this non-standard model of the natural numbers gives some intuition about what happens if the consistency of ZFC is independent of ZFC and you add in the axiom that ZFC is inconsistent: your natural numbers will have to be something similarly weird.
> you have just made a new type of number not comparable to natural numbers
But they are comparable. The everyday natural numbers are embedded in this new system, which is what these (n,n,n,...) elements represent. One way to interpret what we've done here is to add in infinities to the number system so that all the normal operations still work, and there's even a total ordering on them!
Using the real numbers instead of the naturals, you get the so-called nonstandard real numbers, which is the number system used in nonstandard analysis. Nonstandard analysis is a way to do analysis without epsilon-delta proofs. Nonstandard reals include infinitesimals and infinities, similar to nonstandard naturals having infinities. There are even textbooks on the subject -- I haven't read them, but they claim it makes analysis easier.
The last thing that comes to mind is that nonstandard numbers of this exact type show up when you study certain infinite-dimensional number systems and calculate the points (like in the end of my comment).
Dropping down to Peano Arithmetic for a moment. We can consider adding a new constant 'c' for a natural number to the language along with the following infinite list of axioms about this remarkable constant:
- 0 < c
- 1 < c
- 2 < c
- 3 < c
...
Adding all these axioms is consistent. I.e. you can do induction upto 'c', whatever it is. Why is it consistent? Because if there was a contradiction, the proof of such a contradiction would be finite, and hence can only use a finite number of these new axioms (this is a so-call compactness argument). But clearly any finite subset of this list of axioms is consistent because it has a model where c is just defined to be 1 more than the largest numeral appearing in that list.
But all those infinite number of axioms taken together creates an unsound system because it claims that 'c' denotes a natural number that is larger than every written numeral.
Heading back to ZFC land, it turns out that (assuming ¬Con(ZFC) is independent of ZFC) adding ¬Con(ZFC) to ZFC similarly is similarly unsound in that it yields only models that have elements that are larger than every written numeral.
This sounds a little bit like an argument against systems with an infinite number of axioms to me. Or maybe it's fine but only under certain conditions?
You can find a natural number that is bigger than any one natural number, but you can't write one down that's bigger than every natural number in the ZFC sense.
Do you mean that given finite resources (like time and matter) we would be unable to express the number? Or are you talking about a sort of recursive thing where writing down a number which is a sum of all natural numbers up to and including itself is impossible since that number is bigger each time you look at it?
> that's bigger than every natural number in the ZFC sense.
I'm not sure what you mean by this.
Even for nonstandard models of the natural numbers there will never be a single number larger than all other numbers, since that violate the Peano axioms.
Did you mean by "in the ZFC sense" "the standard model?"
For this entire argument I will be operating under the assumption that ¬Con(ZFC) is independent of ZFC.
ZFC+¬Con(ZFC) proves "there EXISTS a natural number x such that x is an encoding of a proof in ZFC of 0=1".
Now suppose there is actually a numeral n, i.e. a specific written term written using symbols, e.g. (+,*,1), such that
ZFC+¬Con(ZFC) proves that "n is an encoding of a proof in ZFC of 0=1". We can mechanically decode such a n and check if it is indeed a proof that that ZFC proves "0=1" or not.
There are two possibilities: (a) the check is successful and thus we have found proof of a contradiction in ZFC. But if ZFC is inconsistent, then it proves everything. In particular ZFC proves ¬Con(ZFC), which violates our assumption that ¬Con(ZFC) is independent of ZFC.
Alternatively (b) our check fails and n does not encode a proof in ZFC of 0=1. But that statement "n does not encode a proof in ZFC of 0=1" is a true Delta_0 statement, and we can prove by induction that ZFC proves every true Delta_0 statement. Thus we can prove that ZFC proves "n does not encode a proof 0=1", and hence ZFC+¬Con(ZFC) proves "n does not encode a proof 0=1". But now we have found a statement Q such that ZFC+¬Con(ZFC) proves both Q and also ¬Q. This means that ZFC+¬Con(ZFC) is inconsistent. But if ZFC+¬Con(ZFC) is inconsistent, by the deduction theorem, ZFC proves ¬¬Con(ZFC), or equivalently ZFC proves Con(ZFC). This contradicts our assumption that ¬Con(ZFC) is independent of ZFC (it also implies that ZFC is inconsistent).
Thus we are left with the conclusion that there is no such numeral n. However it is still the case that ZFC+¬Con(ZFC) proves "there EXISTS a natural number x such that x is an encoding of a proof in ZFC of 0=1", and the only way this can be the case is any such x is a "natural number" which has no numeral that denotes it.
On any storage system you will eventually run out of memory, so there will always be a maximum integer that you can write down with what you have, and we can only write down a vanishingly small subset of the natural numbers.
Of course, this isn’t what mean when they talk about “writing down” a number, but I’ve long thought that assuming the existence of natural numbers that you can’t write down is poetically apt.
In some sense, the set of natural numbers is “too big” to be about everyday finite numbers alone and it seems fitting to acknowledge that our usual axioms can’t exclude weirdo large finite numbers that are simply too big to ever reach.
Isn't that basically the definition of the natural numbers, ie. if you write down any natural number (say n) I can always construct a natural number that is larger than it (like n+1) ?