Hacker News new | past | comments | ask | show | jobs | submit login

For those interested in constructive and intuitionistic approaches here Dummett's [0] Elements of Intuitionism is an extremely good read.

Intuitionism is a form of a constructive foundation for mathematics which (a) notes that any attempt to deny the uncountability of reals leads to difficulties and (b) any attempt to internally define them violates constructivity.

The resolution proposed is to posit the existence of "free choice sequences". These are essentially "unpredictable" sequences of potentially infinite binary choices. As there is no a priori reason to believe that they can be predicted they are able to be much larger than what is computable and thus can be used to give a characterization of the reals. Atop this you build a constructive understanding of free choice reals which behaves very nicely (at least foundationally... it points out all kinds of weirdnesses about what we assume classically to be the structure of reals).

What's very nice about this solution is that it sidesteps the difficulty. Free choice is a weaker thing to ask for than finite/constructive reals, but finite/constructive reals could be transparently encoded into free choice sequences and all the math would just work.

[0] https://www.amazon.com/Elements-Intuitionism-Oxford-Logic-Gu...




It sidesteps the difficulty by making the math itself (at least for the time being) much more difficult, and that is the reason it was rejected by most mathematicians in the Hilbert/Brouwer debates. Because here's the question: suppose you can't philosophically justify the "existence" of the real numbers, yet they coincide perfectly with observation and result in math that is much simpler than constructive math. Should you reject them? Brouwer -- who was the first to recognize just how much ordinary math relies on non-constructive principles -- said yes, because non-constructive math is philosophically wrong, period. Hilbert -- who was a finitist -- instead suggested that the propositions of math come in two flavors: real propositions, those that are finitary and can be taken to say something about physical reality, and ideal propositions, that are not. He said that as long as the ideal propositions are consistent with the ideal ones, they should not be rejected on a priori philosophical grounds even if no finitary meaning can be assigned to them. I.e. they are philosophically justified after the fact by virtue of their consistency with the real propositions. This philosophical classification of mathematics into "real" and "ideal" is called formalism, because it does not require that the ideal propositions be assigned a finitary meaning beyond their formal statement (as a finite string of characters).

Of course, most mathematicians are not finitist, so they require neither intuitionism nor formalism -- both essentially finitist philosophies -- and are Platonists, believing that even ideal objects that are beyond physical reality and computation have a "real existence" in some Platonic sense.

BTW, I think that after Turing (who used Brouwer's choice sequences in his construction of computable numbers) it is no longer necessary to rely on "free choice" (or lawless) sequences because of the halting theorem, and both lawlike and lawless sequences can be unified, and Brouwer's "creating subject" identified with a Turing machine. But I'm not sure about that. Turing -- who was a mathematical philosopher himself -- rejected any dogmatic a priory philosophy of mathematics, except for common sense, as the one true foundation, and suggested that the value of a formal system be derived not from its a priori philolosphy but from its ad hoc utility.


Wonderful to know that Hilbert was a finitist, and that https://en.wikipedia.org/wiki/Finitism is an official camp.

Without knowing too much about about the subject, I've vaguely wondered about this idea for a long time, now, but I figured it likely an un-respectable position. I think it's too bad that beginners are often shielded from controversies in foundations.

Within the past few years I ran across an alternate approach to calculus, which if I recall correctly, achieves the same basic results, but without the same notion of infinitely small slices and so on... now I can't find it to link.


Yep, and as you can see in that Wikipedia article, both Brouwer and Hilbert were finitists. It is not that it is an unrespectable opinion as much as mathemticians these days -- from what little I know -- are not generally expected to hold any dogmatic opinion on philosophical foundations, but maybe that attitude will shift again.

> I think it's too bad that beginners are often shielded from controversies in foundations.

Mathematicians generally shouldn't worry about foundations, as it is the intention of foundations to stay hidden -- except in cases where the foundation requires a rewrite of much of math, as in the case of Brouwer's intuitionism, but constructive math is pretty advanced anyway. Foundations are usually the concern of logicians, but from what little I've seen in online discussions, it seems like many logicians aren't interested in philosophy, either, and that's a real shame.

> I ran across an alternate approach to calculus, which if I recall correctly, achieves the same basic results, but without the same notion of infinitely small slices and so on

There's constructive analysis[1], which recreates analysis within the framework of constructive math (by finite means etc.), and there's also non-standard calculus[2] (which I know nothing about) that makes treat infinitesimals as actual numbers, which may be the opposite of what you meant, but maybe not -- I saw something about there being constructive versions of non-standard analysis, too.

[1]: https://en.wikipedia.org/wiki/Constructive_analysis

[2]: https://en.wikipedia.org/wiki/Non-standard_calculus


I think you're right on the money bringing Turing into this.

I don't have a real, formal horse in this race, but the way that I have things arranged in my mind is that these are all disagreements on the notion two things in logical foundations: the need to consider time/resources and the conception of logical systems as closed or open.

Brouwer and Hilbert accept that time is a factor, but handle it in different implicit ways. Hilbert differentiates between finite (time accessible) and infinite (not finite) and then makes a plea that this divide doesn't much matter. His Program sought to prove it and then Turing and Godel and the like shattered that dream. Brouwer philosophically treated proof as communication and thus Intuitionism handles time and resource naturally (though not explicitly like a Kripke model).

Brouwer was forced by this choice to leave his mathematical foundations open ended as well. At any given level of effort there is a finite set of things we can prove/talk about/know and the rest must be left unknown, but likewise at any point in time we can incrementally increase our knowledge. Intuitionism is complex in order to handle that openness.

Bringing Turing in makes openness and expense/time immediately obvious. Turing machines are a model for an open world of knowledge: once you have a partial halting oracle G I can build a new program which your oracle fails on. Godel took it to the next level: your own formal system cannot prove its own consistency, but using it I can create a larger universe where it can be proven consistent.

In each case, the result arises from wanting a formal closure of your foundation and both use self-reference to create a (co-)inductive means of incrementally enlarging your system forever.

So I think Brouwer beats them to the punch in a certain way by taking that continuous finitary expansion as foundational itself.

Free choice can't be modeled formally by "Turing machines" because we possess a means for writing them all down. On the other hand, if I give you the trace of a Turing machine you have no (finite) means to predict it so "Turing machine traces" do provide the largeness you need. I don't really know how to handle the fact that someone might want to form a bijection between the two. To me it makes me think immediately of information hiding behind an abstract interface: even though you could eventually know an implementation of the interface, you spend most of your time working ignorant to it and often have more power for doing so.

I'm definitely no expert here, but I think it's all really fascinating. Definitely agree that Brouwer didn't produce anything of much use to day-to-day mathematicians, but that's sort of Hilbert's (and Turing's) point: practical common sense doesn't seem too impeded by this.

I think as computer scientists we're essentially carrying out a certain mathematical/philosophical program in this debate, fwiw. Turing showed that if you take working computationally very seriously that you will run into open-endedness and time-as-resource constraints very quickly. Programming is a long exercise in discovering just have very seriously you have to take both of those things (interfaces, abstraction on one side, resources, human patience on the other).

Brouwer and intuitionism are subsequently having a bit of a resurgence in mechanized proving. The non-computability of reals shows its ugly head every time someone working in computational optimization has to write `compare_espilon` instead of just `equals`. Or, perhaps even better, every time someone gets bitten by thinking IEEE reals are "mathematical reals".

I also really like to imagine this because I love this idea that each programmer is unknowingly taking very tiny steps along a grand philosophical program. Unromantically it's obviously true, but there's something fun in trying to imagine just how profoundly computers may yet still impact our understanding of the world.


> I think you're right on the money bringing Turing into this.

Well, his most famous paper, the one in he was first discovered the essence of what computation is (rather than merely asking what functions can be computed by a "definite procedure"), was named On Computable Numbers, and he uses the very problem of the real numbers as a segue into the more general topic of computation.

> Free choice can't be modeled formally by "Turing machines" because we possess a means for writing them all down. On the other hand, if I give you the trace of a Turing machine you have no (finite) means to predict it so "Turing machine traces" do provide the largeness you need. I don't really know how to handle the fact that someone might want to form a bijection between the two.

I don't think there is any such bijection (if only for the cardinality argument you mention), but that it turns out (thanks to halting), that lawful and lawless sequences are equally opaque: it is as impossible to determine (using finite means) whether two lawful sequences are equal as it is to determine whether two lawless sequences are, which is all that's required (I think) for the theory. Therefore the same idea that Brouwer had in mind does not really require lawless sequences (which, I think, he wouldn't have introduced if he'd known about halting).

> Turing showed that if you take working computationally very seriously that you will run into open-endedness and time-as-resource constraints very quickly. Programming is a long exercise in discovering just have very seriously you have to take both of those things (interfaces, abstraction on one side, resources, human patience on the other).

It's a bit more complicated, as Turing himself -- in his published conversations with Wittgenstein -- opposed intuitionism as a foundation, believing that math neither has nor requires one true formal foundation. After all, Formalism is also finitistic, and you can just as well use a program to prove a theorem about (classical) real numbers using finite means; there's no need to construct -- i.e. give specific meaning to -- each real number. That was Hilbert's point: it is enough to reason about mathematical proposition in a finitary manner, and there's no actual need to construct the objects in their domain of discourse with finitary means.

> Brouwer and intuitionism are subsequently having a bit of a resurgence in mechanized proving.

Yes, but I think the relationship is more complicated. Again, a mechanical prover can prove classical propositions -- by using deduction rules -- just as it can prove intuitionistic propositions. The resurgence with respect to mechanical provers, as far as I understand, has two reasons: 1. it is easier to create a tiny universal core for a mechanical prover based on intuitionsitic logic, and classical logic can then be built on top of that. 2. much of the research on the theory of functional programming rests on the desire (explicitly stated by Martin-Löf) to unify programming and constructive math. I personally find this approach misguided for various reasons (I'm not a mathematician or a logician, so I see this from a practicing programmer's perspective), but I can understand the appeal.

> there's something fun in trying to imagine just how profoundly computers may yet still impact our understanding of the world.

Oh, absolutely. Once Turing realized that computation is in a very strong sense a fundamental natural phenomenon, he understood the many implications (on logic, AI, biology and physics).


> it is as impossible to determine whether two lawful sequences are equal as it is to determine whether two lawless sequences are

I also think that's enough.

I think when you mention that it's easier to build a mechanistic theorem prover on an intuitionistic foundation that it's a bit more "foundational" a problem than it seems. Theorem provers need (some kind of partial) equality on propositions and on proofs in order to function, but propositions are too rich to support finite equality. This backs you into a more intuitionistic approach.


> but propositions are too rich to support finite equality

Why do you say that? The finitary deduction rules of all logic systems provide both equivalence (<=>) and partial order (=> or |-) relations. Classical logic gives rise to a boolean algebra, while intuirionistic logic forms a Heyting algebra, of which boolean algebra is a special case. The latter is more general, but both are bounded lattices, and both are perfectly fine.


Sorry, on the propositions themselves, yes, but it's hard to internalize equality: the big sticking point between ITT, OTT, HTT, etc.


It's only hard if you're constructive (and so all relations have to be computable, including equality).


All numbers are ultimately "probabilistic" in calculations.

Sure you can state them as a "fact", but that undermines quantum theory itself and the Universe.

If you want to do the math right you really need to consider terms as peak probabilities and work from there. (See Chaos theory)

IMHO

Of course doing math with hard constraints is perfectly adequate for most macroscopic calculations.

That said, errors do add up given sufficient entropy and time.

It's quite simple to add fuzziness into the equations but quite a pain in the ass to compute with, plus you will in most cases get the same answer anyway.

As far as real numbers, they are no different from integers, just shift the decimal.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: