Hacker News new | past | comments | ask | show | jobs | submit login

It seems to me that if your argument is valid then the situation is even worse for machine learning.

That is to say, if machine learning cannot use the real numbers then they cannot solve any problem which depend on any properties of the reals, such as producing a machine learned function to compute the square root of two.

Now, it may be true (in fact I think it likely to be true) that the cardinality of the reals is a little bit different of a property than the denseness, enough so that machine learning can approximate the denseness and not the cardinality, but the reason for this doesn't seem to be simply "machine learning doesn't use the real numbers".




That is not surprising at all. The set of real numbers that can be produced by a computer is countable. That means that almost all real numbers are uncomputable. (Square roots of natural numbers are computable though.)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: