Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You don't understand. I am making this assertion as a former physicist. We have stumbled upon extremely general function approximators.

Effectively the same way that we can use equations to make inferences about reality - but because of the nature of mathematical notation and the limitations of human ability, though powerful, math has limitations. E.g. you can write out the idealized equations for heat propagation in a conductive medium, but solving them for a real object requires empirical simulation.

Deep neural nets are the next step. Now you can essentially train these neural networks to infer not just the general idealized behaviors, but specific details, discrete values for X and Y on a fine grid that are beyond the practical limits of applied math.

But this is much bigger. It turns out that, much in the way that idealized equations apply to many problems (e.g. exponential growth arising from diff EQ), neural nets generalize to all manner of real world problems, provided the training data is appropriately curated.

These neural networks excel at learning human like heuristics, with machine level precision. You can make inferences for both continuous and discrete probabilistic systems. This is a major development and it's just starting. We've finally assembled the pieces in the last few years.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: