Hacker News new | past | comments | ask | show | jobs | submit login
Deep learning of physical laws from scarce data (arxiv.org)
36 points by che_shr_cat 29 days ago | hide | past | favorite | 13 comments



I've suspected for a while that this is how we unlock the next phase in theoretical physics, such as unifying QM and gravity. We may have hit some kind of human cognitive limit or blind region, so maybe we need to deploy some bigger (but more specialized) guns.


Maybe. The challenge you run into for devising radically new theory is that the optimization problem being solved in this work takes place within a broad but still-constrained "theory space". Specifically, from my quick read of it it looks like they're inferring components of a PDE of the form u_t = ϕL, where ϕ is a large dictionary of possible functions of u and its spatial derivatives, and L is a sparse coefficient matrix. This form covers a really broad class of PDEs, but for example, whatever theory ends up unifying QM and gravity probably won't be a fancier PDE of this form that we haven't found yet.


Personally, I think Stephen Wolfram and his colleagues are on the right track on that one: https://writings.stephenwolfram.com/2020/04/finally-we-may-h...


I'd be wary of that. He needs to make some predictions that can be tested or it's just another string theory/brane theory/many worlds/etc. theory that makes a lot of claims but hasn't advanced the science at all. We need testable predictions, preferably new particles in energy regimes we can actually reach with current or near future tech.

Edit: There's a big danger in accepting theories because they have simplicity or beauty. That's how we ended up with String Theory and Fundamental Theoretical Physics has basically stalled since the '50s because of it.


The Standard Model was formulated in the 70s.


Ironically, the blog post where he announces that he's cracked the Theory Of Everything has the most humble tone of anything by him I've ever read.


Might be wishful thinking. The energy levels where a unifying theory might pop out seem to be out of reach right now. Doesn't seem that we're just missing something that's already in our data.


Our current data is incomplete. It's possible then that there's more than one model that fits it, and that our current model is wrong but still manages to fit the data that we have. If we train a naive learning system on the same data, it might find another model that also fits the data but is closer to the right model.

As a historical precedent consider Newtonian vs. Einsteinian mechanics. Newtonian mechanics works fine at familiar scales of time, space, force, power, mass, etc. Einsteinian mechanics also works fine at those scales. If that's all the data you have, they are equivalent theories. Yet Einsteinian mechanics also predicts things far beyond that scale correctly while Newtonian mechanics does not.


Yes that's what I said, the data you need for a unifying theory is unavailable because we don't have colliders that can reach the right energies. Giant Neural Networks don't fix that.


Ahh... so our learning system may find multiple models but without access to those higher energies we may have no way to know which is correct.

I still think it's an interesting thing to attempt and may suggest new experimental predictions. What if an alternative model makes a novel prediction that is testable?


Could new physics be lurking at obtainable energies with dark matter and dark energy?


discover antigravity from LIGO data or something heh


2D incompressible Navier-Stokes ain't too shabby. The problems they investigated should have been in the abstract.

3D compressible N.-S. where error is measured via some functional of interest after direct numerical simulation of the discovered equations would be neat to see. I am curious which conservation laws it could figure out.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: