Hacker News new | past | comments | ask | show | jobs | submit login
Learning coupled differential equations subject to non-conservative forces (arxiv.org)
52 points by tonisunset 25 days ago | hide | past | favorite | 25 comments

This kinda thing is prevalent in academia, it is probably totally useless and at best so incomplete as to not tell you anything.

There is a lot of research going back to the 1930s solving this problem--how do you estimate energy potential based on available data. The machine learning approach is really the variational approach but using very expensive (lots of parameter) functions instead of cleverly choosing your fitting function so that it has parameters which are physically meaningful and significantly fewer in number. The first helps with building a meaningful story of the underlying physics while the second means you need fewer data points.

> cleverly choosing your fitting function

One benefit of ML is being able to replace clever subject matter experts (who may be expensive and hard to identify) with ML generalists.

Frequently, you still need a clever subject matter expert to figure out what form of input you need to insert into the neural network. E.G. someone has to figure out how to take crystal structure and put it in a neural network. Most obvious choices (e.g. voxel models) are intractable without huge amounts of computational resources which aren't generally available

Your data needs to be very cheap.

And you don't have to be that clever. For periodic function sines and cosines. For potential energy quadratics, polynomials and exponents do a great job.

One of the key components which makes neural networks generally inferior is that you need enough data to learn the governing equations (always true physics) as well as the constitutive law (part of the physics which applies only to your problem) as opposed to just the constitutive law.

Of course there are notable exeptions. Neural networks have been doing a great job of processing microscopy data and building 3d models from a series of 2d images.

And then the ML generalists (sometimes painfully) re learn the same lessons that the subject matter experts have learned over generations

While that’s the poster dream, I wonder whether it actually works in practice, rather than being a liability, as a sibling comment points out.

IME, imposing domain knowledge (usually directly into the model, or sometimes also in the training data distribution) is crucial to getting things working in an efficient and robust manner.

Deep NNs are already competitive with specialized pipelines for translation and speech.

The primary downside is that they require more data to train.

more importantly, thanks to the internet there is a huge amount of available data that is reasonably available. Many of the science problems I have dealt with have something like 11 data "points". And experiments may take a month to perform.

Sure, but that’s such a tiny corner among generic problems. To generalize from there seems unwarranted.

But is it a benefit or a liability? Reminds me a lot of the arguments in Susskind and Susskind's "The Future of the Professions."

yes, try fitting a function to less than pi/4 period of data...

I did this recently. We wanted to validate our thermal control model and we had ~20 heaters and 24 hours to do the test. Heaters couldn't cycle faster than about 4 hours due to thermal mass, so we had to fit our periodic function in less than a quarter of a period of cycle time. It works fine if you have a good initial guess to the period and you know your solution is periodic.

This is a really interesting idea, using machine learning to analyze dynamical systems. I've had this idea kicking around in my head about using machine learning on complex time series data for a sort of anomaly detection. You could use example time series data to train a neural network to find conserved quantities in the system, essentially trying to learn the internal energy of the system, and then changes in that internal energy would be your anomalies.

It looks like this paper is getting at the same basic idea, though obviously actually fleshed out and with better results than I would have thought were possible. It's going not for anomaly detection but actual forecasting, and directly isolating the non-conserved force.

I'll need to start reading the related papers, I wonder if people have tried doing this with nonlinear equations, or systems where the dynamics aren't well defined? I imagine actually forecasting a nonlinear system wouldn't be possible beyond short time frames, but maybe you could still learn the underlying equation and dynamics? It would be interesting to see if you could give a neural network a bunch of nonlinear data and have it find equilibrium points, identify regions where the dynamics are almost linear, etc.

People are working on it (disclosure, that includes me - https://arxiv.org/abs/2005.13028)

There is a special interest group at The Alan Turing Institute in the UK dedicated to working on problems of this sort.

There are a number of challenges that vary depending on both the system and the forcing.

It's actually not an interesting or original idea, and this is a shit tier paper; it's generally a way for a physicist to attempt to get a job when done grad school.

Echo state networks do insanely well on the Mackey Glass equation and ESNs are randomly connected and use linear regression on the output node.

> this is a shit tier paper

I agree with the content if not the style of your comment, and it may do better to say why the results are trivial

Arxiv is not peer-reviewed and usually used for preprint versions before submitting a paper to a journal. Also it allows to make the information accessible for free to anybody.

Not sure if this is close to the problems you want to study, but NASA applied Deep learning to anomaly detection on spacecraft telemetry and has some early results here: https://arxiv.org/pdf/1802.04431.pdf

I'm very interested in anomaly detection problems, so if you find anything good out there do comment back in!

This system might be able to determine causual relationships as well as expended energy vs delivered energy and what components are driving vs being driven.

>90% of all energy spent on compute will be on matrix multiplications.

This is a busy week for neural differential equation solvers. Here's another interesting one using a neural network to solve complicated PDEs like Navier-Stokes: https://arxiv.org/abs/2010.08895

This is interesting and differs from other ODE ML approaches in that it can deal with non-conservative forces, like friction. The design allows for a shallow network with only 3 free parameters and is therefore very cheap to train.

This seems restricted to just an oscillating system. Neural ODEs (DiffEqFlux) seem a bit more general than this to learn any kind of dynamical system

As an example application, it would be cool to have a vehicle know when its shocks need to be replaced or if they are too far out of specification, just from monitoring suspension arms' angle sensor/s.

Yo, your car's shocks need replacing. It is currently underdamped and going to bounce around like a Ford Fiesta in an LA earthquake.

Hey buddy, those pricey tuner shocks are problematically over-damped which will lead to a rough ride, wear issues, and handling problems.

I don't understand why you need "machine learning" for that. The engineers already have fea models which they build reduced order dynamic equations out of. By monitoring the response, you can estimate the damping factor relatively easily which could provide the feedback whether your shocks need to be replaced.

I mean some people take machine learning to include anything with parameter fitting, but it seems like classic engineering to me.

Yeah you just need an accelerometer and you can find out if your damper is broken with an FFT. OEMs are just very price sensitive so only the expensive vehicles will get this, even if it's a cheap component.

Analyzing data collected over time to see how shocks degrade .

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact