Hacker News new | past | comments | ask | show | jobs | submit login
A better approach to gravity: how we made EGM2008 faster (elodin.systems)
88 points by sphw 50 days ago | hide | past | favorite | 23 comments



Exactly what order and degree were you using to evaluate the model? Variations in drag and solar pressure are more significant than the uncertainty in the gravity field for objects in LEO somewhere much less than 127th order (40 microseconds on my machine, your smileage may vary), so you can safely truncate the model for simulations. GRACE worked by making many passes such that they could average out those perturbations to make their measurement. But for practical applications, those tiny terms are irrelevant.

IERS Technical Note 36 section 6.1 gives recommendations for model truncation if you are looking for justification. https://iers-conventions.obspm.fr/content/tn36.pdf


Always super satisfying to take a 1.2s calculation and make it orders of magnitude faster. Recently I had a complicated calculation mostly done in SQLite (with some C callbacks to do core floating point ops) that was taking 1.5s; rewrote it into a hand-crafted incremental computation network and got the calculation down to 6ms.


Do I understand right, that it's summing ~2 million spherical harmonic terms, and the optimization is that that's accomplished in ~200 ms (per test point?)? Does this force field feed into something like a Runge-Kutta solver for predicting orbits? (I don't know much about orbital mechanics).


Yup, your understanding is correct. When set to full fidelity, it computes and sums 2,331,720 terms. What we optimized is the way the spherical harmonics are generated, which allows for the generation and summation to happen in less than 250 ms. Once that force information is generated, it is passed to a semi-implicit integrator – we also support RK4, but this test was using semi-implicit. That allows you to run orbital dynamics simulations; our primary use case is for testing satellite control systems.


Related: Dr. Martin used PINN for creating a gravity model [0] and video [1].

[0] https://github.com/MartinAstro/GravNN

[1] https://www.youtube.com/watch?v=1UNtZ4sOVEI


Latest paper is here, pretty exciting https://arxiv.org/abs/2312.10257


Technical nit: spherical harmonics are used in quantum mechanics but were developed for classical mechanics.


Where do they show up in classical mechanics?


Anywhere you have to solve the Laplace equation in spherical coordinates. For example with the gravitational field.


Vibrational modes of spherical stuff.


Fair point. We will fix the language up


It's a pretty minor point, but the pedant in me appreciates it nonetheless!

It's also true that since most people are not subjected to the horrors of learning about PDEs, the place where they may be exposed to spherical harmonics is in atomic orbitals from high school chemistry, so I could see where you were coming from.


I don't think that there is any need for "Technical nit:" and it looks a bit aggressive to me and without any substantiation.

Why not say your piece and be done at that point in time? You can respond to counter arguments as they arise in child comments.

Why not kick off with "Here at {wherever}, we find that ... crossing the streams is a really bad idea" or similar?


> Who’s the fastest kid on the block now, MATLAB?

Was MATLAB ever the fastest kid?

What's outlined in the article has, from my PoV at least, been true since 1980 at least (and older coders will likely in with earlier tales): handy implementation and simulation libraries generally work and provide good reference results to check against but almost always you can get a magnitude order faster if you can put the time in.

I put the earlier pre-2008 epoch models for earths gravity and magnetics through a similar workout sometime ago, for our use case we spent some time and money to make a custom TI DSP chip pipeline to get the turn around that made data exploration playful rather than an overnight grind.


Disable NextDNS or other DNS Adblock tools if the site doesn't load the content.


Thank you for giving a plausible explanation as to why the main content of the website doesn’t load, but the rest does! Looking at my NextDNS logs, the website does indeed seem to have plenty of telemetry. I’ll keep it blocked.


Sorry to hear you had to block the site - Do you have issues loading other webflow sites because of NextDNS? I've found many threads on issues between them - but not much in the way of a solution.


What's the accuracy/speed tradeoff for thus model compared to one based on computing density points and using Barnes-Hut?


I've never seen anyone use Barnes-Hut for an earth gravity model. I'd be curious to see how an implementation like that would work. I think Barnes-Hut is typically used for large N-body simulations where each particle moves independently, like for galactic simulations.

One of the biggest barriers to alternate geopotential models is the availability of trusted data-sets. In another comment someone linked to a PINN based method that looks super promising


Excuse my ignorance but how big of a lookup table would you need to achieve the same outcome ?


LUTs are commonly used in geodesy applications on or near the Earth's surface. The full multipole model is used for orbital applications to account for the way that local lumpiness in Earth's mass distribution is smoothed out with increasing distance from the surface. It might be reasonable to build a 3D LUT for use at Starlink scale or higher, but certainly not for individual satellites.


TL;DR: JAX (the GPU-accelerated Python ML framework)

Pretty cool to see applications of all this good stuff developed for ML in other areas! I guess a side effect is that the calculations should now be differentiable and you could optimize various parameters using gradient descent if you wanted. You could even use it as a component of a neural net and backpropagate gradients through it.

Also it's not clear if they used GPUs but they probably could for an even better speedup in a scenario with large batches of calculations.


No offense but this doesn't seem "revolutionary" at all / people have been doing hierarchical representations and multipole expansions since forever. The presentation of a single fixed-resolution bitmap of dense equations with no introduction of terms feels like it's trying to do proof by intimidation.

When doing path guiding in Monte Carlo path tracing you often have lots of nasty low probability / high contribution paths that make gravitational simulation seem easy. After all, if you can a-priori simulate light paths efficiently, then you can efficiently solve optical computers / the Halting Problem.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: