Unfortunately it seems to have been taken down. A shame - it was really everything! But as a huge CFD library research code it is a bit hard to sift through anyway.
I'd recommend the papers then. Stuff like this, from these guys (Jonathan Regele and Oleg Vasileyev), led to the Spacex software: https://www.researchgate.net/publication/239395184_An_adapti...
An earlier starting point might be something like this: http://ms.mcmaster.ca/kevla/pdf/articles/Kevlahan_SISC_v26.p...
This line of work dates back to the mid 90s as you will see from the sources.
Some simple examples:
Seaplane Water Simulation:
Some nice features:
Made with realflow: https://www.youtube.com/watch?v=nnv-95w1d5A
In contrast, mesh-based (and grid-based like in the article) simulators approximate the Navier-Stokes equations and converge to the exact solution as the mesh element sizes approach zero.
Mesh-based simulators solve a global system of equations in every frame. This system has variables equal in number to total mesh elements over the entire domain. Solving such a large system (especially in 3D) is very expensive. Furthermore, the size and shape of the mesh elements drastically affect the numerical stability of the solution, and generating good meshes is not a solved problem in all cases. For these reasons, we really only use mesh-based simulators when we need a high-degree of scientific accuracy.
Depends on what you mean by "particle-based". Lattice Boltzmann models, given specific constraints, can be easily shown to converge to the Navier-Stokes (and, if desired, even higher order fluid dynamics models) equations via the Chapman-Enskog expansion.
(Maybe there's another Navier-Stokes solution with the same initial conditions, maybe it's unique, doesn't matter!)
Do you have some references for this? I was under the impression that convergence results have been shown for purely Lagrangian approaches like SPH.
I recall that there are algorithms for generating divergence-free random fields, but I'm not particularly familiar with them. In another code I worked with more when I was younger, I recall discussing with a much more senior researcher than myself this issue, and if I recall correctly he said to not worry about the initial condition not satisfying the divergence constraint (the divergence was non-zero but known), as the numerical method will enforce the divergence constraint for all future time steps. Presumably one could modify the algorithm (this was a "pressure projection scheme") to take an arbitrary velocity field along with the desired divergence field and correct the initial velocity field to be divergence free. This may not preserve the desired statistical properties of the noise, however.
In reality, even if you can manage perfectly symmetric initial and boundary conditions, thermal noise would still cause this symmetry breaking.
Since these systems are physically unstable, the source of initial perturbation is unimportant as long as it is small.
To be clear, it absolutely is correct to say that this would lead to symmetry breaking. Depending on the numerical method and the computer, it might take a long time, if I recall correctly.
Would flames not be symmetrical if all of the above plus everything imaginable like quantum physics, solar activity, dark matter, wtv, are all controlled for?
Maybe if you had a perfectly homogeneous air density (which you'd never see, even in a really good vacuum and near absolute zero temperature. You can thank QM for that), that constantly stayed homogeneous. Problem in real world conditions is that nothing is perfectly homogeneous (entropy exists), so things will heat up unevenly, certain parts will gain more velocity than others, and just a small offset can create a large change in outcome. Short answer is that these systems are chaotic in nature, so they are not stable.
But that doesn't have anything to do with simulations (except what you are trying to emulate). As far as simulations, numerical accuracy will play a role, but really what you look for is if it is realistic, because the real world has random events. That's more what I was trying to get at. You want something that represents reality, not an overly simplified example that you can't use in a meaningful way. Even with these inaccuracies you can get representative models (they will reflect what happens in a physical experiment). And I say representative, because you aren't going to account for all those factors in a simulation, but you are accurate enough to make extremely effective conclusions. I'll even note that some people will add random noise into their simulations (I don't know if this author did, but numerics can play that role).
Replace the word random with unpredictable and it shows why some people are bothered.
To say something is fundamentally unpredictable means that it is immeasurable and not merely complex.
No, it just means that the measure cannot be predicted in advance.
OTOH, the observer effect can make measurement problematic.
And just to be clear, you do not think an observer has to be a conscious being, right? I only ask because pop culture science gives this impression. A photon can be an observer.
Computer generated art is fascinating
(I'm not sure how phone screen savers even work. I actually didn't even know you could set one on phones.)
 Visual Simulation of Smoke (physbam.stanford.edu/~fedkiw/papers/stanford2001-01.pdf)
This reminded me of the recent Jupiter images: https://solarsystem.nasa.gov/planets/jupiter/galleries
I cannot imagine the amount of work that goes into something like this.
If anyone in engineering has insight into this, I'd love to know.
Technically, non-fluid simulations could also be sped up using GPUs? Dynamics, solid state mechanics, thermal simulations, etc.
Often they use back-end solvers that are very old. For example I use FEMAP professionally and it’s essentially a pre- and post-professor over NASTRAN, which is way older than me. Adding GPGPU to it would be difficult indeed, and no one will pick a less accurate new solver without it having been robustly proven (it’s a chicken and egg problem, in a way).
Also, GPUs aren’t suited to all problems. You still have the memory limits of GPUs which aren’t as Large as traditional RAM (I see no GPUs with 32-64GB RAM). They’re not the silver bullet people sometimes hope for.
Lastly, the people who do this are surprisingly less overlapping with the flashy new IT development-aware crowd than you’d expect. They’re not Silicon Valley types with their finger on the pulse of the latest and greatest. Most just use PCs as a tool and wouldn’t know the benefits GPGPU could provide. To them, video cards are just video cards.
LLNL and ORNL are right now rolling out two new gigantic GPU systems, Sierra and Summit: https://www.google.com/amp/s/www.nextplatform.com/2017/09/19.... There will be plenty more GPU CFD simulations of running on those systems.
World needs to evolve. Those who do not, disappear.
If there is a way without sacrificing accuracy, then GPU computation would be absolutely amazing. Imagine the productivity boost, time savings, power and infrastructure savings you'd gain by using a GPU (if it is possible that is). Imagine being able to simulate engineering problems in near real time without having to wait for hours for a solver to converge.
I don't have experience in the solvers, but if there is a way to enable GPU computation then why the hell not!? Silicon valley types or not. People who are engaged in CFD (I used to work at Lockheed's flutter dynamics team) are certainly not "old fashioned" as you describe. GPU to them wouldn't just be a "kid's video game toy" - I can assure you having worked with these folks.
less detailed, but realtime on the CPU
and Fluvio which has GPU acceleration.
Jokes aside, this is a beautiful simulation of 2D fluid dynamics pumped to the screen using OpenGL.
I shared my shift with four intimidating Icelandic men.
I usually stood by the line, making sure that any fish that were hooked got inside the boat.
During one of my shifts, under heavy storms, an absolutely massive knot came up. I tried to untie it, but ended up having to call for help.
The Icelanders went to town on it, but did not make a lot of progress.
In frustration, one of them screamed at me to go get the knot book from the captain.
I ran as fast as I could. Up on deck, a huge wave hit, and I almost fell overboard.
Got up to the bridge. Captain just laughed and told me to fuck off.
When I got back, they had untied the knot.
I never lived it down.
We tested if the reservoir of GPU fluid is low, but after we put the transistors on the megahertz scale it's pretty clear that your monitor won't ever go into X mode. It's completely out of sync. The only thing left is to update the source codes, or you'll never boot again.