Hacker News new | comments | show | ask | jobs | submit login
GPU Fluid (benedikt-bitterli.me)
532 points by pablode 8 months ago | hide | past | web | favorite | 79 comments



His 3D fluid simulation and renderer is even more beautiful! https://benedikt-bitterli.me/vorton-fluid.html


This is mind blowing!


doesn't seem to be on github unfortunately


I came across a nice video/presentation from some guys at SpaceX on using GPUs to model and visualise the fluid flows inside rocket engines (specifically the Raptor engine) and around spacecraft during reentry, as well as a good explanation of how they do it efficiently using adaptive grids. Their explanations are quite approachable as well.

https://www.youtube.com/watch?v=txk-VO1hzBY


Do they have code available?


Note the citations at the end of the video (44:30 mark ). Code from some of the original research the spacex stuff was based on, adaptive wavelet collocation methods, used to be online in a bitbucket repo here: https://git.plgrid.pl/scm/tmh/wavelet-original-code.git

Unfortunately it seems to have been taken down. A shame - it was really everything! But as a huge CFD library research code it is a bit hard to sift through anyway.

I'd recommend the papers then. Stuff like this, from these guys (Jonathan Regele and Oleg Vasileyev), led to the Spacex software: https://www.researchgate.net/publication/239395184_An_adapti...

An earlier starting point might be something like this: http://ms.mcmaster.ca/kevla/pdf/articles/Kevlahan_SISC_v26.p...

This line of work dates back to the mid 90s as you will see from the sources.


Very unlikely, this software is an enormous commercial advantage. Not so much the cost in and of itself of the software but for the advantage of speeding up development of engines, etc through shortened iterations. SpaceX developed this because they couldn't find any reasonable solutions on the market already. I suspect that reasonable not only included cost and availability but also the ability to adjust the product to their needs.



If you like things like fluid simulation and you would like to create real simulation then I would recommend RealFlow. It's so easy to model real life looking huge fluid simulations with physical objects in this software. You can choose to create things like foam automatically. You don't need any knowledge about programming or physics, just set the source, set your scene, click simulate and watch the magic happens. It's only hobby for me but it's great fun.

Some simple examples:

Seaplane Water Simulation: https://www.youtube.com/watch?v=R9TZNiZYhq0

Some nice features: https://www.youtube.com/watch?v=PuS1zD6kCyc

Made with realflow: https://www.youtube.com/watch?v=nnv-95w1d5A

[1] http://www.nextlimit.com/realflow/


It's worth pointing out that particle-based fluid simulators are not physically accurate. They compute the positions of zero-volume particles in space at every frame (fluid has volume). This is convenient for performance purposes since you can evaluate each particle position mostly in parallel and the results look convincing enough for a movie or a game.

In contrast, mesh-based (and grid-based like in the article) simulators approximate the Navier-Stokes equations and converge to the exact solution as the mesh element sizes approach zero.

Mesh-based simulators solve a global system of equations in every frame. This system has variables equal in number to total mesh elements over the entire domain. Solving such a large system (especially in 3D) is very expensive. Furthermore, the size and shape of the mesh elements drastically affect the numerical stability of the solution, and generating good meshes is not a solved problem in all cases. For these reasons, we really only use mesh-based simulators when we need a high-degree of scientific accuracy.


> It's worth pointing out that particle-based fluid simulators are not physically accurate.

Depends on what you mean by "particle-based". Lattice Boltzmann models, given specific constraints, can be easily shown to converge to the Navier-Stokes (and, if desired, even higher order fluid dynamics models) equations via the Chapman-Enskog expansion.


How can you show convergence when it is not shown that there is a unique solution to Navier-Stokes?


There's no contradiction. You show that what it converges to satisfies Navier-Stokes.

(Maybe there's another Navier-Stokes solution with the same initial conditions, maybe it's unique, doesn't matter!)


> It's worth pointing out that particle-based fluid simulators are not physically accurate.

Do you have some references for this? I was under the impression that convergence results have been shown for purely Lagrangian approaches like SPH.


His page needs a trigger warning for Imposter Syndrome.


For those who would like round off this excellent post with some classy, non-gpu, retro platform fluid, please check out Fluid Fire, a great little 1k Amiga bootsector intro.

https://www.youtube.com/watch?v=OKatv85Bei4


In the Hot Black Smoke video, how does the asymmetry occur? Is it due to imperfections in the calculations?


Fluid dynamicist here. If a simulation is too symmetric, one can insert random noise into the initial conditions. I've done this in the past, and it's fine, even physically justified as reality is rarely so perfect. There are measures of the noise that one can match.


How do you ensure the initial conditions are divergence-free after introducing the noise?


Great question. I don't recall precisely what the research code I used then did. The velocity fields for constant mass density incompressible flow need to be divergence-free. But other variables and other situations do not have the same restrictions (though they might need to satisfy some other conditions). The code might have introduced noise in the the temperature field.

I recall that there are algorithms for generating divergence-free random fields, but I'm not particularly familiar with them. In another code I worked with more when I was younger, I recall discussing with a much more senior researcher than myself this issue, and if I recall correctly he said to not worry about the initial condition not satisfying the divergence constraint (the divergence was non-zero but known), as the numerical method will enforce the divergence constraint for all future time steps. Presumably one could modify the algorithm (this was a "pressure projection scheme") to take an arbitrary velocity field along with the desired divergence field and correct the initial velocity field to be divergence free. This may not preserve the desired statistical properties of the noise, however.


Small perturbations in the velocity field are totally fine even if they are not divergence-free. Most fluid dynamics codes used a discrete Poison equation to enforce the conservation of mass (i.e. divergence-free velocity field). Here is some background: https://en.wikipedia.org/wiki/Discrete_Poisson_equation#Appl...


One trick would be to generate a random field and then take its curl. Perhaps there are better ways though.


Yes, numerical error breaks the symmetry. It could be as small as the rounding error due to finite precision, but most likely a few orders of magnitude bigger thanks to errors in the numerical method.

In reality, even if you can manage perfectly symmetric initial and boundary conditions, thermal noise would still cause this symmetry breaking.

Since these systems are physically unstable, the source of initial perturbation is unimportant as long as it is small.


As an addition to my other comment, I've read and heard some fluid dynamics researchers/engineers rely on numerical error to break symmetries and lead to turbulence, etc., but it would be hard to call this physical. I think a better approach would be as I said in my other comment, adding noise to the initial condition. This is particularly problematic for studies of the transition to turbulence, as presumably such a phenomena would be strongly dependent on the details of the accumulation of errors/noise, and relying on numerical errors would essentially be like adding noise without quantifying it.

To be clear, it absolutely is correct to say that this would lead to symmetry breaking. Depending on the numerical method and the computer, it might take a long time, if I recall correctly.


Why would you expect symmetry? Smoke from a real flame does not have symmetry.


I think they're asking how to create the asymmetry and if the imperfections in the calculations are exploited to do so.


Wouldn't that physical analogy be caused by factors unknown to this algorithm though? Air flux, non-homogeneous air, air impurity etc.

Would flames not be symmetrical if all of the above plus everything imaginable like quantum physics, solar activity, dark matter, wtv, are all controlled for?


If we're going to the level of QM, then no. Because QM is inherently a random process. Like radioactive decay is a great way to make true random number generators. The real world has truly random events, and while it bothers many people, physicists don't generally argue about this point (anymore).

Maybe if you had a perfectly homogeneous air density (which you'd never see, even in a really good vacuum and near absolute zero temperature. You can thank QM for that), that constantly stayed homogeneous. Problem in real world conditions is that nothing is perfectly homogeneous (entropy exists), so things will heat up unevenly, certain parts will gain more velocity than others, and just a small offset can create a large change in outcome. Short answer is that these systems are chaotic in nature, so they are not stable.

But that doesn't have anything to do with simulations (except what you are trying to emulate). As far as simulations, numerical accuracy will play a role, but really what you look for is if it is realistic, because the real world has random events. That's more what I was trying to get at. You want something that represents reality, not an overly simplified example that you can't use in a meaningful way. Even with these inaccuracies you can get representative models (they will reflect what happens in a physical experiment). And I say representative, because you aren't going to account for all those factors in a simulation, but you are accurate enough to make extremely effective conclusions. I'll even note that some people will add random noise into their simulations (I don't know if this author did, but numerics can play that role).


> The real world has truly random events, and while it bothers many people..

Replace the word random with unpredictable and it shows why some people are bothered.

To say something is fundamentally unpredictable means that it is immeasurable and not merely complex.


Random is the word we use in physics. It isn't wrong. You're kind of just giving the definition of random. True random. Not pseudo random, which computers give you (and is usually called pseudo random explicitly). And you can measure the events, you just can't PREDICT it with certainty. Though you can predict it stochastically.


> To say something is fundamentally unpredictable means that it is immeasurable.

No, it just means that the measure cannot be predicted in advance.

OTOH, the observer effect can make measurement problematic.


This is true (Heisenberg uncertainty more says you can only get so much resolution in your measurement), but I was staying more general because there are other events that are random. Radioactive decay is completely random and isn't affected by an observer.


That is because you don't actually observe the radioactive decay event; you observe it's byproducts. And the presence of an observer of those byproducts certainly changes their behavior (attenuation/scattering, energy, absorption depending on detection method) as opposed to if there were no observer looking at that byproduct.


I was more referencing the actual event of decay. Sure, the byproducts exhibit the standard quantum effects. the byproducts are particles after all.

And just to be clear, you do not think an observer has to be a conscious being, right? I only ask because pop culture science gives this impression. A photon can be an observer.



There is a Reddit community for Generative Art:

https://www.reddit.com/r/generative/

Computer generated art is fascinating


Beautiful. If he were to sell hi-res prints of some of these I'm sure people would buy them.


I watched it fullscreen on a mobile phone and in the later stages it was mesmerizing. A fluid simulation as phone screen saver would be awesome.


Sadly, a fluid simulation as a phone screen saver would be a battery killer.


How about, a video / animation of one instead of a real time one?

(I'm not sure how phone screen savers even work. I actually didn't even know you could set one on phones.)


with inputs from the motion sensors even more so.


I've used this as a live wallpaper on my phones for quite a few years. It's a nice fast effect, not truly like this example but gets close enough to the idea. I bought the full version too to customize some colors.

https://play.google.com/store/apps/details?id=com.formisk.al...


I thought the iPhone X did this (I haven't used one) but after looking it up it's just a video file [0]

[0] http://universaleverything.com/projects/iphone-x-wallpapers/


Ah, brings back fond memories. I always loved physics simulations and managed to implement a fluid solver (after reading [1]) back in 2005. When I got the basic 2D version working, I spent maybe 20 minutes just stirring the vortices with a mouse - so fun!

[1] Visual Simulation of Smoke (physbam.stanford.edu/~fedkiw/papers/stanford2001-01.pdf)


> white fluid and heavy, black fluid flow past each other, creating the Kelvin-Helmholtz instability

This reminded me of the recent Jupiter images: https://solarsystem.nasa.gov/planets/jupiter/galleries


I've just been thinking car repair, so I had hopes of something more unusual here, maybe like the blue smoke of some electronics. https://en.wikipedia.org/wiki/Magic_smoke


The most impressive version of this I have seen is from NASA/Boeing: https://youtu.be/-D5N_OnZ_Tg

I cannot imagine the amount of work that goes into something like this.


I am wondering if engineering simulations for fluids take advantage of GPUs? I studied CFD back in 2005 or so and we used ANSYS Fluent as a solver and it took forever to converge with sufficient accuracy.

If anyone in engineering has insight into this, I'd love to know.

Technically, non-fluid simulations could also be sped up using GPUs? Dynamics, solid state mechanics, thermal simulations, etc.


In my experience from solid mechanics, generally no. I don’t know exactly why, but my guess would be that there’s a significant lag time in development of these packages. They are conservative and heavily favour accuracy to speedy new techniques.

Often they use back-end solvers that are very old. For example I use FEMAP professionally and it’s essentially a pre- and post-professor over NASTRAN, which is way older than me. Adding GPGPU to it would be difficult indeed, and no one will pick a less accurate new solver without it having been robustly proven (it’s a chicken and egg problem, in a way).

Also, GPUs aren’t suited to all problems. You still have the memory limits of GPUs which aren’t as Large as traditional RAM (I see no GPUs with 32-64GB RAM). They’re not the silver bullet people sometimes hope for.

Lastly, the people who do this are surprisingly less overlapping with the flashy new IT development-aware crowd than you’d expect. They’re not Silicon Valley types with their finger on the pulse of the latest and greatest. Most just use PCs as a tool and wouldn’t know the benefits GPGPU could provide. To them, video cards are just video cards.


There are plenty of scientific and engineering fluid dynamics codes that take advantage of GPUs. The current fastest supercomputer in the US, Titan at ORNL, uses GPUs. Here’s a CFD code that ran on it: https://www.olcf.ornl.gov/2014/06/13/ramgen-takes-turbomachi.... ORNL likes their OpenACC; here’s a GTC presentation on the code: http://on-demand.gputechconf.com/gtc/2014/presentations/S475...

LLNL and ORNL are right now rolling out two new gigantic GPU systems, Sierra and Summit: https://www.google.com/amp/s/www.nextplatform.com/2017/09/19.... There will be plenty more GPU CFD simulations of running on those systems.


I understand the difficultly with legacy code, but your opinion about "Silicon Valley" types, etc. is very off putting.

World needs to evolve. Those who do not, disappear.

If there is a way without sacrificing accuracy, then GPU computation would be absolutely amazing. Imagine the productivity boost, time savings, power and infrastructure savings you'd gain by using a GPU (if it is possible that is). Imagine being able to simulate engineering problems in near real time without having to wait for hours for a solver to converge.

I don't have experience in the solvers, but if there is a way to enable GPU computation then why the hell not!? Silicon valley types or not. People who are engaged in CFD (I used to work at Lockheed's flutter dynamics team) are certainly not "old fashioned" as you describe. GPU to them wouldn't just be a "kid's video game toy" - I can assure you having worked with these folks.


Perhaps the main issue is that even if a PDE solver supports distributed CPU parallelism, the distributed block solvers do not typically allow for decoupling to thousands of independent threads that GPUs are good at. Therefore as the PDE problems and solvers are tightly coupled they do not easily parallelize to GPUs and don't allow for simple recompilation with GPU targets. Most often an existing code would require a complete rewrite/redesign (man years of work for big code bases), at least if there are to be any gains to be had. There are new codes coming, particularly in academia, utilizing GPUs. From what I've seen one can expect around 10x improvement switching to GPUs, so it's good, but not magnitudes better considering the work involved.



This posting makes me unaccountably happy.


Reminds me of this recent demoscene production....backwards fluid dynamics that generate bad poetry text in 8kb.

https://www.pouet.net/prod.php?which=62974


Graphics programming seems to be very interesting to me, especially after watching cool stuff like this. Can anyone point me out to good resources where I can start learning it?


What he's doing, writing CFD codes from scratch, is basically all math. You should be pretty comfortable solving PDEs by hand, then learn about the numerical solutions.


Unfortunately it takes a lot of persistence because mathematics is a huge precursor. Most people want bells and whistles but aren't willing to bunker down and learn partial differential equations and other context which can be less buzzworthy and somewhat mundane. I think math is cool though, its the language of complexity.


You can also start with simpler things, like drawing 3D graphs of equations, rendering and shading 3D shapes, and plotting Mandelbrot sets. For me, that was fun and visual, with simpler math. Example of sin(x)+cos(y) : http://maxima-online.org/examples.html


Do you want to learn how to use graphics APIs to do cool stuff? Or create graphics API's, using cool math? Or learn CPU/GPU innards?


much more CFD, still images, animations, artworks: http://markjstock.com/


Fluid simulation is hard, I found using a curl-noise based fluid simulation works quite well esp for mobile game development


a realtime fluid solution is available in Unity: Megaflow

less detailed, but realtime on the CPU

https://www.assetstore.unity3d.com/en/#!/content/24340

and Fluvio which has GPU acceleration.


Wouldn't a multigrid approach yield a more scalable acceleration compared to using a GPU?


Probably depends on the resolution -- I've found a good GPU-based CG implementation can chew through moderately sized problems faster than multigrid. I would also imagine the blog author could get away with some fairly loose tolerances on the Poisson solve (at the cost of some compressibility) and still have the motion look great.


Multigrid is a family of algorithms and a GPU is piece of hardware.


I found them very relaxing. Like watching your coffee


Many of the instabilities you see in his sims are familiar from everyday observation, like mixing cream into coffee. See for example, the following two which are the most familiar:

https://en.wikipedia.org/wiki/Rayleigh%E2%80%93Taylor_instab...

https://en.wikipedia.org/wiki/Kelvin%E2%80%93Helmholtz_insta...


OFF-TOPIC: When someone has a problem with their computer, I'm going to start telling them they need GPU fluid.

Jokes aside, this is a beautiful simulation of 2D fluid dynamics pumped to the screen using OpenGL.


In my teenage years, I went fishing aboard a long liner.

I shared my shift with four intimidating Icelandic men. I usually stood by the line, making sure that any fish that were hooked got inside the boat.

During one of my shifts, under heavy storms, an absolutely massive knot came up. I tried to untie it, but ended up having to call for help. The Icelanders went to town on it, but did not make a lot of progress.

In frustration, one of them screamed at me to go get the knot book from the captain. I ran as fast as I could. Up on deck, a huge wave hit, and I almost fell overboard.

Got up to the bridge. Captain just laughed and told me to fuck off.

When I got back, they had untied the knot.

I never lived it down.


We have a lot of these in the military (M203 blank adapter, box of grid squares, chemlight batteries, testing the armor on gun trucks or the shocks on armor like APCs or tanks, etc). But my favorite by far was convincing a FNG to collect a few exhaust samples from the Bradleys. This guy was a friend of mine in the civilian world and a really smart dude, but man he fell hook, line, and sinker for that one. He would climb up there on the vehicle with a trash bag, cover the massive exhaust port, then tell the driver to fire it up. It just blew thick, black engine smoke all over him, covering him in soot. Several times. When he finally managed to get a big cloud of exhaust in the trash bag (with the whole platoon standing around watching), I took it and opened the bag and looked in there and said, "Nah, man, this one's not quite good enough, gotta get another one"... Finally he ended up walking across the whole formation to the motor sergeant with this giant transparent trash bag full of engine exhaust, covered head to toe in black gunk, totally proud that he finally got a good sample.


When it releases the magic smoke, it means it's burnt through its supply of GPU fluid.


I hear you can get that at auto parts stores. It's right next to the blinker fluid.


I can see how you might think you have a virus, with that strange message box and those encrypted files, but you actually have a problem with your GPU.

We tested if the reservoir of GPU fluid is low, but after we put the transistors on the megahertz scale it's pretty clear that your monitor won't ever go into X mode. It's completely out of sync. The only thing left is to update the source codes, or you'll never boot again.


When the apprentice has nothing else to do, send him buy some gpu fluid.


Off-topic, but I used to work at Domino's and the running gag for new hires was to yell at them to get the "dough repair kit" from the back in the middle of the Friday evening rush.


If you have a liquid cooled GPU then GPU Fluid is a real thing...


That's exactly what I thought this item would be about before I clicked it.


Just make sure they check if the GPU fan spins left or right - right-handed GPU fluid will make the spurving bearings in a left-handed GPU sieze up real quick.




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: