
GPU Fluid - pablode
https://benedikt-bitterli.me/gpu-fluid.html
======
tehsauce
His 3D fluid simulation and renderer is even more beautiful!
[https://benedikt-bitterli.me/vorton-fluid.html](https://benedikt-
bitterli.me/vorton-fluid.html)

~~~
pantalaimon
doesn't seem to be on github unfortunately

------
andrewwharton
I came across a nice video/presentation from some guys at SpaceX on using GPUs
to model and visualise the fluid flows inside rocket engines (specifically the
Raptor engine) and around spacecraft during reentry, as well as a good
explanation of how they do it efficiently using adaptive grids. Their
explanations are quite approachable as well.

[https://www.youtube.com/watch?v=txk-
VO1hzBY](https://www.youtube.com/watch?v=txk-VO1hzBY)

~~~
Bromskloss
Do they have code available?

~~~
tobmlt
Note the citations at the end of the video (44:30 mark ). Code from some of
the original research the spacex stuff was based on, adaptive wavelet
collocation methods, used to be online in a bitbucket repo here:
[https://git.plgrid.pl/scm/tmh/wavelet-original-
code.git](https://git.plgrid.pl/scm/tmh/wavelet-original-code.git)

Unfortunately it seems to have been taken down. A shame - it was really
everything! But as a huge CFD library research code it is a bit hard to sift
through anyway.

I'd recommend the papers then. Stuff like this, from these guys (Jonathan
Regele and Oleg Vasileyev), led to the Spacex software:
[https://www.researchgate.net/publication/239395184_An_adapti...](https://www.researchgate.net/publication/239395184_An_adaptive_wavelet-
collocation_method_for_shock_computations)

An earlier starting point might be something like this:
[http://ms.mcmaster.ca/kevla/pdf/articles/Kevlahan_SISC_v26.p...](http://ms.mcmaster.ca/kevla/pdf/articles/Kevlahan_SISC_v26.pdf)

This line of work dates back to the mid 90s as you will see from the sources.

------
diimdeep
He also made this 2D light renderer
[http://htmlpreview.github.io/?https://github.com/tunabrain/t...](http://htmlpreview.github.io/?https://github.com/tunabrain/tantalum/blob/master/tantalum.html)

------
lossolo
If you like things like fluid simulation and you would like to create real
simulation then I would recommend RealFlow. It's so easy to model real life
looking huge fluid simulations with physical objects in this software. You can
choose to create things like foam automatically. You don't need any knowledge
about programming or physics, just set the source, set your scene, click
simulate and watch the magic happens. It's only hobby for me but it's great
fun.

Some simple examples:

Seaplane Water Simulation:
[https://www.youtube.com/watch?v=R9TZNiZYhq0](https://www.youtube.com/watch?v=R9TZNiZYhq0)

Some nice features:
[https://www.youtube.com/watch?v=PuS1zD6kCyc](https://www.youtube.com/watch?v=PuS1zD6kCyc)

Made with realflow:
[https://www.youtube.com/watch?v=nnv-95w1d5A](https://www.youtube.com/watch?v=nnv-95w1d5A)

[1] [http://www.nextlimit.com/realflow/](http://www.nextlimit.com/realflow/)

~~~
fwilliams
It's worth pointing out that particle-based fluid simulators are not
physically accurate. They compute the positions of zero-volume particles in
space at every frame (fluid has volume). This is convenient for performance
purposes since you can evaluate each particle position mostly in parallel and
the results look convincing enough for a movie or a game.

In contrast, mesh-based (and grid-based like in the article) simulators
approximate the Navier-Stokes equations and converge to the exact solution as
the mesh element sizes approach zero.

Mesh-based simulators solve a global system of equations in every frame. This
system has variables equal in number to total mesh elements over the entire
domain. Solving such a large system (especially in 3D) is very expensive.
Furthermore, the size and shape of the mesh elements drastically affect the
numerical stability of the solution, and generating good meshes is not a
solved problem in all cases. For these reasons, we really only use mesh-based
simulators when we need a high-degree of scientific accuracy.

~~~
keldaris
> It's worth pointing out that particle-based fluid simulators are not
> physically accurate.

Depends on what you mean by "particle-based". Lattice Boltzmann models, given
specific constraints, can be easily shown to converge to the Navier-Stokes
(and, if desired, even higher order fluid dynamics models) equations via the
Chapman-Enskog expansion.

~~~
whyever
How can you show convergence when it is not shown that there is a unique
solution to Navier-Stokes?

~~~
pzone
There's no contradiction. You show that what it converges to satisfies Navier-
Stokes.

(Maybe there's another Navier-Stokes solution with the same initial
conditions, maybe it's unique, doesn't matter!)

------
munificent
His page needs a trigger warning for Imposter Syndrome.

------
grewil2
For those who would like round off this excellent post with some classy, non-
gpu, retro platform fluid, please check out Fluid Fire, a great little 1k
Amiga bootsector intro.

[https://www.youtube.com/watch?v=OKatv85Bei4](https://www.youtube.com/watch?v=OKatv85Bei4)

------
fhood
In the Hot Black Smoke video, how does the asymmetry occur? Is it due to
imperfections in the calculations?

~~~
godelski
Why would you expect symmetry? Smoke from a real flame does not have symmetry.

~~~
usmannk
I think they're asking how to create the asymmetry and if the imperfections in
the calculations are exploited to do so.

------
milesokeefe
source: [https://github.com/tunabrain/gpu-
fluid](https://github.com/tunabrain/gpu-fluid)

------
msoad
There is a Reddit community for Generative Art:

[https://www.reddit.com/r/generative/](https://www.reddit.com/r/generative/)

Computer generated art is fascinating

------
nicolashahn
Beautiful. If he were to sell hi-res prints of some of these I'm sure people
would buy them.

------
tqkxzugoaupvwqr
I watched it fullscreen on a mobile phone and in the later stages it was
mesmerizing. A fluid simulation as phone screen saver would be awesome.

~~~
slavik81
Sadly, a fluid simulation as a phone screen saver would be a battery killer.

~~~
jdironman
How about, a video / animation of one instead of a real time one?

(I'm not sure how phone screen savers even work. I actually didn't even know
you could set one on phones.)

------
gattr
Ah, brings back fond memories. I always loved physics simulations and managed
to implement a fluid solver (after reading [1]) back in 2005. When I got the
basic 2D version working, I spent maybe 20 minutes just stirring the vortices
with a mouse - so fun!

[1] Visual Simulation of Smoke
(physbam.stanford.edu/~fedkiw/papers/stanford2001-01.pdf)

------
coling
> white fluid and heavy, black fluid flow past each other, creating the
> Kelvin-Helmholtz instability

This reminded me of the recent Jupiter images:
[https://solarsystem.nasa.gov/planets/jupiter/galleries](https://solarsystem.nasa.gov/planets/jupiter/galleries)

------
kbutler
I've just been thinking car repair, so I had hopes of something more unusual
here, maybe like the blue smoke of some electronics.
[https://en.wikipedia.org/wiki/Magic_smoke](https://en.wikipedia.org/wiki/Magic_smoke)

------
p_eter_p
The most impressive version of this I have seen is from NASA/Boeing:
[https://youtu.be/-D5N_OnZ_Tg](https://youtu.be/-D5N_OnZ_Tg)

I cannot imagine the amount of work that goes into something like this.

------
npgatech
I am wondering if engineering simulations for fluids take advantage of GPUs? I
studied CFD back in 2005 or so and we used ANSYS Fluent as a solver and it
took forever to converge with sufficient accuracy.

If anyone in engineering has insight into this, I'd love to know.

Technically, non-fluid simulations could also be sped up using GPUs? Dynamics,
solid state mechanics, thermal simulations, etc.

~~~
NamTaf
In my experience from solid mechanics, generally no. I don’t know exactly why,
but my guess would be that there’s a significant lag time in development of
these packages. They are conservative and heavily favour accuracy to speedy
new techniques.

Often they use back-end solvers that are very old. For example I use FEMAP
professionally and it’s essentially a pre- and post-professor over NASTRAN,
which is way older than me. Adding GPGPU to it would be difficult indeed, and
no one will pick a less accurate new solver without it having been robustly
proven (it’s a chicken and egg problem, in a way).

Also, GPUs aren’t suited to all problems. You still have the memory limits of
GPUs which aren’t as Large as traditional RAM (I see no GPUs with 32-64GB
RAM). They’re not the silver bullet people sometimes hope for.

Lastly, the people who do this are surprisingly less overlapping with the
flashy new IT development-aware crowd than you’d expect. They’re not Silicon
Valley types with their finger on the pulse of the latest and greatest. Most
just use PCs as a tool and wouldn’t know the benefits GPGPU could provide. To
them, video cards are just video cards.

~~~
fermienrico
I understand the difficultly with legacy code, but your opinion about "Silicon
Valley" types, etc. is very off putting.

World needs to evolve. Those who do not, disappear.

If there is a way without sacrificing accuracy, then GPU computation would be
absolutely amazing. Imagine the productivity boost, time savings, power and
infrastructure savings you'd gain by using a GPU (if it is possible that is).
Imagine being able to simulate engineering problems in near real time without
having to wait for hours for a solver to converge.

I don't have experience in the solvers, but if there is a way to enable GPU
computation then why the hell not!? Silicon valley types or not. People who
are engaged in CFD (I used to work at Lockheed's flutter dynamics team) are
certainly not "old fashioned" as you describe. GPU to them wouldn't just be a
"kid's video game toy" \- I can assure you having worked with these folks.

~~~
bauta-steen
Perhaps the main issue is that even if a PDE solver supports distributed CPU
parallelism, the distributed block solvers do not typically allow for
decoupling to thousands of independent threads that GPUs are good at.
Therefore as the PDE problems and solvers are tightly coupled they do not
easily parallelize to GPUs and don't allow for simple recompilation with GPU
targets. Most often an existing code would require a complete rewrite/redesign
(man years of work for big code bases), at least if there are to be any gains
to be had. There are new codes coming, particularly in academia, utilizing
GPUs. From what I've seen one can expect around 10x improvement switching to
GPUs, so it's good, but not magnitudes better considering the work involved.

------
ncmncm
This posting makes me unaccountably happy.

------
bane
Reminds me of this recent demoscene production....backwards fluid dynamics
that generate bad poetry text in 8kb.

[https://www.pouet.net/prod.php?which=62974](https://www.pouet.net/prod.php?which=62974)

------
k_138z
Graphics programming seems to be very interesting to me, especially after
watching cool stuff like this. Can anyone point me out to good resources where
I can start learning it?

~~~
starpilot
What he's doing, writing CFD codes from scratch, is basically all math. You
should be pretty comfortable solving PDEs by hand, then learn about the
numerical solutions.

------
spot
much more CFD, still images, animations, artworks:
[http://markjstock.com/](http://markjstock.com/)

------
amitbr
Fluid simulation is hard, I found using a curl-noise based fluid simulation
works quite well esp for mobile game development

------
tezza
a realtime fluid solution is available in Unity: Megaflow

less detailed, but realtime on the CPU

[https://www.assetstore.unity3d.com/en/#!/content/24340](https://www.assetstore.unity3d.com/en/#!/content/24340)

and Fluvio which has GPU acceleration.

------
amelius
Wouldn't a multigrid approach yield a more scalable acceleration compared to
using a GPU?

~~~
electricslpnsld
Probably depends on the resolution -- I've found a good GPU-based CG
implementation can chew through moderately sized problems faster than
multigrid. I would also imagine the blog author could get away with some
fairly loose tolerances on the Poisson solve (at the cost of some
compressibility) and still have the motion look great.

------
quickthrower2
I found them very relaxing. Like watching your coffee

~~~
noobermin
Many of the instabilities you see in his sims are familiar from everyday
observation, like mixing cream into coffee. See for example, the following two
which are the most familiar:

[https://en.wikipedia.org/wiki/Rayleigh%E2%80%93Taylor_instab...](https://en.wikipedia.org/wiki/Rayleigh%E2%80%93Taylor_instability)

[https://en.wikipedia.org/wiki/Kelvin%E2%80%93Helmholtz_insta...](https://en.wikipedia.org/wiki/Kelvin%E2%80%93Helmholtz_instability)

------
drefanzor
OFF-TOPIC: When someone has a problem with their computer, I'm going to start
telling them they need GPU fluid.

Jokes aside, this is a beautiful simulation of 2D fluid dynamics pumped to the
screen using OpenGL.

~~~
mar77i
When the apprentice has nothing else to do, send him buy some gpu fluid.

~~~
bwilliams18
If you have a liquid cooled GPU then GPU Fluid is a real thing...

~~~
userbinator
That's exactly what I thought this item would be about before I clicked it.

