
KABOOM in 180 lines of bare C++ - haqreu
https://github.com/ssloy/tinykaboom/wiki
======
acidburnNSA
Add in some lookup-tables from experimental nuclear interaction probability
tables and you've got yourself a Monte Carlo radiation transport system
capable of simulating in precise geometric detail all sorts of systems. This
was indeed one of the early uses of computers back in the 1950s.

For example: [https://en.wikipedia.org/wiki/Monte_Carlo_N-
Particle_Transpo...](https://en.wikipedia.org/wiki/Monte_Carlo_N-
Particle_Transport_Code)

------
themodelplumber
Wow, this is very similar to the way I use procedural 3D textures and
displacement maps to draw planets and things. So much so that I don't even see
the explosion, just a growing planet (really, just tweak the mapped color
gradient a bit).

Example:

[https://www.friendlyskies.net/images/266.jpg](https://www.friendlyskies.net/images/266.jpg)

So much of 3D graphics is "hmm, this random thing I just made slightly
resembles an $X". "OK, so let's say I just created a method for modeling $X."
:-) Really fun stuff.

~~~
GuB-42
That's a common trick in PC 4k intros.

Despite the apparent variety, most 4k intros rely on a single effect, maybe
two. There isn't enough space for more. The only things that change are the
parameters.

Interestingly, the technique used here is signed distance field raymarching. A
technique that is used by maybe 90% of all 4k intros today.

So basically, write the code in GLSL instead of C, add music, play a bit with
the parameters, use a good exe packer like crinkler and you have a nice 4k
intro.

~~~
themodelplumber
That's really cool, in part because it's comforting to hear. :-) Sometimes I
feel a bit guilty about how much variety can be had from abusing a single
distance function + various texture maps, camera angles and focal settings,
environment colors, etc.

~~~
semi-extrinsic
The signed distance function also finds application in real physics
simulations of fluids, where it goes under the name of "level set method". Ron
Fedkiw, who's worked a lot on it, is one of very computational physicists who
also holds an Academy Award. His homepage is very interesting:

[http://physbam.stanford.edu/~fedkiw/](http://physbam.stanford.edu/~fedkiw/)

------
lbj
As pretty as this is I recommend reading all of his tutorial. Lots of it is
old stuff that most of you will have seen takes on before, but he's got some
great angles and insights hidden in each post.

Start here:
[https://github.com/ssloy/tinyrenderer/wiki/Lesson-1:-Bresenh...](https://github.com/ssloy/tinyrenderer/wiki/Lesson-1:-Bresenham%E2%80%99s-Line-
Drawing-Algorithm)

------
nertzy
I was hoping that this was a clone of the Atari game.

[https://en.wikipedia.org/wiki/Kaboom!_(video_game)](https://en.wikipedia.org/wiki/Kaboom!_\(video_game\))

~~~
grumpyDev66
Me too!

------
jcoffland
Really well written. The other articles in the series on computer graphics are
excellent too. The use of GitHub to show diffs of each step is quite
effective.

~~~
mi_lk
> The use of GitHub to show diffs of each step is quite effective.

Little off-topic, but that's one of the reasons I like to use Magit on Emacs,
you can see the diff of each commit interactively and intuitively.

[1]:[https://magit.vc/](https://magit.vc/)

~~~
codetrotter
Git can show you the diff of any commit as well, though I guess maybe the
interactivity you mentioned is important.

Anyhow, with just git.

Diff of most recent commit:

    
    
      git show HEAD
    

Diff of preceding commit:

    
    
      git show HEAD^
    

Diff of commit before that again

    
    
      git show HEAD^^
    

Of course you don’t want to type ^ 200 times. Fortunately you don’t have to.
Diff of commit three steps back before most recent:

    
    
      git show HEAD~3
    

Or you can look at the log first

    
    
      git log
    

And find a specific commit and then use the first few characters of the id of
that commit, e.g.

    
    
      git show af4c
    

And of course I would be remiss to not mention

    
    
      git diff
    

And

    
    
      git diff --cached
    

These two show you unstaged and staged changes _before_ you commit. I use
these commands _all the time_. So much that they are two of the commands I
have created very short two-letter aliases for in my .bashrc

    
    
      alias di="git diff --cached"
      alias dp="git diff"

~~~
viraptor
Alternatively, you can use `tig` which allows you to browse the commit log,
diffs, and blames interactively. It's a terminal ui tool.

------
edoo
Impressive effect.

I didn't realize openmp was so easy to use. It isn't realtime but you could
bake up some cool effects with this.

AMD FX8320 3.5GHz

$ time ./tinykaboom

real 0m4.176s user 0m28.631 sys 0m0.012s

~~~
nazri1
I saw the openmp pragma and thought to myself "neat! should be fun to watch
the cores work hard at this" and went ahead and compiled and run it and smiled
at the 400% cpu usage in top.

    
    
        $ time ./tinykaboom 
        ./tinykaboom  78.08s user 0.02s system 369% cpu 21.159 total
    

Then I wondered how it would fare if I were to port it to Go and went ahead
and hastily did port to Go and thought that, "hmmm this should run a bit
slower than the c++ version" but surprisingly it ran more than twice faster:

    
    
        $ go build ./tinykaboom.go
        $ time ./tinykaboom 
        ./tinykaboom  34.32s user 0.03s system 368% cpu 9.315 total
    

[https://github.com/holygeek/tinykaboom/blob/master/tinykaboo...](https://github.com/holygeek/tinykaboom/blob/master/tinykaboom.go)

Here's the corresponding perf report:

Go:

    
    
        Samples: 103K of event 'cycles:pp', Event count (approx.): 37252033995665
        Overhead  Command     Shared Object      Symbol
          32.17%  tinykaboom  tinykaboom         [.] math.sin
          28.80%  tinykaboom  tinykaboom         [.] main.hash
          11.81%  tinykaboom  tinykaboom         [.] main.rotate
           7.76%  tinykaboom  tinykaboom         [.] math.Min
           5.18%  tinykaboom  tinykaboom         [.] main.lerpFloat64
           4.25%  tinykaboom  tinykaboom         [.] main.noise
           2.59%  tinykaboom  tinykaboom         [.] runtime.mallocgc
           2.59%  tinykaboom  tinykaboom         [.] main.fractal_brownian_motion
           2.58%  tinykaboom  tinykaboom         [.] main.signed_distance
    

c++:

    
    
        Samples: 234K of event 'cycles:pp', Event count (approx.): 86721459552303
        Overhead  Command     Shared Object        Symbol
          67.93%  tinykaboom  libm-2.23.so         [.] __sin_avx
          30.80%  tinykaboom  tinykaboom           [.] _Z5noiseRK3vecILm3EfE
           1.27%  tinykaboom  libm-2.23.so         [.] __floorf_sse41
           0.00%  tinykaboom  tinykaboom           [.] _Z23fractal_brownian_motionRK3vecILm3EfE
           0.00%  tinykaboom  tinykaboom           [.] floorf@plt
    

If anyone can give suggestions on how to make the tinykaboom.cpp faster that
would be neat!

~~~
haldean
What were your compilation flags?

~~~
nazri1
I used the default one in CmakeLists.txt (-O3).

I ran the comparison again on another machine that I have and this time their
performances are about the same:

c++:

    
    
        $ time ./tinykaboom
        ./tinykaboom  46.72s user 0.01s system 364% cpu 12.804 total
    

go:

    
    
        $ time ./tinykaboom     
        ./tinykaboom  42.50s user 0.07s system 350% cpu 12.161 total

------
etaioinshrdlu
This technique (distance field sphere tracing) was popularized in a big way on
the community [https://www.shadertoy.com/](https://www.shadertoy.com/)

~~~
userbinator
Inigo Quilez created that site, he could be said to have been the one who
really popularised it all. He has a bunch of articles on his site about
distance fields:

[https://www.iquilezles.org/www/articles/distfunctions/distfu...](https://www.iquilezles.org/www/articles/distfunctions/distfunctions.htm)

[https://iquilezles.org/www/articles/raymarchingdf/raymarchin...](https://iquilezles.org/www/articles/raymarchingdf/raymarchingdf.htm)

He also has videos on YouTube, some of which show the creation of a scene
starting from the basics.

------
tpaschalis
I ported this [1], as well as ssloy's previous [2] article that was featured
recently in HN, in Golang.

I wanted to say he's an _excellent_ introduction to graphics. As a sysadmin, I
had never drawn more than simple lines and the whole thing seemed daunting.
Now I see how fun it can be, and I'm waiting for next weekend to pick up some
more!

[1] [https://github.com/tpaschalis/go-
tinykaboom](https://github.com/tpaschalis/go-tinykaboom)

[2] [https://github.com/tpaschalis/go-
tinyraytracer](https://github.com/tpaschalis/go-tinyraytracer)

------
gammateam
Impressive

I was willing to criticize the line count if it had a bunch of dependencies,
but 180 lines seems accurate and I'll give a pass to the obligatory core C++
include statements

Way to go!

------
tombert
I suppose this shows a bit of a sickness inside me, because I have a strong
aversion against reading these articles because of my pure distaste for C++,
which is stupid because this article is great.

Still, it's hard for me to not want to create a series of blog posts under the
title "All those cool graphics tutorials in <language I like better>".

------
franciscop
This is amazing, thanks for sharing! One particularity of fire is that it does
not generate shadows, while smoke does (might be useful for more realistic
effects):

[https://physics.stackexchange.com/questions/372117/shadow-
of...](https://physics.stackexchange.com/questions/372117/shadow-of-fire-
doesnt-exist)

------
wprapido
Brings back fond memories from the demoscene days back in the 90's! Cheers!

------
angeal1131
Oh. I thought it was the Atar2600 game.

~~~
butterisgood
Kaboom! was awesome... need paddle controllers to really appreciate it though.

------
jrjarrett
Aw, here I thought it was an implementation of the 1980's Atari 2600
Activision game :/

[https://en.wikipedia.org/wiki/Kaboom!_(video_game)](https://en.wikipedia.org/wiki/Kaboom!_\(video_game\))

------
mistrial9
reminds me of a patch to MacOS WindowMgr that made an explosion when you
closed the window.. didnt win MacHack i n Ann Arbor MI that year but it was
fun..

------
Solar19
Nice. Does Visual Studio support the level or version of OpenMP used in this
code? (see the pragma)

~~~
haqreu
Yes it does, however you'd need to modify CMakeLists.txt, as it is written for
g++

------
hestefisk
This is pretty cool. Reminds me of my old escapades on the demo scene (The
Party!) in the late 90es.

------
erikpukinskis
SDFs are amazing.

------
cheick1
I would learn language C C+ and C++ who can help me please

------
Const-me
Modern GPUs are just too powerful to ignore.

Especially for graphics-related stuff.

Yet HN community seems to ignore GPUs.

When I recently published my hobby project that renders much more advanced
procedurally-generated stuff, I only got a single upvote:
[https://news.ycombinator.com/item?id=18921046](https://news.ycombinator.com/item?id=18921046)

~~~
holga
You completely missed the point of why people like this..

~~~
Const-me
The OP's 180 lines of C++ could be rewritten to similar count of lines in HLSL
or GLSL. The resulting code would render in realtime while also consuming less
electricity.

Indeed, I don't see why people like this. In modern world, doing graphics on
CPU is very inefficient.

~~~
vidarh
You miss the point. The point is not doing it efficiently, but explaining the
concepts without being distracted by getting it running on specific hardware
and the like. Read the introduction to the overall series.

~~~
Const-me
> The point is not doing it efficiently

Have you read the linked article? It says "I want to have a simple stuff
applicable to video games."

> without being distracted by getting it running on specific hardware

Just target Windows and use Direct3D, 99% of PC game developers do just that.
The last GPU that didn't support D3D feature level 11.0 was intel sandy bridge
from 2011. Everything newer then that supports 11.0, and unlike OpenGL with
it's extensions, the majority of features are mandatory. Very rarely I saw
compatibility issues across GPUs in recent years, and when I did it was a
driver bug.

~~~
vidarh
The _algorithm_ is applicable to video games. As you yourself point out it'd
not be a lot of effort to rewrite. I suggest _you_ read the very next
paragraph, which ends:

> I do not pursue speed/optimization at all, my goal is to show the underlying
> principles.

...

> Just target Windows and use Direct3D, 99% of PC game developers do just
> that.

Misses the point of the series. From the introduction (linked at the top of
the page):

> I do not want to show how to write applications for OpenGL. I want to show
> how OpenGL works. I am deeply convinced that it is impossible to write
> efficient applications using 3D libraries without understanding this.

The exact same could be said for Direct3D. He gives his students a class to
read/write TGA images and set pixels for that article, which should make it
exceedingly clear that the _point_ is to ensure the focus is the algorithm and
no irrelevant details to teach the principles without people being sidelined
by worrying about libraries, and differences between platforms and the like.

And in any case, I explained to you what the appeal of this to people here is.
That you think it could be done differently does not change that the appeal to
people is exactly that there are no dependencies like Direct3D or Windows or
anything else (I don't have Windows anywhere, so for me that would have made
it relatively uninteresting; as it would for a lot of other people here). I
don't care about the performance; I care about the concepts.

~~~
Const-me
> I am deeply convinced that it is impossible to write efficient applications
> using 3D libraries without understanding this.

What he explains is almost irrelevant for efficiency. Other things are
relevant: early Z, tiled rendering, other things about architecture of GPUs:
resource types, cache hierarchy, fixed-function pipeline steps, these warps,
many others.

> I care about the concepts.

I've been programming C++ for living since 2000, about half of that time
something relevant to 3D graphics, both games and CAD. You no longer need deep
understanding of rasterizer. Vague understanding of what the hardware does,
and how to control it, is enough already. Only people working in companies
like nVidia or Chaos Group need that info, IMO.

~~~
mcbits
Yes, irrelevant, because the article is simply describing how to combine a
sphere+bumps+noise to look like an explosion.

Your project seems to be interesting in its own way, but I don't see why you'd
juxtapose it with this tutorial, other than that they both involve pixels.

~~~
Const-me
The project was just an example of HN bias against GPUs. At least for doing
graphics on GPUs.

Try searching "GPU" on this site. 100% of the first page of results are about
using GPGPU in the clouds. Do you think that matches what people buy GPUs for,
or amount of code developers white for them?

~~~
mcbits
I searched "GPU" and almost all of the top results were people doing
weird/unusual things with the GPU: terminal emulator, Postgres query
acceleration, stripped down HTML engine, APL compiler, etc. A search of
"Direct3D" suggests HN doesn't have much interest in Direct3D, though.

~~~
Const-me
> weird/unusual things with the GPU: terminal emulator, Postgres query
> acceleration, stripped down HTML engine, APL compiler

Exactly. People here mostly do general-purpose computations on them, despite
“G” stands for graphics.

Search for “Graphics”, and the majority of top results are for CPU-based
stuff, pixel graphics in terminal, python, Skia. There’re some results for
GPU-based graphics, like graphic studies for MGS and GTA, but they’re
minority.

I think graphics is how the majority of users use their hardware for, but it’s
under-represented here.

