Hacker News new | past | comments | ask | show | jobs | submit login
Marble Marcher: ray-traced real time game on the surface of evolving fractals (itch.io)
345 points by kowdermeister 3 months ago | hide | past | web | favorite | 41 comments



This program utilizes a powerful technique called raymarching, raytracing's big brother. Instead of tracing along each ray in tiny fragments and returning intersections, the ray is 'marched' in larger steps, vastly increasing render speed (the downfall of classical raytracers).

How do we 'march' rays without missing geometry? With the aid of signed distance functions, geometric equations which take an arbitrary point in 3space and evaluate to 0 if that point is on the 'shell' of the geometry. For example, the signed distance function for a sphere:

  float sdfSphere(vec3 p, float r)
  {
      return length(p) - r;
  }
This lets us do interesting things such as transform the point p to transform our object, take the minimum value of two spheres to produce a venn-diagram intersection shape, take the maximum of two spheres to 'cutout' a hole from one, etc. The logic is a bit backwards at first, but it clicks quite quickly and one can rapidly construct complex geometry without much optimization/shader magic. For example, this is a realtime capture of a 64k self-contained executable. No 3d models, no textures, no assets - just raymarching and math: [https://www.youtube.com/watch?v=ie4u2i_5OdE]

Approachable overview of raymarching & SDFs: [https://jamie-wong.com/2016/07/15/ray-marching-signed-distan...]

Learn more: [https://iquilezles.org/www/articles/raymarchingdf/raymarchin...]


> Instead of tracing along each ray in tiny fragments and returning intersections

I dunno if that’s how it’s done now, but the old CG end-of-semester assignment was a constructive solid geometry ray-tracer to render not-LEGO bricks. It was simple ray-intersection against simple geometry unioned/diffed together, and didn’t use any stepping along rays.

Classic raytracers never march or step: the ‘downfall’ is that the need to test every object in the scene for intersection, and the objects need to be easy to test for ray intersection.


Very true, that was a bad description of raytracing.


One optimization is to break a scene into chunks. First you see which chunk a ray intersects and then test a Ray vs each object in that chunk.

This lets you avoid testing most objects for intersections but adds book keeping overhead so it only works really well for specific kinds of scenes. Aka not the classic single ball and a few walls.

Their are also multiple ways to handle chunks. For a space RTS game you might box up all the ships in a squad and first test the box around the squad before testing each ship. For Minecraft you can just split the world into arbitrary boxes that don’t change frequently.


This optimization can be made systematic. From a polygon-based geometry, an algorithm automatically computes a set of "boxes" (actually, hyperpolyhedra in the general case, polygons in 2D). This allows to very quickly find what intersects any ray without checking all objects. See https://en.wikipedia.org/wiki/Binary_space_partitioning It's what enabled Doom to render complex geometry in real time on a i386 processor.


Except that the state of the art has evolved to bounding volume hierarchies, which are both easier to handle and more performant. I won't recommend implementing BSP or kd-trees because they have many more corner cases and numerical issues.


Absolutely, testing against bounding boxes is a quick way to exclude candidates and frustrum culling with an octree (or similar) also greatly speeds up intersection tests


An advantage of ray-marching is that the acceleration structure is built into the shader math, so it requires no additional memory or loading. This allows the fractal to evolve in realtime, which could not be done with ray-tracing without regenerating the bounding boxes or octree every single frame.


Your description portrays raymarching as an optimization over raytracing and that's simply not the case at all.

Raymarching enables rendering objects/scenes raytracing can't readily handle (like procedurally generated fractals), and tends to be costly because of how the ray must be "marched" towards the surfaces rather than directly computing intersections.

The only form of raytracing I'm familiar with that employs anything resembling raymarching is the classical 80s-era KD-tree acceleration structure paper which stepped the ray at hyperplane boundaries, to confine the costly ray->object interesection tests to objects in the intersected axis-aligned bounding boxes.


Ray tracing does not do any kind of stepping/marching or anything similar. It's incredibly odd that you would claim it does, especially so authoritatively.


You got the terminology quite a bit wrong:

- ray tracing refers to the general problem of finding an intersection between a ray and the closest scene surfacing. It is most commonly asociated with techniques that intersect discrete primitives like triangles and usually employ acceleration structures like bounding volume hierarchies or kd-trees to make that fast.

- ray marching is a related, but actually separate technique where the algorithm steps along the ray in discrete steps. It can accumulate values along the way, if needed. This is used e.g. for volume visualizations (e.g. 3D CT scan visualization or clouds in movies). You can also use that concept to find intersections with surfaces that do nit have/require closed form solutions for ray intersections. This is how you eventually arive at sphere tracing for signed distance fields and the stunning image by iq et al. that are created entirely from math expressions for the surfaces.


>Marble Marcher is entirely ray-traced in real time and is played on the surface of evolving fractals.

That’s the very first line of the source page. So why do you say it uses raymarching? Are these terms used interchangeably as in marching is a method to do tracing?


The source code is available on GitHub and it indeed does seem to use ray-marching

https://github.com/HackerPoet/MarbleMarcher/blob/d436b6f8006...


Additionally, it is mentioned in the video on the page.


Most people don't know what ray-marching is but are familiar with ray-tracing which is similar. So I used the correct "ray-marching" term where I have a more technical audience like my videos, and I use "ray-tracing" for a more general audience like a gamer would see. I would have said ray-marching here on HN, but someone else posted this, not me :P


Because of the distance estimation functions and raymarching code in the fragment shader (https://github.com/HackerPoet/MarbleMarcher/blob/master/asse...)


I did a little open source game recently where all the rendering is based on SDFs, rendered in GLSL but without raymarching—all 2D: https://github.com/westoncb/under-game

If anyone wants to see the SDF/rendering bits, it's all in this file: https://github.com/westoncb/under-game/blob/master/js/gameFr...


What an excellent explanation! Can this work in reverse, as in given an image of dimension d can a signed distance function be used to infer the geometry? I'm hoping it might be useful for visualizing/cataloguing strange attractors in audio or sequential data, for example, or extracting beautiful idealized geometries from blocky voxels.


A nicely written explanation that is also, unfortunately, incorrect.


This captures two of my childhood computing obsessions. I spent so long messing around with Polyray and generating raytraced pictures for convering into Magic Eye stereograms. I also had some Windows 3.1 fractal generator program, where you could mess around with the formulas and zoom in (slowly) and make them cycle colours.

Looking forward to playing this a lot. I particuarly like the clever use of fractal mathematics to make accurate collision detection physics. Though it helps that he's using a sphere as the user-controlled object.


I thought you were talking about the raytracer I was obsessed with as a kid, POVRay, but indeed there is also a project called polyray.

I spent many a weekend rendering hundreds of .tga files and throwing them into .flc files since that was the only way I knew how to make "videos". I mostly did renderings of planets turning with texture maps, trying to get something similar to what I saw in Star Control 2, and I told myself I was making them for my own video game, but I never got past fiddling around with povray. Anyway it looks like polyray has a lot of the same facilities. I wonder how I was unaware of it for the couple years I played with povray?


I remember staying up all night waiting for my first renders to complete on my brand new amazingly fast 80286 with 80287 math coprocessor. It was amazing.


For some time now I've wanted to modify my ray tracer to do realtime RDS (random dot stereograms). Just another item on my bucket list that I might never get around to. I suspect doing high frame rates and animation may keep your eyes adjusted very well, but that intuition might be wrong.


I played with Bryce 3D (from the MetaCreations era) as a child. Sooo many generic landscapes.


you should checkout Tom Beddard, a master in the area, that has partnered with the founder of MetaCreations. A little googling will show you some cool stuff.


What a super well done video! Really benefits the game itself. Clear, concise, intriguing and so positive. Can't wait to try this when I have time.


I'm constantly shocked at how well experts in their field can also teach, explain, edit, and share their knowledge these days.


Thank you!


A couple years ago I made very minimal game engine that allowed simple objects to be intersected with SDFs for physics, a bit similar to this. It also had some other cool ideas in it like 3d portals and gravity-bent rays of light. This is the old version with pictures in the readme https://github.com/PWhiddy/Raymarch-Engine And here is a version with no pictures in the readme but an actual build system (if you want to try it out) https://github.com/PWhiddy/time


This looks beautiful! Is the engine available as a library?


Reminds me of Marble Madness https://en.wikipedia.org/wiki/Marble_Madness except with cool real time fractal generation. Definitely has potential.


Very cool. Think this game also used fractals:

https://en.wikipedia.org/wiki/Rescue_on_Fractalus!


Reminiscent of the exploration scenes in Lem's Solaris, where the fractal-like surface eruptions are explored by spaceship/plane. Unfortunately neither of the movies had those.


My Pascal class in Highschool had a raytracing project for our amiga A500's. I seem to recall a 320x200 view was split between 4-6 machines and it took 3-4 days to render.


I ran this on my 2008 high-endish business laptop on lowest settings and it ran at about 2 frames per second (though the indicator was showing 11) and wasn't responsive. I love the idea, though.


You'll need a dedicated GPU as the post says


anyone compiled this for Linux/OSX ?


Not precompiled, but there's a PR with Linux support: https://github.com/HackerPoet/MarbleMarcher/pull/1

Works well.


Or wasm?


Looks like it's using GLSL 400 shaders so I don't think it would work for wasm unfortunately. Not sure how many non-WebGL features it's using though. Might be straight-forward to port.

EDIT: bigger issue is probably that SFML apparently doesn't support wasm: https://github.com/SFML/SFML/issues/1494


This ran at a very fluxating FPS but never below 120 in 1600x900 - So I think you've done a very good job. Love the fractals!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: