Hacker News new | comments | show | ask | jobs | submit login
Show HN: Metaballs (varun.ca)
673 points by winkerVSbecks 5 months ago | hide | past | web | favorite | 93 comments

From the grandparent article:

> Well, for 40 bouncing circles, on a 700x500 grid, that would be on the order of 14 million operations. If we want to have a nice smooth 60fps animation, that would be 840 million operations per second. JavaScript engines may be fast nowadays, but not that fast.

The math is super-cool, and efficiency is important for finding isosurfaces in higher dimensions, but those aren't really scary numbers for normal programs. Just tinting the screen at 2880x1800 is ~2 million operations per frame. GPUs can handle it.

A simple way to render is to draw a quad for the metaball, using the metaball kernel function in the fragment shader. Use additive blending while rendering to a texture for the first pass, then render the texture to screen with thresholding for the second pass. The end result is per-pixel sampling of the isosurface.

Admittedly, it's kind of a brute-force solution, but even the integrated GPU on my laptop can render thousands of metaballs like that at HiDPI resolutions.

(Specifically, I use a Gaussian kernel for my metaballs. It requires exp, which is more expensive computationally than a few multiplies. I render 1500 of them at 2880x1671 at 5ms per frame on an Intel Iris Pro [Haswell].)

Though, the work scales with fragment count, so a few large metaballs may be as costly many smaller ones. For large numbers of metaballs, you probably also want to use instancing so you'd need OpenGL ES 3.0 / WebGL 2.0 which are fairly recent.

But 40 metaballs with a simple kernel at 700x500? That's easy for a GPU.

I believe 2D canvas rendering is performed on the CPU rather than the GPU.

You wouldn't use a 2d context, you can use WebGL shaders instead. Besides that, most operations on a 2d context are performed on the GPU.

Most is done on the GPU nowadays - only getImageData() and putImageData() has to go via the CPU.

You can BTW also "cheat" your way to meta-balls: https://stackoverflow.com/questions/17177748/increasing-real...

much of it is offloaded to the GPU by recent browsers

The important bit is getting the metaball function into the fragment shader. I'm not really a web guy, but I know you can do that with WebGL.

For a canvas with a more limited API, you can still do it if images are GPU accelerated with a composite mode like "lighter". If that's the case, you can do basically the same thing by first rendering the metaball function to an image once, and then drawing that image for each metaball. Doing it via an image introduces extra aliasing artifacts, but might get around the API limitations.

Edit: I suppose you would still want to find a GPU-accelerated threshold function for the step after that.

I remember what I did to optimize my own metaballs: after rendering all different sizes that were used on their own screen at (0, 0), I would only add the values at relative positions (signs removed) of their respective preset map.

It does take some memory, but n operations per pixel for each frame for n balls plus the overhead of transforming into actual color was still pretty great. Instead of saying "GPU can do this", I'd rather ask, hey, can we do even better than that?

I used to do that, but sampling from a texture seems to be about as expensive as directly calculating the metaball function per fragment.

It is cool to see what we can do. That is one of the things I really like about winkerVSbecks' approach. It's interesting and different. Better for some uses, too, which is always nice to see.

Feels more organic to me if the original metaball gets smaller as the other one moves out (like its stealing material). Haven't worked out the correct math but a quick PoC is here:


That is weirdly satisfying to watch...

Would make a great loader icon.

Keeping 100% accurate constant area would require a pretty insane closed form equation. Even ignoring the amount of area added by the stretched portions.

Yes, in this case a numeric approach would probably be the way to go:

- Assume that R1, R2 are the radii of the discs and A_ORIG is the original area (eg. R1^2 PI)

- Calculate the area A for a given R1, R2

- Multiply R1 and R2 with SQRT(A_ORIG / A)

- Repeat

If this doesn't converge after a few iterations, you can use the Newton method, or even a simple binary search to find he correct radii very quickly. A(k R1, k R2) should be monotonic for k, so solving it numerically for a given value should be trivial.

That is awesome!

Oh wow, it's a gaussian blur and basically a threshold function (boosting the contrast) using SVG filters! I didn't know this was possible. Very clever!

The select example looks amazing.

Interesting approach! Coincidentally, I published an article [0] on this very topic last month. It uses sampling, so it's close to the approach mentioned in the Jamie Wong article you (and I) linked to, but with a path-tracing step capable of producing an SVG path definition. I'd be interested to see how the performance of these two methods stack up to each other for a given quality level.

[0] https://eightsquaredsoftware.com/articles/metaball.html

References https://codepen.io/keithclark/pen/sEbFz which does metaballs using pure CSS (using a combination of a contrast filter with a blur filter).

Beautiful article.

this is so well done! I'm going to have to try out real metaballs now.

FYI the demos are impossible to use on an iOS device since trying to drag the time slider causes the browser back animation to start.

Thanks for pointing that out. I did test on an iPad, but perhaps it had a non-current version of Mobile Safari or navigation gestures were disabled.

Whoa, this is amazing too!

In https://codepen.io/winkerVSbecks/pen/NazWxg, there's a "hitch" as the discs touch due to a first-derivative discontinuity. Here's a version which extrapolates the u1 and u2 variables, making the transition much smoother: https://codepen.io/panic_/pen/BwvjmK.

This is just calling out for a propagation wave effect through the parent ball at the moment of separation.

Much better. There's definitely many improvements that could be done, but that was the main one. The other big one is how the bigger disc doesn't change size.

The approximation OP does is a good start but still far from being real metaballs.

That bubble slider is super cute. https://codepen.io/chrisgannon/pen/GZNgLw

I'm extremely impressed with the implementation. I'm not sure what I would say if presented with a design like this slider for web. Wouldn't have imagined it would work this beautifully as well.

Amazing showcase.

It definitely moved the opposite way I expected it to, but this way grew on me, hah.

It's not a perfect UI element - you can't actually see the options without scrolling through it all, but I could imagine something similar being a pretty cool little thing in the right context

Good to see you on the front page, Varun. :- )

It's possible to do this somewhat efficiently beyond two balls with GLSL and lots of uniforms (or a UBO), since metaballs from the graphics perspective are really just distance fields.

If you want more than a few balls, you can do it in two passes: one to produce the distance field, and one to threshold it.

As an added benefit, it's straightforward to generalize these approaches to any two-dimensional continuous function.

the same effect (albeit with less crisp edges) can also be achieved by abusing the blur filter in CSS -- with no need for JavaScript or SVG at all.


I did something fairly similar to this here: https://codepen.io/thomcc/pen/vLzyPY (I need to look into why this isn't running at 60fps anymore on my laptop, it certainly used to...)

The big difference is that it prerenders a gradient for each ball (it uses html5 canvas for that, but doing it with webgl is completely doable, although a bit more work), which is used as a distance field.

> I need to look into why this isn't running at 60fps anymore on my laptop, it certainly used to...

Runs at 60fps for me on a Chromebook from 2014. I suspect you're looking at it on macOS, which has had very poor (arguably the poorest of any x86 platform) OpenGL drivers for the last four or five years.

I am, but it certainly was at 60fps on my laptop when I wrote it, which was also a mac.

Agreed, but it does run at 60fps on my 2015 macbook pro running macOS Sierra

Absolutely. This is by far worst way to render metaballs But, I love that you can do this as just paths.

Far from the worst: if the update rate is reasonably low, this approach makes it a lot simpler to handle high-resolution displays, event registration. It should also run on a few more browsers and devices.

For everyone complaining about the lack of meatballs… here are some: https://codepen.io/winkerVSbecks/full/oGJLwo/

The math can be optimized by at least an order of magnitude.

Trigonometry functions are expensive, especially the reverse ones.

If v=0.5, see [1] for how to find out sine/cosine of a maxSpread * v. For angleBetweenCenters + maxSpread * v, see [2] for how to find sine + cosine of a sum of angles.

If you’ll do all that math in the symbolic form (you can use Maple or Mathematica or something similar), you’ll get the equivalent formulae for p1-p4 that won’t use any trigonometry, only simple math and probably a square root.

[1] https://en.wikipedia.org/wiki/List_of_trigonometric_identiti... [2] https://en.wikipedia.org/wiki/List_of_trigonometric_identiti...

Implemented a 2D angle class for cases like that, where you would otherwise use these slow trigonometry functions. It’s C++ but should be easy to convert to JavaScript or any other OO language.


> Metaballs, not to be confused with meatballs

I once reviewed an academic paper at a major CS conference that misspelled metaballs as meatballs throughout.

After years of living in the US, I still have trouble that the word is "cockpit" not "cocktip". That became hilariously obvious when at work we had to use a library that uses the term "cockpit" for one of its main components.

A nice example of the the impotence of proofreading![1]

[1] https://www.youtube.com/watch?v=OonDPGwAyfQ

And the importance (oh...I see what you did there...well played)

You think that was a mistake?

I just started learning GLSL shaders. As practice, I wrote a psuedo-metaball joystick. I didn't know about metaballs, but now that I do I can do some more research and improve my next iteration.

Touch blob joystick shader: https://www.shadertoy.com/view/4lfcRf


About 2 minutes in there's an excellent realtime metaballs implementation that ran smoothly on a 486-66mhz. Metaballs were an extremely popular effect in the early 90's.

I wonder what the first demo to use metaballs was. Two candidates:



How about first on the C64? Here's Booze in 2010:


Paper.js is truly a great source of vector drawing tricks. Curious how difficult it would be to extend this technique beyond two circles. Might have to dust off some old experiments ... :)

Oh no... I shook it a bunch and it broke apart ;_;

Metaballs are always nice, but I think this page (that was linked in the article) that shows compass&straight-edge constructions to be especially nifty:


See the Euclidea and Pythogorea phone apps for a puzzle game based on geometric principles.

During or just before WW2, Roy Liming developed analytic techniques for calculating a similar class of blend or fillet. They were taken up in aircraft design, a field that I can't imagine ever using implicit surfaces! I think it was Edgar Schmued's design for the P-51 Mustang that famously used Liming's work.

Liming wrote a book, but it's rare. Some technical discussion towards the end of this page: http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/BOWY...

Was this the technique used in World of Goo?

I believe (the amazing) World of goo used stretched bitmaps* ( at least in the prototype )

* https://www.gamasutra.com/view/feature/2438/how_to_prototype...

This actually refreshes my memory. I had to implement some metaballs myself some years back for a fluid simulation.

I had to struggle with metaball rendering on canvas back then. It was so slow. Now I guess a pixel shader in webGL can do a better job.

Check this out too: https://asadmemon.com/SPHjs/ source: https://github.com/asadm/SPHjs

I would love to incorporate this in some of the UI design work we do for startups. Are there more similar libraries available? We could reference it to our network of clients (mostly developer driven startups) to help translate some of the design ideas we propose. If you know of other similar projects like Metaballs, please do share below or ping me (details in my bio)

We've use the blur+contrast approach successfully in EventDrops [1], a time series visualisation based on d3.js. It all happens client-side, with OK performance. Not sure the SVG approach brings more in this case.

[1] https://marmelab.com/EventDrops/

An alternative method (with potentially different applications) that I found interesting. The visual aids in both articles are very good.


Andrew Glassner published a paper on something extremely similar back in 2015:

"Globs: A Primitive Shape for Graceful Blends Between Circles"


This is pretty awesome.

I wonder how much would need to be adjusted to provide a scaling factor to the first metaball such that the area was constant (Thus ending up with two equally sized metaballs) or even utilizing the speed of the pull in determining the second balls size.

This is cool. Just watched a related talk from Casey Muratori about this yesterday: https://www.youtube.com/watch?v=SDS5gLSiLg0

What's the practical, commercial use for something like this? It looks like it must take a lot of time and effort to get this right.

Or is this university stuff? Or even spare time stuff?

I clicked through this hoping that someone had done a 'Show HN' for a literal plate of meatballs.

I am from Sweden and read "meatballs" when I clicked. Just imagine my disappointment.

Cool stuff though.

I keep seeing Metaballs as Meatballs.

I did too! All the way until I read this comment

there is a popular vvvv shader that implements metballs, see https://vvvv.org/blog/debug2-2

Could this extend to 3D?

That's marching cubes though, which is different from the parametric/Bézier approach in the article?

Marching cubes is the rendering technique, it's based on distance fields which is a similar idea to iso-surfaces.

With the SVG/Bézier approach I doubt you could do 3D, true :)

¯\_(ツ)_/¯ you could probably make 3D bezier curves to generate the connector. The real iso-surface metaball definitely works in 3D.

My first experience with metaballs was in 3D as a feature in Truespace (https://en.wikipedia.org/wiki/TrueSpace) where they were used for modeling.

In order to render a surface you have to either use a contouring algorithm like marching cubes to generate a mesh like the above three.js demo, or raytrace or raymarch them. Because metaballs describe a distance function, its really easy to use SDF raymarching and there is a whole category dedicated to metaball shaders on shader toy (https://www.shadertoy.com/results?query=tag%3Dmetaballs).

Jim Blinn was doing that in the 80's. http://delivery.acm.org/10.1145/360000/357310/p235-blinn.pdf

Like this? https://www.youtube.com/watch?v=wcdKHCp9foY

Works on Unix, Mac, iOS and Android

Code mirror is here - https://github.com/Zygo/xscreensaver

W.P. van Paassen's metaballs C code is here -


Also, you can set various parameter tweaks for the metaballs from the command line or settings - count, radius etc.

You can do 3D metaballs with Blender.

It already does, for water effects.

No one to my knowledge actually uses metaballs for surfacing fluids.

What is used instead?

Fluid simulation; see https://developer.nvidia.com/particles and https://www.youtube.com/watch?v=2gp7-ejkwBQ

Metaballs are way too expensive.

From the associated article for that video:

> SPH ... with 500 000 particles ... about 2.5 fps on my GTX 1070

Still slower than what CNCD & Fairlight demonstrated in 2011 with "Numb Res", at 120fps (stereo 3D) on a geforce 280:

> The demo features up to 500,000 particles running under 3D SPH in realtime on the GPU, with surface tension and viscosity terms; this is in combination with collisions, meshing, high end effects like MLAA and depth of field, and plenty of lighting effects


Metaballs would not be for simulating fluids but for creating the simulated fluid's surface. In your youtube link it would be a step between "simulating particles" and "meshed result".

"Fluid simulation" was a bit of a nonsense response to that question, but there are similarities between metaballs and fluid simulations. The kernel functions used to interpolate Smoothed Particle Hydrodynamics samples are basically the same thing as metaball functions. The main difference is that you probably don't need the isosurface during simulation.

On a related note, one of the annoying things about metaballs for fluid surfacing is that there's some spooky action at a distance. Two drops of water will reach out towards each other as they come closer together, which makes no physical sense at all.

Totally right. I meant to say "fluid dynamics," as in, approximating Jacobians, etc.

Surprised I've never seen metaballs before. Very cool.

I prefer Regular Ordinary Swedish Metaballs™

That's one spacey metaball! (sorry...)

>CodePen requires a referrer to render this. Your browser isn't sending one.


Apparently it's to prevent phishing. You should be able to click on "Edit on Codepen" to see it.

(Not affiliated with them, I just found https://blog.codepen.io/2017/10/05/regarding-referer-headers...)

I was disappointed that this was not about meatballs, for I am hungry.

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact