That will be one of the goals of one of my next articles.
But it cannot be implemented like Apple does (with a delay on switch between dark and light).
What is possible though is to get an average of the current image behind the object and to extrapolate it to either black or white.
Then the layer on top of it would do the opposite.
I considered WebGL, and I agree—a shader is more performant for real-time effects.
But WebGL comes with drawbacks:
- You need JS code running before anything shows up.
- Shaders can’t directly manipulate the DOM render. To make refraction work, you’d have to re-render everything into a canvas—which isn’t really “the web” anymore.
With the SVG/CSS approach, you can pre-render the displacement map (at build time or on the backend) and get the refraction visible on the very first frame. Plus, it integrates cleanly with existing, traditional UIs.
That said, this approach could definitely be improved. Ideally we’d have shader-like features in the SVG Filter spec (there was a proposal, but it seems abandoned). There are some matrix operations available in SVG Filters, but they’re limited—and for my first blog post I wanted to focus more on pedagogy, art, and technique than heavy optimization.
I planned to fix the performance issues before posting here (since I knew HN would be quick to point that out), but somebody posted it first. You’re absolutely right — it’s pretty slow right now and needs optimization.
And it’s not just the refraction/displacement map: plenty of other parts, like visualisations, aren’t optimized yet either.
lol this demo is SO cool. you have NOTHING to be anything but proud and happy of. you did excellently and this UI is the perfect realization of this idea. Well done!
It ran perfectly smoothly with no perf hit in 2020 mba m1. there are no issues with this.