He also is involved with libraries for Mapbox and Leaflet, so if you're doing something with the web and with maps, there's a big chance you've used some of his code directly or indirectly.
Who said one can't be a Renaissance man these days?
(because sometimes, people have an older, not updated library, for example)
Can someone explain that? I assume the problem is something like, "user clicks on 2D map, find which objects are near the click". A naive solution might store the objects in an array sorted by X coordinate. The problem there is that the search for objects near (x,y) would have discard many objects that were far away on the Y axis. Somehow this thing solves that problem?
I think of these as like a prefilter.
But still: you did port it from C++, and consciously or not, that optimization still applies.
Here's a good thread with benchmarks that compare Earcut (also one of my libraries) to C++ and AssemblyScript WebAssembly ports: https://github.com/mapbox/mapbox-gl-js/issues/4835 (in short, mostly slower than JS)
Another anecdotal example is that C++ and Rust ports of https://github.com/mapbox/delaunator (my Delaunay triangulation library) are only 10-15% faster.
My impression is that JS engines can handle straightforward imperative code on typed arrays very well. But is there more stuff that I should pay attention to when writing performance-sensitive code that acts on large datasets in JS?