I feel like "a hold my beer" version of this just accepts webhooks from your artifact registry and pre-pulls new versions of running images, or maybe images that node has seen in the last X hours
Caching is just part of it, to ensure faster download on nearby nodes, the revolutionary thing is that even 100 GB image can be started in a few seconds, and files streamed when accessed.
It's not a new idea, I remember at least one project from a few years ago, which used a modified Docker engine to download accessed files on-the-fly, but this is the first time a cloud provider offers it.