What's the use-case for workers outside of CDN edge compute?
I understand why I'd want to do edge compute: it lets me shift work (in some cases at least) from my backend to the CDN, so my backend gets less traffic and users (possibly) get responses faster since the workers run on CDN edge nodes.
But obviously if I'm running workerd myself, I'm not doing edge compute (unless I'm also operating my own CDN). And if I'm not doing edge compute, just deploying code or a container to a server seems easier. Is it for the case where you want to do serverless stuff in a runtime you host yourself?
The blog post attempted to explain what makes workerd interesting as a platform, independent of Cloudflare. It allows for a new development model where you split your application into smaller components. Maybe that's interesting to you, maybe it isn't.
But the other reason you might develop on workerd is because you want the option to deploy to Cloudflare later. The advantage of Cloudflare isn't just that it's "edge" and therefore closer to clients -- it's also just a lot easier to deploy code on Workers a lot of the time than it is to manage servers on traditional cloud providers.
From development point of view, workerd would encourage this new nanoservices/lambda/function paradigm because it would be much easier to develop and test in self-hosted environment or even CI/CD pipeline.
I understand why I'd want to do edge compute: it lets me shift work (in some cases at least) from my backend to the CDN, so my backend gets less traffic and users (possibly) get responses faster since the workers run on CDN edge nodes.
But obviously if I'm running workerd myself, I'm not doing edge compute (unless I'm also operating my own CDN). And if I'm not doing edge compute, just deploying code or a container to a server seems easier. Is it for the case where you want to do serverless stuff in a runtime you host yourself?