For simple problem domains (Blog, Mom and Pop store website, etc) it's trivial to pre-generate content. For larger content systems you can run into a more complicated dependency tree. Then you have the choice between keeping the dependency logic accurate vs regenerating the entire content set on any change.
It also turns out that content sets that change infrequently, but also unpredictably are a pain to cache. You can cache them for a short time (as long as stale content can be tolerated), but then you lose cache effectiveness. Or you can cache it forever with some sort of generation/versioned cache, but that doesn't interface with named, public resources very well. Telling your visitors and Google that it's yourdomain.com/v12345/pricing not yourdomain.com/v12344/pricing doesn't really fly.
I definitely concur with your surprise about it being novel though. I think that for many situations it's just easier to run extra boxes to handle the increased load of generating dynamic content on the fly over and over again. It's good for SuperMicro and AWS. It's not so good for the planet.
I'm very excited to see Jeremy's approach to addressing the problem.
It also turns out that content sets that change infrequently, but also unpredictably are a pain to cache. You can cache them for a short time (as long as stale content can be tolerated), but then you lose cache effectiveness. Or you can cache it forever with some sort of generation/versioned cache, but that doesn't interface with named, public resources very well. Telling your visitors and Google that it's yourdomain.com/v12345/pricing not yourdomain.com/v12344/pricing doesn't really fly.
I definitely concur with your surprise about it being novel though. I think that for many situations it's just easier to run extra boxes to handle the increased load of generating dynamic content on the fly over and over again. It's good for SuperMicro and AWS. It's not so good for the planet.
I'm very excited to see Jeremy's approach to addressing the problem.