Not only is it a "default option" that is making the web less decentralized, it is also way overkill for something as simple as this. Make sure nginx serves cached content (right headers) instead of reading the content from disk and maybe, just maybe, throw in a static cache like varnish in front.
For the times my websites hit the frontpage of HN, a simple nginx instance (without any cache in front like Varnish) on a $5/month Digital Ocean server was enough to handle things.
You don't even need nginx; I've got a single Node process on a Heroku "hobby" node serving my site, and it's weathered several front-page visits peaking at 15 requests per second without problems. The important thing is the static-rendering (assuming it applies for your site's content)
Having a blog post of mine get posted to HN (many years ago) was how I discovered I'd accidentally introduced a bug into my web server config that meant the app server behind it was serving the static assets as well as the dynamic stuff. It entered a state of wedgitude quite rapidly when traffic ramped up.
Sadly I don't see any alternative. The need for CDNs is a direct result of the structure of the internet and modern HTTP. We could imagine alternative infrastructures - decentralized transparent network-layer caching so that the network itself caches data and responds to requests with cached results - but the end-to-end structure of TCP and HTTPS make that impossible, for better or for worse. So we have to use CDNs.
In this case, this is not a virtue. Using a CDN is option, reversible and is likely to spread your content out to be much less centralised than anything hand rolled.
The best way is to use GitHub Pages + CloudFlare. This way you can also use site generator like Hugo that runs by Github Actions and you can even deploy some code to CloudFlare Actions.
Here how I setup this for OSS game engine I work on:
I can't bring myself to believe that an external caching layer is absolutely required to handle the load from something like a hacker news front page. WordPress with no caching I can see being taken down by 10 to 20 requests per second, but any kind of local caching should be able to handle the load easily. Does HN produce more traffic than that?
I've been on the frontpage before and my logs indicate that the peak load was somewhere north of 350 requests/sec though the non-peak (while still on the frontpage) was typically 10-20 requests/second. That is for a very simple page consisting of an html document, css, an image, a favicon and no javascript (eg https://calpaterson.com/printers.html). If you can serve from a reverse proxy cache locally that will be fine but if some PHP/Python/NodeJS is running on each request you can run into trouble with the small servers people typically use for their side-project's website.