
A non-trivial website that has no back end - oxplot
https://blog.oxplot.com/the-art-of-barebackness/
======
asuffield
This feels like it's playing games with words a lot. What they've built here
is a service which is perfectly cacheable: the same content is served to every
user, and can be precomputed in advance. This allows them to run their page
generation code in a batch generator instead of during processing of the http
query.

It's certainly a very good idea to serve as much of your content from cache as
is possible, and I don't think anybody would be surprised by that statement.
However, they still have the same amount of page generation code, they're just
running it at a different time. I'm not sure I'd call that "no back end". I'd
call it an interesting choice of tradeoffs to optimise query latency.

~~~
icebraining
Yeap, the approach was reasonably common in the past - e.g. Movable Type has a
dynamic administration backend that generates static public pages; Slash (the
codebase powering early Slashdot) used to re-generate static HTML files
whenever someone posted a new comment, etc.

------
amluto
This is a good post. It would be an even better post if the first sentence
were removed.

~~~
datapolitical
The first two would be even better.

------
markbnj
This website has a back-end, imo. It has crawlers, indexers, jobs, etc. Really
the only thing it doesn't have is an api, a monolithic request handler, or a
database. Well they don't show a db but they probably have at least a list of
sites to crawl. Also worth pointing out that, unless they are doing
geolocation or otherwise trying to pin down what a valid request is for a
specific IP, then someone should theoretically be able to extract the entire
dataset as static files.

Lastly I agree with the other poster who found the first sentence
disagreeable. I am not offended, but it just seemed out of place in a semi-
smarmy way.

~~~
oxplot
> someone should theoretically be able to extract the entire dataset as static
> files

Yep, not only possible but trivial. One has to just figure out where the hash
links are in each blob and follow them.

