Hacker News new | past | comments | ask | show | jobs | submit login

15 years ago this even made much more sense. I did a static site generator in good ol' Perl 5 in the 90s – and I'm sure, I wasn't the only one to do this. (Everything was based on file hierarchies, but there wasn't much structure or setup required else. Directories were mangled to intermediate data files in pseudo-XML, only to be reassembled, when there was an update local to that hierarchy. XML wasn't the big thing then, but as this had to mangle HTML anyway, it was somewhat obvious to embed data structures in tags.)

Back then, this really had some advantages: Updates were rare, but views comparably frequent, while databases were either not that performant or quite expensive. This way, you could serve everything from cache (remember Squid?), and, compared to a dynamic site, it was really quick, even in admin-mode. Given the modern machines and the lots of memory they come with, it's quite ironic to see this come back, while we saw the triumph of the LAMP stack on comparably modest machines. Nevertheless, if you've only a few updates and lots of views, it's a good idea towards green computing. (Save some mountaintops! [1])

On the other hand, there is some "magical" limit regarding flexibility and complexity, where things tend soon towards unmaintainable code. So, the judgement is left to you, respective to the purpose.

Edit: [1] "Mountaintop mining" at Google-images: https://www.google.com/search?tbm=isch&hl=en&q=mountaintop+m...




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: