Nobody's teaching the far superior old ways, all anybody is learning any more is overcomplicated Javascript stack garbage. And most of these "show HN" type posts seem to be someone trying to show off a learning project
And if you take all that and abstract it out by running it inside another VM, then you can just pack up your infra and move it to wherever, whenever you want!
I can't tell if this is parody or not, hosting on vercel is the easiest fastest way to prod I can think of, AWS if fairly complicated if you're not used to it but honestly if you're just setting up and DB and getting a connection string is very easy. Also PHP sucks. If this is parody, I apologize.
I don't think this is a parody either, this is like saying the easiest way to format text is to use Word, then poo-ing html...
Maybe to someone who only knows "modern web dev" it seems like a parody, most people don't start with "let's learn 10 overcomplex frameworks and use a template generator which only takes one step".
For generating horrible bloated apps as fast as possible you may have something "easy", I will probably never use vercel, I'm much more likely to write a script and expose it on a port somewhere.
Unless I've misgauged vercel and you can feed it a python script or something...
Meanwhile vercel is literally “now”. Like all you need to do is cd to the directory and type “now” and press enter.
https://docs.ycombinator.lol has been running for years without fail. It’ll probably shut off someday and be lost to history, but I’d bet on it lasting a lot longer than that ec2 instance. Free, too.
You have to admit that it scales pretty well though. Most people that set this up with PHP with nginx wouldn’t be able to do it in a way that happily handles the tens of requests per second.
Well... but does it really needs to scale, if it is designed properly?
For instance the distorted image is being loaded as base64 data blob within the html. The size of that page is 401kB, most of which is in that base64 data. The actual jpeg is just 97kB (the non-distorted image is about 300kB). If the image is served separately, browsers will cache it for those who just reload the page (there is no need to update it anyway, since it is updated on the server every 400 requests only).
So with a 100Mbps connection you can easily (assuming unique users, thus no caching in the browsers) serve not just tens requests/s, but over a hundred requests/s. Put image on CDN, pre-degrade it to have many levels, and with the same server just serving html, you can do thousands request/s.
Sounds (over)complicated. Isn't just a static html webpage on php/js backend with sqlite or even without, just grepping logs via cron, do the trick?
Having Nginx as a front-end, with static file caching enabled, would handle just fine the load from HN front page even on a RPi.