Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

[flagged]


> He probably thinks the internet works on static 5k html pages, while the norm is 100kb, dynamically generated pages.

I just work on web stuff that people actually use. It's 2026, thousands of requests per second is nothing. You'll probably be fine even with stock apache2 and some janky php scripts.

A single gbit line will serve a 100kB page thousand times a second without issues.

Dynamically generated pages you can't easily serve at rates in excess of tens of thousands of requests per second from commodity hardware are extremely rare.


For most web apps, bandwidth won't be the issue, it'll still be I/O bound, or maybe CPU bound.


Sure, but CPUs and I/O are so fast now that it's genuinely difficult to hit those bottlenecks unless you're doing something weird.

Also, hardware these days is good enough that a CRUD web app could very well be bandwidth limited.


i don’t think you realize how fast modern CPUs are. If this stresses your server out, you probably have no business hosting things publicly on that server. This person is hosting stuff on Vercel using serverless which is the root of their problem.

4 request per second is just noise. it’s like complaining about car noise when deciding to buy a house next to the freeway. Exposing things publicly on the internet means _anyone_ can try talking to your server. Real users, bots, hackers, whatever. You can’t guarantee bots are bug-free!

Dynamic content is _typically_ served to logged in users. Content that is public facing is typically cached, for obvious reasons. Of course Meta should fix this…but using Vercel and serverless in this manner is a very poor choice.


Meta isn’t going to fix this because they have your mindset.

Meanwhile, my website with 48M pages over 8 domains is getting hammered with over 200 req/s 24/7 from AI bots in addition to the regular search engine bots. It seems like every day new bots appear that all want to download every single one of my URL’s.

To me it’s not background noise. It’s a problem. It simply requires a lot of CPU power and traffic. I could do with 95% less resources and have faster response times for my actual users if these bots would just bugger off.


even 100 kB dynamically generated pages should be a piece of cake. if it's CRUD like (original op's site is), it should be downright trivial to transfer that much on like... shared hosting (although even a VPS would be much better).

(in original op's case, i clocked 197 requests using 20.60 MB while browsing their site for a little bit. most of it is static assets and i had caching disabled so each new pageload loaded stuff like the apple touch icons.)

honestly you could probably put it behind nginx for the statics and just use bog standard postgres or even prolly sqlite. nice bonus in that you don't have to worry about cold start times either!


I don't have a car. I don't need one because trains exist. My website can also handle 4 requests per second.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: