Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are a few comments in here that predictably suggest that simple static sites can handle large request rates easily.

Sure, that's true - but to try to progress the conversation: how would you measure the complexity of serving web requests, in order to perform more advanced cost comparisons?

(bandwidth wouldn't be quite right.. or at least not sufficient - maybe something like I/O, memory and compute resource used?)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: