Hacker News new | past | comments | ask | show | jobs | submit login

There are a few comments in here that predictably suggest that simple static sites can handle large request rates easily.

Sure, that's true - but to try to progress the conversation: how would you measure the complexity of serving web requests, in order to perform more advanced cost comparisons?

(bandwidth wouldn't be quite right.. or at least not sufficient - maybe something like I/O, memory and compute resource used?)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: