Hacker News new | past | comments | ask | show | jobs | submit login

This creates a surprising situation.

We have a batch process that uses S3 as a way to temporarily store some big datasets while another streams and processes then and, once complete, removes them. This takes like an hour.

So our S3 bill could be, let’s say, $10/mo. If you went and looked at your S3 bill and saw that and thought setting a $20 cap would give you reasonable headroom you’d be sorely surprised when S3 stopped accepting writes or other services started getting shut down or failing next time the batch job triggered.

Under this system, the budget and actual bill need to be off by a factor of more than 10 ($240). And this also doesn’t stop your bill being off by a factor of 10 from what you “wanted” to set your limit to. More than the $200 under discussion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: