This is a pet peeve of mine, and I wonder how many HNers share this feeling. Why doesn't Amazon provide hard spending limits? Back in 2006 [1] this feature was supposed to be "in the works", but clearly it's not.
Just as with the low security of many domain registrars, I guess there are not as many horror stories as my paranoid mind would lead me to believe. Any thoughts?
PS: Yes, I understand the dangers of enabling these limits, which ideally should be accompanied by previous alerts, etc.
This is one of those ideas that makes all the sense in the world in practice, and then someone turns on the limits and you hear about how "NewHotness.com" goes down on Black Friday because of the Hard Limits feature in a big news story and how AWS should never offer a feature like this because "you never know how much you might scale in a flood of users" and "that is the whole point of cloud computing!"
Amazon is just saving themselves the heartache of all the irony :)
I imagine that must be their thinking. Clearly a lack of technical resources it's not. Moreover, Amazon has already shown [1] good will when negligence led to an unexpectedly high bill.
I'm surprised (though empirically without much reason) not to hear stories from NewHotness et al about "how AWS burned five times our monthly budget in one day of frenzy". Even if those new visitors may very well be welcome for the business, I imagine there are many folks who would like some control (say, to double the limit), maybe at the risk of leaving the site offline for a few minutes.
Would a spending limit require terminating the user's EC2 instances, deleting their data in S3 and DynamoDB, etc.? If not, these things could continue to accrue cost.
Amazon's not amazing at announcing useful timelines for their products/services. There are presumably a bunch of internal factors at play, but spending limits aren't the only thing that's been "in the works" for ages. If you're holding out for something that may or may not be coming (eg DevPay in your region, hard spending limits, long-term-queues, scheduling, ability to cloudfront s3 'subfolders' etc) it's frustrating, but on the other hand, they do tend to release well thought out services that make sense.
If it's any consolation, I have heard of individuals getting huge bills and having them cleared by Amazon when it became apparent that they hadn't willingly racked up that many fees (like the guy who included his S3 images in a google spreadsheet causing several TB of requests - http://www.behind-the-enemy-lines.com/2012/04/google-attack-... ).
Why not take the approach of monitoring it instead and triggering alarms based off specific traffic..? So if you use too much foo vs HTTP, trigger an alert. This is one of the things we use Boundary for.
Just as with the low security of many domain registrars, I guess there are not as many horror stories as my paranoid mind would lead me to believe. Any thoughts?
PS: Yes, I understand the dangers of enabling these limits, which ideally should be accompanied by previous alerts, etc.
[1] https://forums.aws.amazon.com/thread.jspa?threadID=10532