At an hourly rate of $120, thinking 10 minutes about saving RAM costs my client $20. With that money he could've bought 1GB extra RAM. I'm sure 99.999% of the times the saved RAM, would be less than 1GB. It's a simple cost/benefit equation. If you can save more than 1GB by thinking 10 minutes about it, you're writing shitty code to begin with.
There are times when the benefits are greater, for example when your software is running a million instances, or you're working on a hardware-intensive game, but those are certainly not common case.
The cost of 1GB of extra RAM is $20 in a certain range, namely where your system fits fairly comfortably on a single machine.
Once you get past a certain level, though, the cost of the next 1GB isn't $20, it's $20 plus the cost of another computer plus the cost of exploiting multiple machines rather than just running on a single one.
Then it's $20/GB for a while again, then $20 plus the cost of adding another machine, and at a certain point you need to add the cost of dealing with the fact that performance isn't scaling linearly with amount of hardware any more.
That last bit might be a concern only in fairly rare cases. But the first, where you make the transition from needing one machine to get the job done to needing more than one, isn't so rare. And that can be a big, big cost.
(Very similar considerations apply to CPU time, of course. Typically more so.)
We can't afford hourly rates of $120 and RAM most definitely doesn't just cost $20 when you factor in the inevitable new VM/machine. RAM is critical for maintaining concurrent connections plus there are times you need to keep large datasets in memory to avoid hitting the database.
This casual disregard for resources would explain why a lot of startups run into infrastructure issues so quickly and settle on terrible solutions (or outright snake oil) to mitigate problems that shouldn't exist in the first place.
People need to start realizing that The Cloud is only a metaphor. Hardware isn't intangible and neither is their cost.
Servers don't grow on trees; they live on a rack and grow on electricity and NAND.
Yes, $20 for the memory, and $200 in time spent getting approval, another $200 to physically install it, because you 'obviously' can't just open up that server as there are procedures fr that kind of stuff, and then $2000 in time wasted by the users while they spent 6 months waiting for that one ram stick to get installed.
It's easy to think everyone has their acts together like facebook or google, but most companies i've dealt with have hardware upgrade processes measured in months or years, not hours or days. You absolutely have to take responsibility for your work as a programmer and make stuff run fast instead of labeling it somebody else's problem.
There are times when the benefits are greater, for example when your software is running a million instances, or you're working on a hardware-intensive game, but those are certainly not common case.