Hacker News new | past | comments | ask | show | jobs | submit login

It depends on the type of optimization. For page caching I definitely agree, not because it's a waste of effort, but two other distinct reasons:

First, because you might be papering over more fundamental issues performance issues that will still hurt you in the long term, and will indeed be harder to spot once caching is in place.

Second, because cache invalidation is quite often non-trivial, and doing it right may require a somewhat thick layer of code, and a certain discipline moving forward. This will slow down development if you are in rapid pivot mode.

However if your performance fundamentals are already resolved, and the business model is in place, and you expect the code to be around for a while, then putting the effort into caching will be amortized over many years and pay many dividends not only in server costs, but also in user experience due to fast page loads, and also in correctness because you will have time to get the caching right rather than scrambling to add it at scale and potentially serving stale data to millions of people instead of thousands.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: