Hacker News new | past | comments | ask | show | jobs | submit login

Cool as it may be for the intellectual challenge of tight coding, to run on a minimum of resources, it also makes things more vulnerable to DDoS, slashdot effect, and less than ethical people running abusive scraping tools that don't respect robots.txt. As a person who is on the receiving end of the very rare 3am phone call for networking related emergencies, I try to provision a sufficient amount of resources above "the bare minimum" to ensure that I'm not woken up by some asshat with a misconfigured mass http mirroring tool.

RAM is so cheap now for small sized things that you can afford to trivially have an entire db cached at all times, with only very rare disk I/O.

As an example we have a request tracker ticket database for a fairly large sized isp which is a grand total of under 40GB and lives in RAM. It's dozens of thousands of tickets with attachments and full body text search enabled. For those not familiar with RT4 it's a convoluted mess of Perl binary scripts.

I could probably run my primary authoritative master DNS on bind9 on Debian-stablr on a 15 year old Pentium 4 with 256MB of RAM, but I don't...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: