Hacker News new | comments | show | ask | jobs | submit login

You'd do it because it's

  npm install toobusy 
then somewhere in your project

If you only need one server that's going to be up to a lot faster than putting anything in front of it.

If you only need one server that's going to be up to a lot faster than putting anything in front of it.

A single server setup might be simpler, but it won't be faster. Varnish serving a cached page from memory is going to be faster than 5 asynchronous calls that take 5ms of CPU time (and filesystem I/O in the case of the database and template given in the example). With varnish, even with a 1 second TTL (and 1 second grace), your first request will take the 5ms hit, but the next 199 for that second will be served from memory.

Now with Varnish serving 199 out of 200 requests from memory, if your backend is still toobusy, by all means serve a 503, and Varnish can cache that too.

You have a problem, and so you try to solve it with caching. Now you have two problems.

I think that's how that quote goes.

I think of caching for apps like clothes for people...while you could survive naked, it's more comfortable with clothes, and you're protected from heat, cold, etc. Of course there is the problem of being over/under dressed, but you have to wear something.

That's super unnecessary for a single server, you can literally just use a variable outside of your request handling as a cache.

  var cache;

  module.exports = function(request, response) {
    if(cache) {
      return response.end(cache); 

    // get my data from wherever
    cache = the_data;

    return response.end(cache);
Now 199 out of 200 requests are from memory, there are zero extra moving parts, and you're using a cool part of the language instead of a 3rd party tool you have to select and configure.

How do you selectively serve the cached response to some users based on cookies, header, etc., and expire it after a set amount of time? How would you gather stats about how many cache hits vs misses you have? How do you serve the cached response when your server is pinned? These are problems 3rd party tools have solved.

You can expire things with setInterval and a counter. You serve a cached response if your server has a cached response, or if you are using it as a fallback you would combine it with something like toobusy maybe so you have a functional (if not fresh) 'under load' page.

Where it gets really fun with NodeJS is you can do all of your data work outside of the requests so you can pull all of your content out at the start and refresh it on an interval independently of the users, which can eliminate some or all of their trips to the database if you're lucky and it fits in ram and is viable etc.

The less moving parts that need to cooperate to serve your site the better - most things don't warrant a deep stack of technology to serve HTML and run CRUD operations.

and what if your whole site is dynamic? varnish doesn't do anything.

Very few sites are so dynamic that something can't be cached for at least a second. Even a heavily dynamic site like HN with people commenting all the time could at least cache for logged out users.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact