Hacker News new | comments | show | ask | jobs | submit login
Fixing the callback spaghetti in node.js (github.com)
222 points by koush on Sept 29, 2011 | hide | past | web | favorite | 101 comments

Very cool, but it's too bad you have to deal with all that to begin with.

I've been using LuaJIT embedded in Nginx (LuaNginxModule). Lua supports coroutines, so a function can just yield. Here's a brief example:

    -- Query a database using an http backend.
    -- Yields and handles other requests until the reply is complete.
    local record = util.getUserRecord(userId)

    -- Send some text to the client. Yields control while
    -- the actual transfer is in progress.
    ngx.print( "Result=" )
    -- Send the result, encoded as JSON, to the client
    -- Again, this call doesn't block the server.
    ngx.print( cjson.encode(result) )
With code like the above I can easily handle into the thousands of concurrent connections per second on the lowest end Linode VPS node available, with barely any load on the box -- and I'm told it should be able to handle 40k+ connections per second, if I were to do any tuning. Oh, and I have only 512Mb of RAM, which it doesn't even get close to under load. And the longest request took less than 500ms at high load.

I've been using OpenResty [1] which has the Lua module and a bunch of others all configured together. Works great, and I can't complain about the performance.

Someday I'm sure I'll hand the maintenance of this off, and then I might regret not using one of the "popular" frameworks. But the code is SO straightforward using this stack -- and what I'm using it for is so simple -- I think not.

[1] http://openresty.org/

IIRC, the original plan for Node involved coroutines (maybe even Lua?). But, they were scrapped because they introduced a lot of the same problems as full threading. You never know if the state of the world is going to change in ways unrelated to the function you are about to call because some deep-nested subroutine might yield.

You never know if the state of the world is going to change

It's the same situation when using callbacks.

The difference is that, for the duration of the callback, you know the state will only change in ways directly related to the functions you are calling. Between callbacks, anything could happen. But, at least you have an island of sanity within the callback. With coroutines however, any function you call might block for some obscure reason (logging some debug info to a file, for example). That means you can only be sure that the world won't move behind your back as long as you don't call any functions :)

I don't have a lot of experience in this area. I'm just reporting back what I remember hearing.

The difference is that, for the duration of the callback, you know the state will only change in ways directly related to the functions you are calling.

That is true for a single threaded reactor, but in practice I haven't found it to be an especially useful property because naturally that doesn't include external state (i.e. the database).

Getting atomicity right in evented-code has in fact often been a hairier issue for me than doing the same with co-routines or threads, because when you're not allowed to block ever then you quickly find yourself in a situation where you need a retry-mechanism.

Such codebases then tend to quickly converge towards the actor-pattern (tied together by queues) which, ironically, could be had much easier by starting out with co-routines in first place.

Isn't that sort of the point? If you use node, you never know if the state of the world is going to change in ways unrelated to the callback you are about to pass, because you expect the world to change between now and the invocation of the callback.

The real reason will be that V8 does not support co-routines.

I think corysama is referring to Ryan Dahl experimenting with Lua (as well as C and Haskell) before settling on JS/V8.

I had several failed private projects doing the same on C, Lua, and Haskell. Haskell is pretty ideal but I’m not smart enough to hack the GHC. Lua is less ideal but a lovely language - I did not like that it already had a lot of libraries written with blocking code. No matter what I did, someone was going to load an existing blocking Lua library in and ruin it. C has similar problems as Lua and is not as widely accessible. There is room for a Node.js-like libc at some point – I would like to work on that some day.

V8 came out around the same time I was working on these things and I had a rather sudden epiphany that JavaScript was actually the perfect language for what I wanted: single threaded, without preconceived notions of “server-side” I/O, without existing libraries.


The threading model in Haskell (GHC) would be ideal, as it's essentially abstracted away from the developer. There's certainly the potential for a performance advantage there as well as easier to maintain code. The advantage Node.js has, is that the vast majority of web developers already know javascript. It doesn't need to be the "perfect" server-side implementation, it just needs to be good enough. It's success is due to it's easy of accessibility, something he would have missed out on had he gone the Haskell path.

Its not just potential - all the major Haskell web frameworks can handle more concurrent requests than node.js and scale to multi-core. http://www.yesodweb.com/blog/2011/03/preliminary-warp-cross-...

I don't think node.js is good enough because you have to deal with the issue being discussed here. In Haskell, as you said, you just write normal code with no worries about mutable state messing with your thread.

Mozilla's JS engine does, and there is at least some work being done to hook Node up to it.


Ahh, but here's the deal: The way the Nginx Lua module is written, you store everything in local variables. Those stay consistent across yields, and are tied to a single invocation of the Lua function.

So except for things you should EXPECT to change (like the state of a database you're querying) between calls, the "world" you (as a programmer) should care about stays perfectly consistent.

Unless I'm not understanding what you're asking -- is there some situation that I haven't encountered where the state of something that isn't a local variable matters?

The problem wasn't intrinsic to coroutines as much as it was the way they made the event loop reentrant which is a horribly complex thing to track. In most cases like that you'd either suspend the loop or have coroutine local event loops.

Of course people like to point at the overhead of native threads and assume coroutines have similar overhead, which is total bunk. Ironically, event sources in node use a stack-like tracking element which brings back a similar sort of overhead you see in coroutines in Lua, for example.

I doubt we'll see node take another shot at coroutines but that's okay. Node will do fine without coroutines but it will come at the cost of making certain types of code a little less natural (same as eventide code in threads becomes unnatural).

Can you give an example of this?

This kind of stuff always makes me doubt why particular stacks are the most popular.

Wondering, is the OpenResty solution otherwise single-threaded-with-an-eventloop much like node?

Well, you can configure how many processes nginx runs with at startup, but otherwise yes it is an event loop based system. Lua is just a module like anything else in Nginx, and uses the Nginx event loop.

Ah right, I see.

From the nginx wiki:

> Unlike Apache's mod_lua and Lighttpd's mod_magnet, Lua code written atop this module can be 100% non-blocking on network traffic as long as you use the ngx.location.capture or ngx.location.capture_multi interfaces to let the nginx core do all your requests to mysql, postgresql, memcached, redis, upstream http web services, and etc etc etc (see HttpDrizzleModule, ngx_postgres, HttpMemcModule, HttpRedis2Module and HttpProxyModule modules for details).

Is that talk-via-nginx-commands thing cumbersome in practice?

No, it's pretty easy. An example:

    local doc = ngx.location.capture(  "/couchdb/hamster/"..uid );
This queries the local CouchDB for a particular user record, and stores it in a local variable.

I have been writing Node since late 2009 and Javascript with Rhino before that and Javascript with Rails before that. Before Node, I was used to linear program execution and was hesitant to switch to writing asynchronous code.

My mental model of code was one-dimensional: do this, then do that, then do that. So I was used to exception handling for errors, and my programs were like trains on a track. This was a comfortable abstraction, but the way in which my programs were written did not reflect what they were doing in the real-world: reading from disk, reading from the network, waiting for something, doing something, responding to something, receiving something in chunks.

Now, looking back, I prefer writing code that reflects what my programs are doing. There is more headspace, another dimension, no longer one thing after another, but a stack of things hovering and happening at the same time, interacting with each other, moving forwards through time.

Now, instead of trying to make concurrent work appear non-concurrent, I prefer to embrace concurrency and see how I can write for concurrency. This is almost certainly different to the way in which I would have written synchronous code. Code for me is less complex now and shorter now than when it was synchronous. It feels richer and more descriptive of the work being performed. It's also faster and more reliable. In essence, I have learned to write better concurrent code.

FWIW, "Javascript with Rhino" supported co-routines, and frameworks like Apache Cocoon took advantage of this to play tricks similar to Smalltalk's Seaside. (That said, you might have meant something different, as Node and Rails are both web tiers, whereas Rhino is an engine like V8.)

Callback spaghetti is a sign that you're doing something wrong.

The first example from this page shows a request handler initialising a database connection and then executing a query. That's terrible separation!

Callbacks "spaghetti" actually does a great job of highlighting when you're not abstracting enough, any more than about 5 indents and you should be seriously considering refactoring your approach.


    app.get('/price', function(req, res) {
      db.openConnection('host', 12345, function(err, conn) {
        conn.query('select * from products where id=?', [req.param('product')], function(err, results) {
becomes something like

    app.get('/price', function(req, res) {
      products.fetchOne(function(err, product) {
        res.render('product', { product: product });
Also, as other posts on this page mentioned, try{}catch{} is not how errors are handled in Node.JS, plenty of async operations will gain a fresh stack, and cannot be caught in this way.

  > Callback spaghetti is a sign that you're doing something wrong.
I have some Node.js code that does direct uploading via Amazon's S3 multipart uploading API -

  * multipart form processing, callbacks
  * each part requires multiple S3 API calls, callbacks 
  * parse XML results from the API, callbacks
Granted not all workflows are this complex. But many are - and they will result in callback hell. But saying that people are doing something "wrong" is at odds with the reality that complex workflows are a fact of life.

You misunderstand me, I think. Callback spaghetti is different from complex code that uses lots of callbacks.

I'm not saying you don't need to have a series of nested callbacks to do things, I'm saying you should hide these behind the appropriate abstractions for the task you're writing.

In your case the bullet points listed are exactly those layers of abstraction.

The request handler processes the form and calls the S3 layer for each file. The S3 layer then calls the APIs and passes off the responses to the XML parsing layer - which gives it back a useful JS object detailing the response, the API layer then provides a response to the request handler which formats the response and sends it to the client.

The workflow I'm implementing in Node is even more complicated than this, but good use of abstractions and control flow libraries[1] means that it's extremely rare for any code to be indented more than 5 blocks.

[1] https://github.com/caolan/async is my lib of choice, but plenty of other suitable choices exist.

becomes something like

You comfortably omitted the error handling (admittedly the original snippet didn't have that either).

However in production code it's not optional, and that's where node.js code tends to get really nasty.

On this point I don't disagree, for this case you'd expect the data access layer to deal with all the database errors and return something useful to the request layer:

    app.get('/price', function(req, res) {
      products.fetchOne(function(err, product) {
        if (err) {
          res.render('error', { error: err });
        } else {
          res.render('product', { product: product });

you'd expect the data access layer to deal with all the database errors and return something useful

There you nailed the problem. It's a constant headache, especially since library authors have different ideas about the format and semantics of "something useful".

Before you know it you have re-invented your own half-baked exception-framework, to normalize/wrap/re-throw those ErrBacks. And then a week later you notice that rollback/retry and event cascades need a whole new level of treatment.

There's a reason why mature event-frameworks such as Twisted have never quite taken over the mainstream. It's sad to see node burn all its powder on a niche programming-style that just doesn't fly for the majority of applications.

I really like this idea, but it doesn't seem to do error handling correctly.

This is not going to catch most errors occurring in an asynchronous APIs:

    try {
      asyncOperation(function(err, result) {
         // ...
    } catch (e) {
      // ...
Most errors will occur asynchronously, thus the convention of "err" being the first argument to asynchronous API callbacks in Node.

Unless this supports that convention, your code should actually look like:

    async function magic() {
      try {
        // code here
        await err, bar = doSomething();
        if (err) {
          throw err;
        // more code here
        await err, boo = doAnotherThing();
        if (err) {
          throw err;
        // do even more stuff here
      catch (e) {
        // handle the error
...or something similar. It's better than the alternative, but not great.

This certainly could support the "err" first convention, but APIs that don't use that convention wouldn't work correctly.

Not necessarily. A callback function with the standard signature under the hood is implied. If the author is controlling flow into and out of these execution contexts there is no reason an error passed to the implicit callback cannot cause an exception to be thrown into the original execution. Throwing exceptions into coroutines is a fairly normal pattern.

True, but the existing Node APIs don't do that, they return the error as an argument to the callback.

The wiki page says:

    // these two lines are equivalent
    await a, b, c, d = moo(1, 2, 3);
    moo(1, 2, 3, function(a, b, c, d) {} );
So I don't see why not...

    // ...these two lines would be also equivalent
    await error, foo = bar(1, 2, 3);
    bar(1, 2, 3, function(error, foo) {} );

Exactly. The error object is a return value of the "await"-ed function, not an exception that's thrown.

Here's a concrete example. Normally in Node you do something like this:

  fs.readFile('/etc/passwd', function (err, data) {
    if (err) {
      // handle error
    // normal processing
The await version would look like this:

  await err, data = fs.readFile('/etc/passwd');
  if (err) {
    // handle error
  // normal processing
Ideally it would look like this:

  try {
    await data = fs.readFile('/etc/passwd');
    // normal processing
  } catch (error) {
    // handle error

This is certainly something that is doable and crossed my mind, but I did not want to have the await be restricted to functions that conform to the typical return arguments. For now at least.

How about

  try await foo = bar(baz)
as shorthand for

  await error, foo = bar(baz)
  if (error) throw error

I wrote a library (30 loc) to handle sequence and parallel flows.

Using only that, I rarely go over 80 character column limit that I impose on myself. There is absolutely zero callback spaghetti whether it's 2 or 25 functions deep in the chain.

Tbh, callback spaghetti only happens to newer async programmers in the same way that a newer programmer will write arrow code with if/else statements.

It's simply not a problem that needs to be addressed other than educating people who are new to node.js with some example tutorials that use an async helper library.

So instead of spaghetti code (a tangled mess) it now is macaroni code (small pieces of code everywhere, with unclear execution flow)? :-)

At least that's my experience with structuring asynchronous event-driven programs (without coroutines).

Yes :) Still waiting for a proper execSync ( https://github.com/joyent/node/issues/1167 ).

Without it small build/utility scripts turn into macaroni cheese unless you resort back to Python/Ruby/Bash - more languages, harder to maintain.

pretty much but I find it quite readable and maintainable. The execution flow is very clear though if you look at the comments in my gist.

I think this is coroutines under the hood?

IMHO the fact that the accepted solution to this problem is "use an async helper library" is a bad sign. This is so fundamental it should be part of the language, or at least runtime.

Actually that's a good point. I'd like to change my stance to it would be nice to have this language feature, but there are current workarounds that are decent.

i agree with you, i don't see it as a big issue. anyway: i'm excited that there are new/other paradigms out there. i'm not yet convinced that this the ultimate (if there is such a thing) solution, but i will give it a try.

got a link? I'd be interested to know how it deals with errors / exceptions, for example.

I've been using: https://github.com/caolan/async

It looks pretty decent in CoffeeScript:

  blah = (done) ->
    async.series [
      (next) -> foo next
      (next) => @bar.baz 1, 2, 3, next
      (next) ->
        x = y
        z w next
    ], done

Looks concise, but doesn't much help with the fundamental problems of callback code - exceptions don't work right, and "callbacks all the way down" whenever something starts having any async computation.

what do you mean by this -> "callbacks all the way down" whenever something starts having any async computation.

Imagine function a calls b() and function b calls c(), and they're all sync.

Now, let's say c() changes and needs to call someAsyncFunction() and provide a callback. Which means c itself needs to take a callback. Which means b needs to provide a callback, so b needs to also take a callback, so a needs to provide a callback. And so forth.

Callbacks are infectious - once anybody in the call stack needs one, everybody needs one even if they just pass it on down the stack. Unless you don't need to do anything with return values, but that's fairly rare.

This is why I'm planning on diving into Go when I eventually have a serious realworld need for a lot of concurrency. None of this kind of mucking around is necessary.

There was some discussion of adding this to CoffeeScript. The issues on GitHub seem stalled. Would really love to see it happen soon!

My understanding is that the CoffeeScript status is won't fix because the transformation violates some constraints about the complexity of the mapping that CoffeeScript values.

C#/.Net is planning on async/await features for its .Net 5 release: http://blogs.msdn.com/b/csharpfaq/archive/2010/10/28/async.a...

and in current release F# (current being pre-Build 2.0, i.e. RTM for over a year, tho I can't comment on whether you'd really like to have VS Ultimate to make it work)






(read comment)

The current release is included in VS Professional, Ultimate isn't required. It seems like the last recent installer for VS Shell is the CTP, so expect some difficulties if you want to go the free route on windows, although it's possible that SharpDevelop integrates with the latest source drop.

Anyway, it shouldn't be too difficult to build the compiler yourself and set up a basic emacs/vim + mono environment for linux.

I think every competent programmer who comes to Node for the first time thinks "hoo boy, better fix this callback stuff first", and immediately writes a module to do just that. I know I did! :)

koush just took this to the next level.

But you know what. Just stop fighting the callback model. Adapt your coding idioms and move on...

In JavaScript callback spaghetti would kill me, but in CoffeeScript it's a breeze and NOT something I want to "abstract away" at all for various reasons. I suggest the solution to JavaScripts concoluted anonymous functions syntax is CoffeeScript, or not inlining.

That "callback spaghetti" is called continuation-passing style.

"Programs can be automatically transformed from direct style to CPS." [1]

Do the math.

[1] http://en.wikipedia.org/wiki/Continuation-passing_style

I don't fully understand why this makes callback hell better just because it is known under a nicer name. CPS has it's use cases, but application code is not one of them.

Specifically, CPS was intended as a mechanical program transformation to remove the stack, not a style for programmers to actually write their programs in.

And in a lot of other languages writing in CPS incurs a performance penalty (lucky for me it seems to be < 1 order of magnitude in clojure); I assume this is the case in node.js too?

You don't have to use inline functions all the time, you can do it like this:

  function foo () {.... }
  function bar () {....foo(); }
  function barfoo () {....bar(); }


And remember that 'bind' can save your life

this.io.sockets.on('connection', this.onConnection.bind(this));

The proper fix is to introduce coroutines.

That's exactly how this was done. I'm going to implement "yield" as well when I have some time. They both just leverage coroutines.

node-fibers is one attempt to put coroutines into node. I like it because it uses libcoro, which is nice and tight and uses ucontext, and because it works as a module without forking the core of node. Once you've 'fiberized' the server modules to start each request in its own fiber and implemented a call-with-current-continuation on Function.prototype, we're pretty much at the same place you've arrived but without new keywords.

Ah I missed this. I once tried to implement coroutines in v8 but v8 makes use of getspecific and setspecific pthread methods which makes it more complicated.

How does this compare to http://chris6f.com/synchronous-nodejs ?

Not sure to be honest! I am still very new the to scene so I am unaware of this project. At first glance, we seem to have different approaches towards the same goal though.

That project requires fibers, in the first place, and doesn't seem nearly as elegant.

Addressing the same problem (and more) in both node and the browser, via compiling-JS-to-JS: http://onilabs.com/stratifiedjs

(disclaimer: I am the guy who worked on the now-defunct `defer` support in Coffeescript, and am now working with the onilabs folk on the stratifiedJS runtime)

This is great, but I'd rather not use a fork of Node, and Joyent isn't going to integrate this, unfortunately. I wish that Google would just add yield to V8, but they won't do that unless Apple adds it to whatever the hell their interpreter is called now (Squirrelfish?).

In the node.js project I just finished, I was also forced to write some "async" c# for a separate system that the node.js project communicated with. Now, I'll be the first to admit, I don't know c#... but, it was the first time I had used both node.js AND c#. I have to say, the c# approach was a horrible exercise in pain and agony. I started to run into significant nesting in node.js, then realized it was because my approach was flawed. I abstracted more, and I also started using async.js (https://github.com/caolan/async), and everything was well in the world.

1) use async.js (classic javascript) 2) learn asynchronous patterns 3) profit

It's a lot of syntactic sugar for one very specific type of monad. Why not just patch the language to allow easy interoperation with any monad?

For example, F#'s let! binding works with any monad not just async.

I'm not sure monads work so well in languages without type-checking. That's certainly my experience.

This is very elegant. Just a couple of questions -

Using await - is the program flow suspended until the corresponding function returns , or does await keyword act more like a 'pause and continue' mechanism.

It exits the function, and resumes it when the callback completes. So other requests can be processed while the async function is (a)waiting.

Perfect :).

I like how the buzz around server-side javascript is progressing slowly from "the next big thing!!!" to "okay, it has its shortcomings".

You will pry Python from my dead, cold hands :)

Seriously, it sounds like Python handles these sorts of cases much better, with "yield". I wonder why everyone suddenly discovered JS for asynchronous programming and rushed to it when Twisted has existed for years. Is it because of V8?

It's partly because of the performance of V8 (my SMTP server, Haraka, is about 8-10 times faster than the Perl version I worked on, Qpsmtpd, for example), and partly because of the "sync library" problem - with Perl (or Python) most libraries that access things on the network or on disk are written synchronously. That's not the case with Node - everything is assumed to be async.

I've written a LOT of async code in Perl and C, and ran across this problem countless times. It really is a blessing to not have to worry about it.

I don't thing a lot of node's traction is coming from Python programmers (though when I last looked at Python I got the impression Twisted has a reputation for being spottily documented, has that improved in recent years?).

The attraction of using Javascript server-side for many is the possibility of using the same language client- and server- side, with the relative ease of a native data encapsulation (JSON) for transferring state between the two. Even back in the VB6 days (I say "back in the days" ruefully: we are still supporting the product!) I used JS for a large chunk of validation code server-side, this meant that the same logic was used in both places aside from some wrapper code to arrange for the code on both sides to be presented with the same data model. From the point of view of a relative beginner to server-side coding who is adept at Javascript on the client side this is particularly attractive (much like Python being available in browsers would be to seasoned server-side Python programmers)

The attraction of node in particular is a mixed bag:

* Momentum. People are using it so people are using it.

* The event driven nature, if properly handled, can make it very memory efficient.

* The speed of V8, which at the time was a step or two ahead of other javascript engines (since node first turned up there has been plenty of active development in this area, with a different story each month about one or the other JS engine beating everything else in some benchmark or other, so I'm not sure which has any sort of upper hand at the moment).

* It arrived at the right time, and development from "proof of concept" to a relatively mature product happened fast enough that people didn't become disillusioned soon after the initial excitement.

* Decent library support, partly due to the number of people created the above mentioned momentum. It is easy for an alternative server-side stack to suffer in this area and eventually die because of it. Though the rapid development may make this bite back a bit, as a lot of bindings from six months ago that have not been actively maintained might not work because of changes in node over that time.

* Other Javascript based server-side stacks have started to die off already, such as Jaxer (http://en.wikipedia.org/wiki/Jaxer#Aptana_Jaxer) for one example, partly due to the popularity of node.js though in many cases it was already happening (or they didn't gain much traction to start with).

I definitely think the iterator case needs work:


Will Conant wrote a great article discussing three different libraries (StratifiedJS, Streamline, node-fibers) to do with the problem: http://blog.willconant.com/post/7523275566/continuations-in-...

http://tamejs.org/ also has a good write-up.

Man, just use python+gevent and be done with it...

Changing JS is just not a great idea and I think Node.js has been wise to avoid it.

If you're going to innovate, then design a language that compiles down to JS that provides the innovation. CoffeeScript.

If you want to take that a step further, look at ClojureScript. Want delimited continuations? Fine. All w/o requiring you to fork Node.js or CoffeeScript.

It's great to see coroutines becoming more mainstream. I think every programming language could do well with some way to abstract over the program counter like this - ie. code that appears sequentially being executed interleaved with other code (deterministically, unlike threading).

Very nice. I've also been using the async node.js module which is quite handy for some other patterns. Of course, having direct support directly in node would be the best. Can't wait for the yield keyword!

Did anyone checkout Jscex? It basically has the same goal but implement as a library. https://github.com/JeffreyZhao/jscex

Or you can just do this:

   foo.step1 = function(){ do_some_thing(foo.step2) }
   foo.step2 = function(arg){ do_another_thing( ... ) }
   # enjoy

I don't accept that there's a "problem" that needs "fixing".

There's room for improvement. Most programmers shouldn't need to know how to write asynchronous code just as they shouldn't need to know how to write concurrent code. Consider mutexes vs Java's 'sychronized' vs transactional memory: there's a way forward, and it's to reduce complexity for the average programmer.

Remember that not everyone is as awesomely brilliant as you are.

Reduce complexity: sure. But if we have requirements that can be well defined within an async FP paradigm, then maybe an async FP environment isn't such a 'problem'... and many of us "average programmers" already know lisp. If nodejs doesn't suit everyone, probably not everyone should use nodejs. But if we're going to have a go at it, let's not immediately dismiss node's natural style as "an absolute mess"

Handling concurrency should be an accepted part of the skill set of GUI development.

Writing a user interface that can respond to user input, whilst also being able to handle and respond to a long running data access or computational requests is a concurrent problem.

Having a single threaded model with callbacks, like the JavaScript browser model is one of the less complex ways to handle this.

I agree that a less complex model is appropriate for some developers/applications. But to call yourself a "skilled-professional GUI-developer", you need to get awesomely brilliant enough to handle this.

I've been wanting to implement something like this for a while! Great job. It looks elegant and basically pushes the continuations into the background.

koush, nice! I use your code every day when I reboot my phone, but I appreciate this useful extension to node almost as much. :)

It seems the async variables are just like Mozarts data flow variables

Perhaps it would be interesting to see how it was done in Twisted: http://twistedmatrix.com/documents/current/api/twisted.inter...

We'd like to think that the C# guys were looking our way when they came up with async/await, but there's no proof. :3

I love the way Twisted does it. Completely part of the way the language is supposed to work. It's a work of art IMO.

Isn't this the best way so far to do these things? It turns asynchronous code into synchronous. I seem to remember Guido recommending this approach too.

JavaScript itself gives developers enough power to beautify messy async code, already. Below is the rewrite of the first example in that page, with async function composition:


/* https://gist.github.com/1250314 */

var db = require('somedatabaseprovider'),

      compose = require('functools').compose;
compose.async(getApp, connect, select)({ url:'/price', host:'host', pass:'123' }, function(error, shift){




It's that easy to abstract those messy callbacks using some functional tools.

theres this great thing called npmjs.org that is a place for stuff like this if I am not mistaken

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact