Hacker News new | past | comments | ask | show | jobs | submit login
Node v8.0.0 Released (nodejs.org)
531 points by petercooper on May 30, 2017 | hide | past | web | favorite | 177 comments

Short ver: async await is now in an LTS release of node. Anything that returns a promise can now be run inline - ie, no endless .then() chaining - provided you've started an async context:

  const writeFile = util.promisify(fs.writeFile),
    request = util.promisify('superagent'),
    stat = util.promisify(fs.stat),
    sorts = require('sorts'),
    log = console.log.bind(console)

  const getJeansAndSaveThem = async function(){ 
    var jeans = await request.get('https://example.com/api/v1/product/trousers').query({
      label: 'Levi Strauss', 
      isInStock: true,
      maxResults: 500
    jeans = jeans.sort(sorts.alphabetical)
    await writeFile('jeans.json', jeans)
    const status = await stat('jeans.json');
    log(`I just got some data from an API and saved the results to a file. The file was born at ${status.birthtime}`)
Note: you should add error handling, I'm new to async/await and this code is to demonstrate a concept on HN, not to run your life support system. ;-)

As noted, this already was possible in Node 7.x. But to use it, you needed a function that returned a promise. And many functions do not, due to having been developed earlier. This is true even for libraries done in 2016. Even today many Node.js developers don't seem to use async/await. At least in my country... And thus they live in braces {} hell.

The improvement on the 8.0.0 version is that such an "oldie" function can be Promisified. A great improvement.

You could always promisify functions manually, or with this tiny package https://github.com/sindresorhus/pify but yeah it's nice to see that in the standard library.

util.promisify is now in Node core as shown in the Original Comment

By "oldie" function, do you mean functions that expect node-style callbacks?

  e.g. function myCallback (err, result) {...}

Correct. Most node core APIs are written this way and now core ships an utility function[0] to wrap those functions in a promise.

[0]: https://medium.com/@styfle/promises-in-node-js-8-x-core-d6a8...

It seems to me that Node should ship promisify-ed versions of all these functions (e.g. fs.readFileAsync). Over time they could replace the implementation with ones that use the OS Async APIs, though perhaps this step is unnecessary if you're already using an event loop underneath.

Until then you can use `mz` for a modernised API[0].

[0] https://github.com/normalize/mz

I mean, they've been debating about how to add a promisified API for years (see for example https://github.com/nodejs/node/pull/5020) but nothing has come of it. util.promisify is a welcome addition, but very late for a first step.

How incredibly uncanny. Just yesterday morning, I posted this question!


FWIW v8.0.0 is not an LTS release. Technically the first v8.x LTS release will happen in October. See this repo for more details: https://github.com/nodejs/lts

Async/await support (without the --harmony flag) was made default in version 7.6 when they updated V8 to version 5.5.

The biggest additions here with regard to async/await is the new promisify util and the async_hooks.

Edit: Changed wording and included more detail.

Of course, but most people don't run non-LTS node releases :-)

Edit: originally wrote 'unstable'. Meant 'non-LTS'

The 7.6 release was no more unstable than the 8.0 release. Unless you're referring to the fact that 8.0 is going to be a LTS release. But if that's the case, it's not happening for a while yet, 6.10 is still the current LTS version.

You're right, corrected.

Node 7.6 is a stable release. Are you talking about LTS?

And some performance improvements. And the fact that there are more to come when they upgrade to V8 5.9 soonish.

>request = util.promisify('superagent')

I don't think that's required. Superagent already returns promises.

Ah you're right!


Alas too late to edit comment.

Sure, but having `util.promisify` in core modules means you no longer need an external dependency here.

Hm? It's not an external dependency. Superagent returns a promise.

Replying to myself: since this comment is so popular, here's a fully working version for node 8:

  const util = require('util'),
    fs = require('fs'),
    writeFile = util.promisify(fs.writeFile),
    request = require('superagent'),
    stat = util.promisify(fs.stat),
    sorts = require('sorts'),
    log = console.log;

  const getPhotosAndSaveThem = async function(){
    try {
      const response = await request.get('https://jsonplaceholder.typicode.com/photos');
      const photos = response.body.sort(sorts.byKey('title'))
      await writeFile('photos.json', JSON.stringify(photos, null, 2))
      const status = await stat('photos.json');
      log(`I got some data from an API and saved the responses at ${status.birthtime}`)
    } catch (e) {
      log(`Oh no, something went wrong!`, e)


Is there a roadmap for supplying async/await interfaces for the entire stdlib? Still some event/callback based stuff in there that doesn't return promises or conform to the callback standard promisfy requires.

Yeah, most of the time, I need to handle errors with try/catch block.

rant: why care about debugability and testability

There should also be something like

to replace

  await Promise.all 

There are many ways to compose promises, no point in elevating some of them to keyword status. If you need to do anything interesting with promises, just use a promises library. The point of async/await is to free you from having to use callbacks, not to eliminate the need for libraries.

Every async function will return a Promise anyway. I am simply interested in resolving a collection of Promises in a clean way.

I don't see what's unclean about Promise.all or how creating a 1:1 alias for it makes it cleaner.

Also, many times you want to limit the concurrency of your promise execution which isn't something you can do with an array of promises. You'd be back to using `await` + something like https://www.npmjs.com/package/promise.map.

I'm someone that used to use `co` where you could go:

    const [a, b] = yield [promiseA(), promiseB()]
But I prefer the simplicity and consistency of having to use something like Promise.all or require('promise.map').

Its the ambiguity for new programmers. With something like awaitAll new programmers wouldn't need to learn about Promises at all.

You have a point with concurrency, but handling concurrency is an another beast altogether.

I have a hard time understanding how adding alias indirection disambiguates anything much less spares one from learning something.

How does one get to the point where they want to await multiple promises, yet they need to be insulated from the very presence of the `Promise` object?

Well at least in my mental model a Promise is an object that eventually resolves to something within its .then() method. In contrast my mental model of await is that of a blocking call. One can internalize the latter without the former.

Given Promise.all resolves to an ordered array surely this would work:

    const [foo, bar, baz] = await returnerOfPromiseCollection();
That would assume there was either a) a catch in that collect or b) some error checking, but still...

It "works" but foo won't contain the resolved value, just the promise.

The code is awaiting an array, which is already "resolved" and its returned right away. Its contents are not relevant to `await`.

You are correct! My apologies = I misremembered how that works >_<

Yes, once you have a promise you have to deal with it as such through the whole chain.

Promise.all is not even that good. If one fails, the rest of the Promises are lost. It does not wait for all Promises.

I prefer something like p-settle most of the time https://github.com/sindresorhus/p-settle

Pretty sure there used to be a syntax in the spec for handling that. Something like `await*` I believe. It got removed pretty early on in the process though.

Any interesting reasons as to why?

I wasn't really involved in those discussions, but I suspect it's for similar reasons to what others in the comments here are arguing; they wanted to keep the syntax minimal and decided that adding a special syntactic sugar for something that could already be accomplished by `Promise.all(...)` was unnecessary.

You can make it yourself fairly easily I guess:

    const awaitAll = Promise.all.bind(Promise);

    async function foo() {
        var [val1, val2] = await awaitAll([func1, func1]);
It isn't ideal but it's a little cleaner

+1. I'm a big fan of top level function binds to remove boilerplate and keep code readable:

  const log = console.log.bind(console), 
    query = document.querySelector.bind(document),
    queryAll = document.querySelectorAll.bind(document);

You don't need to bind `console.log` by the way (at least not in any recent version of Node.js or browsers).

  await all([fn1, fn2]) ;)
i wonder when we will get s-expressions for function calls

  (awaitAll [fn1, fn2])

Just write (iced) coffeescript, s-exp are totally valid there:


Yeah, s-exps are the only thing I miss :(

You can use .map to call await on every element in the array, or define a new function so you don't have to write that all the time:

    const awaitAll = (futures) => futures.map(f => await f);
edit: this is wrong but I can't delete it. Haven't used async as much as promises.

Well an async function always returns a Promise. await simply blocks until the Promise is resolved to continue execution. Thats why I am wondering why they decided to await only on a single Promise.

Don't you need to be in an async context to call await?

Async await being part of core, not behind a flag, is probably my favorite new addition.

1) This was possible in node 7

2) A request can fail and can timeout.

3) Writing to a file can fail.

And you get the idea. Don't omit error handling. Error handling is business logic too.

Being relaxed around error handling is the recipe for cascading server failure.

I'm just getting started with async/await (like most people I stick to LTS releases) so please forgive the lack of error handling in the demo code!

Note that node 8.0.0 is not LTS. Node 8.x will be an LTS release around October.

As someone who spends all day volunteering in Stack Overflow I hope people providing examples were a little bit better at emphasizing error handling.

You should see the destructive effects of bad examples there.

You're welcome to add it.

In their snippet, errors would be handled altogether on the promise you get from `getJeansAndSaveThem()`.

You don't have enough information to be outraged at the toy example.

And that's a very implicit way of handling the error. Implicit behavior is good when you are encapsulating stuff (e.g: I don't need to know about the details of internal combustion engines, just pushing on the accelerator of my car so it moves forward).

But in this case encapsulation is a leaky abstraction that just makes the diagnosing of an error more cumbersome.

Oh please. The point of the post was to illustrate the feature to those wondering "what's async/await?", error handling would be complete noise for that purpose.

I would argue with you, but I don't really have to prove you wrong.

You will eventually prove yourself wrong, since deemphasizing error handling goes wrong very quickly.

One of the nice things is that you can just let async functions fail and they won’t throw asynchronously and kill the process. So no, error handling isn’t important to include in a small example.

Previously that characteristic was also a massive pain point for debugging Node apps though - neglecting error handling would result in silently swallowed errors and leave people (particularly newbies) scratching their heads.

Then they added unhandled rejection warnings by default, and all was ok again - but I see why someone might insist that all examples have error handling.

That’s not something that should be handled inside an async function (and therefore not in this example). If you don’t want to continue a promise chain, terminate it:

  .catch(error => {
    process.nextTick(() => { throw error; });

You want to use setImmediate rather than process.nextTick.

Mmm… no, I didn’t want to. Why do you say that?

Obviously the example code is inside another function - `await` can only be used inside a function marked with `async`, which always returns a promise.

That means error handling may not be necessary inside that function. Errors are caught automatically and returned to the caller as rejected promise! You need error handling at the top level but not necessarily inside each function.

"Note that, when referring to Node.js release versions, we have dropped the "v" in Node.js 8. Previous versions were commonly referred to as v0.10, v0.12, v4, v6, etc. In order to avoid confusion with V8, the underlying JavaScript engine, we've dropped the "v" and call it Node.js 8."

I was wondering how this would be handled. I guess old habits die hard since this article title includes the "v".

Odd then that nvm still calls it v8.0.0

That's a dumb reason to drop the "v". smh

For anyone not in the JS/Node worlds, this is a significant release that people are particularly excited about. It was also delayed somewhat due to wanting to align with V8 which should, however, be totally worth it :-)

Other relevant posts digging into new features include http://codingsans.com/blog/node-8 and https://blog.risingstack.com/important-features-fixes-node-j...

For me, this one brought happiness:

Node.js 8.0.0 includes a new util.promisify() API that allows standard Node.js callback style APIs to be wrapped in a function that returns a Promise. An example use of util.promisify() is shown below.

This is great stuff. This enables writing code using async and await at all times, which is what any sane developer would do when writing code for Node.js.

PSA: `setTimeout` can be `promisify`'d directly despite featuring flipped arguments, thanks to a built-in `util.promisify.custom` definition.

Promisify is a super simple function to write yourself though.

  const promisify = (fn, ctx) => (...args) =>
    new Promise((resolve, reject) =>
      fn.apply(ctx, [ ...args,
        (err, data) => err ? reject(err) : resolve(data)

Almost. Don't forget about the case where callback receives multiple arguments, and the fact that someone might decide to change the API for callback signature like `(err, result)` to `(err, result, stats)`, for example.

Promises only support a single success value so if you're "promisifying" something then you're only going to the second argument as the resolved value. The new util.promisify() doesn't provide said functionality [1] will only resolve the second argument [2] unless you define a custom function on the original function.

[1]: https://nodejs.org/api/util.html#util_util_promisify_origina...

[2]: https://github.com/nodejs/node/blob/ef16319effe396622666268c...

A bunch of functions in core return multiple things to callback. A lot more in userland do the same. Spec or not, that is still something that needs to be accounted for.

Those are handled by `util.promisify()` by having the original function define special properties on itself `Symbol('util.promisify.custom')` and `Symbol('customPromisifyArgs')`. But it's not something handled by default (e.g. resolving an array if the callback gets more than 2 arguments passed to it).

But why. If you're gonna take special steps to handle `util.promisify` you might as well just offer a Promised API and skip `util.promisify`

The problem is that everyone does that slightly differently.

Having a stdlib function makes for an easy "let's just use this everywhere" answer.

What is nice is that it eats into bluebird. I love bluebird, and there are some nice utility functions. But if you only used it to promisify or create promises, there may be no need to keep it.

Does Node support filtered errors like Bluebird?

   somePromise.then(function() {
       return a.b.c.d();
   }).catch(TypeError, ReferenceError, function(e) {
       //Will end up here on programmer error
   }).catch(NetworkError, TimeoutError, function(e) {
       //Will end up here on expected everyday network errors
   }).catch(function(e) {
       //Catch any unexpected errors
It's super useful.

Looks really clean. Does bluebird handle this(https://hackage.haskell.org/package/base- problem?

    Sometimes you want to catch two different sorts of exception. You could do something like

    f = expr `catch` \ (ex :: ArithException) -> handleArith ex
             `catch` \ (ex :: IOException)    -> handleIO    ex

    However, there are a couple of problems with this approach. The first is that having two exception handlers is
    inefficient. However, the more serious issue is that the second exception handler will catch exceptions in the
    first, e.g.  in the example above, if handleArith throws an IOException then the second exception handler will 
    catch it. 
    Instead, we provide a function catches, which would be used thus:

    f = expr `catches` [Handler (\ex :: ArithException -> handleArith ex),
                        Handler (\ex :: IOException    -> handleIO    ex)]


  .catch(ArithException, IOException, handleError)
Edit: Nevermind, I think what you want to be able to do is provide two different error handlers, but essentially catch them at the same time so that if you throw inside one of them, the second one wouldn't catch it.

Not related to the aforementioned Bluebird feature, but I think that's the very reason the promise spec allows you to specify an error callback as the second argument to .then. I guess you can always fallback to an if/switch statement if it's a concern (which is what you'd do with await and try/catch).

The standard way to handle errors coming from 'await' is try/catch, and any errors can be handled in the catch block as if they were coming from a synchronous context. So, you'd now filter async errors the same way you filter synchronous ones.

I have not tried, but I would guess no. Why not give it a try and see?

I constantly use `.map` and `.reduce` out of bluebird. I'm not sure I will replace these soon, since it's out of the Promise A+ specs.

As a matter of fact, does anyone have a benchmark of the new nodejs 8's promise implementation against bluebird, because I so far bluebird was faster than the native implementation.

If I remember correctly there is a 4.5x speed up in the bundled V8 (chrome engine) implementation, making it on-par speed-wise with bluebird, however that is hardly your bottleneck anyway.

It's not entirely clear how to interpret that; can you provide a summary?

The short story is that native promises are faster now except for the "promisification" part.

The benchmark was designed for realistic use in a node environment, where most of the libraries come callback based. Because of that a very fast "promisify" is really important. Native promises don't provide one so the naive implementation using standards-compatible API is quite slow.

Bluebird's promisify is a lot faster since it relies on non-standard (as in non-ES6-standard) internals instead of using the promise constructor as an ES6-based promisifier would need to do.

edit: on second thought, I haven't looked at the included `util.promisify` - it could be taking advantage of non-public internal V8 promise APIs.

Yes! I wrote about promisify a few weeks ago[0]

[0]: https://medium.com/@styfle/promises-in-node-js-8-x-core-d6a8...

I still don't get the need for Promises.. Almost all examples, just like this one in the provided link, talk about solving callback hell with Promises, while callback hell is just a bad way of writing software imao. Look at the code below and please tell me why your example with Promises is a better solution.

  function logAppendResult( err ) {
  	if (err) console.err('Failed to update file');
  	else console.log('Successfully updated file');

  function logWriteResult( err ) {
  	if (err) console.err('Failed to create file');
  	else console.log('Successfully created file');

  function handleFile( filename, fileExists ) {
  	const timestamp = new Date().toISOString();
  	( fileExists )
  		?	fs.appendFile( filename, `Updated file on ${timestamp}\n`, logAppendResult )
  		:	fs.writeFile( filename, `Created file on ${timestamp}\n`, logWriteResult );

  function main() {
    const filename = './example.txt';
    exists( filename, (fileExists) => handleFile(filename, fileExists) );

Takes a look at the code... logAppendResult almost same as logWriteResult...

Well, who are we to deprive you from the joy of writing lots of boilerplate code?

Have you ever tried to run more than one operation at once and collect the results?

(There are lots of other reasons to use the promise abstraction – having a type that can be transformed is extremely useful and natural – but that one’s pretty significant.)

    if (err) console.err('Failed to create file');
Congratulations, you're writing Go in JavaScript! :)

nah, this is valid javascript, Go came later, they might have copied a thing or 2 from C, like javascript did.

When callbacks are used with discipline, they are not much different from promises. The problem is when "discipline" part meets "human" part, though it's still true for promises, perhaps to a lesser degree.

The real value for promises is async/await.

Is this a joke?

You've repeated the if(err) check in three places.

None of your error handling bubbles up so the handlers (i.e. Logging to console.error) are buried in the individual functions.

You've pretended to avoid nested callbacks by inlining them using arrow functions.

For your sake I hope that was sarcastic.

No, it's not a joke. A junior dev can read this code and it will be very hard to create bugs.

Inlining with arrows is just a more functional approach, as I would write in Coffee:

  exists filename, (fileExists) -> handleFile filename, fileExists
Anything wrong with that line of code?

I just try very hard to keep my code as simple as possible.

I hope you know it's trivial to write a log function that accepts different callees so you end up with only 1 'if (err)'.

I didn't test, but I wouldn't be surprised if my example runs times faster as well btw.

Yes it doesn't handle the error case.

And if you replace the if(err) lines with a log(...) function it doesn't reduce them to one place. It makes you repeat the log(...) function everywhere. And you'd still need the if statement to handle the control flow.

Simple code is great, but not handling errors doesn't cut it for non throwaway applications.

You assume a lot.

We can refactor another round:

  function logFsResult( type, err ){
  	var msg= '';
  	switch ( type ) {
  		case 'append': msg= ( err ) ? 'Failed to update file' : 'Successfully updated file';
  		case 'write': msg= ( err ) ? 'Failed to write file' : 'Successfully created file';
  		default: msg= 'logFsResult error, missing or invalid first argument: '+ (type || '')
  	( err ) ? console.err( msg ) : console.log( msg );

  function handleFile( filename, fileExists ) {
  	const timestamp = new Date().toISOString();
  	( fileExists )
  		?	fs.appendFile( filename, `Updated file on ${timestamp}\n`, logFsResult.bind(null, 'append') )
  		:	fs.writeFile( filename, `Created file on ${timestamp}\n`, logFsResult.bind(null, 'write') );

Whenever you program in ECMAScript2016("Javascript"), you should take advantage of its features. Right now you're coding pretty much using the same way one would do it in C. Take advantage that functions are first-class in ES2016. Take advantage of JSON.

For example you could do something like this (sorry, don't have time to open my IDE to try the code i'm going to write):

    logFsResult = (type, err) => {
        map_action = {
          "append": { 
            True: "Succesfully updated file.",
            False: "Failed to update file."
          "write": {
            True: "Successfuly created file.",
            False: "Failed to write file."

        message = map_action[type][(err != null)] // obtain the message
          method = (err)? console.err : console.log // choose logger
          method(err) //invoke the correct logger with the message

This is an easier-to-mantain code. It separates the messages to use, from the logic. It allows you to easily configure/change the messages to log, and the method (function) for logging the error, without having to touch the decision logic. On your original example, the logic of the code was mingled with the error messages.

You could say this example was more "idiomatic" ES2016.

> Whenever you program in ECMAScript2016("Javascript"), you should take advantage of its features.

That's a trap, I should rather make my code as readable, scalable and bug free as possible regardless of ESxxx.

I can refactor in 10 other ways (different styles) coming to the same result, but that's not what my point was about. Using promises and so is just taste or preference, if you like it you use can it, if not do without. I've seen amazing spaghetti with promises and callbacks as well.

Easy to nitpick btw:

you compare err != null?? besides not using strict mode, what should err be? a String? So, what will happen if err is undefined or an empty String?

Then you call logFsResult with err while it is not used.. Did you even consider what happens if the value of type is not available in map_action? I'll be the end of the feast!

last one: True and False as an object key are not Booleans, so if you have your IDE up and running, the code will fail.

Now, you can try to solve this with promises, just as you can try brush your teeth with a broomstick.

> you compare err != null?? besides not using strict mode, what should err be? a String? So, what will happen if err is undefined or an empty String?

YOU are the one who made that comparison, not me. You originally wrote the following line:

   ( err ) ? console.err( msg ) : console.log( msg );
What do you think the "?" operator does with "err"?

> Then you call logFsResult with err while it is not used..

It seems you don't understand the code. I'm not calling "logFsResult", i am defining a function called logFsResult. You also did the same, you defined logFsResult to receive the "err" parameter.

  function logFsResult( type, err ){
  	var msg= '';
  	switch ( type ) {
> That's a trap, I should rather make my code as readable, scalable and bug free as possible regardless of ESxxx.

ES6 allows you to write more readable code than ES5. Take a look at the features.

That's still not error handling, that's error logging. Try writing code that depends on the success of the previous operation to perform another operation and you'll quickly find yourself in the callback soup.

Honestly that looks hideous.

You might like the make_esc/errify pattern [0]. You can apply the same function to any number of error callbacks in order to unify error handling. Works great with Iced CoffeeScript but also works well without. I can provide more examples but I think you'll get it.

[0]: https://github.com/nextorigin/el-borracho/blob/master/src/re...

P.S. Iced3 compiles to ES6 await so all of this has been working together for quite a while.

With Promises we conceptualize a control flow construct like callbacks. It is much easier to reason about a concept rather than following code execution paths. With async await we return control flow to the current scope.

One benefit comes from being able to use `async`/`await`.

Using `try`/`catch` with `async`/`await` is a bit awkward, though, which is especially unfortunate because it's the #1 place you should be handling errors.

LightScript has a language feature[0] that makes it less awkward (sort of an`Either`/`Result` type) that I'm thinking about submitting to TC39.

[0] – http://www.lightscript.org/docs/#safe-await

I will agree, to be honest I tried promises in my own projects and now writing them for someone. But to me Async.js is just cleaner nicer better. Funniest part is Bluebird docs on transitioning from it to Promises, promise example is bigger and messier.

Previously I had used a module called "denodeify" that achieves the same thing as promisify. I can see why the node maintainers used a different name :-D

I've been using https://github.com/sindresorhus/pify which is… also a different name but a similar one

Wow, what about good old promises? I totally dislike async, await, because they force you to switch to async code-flow-thinking.

>I totally dislike async, await, because they force you to switch to async code-flow-thinking.

If you started programming with Node, it's like you've learned doing everything the wrong way.

The async/await brings back the sane, synchronous, reasoning. There's a reason that from Lisp to Haskell, every language tried to get rid of the error-prone callback spaghetti.

I have not started programming wit JavaScript. The point is that node js is event driven and you can hardly escape from callbacks, except with syntatic sugar.

A sync await is an argument for people like you, who are trying hard to change its nature. It was added to the specs, because of "browsers got stuck to JavaScript".

There are better server side languages, so you can use them.

>The point is that node js is event driven and you can hardly escape from callbacks, except with syntatic sugar.

You can hardly escape from imperative code except without syntactic sugar either. That doesn't mean that one should program in the lower layer of abstraction of a platform. If we followed that, C programming would be all gotos instead of functions and the usual control flow (even "if" and "for" are syntactic sugar on top of assembly constructs).

>A sync await is an argument for people like you, who are trying hard to change its nature.

If it wasn't for people who tried hard to change its nature, JS would still be the same ho-hum language it was its first 15 years (I was there). Node.js was itself an attempt to "change" the nature of JS, moving it from client to server side.

The need for callbacks stems from the fact that JS as a language was never specifically event oriented -- any more than in any other language. It just supported DOM events in the browser environment, which for the first 10-15 years of the web were just simple one level callback handler (button clicked, do that). Hardly any kind of asynchronous programming to write home about. Aside from having first class functions, JS was not particularly designed for evented code. That's where callbacks came in, as a poor man's way to deal with evented code -- other languages have had coroutines, promises etc for 30+ years.

async / await works only with Promises. As the name util.promisfy indicates, it generates promises.

You never need use use async / await.

Thanks. I actually wrote the comment, because of :

> This enables writing code using async and await at all times, which is what any sane developer would do when writing code for Node.js.

Async and await do not make your code work synchronously, they make it look like synchronous.

I know there's much bigger things in this release to be excited about, but I'm so happy they're allowing trailing commas in function args/params.

Curious why?

I get the usefulness for arrays and object declarations. Particular to have cleaner diffs. But why for function calls?

Prettifying long function signatures and calls. You can spread them across multiple lines with the same trailing comma syntax you might use with a multiline array or object literal.

> Particular to have cleaner diffs. But why for function calls?

Exactly the same reason :)

I suppose but that's usually an indication that an object would be more apt. If you have enough parameters that it needs to be wrapped, they're probably hard to track too.

Well no, since you can use object destructuring.

const fn = ( { arg1 = '', arg2 = [], arg3 = true } = {} ) => { }

But isn't this already valid with trailing commas?

One small benefit is that it's now easier to generate code.

I like it for `compose()` function calls. Changing the parameters and order is fairly common, so the trailing comma becomes convenient for moving and adding parameters.

This is exciting news! I'm a long-time LAMP developer (now mostly Laravel) and have been experimenting with NodeJS for an upcoming API project. As Javascript becomes a larger part of each new project, using one language throughout the entire stack is becoming much more compelling.

Is Express still considered the de facto web framework for NodeJS? Or are other frameworks better suited for someone used to the "batteries-included" philosophy of Laravel. I'm watching the new "Learning Node" course from WesBos since he covers async/await and Express seems very similar to most MVC frameworks.

Koa.js provides a promise-based API that's well-suited to be used with async functions. It has the distinction of being the official successor to Express (created by the same people). Having used it for several years now, I find Express feels a bit kludgy and error-prone.

That said, Express will be around for quite some time, due to its name-recognition and large install base.

Express is the most popular. As a freelancer I always enjoy when I see express as the framework of choice, since it's very easy to maintain the produced code and is very hard to go unmaintainable.

As a matter of fact, what I don't recommend is Sails [1], which tries to do as much as it can and is quite inflexible in terms of technical decisions

1 : http://sailsjs.com

My passion right now is feathers+nuxt+vuejs. If you're really wanting to keep the "Laravel" feeling though, there is also adonisjs.

AdonisJS is definitely very close to Laravel. I'm not a big fan of the generator syntax, but I assume they will migrate to async/await with the Node v8 announcement.

Might look at Hapi (https://hapijs.com/) for a more "batteries-included" experience - but Express is still a great choice for just busting out a simple http service.

You can also take a look at Huncwot [1] which is being built specifically with async/await & new versions of JavaScript in mind. PS. I'm the author.

[1]: https://github.com/zaiste/huncwot

Express is still very common, but other than that and Hapi, Sails.js (http://sailsjs.com/) is probably a great place to start. It's a full-featured MVC framework built on top of Express.

Does anyone have a link to better explanation of the changes to `debugger`?

> The legacy command line debugger is being removed in Node.js 8. As a command line replacement, node-inspect has been integrated directly into the Node.js runtime. Additionally, the V8 Inspector debugger, which arrived previously as an experimental feature in Node.js 6, is being upgraded to a fully supported feature.

It sounds like `node debug` will no longer work? But it is replaced with something that's better? What is `node-inspect` and where can I learn about it?

Good questions – the new facilities are improvements AFAICT:

    node --inspect index.js

    node --inspect --debug-brk index.js
then open `chrome://inspect` in Chrome.


Can I debug node 8 from the command line, or do I need X / chrome now?

You can also debug, to a limited degree, from the command line:


> Node.js's debugger client is not a full-featured debugger, but simple step and inspection are possible.

Thanks. I use the REPL debugger often, glad to see it's not going away.

Note: --debug-brk has been deprecated in Node 8, you can use --inspect-brk (--inspect not needed) instead.


It's explained there. Basically, `node debug` will still work, they just had to change the command line debugger to support the new protocol, since the old protocol was removed from V8.

But unless you really need to debug from the command line, --inspect/--inspect-brk is the way to go. You don't necessarily have to use Chrome either, these days IDE debuggers support this protocol as well.

This is a big release. Async/await in stable core is something I've been (literally) waiting 6 years for.

Many people have criticized Node's cooperative multithreading model, with both good and uninformed reasons. Yet, it is without dispute that the model is popular.

Async/await is a giant leap forward toward making Node usable for beginner and expert alike. This release is a celebration.

For those of you with existing applications looking to migrate, try `--turbo --ignition` to emulate (most) of the V8 5.9 pipeline. Anecdotally, microbenchmark-style code regresses slightly, real-world code improves by as much as 2x. Exciting times.

> Anecdotally, microbenchmark-style code regresses slightly, real-world code improves by as much as 2x.

Curious, any hunches as to why?

Well, the optimizing pipeline has completely changed in V8 5.8 (if you use `--turbo --ignition`) and in 5.9+. It's been simplified and most importantly, does not deoptimize on quite so many common features (such as try/catch). More information at http://benediktmeurer.de/2017/03/01/v8-behind-the-scenes-feb... and some of his other articles.

In my testing it appears that TurboFan cannot optimize certain patterns as well as Crankshaft did, but there's no reason to believe those regressions will remain as TF evolves. Optimizing more code is much more important for real apps.

This just motivated me to play around a little bit with JS async/await implementation. What I found interesting is that async functions will always return promises, even if an immediate value could be returned. Like for example in the following function:

    async function getFromCacheOrRemote() {
      if (random()) {
        return "Got it";
      } else {
        await DoSomethingLongRunnning();
        return "Got from network";
The function will return a Promise independent of which branch is taken, although it could return a string for the first branch. Does anybody know the reason? From a consumer point of view it does not matter if the consumer uses await, since await accepts both immediates and Promises. Is it because always returning promises is more convenient for users which use Promise combinators instead of await and less bug-prone? Or does it maybe even benefit JS runtime optimizations if the returntype is always a Promise - even though the promise in both cases might be of a different subtype?

For most applications it probably doesn't matter anyway. However returning and awaiting immediate values eliminates an allocation and eventloop iteration compared to using a Promise, which is helpful for high-performance code. This is why C# now introduced custom awaitables and things like ValueTask<T>.

It would be weird for async functions to be synchronous wouldn't it?

Also the consumer is not required to use await with a called async function. They can use .then on it, or pass it around to other functions that consumes promises. Or, you could pass the async function to decorators that consume promise returning functions. Using experimental decorator syntax:

    async function() {
Yes it is a trade off of performance for convenience and consistency.

Also... the right side of await will effectively be wrapped in a simple Promise.resolve() if it is not a promise. Proof: http://babeljs.io/repl/#?babili=false&evaluate=true&lineWrap...

Weird: Depends. I think that an async function might sometimes complete synchronously is not weird - it can happen in many scenarios where the least common denominator is "The function might sometimes take a long time and therfore has to be async". Having a different return type depending on the taken path (Value or Promise<Value>) is definitely weird and should not be encouraged (although supported by JS). I'm therefore in favor of the decision that an async function always returns a Promise - I just wanted to know if anybody has more insight on how the decision was taken and if the performance impact has been considered.

Thanks for bringing up the issue around that an await on a value will wrap it into a promise before! That means even the following "workaround" would not work:

    function getFromCacheOrRemote() {
      if (random()) {
        return "Got it";
      } else {
        return DoSomethingLongRunnningWhichMightUseAsyncAwaitInternally()
        .then(() => "Got from network");

    var result = await getFromCacheOrRemote();
Here getFromCacheOrRemote interferes correctly as type string | Promise<string> in typescript. However if an await on the function will still trigger a Promise creation and the await an eventloop iteration it won't buy anything compared to the simply solution. Seems like to profit from synchronous completions there's also some steps on the callsite needed like:

    var maybePromise = getFromCacheOrRemote();
    var result;
    if (typeof maybePromise === 'object' && maybePromise.then != null)
      result = await maybePromise;
      result = maybePromise;
And just for clarification: I wouldn't encourage any normal application to do this kind of things, the normal async functions should be great for them. However for some libraries (e.g. high-performance networking libraries) these optimizations can make sense. And e.g. the awaitable ValueTask<T> in C# was created with exactly those scenarios in mind.

This "unleashes Zalgo" and is considered a major anti-pattern when designing async APIs: http://blog.izs.me/post/59142742143/designing-apis-for-async...

Is it? I only do some moderate node work, so I could be missing the discussion on this, but I've never seen this brought up or talked about before. I read the blog posts, but is this really a sweeping rule in the js community or is it just the opinon of a few people? Because no one seems to be doing it.

It's a general rule that everyone follows. Things should always be sync or async, not a mix of both.

Returning a promise is the contract of an async function, the whole mechanism is built on top of promises and is supposed to integrate with them seamlessly.

Async functions will always return a Promise, it doesn't know (or shouldn't) whether the work it's doing is synchronous. This is a common JS anti pattern and should be avoided if possible. The Promise should resolve automatically, so in some ways it's almost synchronous time.

It really is a significant release: the world 'significant' is used in 7 of the first 13 lines of the release statement.



It looks like somebody needs to set up the deb repository for 8.x, the installation script[1] is there, but there's no repo[2] for the node 8.x itself.

I also think this[3] url needs to get an update to reflect the new release.

edit-> Considering Debian Stretch will be released June 17th, it would be nice to have a repo for this release, i mean ..node_8.x/dists/stretch/Release.. instead of only jessie and sid's.

[1]https://deb.nodesource.com/setup_8.x [2]https://deb.nodesource.com/node_8.x/dists/jessie/Release [3]https://nodejs.org/en/download/package-manager/#debian-and-u...

Another way is to use "n": https://github.com/tj/n

$ nvm install 8

Long awaited release full of joy!


I just finished cleaning my home folder out of the ~100,000 files npm created over the past couple of months. I just build interesting Node projects I come across to check them out and it's gotten that big. I wonder how it's like for regular node devs.

Sane dependency management isn't free! I have some tips, though:

  # find how all node_modules on *nix under the cwd (gsort is for mac with homebrew coreutils; use `sort` otherwise)
  find . -name "node_modules" -type d -prune -exec du -sh '{}' + | gsort -hr

  # exclude node_modules from time machine backups
  mdfind 'kMDItemFSName == node_modules' -0 | xargs -0 tmutil addexclusion

Doesn't the same apply to any nontrivial programming language / dependency management system that works from source? e.g. Go?

I mean Maven's repository is usually pretty big too. It's usually compiled .jars but IDE's can opt to download sources + documentation too. A lot of Java applications end up downloading half the internet too.

Long story short, any non-trivial development / library / framework / software has a lot of dependencies.

I cleaned out a random code folder this weekend. Deleted nearly 2 million files from npm.

I created a barebones vue app and had 12,000 :/

Sadly, this is one of the worst parts of Node.js development. Import one NPM package and it will import hundreds of other packages. The sad thing is that probably many of them are:

- no bigger than 50 lines of code - probably "unpromified" - probably unmaintained

I really like ECMAScript2016 and the concept behind Node.js, but the NPM ecosystem is really something that isn't pretty.

>node-inspect has been integrated directly into the Node.js runtime

Is node-inspect the same thing as node-inspector or something else?

Disclaimer: Author of `node-inspect`

No, `node inspect` is the new command line debugger for `node --inspect` which replaces `node debug` for `node --debug`. The name is derived from `node --inspect` and has no relation to `node-inspector`.

Thanks for your work! CLI has always been my primary method of debugging node, and I'd hate to see it go away.

Is this pretty much identical to the old debugger?

Yep, the same commands should still work. There's some additional commands now for CPU/memory profiles. But that's the biggest difference - hopefully.

when you run something you, can add the --inspect flag to debug.

1. node --inspect app.js 2. In chrome do `about:inspect`

you now have debugger attached. WIN!

Looking forward to writing unit tests for luwa. Node v8.0 should include wasm 1.0

The promisify stuff looks rather clunky. Aren't there better options?

That's only needed to convert old callback style code to promise-compatible code. Most libraries either return promises now or have an option to, so this isn't necessary.

Notejs should get promise versions of its current callback-based APIs:

    const fs = require("fs");
    fs.writeFile("Hello, World", "helloworld.txt", (error)=>{
        if(error) throw error;
Should be:

    const fs = require("fs");
    fs.writeFilePromise("Hello, World", "helloworld.txt").then(()=>console.log("done!"),error=>console.error(error));

It really should be:

        x = await fs.writeFilePromise("Hello, World", "helloworld.txt")
        console.error(error) // or console.log
This takes advantage of promises fully.

This was tried many years ago. At that time, Promises/A+ was not finalized, and the community could not agree on which Promise specification was best, or even if one was needed at all.

Callbacks are lightest-weight re: CPU & memory overhead, so it was decided that core APIs should implement that, and developers can override using promisify (via e.g. Bluebird or the new `util.promisify()`) as they need. But putting that kind of assumption in core could lead to significant pain.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact