Short ver: async await is now in an LTS release of node. Anything that returns a promise can now be run inline - ie, no endless .then() chaining - provided you've started an async context:
const writeFile = util.promisify(fs.writeFile),
request = util.promisify('superagent'),
stat = util.promisify(fs.stat),
sorts = require('sorts'),
log = console.log.bind(console)
const getJeansAndSaveThem = async function(){
var jeans = await request.get('https://example.com/api/v1/product/trousers').query({
label: 'Levi Strauss',
isInStock: true,
maxResults: 500
}).end()
jeans = jeans.sort(sorts.alphabetical)
await writeFile('jeans.json', jeans)
const status = await stat('jeans.json');
log(`I just got some data from an API and saved the results to a file. The file was born at ${status.birthtime}`)
}
Note: you should add error handling, I'm new to async/await and this code is to demonstrate a concept on HN, not to run your life support system. ;-)
As noted, this already was possible in Node 7.x. But to use it, you needed a function that returned a promise. And many functions do not, due to having been developed earlier. This is true even for libraries done in 2016. Even today many Node.js developers don't seem to use async/await. At least in my country... And thus they live in braces {} hell.
The improvement on the 8.0.0 version is that such an "oldie" function can be Promisified. A great improvement.
You could always promisify functions manually, or with this tiny package https://github.com/sindresorhus/pify but yeah it's nice to see that in the standard library.
It seems to me that Node should ship promisify-ed versions of all these functions (e.g. fs.readFileAsync). Over time they could replace the implementation with ones that use the OS Async APIs, though perhaps this step is unnecessary if you're already using an event loop underneath.
I mean, they've been debating about how to add a promisified API for years (see for example https://github.com/nodejs/node/pull/5020) but nothing has come of it. util.promisify is a welcome addition, but very late for a first step.
FWIW v8.0.0 is not an LTS release. Technically the first v8.x LTS release will happen in October. See this repo for more details: https://github.com/nodejs/lts
The 7.6 release was no more unstable than the 8.0 release. Unless you're referring to the fact that 8.0 is going to be a LTS release. But if that's the case, it's not happening for a while yet, 6.10 is still the current LTS version.
Is there a roadmap for supplying async/await interfaces for the entire stdlib? Still some event/callback based stuff in there that doesn't return promises or conform to the callback standard promisfy requires.
There are many ways to compose promises, no point in elevating some of them to keyword status. If you need to do anything interesting with promises, just use a promises library. The point of async/await is to free you from having to use callbacks, not to eliminate the need for libraries.
I don't see what's unclean about Promise.all or how creating a 1:1 alias for it makes it cleaner.
Also, many times you want to limit the concurrency of your promise execution which isn't something you can do with an array of promises. You'd be back to using `await` + something like https://www.npmjs.com/package/promise.map.
I'm someone that used to use `co` where you could go:
const [a, b] = yield [promiseA(), promiseB()]
But I prefer the simplicity and consistency of having to use something like Promise.all or require('promise.map').
I have a hard time understanding how adding alias indirection disambiguates anything much less spares one from learning something.
How does one get to the point where they want to await multiple promises, yet they need to be insulated from the very presence of the `Promise` object?
Well at least in my mental model a Promise is an object that eventually resolves to something within its .then() method. In contrast my mental model of await is that of a blocking call. One can internalize the latter without the former.
Pretty sure there used to be a syntax in the spec for handling that. Something like `await*` I believe. It got removed pretty early on in the process though.
I wasn't really involved in those discussions, but I suspect it's for similar reasons to what others in the comments here are arguing; they wanted to keep the syntax minimal and decided that adding a special syntactic sugar for something that could already be accomplished by `Promise.all(...)` was unnecessary.
Well an async function always returns a Promise. await simply blocks until the Promise is resolved to continue execution. Thats why I am wondering why they decided to await only on a single Promise.
And that's a very implicit way of handling the error. Implicit behavior is good when you are encapsulating stuff (e.g: I don't need to know about the details of internal combustion engines, just pushing on the accelerator of my car so it moves forward).
But in this case encapsulation is a leaky abstraction that just makes the diagnosing of an error more cumbersome.
Oh please. The point of the post was to illustrate the feature to those wondering "what's async/await?", error handling would be complete noise for that purpose.
One of the nice things is that you can just let async functions fail and they won’t throw asynchronously and kill the process. So no, error handling isn’t important to include in a small example.
Previously that characteristic was also a massive pain point for debugging Node apps though - neglecting error handling would result in silently swallowed errors and leave people (particularly newbies) scratching their heads.
Then they added unhandled rejection warnings by default, and all was ok again - but I see why someone might insist that all examples have error handling.
That’s not something that should be handled inside an async function (and therefore not in this example). If you don’t want to continue a promise chain, terminate it:
Obviously the example code is inside another function - `await` can only be used inside a function marked with `async`, which always returns a promise.
That means error handling may not be necessary inside that function. Errors are caught automatically and returned to the caller as rejected promise! You need error handling at the top level but not necessarily inside each function.
"Note that, when referring to Node.js release versions, we have dropped the "v" in Node.js 8. Previous versions were commonly referred to as v0.10, v0.12, v4, v6, etc. In order to avoid confusion with V8, the underlying JavaScript engine, we've dropped the "v" and call it Node.js 8."
I was wondering how this would be handled.
I guess old habits die hard since this article title includes the "v".
For anyone not in the JS/Node worlds, this is a significant release that people are particularly excited about. It was also delayed somewhat due to wanting to align with V8 which should, however, be totally worth it :-)
Node.js 8.0.0 includes a new util.promisify() API that allows standard Node.js callback style APIs to be wrapped in a function that returns a Promise. An example use of util.promisify() is shown below.
This is great stuff. This enables writing code using async and await at all times, which is what any sane developer would do when writing code for Node.js.
Almost. Don't forget about the case where callback receives multiple arguments, and the fact that someone might decide to change the API for callback signature like `(err, result)` to `(err, result, stats)`, for example.
Promises only support a single success value so if you're "promisifying" something then you're only going to the second argument as the resolved value. The new util.promisify() doesn't provide said functionality [1] will only resolve the second argument [2] unless you define a custom function on the original function.
A bunch of functions in core return multiple things to callback. A lot more in userland do the same. Spec or not, that is still something that needs to be accounted for.
Those are handled by `util.promisify()` by having the original function define special properties on itself `Symbol('util.promisify.custom')` and `Symbol('customPromisifyArgs')`. But it's not something handled by default (e.g. resolving an array if the callback gets more than 2 arguments passed to it).
What is nice is that it eats into bluebird. I love bluebird, and there are some nice utility functions. But if you only used it to promisify or create promises, there may be no need to keep it.
somePromise.then(function() {
return a.b.c.d();
}).catch(TypeError, ReferenceError, function(e) {
//Will end up here on programmer error
}).catch(NetworkError, TimeoutError, function(e) {
//Will end up here on expected everyday network errors
}).catch(function(e) {
//Catch any unexpected errors
});
Sometimes you want to catch two different sorts of exception. You could do something like
f = expr `catch` \ (ex :: ArithException) -> handleArith ex
`catch` \ (ex :: IOException) -> handleIO ex
However, there are a couple of problems with this approach. The first is that having two exception handlers is
inefficient. However, the more serious issue is that the second exception handler will catch exceptions in the
first, e.g. in the example above, if handleArith throws an IOException then the second exception handler will
catch it.
Instead, we provide a function catches, which would be used thus:
f = expr `catches` [Handler (\ex :: ArithException -> handleArith ex),
Handler (\ex :: IOException -> handleIO ex)]
Edit: Nevermind, I think what you want to be able to do is provide two different error handlers, but essentially catch them at the same time so that if you throw inside one of them, the second one wouldn't catch it.
Not related to the aforementioned Bluebird feature, but I think that's the very reason the promise spec allows you to specify an error callback as the second argument to .then. I guess you can always fallback to an if/switch statement if it's a concern (which is what you'd do with await and try/catch).
The standard way to handle errors coming from 'await' is try/catch, and any errors can be handled in the catch block as if they were coming from a synchronous context. So, you'd now filter async errors the same way you filter synchronous ones.
I constantly use `.map` and `.reduce` out of bluebird. I'm not sure I will replace these soon, since it's out of the Promise A+ specs.
As a matter of fact, does anyone have a benchmark of the new nodejs 8's promise implementation against bluebird, because I so far bluebird was faster than the native implementation.
If I remember correctly there is a 4.5x speed up in the bundled V8 (chrome engine) implementation, making it on-par speed-wise with bluebird, however that is hardly your bottleneck anyway.
The short story is that native promises are faster now except for the "promisification" part.
The benchmark was designed for realistic use in a node environment, where most of the libraries come callback based. Because of that a very fast "promisify" is really important. Native promises don't provide one so the naive implementation using standards-compatible API is quite slow.
Bluebird's promisify is a lot faster since it relies on non-standard (as in non-ES6-standard) internals instead of using the promise constructor as an ES6-based promisifier would need to do.
edit: on second thought, I haven't looked at the included `util.promisify` - it could be taking advantage of non-public internal V8 promise APIs.
I still don't get the need for Promises.. Almost all examples, just like this one in the provided link, talk about solving callback hell with Promises, while callback hell is just a bad way of writing software imao. Look at the code below and please tell me why your example with Promises is a better solution.
function logAppendResult( err ) {
if (err) console.err('Failed to update file');
else console.log('Successfully updated file');
}
function logWriteResult( err ) {
if (err) console.err('Failed to create file');
else console.log('Successfully created file');
}
function handleFile( filename, fileExists ) {
const timestamp = new Date().toISOString();
( fileExists )
? fs.appendFile( filename, `Updated file on ${timestamp}\n`, logAppendResult )
: fs.writeFile( filename, `Created file on ${timestamp}\n`, logWriteResult );
}
function main() {
const filename = './example.txt';
exists( filename, (fileExists) => handleFile(filename, fileExists) );
}
Have you ever tried to run more than one operation at once and collect the results?
(There are lots of other reasons to use the promise abstraction – having a type that can be transformed is extremely useful and natural – but that one’s pretty significant.)
When callbacks are used with discipline, they are not much different from promises. The problem is when "discipline" part meets "human" part, though it's still true for promises, perhaps to a lesser degree.
And if you replace the if(err) lines with a log(...) function it doesn't reduce them to one place. It makes you repeat the log(...) function everywhere. And you'd still need the if statement to handle the control flow.
Simple code is great, but not handling errors doesn't cut it for non throwaway applications.
Whenever you program in ECMAScript2016("Javascript"), you should take advantage of its features. Right now you're coding pretty much using the same way one would do it in C. Take advantage that functions are first-class in ES2016. Take advantage of JSON.
For example you could do something like this (sorry, don't have time to open my IDE to try the code i'm going to write):
logFsResult = (type, err) => {
map_action = {
"append": {
True: "Succesfully updated file.",
False: "Failed to update file."
},
"write": {
True: "Successfuly created file.",
False: "Failed to write file."
}
}
message = map_action[type][(err != null)] // obtain the message
method = (err)? console.err : console.log // choose logger
method(err) //invoke the correct logger with the message
}
This is an easier-to-mantain code. It separates the messages to use, from the logic. It allows you to easily configure/change the messages to log, and the method (function) for logging the error, without having to touch the decision logic. On your original example, the logic of the code was mingled with the error messages.
You could say this example was more "idiomatic" ES2016.
> Whenever you program in ECMAScript2016("Javascript"), you should take advantage of its features.
That's a trap, I should rather make my code as readable, scalable and bug free as possible regardless of ESxxx.
I can refactor in 10 other ways (different styles) coming to the same result, but that's not what my point was about. Using promises and so is just taste or preference, if you like it you use can it, if not do without. I've seen amazing spaghetti with promises and callbacks as well.
Easy to nitpick btw:
you compare err != null?? besides not using strict mode, what should err be? a String? So, what will happen if err is undefined or an empty String?
Then you call logFsResult with err while it is not used.. Did you even consider what happens if the value of type is not available in map_action? I'll be the end of the feast!
last one: True and False as an object key are not Booleans, so if you have your IDE up and running, the code will fail.
Now, you can try to solve this with promises, just as you can try brush your teeth with a broomstick.
What do you think the "?" operator does with "err"?
> Then you call logFsResult with err while it is not used..
It seems you don't understand the code. I'm not calling "logFsResult", i am defining a function called logFsResult. You also did the same, you defined logFsResult to receive the "err" parameter.
function logFsResult( type, err ){
var msg= '';
switch ( type ) {
> That's a trap, I should rather make my code as readable, scalable and bug free as possible regardless of ESxxx.
ES6 allows you to write more readable code than ES5. Take a look at the features.
That's still not error handling, that's error logging. Try writing code that depends on the success of the previous operation to perform another operation and you'll quickly find yourself in the callback soup.
You might like the make_esc/errify pattern [0]. You can apply the same function to any number of error callbacks in order to unify error handling. Works great with Iced CoffeeScript but also works well without. I can provide more examples but I think you'll get it.
With Promises we conceptualize a control flow construct like callbacks. It is much easier to reason about a concept rather than following code execution paths. With async await we return control flow to the current scope.
One benefit comes from being able to use `async`/`await`.
Using `try`/`catch` with `async`/`await` is a bit awkward, though, which is especially unfortunate because it's the #1 place you should be handling errors.
LightScript has a language feature[0] that makes it less awkward (sort of an`Either`/`Result` type) that I'm thinking about submitting to TC39.
I will agree, to be honest I tried promises in my own projects and now writing them for someone. But to me Async.js is just cleaner nicer better. Funniest part is Bluebird docs on transitioning from it to Promises, promise example is bigger and messier.
Previously I had used a module called "denodeify" that achieves the same thing as promisify. I can see why the node maintainers used a different name :-D
>I totally dislike async, await, because they force you to switch to async code-flow-thinking.
If you started programming with Node, it's like you've learned doing everything the wrong way.
The async/await brings back the sane, synchronous, reasoning. There's a reason that from Lisp to Haskell, every language tried to get rid of the error-prone callback spaghetti.
I have not started programming wit JavaScript. The point is that node js is event driven and you can hardly escape from callbacks, except with syntatic sugar.
A sync await is an argument for people like you, who are trying hard to change its nature. It was added to the specs, because of "browsers got stuck to JavaScript".
There are better server side languages, so you can use them.
>The point is that node js is event driven and you can hardly escape from callbacks, except with syntatic sugar.
You can hardly escape from imperative code except without syntactic sugar either. That doesn't mean that one should program in the lower layer of abstraction of a platform. If we followed that, C programming would be all gotos instead of functions and the usual control flow (even "if" and "for" are syntactic sugar on top of assembly constructs).
>A sync await is an argument for people like you, who are trying hard to change its nature.
If it wasn't for people who tried hard to change its nature, JS would still be the same ho-hum language it was its first 15 years (I was there). Node.js was itself an attempt to "change" the nature of JS, moving it from client to server side.
The need for callbacks stems from the fact that JS as a language was never specifically event oriented -- any more than in any other language. It just supported DOM events in the browser environment, which for the first 10-15 years of the web were just simple one level callback handler (button clicked, do that). Hardly any kind of asynchronous programming to write home about. Aside from having first class functions, JS was not particularly designed for evented code. That's where callbacks came in, as a poor man's way to deal with evented code -- other languages have had coroutines, promises etc for 30+ years.
Prettifying long function signatures and calls. You can spread them across multiple lines with the same trailing comma syntax you might use with a multiline array or object literal.
I suppose but that's usually an indication that an object would be more apt. If you have enough parameters that it needs to be wrapped, they're probably hard to track too.
I like it for `compose()` function calls. Changing the parameters and order is fairly common, so the trailing comma becomes convenient for moving and adding parameters.
This is exciting news! I'm a long-time LAMP developer (now mostly Laravel) and have been experimenting with NodeJS for an upcoming API project. As Javascript becomes a larger part of each new project, using one language throughout the entire stack is becoming much more compelling.
Is Express still considered the de facto web framework for NodeJS? Or are other frameworks better suited for someone used to the "batteries-included" philosophy of Laravel. I'm watching the new "Learning Node" course from WesBos since he covers async/await and Express seems very similar to most MVC frameworks.
Koa.js provides a promise-based API that's well-suited to be used with async functions. It has the distinction of being the official successor to Express (created by the same people). Having used it for several years now, I find Express feels a bit kludgy and error-prone.
That said, Express will be around for quite some time, due to its name-recognition and large install base.
Express is the most popular. As a freelancer I always enjoy when I see express as the framework of choice, since it's very easy to maintain the produced code and is very hard to go unmaintainable.
As a matter of fact, what I don't recommend is Sails [1], which tries to do as much as it can and is quite inflexible in terms of technical decisions
AdonisJS is definitely very close to Laravel. I'm not a big fan of the generator syntax, but I assume they will migrate to async/await with the Node v8 announcement.
Might look at Hapi (https://hapijs.com/) for a more "batteries-included" experience - but Express is still a great choice for just busting out a simple http service.
Express is still very common, but other than that and Hapi, Sails.js (http://sailsjs.com/) is probably a great place to start. It's a full-featured MVC framework built on top of Express.
Does anyone have a link to better explanation of the changes to `debugger`?
> The legacy command line debugger is being removed in Node.js 8. As a command line replacement, node-inspect has been integrated directly into the Node.js runtime. Additionally, the V8 Inspector debugger, which arrived previously as an experimental feature in Node.js 6, is being upgraded to a fully supported feature.
It sounds like `node debug` will no longer work? But it is replaced with something that's better? What is `node-inspect` and where can I learn about it?
It's explained there. Basically, `node debug` will still work, they just had to change the command line debugger to support the new protocol, since the old protocol was removed from V8.
But unless you really need to debug from the command line, --inspect/--inspect-brk is the way to go. You don't necessarily have to use Chrome either, these days IDE debuggers support this protocol as well.
This is a big release. Async/await in stable core is something I've been (literally) waiting 6 years for.
Many people have criticized Node's cooperative multithreading model, with both good and uninformed reasons. Yet, it is without dispute that the model is popular.
Async/await is a giant leap forward toward making Node usable for beginner and expert alike. This release is a celebration.
For those of you with existing applications looking to migrate, try `--turbo --ignition` to emulate (most) of the V8 5.9 pipeline. Anecdotally, microbenchmark-style code regresses slightly, real-world code improves by as much as 2x. Exciting times.
Well, the optimizing pipeline has completely changed in V8 5.8 (if you use `--turbo --ignition`) and in 5.9+. It's been simplified and most importantly, does not deoptimize on quite so many common features (such as try/catch). More information at http://benediktmeurer.de/2017/03/01/v8-behind-the-scenes-feb... and some of his other articles.
In my testing it appears that TurboFan cannot optimize certain patterns as well as Crankshaft did, but there's no reason to believe those regressions will remain as TF evolves. Optimizing more code is much more important for real apps.
This just motivated me to play around a little bit with JS async/await implementation. What I found interesting is that async functions will always return promises, even if an immediate value could be returned. Like for example in the following function:
async function getFromCacheOrRemote() {
if (random()) {
return "Got it";
} else {
await DoSomethingLongRunnning();
return "Got from network";
}
}
The function will return a Promise independent of which branch is taken, although it could return a string for the first branch. Does anybody know the reason? From a consumer point of view it does not matter if the consumer uses await, since await accepts both immediates and Promises. Is it because always returning promises is more convenient for users which use Promise combinators instead of await and less bug-prone? Or does it maybe even benefit JS runtime optimizations if the returntype is always a Promise - even though the promise in both cases might be of a different subtype?
For most applications it probably doesn't matter anyway. However returning and awaiting immediate values eliminates an allocation and eventloop iteration compared to using a Promise, which is helpful for high-performance code. This is why C# now introduced custom awaitables and things like ValueTask<T>.
It would be weird for async functions to be synchronous wouldn't it?
Also the consumer is not required to use await with a called async function. They can use .then on it, or pass it around to other functions that consumes promises. Or, you could pass the async function to decorators that consume promise returning functions. Using experimental decorator syntax:
@debouncePromise
async function() {
...
}
Yes it is a trade off of performance for convenience and consistency.
Weird: Depends. I think that an async function might sometimes complete synchronously is not weird - it can happen in many scenarios where the least common denominator is "The function might sometimes take a long time and therfore has to be async". Having a different return type depending on the taken path (Value or Promise<Value>) is definitely weird and should not be encouraged (although supported by JS). I'm therefore in favor of the decision that an async function always returns a Promise - I just wanted to know if anybody has more insight on how the decision was taken and if the performance impact has been considered.
Thanks for bringing up the issue around that an await on a value will wrap it into a promise before! That means even the following "workaround" would not work:
function getFromCacheOrRemote() {
if (random()) {
return "Got it";
} else {
return DoSomethingLongRunnningWhichMightUseAsyncAwaitInternally()
.then(() => "Got from network");
}
}
var result = await getFromCacheOrRemote();
Here getFromCacheOrRemote interferes correctly as type string | Promise<string> in typescript. However if an await on the function will still trigger a Promise creation and the await an eventloop iteration it won't buy anything compared to the simply solution. Seems like to profit from synchronous completions there's also some steps on the callsite needed like:
var maybePromise = getFromCacheOrRemote();
var result;
if (typeof maybePromise === 'object' && maybePromise.then != null)
result = await maybePromise;
else
result = maybePromise;
And just for clarification: I wouldn't encourage any normal application to do this kind of things, the normal async functions should be great for them. However for some libraries (e.g. high-performance networking libraries) these optimizations can make sense. And e.g. the awaitable ValueTask<T> in C# was created with exactly those scenarios in mind.
Is it? I only do some moderate node work, so I could be missing the discussion on this, but I've never seen this brought up or talked about before. I read the blog posts, but is this really a sweeping rule in the js community or is it just the opinon of a few people? Because no one seems to be doing it.
Returning a promise is the contract of an async function, the whole mechanism is built on top of promises and is supposed to integrate with them seamlessly.
Async functions will always return a Promise, it doesn't know (or shouldn't) whether the work it's doing is synchronous. This is a common JS anti pattern and should be avoided if possible. The Promise should resolve automatically, so in some ways it's almost synchronous time.
It looks like somebody needs to set up the deb repository for 8.x, the installation script[1] is there, but there's no repo[2] for the node 8.x itself.
I also think this[3] url needs to get an update to reflect the new release.
edit-> Considering Debian Stretch will be released June 17th, it would be nice to have a repo for this release, i mean ..node_8.x/dists/stretch/Release.. instead of only jessie and sid's.
I just finished cleaning my home folder out of the ~100,000 files npm created over the past couple of months. I just build interesting Node projects I come across to check them out and it's gotten that big. I wonder how it's like for regular node devs.
Sane dependency management isn't free! I have some tips, though:
# find how all node_modules on *nix under the cwd (gsort is for mac with homebrew coreutils; use `sort` otherwise)
find . -name "node_modules" -type d -prune -exec du -sh '{}' + | gsort -hr
# exclude node_modules from time machine backups
mdfind 'kMDItemFSName == node_modules' -0 | xargs -0 tmutil addexclusion
Doesn't the same apply to any nontrivial programming language / dependency management system that works from source? e.g. Go?
I mean Maven's repository is usually pretty big too. It's usually compiled .jars but IDE's can opt to download sources + documentation too. A lot of Java applications end up downloading half the internet too.
Long story short, any non-trivial development / library / framework / software has a lot of dependencies.
Sadly, this is one of the worst parts of Node.js development. Import one NPM package and it will import hundreds of other packages. The sad thing is that probably many of them are:
- no bigger than 50 lines of code
- probably "unpromified"
- probably unmaintained
I really like ECMAScript2016 and the concept behind Node.js, but the NPM ecosystem is really something that isn't pretty.
No, `node inspect` is the new command line debugger for `node --inspect` which replaces `node debug` for `node --debug`. The name is derived from `node --inspect` and has no relation to `node-inspector`.
That's only needed to convert old callback style code to promise-compatible code. Most libraries either return promises now or have an option to, so this isn't necessary.
This was tried many years ago. At that time, Promises/A+ was not finalized, and the community could not agree on which Promise specification was best, or even if one was needed at all.
Callbacks are lightest-weight re: CPU & memory overhead, so it was decided that core APIs should implement that, and developers can override using promisify (via e.g. Bluebird or the new `util.promisify()`) as they need. But putting that kind of assumption in core could lead to significant pain.