Hacker News new | past | comments | ask | show | jobs | submit login

Same reason given in the bluebird library documentation:

> Promise.reduce will start calling the reducer as soon as possible, this is why you might want to use it over Promise.all (which awaits for the entire array before you can call Array#reduce on it).

Whether this is ever necessary is another matter :)




    let accumulator = 0
    for (let item of array) {
      const value = await item
      // your code here
    }
Is identical, doesn't use 'cool' reduce features but is much easier to read in my opinion.


Wouldn’t this code only execute 1 promise at a time? I thought Promise.all allowed promises to be resolved in parallel


Indeed. You most likely should do `await Promise.all` and then do the reduction.


If item is already a promise, and not a function returning a promise, they would be "executing" in parallel.


Sorry, in the sense of being identical to the original code in the linked post (reduce), not the comment.


I suppose this might be useful in situations where you are querying an API(s) with multiple requests and some will certainly return seconds before others.

This way you could have the same reducer handle the results and begin updating the UI as the results come in.

An example real-world app might be a price comparison tool or social media aggregator.


> I suppose this might be useful in situations where you are querying an API(s) with multiple requests and some will certainly return seconds before others.

But it's still a serialized operation so the parallelism is still limited. What's really needed is a "parallel reduce" using something like C's select function that will reduce in an arbitrary order using any promises that are ready at any given step.


A nice use case for sure. Seems possible with some kind of iterator/generator wrapper rather than the mess in the OP however.


The example isn't using Promise.reduce.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: