Hacker News new | past | comments | ask | show | jobs | submit login

>How can we make it better ? Let's start by removing the requirement for identity value to always be the promise.

I challenge the view that making the identity value being able to be something other than a Promise is 'making it better'. Pointless abstraction is one of my pet peeves in this industry. This looks like it has gone from a fairly straightforward, if kludgy, piece of code to something far more complex. Why not just:

  const listOfPromises = [...]

  const result = Promise.all(listOfPromises).then(results => {
    return results.reduce((acc, next) => acc + next)
  })


?



Same reason given in the bluebird library documentation:

> Promise.reduce will start calling the reducer as soon as possible, this is why you might want to use it over Promise.all (which awaits for the entire array before you can call Array#reduce on it).

Whether this is ever necessary is another matter :)


    let accumulator = 0
    for (let item of array) {
      const value = await item
      // your code here
    }
Is identical, doesn't use 'cool' reduce features but is much easier to read in my opinion.


Wouldn’t this code only execute 1 promise at a time? I thought Promise.all allowed promises to be resolved in parallel


Indeed. You most likely should do `await Promise.all` and then do the reduction.


If item is already a promise, and not a function returning a promise, they would be "executing" in parallel.


Sorry, in the sense of being identical to the original code in the linked post (reduce), not the comment.


I suppose this might be useful in situations where you are querying an API(s) with multiple requests and some will certainly return seconds before others.

This way you could have the same reducer handle the results and begin updating the UI as the results come in.

An example real-world app might be a price comparison tool or social media aggregator.


> I suppose this might be useful in situations where you are querying an API(s) with multiple requests and some will certainly return seconds before others.

But it's still a serialized operation so the parallelism is still limited. What's really needed is a "parallel reduce" using something like C's select function that will reduce in an arbitrary order using any promises that are ready at any given step.


A nice use case for sure. Seems possible with some kind of iterator/generator wrapper rather than the mess in the OP however.


The example isn't using Promise.reduce.


> Pointless abstraction is one of my pet peeves in this industry. This looks like it has gone from a fairly straightforward, if kludgy, piece of code to something far more complex. Why not just: [code]

Your example code works just fine for promises of course, but not all monads support a coalescing operation like Promise.all.

So even though this article only discusses folding over Promises, the core idea here can be generalised to any monad type (such as Promise, Result, Option, or anything else)


> not all monads support a coalescing operation like Promise.all.

Actually, they do. Haskell calls it sequence :: (Traversable t, Monad m) => t (m a) -> m (t a) [1]

It works by consuming the structure outside the monad and rebuilding it inside. A possible implementation specialized for lists is

  sequence [] = return []
  sequence (h:t) = do
    h' <- h
    t' <- sequence t
    return (h':t')
[1] http://hackage.haskell.org/package/base-4.10.0.0/docs/Prelud...


Sorry, I should've been more clear. You're right - you can absolutely build sequence out of the bind operation for any monad.

Promise.all is not just sequence though, there's some additional subtleties to it. In particular the fail-fast behaviour:

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

That's the kind of fundamental coalescing operation that you cannot implement with bind on plain monads.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: