
Folding Promises in JavaScript - anon335dtzbvc
https://www.codementor.io/vladimirgorej/folding-promises-in-javascript-b1l7mwh93
======
adamjc
>How can we make it better ? Let's start by removing the requirement for
identity value to always be the promise.

I challenge the view that making the identity value being able to be something
other than a Promise is 'making it better'. Pointless abstraction is one of my
pet peeves in this industry. This looks like it has gone from a fairly
straightforward, if kludgy, piece of code to something far more complex. Why
not just:

    
    
      const listOfPromises = [...]
    
      const result = Promise.all(listOfPromises).then(results => {
        return results.reduce((acc, next) => acc + next)
      })
    
    
    ?

~~~
megawatthours
Same reason given in the bluebird library documentation:

> Promise.reduce will start calling the reducer as soon as possible, this is
> why you might want to use it over Promise.all (which awaits for the entire
> array before you can call Array#reduce on it).

Whether this is ever necessary is another matter :)

~~~
codefined

        let accumulator = 0
        for (let item of array) {
          const value = await item
          // your code here
        }
    

Is identical, doesn't use 'cool' reduce features but is much easier to read in
my opinion.

~~~
todd3834
Wouldn’t this code only execute 1 promise at a time? I thought Promise.all
allowed promises to be resolved in parallel

~~~
theprotocol
Indeed. You most likely should do `await Promise.all` and then do the
reduction.

------
noelwelsh
I don't think this is very well written. It doesn't start with any motivating
problem, it introduces terms (functor) without defining them, and a lot of
what is discussed doesn't apply to solving the problem.

------
fair24
> Folding Promises in JavaScript

Or: How to make simple things complex and make a codebase a complete puzzle
for those that come after you?

~~~
Androider
I feel nothing has improved code readability like the recent mainstreaming of
map/filter/fold/reduce and "const all the things". This type of code is so
easy to follow, reason about and trivial to debug at every step, once you
internalize the few primitive functions.

I don't think you need to necessarily memorize these transformation names, but
writing these types of functions is all I seem to be doing these days,
transforming one thing into another line for line.

~~~
jacobr
I feel the code I wrote while on this bandwagon is the hardest to understand
for others and for me myself today.

    
    
        pullAllBy(pluck(things, 'bar').map(compose(xor, lol, rofl)).reduce(differenceWith('id'))
    

Just write your transformations inline and go work on the next feature.

~~~
splintercell
I feel like the functional code you wrote was written to intentionally
obfuscate what is being done. For instance, composing a bunch of methods
inline doesn't have any utility for it unless you define what compose(xor,
lol, rofl) really means.

Ideally this is how it should be written for maximum readability.

    
    
      things
      |> pluck('bar')
      |> map(xor)
      |> map(lol)
      |> map(rofl)
      |> reduce(differenceWith('id')
    
    

The code is written equally well without using pipe operator but the proposal
to introduce it is in works [1].

Here is it using lodash (not even lodash-fp), and this is going to do a single
for loop when executing because this is lazy.

    
    
      _.chain(things)
       .pluck('bar')
       .map(_.xor)
       .map(_.lol)
       .map(_.rofl)
       .reduce(_.differenceWith('id'))
       .value();
    

It's no less readable than the code you'd write using using unfolded
transformations.

1\. [https://github.com/tc39/proposal-pipeline-
operator](https://github.com/tc39/proposal-pipeline-operator)

~~~
gaastonsr
I agree, but you shouldn't map 3 times when you can map once over the data.

    
    
       things
        |> pluck('bar')
        |> map(compose(rofl, lol, xor))
        |> reduce(differenceWith('id')

~~~
splintercell
I made my case against the compose in the post.

Composing xor, rofl, lol isn't any better (esp in terms of readability) than
it is individual maps.

What would be better is this:

    
    
      const makeHilarious = compose(rofl, lol, xor);
      
      things
        |> pluck('bar')
        |> map(makeHilarious)
        |> reduce(differenceWith('id')

~~~
jacobr
This means that to change this code you first have to look for the
makeHilarious definition, then the definitions of rofl, lol and xor, then
figure out what they all do separately and together, and if you can change
them without breaking anything else in your application.

    
    
        things
          .map(thing => x.bar)
          .map(thing => {
            // whatever happens in rofl
            // whatever happens in lol
          })
          .reduce((acc, thing) => {
            // more stuff
          }, {})
    

This is code that is easy to understand and safe to change. The maps could be
combined into one function body if it's convenient.

~~~
splintercell
I anticipated that makeHilarious is more than once used method, even if it is
not, once you look at it's definition, you understand the intent behind "xor
-> rofl -> lol".

It is one thing to quickly be able to understand that the person is doing a
xor, then a rofl and then a lol of each element of an array, and a whole
another thing to understand what the combination of these three actions over
an array means. The python school of "code is read more than it's written"
heavily stresses on explaining how easily the code should be understandable
the first time someone reads it, but not whether it's easy to reason about or
not.

The beauty of declarative style programming isn't to get more readable code
immediately, but rather that once you understand the vocabulary, how easy it
is for you to understand and reason about the code.

For instance, imagine reading a novel which is written like this:

"After Jack was done from the place where he went to do things for money
everyday, he entered an establishment which served drinks that get you
inebriated for money. This establishment was one he frequented regularly and
preferred it over the others. He asked the man behind the counter for a wheat
fermented brewed drink. After putting the drink to his lips and pouring it in
to his mouth, he felt a sense of calmness enter his mind. It pushed all the
thoughts which occupied his mind away, as he earlier desired before entering
this establishment."

As opposed to:

"Jack really needed a drink after hard day at work. He went to his favorite
pub, and ordered his favorite beer. After finishing the pint, he finally felt
relaxed."

The Python philosophy (which is permeated everywhere in imperative world) is
to describe everything in the simplest possible terms just in case there are
people who may not understand what work, pub, beer, bartender, and relaxed
means. But this just prevents from understanding of the actual purpose of the
code.

This is at least the basic philosophy behind not using for loops everywhere.

------
CapacitorSet
I can't quite understand the difference between endomorphism ("input and
output of the transformer must be from the same category") and homomorphism
("structure preserving transformation. We always stay in the same category").
Can someone help?

~~~
lsjroberts
I believe homomorphism is a subset of endomorphism.

So a function that turns an array into another array of different length would
be endomorphic (since it maintains the same type), but not homomorphic since
it has a different structure (a different set of keys).

~~~
catnaroek
The other way around. A homomorphism is a structure-preserving map between two
arbitrary objects, whereas an endomorphism is a homomorphism where the source
and target objects coincide.

------
molf
With async/await this can become:

    
    
        const reduceP = async (fn, identity, listP) => {
          const values = await Promise.all(listP)
          return values.reduce(fn, identity)
        }
    

The whole thing feels like a synthetic and overcomplicated example, though. In
practice I'm sure I'd just write:

    
    
        let total = 0
        while (listP.length > 0) {
          total += await listP.pop()
        }

~~~
megawatthours
That code does the same thing as
[https://news.ycombinator.com/item?id=15302465](https://news.ycombinator.com/item?id=15302465)
but not the same thing as the code in the article.

------
egeozcan
I don't know much about these concepts but isn't `const objToArray = ({ a })
=> [a];﻿` losing data, that being the key of the value in the object? I'm
asking because it says that "Isomorphism is a pair of transformations between
two categories with no data loss".

In any case, this is very helpful, thanks for writing/sharing.

~~~
paavohtl
It's a pair of transformations between [A] and { a: A }, not between arbitrary
arrays and objects.

As long as you know what the transformation is, you can convert between them
without data loss.

------
fortythirteen
"Programs must be written for people to read, and only incidentally for
machines to execute." \- Harold Abelson

------
porlune
The author mentions the library Bluebird, which I think is a fantastic
library. The 'mapSeries' method it offers is also very useful when iterating
over an array of values that need to be 'promisified' and mapped in the given
order. You can even set 'concurrency' as an option, which puts a limit on the
concurrent promises that can run (great for reducing API load).

------
chajath
I've written a javascript library to deal with folding and mapping recurring
promises (i.e. promises that resolve to a value part of which contains clue to
the "next" promise)

[https://github.com/google/chained-promise](https://github.com/google/chained-
promise)

------
minitech
With async (it’s just monads!):

    
    
      listOfPromises.reduce(
        async (m, n) => await m + await n,
        0,
      )

