What is the best way to limit concurrency when using ES6's Promise.all()?

前端 未结 17 757
执念已碎
执念已碎 2020-11-29 21:28

I have some code that is iterating over a list that was queried out of a database and making an HTTP request for each element in that list. That list can sometimes be a rea

17条回答
  •  春和景丽
    2020-11-29 21:46

    If you know how iterators work and how they are consumed you would't need any extra library, since it can become very easy to build your own concurrency yourself. Let me demonstrate:

    /* [Symbol.iterator]() is equivalent to .values()
    const iterator = [1,2,3][Symbol.iterator]() */
    const iterator = [1,2,3].values()
    
    
    // loop over all items with for..of
    for (const x of iterator) {
      console.log('x:', x)
      
      // notices how this loop continues the same iterator
      // and consumes the rest of the iterator, making the
      // outer loop not logging any more x's
      for (const y of iterator) {
        console.log('y:', y)
      }
    }

    We can use the same iterator and share it across workers.

    If you had used .entries() instead of .values() you would have goten a 2D array with [[index, value]] which i will demonstrate below with a concurrency of 2

    const sleep = t => new Promise(rs => setTimeout(rs, t))
    
    async function doWork(iterator) {
      for (let [index, item] of iterator) {
        await sleep(1000)
        console.log(index + ': ' + item)
      }
    }
    
    const iterator = Array.from('abcdefghij').entries()
    const workers = new Array(2).fill(iterator).map(doWork)
    //    ^--- starts two workers sharing the same iterator
    
    Promise.allSettled(workers).then(() => console.log('done'))

    The benefit of this is that you can have a generator function instead of having everything ready at once.


    Note: the different from this compared to example async-pool is that it spawns two workers, so if one worker throws an error for some reason at say index 5 it won't stop the other worker from doing the rest. So you go from doing 2 concurrency down to 1. (so it won't stop there) So my advise is that you catch all errors inside the doWork function

提交回复
热议问题