236

I have some code that is iterating over a list that was queried out of a database and making an HTTP request for each element in that list. That list can sometimes be a reasonably large number (in the thousands), and I would like to make sure I am not hitting a web server with thousands of concurrent HTTP requests.

An abbreviated version of this code currently looks something like this...

function getCounts() {
  return users.map(user => {
    return new Promise(resolve => {
      remoteServer.getCount(user) // makes an HTTP request
      .then(() => {
        /* snip */
        resolve();
      });
    });
  });
}

Promise.all(getCounts()).then(() => { /* snip */});

This code is running on Node 4.3.2. To reiterate, can Promise.all be managed so that only a certain number of Promises are in progress at any given time?

5

32 Answers 32

169

P-Limit

I have compared promise concurrency limitation with a custom script, bluebird, es6-promise-pool, and p-limit. I believe that p-limit has the most simple, stripped down implementation for this need. See their documentation.

Requirements

To be compatible with async in example

My Example

In this example, we need to run a function for every URL in the array (like, maybe an API request). Here this is called fetchData(). If we had an array of thousands of items to process, concurrency would definitely be useful to save on CPU and memory resources.

const pLimit = require('p-limit');

// Example Concurrency of 3 promise at once
const limit = pLimit(3);

let urls = [
    "http://www.exampleone.com/",
    "http://www.exampletwo.com/",
    "http://www.examplethree.com/",
    "http://www.examplefour.com/",
]

// Create an array of our promises using map (fetchData() returns a promise)
let promises = urls.map(url => {

    // wrap the function we are calling in the limit function we defined above
    return limit(() => fetchData(url));
});

(async () => {
    // Only three promises are run at once (as defined above)
    const result = await Promise.all(promises);
    console.log(result);
})();

The console log result is an array of your resolved promises response data.

7
  • 8
    This was by far the best library I've seen for limiting simultaneous requests. And great example, thanks! Commented Apr 17, 2019 at 16:32
  • 4
    Thanks for doing the comparison. Have you compared against github.com/rxaviers/async-pool?
    – ahong
    Commented May 31, 2019 at 9:41
  • 2
    besides this one has about 20M weekly downloads on npm per week vs about 200-100k for other libs mentioned in other answers.
    – vir us
    Commented Jan 2, 2021 at 15:21
  • 1
    @RobHinchliff you cannot use await in a function that is not async.
    – Adrian Pop
    Commented Mar 9, 2023 at 13:27
  • 5
    @AndyRay sir this answer is 5 years old and p-limit has 10x more use. What maturity are you worried about Commented Nov 20, 2023 at 3:22
136

Using Array.prototype.splice

while (funcs.length) {
  // 100 at a time
  await Promise.all( funcs.splice(0, 100).map(f => f()) )
}
8
  • 83
    This runs functions in batches instead of pool, where one function is immediately called when another finishes.
    – cltsang
    Commented Apr 16, 2020 at 4:35
  • 4
    took a sec to grasp what it's doing with lack of more context around it, like it beeing a batch instead of a pool for instance. You are reordering the array everytime you you splice from the beginning or in the middle. (browser have to reindex all items) a theoretical performance better alternativ is to take stuff from the end instead arr.splice(-100) if the order dose not mather, maybe you can reverse the array :P
    – Endless
    Commented Jun 26, 2020 at 22:16
  • 9
    Very useful for running in batches. Note: the next batch will not start until the current batch 100% complete. Commented Jul 25, 2020 at 1:12
  • 2
    Very cool. However, it only batches, which means it won't push a new promise when another promise resolves.
    – adi518
    Commented Jun 15, 2021 at 22:45
  • Is this blocking the event loop?
    – Álvaro
    Commented Jun 21, 2021 at 21:03
91

If you know how iterators work and how they are consumed you would't need any extra library, since it can become very easy to build your own concurrency yourself. Let me demonstrate:

/* [Symbol.iterator]() is equivalent to .values()
const iterator = [1,2,3][Symbol.iterator]() */
const iterator = [1,2,3].values()


// loop over all items with for..of
for (const x of iterator) {
  console.log('x:', x)
  
  // notices how this loop continues the same iterator
  // and consumes the rest of the iterator, making the
  // outer loop not logging any more x's
  for (const y of iterator) {
    console.log('y:', y)
  }
}

We can use the same iterator and share it across workers.

If you had used .entries() instead of .values() you would have gotten a iterator that yields [index, value] which i will demonstrate below with a concurrency of 2

const sleep = t => new Promise(rs => setTimeout(rs, t))
const iterator = Array.from('abcdefghij').entries()
// const results = [] || Array(someLength)

async function doWork (iterator, i) {
  for (let [index, item] of iterator) {
    await sleep(1000)
    console.log(`Worker#${i}: ${index},${item}`)

    // in case you need to store the results in order
    // results[index] = item + item

    // or if the order dose not mather
    // results.push(item + item)
  }
}

const workers = Array(2).fill(iterator).map(doWork)
//    ^--- starts two workers sharing the same iterator

Promise.allSettled(workers).then(console.log.bind(null, 'done'))

The benefit of this is that you can have a generator function instead of having everything ready at once.

What's even more awesome is that you can do stream.Readable.from(iterator) in node (and eventually in whatwg streams as well). and with transferable ReadbleStream, this makes this potential very useful in the feature if you are working with web workers also for performances


Note: the different from this compared to example async-pool is that it spawns two workers, so if one worker throws an error for some reason at say index 5 it won't stop the other worker from doing the rest. So you go from doing 2 concurrency down to 1. (so it won't stop there) So my advise is that you catch all errors inside the doWork function

10
  • 2
    This is definitely a cool approach! Just make sure your concurrency does not exceed the length of your task list (if you care about results anyway) as you may end up with extras! Commented Oct 2, 2020 at 5:31
  • Something that might be cooler later is when Streams get Readable.from(iterator) support. Chrome have already made streams transferable. so you could create readable streams and send it off to a web workers, and all of them would end up using the same underlying iterator.
    – Endless
    Commented Oct 2, 2020 at 9:37
  • It definitely should be published as an NPM module and I'd like to use it.
    – Huan
    Commented Aug 17, 2021 at 4:59
  • @KrisOye I don't reproduce an issue when concurrency exceeds the length of the task list. I just tried running the code snippet with 20 workers (new Array(20)); it ran to completion as desired, with no extras. (The extra workers finished instantly, as the iterator was done before they launched.) Commented Sep 15, 2021 at 0:32
  • 3
    I love this approach—the key insight is that if multiple async functions iterate over the same iterator with for...of, they cooperatively work through the iterator's elements seamlessly, yielding compact and high-performance code. But the example here I think is quite obfuscated, so I rewrote it significantly to highlight this key insight: gist.github.com/fasiha/7f20043a12ce93401d8473aee037d90a (runs nicely in TypeScript Playground). Commented Sep 22, 2022 at 2:13
79

Note that Promise.all() doesn't trigger the promises to start their work, creating the promise itself does.

With that in mind, one solution would be to check whenever a promise is resolved whether a new promise should be started or whether you're already at the limit.

However, there is really no need to reinvent the wheel here. One library that you could use for this purpose is es6-promise-pool. From their examples:

var PromisePool = require('es6-promise-pool')
 
var promiseProducer = function () {
  // Your code goes here. 
  // If there is work left to be done, return the next work item as a promise. 
  // Otherwise, return null to indicate that all promises have been created. 
  // Scroll down for an example. 
}
 
// The number of promises to process simultaneously. 
var concurrency = 3
 
// Create a pool. 
var pool = new PromisePool(promiseProducer, concurrency)
 
// Start the pool. 
var poolPromise = pool.start()
 
// Wait for the pool to settle. 
poolPromise.then(function () {
  console.log('All promises fulfilled')
}, function (error) {
  console.log('Some promise rejected: ' + error.message)
})
6
  • 42
    It's unfortunate that es6-promise-pool reinvents Promise instead of using them. I suggest this concise solution instead (if you're using ES6 or ES7 already) github.com/rxaviers/async-pool Commented May 21, 2018 at 21:02
  • 3
    Took a look at both, async-pool looks way better! More straight forward and more lightweight.
    – Endless
    Commented Jun 25, 2018 at 10:00
  • 3
    I have also found p-limit to be the most simple implementation. See my example below. stackoverflow.com/a/52262024/8177355 Commented Sep 10, 2018 at 16:35
  • 2
    I think tiny-asyc-pool is far much better, non-intrusive and rather natural solution for limiting concurrency of promises. Commented Nov 19, 2019 at 13:34
  • 1
    @RafaelXavier since version 2.0.0 es6-promise-pool does not depend on ES6-Promise anymore.
    – Boris
    Commented Mar 31, 2021 at 16:12
25
+100

Instead of using promises for limiting http requests, use node's built-in http.Agent.maxSockets. This removes the requirement of using a library or writing your own pooling code, and has the added advantage more control over what you're limiting.

agent.maxSockets

By default set to Infinity. Determines how many concurrent sockets the agent can have open per origin. Origin is either a 'host:port' or 'host:port:localAddress' combination.

For example:

var http = require('http');
var agent = new http.Agent({maxSockets: 5}); // 5 concurrent connections per origin
var request = http.request({..., agent: agent}, ...);

If making multiple requests to the same origin, it might also benefit you to set keepAlive to true (see docs above for more info).

2
  • 18
    Still, creating thousands of closures immediately and pooling sockets doesn't appear to be very efficient?
    – Bergi
    Commented Nov 16, 2016 at 18:42
  • 1
    @Bergi also, I'd wager this will increase the chances of timeouts at various levels (starting at the level of the HTTP client itself). [still upvoted because I think it's an interesting approach to point out at least] Commented Nov 23, 2023 at 15:07
23

As all others in this answer thread have pointed out, Promise.all() won't do the right thing if you need to limit concurrency. But ideally you shouldn't even want to wait until all of the Promises are done before processing them.

Instead, you want to process each result ASAP as soon as it becomes available, so you don't have to wait for the very last promise to finish before you start iterating over them.

So, here's a code sample that does just that, based partly on the answer by Endless and also on this answer by T.J. Crowder.

EDIT: I've turned this little snippet into a library, concurrency-limit-runner.

// example tasks that sleep and return a number
// in real life, you'd probably fetch URLs or something
const tasks = [];
for (let i = 0; i < 20; i++) {
    tasks.push(async () => {
        console.log(`start ${i}`);
        await sleep(Math.random() * 1000);
        console.log(`end ${i}`);
        return i;
    });
}
function sleep(ms) { return new Promise(r => setTimeout(r, ms)); }

(async () => {
    for await (let value of runTasks(3, tasks.values())) {
        console.log(`output ${value}`);
    }
})();

async function* runTasks(maxConcurrency, taskIterator) {
    async function* createWorkerIterator() {
        // Each AsyncGenerator that this function* creates is a worker,
        // polling for tasks from the shared taskIterator. Sharing the
        // taskIterator ensures that each worker gets unique tasks.
        for (const task of taskIterator) yield await task();
    }

    const asyncIterators = new Array(maxConcurrency);
    for (let i = 0; i < maxConcurrency; i++) {
        asyncIterators[i] = createWorkerIterator();
    }
    yield* raceAsyncIterators(asyncIterators);
}

async function* raceAsyncIterators(asyncIterators) {
    async function nextResultWithItsIterator(iterator) {
        return { result: await iterator.next(), iterator: iterator };
    }
    /** @type Map<AsyncIterator<T>,
        Promise<{result: IteratorResult<T>, iterator: AsyncIterator<T>}>> */
    const promises = new Map();
    for (const iterator of asyncIterators) {
        promises.set(iterator, nextResultWithItsIterator(iterator));
    }
    while (promises.size) {
        const { result, iterator } = await Promise.race(promises.values());
        if (result.done) {
            promises.delete(iterator);
        } else {
            promises.set(iterator, nextResultWithItsIterator(iterator));
            yield result.value;
        }
    }
}

There's a lot of magic in here; let me explain.

This solution is built around async generator functions, which many JS developers may not be familiar with.

A generator function (aka function* function) returns a "generator," an iterator of results. Generator functions are allowed to use the yield keyword where you might have normally used a return keyword. The first time a caller calls next() on the generator (or uses a for...of loop), the function* function runs until it yields a value; that becomes the next() value of the iterator. But the subsequent time next() is called, the generator function resumes from the yield statement, right where it left off, even if it's in the middle of a loop. (You can also yield*, to yield all of the results of another generator function.)

An "async generator function" (async function*) is a generator function that returns an "async iterator," which is an iterator of promises. You can call for await...of on an async iterator. Async generator functions can use the await keyword, as you might do in any async function.

In the example, we call runTasks() with an array of task functions; we call .values() on the array to convert the array into an iterator.

runTasks() is an async generator function, so we can call it with a for await...of loop. Each time the loop runs, we'll process the result of the latest completed task.

runTasks() creates N async iterators, the "workers." Each worker polls for tasks from the shared taskIterator, ensuring that each worker gets a unique task.

The example calls runTasks with 3 concurrent workers, so no more than 3 tasks are launched at the same time. When any task completes, we immediately queue up the next task. (This is superior to "batching", where you do 3 tasks at once, await all three of them, and don't start the next batch of three until the entire previous batch has finished.)

runTasks() concludes by "racing" its async iterators with yield* raceAsyncIterators(). raceAsyncIterators() is like Promise.race() but it races N iterators of Promises instead of just N Promises; it returns an async iterator that yields the results of resolved Promises.

raceAsyncIterators() starts by defining a promises Map from each of the iterators to promises. Each promise is a promise for an iteration result along with the iterator that generated it.

With the promises map, we can Promise.race() the values of the map, giving us the winning iteration result and its iterator. If the iterator is completely done, we remove it from the map; otherwise we replace its Promise in the promises map with the iterator's next() Promise and yield result.value.

In conclusion, runTasks() is an async generator function that yields the results of racing N concurrent async iterators of tasks, so the end user can just for await (let value of runTasks(3, tasks.values())) to process each result as soon as it becomes available.

3
  • 2
    I managed to use this function to run a promise pool of max items and it performs exceptionally well, plus it's a low overhead compared to solutions like @supercharge/promise-pool this answer deserves more votes
    – bmaggi
    Commented Dec 5, 2021 at 20:23
  • Any idea how it would be possible to add a functionality to remove items from the tasks while the iterator is running? Would that be even possible, given that tasks.values() which creates the initial iterator of tasks is called at the start and won't be changeable once the iterator runs?
    – dennis
    Commented Sep 9, 2024 at 11:49
  • Rather than "removing" tasks from the list, I recommend writing code in the task to allow the task to be "skipped." So, at the first line of the task, it would check some other data structure to see if it should skip itself; if it does, it would return early. Commented Sep 9, 2024 at 15:09
21

bluebird's Promise.map can take a concurrency option to control how many promises should be running in parallel. Sometimes it is easier than .all because you don't need to create the promise array.

const Promise = require('bluebird')

function getCounts() {
  return Promise.map(users, user => {
    return new Promise(resolve => {
      remoteServer.getCount(user) // makes an HTTP request
      .then(() => {
        /* snip */
        resolve();
       });
    });
  }, {concurrency: 10}); // <---- at most 10 http requests at a time
}
2
  • 1
    bluebird is grate if u need faster promises and ~18kb extra junk if you only use it for one thing ;)
    – Endless
    Commented Jun 25, 2018 at 10:03
  • 4
    All depends how important the one thing is for you and if there is other faster/easy better way. A typical trade off. I will choose ease of use and function over few kb, but YMMV. Commented Jun 26, 2018 at 6:27
9

Unfortunately there is no way to do it with native Promise.all, so you have to be creative.

This is the quickest most concise way I could find without using any outside libraries.

It makes use of a newer javascript feature called an iterator. The iterator basically keeps track of what items have been processed and what haven't.

In order to use it in code, you create an array of async functions. Each async function asks the same iterator for the next item that needs to be processed. Each function processes its own item asynchronously, and when done asks the iterator for a new one. Once the iterator runs out of items, all the functions complete.

Thanks to @Endless for inspiration.

const items = [
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2',
  'https://httpbin.org/bytes/2'
]

// get a cursor that keeps track of what items have already been processed.
let cursor = items.entries();

// create 5 for loops that each run off the same cursor which keeps track of location
let numWorkers = 5;
Array(numWorkers).fill().forEach(async () => {
    for (let [index, url] of cursor){
        console.log('getting url is ', index, url)
        // run your async task instead of this next line
        var text = await fetch(url).then(res => res.text())
        console.log('text is', text.slice(0, 20))
    }
})

8
  • Curious as to why this got marked down. It is very similar to what I came up with. Commented Oct 3, 2020 at 18:48
  • 1
    Thank you for offering an answer that doesn't immediately rely on some unnecessary, bloated library. Commented Dec 6, 2022 at 22:45
  • 1
    @Alex take a look at the code again. Note that I am doing an async forEach on an array of 5 items - this represents worker count. In the loop, you'll notice that there is an asynchronous fetch, so the forEach has to be async in order to allow that. The code is confusing, but the async is so that each of the workers can do async tasks. The "magic" is that each for loop is iterating over the same cursor! But that's how iterators work, different loops can iterate the same cursor "at the same time"-obviously not really but close enough with the event loop (if you don't know what that is google) Commented Aug 24, 2023 at 12:40
  • 1
    @user3413723 Thanks and sorry, looks like I've misread the code and was incorrect on both accounts. Indeed the code should work for limited concurrency - I missed the fact that the limited concurrency is powered by iterators. It's very clever - TIL.
    – snobb
    Commented Aug 25, 2023 at 14:56
  • 1
    @Alex I actually spent a lot of time trying to understand how this works after I read Endless answer. Live and learn! Thanks for responding, I always like to hear what people think about my comments! :) Commented Aug 25, 2023 at 19:53
4

I suggest the library async-pool: https://github.com/rxaviers/async-pool

npm install tiny-async-pool

Description:

Run multiple promise-returning & async functions with limited concurrency using native ES6/ES7

asyncPool runs multiple promise-returning & async functions in a limited concurrency pool. It rejects immediately as soon as one of the promises rejects. It resolves when all the promises completes. It calls the iterator function as soon as possible (under concurrency limit).

Usage:

const timeout = i => new Promise(resolve => setTimeout(() => resolve(i), i));
await asyncPool(2, [1000, 5000, 3000, 2000], timeout);
// Call iterator (i = 1000)
// Call iterator (i = 5000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 1000 finishes
// Call iterator (i = 3000)
// Pool limit of 2 reached, wait for the quicker one to complete...
// 3000 finishes
// Call iterator (i = 2000)
// Itaration is complete, wait until running ones complete...
// 5000 finishes
// 2000 finishes
// Resolves, results are passed in given array order `[1000, 5000, 3000, 2000]`.
1
  • I need to handle rejects with Promise.allSettled() so this does not work
    – Álvaro
    Commented Oct 5, 2021 at 18:09
3

Here is my ES7 solution to a copy-paste friendly and feature complete Promise.all()/map() alternative, with a concurrency limit.

Similar to Promise.all() it maintains return order as well as a fallback for non promise return values.

I also included a comparison of the different implementation as it illustrates some aspects a few of the other solutions have missed.

Usage

const asyncFn = delay => new Promise(resolve => setTimeout(() => resolve(), delay));
const args = [30, 20, 15, 10];
await asyncPool(args, arg => asyncFn(arg), 4); // concurrency limit of 4

Implementation

async function asyncBatch(args, fn, limit = 8) {
  // Copy arguments to avoid side effects
  args = [...args];
  const outs = [];
  while (args.length) {
    const batch = args.splice(0, limit);
    const out = await Promise.all(batch.map(fn));
    outs.push(...out);
  }
  return outs;
}

async function asyncPool(args, fn, limit = 8) {
  return new Promise((resolve) => {
    // Copy arguments to avoid side effect, reverse queue as
    // pop is faster than shift
    const argQueue = [...args].reverse();
    let count = 0;
    const outs = [];
    const pollNext = () => {
      if (argQueue.length === 0 && count === 0) {
        resolve(outs);
      } else {
        while (count < limit && argQueue.length) {
          const index = args.length - argQueue.length;
          const arg = argQueue.pop();
          count += 1;
          const out = fn(arg);
          const processOut = (out, index) => {
            outs[index] = out;
            count -= 1;
            pollNext();
          };
          if (typeof out === 'object' && out.then) {
            out.then(out => processOut(out, index));
          } else {
            processOut(out, index);
          }
        }
      }
    };
    pollNext();
  });
}

Comparison

// A simple async function that returns after the given delay
// and prints its value to allow us to determine the response order
const asyncFn = delay => new Promise(resolve => setTimeout(() => {
  console.log(delay);
  resolve(delay);
}, delay));

// List of arguments to the asyncFn function
const args = [30, 20, 15, 10];

// As a comparison of the different implementations, a low concurrency
// limit of 2 is used in order to highlight the performance differences.
// If a limit greater than or equal to args.length is used the results
// would be identical.

// Vanilla Promise.all/map combo
const out1 = await Promise.all(args.map(arg => asyncFn(arg)));
// prints: 10, 15, 20, 30
// total time: 30ms

// Pooled implementation
const out2 = await asyncPool(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 15, 10
// total time: 40ms

// Batched implementation
const out3 = await asyncBatch(args, arg => asyncFn(arg), 2);
// prints: 20, 30, 20, 30
// total time: 45ms

console.log(out1, out2, out3); // prints: [30, 20, 15, 10] x 3

// Conclusion: Execution order and performance is different,
// but return order is still identical

Conclusion

asyncPool() should be the best solution as it allows new requests to start as soon as any previous one finishes.

asyncBatch() is included as a comparison as its implementation is simpler to understand, but it should be slower in performance as all requests in the same batch is required to finish in order to start the next batch.

In this contrived example, the non-limited vanilla Promise.all() is of course the fastest, while the others could perform more desirable in a real world congestion scenario.

Update

The async-pool library that others have already suggested is probably a better alternative to my implementation as it works almost identically and has a more concise implementation with a clever usage of Promise.race(): https://github.com/rxaviers/async-pool/blob/master/lib/es7.js

Hopefully my answer can still serve an educational value.

3

Semaphore is well known concurrency primitive that was designed to solve similar problems. It's very universal construct, implementations of Semaphore exist in many languages. This is how one would use Semaphore to solve this issue:

async function main() {
  const s = new Semaphore(100);
  const res = await Promise.all(
    entities.map((users) => 
      s.runExclusive(() => remoteServer.getCount(user))
    )
  );
  return res;
}

I'm using implementation of Semaphore from async-mutex, it has decent documentation and TypeScript support.

If you want to dig deep into topics like this you can take a look at the book "The Little Book of Semaphores" which is freely available as PDF here

2

The concurrent function below will return a Promise which resolves to an array of resolved promise values, while implementing a concurrency limit. No 3rd party library.

// waits 50 ms then resolves to the passed-in arg
const sleepAndResolve = s => new Promise(rs => setTimeout(()=>rs(s), 50))

// queue 100 promises
const funcs = []
for(let i=0; i<100; i++) funcs.push(()=>sleepAndResolve(i))

//run the promises with a max concurrency of 10
concurrent(10,funcs) 
.then(console.log) // prints [0,1,2...,99]
.catch(()=>console.log("there was an error"))

/**
 * Run concurrent promises with a maximum concurrency level
 * @param concurrency The number of concurrently running promises
 * @param funcs An array of functions that return promises
 * @returns a promise that resolves to an array of the resolved values from the promises returned by funcs
 */
function concurrent(concurrency, funcs) {
    return new Promise((resolve, reject) => {
        let index = -1;
        const p = [];
        for (let i = 0; i < Math.max(1, Math.min(concurrency, funcs.length)); i++)
            runPromise();
        function runPromise() {
            if (++index < funcs.length)
                (p[p.length] = funcs[index]()).then(runPromise).catch(reject);
            else if (index === funcs.length)
                Promise.all(p).then(resolve).catch(reject);
        }
    });
}

Here's the Typescript version if you are interested

/**
 * Run concurrent promises with a maximum concurrency level
 * @param concurrency The number of concurrently running promises
 * @param funcs An array of functions that return promises
 * @returns a promise that resolves to an array of the resolved values from the promises returned by funcs
 */
function concurrent<V>(concurrency:number, funcs:(()=>Promise<V>)[]):Promise<V[]> {
  return new Promise((resolve,reject)=>{
    let index = -1;
    const p:Promise<V>[] = []
    for(let i=0; i<Math.max(1,Math.min(concurrency, funcs.length)); i++) runPromise()
    function runPromise() {
      if (++index < funcs.length) (p[p.length] = funcs[index]()).then(runPromise).catch(reject)
      else if (index === funcs.length) Promise.all(p).then(resolve).catch(reject)
    }
  })
}
1

Here goes basic example for streaming and 'p-limit'. It streams http read stream to mongo db.

const stream = require('stream');
const util = require('util');
const pLimit = require('p-limit');
const es = require('event-stream');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;


const pipeline = util.promisify(stream.pipeline)

const outputDBConfig = {
    dbURL: 'yr-db-url',
    collection: 'some-collection'
};
const limit = pLimit(3);

async yrAsyncStreamingFunction(readStream) => {
        const mongoWriteStream = streamToMongoDB(outputDBConfig);
        const mapperStream = es.map((data, done) => {
                let someDataPromise = limit(() => yr_async_call_to_somewhere())

                    someDataPromise.then(
                        function handleResolve(someData) {

                            data.someData = someData;    
                            done(null, data);
                        },
                        function handleError(error) {
                            done(error)
                        }
                    );
                })

            await pipeline(
                readStream,
                JSONStream.parse('*'),
                mapperStream,
                mongoWriteStream
            );
        }
1

So many good solutions. I started out with the elegant solution posted by @Endless and ended up with this little extension method that does not use any external libraries nor does it run in batches (although assumes you have features like async, etc):

Promise.allWithLimit = async (taskList, limit = 5) => {
    const iterator = taskList.entries();
    let results = new Array(taskList.length);
    let workerThreads = new Array(limit).fill(0).map(() => 
        new Promise(async (resolve, reject) => {
            try {
                let entry = iterator.next();
                while (!entry.done) {
                    let [index, promise] = entry.value;
                    try {
                        results[index] = await promise;
                        entry = iterator.next();
                    }
                    catch (err) {
                        results[index] = err;
                    }
                }
                // No more work to do
                resolve(true); 
            }
            catch (err) {
                // This worker is dead
                reject(err);
            }
        }));

    await Promise.all(workerThreads);
    return results;
};

    Promise.allWithLimit = async (taskList, limit = 5) => {
        const iterator = taskList.entries();
        let results = new Array(taskList.length);
        let workerThreads = new Array(limit).fill(0).map(() => 
            new Promise(async (resolve, reject) => {
                try {
                    let entry = iterator.next();
                    while (!entry.done) {
                        let [index, promise] = entry.value;
                        try {
                            results[index] = await promise;
                            entry = iterator.next();
                        }
                        catch (err) {
                            results[index] = err;
                        }
                    }
                    // No more work to do
                    resolve(true); 
                }
                catch (err) {
                    // This worker is dead
                    reject(err);
                }
            }));
    
        await Promise.all(workerThreads);
        return results;
    };

    const demoTasks = new Array(10).fill(0).map((v,i) => new Promise(resolve => {
       let n = (i + 1) * 5;
       setTimeout(() => {
          console.log(`Did nothing for ${n} seconds`);
          resolve(n);
       }, n * 1000);
    }));

    var results = Promise.allWithLimit(demoTasks);

1
  • @tcooc's answer was quite cool. Didn't know about it and will leverage it in the future.
  • I also enjoyed @MatthewRideout's answer, but it uses an external library!!

Whenever possible, I give a shot at developing this kind of things on my own, rather than going for a library. You end up learning a lot of concepts which seemed daunting before.

 class Pool{
        constructor(maxAsync) {
            this.maxAsync = maxAsync;
            this.asyncOperationsQueue = [];
            this.currentAsyncOperations = 0
        }

        runAnother() {
            if (this.asyncOperationsQueue.length > 0 && this.currentAsyncOperations < this.maxAsync) {
                this.currentAsyncOperations += 1;
                this.asyncOperationsQueue.pop()()
                    .then(() => { this.currentAsyncOperations -= 1; this.runAnother() }, () => { this.currentAsyncOperations -= 1; this.runAnother() })
            }
        }

        add(f){  // the argument f is a function of signature () => Promise
            this.runAnother();
            return new Promise((resolve, reject) => {
                this.asyncOperationsQueue.push(
                    () => f().then(resolve).catch(reject)
                )
            })
        }
    }

//#######################################################
//                        TESTS
//#######################################################

function dbCall(id, timeout, fail) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            if (fail) {
               reject(`Error for id ${id}`);
            } else {
                resolve(id);
            }
        }, timeout)
    }
    )
}


const dbQuery1 = () => dbCall(1, 5000, false);
const dbQuery2 = () => dbCall(2, 5000, false);
const dbQuery3 = () => dbCall(3, 5000, false);
const dbQuery4 = () => dbCall(4, 5000, true);
const dbQuery5 = () => dbCall(5, 5000, false);


const cappedPool = new Pool(2);

const dbQuery1Res = cappedPool.add(dbQuery1).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery2Res = cappedPool.add(dbQuery2).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery3Res = cappedPool.add(dbQuery3).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery4Res = cappedPool.add(dbQuery4).catch(i => i).then(i => console.log(`Resolved: ${i}`))
const dbQuery5Res = cappedPool.add(dbQuery5).catch(i => i).then(i => console.log(`Resolved: ${i}`))

This approach provides a nice API, similar to thread pools in scala/java.
After creating one instance of the pool with const cappedPool = new Pool(2), you provide promises to it with simply cappedPool.add(() => myPromise).
Obliviously we must ensure that the promise does not start immediately and that is why we must "provide it lazily" with the help of the function.

Most importantly, notice that the result of the method add is a Promise which will be completed/resolved with the value of your original promise! This makes for a very intuitive use.

const resultPromise = cappedPool.add( () => dbCall(...))
resultPromise
.then( actualResult => {
   // Do something with the result form the DB
  }
)
1

This solution uses an async generator to manage concurrent promises with vanilla javascript. The throttle generator takes 3 arguments:

  • An array of values to be be supplied as arguments to a promise genrating function. (e.g. An array of URLs.)
  • A function that return a promise. (e.g. Returns a promise for an HTTP request.)
  • An integer that represents the maximum concurrent promises allowed.

Promises are only instantiated as required in order to reduce memory consumption. Results can be iterated over using a for await...of statement.

The example below provides a function to check promise state, the throttle async generator, and a simple function that return a promise based on setTimeout. The async IIFE at the end defines the reservoir of timeout values, sets the async iterable returned by throttle, then iterates over the results as they resolve.

If you would like a more complete example for HTTP requests, let me know in the comments.

Please note that Node.js 16+ is required in order async generators.

const promiseState = function( promise ) {
  const control = Symbol();

  return Promise
    .race([ promise, control ])
    .then( value => ( value === control ) ? 'pending' : 'fulfilled' )
    .catch( () => 'rejected' );
}

const throttle = async function* ( reservoir, promiseClass, highWaterMark ) {
  let iterable = reservoir.splice( 0, highWaterMark ).map( item => promiseClass( item ) );

  while ( iterable.length > 0 ) {
    await Promise.any( iterable );

    const pending = [];
    const resolved = [];

    for ( const currentValue of iterable ) {
      if ( await promiseState( currentValue ) === 'pending' ) {
        pending.push( currentValue );
      } else {
        resolved.push( currentValue );
      }
    }

    console.log({ pending, resolved, reservoir });

    iterable = [
      ...pending,
      ...reservoir.splice( 0, highWaterMark - pending.length ).map( value => promiseClass( value ) )
    ];

    yield Promise.allSettled( resolved );
  }
}

const getTimeout = delay => new Promise( ( resolve, reject ) => {
  setTimeout(resolve, delay, delay);
} );

( async () => {
  const test = [ 1100, 1200, 1300, 10000, 11000, 9000, 5000, 6000, 3000, 4000, 1000, 2000, 3500 ];

  const throttledRequests = throttle( test, getTimeout, 4 );

  for await ( const timeout of throttledRequests ) {
    console.log( timeout );
  }
} )();

1

Using tiny-async-pool ES9 for await...of API, you can do the following:

const asyncPool = require("tiny-async-pool");
const getCount = async (user) => ([user, remoteServer.getCount(user)]);
const concurrency = 2;

for await (const [user, count] of asyncPool(concurrency, users, getCount)) {
  console.log(user, count);
}

The above asyncPool function returns an async iterator that yields as soon as a promise completes (under concurrency limit) and it rejects immediately as soon as one of the promises rejects.

1

I know there are a lot of answers already, but I ended up using a very simple, no library or sleep required, solution that uses only a few commands. Promise.all() simply lets you know when all the promises passed to it are finalized. So, you can check on the queue intermittently to see if it is ready for more work, if so, add more processes.

For example:

// init vars
const batchSize = 5
const calls = []
// loop through data and run processes  
for (let [index, data] of [1,2,3].entries()) {
   // pile on async processes 
   calls.push(doSomethingAsyncWithData(data))
   // every 5th concurrent call, wait for them to finish before adding more
   if (index % batchSize === 0) await Promise.all(calls)
}
// clean up for any data to process left over if smaller than batch size
const allFinishedProcs = await Promise.all(calls)
1

No external libraries. Just plain JS.

It can be resolved using recursion.

The idea is that initially we immediately execute the maximum allowed number of queries and each of these queries should recursively initiate a new query on its completion.

In this example I populate successful responses together with errors and I execute all queries but it's possible to slightly modify algorithm if you want to terminate batch execution on the first failure.

async function batchQuery(queries, limit) {
  limit = Math.min(queries.length, limit);

  return new Promise((resolve, reject) => {
    const responsesOrErrors = new Array(queries.length);
    let startedCount = 0;
    let finishedCount = 0;
    let hasErrors = false;

    function recursiveQuery() {
      let index = startedCount++;

      doQuery(queries[index])
        .then(res => {
          responsesOrErrors[index] = res;
        })
        .catch(error => {
          responsesOrErrors[index] = error;
          hasErrors = true;
        })
        .finally(() => {
          finishedCount++;
          if (finishedCount === queries.length) {
            hasErrors ? reject(responsesOrErrors) : resolve(responsesOrErrors);
          } else if (startedCount < queries.length) {
            recursiveQuery();
          }
        });
    }

    for (let i = 0; i < limit; i++) {
      recursiveQuery();
    }
  });
}

async function doQuery(query) {
  console.log(`${query} started`);
  const delay = Math.floor(Math.random() * 1500);
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      if (delay <= 1000) {
        console.log(`${query} finished successfully`);
        resolve(`${query} success`);
      } else {
        console.log(`${query} finished with error`);
        reject(`${query} error`);
      }
    }, delay);
  });
}

const queries = new Array(10).fill('query').map((query, index) => `${query}_${index + 1}`);

batchQuery(queries, 3)
  .then(responses => console.log('All successfull', responses))
  .catch(responsesWithErrors => console.log('All with several failed', responsesWithErrors));

2
  • This approach completely misses error handling.
    – Bergi
    Commented May 28, 2021 at 19:25
  • @Bergi I've added errors handling to my answer.
    – Anton Fil
    Commented Dec 26, 2022 at 21:30
1

If you want to go for external package you can use p-limit

import pLimit from 'p-limit';

const limit = pLimit(1);

const input = [
    limit(() => fetchSomething('foo')),
    limit(() => fetchSomething('bar')),
    limit(() => doSomething())
];

// Only one promise is run at once
const result = await Promise.all(input);
console.log(result);
0

So I tried to make some examples shown work for my code, but since this was only for an import script and not production code, using the npm package batch-promises was surely the easiest path for me

NOTE: Requires runtime to support Promise or to be polyfilled.

Api batchPromises(int: batchSize, array: Collection, i => Promise: Iteratee) The Promise: Iteratee will be called after each batch.

Use:

batch-promises
Easily batch promises

NOTE: Requires runtime to support Promise or to be polyfilled.

Api
batchPromises(int: batchSize, array: Collection, i => Promise: Iteratee)
The Promise: Iteratee will be called after each batch.

Use:
import batchPromises from 'batch-promises';
 
batchPromises(2, [1,2,3,4,5], i => new Promise((resolve, reject) => {
 
  // The iteratee will fire after each batch resulting in the following behaviour:
  // @ 100ms resolve items 1 and 2 (first batch of 2)
  // @ 200ms resolve items 3 and 4 (second batch of 2)
  // @ 300ms resolve remaining item 5 (last remaining batch)
  setTimeout(() => {
    resolve(i);
  }, 100);
}))
.then(results => {
  console.log(results); // [1,2,3,4,5]
});

0

Recursion is the answer if you don't want to use external libraries

downloadAll(someArrayWithData){
  var self = this;

  var tracker = function(next){
    return self.someExpensiveRequest(someArrayWithData[next])
    .then(function(){
      next++;//This updates the next in the tracker function parameter
      if(next < someArrayWithData.length){//Did I finish processing all my data?
        return tracker(next);//Go to the next promise
      }
    });
  }

  return tracker(0); 
}
0

expanding on the answer posted by @deceleratedcaviar, I created a 'batch' utility function that takes as argument: array of values, concurrency limit and processing function. Yes I realize that using Promise.all this way is more akin to batch processing vs true concurrency, but if the goal is to limit excessive number of HTTP calls at one time I go with this approach due to its simplicity and no need for external library.

async function batch(o) {
  let arr = o.arr
  let resp = []
  while (arr.length) {
    let subset = arr.splice(0, o.limit)
    let results = await Promise.all(subset.map(o.process))
    resp.push(results)
  }
  return [].concat.apply([], resp)
}

let arr = []
for (let i = 0; i < 250; i++) { arr.push(i) }

async function calc(val) { return val * 100 }

(async () => {
  let resp = await batch({
    arr: arr,
    limit: 100,
    process: calc
  })
  console.log(resp)
})();

0

One more solution with a custom promise library (CPromise):

    import { CPromise } from "c-promise2";
    import cpFetch from "cp-fetch";
    
    const promise = CPromise.all(
      function* () {
        const urls = [
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
          "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
        ];
    
        for (const url of urls) {
          yield cpFetch(url); // add a promise to the pool
          console.log(`Request [${url}] completed`);
        }
      },
      { concurrency: 2 }
    ).then(
      (v) => console.log(`Done: `, v),
      (e) => console.warn(`Failed: ${e}`)
    );
    
    // yeah, we able to cancel the task and abort pending network requests
    // setTimeout(() => promise.cancel(), 4500);

    import { CPromise } from "c-promise2";
    import cpFetch from "cp-fetch";
    
    const promise = CPromise.all(
      [
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=1",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=2",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=3",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=4",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=5",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=6",
        "https://run.mocky.io/v3/7b038025-fc5f-4564-90eb-4373f0721822?mocky-delay=2s&x=7"
      ],
      {
        mapper: (url) => {
          console.log(`Request [${url}]`);
          return cpFetch(url);
        },
        concurrency: 2
      }
    ).then(
      (v) => console.log(`Done: `, v),
      (e) => console.warn(`Failed: ${e}`)
    );
    
    // yeah, we able to cancel the task and abort pending network requests
    //setTimeout(() => promise.cancel(), 4500);

0

Warning this has not been benchmarked for efficiency and does a lot of array copying/creation

If you want a more functional approach you could do something like:

import chunk from 'lodash.chunk';

const maxConcurrency = (max) => (dataArr, promiseFn) =>
  chunk(dataArr, max).reduce(
      async (agg, batch) => [
          ...(await agg),
          ...(await Promise.all(batch.map(promiseFn)))
      ],
      []
  );

and then to you could use it like:

const randomFn = (data) =>
    new Promise((res) => setTimeout(
      () => res(data + 1),
        Math.random() * 1000
      ));


const result = await maxConcurrency(5)(
    [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
    randomFn
);
console.log('result+++', result);
0

I have solution with creating chunks and using .reduce function to wait each chunks promise.alls to be finished. And also I add some delay if the promises have some call limits.

export function delay(ms: number) {
  return new Promise<void>((resolve) => setTimeout(resolve, ms));
}

export const chunk = <T>(arr: T[], size: number): T[][] => [
  ...Array(Math.ceil(arr.length / size)),
].map((_, i) => arr.slice(size * i, size + size * i));

const myIdlist = []; // all items
const groupedIdList = chunk(myIdList, 20); // grouped by 20 items

await groupedIdList.reduce(async (prev, subIdList) => {
  await prev;
  // Make sure we wait for 500 ms after processing every page to prevent overloading the calls.
  const data = await Promise.all(subIdList.map(myPromise));
  await delay(500);
}, Promise.resolve());
0

It is possible to limit requests to server by using https://www.npmjs.com/package/job-pipe

Basically you create a pipe and tell it how many concurrent requests you want:

const pipe = createPipe({ throughput: 6, maxQueueSize: Infinity })

Then you take your function which performs call and force it through the pipe to create a limited amount of calls at the same time:

const makeCall = async () => {...}
const limitedMakeCall = pipe(makeCall)

Finally, you call this method as many times as you need as if it was unchanged and it will limit itself on how many parallel executions it can handle:

await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
await limitedMakeCall()
....
await limitedMakeCall()

Profit.

0

I suggest not downloading packages and not writing hundreds of lines of code:

async function async_arr<T1, T2>(
    arr: T1[],
    func: (x: T1) => Promise<T2> | T2, //can be sync or async
    limit = 5
) {
    let results: T2[] = [];
    let workers = [];
    let current = Math.min(arr.length, limit);
    async function process(i) {
        if (i < arr.length) {
            results[i] = await Promise.resolve(func(arr[i]));
            await process(current++);
        }
    }
    for (let i = 0; i < current; i++) {
        workers.push(process(i));
    }
    await Promise.all(workers);
    return results;
}
0

Here's my recipe, based on killdash9's answer. It allows to choose the behaviour on exceptions (Promise.all vs Promise.allSettled).

// Given an array of async functions, runs them in parallel,
// with at most maxConcurrency simultaneous executions
// Except for that, behaves the same as Promise.all,
// unless allSettled is true, where it behaves as Promise.allSettled  

function concurrentRun(maxConcurrency = 10, funcs = [], allSettled = false) {
  if (funcs.length <= maxConcurrency) {
    const ps = funcs.map(f => f());
    return allSettled ? Promise.allSettled(ps) : Promise.all(ps);
  }
  return new Promise((resolve, reject) => {
    let idx = -1;
    const ps = new Array(funcs.length);
    function nextPromise() {
      idx += 1;
      if (idx < funcs.length) {
        (ps[idx] = funcs[idx]()).then(nextPromise).catch(allSettled ? nextPromise : reject);
      } else if (idx === funcs.length) {
        (allSettled ? Promise.allSettled(ps) : Promise.all(ps)).then(resolve).catch(reject);
      }
    }
    for (let i = 0; i < maxConcurrency; i += 1) nextPromise();
  });
}
0

The solution below uses iter-ops library, and operator waitRace that controls concurrency:

import {pipeAsync, map, waitRace} from 'iter-ops';

const i = pipeAsync(
    users, // inputs iterator/iterable
    map(u => u.remoteServer.getCount(u)), // create async request
    waitRace(10) // race-resolve up to 10 promises at a time
)
   .catch(err => {/* handle rejections */});

for await(const p of i) {
    //=> p = resolved value
}

Not the answer you're looking for? Browse other questions tagged or ask your own question.