Home > Blockchain >  How chunk/parallelize nodejs api requests with await async?
How chunk/parallelize nodejs api requests with await async?

Time:07-04

For each id in an array of ids, I want to make an API request to an endpoint somapi.com/todo/{id}. Let's say the API is rate limited to 25 requests per second and a request takes on average 1 second. We'll assume 100000 ids. My first implementation would be something like:

const ids = [...Array(100000).keys()];

const downloadTodos = async (ids) => {
    for (const id of ids) {
      await downloadTodo(id)
    }
}

But that would be slow and far from utilizing the max rate limit. How can I make downloadTodos faster by using parallelization/async-await? Maybe by awaiting N requests in paralell at any given time?

CodePudding user response:

First chunk the array into arrays with 25 ids then using Promise.all or Promise.allSettled wait for the 25 requests to be resolved/rejected and push them in a results array, also make sure at least one second passes between the 25 requests.

const chunkSize = 25;
const waitTimeMs = 1000;

function sleep(ms) {
    return new Promise(resolve => setTimeout(resolve, ms));
}

async function download(ids) {
    const result = [];
    for (let i = 0; i < ids.length; i  = chunkSize) {
        // Chunk the ids
        const chunk = ids.slice(i, i   chunkSize);
        const [settled] = await Promise.all([
            // Download a chunk 
            Promise.allSettled(chunk.map(async id => {
                // return downloadTodo(id);
                return Promise.resolve(id);
            })),
            // Make sure at least 1 second passes before the next chunk is downloaded
            sleep(waitTimeMs) 
        ]);
        // Add the chunk to the result
        result.push(...settled.map(s => s.value));
    }
    return result;
}

download([...Array(100).keys()]).then(r => console.log(r));

  • Related