Home > Software engineering >  NodeJS ExpressJS process array in parallel
NodeJS ExpressJS process array in parallel

Time:11-24

I have an array of ids and I need to send a request to another microservice for each id and it takes a lot of time. Is it possible to do it in parallel? Code looks like that

const ids = [1,2,3,4]
const objects = await Promise.all(ids.map(id => this._getData(id)))

CodePudding user response:

JavaScript concurrency in general

You put any IO operation on the JavaScript event queue and then wait for all of those operations to resolve, as long as there is some callback you can listen to.

Promise.all()

A network call is one such IO operation. With most HTTP request libraries or fetch, you will get a Promise back when creating a request. You can use Promise.all, as in your example, to wait for any number of request you've created to finish. So, given that your _getData function returns a Promise, this should work, yes.

If you expect some of those _getData calls to fail, then you can use Promise.allSettled() and process the results based on the status property.

More reading

You can read up on Promises and the JavaScript concurrency model (event loop) to understand this more clearly.

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all

https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop

CodePudding user response:

app.param('user', function (req, res, next, id) { // try to get the user details from the User model and attach it to the request object User.find(id, function (err, user) { if (err) { next(err) } else if (user) { req.user = user next() } else { next(new Error('failed to load user')) } }) })

  • Related