The array has like 100.000 elements.
And when I loop over it, I noticed that the page freezes for a short period of time.
I read that to avoid this I need to loop over chunks of the array inside a setTimeout
call, to let the browser "breathe" in between these calls.
But what if I use promises instead of setTimeout and run them in parallel instead of one after the other? So each promise would loop over a portion of the array, and update it. Would this create unexpected issues, like the array not being updated correctly?
CodePudding user response:
It all depends on how you implement the Promises. Just be aware that if you're calling an async function, everything up to the point where you await
is still synchronous code. So if you called Promise.all
on a bunch of calls to this:
async function workOnSlice(array, index, count) {
for (let i = index; i < index count; i ) {
doSomethingExpensive(array[i]);
}
}
...it would still commandeer the CPU in the same way. Also, keep in mind that Promise.all
doesn't actually run any code parallel; the only thing it does is allow async waits in parallel; once waits have completed, it's still a single JS thread that runs task code.
I'd recommend using a worker process/thread instead for actual parallel processing, if possible. Or if you don't care about parallel processing and just want to yield to the event loop periodically, you could queue the items to process and have a process/wait loop.