I have an API from which I need to query N pages of data. Because I do not want to overload the API, I want to do it sequentially and without blocking the main thread.
The code would be something like this:
var res = []; // all data from api
var totalPages = 10;
var pageSize = 100;
for (let page = 0; page < totalPages; page ) {
// load using jQuery ajax request
$.get('api.php', { page: page, page_size: pageSize }, function(result) {
res.push(...result); // add data to resulting array
});
}
But this approach has a few issues:
- Since it is async, it will just run all requests in parallel, overloading API as a result. I need them to still run async, but each should run only when the previous is done.
- Since all the calls are async, by the end of the cycle we still wouldn't have the requested data - it will be loading in the background. We need somehow to wait for all callbacks to be done before returning
res
to some other code that needs it. - There is no way to make every callback pass its result to the next callback and it is the only way to stop loading when some callback receives "stop loading"/"no more data" request from the server
Is there a way to fix these issues without using some side libraries or promises? Just plain old vanilla javascript. Sorry if something looks unclear, I am not very experienced in js
CodePudding user response:
You can use a simple recursion here, like this:
var res = []; // all data from api
var totalPages = 10;
var pageSize = 100;
const loader = page => {
// load using jQuery ajax request
$.get('api.php', { page: page, page_size: pageSize }, function(result) {
res.push(...result); // add data to resulting array
if (page < totalPages)
loader( page)
else
console.log('DONE!');
});
}
loader(0);