I have the following code:
newIds = [];
for (let i = 0; i < combinedIds.length; i ) {
thisId = combinedIds[i];
**console.log(checkForAssoc(thisId))**;
}
async function checkForAssoc(docId) {
check = newIds.includes(docId);
if (check) {
return;
} else {
newIds.push(docId);
}
docToCheck = await adminModel.letterModel.findById(docId);
if (docToCheck.assocLetters.length > 0) {
newDocToCheckArry = docToCheck.assocLetters;
for (let n = 0; n < newDocToCheckArry.length; n ) {
checkForAssoc(newDocToCheckArry[n]);
}
}
return await newIds;
}
I am trying to loop over an array combinedIds
(which contains a number of mongoDB ids) and based on that create a new array newIds
by fetching other ids from the backend.
I am using a recursive async function, and everything works fine, however since the function returns a promise, I am not sure how to convert it to a simple array of values because it's within a for loop. Any help and advice appreciated.
CodePudding user response:
It might be that you have misunderstood what await
and async
are. Imagine you have the following code:
async function something() {
const v = await doRequest;
const promise = v.body;
return promise;
}
This is the exact same as:
function something() {
const promise = doRequest().then(v => v.body);
return promise;
}
Now from another async function you can call these with await
and the result is whatever v.body
is. however if you are not in a async function both of these returns a promise that you need to do a then
with a function to be called WHEN it gets fulfilled like this:
something().then(vBody => console.log(vBody));
async and await are just tools so that one function can be written in a more eager way even if it only deals with promises.
Looking at your code return await newIds
will return a promise right away that is fulfilled since it isn't a promise but an array. It will not wait for the array to be fully populated since you are not awaiting the recursive call or doing anything with the promise it returns. You are then doing console.log
on the promise since all async functions return only promises. You also have bugs in your code where you should have await
-ed for the recursive call to be returned or keep the promises you get in the loop and do a Promise.all
to benefit from the IO wait to not be chained but be in parallel.
CodePudding user response:
It looks to me as though you're crawling the data, with the result for each input key possibly containing more keys to search.
If I've got this right, I think it makes sense to separate out the general crawling mechanism from the specifics of your code, configuring a crawler with functions that deal with your particular data.
This is an implementation of that idea, with a dummy version of adminModel
:
const crawl = (search, extract, convert, collect) => async (inits) => {
const toCheck = new Set (inits)
const found = new Map ()
while (toCheck .size !== 0) {
const first = toCheck .values () .next () .value
const res = await search (first)
found .set (first, convert (res))
const newVals = extract (res)
newVals .forEach ((val) => found .has (val) || toCheck .add (val))
toCheck .delete (first)
}
return collect (found)
}
const findLetters = crawl (
(key) => adminModel .letterModel .findById (key),
(x) => x .assocLetters,
(x) => null,
(m) => [... m .keys ()]
)
const combinedIds = ['a', 'e']
findLetters (combinedIds)
.then (letters => console .log ('Results:', letters))
.catch (console .warn)
.as-console-wrapper {max-height: 100% !important; top: 0}
<script>
// Dummy version of adminModel
const adminModel = {letterModel: {
findById: ((data) => (id) => console .log (`Getting ${id}`) ||
new Promise ((res, rej) => setTimeout (() => res (data [id]), 250))
)({
a: {value: '1', assocLetters: ['c', 'e']},
b: {value: '2', assocLetters: ['c', 'd']},
c: {value: '3', assocLetters: []},
d: {value: '4', assocLetters: ['a', 'd', 'e']},
e: {value: '5', assocLetters: ['b']},
f: {value: '6', assocLetters: ['f', 'g']},
g: {value: '7', assocLetters: []},
})
}}
</script>
crawl
here is a general-purpose data crawler. It takes four parameters, and returns a function that takes an array of input keys (here, combinedIds
) and returns a Promise.
The inputs are
search
, which takes a key and returns a promise for some resultextract
, which takes that result and chooses all keys it containsconvert
, which turns that result into something you want to savecollect
, which takes a Map from keys to the result ofconvert
and returns your desired final output.
The function returned will take an array of keys and return a promise for the output of convert
.
crawl :: ((a -> Promise b), (b -> [a]), (b -> c), ((Map a c) -> d)) -> [a] -> Promise d
In our case, we have
search
:(key) => adminModel .letterModel .findById (key)
. I don't know what this does, but I created a dummy version in the snippet above. If thatfindById
function doesn't referencethis
, you might be able to just refer to it directly, replacing(key) => adminModel .letterModel .findById (key)
withadminModel .letterModel .findById
extract
: a function which takes theassocLetters
property from its inputconvert
: a dummy function since it seems we'll only need the keyscollect
: a function which extracts the keys from theMap
supplied.
The heart of this, of course, is crawl
. It's a very lightweight data crawler. It does its calls sequentially; if we wanted we can get more sophisticated and write one that bundles calls, it would be straightforward, and wouldn't alter the calling interface. But that can wait.
The function returned by crawl (search, extract, convert, collect)
takes an initial array of keys you want to use, and keeps two collections: those you still want to check, starting with the initial values (toCheck
), and a map of ones searched to the result of the convert
call (found
). So long as toCheck
is not empty, we call search
for its first key, take any new keys found by calling extract
on the result and add them to toCheck
, assign the result of convert
to our key
inside found
, and removes the key from toCheck
. When the queue is empty we call collect
on found
and return the results.
One big advantage of this style is that the abstraction above is no harder -- and often easier -- to think about than a custom version for your specific case. But now you can use it for many purposes -- say a web crawler -- and the code for your problem is as simple as this:
const findLetters = crawl (
(key) => adminModel .letterModel .findById (key),
(x) => x .assocLetters,
(x) => null,
(m) => [... m .keys ()]
)
const combinedIds = ['a', 'e']
findLetters (combinedIds)
.then (letters => console .log ('Results:', letters))
.catch (console .warn)