There is an initial array that needs to be mapped and filtered at the same time:
let data = [
{
"id": 1,
"parentId": null,
"dxId": "1_id",
"dxParentId": null,
"defName": "group1"
},
{
"id": 12,
"parentId": null,
"dxId": "12_id",
"dxParentId": null,
"defName": "группа3"
},
{
"id": 8,
"parentId": 1,
"dxId": "1_id/13",
"dxParentId": "1_id",
"defName": "group4"
},
{
"id": 5,
"parentId": 1,
"dxId": "1_id/20",
"dxParentId": "1_id",
"defName": "user1"
},
{
"id": 5,
"parentId": 1,
"dxId": "1_id/20",
"dxParentId": "12_id",
"defName": "user1"
},
];
I filter by the presence of the parentide property, and collect the resulting array of strings (not the initial array of objects).
originally I did it through reduce methode:
resultArr = data.reduce((filtered, obj) => {
if( obj.dx_parentId !== null) {
if(!(filtered.includes(obj.dxParentId))) {
filtered.push(obj.dxParentId);
}
}
return filtered;
}, []);
console.log(resultArr , 'result'); // ['1_id', '12_id'] 'result'
then I discovered for myself that this can be done with the flatMap array method
resultArr = data.flatMap((obj) => obj.dxParentId !== null ? [obj.dxParentId] : []);
console.log(resultArr , 'result'); //['1_id', '1_id', '12_id'] 'result'
If you notice that in the case of reduce I use filtered (accumulating array) I do an additional check using includes and I get the result that I need.
The question is how to do the same it via flatMap? is it possible to access the accumulating result at each iteration?
Additionally: can someone tell me what is better to use to solve this case, in terms of optimization? mb forEach? thanks in advance
CodePudding user response:
You cannot access the array that is being built in flatMap
. In mapping, and also mapping-with-flattening, the callback is supposed to return a value that depends only on the particular item.
You might simple not want to use reduce
at all for an imperative algorithm like this:
const resultArr = [];
for (const obj of data) {
if (obj.dxParentId !== null) {
if (!resultArr.includes(obj.dxParentId)) {
filtered.push(obj.dxParentId);
}
}
}
console.log(resultArr, 'result');
However, to get unique values from an array there is a much better (simpler and more efficient) way:
const resultArr = Array.from(new Set(data.map(obj => obj.dxParentId).filter(id => id !== null)));
console.log(resultArr, 'result');
CodePudding user response:
So while you can't view the array flatmap is building while its building it, you can create an object just outside of flatmap and use it to track which ids you've added or not.
const seen = {};
const resultArr = data.flatMap((obj) => {
const id = obj.dxParentId;
if (id in seen) return [];
if (id !== null) {
seen[id] = true;
return [id];
}
return [];
});
console.log(resultArr, "result"); //['1_id', '1_id', '12_id'] 'result'
As far as time and space complexity goes this solution is much faster, but takes up much more space (in most cases that's a good tradeoff).
Array.include() traverses the entire array and has a time complexity of O(n). key in object has a constant time complexity, but building the object is O(n) space.
In my opinion the best possible solution would be combining the instant lookup of an object with the scoping of reduce.
const resultArr = data.reduce(
(acc, obj) => {
const id = obj.dxParentId;
const { seen, filtered } = acc;
if (id in seen) return acc;
if (id !== null) {
seen[id] = true;
filtered.push(id);
}
return acc;
},
{ seen: {}, filtered: [] }
).filtered;
console.log(resultArr, "result"); // ['1_id', '12_id'] 'result'
This solution has the same time and space complexity as the flatmap, but the seen object becomes eligible for garbage collection as soon as the reduce method finishes so those potential space issues aren't as large of a concern.