The structure of an object array looks like this:
[
{ _id: "id1", metadata: { data: ["somedata1"], link: [] } }
{ _id: "id2", metadata: { data: ["somedata2"], link: ["id2", "id3"] } }
{ _id: "id3", metadata: { data: ["somedata3"], link: ["id2", "id3"] } }
{ _id: "id4", metadata: { data: ["somedata4"] } }
]
As you can see, there is an optional link
key, which connects two objects. Now I need to convert the object to array elements, which merges the connected datasets. So the result should look like this:
[
[
{ _id: "id1", metadata: { data: ["somedata1"], link: [] } }
],
[
{ _id: "id2", metadata: { data: ["somedata2"], link: ["id2", "id3"] } },
{ _id: "id3", metadata: { data: ["somedata3"], link: ["id2", "id3"] } }
],
[
{ _id: "id4", metadata: { data: ["somedata4"] } }
]
]
I think I would iterate through all objects, but I don't know how to merge the linked objects into one array element without getting duplicates.
const result = []
data.map(d => {
if (!d.metadata.link?.length)
result.push([d])
else
result.push(
data.getFiles.filter((item) => d.metadata.link.indexOf(item._id) !== -1)
)
// but this would result in a duplicate array, as id2 and id3 have the same link content
})
CodePudding user response:
first if you just want to loop on adta array it's bettern to use array.forEach
and not array.map
that is used to return a new array with modified data
then an idea can be to just add an if before push in result array to check if data have already been added for sample with array.some
if (!result.some(oneArr => oneArr.some(oneData => oneData._id === d._id)))
const data = [
{ _id: "id1", metadata: { data: ["somedata1"], link: [] } },
{ _id: "id2", metadata: { data: ["somedata2"], link: ["id2", "id3"] } },
{ _id: "id3", metadata: { data: ["somedata3"], link: ["id2", "id3"] } },
{ _id: "id4", metadata: { data: ["somedata4"] } }
];
const result = [];
data.forEach(d => {
if (!d.metadata.link?.length) {
result.push([d])
} else {
if (!result.some(oneArr => oneArr.some(oneData => oneData._id === d._id)))
result.push(
data.filter((item) => d.metadata.link.indexOf(item._id) !== -1)
)
}
});
console.log(result);
CodePudding user response:
I think below way object will not get duplicate. data will group as link
values. This approach does n't have much complexity. It will be O(N)
I hope this is what you're looking for.
const data = [
{ _id: "id1", metadata: { data: ["somedata1"], link: [] } },
{ _id: "id2", metadata: { data: ["somedata2"], link: ["id2", "id3"] } },
{ _id: "id3", metadata: { data: ["somedata3"], link: ["id2", "id3"] } },
{ _id: "id4", metadata: { data: ["somedata4"] } }
]
const result = data.reduce((acumD, d) => {
const link = d?.metadata?.link;
if (!link) {
if (acumD['NONE']) {
acumD['NONE'].push(d)
} else {
acumD['NONE'] = [d];
}
} else if (link.length === 0) {
if (acumD['EMPTY']) {
acumD['EMPTY'].push(d)
} else {
acumD['EMPTY'] = [d];
}
} else {
const linkString = link.join(',');
if (acumD[linkString]) {
acumD[linkString].push(d)
} else {
acumD[linkString] = [d];
}
}
return acumD;
}, {});
console.log('result', Object.values(result));