Home > Software design >  Which is the best way to eliminate duplicated items of an array of arrays in javascript?
Which is the best way to eliminate duplicated items of an array of arrays in javascript?

Time:01-19

I have a an array of arrays like this:

let array = [
[402457.9590403921, 4621707.254796494],
[402457.9590403921, 4621707.254796494],
[402457.9590403921, 4621707.254796494],
[402442.9590403921, 462145.254796494],
[402442.9590403921, 462145.254796494],
]

and I would like to delete the repeated items. So it should be:

let array = [
[402457.9590403921, 4621707.254796494],
[402442.9590403921, 462145.254796494],
]

Thanks in advance

CodePudding user response:

Try this, which uses a hash/set to quickly tell what has been seen previously in the search. It's much faster than using indexOf:

const set = new Set();
const outerArr = [
  [402457.9590403921, 4621707.254796494],
  [402457.9590403921, 4621707.254796494],
  [402457.9590403921, 4621707.254796494],
  [402442.9590403921, 462145.254796494],
  [402442.9590403921, 462145.254796494]
];

const newArr = outerArr.filter(innerArr => {
  return innerArr.filter(n => {
    if (set.has(n)) {
      return false;
    }
    
    set.add(n);
    return true;
  }).length;
});
      
   
console.log(newArr);

CodePudding user response:

Convert each item to a string, then use a Set to get unique entries, then parse the string to get an array back.

const strings = array.map(a => JSON.stringify(a));
const unique = new Set(strings);
Array.from(unique, JSON.parse)

CodePudding user response:

Create a set of unique JSON representations of items in the array, and then map those back to objects/arrays.

let array = [
[402457.9590403921, 4621707.254796494],
[402457.9590403921, 4621707.254796494],
[402457.9590403921, 4621707.254796494],
[402442.9590403921, 462145.254796494],
[402442.9590403921, 462145.254796494],
]

console.log([...new Set(array.map(JSON.stringify))].map(JSON.parse))

  • Related