Home > other >  Partition a list of strings with duplicates into different lists with no local duplicates
Partition a list of strings with duplicates into different lists with no local duplicates

Time:12-15

In python, how can I partition a list of strings into several lists with no duplicates per list, taking as many string as possible in each iteration until every item has been inserted in a list?

Input:

["1", "2", "2", "3", "3", "3", "4", "5" ]

Output:

["1", "2", "3", "4", "5"]
["2", "3"]
["3"]

How can I do this efficiently?

CodePudding user response:

Here's an easy way, provided the elements can be used as dict keys:

def partition(xs):
    from collections import Counter
    result = []
    c = Counter(xs)
    while c:
        uniq = list(c) # one of each key remaining
        result.append(uniq)
        c -= Counter(uniq) # and remove one of each of those
    return result

Then

>>> xs = ["1", "2", "2", "3", "3", "3", "4", "5"]
>>> partition(xs)
[['1', '2', '3', '4', '5'], ['2', '3'], ['3']]

CodePudding user response:

from collections import Counter
def spli(arr):
    l = Counter(arr)
    #find counter object
    k = l.most_common(1)[0][1]
    #find amount of list needed
    l = dict(l)
    final = [[] for _ in range(k)]
    #initialize empty nested lists
    for i in l:
        for k in range(l[i]):

            final[k].append(i)
    print(final)

Output:

>>> spli(["1", "2", "2", "3", "3", "3", "4", "5" ])
[['1', '2', '3', '4', '5'], ['2', '3'], ['3']]
  • Related