Home > Blockchain >  Why time complexity for permutation algorithm is n*n! and not n^3 * n!?
Why time complexity for permutation algorithm is n*n! and not n^3 * n!?

Time:11-10

Author claims that time complexity of this permutation algorithm:

from collections import deque

def find_permutations(nums):
    
    nums_l = len(nums)
    perms = deque()
    perms.append([])
    
    for num in nums:                     # (1)
        for _ in range(len(perms)):      # (2)
            perm = perms.popleft() 
            for j in range(len(perm) 1): # (3)
                new_perm = list(perm)    # (4)
                new_perm.insert(j, num)  # (5)
                perms.append(new_perm)   # (6)
    
    return perms

is n*n!

But I don't understand why. Here is my computation of time complexity for each line of code:

(1): len(nums) ~ n
(2): len(perms) ~ n!
(3): len(perm) ~ n
(4): n
(5): n
(6): n

So total Time complexity is:

n * n! * n * (n   n   n) ~ n^3 * n!

Am I wrong?

CodePudding user response:

Before getting to the main issue, there is a minor error for step (6): append is O(1). But that doesn't influence the overall complexity.

Your analysis goes wrong at step (2):

for _ in range(len(perms)):

This does not perform O(n!) iterations. The first time it even only executes 1 iteration, the second time still 1 iteration, and the third time 2 iterations, then 2 * 3, then 2 * 3 * 4, ... and the last time (n-1)! iterations.

Then there is also an overestimation at step (3):

for j in range(len(perm) 1)

The size of perm is initially small (corresponding to the loop of step 2): first the permutation(s) have size 0, then in the next iteration of step 2, they have size 1, ... then 2, ... up to n-1.

So for the total number of iterations of the inner loop, we have this sum:

1 * 1 1 * 2 2 * 3 (2 * 3) * 4 ... k! ... n!

An easier way to look at this is to realise that, at each iteration of the outer loop, this code produces all permutations that are one item longer than in its previous iteration. So in the first iteration you get all permutations of size 1 (using the first input value), in the second iteration you get all permutations of size 2 (using the two first input values), ...etc. It is a bottom-up approach. Also notice that popLeft is executed the right number of times to remove all permutations of the previous (outer) iteration, while append is executed for all new permutations (which only get popped in the next outer iteration).

So we can now easily see that append is executed this number of times:

1! 2! 3! 4! ... n! = O(n!)

If we now include into that the complexity of step 4 (which "absorbs" the complexity of step 5), then we get:

1.1! 2.2! 3.3! 4.4! ... n.n! = O(n.n!)

CodePudding user response:

This algorithm generates permutations by inserting number into all possible positions of existing permutations

now first loop is clearly O(n)

lets see whats happening with other loops:

on final iteration (for worst case):

  1. going over all (n-1)! permutations (generated with previous iterations) - O((n-1)!)
  2. copying each of them and adding new number inside - O(n) each

in total its (n-1)! * (n) = n! for worst case of other loops

so it is O(n * n!) overall

  • Related