Home > front end >  Complexity analysis for the permutations algorithm
Complexity analysis for the permutations algorithm

Time:11-16

I'm trying to understand the time and space complexity of an algorithm for generating an array's permutations. Given a partially built permutation where k out of n elements are already selected, the algorithm selects element k 1 from the remaining n-k elements and calls itself to select the remaining n-k-1 elements:

public static List<List<Integer>> permutations(List<Integer> A) {
    List<List<Integer>> result = new ArrayList<>();
    permutations(A, 0, result);
    return result;
  }

  public static void permutations(List<Integer> A, int start, List<List<Integer>> result) {
    if(A.size()-1==start) {
      result.add(new ArrayList<>(A));
      return;
    }

    for (int i=start; i<A.size(); i  ) {
      Collections.swap(A, start, i);
      permutations(A, start 1, result);
      Collections.swap(A, start, i);
    }
  }

My thoughts are that in each call we swap the collection's elements 2n times, where n is the number of elements to permute, and make n recursive calls. So the running time seems to fit the recurrence relation T(n)=nT(n-1) n=n[(n-1)T(n-2) (n-1)] n=...=n n(n-1) n(n-1)(n-2) ... n!=n![1/(n-1)! 1/(n-2)! ... 1]=n!e, hence the time complexity is O(n!) and the space complexity is O(max(n!, n)), where n! is the total number of permutations and n is the height of the recursion tree.

This problem is taken from the Elements of Programming Interviews book, and they're saying that the time complexity is O(n*n!) because "The number of function calls C(n)=1 nC(n-1) ... [which solves to] O(n!) ... [and] ... we do O(n) computation per call outside of the recursive calls".

Which time complexity is correct?

CodePudding user response:

The time complexity of this algorithm, counted by the number of basic operations performed, is Θ(n * n!). Think about the size of the result list when the algorithm terminates-- it contains n! permutations, each of length n, and we cannot create a list with n * n! total elements in less than that amount of time. The space complexity is the same, since the recursion stack only ever has O(n) calls at a time, so the size of the output list dominates the space complexity.

If you count only the number of recursive calls to permutations(), the function is called O(n!) times, although this is usually not what is meant by 'time complexity' without further specification. In other words, you can generate all permutations in O(n!) time, as long as you don't read or write those permutations after they are generated.

The part where your derivation of run-time breaks down is in the definition of T(n). If you define T(n) as 'the run-time of permutations(A, start) when the input, A, has length n', then you can not define it recursively in terms of T(n-1) or any other function of T(), because the length of the input in all recursive calls is n, the length of A.

A more useful way to define T(n) is by specifying it as the run-time of permutations(A', start), when A' is any permutation of a fixed, initial array A, and A.length - start == n. It's easy to write the recurrence relation here:

T(x) = x * T(x-1)   O(x)     if x > 1
T(1) = A.length

This takes into account the fact that the last recursive call, T(1), has to perform O(A.length) work to copy that array to the output, and this new recurrence gives the result from the textbook.

  • Related