Home > Net >  python time complexity quick sort algorithm
python time complexity quick sort algorithm

Time:12-02

def partition(A, l, r):
    p = A[l]
    stack = A[l]
    A[l] = A[r]
    A[r] = stack
    s = l

    for i in range(l, r):

        if A[i] <= p:

            stack2 = A[i]
            A[i] = A[s]
            A[s] = stack2
            s  = 1

    stack3 = A[s]
    A[s] = A[r]
    A[r] = stack3

    return s

def quicksort(A, l, r):

    if l < r:

        q = partition(A, l, r)
        quicksort(A, l, q - 1)
        quicksort(A, q   1, r)
    return A

I've wrote "maybe" quick sort algorithm, as I've noticed here the time complexity of partition was O(n) because of the for loop, Also the complexity in quicksort seems to be at least O(n). The question: how is it possible for the entire code to have total time complexity of O(nlogn).

CodePudding user response:

logn comes because of the recursion.
If you call a function recursively you get the complexity of logn and for each for inside the function (for from the partition) you get n

How the logn comes from recursion? Everytime you call that function, inside it you call it again 2 times.
Example:

  • Calling it once without recursion you call the for loop 2^0 times / 1 time
  • Calling it and getting one time recursion you call for loop 2^1 times / 2 times (it would be accutally 3 times, but you don't iterate n times through those for)
  • Calling it and getting 2 times recursion you call for loop 2^2 times / 4 times

CodePudding user response:

You can check out the link, I hope this will help.

  • Related