Home > Software design >  Question about quicksort and its worst-case time complexity assuming you choose the middle pivot for
Question about quicksort and its worst-case time complexity assuming you choose the middle pivot for

Time:09-30

So if you pick a pivot that is either the smallest or largest element this results in the next call being of size n - 1. So if you repeat this getting size n - 2, n - 3 and so on you get the worst-case time complexity for quicksort, this being O(n^2).

The thing is, to get this pattern of n - 1, n - 2, n - 3 etc. you always need to pick a bad pivot, meaning the pivot always needs to be the largest or smallest element. My question is, assuming you would always pick the middle pivot, what ordering of, for eg. {2, 3, 1, 6, 5, 7} would yield a quadratic complexity?

Through some testing, I discovered that {4, 5, 6, 7, 3, 2, 1} seems to follow the worst-case pattern (again always choosing the middle pivot) up until the call n - 2, where it splits into an even and uneven partition. From what I understand the time complexity can be defined by the sum of the complexities at each partitioning level. So I wanted to know is if {4, 5, 6, 7, 3, 2, 1} still generates the worst-case time complexity even though it does not follow the n - 1 pattern all the way through, and if it doesn't generate the worst-case time complexity how would I approach finding an ordering of a sequence of sorted integers; with the restriction that I always choose the middle pivot; that does.

Any help would be greatly appreciated, I have been stuck on this problem for quite a bit!

CodePudding user response:

Using the question's method, and assuming pivot = array[(lo hi)/2], then for a pattern where the pivot is the smallest element, one sequence is decreasing even numbers followed by increasing odd numbers:

{6, 4, 2, 1, 3, 5, 7}
 6  4  2  3  5  7
 6  4  3  5  7
 6  4  5  7
 6  5  7
 6  7
 7
  • Related