Home > database >  Complexity of T(n) = 2T(n/2) n/2 (without master's theorem)?
Complexity of T(n) = 2T(n/2) n/2 (without master's theorem)?

Time:09-26

I am looking at the best case running time for merge sort, and have found the following recurrence relation: T(n) = 2T(n/2) n/2. I am aware of the fact that merge sort is theta(nlogn) in all cases. In attempting to solve this recurrence relation, I use telescoping:

T(n) = 2*T(n/2)   n/2
T(n) = 2^2*T(n/4)   n/4   n/2
T(n) = 2^k*T(1)   (n/2   n/4   ...   n/2^k) 
2^k = n -> log_2(n) = k
T(n) = n   n(1/2   1/4   ...   1/n) 

I am unsure how to solve the summation in the last part... I'm not even sure if that is correct. My thinking is that there would be log_2(n) total items being added in the summation? I am unsure how to derive that 2T(n/2) n/2 is theta(nlogn) without using the master's theorem please...

CodePudding user response:

As pointed out in the comment, your calculation seems to be wrong.

T(n) = 2*T(n/2)   n/2
T(n) = 2*(2*T(n/4)   n/4)   n/2  = 4*T(n/4)   2*(n/4)   n/2 = 4*T(n/4)   2*(n/2)
T(n) = 4*(2*T(n/8)   n/8)   2*(n/2) = 8*T(n/8)   (n/2)   2*(n/2) = 8*T(n/8)   3*(n/2)
...
T(n) = 2^k * T(n / 2^k)   k*(n/2),       2^k = n --->  k = log(n)
T(n) = log(n) * T(1)   log(n) * (n/2)
T(n) = logn   n*log(n)/2

Therefore time complexity of merge sort = O(n*log(n))

  • Related