Home > front end >  What is the time complexity of this recursion?
What is the time complexity of this recursion?

Time:04-18

def s_r(n):
    if(n==1):
       return 1
    temp = 1
    for i in range(int(n)):
        temp  = i
    return temp * s_r(n/2) * s_r(n/2) * s_r(n/2) * s_r(n/2) 

Using recursion tree what’s the Big Oh And also how do we write this function into one recursive call.

I could only do the first part which I got O(n^2). This was one my exam questions which I want to know the answer and guidance to. Thank you

CodePudding user response:

As @kcsquared said in comments I also believe this function is O(n²). This code can be refactored into one function call by storing the result of recursion call, or just doing some math application. Also you can simplify the range sum, by using the built-in sum

def s_r(n):
    if n == 1:
        return 1
    return sum(range(int(n)   1)) * s_r(n) ** 4

CodePudding user response:

First, note that the program is incorrect: n/2 is floating point division in python (since Python3), so the program does not terminate unless n is a power of 2 (or eventually rounds to a power of 2). The corrected version of the program that works for all integer n>=1, is this:

def s_r(n):
    if(n==1):
       return 1
    temp = 1
    for i in range(n):
        temp  = i
    return temp * s_r(n//2) * s_r(n//2) * s_r(n//2) * s_r(n//2) 

If T(n) is the number of arithmetic operations performed by the function, then we have the recurrence relation:

T(1) = 0
T(n) = n   4T(n//2)

T(n) is Theta(n^2) -- for n a power of 2 and telescoping we get: n 4(n/2) 16(n/4) ... = 1n 2n 4n 8n ... nn = (2n-1)n

We can rewrite the program to use Theta(log n) arithmetic operations. First, the temp variable is 1 0 1 ... (n-1) = n(n-1)/2 1. Second, we can avoid making the same recursive call 4 times.

def s_r(n):
    return (1   n * (n-1) // 2) * s_r(n//2) ** 4 if n > 1 else 1

Back to complexity, I have been careful to say that the first function does Theta(n^2) arithmetic operations and the second Theta(log n) arithmetic operations, rather than use the expression "time complexity". The result of the function grows FAST, so it's not practical to assume arithmetic operations run in O(1) time. If we print the length of the result and the time taken to compute it (using the second, faster version of the code) for powers of 2 using this...

import math
import timeit

def s_r(n):
    return (1   n * (n-1) // 2) * s_r(n//2) ** 4 if n > 1 else 1


for i in range(16):
    n = 2 ** i
    start = timeit.default_timer()
    r = s_r(n)
    end = timeit.default_timer()
    print('n=2**%d,' % i, 'digits=%d,' % (int(math.log(r)) 1), 'time=%.3gsec' % (end - start))

... we get this table, in which you can see the number of digits in the result grows FAST (s_r(2**14) has 101 million digits for example), and the time measured does not grow with log(n), but when n is doubled the time increases by something like a factor of 10, so it grows something like n^3 or n^4.

n=2**0, digits=1, time=6e-07sec
n=2**1, digits=1, time=1.5e-06sec
n=2**2, digits=5, time=9e-07sec
n=2**3, digits=23, time=1.1e-06sec
n=2**4, digits=94, time=2.1e-06sec
n=2**5, digits=382, time=2.6e-06sec
n=2**6, digits=1533, time=3.8e-06sec
n=2**7, digits=6140, time=3.99e-05sec
n=2**8, digits=24569, time=0.000105sec
n=2**9, digits=98286, time=0.000835sec
n=2**10, digits=393154, time=0.00668sec
n=2**11, digits=1572628, time=0.0592sec
n=2**12, digits=6290527, time=0.516sec
n=2**13, digits=25162123, time=4.69sec
n=2**14, digits=100648510, time=42.1sec
n=2**15, digits=402594059, time=377sec

Note that it's not wrong to say the time complexity of the original function is O(n^2) and the improved version of the code O(log n), it's just that these describe a measure of the program (arithmetic operations) and are not at all useful as an estimate of actual program running time.

  • Related