I am having trouble reasoning what the time complexity of this is. I was writing a backtracking function to solve a problem. To simplify, let's just say I have a list of size "a" and I am allowed to place down 0 or 1 into each element of the list. After trying all combinations, I return. This is clearly 2^(nm).
However, what if during each recursive call I have a constant amount of work to do? I am stuck reasoning through what the complexity is here. Can you point me to sources? From my undergrad studies all I remember is Master's theorem, but this approach is not relevant. (We subtract rather than divide to get the subproblem)
def myfunc(x,a):
if x == a:
return
myfunc2() #Some constant time work
myfunc(x 1,a)
myfunc(x 1,a)
CodePudding user response:
In your case, the time complexity is T(n) = m 2T(n - 1)
. Although we can't apply Master's theorem here, we can use substitution:
T(n) = m 2T(n - 1)
= m 2(m 2T(n - 2))
= m 2m 4(m 2T(n - 3))
= ∑(i = 1, i = n) m2^i
Evaluating this, we have m2^n
or ϴ(2^n).
Recursion doesn't really offer you any benefits here over readability. You could see savings if you combined this with memory of what you've evaluated, however. In that case, evaluating the time complexity becomes more... complex.
CodePudding user response:
The time complexity is obviously exponential.
And the myfunc2()
contribution is negligible unless x
is less than 2.
Maybe if myfunc2() is searching for 42.