I've been trying to solve a problem (I'm new to recursion) and here's the given code:
def f(n):
print(n)
if (0 <= n) and (n <= 1000000000):
f(n-2)
f(n 5)
f(n 7)
I need to count the unique numbers printed when f(0) is called. First of all I tried to append each unique number to a list instead of printing all of them. But 'RecursionError: maximum recursion depth exceeded in comparison' occured. Then I changed the value of sys.setrecursionlimit and my code started working but printed nothing:
import sys
sys.setrecursionlimit(2000000000)
lst = []
def f(n):
if n not in lst:
lst.append(n)
if (0 <= n) and (n <= 1000000000):
f(n-2)
f(n 5)
f(n 7)
f(0)
print(len(lst))
What should I do to solve the problem?
CodePudding user response:
The code looks like it's going to blow up no matter the recursion depth, since the number of calls increases exponentially with the iterations. (Some of them won't result in further calls, such as f(n-2)
on the first iteration, but most will). Which means that after a few dozen iterations, the interpreter might already have billions of calls to keep track of.
To me it looks more like a mathematical puzzle that could maybe be solved with pen and paper. Running the code doesn't seem to be feasible.