Home > Software design >  Python Time Complexity of a small programm
Python Time Complexity of a small programm

Time:12-29

I want to understand how to calculate the time complexity of the next algorithm. Basically, it is just a random number generator function that takes one single parameter size then, generates a list of random numbers while the length of the lists is different than the size parameter.

I am using random.randrange and datetime.

I am trying to understand how could I calculate the approx. time it will take to run instead of running it. what would be the actual math with these parameters.

I did some test with inputs: 1, 10, 100, 1000, 10000, 100000. Here are my results in a nice table:

Input Time
1 0.000016
10 0.000022
100 0.000081
1000 0.000690
10000 0.007082
100000 0.068690
1000000 still going on while I write this

From what I understand the time complexity grows linear with N, but how could I do the math to have an approx time without computing this.

Here's the code:

def randomRegisteredVisits(size):
    randomNumbersList = []
    while (len(randomNumbersList) != size):
        randomNumbersList.append(randrange(0, size))
    return randomNumbersList

start_time = datetime.now()
miTestofRandom = randomRegisteredVisits(1000000)
end_time = datetime.now()
print('Duration: {}'.format(end_time - start_time))

CodePudding user response:

Time complexity is O(n), so it is linear.

It's linear as there is a single loop that loops for the entire range. If there was a loop inside loop then the time complexity would be O(n*n), or O(n^2)

CodePudding user response:

Maybe you can use python Sklearn library and perform a linear regressor algorithm with a table like you post with input as X and time as Y, so you could "predict" the time of computing for an algorithm mapping this features. i hope this help you

  • Related