Home > front end >  How to calculate the big O notation given time taken to execute?
How to calculate the big O notation given time taken to execute?

Time:02-21

I've managed to find articles and videos explaining how to calculate the BigO notation of functions but I can't find one that explains how to calculate it given the time it takes to complete an algorithm. Consider the following problem: "An algorithm has time complexity O (n2 ) and when you implemented it and test run it and you have found that the algorithm takes 20 seconds to run when n is 200000. Approximately how long can we assume that the algorithm takes to run for 100000 values?"

How would one go about solving this? Or in general, if I know the worst case and been given the time it takes to run an algorithm, how can I figure out the bigO of that algorithm?

CodePudding user response:

We can assume it approximately takes 5 seconds.

If the relationship between n and runtime is that the algorithm takes exactly c×n² seconds for some constant c, then for halving n like in the question you get c×(n/2)² = (c×n²)/4 seconds, i.e., a quarter of the time. A quarter of 20 seconds is 5 seconds.

Most certainly the relationship isn't exactly that, but likely it's at least roughly that, and those n values are sufficiently large unless you have an unusual algorithm. So since we're asked about "assume" and "approximately", 5 seconds is the right answer.

CodePudding user response:

Big-O notation is about the behavior of a function as the function argument goes to infinity.

Execution time at finite input sizes cannot be used to determine a big-O for the time complexity, nor can such a big-O be used to determine the execution time for any given specific input size.

  • Related