Given this question where increment over the iterator i happens by incrementing its value by its own log value. What will be the big O time complexity for this fragment?
i = 10
while(i<n) {
i=i log(i);
}
CodePudding user response:
Interesting question! Here's an initial step toward working out the runtime.
Let's imagine that we have reached a point in the loop where the value of i has reached 2k for some natural number k. Then for i to increase to 2k 1, we'll need approximately 2k / k iterations to elapse. Why? That's because
- The amount i needs to increase is 2k 1 - 2k = 2k(2 - 1) = 2k, and
- At each step, increasing i by log i will increase i by roughly log 2k = k.
We can therefore break the algorithm apart into "stages." In the first stage, we grow from size 23 to size 24, requiring (roughly) 23/3 steps. In the second stage, we grow from size 24 to size 25, requiring (roughly) 24 / 4 steps. After repeating this process many times, we eventually grow from size n/2 = 2log n - 1 to size n = 2log n, requiring (roughly) 2log n / log n steps. The total amount of work done is therefore given by
23 / 3 24/4 25 / 5 ... 2log n / log n.
The goal now is to find some bounds on this expression. We can see that the sum is at least equal to its last term, which is 2log n / log n = n / log n, so the work done is Ω(n / log n). We can also see that the work done is less than
23 24 25 ... 2log n
≤ 2log n 1 (sum of a geometric series)
= 2n,
so the work done is O(n). That sandwiches the runtime between Ω(n / log n) and O(n).
It turns out that Θ(n / log n) is indeed a tight bound here, which can be proved by doing a more nuanced analysis of the summation.
CodePudding user response:
Let us look a bit about the definition of g(n)=O(f(n)): saying the function g is of order O(f(n)) means there exists a number n0 and a constant c such that for all n>n0 it is g(n)<=cf(n).
Looking at the worst case scenario, the while will run for maximum n times which means we can say your code is of order O(n).
Now let's assume that the code inside the while loop is
(*) while(i<n) {
i = i i ;
}
which obviously should skip more iteration that the original one. So we can use this code to estimate a lower bound. Examining (*) we see that in each iteration the counter will be double, if we think a little bit about it we see that for n iteration each time half of the input will be thrown out. so the code in (*) will have worst case asymptotic runtime O(log n).
Now we estimate that the original code should be between both so we can say its asymptotic lower bound is Ω(log n) and its asymptotic upper bound is O(n).