int cnt = 0;
for (int i = 1; i < n; i ) {
for (int j = i; j < n; j ) {
for (int k = j * j; k < n; k ) {
cnt;
}
}
}
I have no idea of it. How to analyze the time complexity of it?
CodePudding user response:
The first approach would be to look at the loops separately, meaning that we have three O(.) that are connected by a product. Hence,
Complexity of the Algorithm = O(OuterLoop)*O(MiddleLoop)*O(InnerLoop)
Now look at each loop separately:
Outerloop: This is the most simple one. Incrementing from 1 to n, resulting in O(n)
Middleloop: This is non-obvious, the terminate condition of the loop is still n, but the starting iterator value is i, meaning that the larger i gets, less time it will take to finish the loop. But this factor is quadratical-asymptotically only a constant, meaning that it is still O(n), hence O(n^2) "until" the second loop.
Inner loop: We see, that the iterator increases quadratically. But we also see that the quadratic-increasing depends on the second loop, which we said to be O(n). Since, we again only look at the complexity asymptomatically, means that we can assume that j rises linearly, and since k rises quadratically until n, it will take \square(n) iterations until n is reached. Meaning that the inner most loop has a running time of O(\square(n)).
Putting all these results together,
O(n * n* square(n))=O(n^2*square(n))
CodePudding user response:
It's easy to see that the code is Omega(n²) - the two outer loops execute around n²/2 times.
The inner k
loop executes zero times unless j
is less than sqrt(n)
. Even though it executes zero times, it takes some computation to compute the conditions for the loop, so it's O(1) work in these cases.
When j
is less than sqrt(n)
, i
must also be less than sqrt(n)
, since by the construction of the loops, j
is always greater than or equal to i
. In these cases, the k
loop does n-j² iterations. We can construct a bound for the total amount of work in this inner loop in these cases: both i and j are less than sqrt(n), and there's at worst O(n) work done in the k loop, so there's at most O(n²) (ie: sqrt(n) * sqrt(n) * n) total work done in the inner loop.
There's also at most O(n²) total work done for the cases where the inner loop is trivial (ie: when j>sqrt(n)).
This gives a proof that the runtime complexity of the code is θ(n²).
Methods involving looking at nested loops individually and constructing big-O bounds for them do not in general give tight bounds, and this is an example question where such a method fails.