Home > OS >  Time complexity of an algorithm that runs 1 2 ... n times;
Time complexity of an algorithm that runs 1 2 ... n times;

Time:07-02

To start off I found this stackoverflow question that references the time complexity to be O(n^2), but it doesn't answer the question of why O(n^2) is the time complexity but instead asks for an example of such an algorithm. From my understanding an algorithm that runs 1 2 3 ... n times would be less than O(n^2). For example, take this function

function(n: number) {
    let sum = 0;

    for(let i = 0; i < n; i  ) {
        for(let j = 0; j < i 1; j  ) {
            sum  = 1;
        }
    }

    return sum;
}

Here are some input and return values

num sum
1 1
2 3
3 6
4 10
5 15
6 21
7 28

From this table you can see that this algorithm runs in less than O(n^2) but more than O(n). I also realize than algorithm that runs 1 (1 2) (1 2 3) ... (1 2 3 ... n) is true O(n^2) time complexity. For the algorithm stated in the problem, do we just say it runs in O(n^2) because it runs more than O(log n) times?

CodePudding user response:

It's known that 1 2 ... n has a short form of n * (n 1) / 2. Even if you didn't know that, you have to consider that, when i gets to n, the inner loop runs at most n times. So you have exactly n times (for outer loop i), each running at most n times (for inner loop j), so the O(n^2) becomes more apparent.

I agree that the complexity would be exactly n^2 if the inner loop also ran from 0 to n, so you have your reasons to think that a loop i from 0 to n and another loop j from 0 to i has to perform better and that's true, but with big Oh notation you're actually measuring the degree of algorithm's complexity, not the exact number of operations.

p.s. O(log n) is usually achieved when you split the main problem into sub-problems.

CodePudding user response:

I think you should interpret the table differently. The O(N^2) complexity says that if you double the input N, the runtime should quadruple (take 4 times as long). In this case, the function(n: number) returns a number mirroring its runtime. I use f(N) as a short for it.

So say N goes from 1 to 2, which means the input has doubled (2/1 = 2). The runtime then has gone from f(1) to f(2), which means it has increased f(2)/f(1) = 3/1 = 3 times. That is not 4 times, but the Big-O complexity measure is asymptotic, dealing with the situation where N approaches infinity. If we test another input doubling from the table, we have f(6)/f(3) = 21/6 = 3.5. It is already closer to 4.

Let us now stray outside the table and try more doublings with bigger N. For example we have f(200)/f(100) = 20100/5050 = 3.980 and f(5000)/f(2500) = 12502500/3126250 = 3.999. The trend is clear. As N approaches infinity, a doubled input tends toward a quadrupled runtime. And that is the hallmark of O(N^2).

  • Related