Home > OS >  Time Complexity in asymptotic analysis log n and log (n m)
Time Complexity in asymptotic analysis log n and log (n m)

Time:10-12

Just some interesting discussion inspired by a conversation in my class.

There are two algorithms, one has time complexity log n and another log (n m).

Am I correct to argue for average cases, log (n m) one will perform faster while they make no differences in running time when considering it asymptotically? Because taking the limit of both and f1'/f2' will result in a constant, therefore they have the same order of growth.

Thanks!

CodePudding user response:

As I can see from the question, both n and m are independent variables. So when stating that

O(m   n) = O(n)

it should hold for any m, which is not: the counter example is

m = exp(n)

O(log(m   n)) = O(log(n   exp(n))) = O(log(exp(n))) = O(n) > O(log(n))

That's why in general case we can only say, that

O(log(m   n)) >= O(log(n))

An interesting problem is when O(m n) = O(n). If m grows not faster then polynom from n, i.e. O(m) <= O(P(n)):

O(log(m   n)) = O(log(P(n)   n)) = O(log(P(n))) = k * O(log(n)) = O(log(n))    

In case of (multi)graphs seldom have we that many edges O(m) > P(n): even complete graph Kn contains only m = n * (n - 1) / 2 = P(n) edges, that's why

O(m   n) = O(n)

holds for ordinary graph (no parallel/multiple edges, no loops)

  • Related