I have been googling about Big O notation from past three days and came to the following conclusions.
Big O notation just tells us how the time taken by an algorithm grows as we increase the input size.
We can compare only algorithms from two different classes and not from the same class i.e; we can compare
O(n^2)
withO(n)
but not withO(n^2)
of some other algorithm.We omit the constants as we are only considered about growth.
After having all this clarification, I still have a doubt:
We know that
O(10n)
is slower thanO(n^2)
forn<10
. Then how can we say that the algorithm of orderO(n)
is always best thanO(n^2)
without actually considering the constants?How can we state that the algorithm of time complexity
O(n)
is always best?If we have
f(n)=10n^2 5n 6
, we omit10
as the function always increase by a constant 10. But why do we also omit5n 6
(not constant)? Isn't it significant in function growth?
CodePudding user response:
Big O notation just tells us how the time taken by an algorithm grows as we increase the input size.
It only tells us how the time taken grows asymptotically. This means you cannot even make conclusions about the time taken for one input size compared to the time taken for another input size. Given an implemented function