Home > Net >  O(log n) algorithm in theory runs much slower in practice
O(log n) algorithm in theory runs much slower in practice

Time:04-30

The algorithm is for calculating 2^n recursively. I have used the Master theorem to determine that the time complexity is indeed O(log n) which seems to correct according to other sources online.

def pow(n):
    """Return 2**n, where n is a nonnegative integer."""
    if n == 0:
        return 1
    x = pow(n//2)
    if n%2 == 0:
        return x*x
    return 2*x*x

My problem is that it does not run in logarithmic time when i measure it in python for many and largevalues of n. Instead, it seems to be running at something like O(n) or even O(n*logn) time. Is this because of the large multiplication in the return statements or something else? If so, can the algortithm even be considerd O(log n) if that is not what determines the final outcome in terms of speed?

CodePudding user response:

The complexity is O(log n) assuming each of the numbers fits in one word of memory. That assumption doesn’t hold if the sizes of the numbers increase.

CodePudding user response:

There's O(log n) recursive calls or "primitive operations". However several of those operations are themselves not O(1).

n // 2 itself is already O(n) and x * x is O(n^2). The bound for multiplication is not tight. Usually for large numbers optimized algorithms are used, but the so far all known algorithms are \Omega(n log n).

So even if your benchmark is implemented correctly (which can in itself be a real challenge depending on the used interpreter and plugins), you'll need to take the cost of these operations into account.

The most likely actual complexity of your algorithm is O(n^x * log n) for some x with 1 < x < 2, depending on the choice of the algorithm used for multiplying numbers and the number of digits n.

  • Related