I have algorithm that has time complexity of 2n^2, and on some machine it takes x time to execute it. The question is: if my machine is 32 times faster and time stays the same how mutch data will it process?
ty <3
CodePudding user response:
Realistically, this question does not make any sense, because complexity does not determine exact runtime, operations etc.
However, here we don't have big-O notation (asymptotic representation), I will work with the assumption that:
- This is purely theoretical
- Changing machines doesn't impact time taken per operation (the change remains same across machines and that is exactly by a factor of 32)
So, lets say you process D
, d
data points on M1
, M2
, taking x
time
If complexity here represents exact time taken for operations,
2 * D^2 = x = 2 * d^2
Since M2 is 32 times faster,
32 * 2 * D^2 = 2 * d^2
=> d = 4 * D * sqrt(2)