Consider the following code for a function isPrime.
f :: Integer -> Integer
f n = head (filter isPrime [0..n])
Assuming that isPrime k
computes a boolean in O(k) time, what is the time complexity of f n
?
My thought process was that filter is linear time, and we are given isPrime
is also linear time, so the time complexity of f n
should be O(n).
Is this correct and is there an easy way to eyeball the time complexity of code?
CodePudding user response:
If we can assume that isPrime
will return True
for 2
, then it takes constant time. Indeed, it will first call isPrime 0
, then isPrime 1
, and finally isPrime 2
. All these function calls take constant time, since isPrime k
runs in O(k).
If we make no assumptions what isPrime
will return, then thefact that isPrime
is a linear function, means that it will take O(n2) time. Indeed, we each time call isPrime k
, which runs in O(k), and this thus means that it runs in O(Σn0k) and thus (n2).
Is this correct and is there an easy way to eyeball the time complexity of code?
Due to Haskell's laziness, analyzing the timecomplexity is more complicated. Especially since for example the time complexity of filter isPrime [0..n]
depends on the function that uses that list (here head
). head
will stop at the first item of the list. Since filter
is lazy, it will thus not determine the rest of the items. So the notion of time complexity of a function for lazy functions, is probably less straight forward.