Home > front end >  Are algorithms with high time complexity ever used in the real world for small inputs?
Are algorithms with high time complexity ever used in the real world for small inputs?

Time:01-13

Let's say we have a problem where a certain algorithm lets call it algorithm_1 solves it in time complexity of O(n^2) and another algorithm lets call it algorithm_2 solves it in time complexity O(n), but in reality we see that for n < 1000 algorithm_1 is faster and otherwise algorithm_2 is faster

Why cant we just write code like this:

if ( n < 1000)
  do algorithm_1
else
  do algorithm_2

Is this a real thing programmers do or are there down sides for this?

On a smaller program this seems to be a good idea.

CodePudding user response:

This does happen in the real world! For example a famous sorting algorithm is Timsort:

Timsort

Details of the below implementation:

We consider the size of the run as 32 and the input array is divided into sub-array.

We one-by-one sort pieces of size equal to run with a simple insertion sort. After sorting individual pieces, we merge them one by one with the merge sort.

We double the size of merged subarrays after every iteration.

Insertion sort has complexity O(N^2) but is faster for tiny lists, Merge Sort has complexity O(N logN) so it is better for longer lists.

Introsort

Another example is introsort, the sort used in the C standard library:

Introsort or introspective sort is a hybrid sorting algorithm that provides both fast average performance and (asymptotically) optimal worst-case performance. It begins with quicksort, it switches to heapsort when the recursion depth exceeds a level based on (the logarithm of) the number of elements being sorted and it switches to insertion sort when the number of elements is below some threshold. This combines the good parts of the three algorithms, with practical performance comparable to quicksort on typical data sets and worst-case O(n log n) runtime due to the heap sort. Since the three algorithms it uses are comparison sorts, it is also a comparison sort.

More complexity downside

The downside of using more algorithms for a single task is clearly increased complexity. It is worth it if you are writing standard library code for a programming language that will be re-used millions or even billions of times. For smaller projects focusing on saving developer time over machine time by implementing only one algorithm is often the better choice.

References:

CodePudding user response:

In most of the optimization problems you have to use approximation algorithm or brute-force/inefficient algorithms that have high time complexity.

Older implementations of Quick sort use Insertion sort because its efficient for small data sets which is better than most other simple quadratic algorithms such as Selection sort or Bubble sort.

There is other use cases which are hard to compute by design choices such as Cryptocurrency Mining algorithms and CPU cycle measuring...

  • Related