Home > Software engineering >  How is this obscure sorting algorithm called
How is this obscure sorting algorithm called

Time:10-30

I came up with an obscure sorting algorithm and given that it's so simple, it must have been invented and named before, so I was wondering what it's called.

It has a very rare constraint: It only works for inputs that have keys from 0 to n-1 (or equivalent). That's a very strong constraint that makes it useless in practice, but maybe one can construct some artificial settings in which it's useful. The algorithm basically swaps the element at a particular position with its final position until the array is sorted. Pseudocode:

def obscure_sort(array):
  sorted_until = 1
  while true
    if key(array[0]) != 0:
      # Swap the element at position 0 to its final position.
      swap(array, 0, key(array[0]))
    else:
      # Find the next element that isn't in its final position.
      while key(array[sorted_until]) == sorted_until:
        sorted_until  
        # If we happen to reach the end, we're done.
        if sorted_until == array.length:
          return
      # Swap the newfound first unsorted element to position 0
      swap(array, 0, sorted_until)

The algorithm actually runs in O(n). It's not completely trivial to see that and I'll leave out the analysis unless someone is really interested.

Does anyone know if this has a name?

CodePudding user response:

This is a slight variation of a restricted cycle sort, probably closest to the algorithm from section 3 of this paper.

Normally with cycle sort on the keys A = [0, 1,...(A.length-1)], you would loop through the array testing indices 0 to A.length-1 as a 'cycle start', looking for cycles to rotate. One 'rotation' is done by always holding a temporary variable 'temp' (initially our cycle start), and doing a swap(temp, A[temp]) until we are back at the start of the cycle (i.e., when temp == A[temp]).

Here, in contrast, we add 0 at the back of the cycle, and 'A[0]' takes the place of 'temp'. We use the operation swap(A[0], A[A[0]]), so that in general, an element x that's moved takes a journey of A[old] -> A[0] -> A[x] rather than A[temp] -> temp -> A[x].

In the linear time algorithm described in the paper above, upon starting loop iteration i, all of the elements 0, 1, ..., i-1 are in place and never moved again. This algorithm is similar, except that if it were written with the same loop style, 0, 1, ..., i-1 are also in place at the start of iteration i but element 0 is not fixed, being moved constantly during an iteration.

As a small example:

Traditional Cycle Sort
Initially, A = [1, 3, 0, 2]

Step 1: A = [1, 3, 0, 2], temp = 1, with cycle_start = 0
Step 2: A = [1, 1, 0, 2], temp = 3
Step 3: A = [1, 1, 0, 3], temp = 2
Step 4: A = [2, 1, 2, 3], temp = 0
Step 5: A = [0, 1, 2, 3], temp = 2; stop since temp == A[temp]
Custom Cycle-like Sort
A = [1, 3, 0, 2]

Step 1: A = [1, 3, 0, 2]
Step 2: A = [3, 1, 0, 2]
Step 3: A = [2, 1, 0, 3]
Step 4: A = [0, 1, 2, 3]

Note that this new sort can take more steps than the normal cycle sort, since 'adding 0 at the back of the cycle' can add an additional swap operation per cycle. The total number of array swaps, though, is linear (and at most twice the array length).

  • Related