# Is there an ideal comparison sort?

Every computer science student should have studied or at least encountered sorting algorithms early in their career. There are many well-known sorting algorithms, including bubble sort, selection sort, insertion sort, Shell short, heapsort, merge sort, quicksort, and radix sort. Each one has different characteristics and tradeoffs. An open question I would like to ask is, is there a comparison sorting algorithm that has all the ideal properties?

- Comparison sort
A comparison sort only extracts information from the list’s elements by comparing two of them at a time. Comparing \(x\) and \(y\) gives exactly one of three results: \(x < y\), \(x = y\), or \(x > y\). A comparison sort does not extract any other information, such as the size of an element or the bit representation of it. This rules out algorithms like radix sort. It is known that a comparison sort on a list of \(n\) elements (performed on a serial machine) must have worst-case time complexity \(\Omega(n \log n)\).

- Worst-case time complexity
The naive sorting algorithms, like bubble sort and insertion sort, have time complexity \(\Theta(n^2)\). The best comparison sorting algorithms have time complexity \(\Theta(n \log n)\). However, quicksort has random-case time complexity \(\Theta(n \log n)\) but worst-case time complexity \(\Theta(n^2)\).

- Worst-case auxiliary space complexity
Some sorting algorithms operate in-place, using \(\Theta(1)\) extra (auxiliary) space. Some need lots of extra space, as much as \(\Theta(n)\). Ideally, a sorting algorithm should only use \(O(1)\) extra space.

- Stability
Elements are being sorted by their keys, but the elements may hold extra data that is not considered in sorting. A stable sorting algorithm keeps the relative order of elements that have the same key. This is useful for sorting the same list by one key followed by sorting by another key. Any unstable sorting algorithm can be modified to be stable by using \(O(n)\) extra space.

Here are the characteristics of popular sorting algorithms:

Algorithm | Comparison sort | Worst-case time | Worst-case extra space | Stable |
---|---|---|---|---|

Bubble sort | Yes | \(\Theta(n^2)\) | \(\Theta(1)\) | Yes |

Selection sort | Yes | \(\Theta(n^2)\) | \(\Theta(1)\) | No |

Insertion sort | Yes | \(\Theta(n^2)\) | \(\Theta(1)\) | Yes |

Shellsort | Yes | Between \(\Theta(n \log n)\) and \(\Theta(n^2)\) | \(\Theta(1)\) | No |

Heapsort | Yes | \(\Theta(n \log n)\) | \(\Theta(1)\) | No |

Merge sort | Yes | \(\Theta(n \log n)\) | \(\Theta(n)\) | Yes |

Quicksort | Yes | \(\Theta(n^2)\) | \(\Theta(\log n)\) | No |

Radix sort | No | \(\Theta(nk)\), where \(k\) is the number of bits | \(\Theta(n)\) | Yes |

So my question is, is there a comparison sorting algorithm that has worst-case time complexity \(\Theta(n \log n)\), worst-case auxiliary space complexity \(\Theta(1)\) (or perhaps up to \(O(\log n)\)), and is stable?

I have come across mentions of an in-place stable merge sort algorithm, but was unable to find a description that I could understand. Examples include [0], [1], [2], [3], [4].