I've been thinking about this question ever since I learnt about the $O(n\log(n))$ sorting algorithms such as MergeSort, QuickSort (average case is pretty much worse case with a good choice of a pivot) and HeapSort. I've been thinking how can we possibly achieve a greater class of efficiency in worst case? Can we do so using specific hardware implementations? A sort that requires a large amount of space? There is indeed RadixSort and BucketSort, but those are only $O(n)$ in specific use cases.
2 Answers
To expand on what blue-dino wrote, is it indeed impossible to make a general sorting algorithm with worst-case (or even average-case) complexity which is better than $O(n\log(n))$. In this case "general sorting algorithm" refers to the comparison based model of sorting - it means that the only operation our sorting algorithm is allowed to make is a comparison between two elements which tells us what element is bigger/smaller (or equality), but we get no further information.
The proof of this lower bound is based around building a "comparison-tree" with all possible sortings of a group of $n$ different elements, and proving that there will always be a branch of length $O(n\log(n))$, which corresponds to a similarly long computation or run of the algorithm. As I mentioned beforehand, such length can also be proven to be the average length of a branch in the comparison tree, therefore providing a lower limit to the average complexity as well.
- 542
- 3
- 17
No, n lg n is a theoretical lower bound for general sorting. You can get linear time for special cases. A proof of this fact can be found on cormen.
- 199
- 5