Is it theoretically possible to sort an array of n integers in an amortized complexity of O(n)?
What about trying to create a worst case of O(n) complexity?
Any page on the intertubes that deals with comparison-based sorts will tell you that you cannot sort faster than O(n lg n) with comparison sorts. That is, if your sorting algorithm decides the order by comparing 2 elements against each other, you cannot do better than that. Examples include quicksort, bubblesort, mergesort.
Some algorithms, like count sort or bucket sort or radix sort do not use comparisons. Instead, they rely on the properties of the data itself, like the range of values in the data or the size of the data value.
Those algorithms might have faster complexities. Here is an example scenario:
You are sorting
10^6integers, and each integer is between0and10. Then you can just count the number of zeros, ones, twos, etc. and spit them back out in sorted order. That is how countsort works, inO(n + m)wheremis the number of values your datum can take (in this case,m=11).
Another:
You are sorting
10^6binary strings that are all at most5characters in length. You can use the radix sort for that: first split them into 2 buckets depending on their first character, then radix-sort them for the second character, third, fourth and fifth. As long as each step is a stable sort, you should end up with a perfectly sorted list inO(nm), where m is the number of digits or bits in your datum (in this case,m=5).
But in the general case, you cannot sort faster than O(n lg n) reliably (using a comparison sort).