What sort algorithm provides the best worst-case performance?

后端 未结 16 2086
[愿得一人]
[愿得一人] 2020-12-15 14:44

What is the fastest known sort algorithm for absolute worst case? I don\'t care about best case and am assuming a gigantic data set if that even matters.

相关标签:
16条回答
  • 2020-12-15 14:52

    For the man with limitless budget

    Facetious but correct: Sorting networks trade space (in real hardware terms) for better than O(n log n) sorting!

    Without resorting to such hardware (which is unlikely to be available) you have a lower bound for the best comparison sorts of O(n log n)

    O(n log n) worst case performance (no particular order)

    • Binary Tree Sort
    • Merge Sort
    • Heap Sort
    • Smooth Sort
    • Intro Sort

    Beating the n log n

    If your data is amenable to it you can beat the n log n restriction but instead care about the number of bits in the input data as well

    Radix and Bucket are probably the best known examples of this. Without more information about your particular requirements it is not fruitful to consider these in more depth.

    0 讨论(0)
  • 2020-12-15 14:58

    Quicksort is usually the fastest, but if you want good worst-case time, try Heapsort or Mergesort. These both have O(n log n) worst time performance.

    0 讨论(0)
  • 2020-12-15 14:58

    See Quick Sort Vs Merge Sort for a comparison of Quicksort and Mergesort, which are two of the better algorithms in most cases.

    0 讨论(0)
  • 2020-12-15 14:58

    I've always preferred merge sort, as it's stable (meaning that if two elements are equal from a sorting perspective, then their relative order is explicitly preserved), but quicksort is good as well.

    0 讨论(0)
  • 2020-12-15 14:59

    If you have a gigantic data set (ie much larger than available memory) you likely have your data on disk/tape/something-with-expensive-random-access, so you need an external sort.

    Merge sort works well in that case; unlike most other sorts it doesn't involve random reads/writes.

    0 讨论(0)
  • 2020-12-15 15:04

    It depends both on the type of data and the type of resources. For example there are parallel algorithms that beat Quicksort, but given how you asked the question it's unlikely you have access them. There are times when the "worst case" for one algorithm is "best case" for another (nearly sorted data is problematic with Quick and Merge, but fast with much simpler techniques).

    0 讨论(0)
提交回复
热议问题