2

Comparison-based sorting algorithms does a number of different operations to accomplish the sorting, why comparisons are the dominant time consumption? While I understand the standard analyses of asymptotical behavior of number of comparison operations, I don't quite understand why other costs of other types of operations are negligible.

[Edit 2014-08-26]

If I run the same mergesort implementation on two different computers (with possibly different architecture etc.), how to argue that running time of mergesort divided by the number of compares will approach (possibly different) constants as the problem size increases?

user78219
  • 23
  • 3

2 Answers2

1

Because the number of comparisons dominates the number of other operations. Since costs of all operations are usually pretty much the same (within a constant factor), costs of non-comparison operations get eaten by constants in the asymptotic measure.

zpavlinovic
  • 1,664
  • 10
  • 19
1

It mostly has to do with the nature of the algorithm. In selection or insertion sort, for example, the important work is the number of comparisons: the extra loop overheads are subsumed in the calculations. In essence, as @bellpeace noted, you add a loop operation with each comparison, so in the long run you're only adding a constant multiple to the work done. This would be different if you were considering, say, radix sort, where you aren't comparing elements but rather their bit values, which you can do without comparisons (more or less).

Rick Decker
  • 15,016
  • 5
  • 43
  • 54