1

Big-O describes an upper bound on run time. Is that not the definition of "worst-case"?

For example, how can we say that a hash table insertion require O(1) time on average? Constant time is the best case, n time is the worst case. How can an upper bound describe an average run time? In what sense is that bound "upper"?

Cow Pow Pow
  • 159
  • 3
  • 6

1 Answers1

2

Wikipedia says:

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.

So it is not the definition of the "worst case".

Wikipedia defines best, worst, and average cases as

In computer science, best, worst, and average cases of a given algorithm express what the resource usage is at least, at most and on average, respectively.

By resource we usually refer to time or/and space (memory) required to solve a certain problem.

It also says:

Big O notation is a convenient way to express the worst-case scenario for a given algorithm, although it can also be used to express the average-case - for example, the worst-case scenario for quicksort is $O(n^2)$, but the average-case run-time is $O(n\log{n})$.

To sum up: Big-O notation is a pure mathematical concept and notation, which is used in Computer science to express the worst-case scenario for a given algorithm.

Here we have some useful posts:

fade2black
  • 9,905
  • 2
  • 26
  • 36