2

I am new to Advanced Algorithms and I have studied various samples on Google and StackExchange. What I understand is:

  1. We use $O(\log n)$ complexity when there is division of any $n$ number on each recursion (especially in divide and conquer).

  2. I know that for binary search, we have time complexity $O(n \log n)$, I understood $\log n$ is because each time it halves the full $n$ size number list in a recursive manner until it finds the required element. But why is it multiplied with $n$ even we just traverse half of the $n$ size element for each execution so why we multiply $\log n$ with $n$?

  3. Please give me any example explaining the complexity $O(n^2 \log n)$. I hope this will help me in understanding much better the above two questions.

Raphael
  • 73,212
  • 30
  • 182
  • 400

1 Answers1

4
  1. You are correct in thinking binary search is $O(\log n)$, it shouldn't be multiplied by $n$.

  2. Popular (comparison-based) sorting algorithms are $O(n \log n)$.

  3. 3SUM (i.e. find 3 elements in an array that sums to zero) using binary search is $O(n^2 \log n)$.

    The pseudo-code:

    For each element
      For each other element
        Do a binary search for the 3rd element that will result in a zero sum.
    

    Although the problem can be solved in $O(n^2)$ in a different way, this should still serve as a decent example.

Explanation of merge-sort complexity:

Merge-sort, for example, splits the array into 2 parts repeatedly (similar to binary search), but there are some differences:

  • Binary search throws away the other half, where merge-sort processes both
  • Binary search consists of a simple $O(1)$ check at each point, where-as merge-sort needs to do an $O(n)$ merge. This should already make the $O(\log n)$ vs $O(n \log n)$ make sense.

For a quick check, ask how many work, on average, is done for each element.
Note that a single merge is linear time, thus $O(1)$ per element.
You recurse down $O(\log n)$ times and, at each step there's a merge, so each element is involved in $O(\log n)$ merges.
And there are $O(n)$ elements.
Thus we have a time complexity of $O(n \log n)$.

There is also the more mathematical analysis: (source)

Let $T(n)$ the time used to sort n elements. As we can perform separation and merging in linear time, it takes $cn$ time to perform these two steps, for some constant $c$. So,
$T(n) = 2T(n/2) + cn$.

From here you work your way down to $T(1)$, and the remaining terms gives you your $O(n \log n)$ running time. Or use the master theorem.

If both of these are unclear, it should be easy enough to find another resource explaining the complexity of merge-sort.

Bernhard Barker
  • 965
  • 6
  • 13