1

I came across following problem:

Find the time complexity of below recurrence relation:

$T(n)=\begin{cases} & 2T(n/2)+C; & n>1\\ & C;&n=1 \\ \end{cases}$

The solution was given as follows

$\begin{align} T(n) & = C+2C+4C+...nC\\ & = C (1+2+4+...+2^k) \\ & = C \left(\frac{1(2^{k+1}-1)}{2-1}\right) \\ & = C (2^{k+1}-1) \\ & = C (2n-1) \\ & = O(n) \\ \end{align}$

I feel above gives order of return value, but not the order of computation involved.

This function makes two recursive calls with argument $n/2$. Each of these two make another two recursive calls, making total 4, with argument $n/2/2=n/4$. Each of these four make another two recursive calls, making total 8, with argument $n/2/2/2=n/8$. And so on. Thus this form complete binary tree. This tree terminates when $n/2^h=1$. That is, when $2^h=n$.
$\therefore $ Height of binary tree $h=\log_2n$. And the order of computation will be, number of nodes in the complete binary tree for height $h$ which equals $2^{h+1}-1=2^{\log_2n+1}-1=O(2^{\log_2n})$.

Am I correct with this final time complexity and the overall interpretation (of first being the time complexity of the return value and the one which I came up with, the time complexity of computation involved)?

John L.
  • 39,205
  • 4
  • 34
  • 93
RajS
  • 1,737
  • 5
  • 28
  • 50

3 Answers3

3

Here is a correct version of your original problem.

Find the time complexity of an algorithm which takes $T(n)$ time/operations to finish when the input size is $n$, where $T(n)$ satisfies the below initial condition,

$$T(1) = C$$ and the below recurrence relation, $$T(n) = 2T(n/2)+C\ \text{ if } n>1$$ for some constant $C>0$.

Another correct version would be "Find the asymptotics in term of big $O$ or $\Omega$ notation for the function $T(n)$ that ...".

My answer is based on the above editions of the your original problem.

Your question and much more beyond have been answered in more generality in this reference question and answer and in particular, by the first case of the master theorem.

Assume throughout this answer and the question that $n$ is a positive integer and "/" is the integer division when it is applied to two positive integers as in Python/Java/C/C++/C#. Without these assumptions, $T(1.5)$ and $T(3)$ may not be defined.

Here is a correct computation. Suppose $2^h\le n\lt 2^{h+1}$ for some nonnegative integer $h$. \begin{align} T(n) & = 2T(n/2) + (2^1-1)C\\ & = 2(2T(n/4) +C)+ (2^1-1)C\\ & = 2^2T(n/2^2) + (2^2-1)C\\ & \cdots\\ & = 2^hT(1) + (2^h-1)C\\ & = (2^{h+1}-1)C\\ & = \Omega(n) \end{align} The last equality stands from the fact that $Cn\le (2^{h+1}-1)C\lt 2Cn$.

D.W.
  • 167,959
  • 22
  • 232
  • 500
John L.
  • 39,205
  • 4
  • 34
  • 93
1

The question is ill-posed. Recurrence relations don't have time complexities.

  • Recurrences have solutions.
  • Algorithms have running times.
  • Problems have time complexities.
David Richerby
  • 82,470
  • 26
  • 145
  • 239
0

You could also use the Master Theorem

$T(n) = aT(\frac{n}{b})+f(n)$

In your case, a = 2, b = 2 f(n) = $\Theta(1)\therefore$

$f(n) = n^0, 0 < \log_2 2\therefore$ Case 1 and

$T(n) \in \Theta(n^{\log_2 2})= \Theta(n)$

Fred Guth
  • 111
  • 3