7

I am trying to either solve or find a tight bound $\Theta$ for the following recurrence relation:

$$T(n) = x + T(n-\log_2 n)\,.$$

For some nonzero constant $x$ (we can suppose it to be 1, for simplicity). Thus far, I have been able to prove the following:

\begin{align*} T(n) &\in O(n)\\ T(n) &\notin \Omega(n)\\ T(n) &\in \Omega (n^p)\quad \forall p \in (0,1)\,. \end{align*}

But I am now stuck, I cannot figure how to either solve the recurrence or find a $g(n)$ to prove a tight bound: $T(n) \in \Theta(g(n))$. May somebody know how to solve this?

If you want to know, this recurrence arises in the following context: in maxiphobic heaps[1] where the "weight" of the heap is its number of nodes, the merge operation is recursive, but its complexity is $\Theta(\log n)$, even in the worst case. If the meaning of the weight is changed to mean the height of the heap, the complexity for average random inputs remains $\Theta(\log n)$, but you can construct a worst-case pathological input where in each step merge calls recursively itself with almost all of its input, minus a degenerate subtree of at most $\log_2 n$ elements, approximately. I want to find a tight bound for the complexity of that worst case.

[1] C. Okasaki, 2005. Alternatives to Two Classic Data Structures. In Proceedings of 36th SIGCSE Technical Symposium on Computer Science Education, pp. 162–165. ACM. (ACM Digital Library)

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
josedavid
  • 73
  • 2

1 Answers1

8

Assuming an appropriate base case, it is easy to see that $T(n) \geq (n/\log_2 n) \cdot x$, since at each step we subtract at most $\log_2 n$, and thus it takes t least $n/\log_2 n$ steps to reach zero.

In the other direction, denote by $m$ the current input ($n$, $n-\log_2n$, and so on). As long as $m \geq n/2$, the input decreases by at least $(\log_2 n)/2$ at each step, and so it takes at most $2n/\log_2 n$ steps to reach $n/2$. Then it takes at most $4\sqrt{n}/\log_2 n$ steps to reach $n^{1/4}$, and so on. After $\log\log n$ such phases, $m$ will be $O(1)$. This gives an upper bound proportional to $x$ times $$ \frac{2n}{\log_2 n} + \frac{4\sqrt{n}}{\log_2 n} + \cdots \leq \frac{2n}{\log_2 n} + 4\sqrt{n} + 8n^{1/4} + \cdots + 2\log n \cdot n^{1/\log n} = \\ O(n/\log n + \sqrt{n} \log\log n) = O(n/\log n). $$ Thus $T(n) = \Theta(xn/\log n)$.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514