10

What is the complexity of computing $n^{n^2},\;n \in \mathbb{N}$?

Raphael
  • 73,212
  • 30
  • 182
  • 400

2 Answers2

12

By using the fast Fourier transform, multiplications on $k$-bit numbers can be done in time $\tilde{O}(k)$ (where the tilde signifies that we're ignoring polylogarithmic factors). By repeated squaring, we can compute $n^{n^2}$ with $O(\log n)$ multiplications, and each multiplication involves no number larger than $n^{n^2}$, which has roughly $n^2 \log_2 n$ bits. So the total amount of time required is $\tilde{O}(n^2(\log n)^2)=\tilde{O}(n^2)$.

Micah
  • 221
  • 1
  • 6
3

Edited in response to comments The time to compute $f(n) = n^{n^2}$ can be decomposed into the time required to compute $f_1(n) = n^2$ and that required to perform $n ^ {f_1(n)}$. I'll assume that multiplying an $m$ bit number by an $n$ bit number takes exactly $mn$ time by the school book method ; additions, etc. are constant time. As a result, computing $n^2$ takes $\log_2^2 (n)$ time.

Suppose that we use binary exponentiation for computing $f(n)$. Binary exponentiation does two kinds of operations in computing $f(n)$: squaring the current product, and multiplying the current product by $n$, according to whether the current bit in the binary expansion of $n^2$ is 0 or 1. In the worst case, $n^2$ is a power of two, so that binary exponentiation repeatedly squares its current product until it reaches the answer.Note that $n^2$ has $m'=\lceil 2 \log_2(n) \rceil $ bits, so that the number of such squarings is $m= m'-1$. This is the case we analyze further below.

The first squaring takes $t_1=\log_2^2(n)$ time, resulting in a $o_1=2 \log_2(n)$-bit product . The second squaring takes in two $o_1$ bit numbers and runs in $t_2=o_1^2$ time, resulting in a $o_2=2o_1$-bit product. Continuing, the $i$-th step takes $t_i = 4^{i-1}\log^2_2 n$ time and outputs a $o_i=2^{i} \log_2 (n)$ -bit product. This process stops at the $m$-th step; as a result, it takes time

$T _{exp} =\sum _{[1..m]} t_i = \log_2 ^2 (n) \sum_{[1..m]} 4^i = \frac{4^m -1} {3} \log_2 ^2 n$.

When the initial squaring cost is included, we find we need time at most

${T_{exp} + \log_2^2{n}}$

Note

  • I omitted some floors and ceilings in the computations, hoping they would not materially affect the answer.
  • I deliberately omitted an $O$-based analysis in favour of an exact upper bound just to be rigorous.
  • The above reasoning also makes clear why my earlier analysis was flawed. The $O$ notation was used in a fast-and-loose way, and it conveniently omitted constants so that, for instance, $\sum t_i $ magically became $O(\log n)$.
  • The multiplications can be always speeded up by FFT and other methods.
PKG
  • 1,489
  • 10
  • 15