Everyone keeps claiming that integer factoring is in $NP$ but I just don't get it... Even with the simplest algorithm (division with all integers up to $\sqrt{n}$) the complexity should be $\sqrt{n}\log(n)$... How is that not in $P$? Is there something I'm missing?
1 Answers
One of the things to remember when dealing with natural numbers (and others, but naturals are the central things here) is the encoding, and that the definitions of $P$ and $NP$ reference the length of the encoding of the input on a Turing Machine (or something closely equivalent).
So the input to integer factoring, as a decision problem, is typically two numbers $n$ and $k$ in $\mathbb{N}$, and the question is whether $n$ has a factor $d \leq k$.
So the magnitude of $n$ is $n$, but the size of its encoding may be only $O(\log n)$ (for example, in binary). This is exponentially smaller than $n$ (i.e. if we take $n' = \log_{2} n$, then $n = 2^{n'}$).
So then the $\sqrt{n}\log n$ "obvious" algorithm runs in time $2^{\frac{n'}{2}}\cdot n'$, which is exponential in the input size.
- 18,373
- 4
- 60
- 87