2

Everyone keeps claiming that integer factoring is in $NP$ but I just don't get it... Even with the simplest algorithm (division with all integers up to $\sqrt{n}$) the complexity should be $\sqrt{n}\log(n)$... How is that not in $P$? Is there something I'm missing?

Raphael
  • 73,212
  • 30
  • 182
  • 400
Confused
  • 23
  • 2

1 Answers1

7

One of the things to remember when dealing with natural numbers (and others, but naturals are the central things here) is the encoding, and that the definitions of $P$ and $NP$ reference the length of the encoding of the input on a Turing Machine (or something closely equivalent).

So the input to integer factoring, as a decision problem, is typically two numbers $n$ and $k$ in $\mathbb{N}$, and the question is whether $n$ has a factor $d \leq k$.

So the magnitude of $n$ is $n$, but the size of its encoding may be only $O(\log n)$ (for example, in binary). This is exponentially smaller than $n$ (i.e. if we take $n' = \log_{2} n$, then $n = 2^{n'}$).

So then the $\sqrt{n}\log n$ "obvious" algorithm runs in time $2^{\frac{n'}{2}}\cdot n'$, which is exponential in the input size.

Luke Mathieson
  • 18,373
  • 4
  • 60
  • 87