4

I've been reading around the area of complexity and arithmetic operations using logic gates; one thing that is confusing me is that \begin{equation} \Theta (n^{2}) \end{equation} is quoted as being the complexity for multiplication for iterative adition. But addition of a number requires \begin{equation} log_2(n) \end{equation} operations, 1 for each bit or 8 times that for each nand gate involved in doing this. So it strikes me as obvious that adding that number n times will have a complexity of \begin{equation} n \log_2(n) \end{equation} Which is definitely less than \begin{equation} \Theta (n^{2}) \end{equation} So where is this additional factor of \begin{equation} \frac{n}{\log_2(n)} \end{equation} coming from?

Raphael
  • 73,212
  • 30
  • 182
  • 400
Duke of Sam
  • 143
  • 1
  • 3

1 Answers1

8

Addition of a number of size $n$ takes time $O(n)$. Don't confuse a number and its encoding size, which is logarithmically smaller.

When multiplying an $n$-bit integer $a$ by an $n$-bit integer $b$ using the iterative addition algorithm, you are adding up to $n$ shifted copies of $a$. Each addition costs you $O(n)$ rather than $O(\log n)$. The numbers $a,b$ itself could be as large as $2^n$.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514