I've been reading around the area of complexity and arithmetic operations using logic gates; one thing that is confusing me is that \begin{equation} \Theta (n^{2}) \end{equation} is quoted as being the complexity for multiplication for iterative adition. But addition of a number requires \begin{equation} log_2(n) \end{equation} operations, 1 for each bit or 8 times that for each nand gate involved in doing this. So it strikes me as obvious that adding that number n times will have a complexity of \begin{equation} n \log_2(n) \end{equation} Which is definitely less than \begin{equation} \Theta (n^{2}) \end{equation} So where is this additional factor of \begin{equation} \frac{n}{\log_2(n)} \end{equation} coming from?
Asked
Active
Viewed 8,260 times
1 Answers
8
Addition of a number of size $n$ takes time $O(n)$. Don't confuse a number and its encoding size, which is logarithmically smaller.
When multiplying an $n$-bit integer $a$ by an $n$-bit integer $b$ using the iterative addition algorithm, you are adding up to $n$ shifted copies of $a$. Each addition costs you $O(n)$ rather than $O(\log n)$. The numbers $a,b$ itself could be as large as $2^n$.
Yuval Filmus
- 280,205
- 27
- 317
- 514