10

Can I multiply Big-O time complexities?

For example: $O(n) \cdot O(n) = O(n^2)$?

UPDATE:

The question came from my observation that different sources analyze their algorithms in different ways. For example, some use the following $O(n \cdot n + n \cdot log (n) )$ notation and some more like $O(n) \cdot O(n) + O(n) \cdot O(log (n))$ notation. I was wondering if it is equivalent notations.

user3840170
  • 121
  • 6
illuminato
  • 211
  • 2
  • 8

2 Answers2

20

Yes, you can and yes, it is.

Considering, for example, the non-negative case, we have a more general property: $$O(f)\cdot O(g) = O(f\cdot g )$$ Let's take $ \varphi \in O(f) \cdot O(g) $. Then we have $\varphi = \varphi_{1} \cdot \varphi_{2}$ where $$\exists C_{1} > 0, \exists N_{1} \in \mathbb{N}, \forall n > N_{1},\ \varphi_{1} \leqslant C_{1} \cdot f $$ $$\exists C_{2} > 0, \exists N_{2} \in \mathbb{N},\ \forall n > N_{2},\ \varphi_{2} \leqslant C_{2} \cdot g $$ From above $\varphi = \varphi_{1} \cdot \varphi_{2} \leqslant C_{1} \cdot f \cdot C_{2} \cdot g = C_{1} \cdot C_{2} \cdot f \cdot g $ when $ n>N=max(N_{1},N_{2}) $.

Accordingly, we can also see that there is a second general property: $$O(f)+O(g) = O(f+ g )$$ The proof uses the same technique.

Definitions: $$O(f)\cdot O(g) = \left\lbrace a\cdot b \colon (a,b)\in O(f)\times O(g) \right\rbrace $$ $$O(f)+O(g) = \left\lbrace a + b \colon (a,b)\in O(f)\times O(g) \right\rbrace $$

zkutch
  • 2,389
  • 1
  • 9
  • 14
12

It's an abuse of notation.

$O(n)$ is a set of functions.

So $O(n)*O(n)$ is not really defined. $O(n)\times O(n)$ is defined, but it is defined as cartesian products of the set of functions in $O(n)$ by itself. This is probably not what you are after.

Within the context of algorithm analysis most would interpret $O(f)*O(g)$ to mean that we do some operation a number of time bounded by $f$, and each operation takes relevant resource (e.g. time or space) bounded by $g$. By that interpretation $O(f)*O(g)\subset O(f*g)$

Taemyr
  • 605
  • 3
  • 9