4

Given the Problem

$A x = b$

for some regular matrix $A \in \mathbb{R}^{n \times n}$ and $b\in\mathbb{R}^n$. One can compute $x$ with the Cholesky factorization in $O(n^3)$.

If $A$ is known to be a symmetric, positive (i.e. $A_{ij} >0$) and positive definite matrix, is it possible to solve $Ax = b$ faster than $O(n^3)$?

Adam
  • 3,819
  • Thanks to the symmetry, you gain a factor of 2 with the Cholesky decomposition approach, described at length on Wikipedia, but it is still $O(n^3)$. – Matt Mar 26 '20 at 17:30

1 Answers1

0

Yes it is, but the best way highly depends on the structure of $A$. For example you can use the Preconditionned conjugate Gradient method which complexity is $O(n)$. You can also study what do the function A\b of Matlab® (which tries to take benefit of the structure of $A$ and adapt the way of solving the equation system). Anyway if it is for implementation you should use some libraries and not implement it yourself.

  • Are you sure about O(n)? I found that the complexity actually depends on the sparsity and condition of the matix: http://math.stackexchange.com/questions/607423/what-is-the-time-complexity-of-conjugate-gradient-method .. which would leave me in $O(n^2)$ for a not sparse but well conditioned matrix. – Adam Aug 27 '14 at 09:07
  • @Adam You can be sure $O(n)$ is wrong in the general (non-sparse) case, since there are $O(n^2)$ entries in $A$ which need to be looked at (because in the general case they all affect the answer, so you can't skip any), yielding a clear $O(n^2)$ lower bound. Probably the $O(n)$ "complexity" is referring to the number of conjugate gradient steps, but those steps contain matrix operations requiring $O(n^2)$ time, so we are back to $O(n^3)$ for this approach, and the preconditioning is probably also of this order. – Matt Mar 18 '20 at 09:21