15

I have computed the Cholesky of a positive semidefinite matrix $\Theta$. However, I wish to know the diagonal elements of the inverse of $\Theta^{-1}_{ii}$. Is it possible to do this using the Cholesky that I have computed? Or will finding the eigenvalues alone (without the orthonormal matrices of a SVD) help this cause? Are there any other suggestions or alternative decompositions that will aid finding the inverse matrix diagonal?

I've seen that random projections does wonders for inverting matrices. Could something like this be applied here?

sachinruk
  • 1,001
  • 9
    I don't see why this question should be on hold. The subtext of the question seems to be clear and constitutes a non-trivial question: what is an optimal way to compute the diagonal elements of a symmetric positive semi-definite matrix? The naive way to compute the entire inverse is $O(n^3)$. But can one get just the diagonal with a smaller asymptotic exponent? – Igor Khavkine Oct 16 '14 at 20:48
  • Not a solution of your problem. However Schur complement formula tells that $\frac{1}{(\Theta^{-1}){ii}}=\Theta{ii}-r_{i}\tilde\Theta^{-1}c_{i}$, where $r_{i}=(\Theta_{i1},\ldots,\hat\Theta_{ii},\ldots,\Theta_{in})$, $c_{i}=(\Theta_{1i},\ldots,\hat\Theta_{ii},\ldots,\Theta_{ni})$ ($\hat a$ means $a$ is removed), and $\tilde\Theta$ is obtained from $\Theta$ after removing $i$th row and $i$th Column. – Indrajit Nov 02 '14 at 18:45
  • The following question seems related: https://math.stackexchange.com/questions/64420/is-there-a-faster-way-to-calculate-a-few-diagonal-elements-of-the-inverse-of-a-h – jochen Dec 04 '17 at 20:51

4 Answers4

9

I stumbled onto this question when trying to answer a similar question

I want a diagonal matrix that best approximates the inverse of a matrix ${\bf B} \succ 0$.

I'll post my answer to that question in case it helps other (and maybe OP). In this case, "best" means nearest in the $\ell_2$ sense.

$$\textbf{d}^*(\textbf{B}) = \operatorname{argmin}_{\textbf{d}} \tfrac 1 2 \| \textbf{B} \operatorname{diag} (\textbf{d}) - \textbf{I} \|_F^2$$

This is separable in $d_i$ and differentiable. Setting the gradient to zero brings us to the closed form (and very cheap) solution

$$[\textbf{d}^*]_i = \frac {b_{ii} } { \| \textbf{b}_i \|^2}$$

(Note in complex numbers, you'd need to conjugate)

I wouldn't be surprised if this has been known for 100 years, but I couldn't easily find it.

  • 2
    This is well-known and is called "sparse approximate inverse". These are generally used as preconditioners. – max Mar 07 '18 at 16:26
  • 1
    So not 100 years old, but more like 20 or so. Thanks for commenting with the correct terminology. – Mark Borgerding Mar 07 '18 at 19:42
  • What if $B$ is not posdef, not even symmetric: $|b_i|^2 \to |b_{row\ i}| |b_{col\ i}|$ ? Would you have a test case or two ? – denis May 15 '20 at 16:19
  • @denis , this was to cheaply approximate the inverse of a Gram matrix. So B was always pos semidef. Adding epsilon to the diagonal elements makes it definite in practice. – Mark Borgerding May 15 '20 at 20:10
4

Tang and Saad have a method that uses random vectors (not necessarily projections):

A Probing Method for Computing the Diagonal of the Matrix Inverse

Yair Daon
  • 193
3

If you have the Cholesky decomposition, you can easily compute the whole matrix inverse. Since

$$\Theta = R^* R$$

where $R$ is upper-triangular, then you can find $\Theta^{-1}$ by solving

$$R^* R X = I$$

where $I$ is the identity. The latter system can be solved by forward and backward substitution.

If you only want the diagonal entries of $X$, you could save perhaps half the computation by stopping the backward substitution process (for each column) when you get to the diagonal entry.

1

I think that we cannot have a complexity $<n^3/3$. Indeed let $\Theta=LL^*$ where $L$ is lower triangular. Clearly $\Theta^{-1}={L^*}^{-1}L^{-1}$ ; let $L^{-1}=[u_{i,j}]$. Then (F) ${\Theta^{-1}}_{i,i}=\sum_{j\geq i}|u_{j,i}|^2$. Then we must know the $(|u_{j,i}|)$ and (in my opinion) we must know the $(u_{j,i})$, that is $L^{-1}$. The complexity of the calculation of $L^{-1}$ is $\sim n^3/3$ (cf. A Fast Triangular Matrix Inversion by R. Mahfoudhi). Along the second step, the calculation of the $({\Theta^{-1}}_{i,i})$ (with (F)) is in $O(n^2)$.