2

I've already proved this with B positive definite using the fact that $B^{-\frac{1}{2}}$ does exist and AB is similar to $A^{\frac{1}{2}}BA^{\frac{1}{2}}$, but since B is positive semi-definite I don't know if that is true. I also know that $A^{\frac{1}{2}}$ and $B^{\frac{1}{2}}$ are defined.

I´m trying to see that $A^{\frac{1}{2}}BA^{\frac{1}{2}}$ is similar to $A^{\frac{1}{2}}A^{\frac{1}{2}}B$, but i'm having problems with that.

Any hint would help, thanks!

aalamo1
  • 23
  • I believe that A and B need to share a common basis in eigenvectors in order to be able to commute, and in general AB is only guaranteed to be diagonalizable if A and B commute. Commute means AB=BA. – Craig Hicks Apr 12 '20 at 06:54
  • Are you sure that AB is guaranteed to be diagonalizable if A and B conmute? I'm not sure about that. Diagonalization is different than simultaneous diagonalization – aalamo1 Apr 12 '20 at 07:06
  • Simultaneous diagonalization is a property on a set of of matrices, while the comment is about the single matrix $AB$. – Craig Hicks Apr 12 '20 at 14:31
  • Ok, I see, but I still thinking that the statement is true – aalamo1 Apr 12 '20 at 15:16
  • 3
    The statement is true and proven in Corollary 2.3 in this article: https://reader.elsevier.com/reader/sd/pii/002437959190239S?token=6D0F50BB70575D120D81774C90B25337FC1A180C9721693CA64B6C81BCF09C5BDD1BA218CF8CC2FF49ABB47781E72E04 – Jan Apr 12 '20 at 16:02
  • To the OP; when writing a conjecture, you should definitely report clearly this fact in the title and body of the text. –  Apr 12 '20 at 17:42

3 Answers3

3

By a change of orthonormal basis, we may assume that $$ A=\pmatrix{A_{r\times r}'&0\\ 0&0_{(n-r)\times(n-r)}} \text{ and } B=\pmatrix{B_{r\times r}'&\ast\\ \ast&B_2}, $$ where $A'$ is positive definite. Then, by performing another change of orthonormal basis (or by orthogonally diagonalising $B'$, followed by permuting the diagonal entries of $B'$), we may further assume that $$ A=\pmatrix{A_1&X^T&0\\ X&A_2&0\\ 0&0&0} \text{ and } B=\pmatrix{B_1&0&Y^T\\ 0&0&Z^T\\ Y&Z&B_2}, $$ where $A'=\pmatrix{A_1&X^T\\ X&A_2}$ and $B'=\pmatrix{B_1&0\\ 0&0}$, with $B_1$ being positive definite. However, since $B$ is positive semidefinite, the sub-block $Z$ must be zero. It follows that $$ AB =\pmatrix{A_1&X^T&0\\ X&A_2&0\\ 0&0&0}\pmatrix{B_1&0&Y^T\\ 0&0&0\\ Y&0&B_2} =\pmatrix{A_1B_1&0&AY^T\\ XB_1&0&XY^T\\ 0&0&0} $$ is similar to the positive semidefinite matrix $A_1^{1/2}B_1A_1^{1/2}\oplus0\oplus0$: \begin{aligned} &\pmatrix{A_1^{-1/2}&0&A_1^{-1/2}B_1^{-1}Y^T\\ -XA_1^{-1}&I&0\\ 0&0&I} \pmatrix{A_1B_1&0&AY^T\\ XB_1&0&XY^T\\ 0&0&0} \pmatrix{A_1^{1/2}&0&-B_1^{-1}Y^T\\ XA_1^{-1/2}&I&-XA_1^{-1}B_1^{-1}Y^T\\ 0&0&I}\\ &=\pmatrix{A_1^{1/2}B_1&0&A_1^{1/2}Y^T\\ 0&0&0\\ 0&0&0} \pmatrix{A_1^{1/2}&0&-B_1^{-1}Y^T\\ XA_1^{-1/2}&I&-XA_1^{-1}B_1^{-1}Y^T\\ 0&0&I}\\ &=\pmatrix{A_1^{1/2}B_1A_1^{1/2}&0&0\\ 0&0&0\\ 0&0&0}. \end{aligned} Hence $AB$ is diagonalisable.

user1551
  • 149,263
  • If you have time could you please explain the third tl last eq. line? – Craig Hicks Apr 12 '20 at 20:26
  • The choice of left and right hand side matrices in the line after "is similar to the positive semidefinite matrix". – Craig Hicks Apr 12 '20 at 20:30
  • Yes isn't that particular $A$ and $B$ we can assume $A$ in the form you gave but $B$ may be different? – Toni Mhax Apr 13 '20 at 13:33
  • Never seen such thing what if $B$ is invertible no zeros in the diagonal? after your matrix products – Toni Mhax Apr 13 '20 at 15:52
  • @ToniMhax If $B$ is positive definite, the zero sub-blocks are empty. – user1551 Apr 13 '20 at 15:59
  • Ok you should have put the unitaries you used, no one will know the final expressions $A$ and $B$ since it is two changement of basis. I am deleting the comments. Best – Toni Mhax Apr 14 '20 at 03:52
  • I'm sorry, I don't get who is $A_1$ – aalamo1 Apr 14 '20 at 17:08
  • @aalamo1 If $B_1$ is $k\times k$, $A_1$ is the leading principal $k\times k$ submatrix in $A'$. In other words, we partition $A'$ and $B'$ in the same way, and call the submatrix at the top left corner of $A'$ as $A_1$. – user1551 Apr 14 '20 at 17:56
2

preliminaries
$A^\frac{1}{2}BA^\frac{1}{2}$ is real symmetric positive semi-definite because
$\mathbf x^TA^\frac{1}{2}BA^\frac{1}{2}\mathbf x = \big\Vert B^\frac{1}{2}A^\frac{1}{2}\mathbf x\big\Vert_2^2\geq 0$ for any $\mathbf x \in \mathbb R^n$
we also know
$\text{trace}\Big(\big(A^\frac{1}{2}BA^\frac{1}{2}\big)^k\Big)= \text{trace}\Big(\big(AB\big)^k\Big)$
for all natural numbers $k$ so they have the same characteristic polynomials

$C:=A^\frac{1}{2}BA^\frac{1}{2}$
and for natural numbers $k$ we have the useful identity
$A^\frac{1}{2}C^k A^\frac{1}{2}B = (AB)^{k+1}= (AB)(AB)^{k} $

since $C$ is diagonalizable over reals we proceed with a minimal polynomial argument

main argument
we have
$p(C) = \mathbf 0$
where $p$ is the minimal polynomial of $C$-- i.e. for this problem: $p$ is the characteristic polynomial of $C$ except there are no repeated roots. This implies

$\mathbf 0 =A^\frac{1}{2} p(C)A^\frac{1}{2}B = AB\cdot p(AB)=g\big(AB\big)$

Thus the polynomial $g$ annihilates $(AB)$, and $g$ has no repeated roots, except possibly with the eigenvalue zero.

thus we know that for every distinct $\lambda \gt 0$ of $\big(AB\big)$ we have
$\text{geometric multiplicity}_{\text{of }\lambda}\big(AB\big)= \text{algebraic multiplicity}_{\text{of }\lambda}\big(AB\big) $

in other words: all positive eigenvalues of $\big(AB\big)$ are semi-simple

semisimplicity of $\lambda =0$
underlying ideas in the below inequalities (i) submultiplicativity of rank which holds over any field, (ii) usefulness of matrix square root which is something of a special feature in $\mathbb R$ or $\mathbb C$ associated with positive semi-definiteness, (iii) in $\mathbb R$ or $\mathbb C$ we have
$\text{rank}\big(Z^*Z\big) = \text{rank}\big(Z\big)$ and $\text{rank}\big(Z^*Z\big)=\text{rank}\big(ZZ^*\big)$
for a crude proof of (iii): use Polar Decomposition

To prove that zero is semisimple-- or equivalently, to prove that there isn't a 'shortage' of eigenvectors associated with the eigenvalue $0$, we need to estimate $(AB)$'s rank

$\text{rank}\big(AB\big)$
$=\text{rank}\big(B^*A^*AB\big)$
$=\text{rank}\big(BA^2B\big)$
$\leq\text{rank}\big(B^\frac{1}{2}A^2B\big)$
$\leq\text{rank}\big(B^\frac{1}{2}A^2B^\frac{1}{2}\big)$
$=\text{rank}\big(ABA\big)$
$\leq\text{rank}\big(A^\frac{1}{2}BA\big)$
$\leq\text{rank}\big(A^\frac{1}{2}BA^\frac{1}{2}\big)$

If we negate the above and add $n$ we can apply rank-nullity to get
$\text{algebraic multiplicity}_{\text{of }\lambda \text{= 0}}\big(A^\frac{1}{2}BA^\frac{1}{2}\big)$
$=\text{geometric multiplicity}_{\text{of }\lambda \text{= 0}}\big(A^\frac{1}{2}BA^\frac{1}{2}\big) $
$\leq \text{geometric multiplicity}_{\text{of }\lambda \text{= 0}}\big(AB\big) $
$\leq \text{algebraic multiplicity}_{\text{of }\lambda \text{= 0}}\big(AB\big) $
$=\text{algebraic multiplicity}_{\text{of }\lambda \text{= 0}}\big(A^\frac{1}{2}BA^\frac{1}{2}\big)$

that is, we have
$\text{geometric multiplicity}_{\text{of }\lambda \text{= 0}}\big(AB\big)= \text{algebraic multiplicity}_{\text{of }\lambda \text{= 0}}\big(AB\big) $

so we also know the eigenvalue of $0$ is semi-simple for $\big(AB\big)$, which completes the proof.

user8675309
  • 12,193
1

For my own reference, here is a proof modified from user8675309's excellent answer.

If $A$ is nonsingular, then $AB$ is similar to $\sqrt{A}B\sqrt{A}$ and hence it is diagonalisable.

If $A$ is singular, the minimal polynomial of $\sqrt{A}B\sqrt{A}$ will be in the form of $xg(x)$ for some polynomial $g$. Therefore \begin{align*} &\left(\sqrt{A}B\sqrt{A}\right)g\left(\sqrt{A}B\sqrt{A}\right)=0\\ &\implies (AB)(AB)g(AB) =\sqrt{A}\left(\sqrt{A}B\sqrt{A}\right)g\left(\sqrt{A}B\sqrt{A}\right)\sqrt{A}B =0\\ &\implies \left(\sqrt{B}A\sqrt{B}\right) \left(\sqrt{B}A\sqrt{B}\right) \sqrt{B}g(AB) =\sqrt{B}(AB)(AB)g(AB)=0\\ &\implies \left(\sqrt{B}A\sqrt{B}\right) \sqrt{B}g(AB)=0\\ &\implies \left(\sqrt{A}\sqrt{B}\right) \sqrt{B}g(AB)=0\\ &\implies ABg(AB)=0.\\ \end{align*} In other words, $xg(x)$ annihilates $AB$. As $\sqrt{A}B\sqrt{A}$ is diagonalisable, its minimal polynomial $xg(x)$ is a product of distinct linear factors. Hence the minimal polynomial of $AB$ — which divides the annihilating polynomial $xg(x)$ — must also be a product of distinct linear factors. Therefore $AB$ is diagonalisable. Furthermore, as $AB$ has the same spectrum as $\sqrt{A}B\sqrt{A}$, the two products are similar to each other.

user1551
  • 149,263