How do I go about constructing two matrices $A$ and $B$ such that the pseudoinverse of $AB$ is not equal to the pseudoinverse of $B$ times the pseudoinverse of $A$?
1 Answers
Try some $2 \times 2$ examples, at least one of them singular, say with $1$'s and $0$'s as entries. It shouldn't take too many tries to get one that works. You could even try it with $A=B$.
EDIT: Let's do the case $A = B$ for $2 \times 2$ matrices. The SVD of
$A$ is $A = U \Sigma V^T$ where $U$ and $V$ are orthogonal matrices and $\Sigma$ is positive semidefinite.
I'll assume $U$ and $V$ are rotations, so
$$ U = R_\theta = \pmatrix{\cos(\theta) & \sin(\theta)\cr -\sin(\theta) & \cos(\theta)\cr}$$
and $V = R_\phi$. Since we want $A$ to be singular, $\Sigma = \pmatrix{\sigma_1 & 0\cr 0 & 0\cr}$. Then
$$ A = \sigma_1 \pmatrix{\cos(\phi) \cos(\theta) & -\sin(\phi) \cos(\theta)\cr
-\cos(\phi) \sin(\theta) & \sin(\phi) \sin(\theta)}$$
Compute the pseudo-inverses of $A$ and $A^2$. If $\sigma_1 > 0$, I find the condition for $(A^2)^+ = (A^+)^2$ to be
$\sin(\phi-\theta) = 0$.
- 470,583
-
Is there a scientific way to determine this or is trial and error the only way? – user100503 Mar 25 '14 at 22:36
-
Thank you very much. I will take a more detailed look but this is making sense. – user100503 Mar 26 '14 at 02:03