1

Okay, if a matrix RREFs to the identity, then you can express it as a product of elementary row operations and the identity, and find the inverse by going backwards, that's trivial

Going the other way is more difficult, I am struggling to show:

A matrix has an inverse $\implies$ it RREFs to the identity matrix

My work (if you can call it that) so far: contrapositive, suppose a matrix RREFs to not the identity matrix, I must show the matrix has no inverse.

Alec Teal
  • 5,590

1 Answers1

1

Suppose $A$ is an $n\times n$ matrix which has inverse. Recall, using Gauss-Jordan Elimination, we can use row operation to reduce the block matrix $[A|I_n]$ to $[I_n|A^{-1}]$, where $I_n$ is the identity matrix. That is to say, we can use row operation to reduce $A$ to $I_n$. Now, since $I_n$ is in rref form and rref form of a matrix must be unique (as you want to prove in your other post), the rref form of $A$ must be $I_n$ (or in your terminology, $A$ RREFs to $I_n$).

Paul
  • 19,636
  • Hate to be annoying but proof that $[A|I_n]$ RREFs to $[I|A^{-1}]$? (it annoys me I can't do this) – Alec Teal Dec 10 '14 at 23:44
  • You can refer to here: http://math.stackexchange.com/questions/164471/proof-that-gauss-jordan-elimination-works – Paul Dec 10 '14 at 23:53