When we perform row operations on the matrix, for instance, say we are reducing the matrix to a reduced row echelon form, how does doing this help us understand what the linearly independent columns are when we are clearly making changes to the rows?
Asked
Active
Viewed 74 times
0
-
You can think as a determinant. If the determinant is non-null (linearly independent), making linear changes will not make it null (linearly dependent). – Matheus Nunes Mar 19 '20 at 06:52
-
Think of the matrix as a system of linear equations. Why do we add equations on different lines to each other? – Brevan Ellefsen Mar 19 '20 at 06:53
1 Answers
1
You have to know 2 things: First, doing a row operation on a matrix $A$ is the same as multiplying $A$ on the left by an invertible matrix $P.$ Second, if the columns of $A$ are $c_1,c_2, ... ,c_n,$ then the columns of $PA$ are $Pc_1,Pc_2, ... ,Pc_n$ so for scalars $r_1,r_2, ... ,r_n$ $$r_1c_1+r_2c_2+...+r_nc_n=0$$ iff $$r_1Pc_1+r_2Pc_2+...+r_nPc_n=0,$$ which is the same as saying that a relation of linear dependence is satisfied on the columns of $A$ iff the same relation is satisfied on the columns of $PA.$
P. Lawrence
- 5,768