10

Could anyone help me with this proof without using determinant? I tried two ways.

Let $A$ be a matrix. If $A$ has the property that each row sums to zero, then there does not exist any matrix $X$ such that $AX=I$, where $I$ denotes the identity matrix.

I then get stuck. The other way was to prove by contradiction, and I failed too.

Andy Z
  • 653

7 Answers7

20

Hint: You can sum the elements of a row by multiplying this row with a vector of $1$'s. Can you find now a matrix $X$ (with appropriate columns) such that $AX=Ο$?

Jimmy R.
  • 36,148
11

If the sum of the rows is zero, then the matrix has the eigenvalue $0$. As a result its $\ker$ is of dimension $\ge1$, i.e. there is a nonzero solution to $AX=0$, hence it's noninvertible

  • Well as I mentioned, it's better to avoid anything related to determinant. It is an exercise question after matrix multiplication, so I think we don't really need deep concepts to prove this. – Andy Z Nov 11 '15 at 14:11
  • 5
    @H.Zhu Eigenvalues don't need to be related to determinants. There are several ways to define them without using determinants and characteristic polynomials – Hippalectryon Nov 11 '15 at 14:14
4

Assuming $A$ is an $n \times n$ matrix, let $v_0 = (0,0,\dots,0)$ and $v_1 = (1,1,\dots,1)$ be $n$-element column vectors. Since each row of $A$ sums to zero, it follows that $$A v_1 = (0,0,\dots,0) = A v_0,$$ showing that $A$ cannot have a (left) inverse.

2

If each row of $A$ sums to zero, then each row of the column vector that is the sum of the column vectors constituting $A$ is zero. So the columns of $A$ are not linearly independent, and therefore the matrix is singular (i.e. it has no inverse).

John Bentin
  • 20,004
2

Hint: For a matrix $A$ having such a property has vector $(1,1,...,1)$ in its kernel thereby giving $\dim(\ker A)>0$.

user26857
  • 53,190
Nitin Uniyal
  • 8,108
0

Theorem. If $A$ is an $n$-by-$n$ matrix, then $A$ is not invertible if and only if zero is an eigenvalue of $A$.

If $e$ denotes the all-ones vector of appropriate size, then, by hypothesis, $Ae = 0 = 0e$, i.e., zero is an eigenvalue of $A$.

Pietro Paparella
  • 3,690
  • 1
  • 21
  • 30
-1

I'm not too sure but my reasoning is that if the sum of a row is 0, then the rows of the matrix A are linearly dependent because they are a linear combination. If the rows of A are linearly dependent then the columns of A transpose is linearly dependent. Therefore matrix A is not invertible by the Invertible Matrix Theorem and the determinant is equal to 0.