11

A example is:

$$A= \begin{bmatrix} 3 & 2 \\ 1 & 4 \end{bmatrix} \leftrightarrow -\begin{bmatrix} 1 & 4 \\ 3 & 2 \end{bmatrix} \leftrightarrow -\begin{bmatrix} 1 & 4 \\ 0 & -10 \end{bmatrix} = H $$

The row operations I used are $$(1) R_1 \leftrightarrow R_2$$

$$(2) -3R_1 + R_2 \to R_2$$

The original matrix

$$|A - \lambda I| = (\lambda-2)(\lambda-5) $$

The reduced matrix:

$$|H - \lambda I| = -(1-\lambda)(-10-\lambda) $$

Why are they different?

For a couple other questions this was working

EDIT: A matrix that gave the same eigen value as the one reduced:

$$A= \begin{bmatrix} 2 & 0 & 1 \\ 6 & 4 & -3 \\ 2 & 0 & 3 \end{bmatrix} \leftrightarrow \begin{bmatrix} 1 & 0 & 1 \\ 9 & 4 & -3 \\ -1 & 0 & 3 \end{bmatrix} \leftrightarrow \begin{bmatrix} 1 & 0 & 0 \\ 9 & 4 & -12 \\ -1 & 0 & 4 \end{bmatrix} = H $$

Column operations were $$-C_3 + C_1 \to C_1$$

$$-C_1 + C_3 \to C_3$$

The eigen values are the same for A and H

operatorerror
  • 29,881
user349557
  • 1,417
  • I wonder which were the couple of questions for which this was working, because as you can see below, it does not apply in general.However the row operations don't change the determinant (as you can see above), but that follows from multiplicativity of determinant. – Sarvesh Ravichandran Iyer Apr 01 '17 at 22:32
  • I'll put it in an edit – user349557 Apr 01 '17 at 22:36
  • Very interesting. In fact, this was a very special case, since some other set of elementary operations could possibly have changed the eigenvalues, while this specific one did not. – Sarvesh Ravichandran Iyer Apr 01 '17 at 22:41
  • Yeah I was trying to find a shortcut. I wondered why none of my textbook solutions ever used row operations. Makes sense now – user349557 Apr 01 '17 at 22:43
  • All right. Of course, +1 for this question. Now you know it though : the operations are never meant to retain the eigenvalues, they only retain the determinant. – Sarvesh Ravichandran Iyer Apr 01 '17 at 22:44

2 Answers2

10

There's no reason this will work because row operations don't preserve similarity and so might change the eigenvalues. More explicitly, by performing row operations you move from a matrix $A$ to a matrix $PA$ where $P$ is invertible and in general $PA$ is not similar to $A$.

To see this even more explicitly, assume that $A$ is diagonal with entries $(1,0)$ and multiply the first row by $2$. Clearly the eigenvalues of the resulting matrix will be $(2,0)$ while those of the original matrix were $(1,0)$.

levap
  • 67,610
  • So you shouldnt be doing row operations when working with eigen values? – user349557 Apr 01 '17 at 22:29
  • 1
    @user349557 correct, you are modifying your matrix in a way that may modify eigenvalues. Operations of the form $SAS^{-1}$ are guaranteed not to change eigenvalues of $A$. Levap and my points are that row reduction is not an operation of this form – operatorerror Apr 01 '17 at 22:31
  • You definitely can't use row operations on the matrix $A$ and expect the eigenvalues to stay the same. Sometimes it is useful to do row operations on the matrix $A - xI$ and record the effect on the determinant. Since the eigenvalues of $A$ are the roots of $\det(A - xI)$ and row operations change the determinant in known way, this can be of help but this is not the same as doing the row operations on the matrix $A$. – levap Apr 01 '17 at 22:32
  • In general, it is much more difficult to find the eigenvalues of a matrix (you need to solve a polynomial equation $\det(A - xI) = 0$) than to perform row reduction. – levap Apr 01 '17 at 22:33
6

The eigenvalues of a matrix are the solutions to the polynomial $$ \det(A-\lambda I)=0 $$ not the solution of $$ \det(SA-\lambda I)=0 $$ where $S$ is the product of a bunch of elementary matrices. When you row reduce, you are multiplying on the left (usually) by an invertible matrix. The polynomials above will not in general have their roots coincide.

operatorerror
  • 29,881