0

For $$ A=\left[\begin{array}{lllll} 1 & 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 & 1 \end{array}\right] $$

Using MATLAB, one can easily show that $\lambda_1 = 0\,\,\,\,\,\&\,\,\,\,\lambda_2 = 5$ are the eigenvalues of the $A$.

However, I would like to do it by hand.

Since $det(A) = 0$, we know that $\lambda_1 = 0$ is an eigen value.

However, I am having difficulty finding the other value. I know if we take $det(A-\lambda I) = 0$, then we get the characteristic equation and can find eigenvalues.

But, I think it is too much to do for such a special matrix.

On the other hand, if I can establish that algebraic multiplicity of $\lambda_1 = 0$ is $4$, then I can say that since $A$ must $5$ eigenvalues (not necessarily different), and $4$ of them being equal to $0$, then the other eigenvalue $\lambda_2 = \sum{a_{ii}} - 4 * \lambda_1 = 5 - 4*0 = 5$.

Is it possible to efficiently compute the algebraic multiplicity of $\lambda_1 = 0$ as $4$ by observing the matrix?

Sai Nikhil
  • 189
  • 9
  • You can write down all of the eigenvectors very explicitly. For starters, the eigenvector corresponding to eigenvalue $5$ is $(1, 1, 1, 1, 1)$. – Qiaochu Yuan Dec 11 '20 at 04:34
  • I don't want to compute eigenvectors at all. May I know the reason why you suggested that? – Sai Nikhil Dec 11 '20 at 04:39
  • It is by far the fastest way to do this computation. You can also do it by computing the characteristic polynomial using row reduction, but it is genuinely really easy to compute the eigenvectors, I promise. – Qiaochu Yuan Dec 11 '20 at 04:40
  • How did you find 5 is an eigenvalue? – Sai Nikhil Dec 11 '20 at 04:40
  • Multiply the matrix by the vector $(1, 1, 1, 1, 1)$! – Qiaochu Yuan Dec 11 '20 at 04:40
  • How did you get this magic vector? – Sai Nikhil Dec 11 '20 at 04:42
  • 1
    It's not magic! Just look at what the matrix actually does! It takes a vector $(x_1, x_2, x_3, x_4, x_5)$ and returns the vector all of whose entries are the sum $x_1 + x_2 + x_3 + x_4 + x_5$. In other words, the range is $1$-dimensional and spanned by $(1, 1, 1, 1, 1)$ so this is the only possible eigenvector corresponding to a nonzero eigenvalue, and the other eigenvectors must have eigenvalue $0$. – Qiaochu Yuan Dec 11 '20 at 04:43
  • Is there any relation between algebraic multiplicity and $dim(ker(A-\lambda I))$? – Sai Nikhil Dec 11 '20 at 04:57
  • In general, try to plug in the vector $(1, \zeta, \zeta^2, \zeta^3, \zeta^4) $, where $\zeta$ is a 5-th root of unity. Since there are five of them, you will get five distinct eigenvectors. This works any time the row below is obtained by shifting positions by one. For example if the first row is $1 \ 2 \ 3 \ 4 \ 5$ , the second should be $ 2\ 3 \ 4 \ 5 \ 1$, and so on for the following rows. – Andrea Marino Jan 07 '21 at 20:16

3 Answers3

2

It is obvious that $A$ has $\mbox{rank}(A)=1$. Therefore, by the rank-nullity theorem, the Nullspace of $A$ has dimension $4$.

This means that the eigenspace for the eigenvalue $\lambda=0$ is 4-dimensional.

It is a fundamental result of Linear algebra that the multiplicity of an eigenvalue is greater or equal to the dimension of the eigenspace. [This is typically stated as "algebraic multiplicity" $\geq$ "geometric multiplicity"]

This means that $\lambda=0$ has multiplicity $4$.

It follows that $\lambda_1=\lambda_2=\lambda_3=\lambda_4=0$.

Finally, $$\lambda_1+\lambda_2+\lambda_3+\lambda_4+\lambda_5= \mbox{tr}(A)=5$$

This gives $\lambda_5=5$.

N. S.
  • 134,609
0

The columns are eigenvectors $$ \left( \begin{array}{rrrrr} 1 & -1 & -1 & -1 & -1 \\ 1 & 1 & -1 & -1 & -1 \\ 1 & 0 & 2 & -1 & -1 \\ 1 & 0 & 0 & 3 & -1 \\ 1 & 0 & 0 & 0 & 4 \\ \end{array} \right). $$

Will Jagy
  • 146,052
0

Yes you can. Note that it is a symmetric matrix, there are four linearly dependent row/column vectors i.e. all the row(column) vectors can be linearly expressed by a single row(column) vector. So treating the matrix as a transformation matrix we see that as the row vectors spans only one dimension, the transformed vectors will also span only one dimension. So there is only one non zero real eigenvalue. Rest all eigenvalues has to be zero.

Also for a matrix like above it is easy to see the the eigen vector for the non zero eigenvalue is type $\begin{bmatrix} 1\\1\\.\\.\\1 \end{bmatrix}$ for ex. $\begin{bmatrix} 1 & 2 & 1\\1 & 2 & 1\\1 & 2 & 1\end{bmatrix}\begin{bmatrix} x\\y\\z\end{bmatrix} = \lambda\begin{bmatrix} x\\y\\z\end{bmatrix} $ then

$x+2y+z = \lambda x, x+2y+z = \lambda y, x+2y+z = \lambda z$,
so $x=y=z$. This also independently confirms that there is only one non zero eigenvalue.

If you want more technical view you can see this. What is the relation between rank of a matrix, its eigenvalues and eigenvectors

novice_2
  • 838
  • Okay, seems like you're the only one who read my question correctly. Just one more doubt before I upvote and tick. Can you mathematically write "row vectors spans only one dimension $\implies$ only $1$ non-zero eigenvalue"? – Sai Nikhil Dec 11 '20 at 05:33
  • @SaiNikhil I edited my answer to clarify further. – novice_2 Dec 11 '20 at 06:03
  • The second part what I wrote was incorrect so I deleted it. But still the first part and the link should be fine. – novice_2 Jan 07 '21 at 20:11
  • and I added what I should have written then. – novice_2 Jan 08 '21 at 11:42