Assume one knows the eigenvalues $(\lambda_i)$ of a real matrix $M$ of size $n \times n$. Let $b$ be a vector in $\mathbb{R}^n$ and construct the following matrix $J$ by: $$J_{ij} = M_{ij}b_j$$ Can we deduce the eigenvalues of $J$ from the eigenvalues $\lambda_i$ and the vector $b$ ?
Asked
Active
Viewed 237 times
0
-
1J is a vector... – Oct 10 '11 at 17:18
-
@percusse : no i think J is a matrix – mellow Oct 10 '11 at 17:20
-
1Okay, so you're postmultiplying $M$ with the diagonal matrix $\mathrm{diag}(b)$... I would say there doesn't seem to be a nice relationship between the eigenvalues of that and the eigenvalues of your original matrix... – J. M. ain't a mathematician Oct 10 '11 at 17:21
-
To make it clearer what $J$ is, you could write $$J=M\mathrm{diag}(\vec{b}).$$ – anon Oct 10 '11 at 17:23
-
1A related question... – J. M. ain't a mathematician Oct 10 '11 at 17:27
-
@mellow It was a little bit puzzling for me since $b$ is a vector. I see that you are not summing over indices. – Oct 10 '11 at 17:44
1 Answers
3
No. For example, suppose $n=2$ and we know that $M$ has eigenvalues $0$ and $2$ and that $b=(1,0)$. Then we might have $M=\begin{pmatrix} 2 & 0 \\ 0 & 0 \end{pmatrix}$, in which case $J=\begin{pmatrix} 2 & 0 \\ 0 & 0 \end{pmatrix}$ has eigenvalues $0$ and $2$. On the other hand, we might have $M=\begin{pmatrix} 0 & 0 \\ 0 & 2 \end{pmatrix}$, in which case $J=\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$ has $0$ as its only eigenvalue.
Chris Eagle
- 34,035