9

I am posed with the following question:

Suppose that $T\in \mathcal{L}(V)$, where $V$ is a finite-dimensional vector space, is such that every vector in $V$ is an eigenvector of $T$. Prove that T is a scalar multiple of the identity function.


My attempt goes as follows:

If $T$ is a scalar multiple of the identity function, then $Tv=av$ for all $v\in V$, where $a$ doesn't depend on $v$. We will start off assuming that that may not necessarily be true, and work our way to the result.

From the problem statement, we know that $Tv=a_j v$, where $a_j$ may depend on the choice of $v$. To show that it doesn't, consider two non-zero vectors $v_1$ and $v_2$ both in $V$ (It would be pointless to consider zero vectors). Consider $T(v_1+v_2)$. Also, let us first consider the case where $v_2$ is not a scalar multiple of $v_1$ (so $v_1$ & $v_2$ form a linearly independent set). On one hand we have

$$\begin{align*} T(v_1+v_2)&=\alpha (v_1+v_2)\\ &= \alpha v_1 + \alpha v_2 \end{align*}$$

Which is true because of the the assumption that every vector is an eigenvector. On the other hand, we have

$$\begin{align*} T(v_1+v_2)&=T(v_1)+T(v_2)\\ &=a_1 v_1 + a_2 v_2 \end{align*}$$

So we are left with the following equality

$$a_1 v_1 + a_2 v_2=\alpha v_1 + \alpha v_2$$ $$\implies (a_1 - \alpha)v_1 + (a_2 - \alpha)v_2 = 0$$

Because $v_1$ & $v_2$ are linearly independent, $a_1=a_2=\alpha$.

Now consider $\beta v$, a scalar multiple of $v$.

$$\begin{align*} T(\beta v)&= \beta T(v)\\ &= \beta (a v)\\ &= a (\beta v) \end{align*} $$

Because every vector in $V$ that is not equal to an arbitrary vector $v$ is either a scalar multiple of it or the sum of $v$ and some other vector in $V$, $Tv=av$ for all $v\in V$.


Is my prove valid? Please criticize my proof holistically.

P.S: Is the same proof valid for infinite-dimensional vector spaces? I ask because such truth would lie in the validity of the last paragraph, and I don't know if it is true in infinite-dimensional vector spaces.

1 Answers1

7

Here's as holistic approach as I've got to the present problem:

This very same question actually came up in a research project on differential equations I last worked on a couple of years ago, so it is something with which I am well familiar. And my proof was essentially the same as OP Arturo Don Juan's, so I believe I can affirm the correctness of his argument with confidence.

Just for completeness, here's the way I wrote it up (from my research notes):

Let $\Bbb F$ be any field, and let $V$ be any vector space over $\Bbb F$ (note we don't require $\dim V < \infty$.). If $T \in \mathcal L(V)$ is such that every non-zero $v \in V$ is an eigenvector, then there is an $\alpha \in \Bbb F$ such that $T(v) = \alpha v$ for all $v \in V$.

Proof: The hypothesis that every (non-zero) $v \in V$ is an eigenvector of $T$ may be expressed by the equation

$Tv = \phi(v) v, \tag{1}$

where $\phi(v) \in \Bbb F$ for every $v$; that is $\phi:V \to \Bbb F$ is a function taking vectors in $V$ to scalars in $\Bbb F$. If $u \in V$ is related to $v \in V$ by the equation $u = \beta v$ for some $\beta \in \Bbb F$, we have

$\phi(u)u = \phi(\beta v)(\beta v) = T(\beta v) = \beta T(v) = \beta \phi(v) v = \phi(v) \beta v, \tag{2}$

or in short form,

$\phi(\beta v)(\beta v) = \phi(v) \beta v; \tag{3}$

when $\beta v \ne 0$, (3) shows that

$\phi(\beta v) = \phi(v); \tag{4}$

i.e., $\phi(v)$ only depends on the one-dimensional subspace $v$ generates; on $\text{span}\{v\}$; this implies that, for $v_1, v_2$ linearly dependent, $\phi(v_1) = \phi(v_2)$. In the event that $v_1, v_2$ are linearly independent, we may write in manner analogous to our OP

$T(v_1 + v_2) = \phi(v_1 + v_2) (v_1 + v_2) = \phi(v_1 + v_2) v_1 + \phi(v_1 + v_2) v_2 \tag{5}$

and

$T(v_1 + v_2) = T(v_1) + T(v_2) = \phi(v_1)v_1 + \phi(v_2)v_2; \tag{6}$

combining (5) and (6):

$\phi(v_1 + v_2) v_1 + \phi(v_1 + v_2) v_2 = \phi(v_1)v_1 + \phi(v_2)v_2; \tag{7}$

by the linear independence of $v_1$ and $v_2$, we see from (7) that

$\phi(v_1) = \phi(v_1 + v_2) = \phi(v_2); \tag{8}$

since $v_1, v_2$ are an arbitrary pair of linearly independent vectors, this result combined with (3)-(4) shows that $\phi(v)$ is a constant; taking $\alpha = \phi(v)$ for any $v \in V$ then yields the complete solution. QED.

Robert Lewis
  • 72,871
  • Please, remember point three here for future posts. – Pedro Jan 04 '15 at 06:16
  • @PedroTamaroff: very well, I will try to refrain from ancient and other foreign tongues for the moment. Having said that, was there something you wanted to talk to me about? – Robert Lewis Jan 05 '15 at 05:12
  • @PedroTamaroff: well, I looked up your cited "point three" and I must say I can't see how it bears on the case at hand. – Robert Lewis Sep 20 '21 at 03:24
  • The original message refers to the signature that was removed: "Hope this helps. Cheers,

    And As Ever,

    Fiat Lux!!!"

    – Pedro Sep 20 '21 at 10:56