I am posed with the following question:
Suppose that $T\in \mathcal{L}(V)$, where $V$ is a finite-dimensional vector space, is such that every vector in $V$ is an eigenvector of $T$. Prove that T is a scalar multiple of the identity function.
My attempt goes as follows:
If $T$ is a scalar multiple of the identity function, then $Tv=av$ for all $v\in V$, where $a$ doesn't depend on $v$. We will start off assuming that that may not necessarily be true, and work our way to the result.
From the problem statement, we know that $Tv=a_j v$, where $a_j$ may depend on the choice of $v$. To show that it doesn't, consider two non-zero vectors $v_1$ and $v_2$ both in $V$ (It would be pointless to consider zero vectors). Consider $T(v_1+v_2)$. Also, let us first consider the case where $v_2$ is not a scalar multiple of $v_1$ (so $v_1$ & $v_2$ form a linearly independent set). On one hand we have
$$\begin{align*} T(v_1+v_2)&=\alpha (v_1+v_2)\\ &= \alpha v_1 + \alpha v_2 \end{align*}$$
Which is true because of the the assumption that every vector is an eigenvector. On the other hand, we have
$$\begin{align*} T(v_1+v_2)&=T(v_1)+T(v_2)\\ &=a_1 v_1 + a_2 v_2 \end{align*}$$
So we are left with the following equality
$$a_1 v_1 + a_2 v_2=\alpha v_1 + \alpha v_2$$ $$\implies (a_1 - \alpha)v_1 + (a_2 - \alpha)v_2 = 0$$
Because $v_1$ & $v_2$ are linearly independent, $a_1=a_2=\alpha$.
Now consider $\beta v$, a scalar multiple of $v$.
$$\begin{align*} T(\beta v)&= \beta T(v)\\ &= \beta (a v)\\ &= a (\beta v) \end{align*} $$
Because every vector in $V$ that is not equal to an arbitrary vector $v$ is either a scalar multiple of it or the sum of $v$ and some other vector in $V$, $Tv=av$ for all $v\in V$.
Is my prove valid? Please criticize my proof holistically.
P.S: Is the same proof valid for infinite-dimensional vector spaces? I ask because such truth would lie in the validity of the last paragraph, and I don't know if it is true in infinite-dimensional vector spaces.