In addition to my comment, let me suggest another proof that leads to a slightly larger insight.
For any matrix $M$, and $k \ge 1$, if $x$ is an eigenvector for $\lambda$ then
$$
M^k x = \lambda^k x.
$$
The proof is just induction, so I won't write it out.
Now start from $A^2 = A$. Rewrite this as
$$
A^2 - A = 0
$$
and then let $x$ be an eigenvector for $\lambda$, an eigenvalue of $A$. We then get
\begin{align}
(A^2 - A)x &= 0 \\
A^2x - Ax &= 0 \\
\lambda^2x - \lambda x &= 0 & \text{by applying the lemma}\\
(\lambda^2 - \lambda) x &= 0 \\
\lambda^2 - \lambda &= 0 & \text{because $x$ is nonzero}\\
\end{align}
Now the polynomial that $A$ satisfied --- $A^2 - A = 0$ --- was nothing special. Suppose instead that we knew that $A^3 - 3A^2 - A + I = 0$. We can write this as $p(A)$, where $p(x) = x^3 - 3x^2 - x + 1$. By exactly the same kind of argument, we'd find that if $\lambda$ was an eigenvalue for $A$, then $p(\lambda) = 0$.
Summary: if a matrix $A$ satisfies $p(A) = 0$ for some polynomial $p$, then for any eignevalue $\lambda$ of $A$, we also have $p(\lambda) = 0$.