$\newcommand{\metric}[2]{\langle #1,#2 \rangle}$$\newcommand{\R}{\mathbb{R}}$Here's another proof. The argument is an adjustment of the proof of the singular value decomposition of a matrix. I don't claim it's the shortest proof ever, but I think it quite informative.
The geometric viewpoint. A symmetric $n\times n$-matrix $A$ gives rise to a bilinear form on $\mathbb{R}^n$:
$$
\metric{v}{w} = w^T A v.
$$
Associated with this form is the quadratic form $Q(v)=\metric{v}{v}=v^T A v$.
Consider this bilinear form as a generalized inner product on $\R^n$. In this sense, think of $\metric{v}{v}$ as the "squared length" of a vector $v\in\R^n$. If all eigenvalues of $A$ are positive, then $\metric{v}{v}$ is indeed always a positive number;
in this case $\metric{v}{w}$ is an inner product. If $A$ has negative of zero eigenvalues, this viewpoint is less intuitive. Nevertheless, such bilinear forms do occur naturally (e.g. the Minkowski spacetime on $\R^4$).
A natural question arises. How do the "distances" behave on $\R^n$ equipped with this bilinear form? For this, we investigate the map
$$
f: S^{n-1}\subset\R^n\to \R: v \mapsto v^T A v.
$$
We are going to proceed as follows. First we determine the vector $v_1 \in S^{n-1}$ such that $f(v_1)$ is maximal. Then we are going to find a vector $v_2 \in S^{n-1} \cap \mathrm{span}\{v_1\}^\perp$ such that $f(v_2)$ is maximal (on the orthogonal complement of $v_1$). By repeating this argument we get an orthonormal basis $\{v_1,\ldots, v_n\}$ with eigenvalues $f(v_1),\ldots, f(v_n)$. Since the function is real valued, the eigenvalues are real.
Theorem. The matrix $A$ admits an orthonormal basis $\{v_1,\ldots, v_n\}$ of eigenvectors with real eigenvalues.
Proof. Basis step. Since $S^{n-1}$ is compact, there is a $v_1\in S^{n-1}$ such that $f(v_1)$ is maximal. Now take a $w \in S^{n-1}\cap \mathrm{span}\{v_1\}^\perp$. Consider the curve
$$
\alpha\colon (-\epsilon,\epsilon)\to\R^n : t \mapsto \cos t\, v_1 + \sin t\, w.
$$
Note that $\alpha(0)=v_1$ and $\alpha'(0)=w$. Now consider the composition $g(t)=f(\alpha(t))$. Its derivative is
$$
\begin{align*}
g'(t) &= (-\sin t\, v_1 + \cos t \, w)^T A (\cos t\, v_1 + \sin t\,w) \\
& \qquad + (\cos t\, v_1 + \sin t\,w)^T A (-\sin t\, v_1 + \cos t \, w) \\
&= 2 (-\sin t\, v_1 + \cos t \, w)^T A (\cos t\, v_1 + \sin t\,w) .
\end{align*}
$$
Here we used the symmetry of $A$. Note that $g$ is maximal in $t=0$ since $\alpha(0)=v_1$. Therefore $g'(0)=2 w^T A v_1$ must be zero. Since $w$ is an arbitrary unit vector perpendicular to $v_1$, $Av_1$ must be a multiple of $v_1$. We conclude that $v_1$ is an eigenvalue with real eigenvalue $f(v_1)$.
Induction step. Suppose we already found an orthonormal set $\{v_1,\ldots, v_k\}$ of eigenvalues with real eigenvalues. Then we just do the same argument as above, but we apply it to the function $f$ restricted to the subsphere $S^{n-1}\cap \mathrm{span}\{v_1,\ldots,v_k\}^\perp$.