This is a good complex analysis problem, even though everything is stated in terms of symmetric real matrices. Complex analysis gives you a good way to find objects by using contour integrals. For example, the matrix $A$ has an orthornormal basis $\{ e_{j}\}_{j=1}^{n}$ with corresponding unique eigenvalues $\sigma(A)=\{ \lambda_{j}\}$, which can be used to write the resolvent matrix
$$
(\lambda I -A)^{-1}x = \sum_{j=1}^{n}\frac{1}{\lambda-\lambda_{j}}(x,e_{j})e_{j}.
$$
The resolvent has simple poles at all of the eigenvalues, and the residue matrix $R_{j}$ is the orthogonal projection onto the eigenspace with eigenvalue $\lambda_{j}$:
$$
R_{j}x = (x,e_{j})e_{j}.
$$
Notice that, working in the norm of $\mathbb{C}^{n}$,
$$
\|(\lambda I -A)^{-1}x\|^{2}\le \frac{1}{\mbox{dist}(\lambda,\sigma(A))^{2}}\sum_{j=1}^{n}\|x\|^{2}.
$$
That is,
$$
\|(\lambda I -A)^{-1}\| \le 1/\mbox{dist}(\lambda,\sigma(A)).
$$
If you perturb $A$ by $\epsilon V$, where $V$ is also symmetric and real, and where $\epsilon$ is real, then $(\lambda I -A-\epsilon V)$ remains invertible provided $\epsilon$ is small enough. In particular,
$$
(\lambda I - A -\epsilon V)^{-1}=(\lambda I -A)^{-1}(I-\epsilon V(\lambda I-A)^{-1})^{-1} = (\lambda I -A)^{-1}\sum_{k=0}^{\infty}\epsilon^{k}(V(\lambda I -A)^{-1})^{k}.
$$
The above converges in matrix operator norm to the expected inverse if
$$
|\epsilon|\|V\|/\mbox{dist}(\lambda,\sigma(A)) < 1\;\;\;\mbox{ or }\;\;\; |\epsilon| < \|V\|^{-1}\mbox{dist}(\lambda,\sigma(A)).
$$
By drawing a fixed system of circular contours $C_{j}$ around each $\lambda_{j}$, you can make $\epsilon$ small enough to guarantee that $(\lambda I - A-\epsilon V)^{-1}$ has all of its singularities inside these contours $C_{j}$, and to guarantee that the integrals of $(\lambda I -A-\epsilon V)^{-1}$ vary continuously with $\epsilon$ in a small range of $0$. So the eigenvalues of $A+\epsilon V$ remain in these circles for small $\epsilon$. Because the eigenspaces of $A$ all have dimension 1, then it follows that the same is true of $A+\epsilon V$ for small enough $\epsilon$; if some of the eigenspaces were to have dimension 2 or more, then there could be bifurcations in the eigenvalues.
You can argue that the dimensions of eigenspaces remain 1 for small $\epsilon$ by applying the trace to the orthogonal projection matrices $P_{j}^{\epsilon}$ defined through the contour integrals--the trace is a value which is 1 at $\epsilon=0$, remains an integer, and which must vary continuously for small $\epsilon$ because the following varies continuously with $\epsilon$:
$$
P_{j}^{\epsilon} =\frac{1}{2\pi i}\oint_{C_{j}}(\lambda I - A-\epsilon V)^{-1}d\lambda .
$$
Note: The $P_{j}^{\epsilon}$ is the projection onto all of the eigenvectors of $A+\epsilon V$ corresponding to eigenvalues enclosed by $C_{j}$. Also, $P_{j}^{\epsilon}e_{j}$ is non-zero for $\epsilon$ near $0$, and must be an eigenvector of $A+\epsilon V$ with the unique eigenvalue enclosed by $C_{j}$. I'm not sure what the normalization does to eigenvector function $\epsilon\mapsto P_{j}^{\epsilon}e_{j}$, however.
EXPANSION: I will add the explict expansion offered by the above. The expansion for the resolvent $(\lambda I - A-\epsilon V)^{-1}$, which is valid for $|\epsilon|\|V\| < \mbox{dist}(\lambda,\sigma(A))$ has first order $\epsilon$ term
$$
(\lambda I - A)^{-1} + \epsilon(\lambda I - A)^{-1}V(\lambda I - A)^{-1}
$$
When applied to a vector $x$, this can be written as
$$
(\lambda I - A)^{-1}x + \epsilon\sum_{k=1}^{n}\sum_{j=1}^{n} \frac{(x,e_{j})(Ve_{j},e_{k})e_{k}}{(\lambda-\lambda_{j})(\lambda-\lambda_{k})}
$$
Integrating over a circle enclosing only $\lambda_{j}$, which now encloses also $\lambda_{j}'$, the perturbed eigenvalue, only $(\lambda-\lambda_{j})^{-1}$ is singular. The term $(\lambda-\lambda_{j})^{-2}$ integrates to 0 because the residue is 0. The other terms $(\lambda-\lambda_{k})^{-1}(\lambda-\lambda_{j})^{-1}$ integrate to $(\lambda_{j}-\lambda_{k})^{-1}$ because $(\lambda-\lambda_{k})$ is holomorphic in the circle around $\lambda_{j}$. The projection $P_{j}x=(x,e_{j})e_{j}$ is the integral of $(\lambda I-A)^{-1}$ over the circle around $\lambda_j$. Finally, one obtains
$$
P_{j}^{\epsilon}x = (x,e_{j})e_{j} +\epsilon\left(\sum_{k=1, (k\ne j)}^{n}\frac{(x,e_{j})(Ve_{j},e_{k})}{(\lambda_j-\lambda_k)}e_{k}\right)
$$
A good expansion of the perturbed eigenvector is, therefore, $P_j^{\epsilon} e_j$, which is
$$
e_{j}' = e_{j} +\epsilon \sum_{k=1,(k\ne j)}^{n} \frac{(Ve_j,e_k)}{\lambda_j-\lambda_k}e_k.
$$
This needs to be normalized in length. Afterwards, the eigenvalue is $\lambda_j' = ((A+\epsilon V)e_{j}',e_{j}')$.