I am convinced that it is more likely than not that it is impossible to isolate the eigenvectors without referencing the eigenvalues $\lambda$, without referencing the zeroes of the characteristic polynomial, and without referencing any immediately recognisable derivative thereof.
Nevertheless, my instructor sent me an article that is as close to a solution as I expect is possible in that it does not require the eigenvalues from the beginning, and it is a single process that returns them both, rather than sticking together an eigenvalue-finding process and an eigenvector-finding process and calling it “one process.”
The algorithm is described by William A. McWorter, Jr., and Leroy F. Meyers in their paper “Computing Eigenvalues and Eigenvectors Without Determinants” in Mathematics Magazine, vol. 71, no. 1 (Feb. 1998):
Let $\newcommand{\u}{\vec{u}} \u$ be any nonzero vector in $\newcommand{\F}{\Bbb{F}} \F^n$ [the same field that matrix $A$ comes from]. Since $\F^n$ has finite dimension $n$, the $n+1$ vectors $\u, A\u, A^2\u, \dots, A^n\u$ are linearly dependent. Let $k$ be the smallest positive integer such that $a_0\u+a_1A\u+a_2A^2\u+\cdots+a_kA^k\u=\vec0$, for some $a_0,\dots,a_k$ in $\F$ with $a_k\neq0$. Algebraic closure ensures that the polynomial $a_0+a_1t+a_2t^2+\cdots+a_kt^k$ in $\F[t]$ is factorable as $(t-\lambda)\,Q(t)$ for some $\lambda$ in $\F$ and some polynomial $Q(t)$ in $\F[t]$. Hence $(A-\lambda I)\,Q(A)\,\u=\vec0$ [this is part of the proof, not the process]. The minimality of $k$ implies that the vector $Q(A)\,\u$ is nonzero and so is an eigenvector. . . .
If you can eyeball the factorisation $(t-\lambda)\,Q(t)$, then at least you don’t have to start with the eigenvalues.