There is a proof of the Cayley-Hamilton theorem which is presented in Introduction to Commutative Algebra by Atiyah and MacDonald, near the section on Nakayama's lemma. I am interested in how to turn this into a coordinate-free proof.
I will present the proof below. I am primarily interested in cases where the base ring is a PID or a field, but I am open to generalisations as long as they do not complicate things.
Let $k$ be a PID or a field, $V$ an $n$-dimensional $k$-vector space (or free module of finite rank $n$, if $k$ is a PID), and $A: V \to V$ a $k$-linear map. We are interested in showing that $A$ is a root of its own characteristic polynomial.
Pick a basis $(e_1, \ldots, e_n)$ of $V$ over $k$, so that we can talk about $A$ as a matrix in the usual way: $$ A e_j = \sum_i a_{ij} e_i.$$ Let $R \subseteq \operatorname{End}_k(V)$ be the subring generated by powers of $A$: this is a commutative algebra over $k$. Define the $n \times n$ matrix $M$ over $R$: $$ M \in \operatorname{Mat}_n(R), \quad M_{ij} = \delta_{ij} A - a_{ij} \operatorname{id}_V.$$ Let $V^n$ have the obvious $R$-module structure $$ r \cdot (v_1, \ldots, v_n) = (rv_1, \ldots, rv_n),$$ and define the special element $$ \underline{e} \in V^n, \quad \underline{e} = (e_1, \ldots, e_n).$$ The proof is now surprisingly easy: $\det(M)$ is clearly the characteristic polynomial of $A$, evaluated at $A$. We want to show this is zero. On the other hand, it is easily calculated that $M \underline{e} = 0$, so by multiplying on the left by the adjugate matrix we get $\operatorname{adj}(M)M \underline{e} = \det(M) \underline{e} = 0$. Since $\det(M) e_i = 0$ for all $e_i$, it follows (from the fact that the $e_i$ are a basis) that $\det(M) \in \operatorname{End}(V)$ is the zero map.
In the proof above we did two coordinatey things: we picked a basis of $V$, and we also created $M$ as a matrix directly. I'm wondering if we can get around this kind of thing, and keep the essence of the proof the same. I am also in favor of how direct this proof is - not even going via the ring $R[t]$ for instance.
I'll give my thinking so far. The replacement for $V^n$ should probably be $V \otimes_k V$, considered as an $R$-module where $R$ acts on the first component: $r \cdot (u \otimes v) = (ru) \otimes v$. The replacement for $\underline{e}$ should be the vector $\sum_i e_i \otimes e_i$, which is (despite the formulation here) independent of the basis. The replacement for $M$ should be the $R$-linear operator $$ \mathcal{M} \in \operatorname{End}_R(V \otimes_k V), \quad \mathcal{M}(u \otimes v) = Au \otimes v - u \otimes Av.$$ I can at least see that if I feed in $u \otimes e_i$ into $\mathcal{M}$, I get a "matrix" resembling $M$ above. However, I feel a bit uncomfortable calling this a matrix, since $V \otimes_k V$ does not need to be free as an $R$-module, does it?
Perhaps someone can help me out here?