As Randall said in the comments, the elements of a polynomial ring are better viewed as formal polynomials. Meaning that they are not functions at all! See this old thread for a discussion.
In algebra we, of course, still want to evaluate polynomials. We want to plug in many possible entities in place of that $x$. For this to work well we usually restrict ourselves to commutative rings $R$. The reason is that we want evaluation to play nicely together with polynomial arithmetic in the sense that we expect the rules (here $\alpha$ is something we can plug in)
- $f(x)+g(x)=h(x)\implies f(\alpha)+g(\alpha)=h(\alpha)$,
- $f(x)g(x)=h(x)\implies f(\alpha)g(\alpha)=h(\alpha)$
to hold for all polynomials $f,g$.
The following is kind of an umbrella result on this theme:
Lemma. Assume that $R$ is a commutative ring, $S$ is another ring that contains $R$ as a subring, and $\alpha\in S$ commutes with all the elements of $R$. That is,
$$r\alpha=\alpha r$$ for all $r\in R$. Then the evaluation mapping $ev_\alpha:R[x]\to S$,
$$f(x)=\sum_{i=0}^na_ix^i\mapsto \sum_{i=0}^na_i\alpha^i$$
is a homomorphism of rings. In particular, the rules above work.
See this post for a quick explanation as to why commutativity is needed. And this answer for an example as to what goes spectacularly
wrong, when we drop commutativity, explaining what goes wrong, when we
substitute $x=j$ into the factorization
$$x^2+1=(x-i)(x+i).$$
Here $i,j$ are quaternions.
The Lemma means that if $R$ is a commutative ring and $f(x)\in R[x]$, then we can, without much worries, evaluate $f(\alpha)$, when for example
- $R$ is an integral domain, and $\alpha$ is an element of its field of fractions $K$.
- Or, if $\alpha$ is an element of an extension field of $K$.
- Of, if $\alpha$ is a square matrix with entries in a ring that contains $R$ in its center (here we need to adopt the usual convention of identifying $R$ with scalar matrices).
- But, there is no maximal possible domain covering all the cases. In all cases range/domain need to match at least somewhat.