I have a problem that I have been stuck on for several days now, involving polynomial reduction: Suppose $h(x)$ is a rational function: \begin{equation} h(x) = \frac{n(x)}{d(x)}\,, \end{equation} and let $b(x)$ be a polynomial. I consider the polynomial reduction of $h(x)$ with respect to $(b(x)-\delta)$, where $\delta$ is a constant parameter, so that we have \begin{equation} r(x,\delta) = h(x) \mod (b(x)-\delta)\,. \end{equation} This remainder defines a function $r$ with two inputs. The question is to show that the function $r$ now satisfies: \begin{equation} r(x,b(x)) = h(x)\,. \end{equation} for any $h$ and $b$.
Example:
Let \begin{equation} h(x) = \frac{x}{2 x^2-x+1}\,, \qquad b(x) = x^2 - 3x + 5\,. \end{equation} We have \begin{equation} r(x,\delta) = \frac{-5 \delta +2\, \delta\, x-9 x+25}{4 \delta ^2-31\, \delta +71}\,, \end{equation} (where the polynomial inverse of $d(x)$ was used as an intermediate step) and thus \begin{equation} r(x,x^2-3x+5) = \frac{2 \left(x^2-3 x+5\right) x-5 \left(x^2-3 x+5\right)-9 x+25}{4 \left(x^2-3 x+5\right)^2-31 \left(x^2-3 x+5\right)+71} = \frac{x}{2 x^2-x+1} = h(x)\,. \end{equation}
Partial Solution:
For the special case of $d(x)=1$ and so $h(x)=n(x)$ is a polynomial, I think I have an answer. We can repeatedly take polynomial remainders with respect to $b(x)$ to obtain \begin{equation} n(x) = r_0(x)+q_{0}(x)b(x) =r_0(x) + (q_1(x)b(x) + r_1(x))b(x) = r_0(x) + r_1(x)b(x) + r_2(x)b(x)^2 + \cdots + r_m(x)b(x)^m\,. \end{equation} This series terminates because the degree of $r_{i+1}$ is one less than that of $r_{i}$, so eventually a constant is reached ($r_m(x) = r_m$). If we take this expression modulo $(b(x) - \delta)$, this is equivalent to the replacement $b(x) \to \delta$ and thus we obtain \begin{equation} r(x,\delta) = n(x) = r_0(x) + r_1(x)\delta + \cdots + r_m(x)\delta^m\mod (b(x)-\delta)\,. \end{equation} If $d(x)\neq 1$ then one can also perform a very similar argument, but the problem is that the series is not guaranteed to terminate. This is because if \begin{equation} \frac{n(x)}{d(x)}=\frac{q_n(x)}{q_d(x)}b(x) + r_0(x)\,, \end{equation} then in general, $q_d(x)$ has the same degree as $d(x)$, so the "series" is infinite.
The only "intuition" I could think of to tackle the full case is that one can write \begin{equation} h(x) = q(x)(b(x)-\delta) + r(x,\delta)\,, \end{equation} and thus if $\delta = b(x)$ the quotient cancels and thus $r(x,b(x))=h(x)$. However, I don't think this is valid because the above identity only holds if $\gcd(d(x),(b(x)-\delta))=1$ and $\gcd(d(x),0)=d(x)$.
I also tried thinking of arguments involving Bézout's identity but didn't get anywhere.
Does anyone have an idea on how to proceed? Any help would be greatly appreciated. I feel like this is a problem that would be best formulated in terms of polynomial rings and ideals, but I unfortunately have no formal training in that regard.
Thank you very much!
Edit: This question is similar to the one found here, for the case b(x) = x. However I would like to show this is true for general (not necessarily linear) polynomials.
$$\begin{align} d,r(y)&= \ \ \ n\ \ +\ \ (y-b)f(y)\ \ \Rightarrow \ \ \ \ \ \ \ d\ r(b) = n\[.3em] d_1(y), r(y) &= n_1(y) + (y-b)f_1(y)\ \Rightarrow\ d_1(b),r(b) = n_1(b)\end{align}\ \Rightarrow\ \frac{n}d = r(b) = \frac{n_1(b)}{d_1(b)}\qquad$$
– Bill Dubuque Aug 11 '24 at 18:49