I'm studying Rotman's Advanced Modern Algebra, and have trouble understanding example 3.118 on page 187. It states: "Consider the evaluation map $\varphi:\mathbb{R}[x]\to\mathbb{C}$ given by $\varphi:f(x)\to f(i)$." Then surjectivity is proved, then it is proved that $(x^2+1)\subseteq\ker\varphi$. These I understand. But then it states: "For the reverse inclusion, take $g(x)\in\ker\varphi$. Now $i$ is a root of $g(x)$, and so $\gcd(g,x^2+1)\not=1$ in $\mathbb{C}[x]$; therefore, $\text{gcd}(g,x^2+1)\not=1$ in $\mathbb{R}[x]$."
I get that $\gcd(g,x^2+1)\not=1$ in $\mathbb{C}[x]$ because they both are divisible by $x-i$ because $i$ is a root of both of them. But why does this carry over to $\mathbb{R}[x]$ where $i$ does not exist?
Here is a picture of the whole example, with the part I have trouble with yellowed.