2

I was working on the problem link, in the setting, when we have a finite field extension $L|K$, and I came up with the proof that goes as: Suppose that $\alpha$ is a root of $f$ in $K$. Then $[K(\alpha):K]=deg(f)$, further since $L|K$ finite, by the tower law: $[L:K]=[L:K(\alpha)][K(\alpha):K]$, thus divisible by $deg(f)$. As I saw at that answer, this solution was downvoted. Why is it wrong, or why is the other solution preferable?

NB. I know that the Tower Law might only be applied in the finite case. Could this be the reason?

Jyrki Lahtonen
  • 140,891
stomfaig
  • 669

1 Answers1

4

It is wrong because it assumes that if $f(x)$ is reducible over $L$, then it must have a root in $L$. This is just not true.

If $f(x)$ has a root in $L$ (and is not linear) then it is reducible in $L$. But you can have a polynomial that is reducible in $L$ but has no roots in $L$. For example, $x^4-2$ is irreducible over $K=\mathbb{Q}$, has no roots in $L=\mathbb{Q}(\sqrt{2})$, but factors as $$x^4 -2 = (x^2-\sqrt{2})(x^2+\sqrt{2})$$ over $L$. So it is not irreducible, yet has no roots.

Arturo Magidin
  • 417,286