4

I am trying to prove that $\{1, x, x^2, ..., x^n\}$ is a linearly independent set with no hand waving and without using the fact that it is a basis for $P_n$ or the Fundamental Theorem of Algebra.

Suppose $a_0 + a_1x + ... + a_nx^n = 0$. How do I show $a_i = 0$ for $1 \leq i \leq n$?

sobrio35
  • 111
  • similar http://math.stackexchange.com/questions/944039/question-concerning-the-linear-independence-of-1-x-xm?rq=1 – janmarqz Nov 14 '15 at 18:45
  • This uses the Fundamental Theorem of Algebra, which I excluded in my question. – sobrio35 Nov 14 '15 at 18:54
  • 1
    The determinant of the Vandermonde matrix for n+1 different values of $x$ is non-zero; so for these values, the vectors are l.i. – Aravind Nov 14 '15 at 18:56
  • 1
    ok, ok, but what about this http://math.stackexchange.com/questions/1528822/linear-independence-of-polynomials?rq=1 – janmarqz Nov 14 '15 at 18:58
  • Yeah this was my original, but poorly stated and I think I caused a mess :-( I was basically looking for a simplified, lower level, chapter 2 or 3 kind of answer. Thanks for looking it up though. – sobrio35 Nov 14 '15 at 19:28
  • 1
    The fundamental theorem of algebra is NOT that a polynomial of degree n>0 has at most n zeroes .It is that a polynomial P of degree n>0 with real or complex co-efficients satisfies P(z)=0 for at least one complex or real z. – DanielWainfleet Nov 14 '15 at 19:55
  • Thanks @user254665. You are right, that wasn't using the FTA. – sobrio35 Nov 14 '15 at 20:35
  • 1
    @jammarqz Thanks, that first reference actually was a good solution, I was in chapter 2 and that solution was in chapter 4. I was looking for a chapter 2 style solution, but maybe it's hard to explain without using chapter 4 tools. – sobrio35 Nov 14 '15 at 20:37
  • If $x$ is a real or complex variable, then you can take derivatives of both sides to show that the $a_i$ are $0$. But this is also true for general fields where differentiation is not possible. – Paul Sinclair Nov 14 '15 at 20:40

1 Answers1

0

By induction on $n$. The case $n=0$ is obvious. Suppose case $n$ holds. Suppose $a_0,...,a_{n+1}$ are constants, and $P(x)=\sum_{j=0}^{j=n+1}a_jx^j=0$ for all $x$. If $a_{n+1}=0$, this reduces to case $n$. If $a_{n+1}\ne 0$, observe that $$x\ne 0\to P(x)=a_{n+1}x^{n+1}(1+Q(x))$$ $$\text {where } Q(x)=\sum_{j=0}^{j=n} \frac {a_j}{a_{n+1}} \frac {1}{x^{n+1-j}}.$$ Let $M=\max \{ \frac {|a_j|}{|a_{n+1}|} :0\leq j\leq n\}$. Now whenever $|x|>\max (1,2 M (n+1))$ we have, for each $0\leq j \leq n,$ $$|x^{n+1-j}|^{-1}|a_j|/|a_{n+1}|\leq |x|^{-1}M\leq 1/(2 (n+1)).$$ $$\text {This implies } |Q(x)|\leq 1/2$$ $$\text {which implies } |P(x)|\geq |a_{n+1}x^{n+1}|.(1-|Q(x)|)\geq |a_{n+1}x^{n+1}|/2>0.$$ So we cannot have $a_{n+1}\ne 0$ and we are reduced to case $n$ again.

janmarqz
  • 10,891