2

I'm trying to show that $\{1, x, x^2,...,x^n\}$ is a linearly independent set (in $P_n$) without being circular; so without using either the Fundamental Theorem of Algebra or the fact that this is the standard basis for $P_n$.

I understand if $a,b,c$ are all distinct positive integers and $t_1, t_2$ arbitrary constants, then: $$(t_1)x^a + (t_2)x^b = x^c,$$ is impossible.

Is there a nice way to express this idea in general terms though?

The solutions I have seen so far seem to be the hand waving variety, and I'm looking for something a bit more clear.

Thanks!

janmarqz
  • 10,891
Dan
  • 29
  • 2
    The polynomials $a_0+a_1x+\dots+a_nx^n$ and $b_0+b_1x+\dots+b_nx^n$ (where any coefficient can be $0$) are equal if and only if $a_0=b_0,a_1=b_1,\dots,a_n=b_n$ *by definition*. So those polynomials are linearly independent because of definition of equality between polynomials. There's not much more to prove. – egreg Nov 14 '15 at 17:19
  • 5
    @Dan egreg's comment is insightful and you should definitely consider it and possibly adjust your question. I suspect you are implicitly defining two polynomials to be equal if they take the same values at every $x$, rather than as formal sums of powers of an indeterminate. In this case it's important to note that "equality" depends on the exact domain of $x$: the polynomials $x^5$ and $x$ are certainly different over $\mathbb R$, but their values are indistinguishable over $\mathbb Z/5\mathbb Z$: even the latter context, it would be non-standard to call $x^5$ and $x$ the same polynomial. – Erick Wong Nov 14 '15 at 17:26

4 Answers4

5

In abstract algebra, a polynomial is just the sequence of its coefficients; any sequence (in the base field) is good, so long as its terms are zero from some point on. Two sequences are equal if and only if they are so termwise: by definition, if $(a_n)_{n\ge0}$ and $(b_n)_{n\ge0}$ are the sequences of coefficients of two polynomials, they are equal if and only if, for all $n\ge0$, $a_n=b_n$.

The zero polynomial is the one where the sequence has all zero terms.

Sum of sequences is performed term by term, just like for the polynomials defined in the intuitive way. So $$ \alpha_01+\alpha_1x+\dots+\alpha_nx^n=0 $$ if and only if $\alpha_0=0,\alpha_1=0,\dots,\alpha_n=0$ by definition. There's nothing more to prove (and it's not circular).


Erick Wong's comment is really important. Since we want to use polynomials over any ring (or, in particular, fields), it's not possible to identify a polynomial with the function it defines. Over the field $\{0,1,2\}$ with three elements, the function associated to the polynomial $x(x-1)(x-2)$ only assumes the value $0$, but we don't want to consider it as the zero polynomial.

egreg
  • 244,946
2

Try analysis : with $a<b<c,$ by differentiating $c$ times your equality, you will get $$0=c!$$ which is impossible, and then you get your contradiction.

Balloon
  • 8,624
1

How about using induction: Having shown that $\{1,x,x^2,\ldots,x^{n-1}\}$ are linearly independent, let's deduce that $\{1,x,x^2,\ldots,x^{n}\}$ are linearly independent. Indeed, suppose that $c_0+c_1x+c_2x^2+\cdots+c_nx^n=0$ for all $x$. Then in particular (take $x=0$) we must have $c_0=0$, and consequently $$ x\cdot(c_1+c_2x+c_3x^2+\cdots c_nx^{n-1})=0,\qquad\forall x, $$ which implies that $c_1+c_2x+c_3x^2+\cdots c_nx^{n-1}=0$ for all $x$, and in turn that $c_1=c_2=\cdots=c_n=0$, by the induction hypothesis.$\phantom{b}$

John Dawkins
  • 29,845
  • 1
  • 23
  • 39
  • There is a subtle flaw here: you only get $c_1 + c_2 x + c_3 x^2 + \cdots + c_n x^{n-1} = 0$ for all $x \ne 0$. Moreover, your induction makes critical use of the fact that $x=0$ is in the zero locus of the lower-degree polynomial, which isn't true. So your induction hypothesis needs to be adjusted (which exposes the fact that this argument only really goes through in infinite fields). – Erick Wong Nov 14 '15 at 17:30
  • That first was a detail left to the OP in my outline of an argument. Of course $c_1+c_2x+\dots+c_nx^{n-1}=0$ first for all $x\not=0$, and then even for $x=0$ by continuity. The OP will correct me if I am wrong, but the wording of the question left me with the impression that he working with polynomials over the real field (or perhaps the complex). – John Dawkins Nov 14 '15 at 17:46
  • 1
    Got it, thanks. I'm happy to retract my downvote (but I can't do it without some minor edit). – Erick Wong Nov 14 '15 at 17:47
0

Here is the sketch of a non-calculus proof. For this answer, I am assuming you are interested in polynomials over the real or complex numbers, and are thinking of them in terms of their values as functions. So, taking a more analysis-oriented view rather than the algebraic view that others such as egreg have already discussed.

Given a set of points $x_i$, consider the Vandermonde matrix with entries $$M_{ij} = x_i^j.$$ That is, the first column is the polynomial $f(x)=1$ evaluated at all the points, the second column is $f(x)=x$ evaluated at all the points, and so forth.

If we can find any set of points $x_i$ such that $M$ is invertible, then the polynomials are linearly independent on that set of points, and so they are linearly independent generally.

Consider evaluating the polynomials on the set of points $0,1,2,3,\dots,n$. The Vandermonde matrix is:

$$\begin{bmatrix} 1 & 0 & 0 & 0 & \dots & 0 \\ 1 & 1 & 1 & 1 & \dots & 1 \\ 1 & 2 & 4 & 8 & \dots & 2^n \\ 1 & 3 & 9 & 27 & \dots & 3^n \\ \vdots & \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & n & n^2 & n^3 & \dots & n^n \end{bmatrix}.$$

If one performs Gaussian elimination (without pivoting) to compute a LU factorization of this matrix, the process never breaks down and diagonal entries on the $U$ matrix take the factorial values, $$u_{kk}=k!$$ which can be proven via a straightforward but tedious induction proof.

Since $M$ has a non-degenerate LU factorization, it is invertible, and so the polynomials are linearly independent.

Nick Alger
  • 19,977