Here is the sketch of a non-calculus proof. For this answer, I am assuming you are interested in polynomials over the real or complex numbers, and are thinking of them in terms of their values as functions. So, taking a more analysis-oriented view rather than the algebraic view that others such as egreg have already discussed.
Given a set of points $x_i$, consider the Vandermonde matrix with entries
$$M_{ij} = x_i^j.$$
That is, the first column is the polynomial $f(x)=1$ evaluated at all the points, the second column is $f(x)=x$ evaluated at all the points, and so forth.
If we can find any set of points $x_i$ such that $M$ is invertible, then the polynomials are linearly independent on that set of points, and so they are linearly independent generally.
Consider evaluating the polynomials on the set of points $0,1,2,3,\dots,n$. The Vandermonde matrix is:
$$\begin{bmatrix}
1 & 0 & 0 & 0 & \dots & 0 \\
1 & 1 & 1 & 1 & \dots & 1 \\
1 & 2 & 4 & 8 & \dots & 2^n \\
1 & 3 & 9 & 27 & \dots & 3^n \\
\vdots & \vdots & \vdots & \vdots & \ddots & \vdots \\
1 & n & n^2 & n^3 & \dots & n^n
\end{bmatrix}.$$
If one performs Gaussian elimination (without pivoting) to compute a LU factorization of this matrix, the process never breaks down and diagonal entries on the $U$ matrix take the factorial values,
$$u_{kk}=k!$$
which can be proven via a straightforward but tedious induction proof.
Since $M$ has a non-degenerate LU factorization, it is invertible, and so the polynomials are linearly independent.