2

From Serge Lang's Linear Algebra:

Let $x_1$, $x_2$, $x_3$ be numbers. Show that:

$$\begin{vmatrix} 1 & x_1 & x_1^2\\ 1 &x_2 & x_2^2\\ 1 & x_3 & x_3^2 \end{vmatrix}=(x_2-x_1)(x_3-x_1)(x_3-x_2)$$

The matrix presented above seems to be the specific case of Vandermonde determinant:

$$ \begin{vmatrix} 1 & x_1 & ... & x_1^{n-1}\\ 1 &x_2 & ... & x_2^{n-1}\\ ... & ... & ... & ...\\ 1 & x_n & ... & x_n^{n-1} \end{vmatrix}=\prod_{i, j}(x_i - x_j), \forall (1 \leq i \leq n) \land (1 \leq j \leq n) $$


I'm trying to prove the specific case to then generalize it for arbitrary Vandermonde matrices.

My incomplete "proof"

Since determinant is a multilinear alternating function, it can be seen that adding a scalar multiple of one column (resp. row) to other column (resp. row) does not change the value (I omitted the proof to avoid too much text).

Thus considering that $x_1$ is a scalar, we can multiply each column but the last one of our specific Vandermonde matrix by $x_1$ and then starting from right to left subtract $n-1$th column from $n$:

$$\begin{vmatrix} 1 & x_1 & x_1^2\\ 1 &x_2 & x_2^2\\ 1 & x_3 & x_3^2 \end{vmatrix}=\begin{vmatrix} x_1 & 0 & 0 \\ x_1 & x_2 - x_1 & x^{2}_2 - x^{2}_1\\ x_1 & x_3 - x_1 & x^{2}_3 - x^{2}_1 \end{vmatrix}$$

Then using the expansion rule along the first row (since all the elements in it but $x_1$ are zero):

$$... =x_1\begin{vmatrix} x_2 - x_1 & x^{2}_2 - x^{2}_1\\ x_3 - x_1 & x^{2}_3 - x^{2}_1 \end{vmatrix}=(x_1x_2-x^2_1)(x^2_{3}-x^2_1)-(x^{2}_2x_1 - x^{3}_1)(x_3x_1 - x^2_1)$$

The first expansion seems interesting because it contains $x_2 - x_1$ and $x_3 - x_1$ (which are first two factors of specific Vandermonde matrix), but further expansion does not give satisfying results.

Question:

Is this a good simple start of inductively "proving" relation between Vandermonde matrix and its factors? If so what does it lack to show the complete result? Did I make mistake during evaluation?

Thank you!

ShellRox
  • 977
  • 1
    The formulae do not reflect the operations you describe. – Bernard Aug 17 '19 at 19:24
  • @Bernard I apologize, I can't understand your statement properly, do you mean that I incorrectly applied expansion formula along the first row? – ShellRox Aug 17 '19 at 19:26
  • 2
    No. I mean the operations you describe do not lead to what is in the right matrix (first equation). – Bernard Aug 17 '19 at 19:29
  • I think I made a mistake in description, I should've said multiply all columns but the final one with $x_1$, and then other operations described would lead to the right matrix, correct? – ShellRox Aug 17 '19 at 19:34
  • 1
    I don't think so. What you should do is subtract $x_1\times$ the first column from the second column, and subtract $x_1^2\times $ the first column from the third. – Bernard Aug 17 '19 at 19:41
  • @ShellRox Warning. the formula works whenever $1\leq i<j \leq n$ in the product operator. – DIEGO R. May 04 '22 at 01:32

3 Answers3

2

"Since determinant is a multilinear alternating function, it can be seen that adding a scalar multiple of one column (resp. row) to other column (resp. row) does not change the value (I omitted the proof to avoid too much text) " is right. But $$ \begin{vmatrix} 1 & x_1 & x_1^2\\ 1 &x_2 & x_2^2\\ 1 & x_3 & x_3^2 \end{vmatrix} \neq \begin{vmatrix} x_1 & 0 & 0 \\ x_1 & x_2 - x_1 & x^{2}_2 - x^{2}_1\\ x_1 & x_3 - x_1 & x^{2}_3 - x^{2}_1 \end{vmatrix} \neq (x_1x_2-x^2_1)(x^2_{3}-x^2_1)-(x^{2}_2x_1 - x^{3}_1)(x_3x_1 - x^2_1) $$ Remember that when you multiply a row or a column by $\lambda$, the determinant is multiplied by $\lambda$. And be careful when distributing $x_1$. We have \begin{align} \begin{vmatrix} 1 & x_1 & x_1^2\\ 1 &x_2 & x_2^2\\ 1 & x_3 & x_3^2 \end{vmatrix} &= x_1 \begin{vmatrix} x_1 & 0 & 0 \\ x_1 & x_2 - x_1 & x^{2}_2 - x^{2}_1\\ x_1 & x_3 - x_1 & x^{2}_3 - x^{2}_1 \end{vmatrix}\\ &= x_1^2 \begin{vmatrix} x_2 - x_1 & x^{2}_2 - x^{2}_1\\ x_3 - x_1 & x^{2}_3 - x^{2}_1 \end{vmatrix}\\ &= x_1^2((x_2 - x_1)(x^{2}_3 - x^{2}_1) - (x^{2}_2 - x^{2}_1)(x_3 - x_1))\\ &\neq (x_1x_2-x^2_1)(x^2_{3}-x^2_1)-(x^{2}_2x_1 - x^{3}_1)(x_3x_1 - x^2_1) \end{align} Keep in mind that we are trying to have the simplest possible factors. Here, you can do \begin{align} \begin{vmatrix} 1 & x_1 & x_1^2\\ 1 &x_2 & x_2^2\\ 1 & x_3 & x_3^2 \end{vmatrix}&=_{L_3 \leftarrow L_3 - L_2 \text{ and } L_2 \leftarrow L_2 - L_1} \begin{vmatrix} 1 & x_1 & x_1^2\\ 0 &x_2 -x_1& (x_2 - x_1)(x_2+x_1)\\ 0 & x_3 - x_2 & (x_3 - x_2)(x_3+x_2) \end{vmatrix}\\ &=_{L_3 \leftarrow L_3 - L_2} (x_2 - x_1)(x_3-x_2) \begin{vmatrix} 1 & x_1 & x_1^2\\ 0 &1& x_2 + x_1\\ 0 & 0 & x_3 -x_1 \end{vmatrix}\\ &=(x_2 - x_1)(x_3-x_2)(x_3-x_1) \end{align}

Monadologie
  • 1,319
2

The general proof is not difficult.

From the definition of a determinant (sum of products), the expansion must be a polynomial in $x_1,x_2,\cdots x_n$, of degree $0+1+2+\cdots n-1=\dfrac{(n-1)n}2$, and the coefficient of every term is $\pm1$.

On another hand, the determinant cancels whenever $x_j=x_k$, so that the polynomial must be a multiple of

$$(x_1-x_2)(x_1-x_3)(x_1-x_4)\cdots(x_1-x_n)\\ (x_2-x_3)(x_2-x_4)\cdots(x_2-x_n)\\ (x_3-x_4)\cdots(x_3-x_n)\\ \cdots\\ (x_n-x_{n-1})$$ ($\dfrac{(n-1)n}2$ factors).

Hence the determinant has no other choice than being $\pm$ this product.


For the $3\times3$ case,

$$\begin{vmatrix} 1 & x_1 & x_1^2\\ 1 &x_2 & x_2^2\\ 1 & x_3 & x_3^2 \end{vmatrix}= \begin{vmatrix} 1 & x_1 & x_1^2\\ 0 &x_2-x_1 & x_2^2-x_1^2\\ 0 & x_3-x_1 & x_3^2-x_1^2 \end{vmatrix}=\begin{vmatrix} x_2-x_1 & x_2^2-x_1^2\\ x_3-x_1 & x_3^2-x_1^2 \end{vmatrix}=(x_2-x_1)(x_3-x_1)\begin{vmatrix} 1&x_2+x_1 \\1& x_3+x_1 \end{vmatrix}=(x_2-x_1)(x_3-x_1)(x_3-x_2).$$

  • 2
    Note: This requires some knowledge about the polynomial ring in $n$ variables -- either that it is a UFD, or at least that common multiples of distinct $x_i-x_j$'s (with $i<j$) are multiples of their product. – darij grinberg Aug 17 '19 at 20:25
  • Thank you for the response. I don't posses sufficient knowledge of group structures that @darijgrinberg mentioned (thank you very much for the prerequisites), but how is alternating property of determinant related to factors of polynomial? Perhaps due to consequence that Darij mentioned or something else? (I apologize if I misunderstood this). – ShellRox Aug 17 '19 at 21:36
  • I'm sorry, I didn't pay attention to correlation between alternating property and polynomial vanishing. Now it's understandable, thank you! – ShellRox Aug 18 '19 at 07:37
0

Using induction on $n$ is maybe the most convenient way. Let's proof that if we assume that the result is true for $n - 1$ then the result is true for $n$. So, consider the Vandermonde matrix $$ V =\begin{pmatrix} 1 & x_{1} & x_{1}^{2} \cdots & x_{1}^{n-1} \\ 1 & x_{2} & x_{2}^{2} \cdots & x_{2}^{n-1} \\ & & \vdots \\ 1 & x_{n -1} & x_{n - 1}^{2} \cdots & x_{n-1}^{n-1} \\ 1 & x_{n} & x_{n}^{2} \cdots & x_{n}^{n-1} \end{pmatrix} $$ Now consider $\det(V)$ as a polynomial of degree $n-1$ in the variable $x_{n}$. We can compute $\det(V)$ using the $n$-th column by means of the expansion of $\det(V)$ in terms of cofactors, so $$ \det(V) = a_{n-1} x_{n}^{n-1} + a_{n-2} x_{n}^{n-2} + \cdots + a_{0} \implies a_{n-1} = \det \begin{pmatrix} 1 & x_{1} & x_{1}^{2} \cdots & x_{1}^{n-2} \\ 1 & x_{2} & x_{2}^{2} \cdots & x_{2}^{n-2} \\ & & \vdots \\ 1 & x_{n -1} & x_{n - 1}^{2} \cdots & x_{n-1}^{n-2} \end{pmatrix}\\ \text{(Note that $a_{n-1}$ is the leading coeficient of the polynomial $\det(V)$)} $$ In addition, $x_{1}, x_{2}, \dots, x_{n-1}$ are the roots of $\det(V)$, so we can write $$ \det(V) = a_{n-1}(x_{n}- x_{1}) (x_{n} - x_{2}) \cdots (x_{n} - x_{n-1}). $$ By induction hypothesis we have the $$ a_{n-1} = \prod_{1 \leq i < j \leq n-1} (x_{i} - x_{j}). $$ Hence, $$ \det(V) = \left[\prod_{1 \leq i < j \leq n-1}(x_{i} - x_{j})\right] (x_{n} - x_{1}) \dots (x_{n} - x_{n-1}) = \prod_{1 \leq i < j \leq n} (x_{i} - x_{j}). $$

DIEGO R.
  • 1,204