I've been following along Georgi Shilov's dense but pleasant Linear Algebra. To my chagrin, I'm stuck on problem 10 of the first chapter, which has withstood a few hours' toil.
The problem is to prove that the equation below holds, where the vertical bars signify taking the determinant: $$ \begin{vmatrix} 1 & 1 & ...& 1 \\ x_1 & x_2 & ... & x_n \\ x_1^2 & x_2^2 & ... & x_n^2 \\ \vdots & \vdots & \ddots & \vdots \\ x_1^{n-2} & x_2^{n-2} & ...& x_n^{n-2} \\ x_1^n & x_2^n & ...& x_n^n \\ \end{vmatrix} = \begin{vmatrix} 1 & 1 & ...& 1 \\ x_1 & x_2 & ... & x_n \\ x_1^2 & x_2^2 & ... & x_n^2 \\ \vdots & \vdots & \ddots & \vdots \\ x_1^{n-2} & x_2^{n-2} & ...& x_n^{n-2} \\ x_1^{n-1} & x_2^{n-1} & ...& x_n^{n-1} \\ \end{vmatrix} \times \sum_{k=1}^nx_k $$ Let us call the lefthand matrix $A$ and the righthand matrix $B$.
Now, here's the rub. Based on the book's presentation (and hint at the back), there exists a solution using the below facts/operations:
(1) $\det(A)$ is a polynomial in $x_n$ of degree $n$, with factors the $\{x_1, x_2, ..., x_{n-1}\}$. That is:
$$\det(A) = (\alpha + \beta x_n)\prod_{k=1}^{n-1}(x_n-x_k)$$ (2) Cofactor expansion on the rightmost column of $A$. That is:
$$\det(A) = C_n^1 + ... + x_n^{n-2}C_n^{n-2} + x_n^nC_n^n$$ where the $C_n^j$ are the corresponding cofactors of the $j$th element in the rightmost column.
My observations:
- The polynomial (1) has the below leading and next-leading terms (see Wolfram):
$$ \det(A) = \beta x_n^n - \beta(\sum_{k=1}^{n-1}x_i)x_n^{n-1} + \alpha x_n^{n-1}... $$
The polynomial (2) has leading coefficient $C_n^n$. Therefore $\beta = C_n^n$.
The polynomial (2) does not have any $x_n^{n-1}$ term. Therefore, equating the coefficient from Observation 1:
$$ \beta(\sum_{k=1}^{n-1}x_i) = C_n^n(\sum_{k=1}^{n-1}x_i) = \alpha $$
- $\alpha$ has the exact form of the righthand side of our equation, were it one column and row smaller!
Unfortunately, I can't see past this observation. Any pointers?