7

I want to prove the following identity

$\sum_{i=1}^{k}x_i\Delta(x_1,\ldots,x_i+t,\ldots,x_k)=\left(x_1+x_2+\cdots+x_k+{k \choose 2}t\right)\cdot\Delta(x_1,x_2,\ldots,x_k),$

where we write $\Delta(l_1,\ldots,l_k)$ for $\prod_{i<j}(l_i-l_j).$

I have checked that this identity is true for $k=2,3.$ I tried to calculate LHS using Vandermonde determinant, but it seems to work. I sense some properties of symmetric function could be useful…… Any help will be appreciated.:)


The problem comes from Fulton's book "Young Tableaux-With Applications to Representation Theory and Geometry". It's a default fact in Exercise 10 in page55.

enter image description here

However, I cannot prove this.


He uses the following fact to prove Hook length formula. enter image description here

And set $x_i=l_i$ and $t=-1$ in Exercise 10.

  • I have been interested in a nice proof of this identity for a while! (I remember there being a proof using root counting, but I don't remember the details, except that I didn't like them.) This sounds like it could have a combinatorial meaning (RS insertion in some tableau?). – darij grinberg Oct 12 '16 at 01:14
  • I am interested in how this identity leads to the Hook Length formula; could you or someone else sketch the rest of the argument? – alphacapture Oct 12 '16 at 16:32
  • Thanks for your answer! Fulton's book uses the fact of Exercise 9 in page54. – VerMoriarty Oct 13 '16 at 08:44

2 Answers2

3

The difference of the RHS and the LHS is a degree $\binom{k}{2}+1$ polynomial in $x_1, x_2, \ldots, x_k, t$.

First, note that if $x_i=x_j$ for any $i, j$, then the equation is true. Indeed, the RHS would be 0, and all terms on the LHS would be 0 other than $x_i\Delta(x_1, x_2, \ldots, x_i+t, \ldots, x_k)$ and $x_j\Delta(x_1, x_2, \ldots, x_j+t, \ldots, x_k)$, which cancel out (all factors not involving $x_i$ and $x_j$ are the same; factors involving only one can be paired up as $(x_i+t-x_k)(x_k-x_j)$ which is the same as $(x_i-x_k)(x_k-(x_j+t))$(recall $x_i=x_j$), and the only factors left are $x_i+t-x_j$ and $x_i-(x_j+t)$, causing one to be the opposite of the other).

Therefore, $x_i-x_j$ is a factor of the difference for any $i, j$, giving $\binom{k}{2}$ factors.

When $t=0$, the equation is true by the distributive law. So $t$ is also a factor of the difference of the LHS and the RHS.

Therefore, the difference LHS-RHS must be of the form $Ct\prod_{i<j}{(x_i-x_j)}$ for some constant $C$. To prove that $C=0$, it sufficies to show that the equation is true for some values for $x_1,x_2,\ldots,x_k,t$ where no two $x_i$ are equal and $t\neq0$.

I will choose $x_i=i$ and $t=-1$.

$$\textrm{LHS}=\sum_{i=1}^{k}i\Delta(1,\ldots, i-1,i-1,i+1,\ldots,k),$$

and all terms are 0 except the first, which is $\Delta(0,2,\ldots,k).$

$$\textrm{RHS}=\left(1+2+\ldots+k+\binom{k}{2}(-1)\right)\cdot\Delta(1,2,\ldots,k)=k\Delta(1,2,\ldots,k).$$

So, the goal is to show

$$\Delta(0,2,\ldots,k)=k\Delta(1,2,\ldots,k).$$

To do this, note that all the factors in the $\Delta$ part not involving the first term are the same in both. For the LHS, the factors involving the first term are $-2,-3,\ldots,-k$, and for the RHS, the factors involving the first term are $-1,-2,\ldots,-(k-1)$, so we are done.

alphacapture
  • 3,228
2

1. There exists a solution without the polynomial identity trick; it is long but rather conservative in its methods:

  • Fix some $i$. Vandermonde tells us that $\Delta\left(x_1, x_2, \ldots, x_i + t, \ldots, x_k\right)$ is the determinant of the $k\times k$-matrix whose $\left(u,v\right)$-th entry is $x_u^{k-v}$ if $u \neq i$ and $\left(x_i+t\right)^{k-v}$ if $u = i$.
  • Expand this determinant along the $i$-th row. Do the same with $\Delta\left(x_1, x_2, \ldots, x_k\right)$. Notice that the cofactors will be the same for both determinants.
  • In the expansion of $\Delta\left(x_1, x_2, \ldots, x_i + t, \ldots, x_k\right)$, expand each $\left(x_i + t\right)^{k-v}$ by the binomial formula: $\left(x_i + t\right)^{k-v} = \sum\limits_{\ell=0}^{k-v} \dbinom{k-v}{\ell} t^\ell x_i^{k-v-\ell}$.
  • Subtract $\Delta\left(x_1, x_2, \ldots, x_k\right)$; this steals one addend from the binomial formula (i.e., the $\sum\limits_{\ell=0}^{k-v} \dbinom{k-v}{\ell} t^\ell x_i^{k-v-\ell}$ is turned into $\sum\limits_{\ell=1}^{k-v} \dbinom{k-v}{\ell} t^\ell x_i^{k-v-\ell}$).
  • Conclude a double sum expression for $\Delta\left(x_1, x_2, \ldots, x_i + t, \ldots, x_k\right) - \Delta\left(x_1, x_2, \ldots, x_k\right)$. Multiply it by $x_i$ and sum over all $i$. The result is a triple sum (i.e., a sum of sums of sums). Your goal is to prove that this result is $\dbinom{k}{2} t \Delta\left(x_1, x_2, \ldots, x_k\right)$.
  • Transform the triple sum so that the summation over $i$ becomes the innermost sum. Then, the sum over $i$ is a Laplace expansion of a certain determinant with respect to the $i$-th row. If $\ell=1$, then this latter determinant is $\Delta\left(x_1, x_2, \ldots, x_k\right)$, whereas otherwise it is zero (since it has two equal rows). Use this to get rid of the two inner sums. The outer sum now becomes $\sum\limits_{v=1}^{k} \dbinom{k-v}{1} t \Delta\left(x_1, x_2, \ldots, x_k\right) = \dbinom{k}{2} t \Delta\left(x_1, x_2, \ldots, x_k\right)$, which is what you wanted.

The details can be found in the solution to Exercise 6.34 in my Notes on the combinatorial fundamentals of algebra (version of 10 January 2019) (there is also a version on my website which is being kept up-to-date, but its numbering might shift).

2. I am wondering if there is a combinatorial proof (by sign-reversing involution). Combinatorial proofs of the Vandermonde determinant can be found in various places:

3. Here are four variations on the identity from the original post:

Variation 1. We have \begin{equation} \sum\limits_{i=1}^k \Delta\left(x_1, x_2, \ldots, x_i + t, \ldots, x_k\right) = k \Delta\left(x_1, x_2, \ldots, x_k\right) . \end{equation}

Variation 2. For each $i \in \left\{1,2,\ldots,k\right\}$, set $y_i = \prod\limits_{1\leq j\leq k;\ j \neq i} x_j$. Then, \begin{equation} \sum\limits_{i=1}^k y_i \Delta\left(x_1, x_2, \ldots, x_i + t, \ldots, x_k\right) = - \dfrac{\prod\limits_{i=1}^k \left(x_i-t\right) - \prod\limits_{i=1}^k x_i}{t} \Delta\left(x_1, x_2, \ldots, x_k\right) . \end{equation}

Variation 3. For each $m \in \left\{0,1,\ldots,k-1\right\}$, we have \begin{equation} \sum\limits_{i=1}^k x_i^m \Delta\left(x_1, x_2, \ldots, x_{i-1}, t, x_{i+1}, \ldots, x_k\right) = t^m \Delta\left(x_1, x_2, \ldots, x_k\right) . \end{equation}

Variation 4. For each $i \in \left\{1,2,\ldots,k\right\}$, set $y_i = \prod\limits_{1\leq j\leq k;\ j \neq i} x_j$. Then, \begin{equation} \sum\limits_{i=1}^k y_i \Delta\left(x_1, x_2, \ldots, x_{i-1}, t, x_{i+1}, \ldots, x_k\right) = - \dfrac{\prod\limits_{i=1}^k \left(x_i-t\right) - \prod\limits_{i=1}^k x_i}{t} \Delta\left(x_1, x_2, \ldots, x_k\right) . \end{equation}

Variation 1 is easier than the original problem. It has an algebraic proof which is similar to (but easier than) the above argument (Proposition 7.192 in my Notes on the combinatorial fundamentals of algebra). I am wondering whether it has a combinatorial proof using tournaments as well.

Variation 2, on the other hand, appears significantly harder. I have two proofs. One uses the alternant expression for the elementary symmetric functions. Another is more elementary but irritatingly long (see Proposition 7.198 and Proposition 7.199 (d) in my Notes on the combinatorial fundamentals of algebra). A combinatorial proof would be particularly interesting here.

Variation 3 is fairly easy again (it is Proposition 7.194 in my Notes on the combinatorial fundamentals of algebra). It is also equivalent (when the denominators make sense) to the famous formula $\sum\limits_{i=1}^k x_i^m \dfrac{\prod\limits_{j \neq i} \left(t-x_j\right)}{\prod\limits_{j \neq i} \left(x_i-x_j\right)} = t^m$, which can be seen as an explicit description of a Lagrange interpolating polynomial. Again, I don't have a combinatorial proof.

Variation 4 is again painful to prove (Proposition 7.205 in my Notes on the combinatorial fundamentals of algebra).