5

Let

$$ c^{n_1,n_2,k}_{m_1,m_2} = \sum_{i=0}^k (-1)^i \binom{k}{i} \binom{n_1+i}{m_1} \binom{n_2+i}{m_2}. $$

I have reasons to believe that $c^{n_1,n_2,k}_{m_1,m_2} = 0$ for $k > m_1 + m_2$. However, so far I failed to come up with a proof. I know that for $n_i \ge m_i$ one can express

$$ c^{n_1,n_2,k}_{m_1,m_2} = \binom{n_1}{m_1}\binom{n_2}{m_2} \phantom{}_3F_2(-k, n_1+1,n_2+1;n_1-m_1+1,n_2-m_2+1;1), $$ with $_3F_2$ a generalised hypergeometric function. But these are not defined when $m_i > n_i$, while the first definition clearly exists (with $\binom{n}{m} = 0$ for $n < m$).

I numerically sampled the identity $c^{n_1,n_2,k}_{m_1,m_2} = 0$ for $n_1, n_2, m_1, m_2 \in \{0,\dots,100\}$ and $k \in m_1+m_2+1 + \{0,\dots,100\}$ and found no problem. Thanks for any help in advance.


Motivation:

I came to this problem when working with the power series

$$ f(x) = \sum_{k=0}^\infty \frac{(-x)^k}{k!} \binom{n_1+k}{m_1} \binom{n_2+k}{m_2}. $$

Playing around with this, it is easy to prove for small values of $n_i, m_i$ that $f(x) = h(x) e^{-x}$ with $h$ polynomial. To show this for arbitrary $n_i$, $m_i$ I multiplied $f$ by $1 = e^{-x} e^x$, expanded $e^x$ in its power series, combined both power series by building their convolution and ended up with the definition from above (up to a $k!$). If I can now show that $c^{n_1,n_2,k}_{m_1,m_2} = 0$ for $k$ large enough, then this proves the separation of $f$ into an exponential and polynomial part.

manthano
  • 482
  • What is the motivation for this issue ? – Jean Marie Nov 15 '21 at 15:00
  • @JeanMarie added motivation to the text.

    You can also express the defining equation of $f$ by hypergeometric functions, but they also suffer the $m_i > n_i$ problem and I also didn't find a way to extract useful informations from them, while the outlined way seems much clearer.. if correct :D.

    – manthano Nov 15 '21 at 15:23

1 Answers1

5

The (forward, unit step) finite difference of a function is defined as $$ \Delta f(x) = f(x + 1) - f(x) $$ and its iteration as $$ \Delta ^n f(x) = \Delta \left( {\Delta ^{n - 1} f(x)} \right) = \sum\limits_{0 \le k\left( { \le n} \right)} {\left( { - 1} \right)^{\,n - k} \left( \matrix{ n \hfill \cr k \hfill \cr} \right)f(x + k)} $$

The binomial $$ \left( \matrix{ a + x \cr m \cr} \right) $$ is a polynomial in $x$ of degree $m$, and the product $$ \left( \matrix{ a + x \cr m \cr} \right)\left( \matrix{ c + x \cr q \cr} \right) $$ a polynomial of degree $m+q$.

Your expression is just $$ \left( { - 1} \right)^{\,k} \Delta ^k $$ of a polynomial of degree $m_1 + m_2$, which is identically null whenever $k > m_1 + m_2$

Following the suggestion of @jeanmarie and yours, let's explore a bit more the properties of $\Delta$ applied to polynomials.

Let $x^{\,\underline {\,q\,} }$ represents the Falling Factorial.
For $0 \le q \in \mathbb N$ the falling factorial can be defined as $$ x^{\,\underline {\,q\,} } = x\left( {x - 1} \right)\left( {x - 2} \right) \cdots \left( {x - q + 1} \right) = \prod\limits_{k = 0}^{q - 1} {\left( {x - k} \right)} $$ which clearly is a polynomial of degree $q$.

You can easily check that $$ \eqalign{ & \left| {\;n,q \in N} \right. \cr & x^{\,\underline {\,0\,} } = 1 \cr & \Delta x^{\,\underline {\,q\,} } = qx^{\,\underline {\,q - 1\,} } \quad \cr & \Delta ^n x^{\,\underline {\,q\,} } = q\left( {q - 1} \right) \cdots \left( {q - n + 1} \right)x^{\,\underline {\,q - n\,} } = q^{\,\underline {\,n\,} } x^{\,\underline {\,q - n\,} } \quad \left| {\;0 \le n < q} \right. \cr & \Delta ^q x^{\,\underline {\,q\,} } \equiv q!\quad \quad \left| {\;0 \le n = q} \right. \cr & \Delta ^n x^{\,\underline {\,q\,} } \equiv 0\quad \left| {\;0 \le q < n} \right. \cr} $$ i.e. the $\Delta$ operator on $x^{\,\underline {\,q\,} }$ "mimicks" the $D$ operator on $x^q$ ( => umbral calculus, ..)

We can expand the product defining the Falling Factorial as $$ x^{\,\underline {\,q\,} } = \prod\limits_{k = 0}^{q - 1} {\left( {x - k} \right)} = \sum\limits_{\left( {0\, \le } \right)\,j\,\left( { \le \, q} \right)} {\left( { - 1} \right)^{\,q - j} \left[ \matrix{ q \hfill \cr j \hfill \cr} \right]x^{\,j} } $$ where the square bracket represents the unsigned Stirling N. of 1st kind.

The matrix of the Sitling N. is Lower Triangular and invertible and $$ x^{\,q} = \sum\limits_{\left( {0\, \le } \right)\,k\,\left( { \le \,q} \right)} {\left\{ \matrix{ q \hfill \cr k \hfill \cr} \right\}\,x^{\,\underline {\,k\,} } } $$ where the curly bracket represents the Stirling N. of 2n kind.

So the sequence $$ \left\{ {x^{\,\underline {\,0\,} } ,x^{\,\underline {\,1\,} } ,\; \cdots ,\,x^{\,\underline {\,q\,} } ,\; \cdots } \right\} $$ constitutes a polynomial basis (for finite degree at least).

Since $\Delta$ is a linear operator, then $$ \Delta ^n x^{\,\underline {\,d\,} } \equiv 0\;\quad \Rightarrow \quad \Delta ^n x^{\,d} \equiv 0\quad \Rightarrow \quad \Delta ^n p_d \left( x \right) \equiv 0\quad \left| {\;0 \le d < n} \right. $$ which is the answer given above.

Coming to the Newton series, in my opinion the best way to "grasp it" is through the mutual relation of $\Delta$ with the Shift operator $$ E:\;\;Ef(x) = f(x + 1)\quad \Leftrightarrow \quad \Delta = \left( {E - I} \right) $$ so that $$ \eqalign{ & \Delta ^n = \left( {E - I} \right)^n = \sum\limits_{\left( {0\, \le } \right)\,k\,\left( { \le \,n} \right)} {\left( { - 1} \right)^{\,n - k} \left( \matrix{ n \cr k \cr} \right)E^{\,k} } \quad \Leftrightarrow \cr & \Leftrightarrow \quad \Delta ^n f(x) = \sum\limits_{\left( {0\, \le } \right)\,k\,\left( { \le \,n} \right)} {\left( { - 1} \right)^{\,n - k} \left( \matrix{ n \cr k \cr} \right)E^{\,k} f(x)} = \sum\limits_{\left( {0\, \le } \right)\,k\,\left( { \le \,n} \right)} {\left( { - 1} \right)^{\,n - k} \left( \matrix{ n \cr k \cr} \right)f(x + k)} \cr} $$ as we already saw.

We invoke now the Binomial Transform Inversion to write $$ \eqalign{ & E^n = \sum\limits_{\left( {0\, \le } \right)\,k\,\left( { \le \,n} \right)} {\left( \matrix{ n \cr k \cr} \right)\Delta ^{\,k} } \quad \cr & \Leftrightarrow \quad E^n f(x) = f(x + n) = \sum\limits_{\left( {0\, \le } \right)\,k\,\left( { \le \,n} \right)} {\left( \matrix{ n \cr k \cr} \right)\Delta ^{\,k} f(x)} \cr} $$ which is the (forward) Newton expansion.

Finally, taking a generic polynomial $$ p_d (x) = a_{\,d} \,x^{\,d} + a_{\,d - 1} \,x^{\,d - 1} + \; \cdots \; + a_{\,0} \,x^{\,0} = \sum\limits_{0\, \le \,j\, \le \,d} {a_{\,j} \,x^{\,j} } $$ we can equivalently write it as $$ \eqalign{ & p_d (x) = b_{\,d} \,x^{\,\underline d } + b_{\,d - 1} \,x^{\,\underline {d - 1} } + \; \cdots \; + b_{\,0} \,x^{\,\underline 0 } = \sum\limits_{0\, \le \,j\, \le \,d} {b_{\,j} \,x^{\,\underline {\,j\,} } } = \cr & = d!b_{\,d} \,\left( \matrix{ x \cr d \cr} \right) + \left( {d - 1} \right)!b_{\,d - 1} \,\left( \matrix{ x \cr d - 1 \cr} \right) + \; \cdots \; + 0!b_{\,0} \left( \matrix{ x \cr 0 \cr} \right) \cr} $$

Using the relations $x^{\,\underline {\,q\,} } \leftrightarrow \,x^{\,j} $ provided above, and the inversion relation of the Stirling N. we come to $$ b_{\,k} = \sum\limits_{0\, \le \,j\, \le \,d} {a_{\,j} \,\left\{ \matrix{ j \hfill \cr k \hfill \cr} \right\}\;} \quad \Leftrightarrow \quad a_{\,k} = \sum\limits_{0\, \le \,j\, \le \,d} {\left( { - 1} \right)^{\,j - k} \,\left[ \matrix{ j \hfill \cr k \hfill \cr} \right]b_{\,j} } $$ which is the premise to the answer signalled by @jeanmarie .

This very succintly is the circle of relations I was mentioning in my comment:
it would have become more apparent if we were to adopt the matrix vectorial representation of polynomials, and base change.

manthano
  • 482
G Cab
  • 35,964
  • [+1] Congratulations. It seems (now!) evident but I wasn't able to see it ! Another (more complicated) way to reach the same conclusion is to use another expression for $\Delta^n f(x)$ that can be found in this very recent answer – Jean Marie Nov 15 '21 at 19:34
  • 1
    @jeanmarie hi, Jean Marie, thanks for the appreciation and for the additional link . However I am quite aware of the connection polynomials, powers, falling factorials, stirling n.., Newton series .. don' t know if the OP is interested in that – G Cab Nov 15 '21 at 21:03
  • 1
    @GCab no, in fact the connect to that seems very interesting, and I think it is worth linking to that answer! Besides that, thank you for the very clear answer! – manthano Nov 16 '21 at 07:42
  • 1
    @manthano: I added a concise explanation of the relations among difference, Stirling, Newton etc. and the cited answer. – G Cab Nov 16 '21 at 17:26
  • @GCab Thank you very much for the great amount of added details. I took the freedom of fixing two minor mistakes. If I could, I would also upvote your answer a second time. – manthano Nov 17 '21 at 08:28
  • @manthano: glad it helps, .. and thanks for clearing the typos ! – G Cab Nov 17 '21 at 11:28
  • @manthano: concerning your $f(x)$, maybe you already know that it can be expressed as a hypergeometric function (generalized 2,2) – G Cab Nov 17 '21 at 11:54
  • @GCab yes, that was actually the starting point for me. However, knowing that $f(x)$ = polynomial * exponential, even if I cannot write down the polynomial, is more useful to me; than the abstract closed form representation. (+ The definition of the hypergeometric function is again a little problematic if $n_i < m_i$). – manthano Nov 17 '21 at 14:53