3

Motivation

For $k=1,2,3$ products of $k$ factors can be expressed as linear combinations of linear combinations of their factors that are raised to the power of $k$:

$$\begin{align} x_1&=1 (x_1)^1 \tag{1}\\ x_1x_2&=\frac{1}{4}(x_1+x_2)^2-\frac{1}{4}(x_1-x_2)^2 \tag{2}\\ x_1x_2x_3&=\frac{1}{24}(x_1+x_2+x_3)^3−\frac{1}{24}(−x_1+x_2+x_3)^3−\frac{1}{24}(x_1−x_2+x_3)^3−\frac{1}{24}(x_1+x_2−x_3)^3 \tag{3}\end{align}$$

This leads to the general form: $$\prod_{j=1}^k x_j=\sum_{i=1}^{n_k}a_i\left(\sum_{j=1}^k b_{i,j}x_j\right)^k \tag{4}$$ Where we have exemplarily for $k=2:\\ n_2=2,a_1=\frac{1}{4},a_2=-\frac{1}{4},b_{1,1}=1,b_{1,2}=1,b_{2,1}=1,b_{2,2}=-1$.

However this simple pattern does not continue for $k>3$.

Questions

Which real coefficients $a_i,b_{i,j}$ fulfill the product $x_1 x_2 x_3 x_4$?

Which real coefficients $a_i,b_{i,j}$ fulfill $\prod_{j=1}^k x_j$ with $k\in \mathbb{N}^+$ ?

Note:

The smallest possible $n_k$ is of interest. It is allowed for some $b_{i,j}=0$.

3 Answers3

3

For $k=4$ we have

$$x_1 x_2 x_3 x_4 = \frac{1}{192} ({x_1}+{x_2}+{x_3}+{x_4})^4+\frac{1}{384} \left(({x_1}+{x_2}-{x_3}-{x_4})^4+({x_1}-{x_2}+{x_3}-{x_4})^4+(-{x_1}+{x_2}+{x_3}-{x_4})^4+({x_1}-{x_2}-{x_3}+{x_4})^4+(-{x_1}+{x_2}-{x_3}+{x_4})^4+(-{x_1}-{x_2}+{x_3}+{x_4})^4\right)+\frac{1}{192} \left(-({x_1}+{x_2}+{x_3}-{x_4})^4-({x_1}+{x_2}-{x_3}+{x_4})^4-({x_1}-{x_2}+{x_3}+{x_4})^4-(-{x_1}+{x_2}+{x_3}+{x_4})^4\right) $$

Maybe you recognize a pattern, the numbers in the denominator are for instance $192=4 \cdot 6 \cdot 8$ and $384=2 \cdot 4 \cdot 6 \cdot 8$.

They can be found like so:

  • Construct an approach that is fully symmetric in all variables $x_i$ and that involves linear parameters $a,b,c$.
  • Expand and collect w.r.t. $a,b,c$
  • Compare coefficients. E.g. in the above case, replacing the fractions with $a,b,c$ you get $96 b = x_1 x_2 x_3 x_4, -24 a = x_1 x_2 x_3 x_4, -144 c = x_1 x_2 x_3 x_4$. Since this is what you want your first linear equation is $96 b - 24 a -144 c =1$. Repeat for other combinations of $x_i$ until you have enough independent linear equations to solve for $a,b,c$. E.g. for $x_i^4$ you get $-4b-a-6c=0$.
  • Solve for $a,b,c$.

With some combinatorics it should be possible to work out the general rule.


Granular bastard's own answer:

$${\prod_{i=1}^k x_i =\frac{1}{(2k)!!} \sum_{(s_1, s_2, \dots, s_k) \in \{-1,1\}^k}\left[\left(\prod_{i=1}^k s_i \right)\left(\sum_{i=1}^k {s_i x_i}\right)^k\right]},$$ where $\{-1, 1\}^k := \Big\{(s_1, s_2, \dots, s_k) \mid s_i \in \{-1, 1\} \text{ for all positive integers } i \leq k \Big\}$ are the set of all $2^k$ tuples of length $k$ whose elements are either $−1$ or $1$. So, for example, the set $\{−1,1\}^2$ is defined to be $\{(−1,−1),(−1,1),(1,-1,),(1,1)\}$.


Sketch of a proof

Some lingo:

  • linear combination: $\pm x_1\pm x_2...\pm x_{k-1}\pm x_k$
  • completely mixed monomial: $x_1 x_2 ... x_k$
  • an incompletely mixed monomial is any non-completely mixed monimial, like these examples: $x_1^k, x_1 x_2^{k-1}, x_2^2 x_3 ... x_k$ etc.
  1. The number different linear combinations $p$ due to the sign permutations is $p=2^k$. The number of completely mixed monomials $x_1 x_2 \cdots x_k$ when expanding a single linear combination $(\pm x_1 \pm x_2\pm \dots \pm x_k)^k$ is $k!$. Therefore, the completely mixed monomial occurs $k!\times 2^k=(2k)!!$ times when the powers of the linear combinations are expanded completely. Due to the construction of the above expression with $(-1)^l=\left(\prod_{i=1}^k s_i \right)$ with $l$ being the number of minuses in each bracket, all completely mixed monomials $x_1 x_2 \cdots x_k$ have $+1$ as expansion coefficient. Therefore, the completely mixed monomials sum up to $(2k)!! x_1 x_2 .... x_k$.

  2. In each incompletely mixed monomial at least one $x_j$ is missing. For each missing $x_j$ in an incomplete monomial we can divide the $p=2^k$ (which is an even number) linear combinations that are raised to the power $k$ into two groups, $(\pm x_1\dots + x_j \dots\pm x_k)^k$ and $-(\pm x_1 \dots- x_j \dots \pm x_k)^k$. When expanding for the monomials that lack $x_j$, these cancel each other out.

Therefore, only the completely mixed monomials are not cancelled out, which leads to the above expression.

2

Writing the terms symmetrically, for $k=1,2,3,4$:

$$\begin{align} &(2\times 1)\text{!!}\prod_{i=1}^1 x_i=2x_1=\\ &+\left(+x_1\right){}^1\\ &-\left(-x_1\right){}^1\\ \\ &(2\times 2)\text{!!}\prod_{i=1}^2 x_i=8x_1x_2=\\ &+\left(+x_1+x_2\right)^2+\left(-x_1-x_2\right)^2\\ &-\left(+x_1-x_2\right)^2-\left(-x_2+x_1\right)^2\\ \\ &(2\times 3)\text{!!}\prod _{i=1}^3 x_i=48x_1x_2x_3=\\ &+\left(+x_1+x_2+x_3\right)^3+\left(+x_1-x_2-x_3\right)^3+\left(-x_1+x_2-x_3\right)^3+\left(-x_1-x_2+x_3\right)^3\\ &-\left(-x_1-x_2-x_3\right)^3-\left(-x_1+x_2+x_3\right)^3-\left(+x_1-x_2+x_3\right)^3-\left(+x_1+x_2-x_3\right)^3\\ \\ &(2\times 4)\text{!!}\prod_{i=1}^4 x_i=384x_1x_2x_3x_4=\\ &+\left(+x_1+x_2+x_3+x_4\right)^4+\left(-x_1-x_2-x_3-x_4\right)^4+\left(+x_1+x_2-x_3-x_4\right)^4+\left(+x_1-x_2+x_3-x_4\right)^4 +\left(-x_1+x_2+x_3-x_4\right)^4+\left(+x_1-x_2-x_3+x_4\right)^4+\left(-x_1+x_2-x_3+x_4\right)^4+\left(-x_1-x_2+x_3+x_4\right)^4\\ &-\left(-x_1+x_2+x_3+x_4\right)^4-\left(+x_1-x_2+x_3+x_4\right)^4-\left(+x_1+x_2-x_3+x_4\right)^4-\left(+x_1+x_2+x_3-x_4\right)^4 -\left(+x_1-x_2-x_3-x_4\right)^4-\left(-x_1+x_2-x_3-x_4\right)^4-\left(-x_1-x_2+x_3-x_4\right)^4-\left(-x_1-x_2-x_3+x_4\right)^4 ,\end{align}$$

where !! is the double factorial, we see that the $2^k$ summands are all sign permutations of $x_i$. The signs in front of the brackets are negative/positive if the number of negative signs within the bracket are odd/even. Using a programming language this construction rule could be proved for $1\le k\le28$.

Using the formalism described here we can write a conjecture for $k\in \mathcal{N}^+$ $$\color{blue}{\prod_{i=1}^k x_i =\frac{1}{(2k)!!} \sum_{(s_1, s_2, \dots, s_k) \in \{-1,1\}^k}\left[\left(\prod_{i=1}^k s_i \right)\left(\sum_{i=1}^k {s_i x_i}\right)^k\right]},$$ where $\{-1, 1\}^k := \Big\{(s_1, s_2, \dots, s_k) \mid s_i \in \{-1, 1\} \text{ for all positive integers } i \leq k \Big\}$ are the set of all $2^k$ tuples of length $k$ whose elements are either $−1$ or $1$. So, for example, the set $\{−1,1\}^2$ is defined to be $\{(−1,−1),(−1,1),(1,-1,),(1,1)\}$.

It would be interesting to see a beautiful mathematical proof for $k\in \mathcal{N}^+$.

1

We can rewrite the last formula given by @granularbastard as $$ \eqalign{ & \left( {2m} \right)!!\prod\limits_{k = 1}^m {x_k } = \cr & = \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {\left( {\prod\limits_{k = 1}^m {s_k } } \right)\left( {\sum\limits_{k = 1}^m {s_k x_k } } \right)^m } \right)} = \cr & = \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {s_m \left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } + s_m x_m } \right)^m } \right)} = \cr & = \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } + x_m } \right)^m } \right)} + \cr & - \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } - x_m } \right)^m } \right)} = \cr & = \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\sum\limits_{j = 0}^m {\left( \matrix{ m \cr j \cr} \right)x_m ^j \left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - j} } } \right)} + \cr & - \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\sum\limits_{j = 0}^m {\left( \matrix{ m \cr j \cr} \right) \left( { - 1} \right)^j x_m ^j \left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - j} } } \right)} = \cr & = 2\sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\sum\limits_{j = 0}^m {\left( \matrix{ m \cr 2j + 1 \cr} \right) x_m ^{2j + 1} \left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1 - 2j} } } \right)} = \cr & = 2\sum\limits_{j = 0}^m {\left( \matrix{ m \cr 2j + 1 \cr} \right)x_m ^{2j + 1} \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1 - 2j} } } \cr} $$

It remains to demonstrate that $$ \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^q } = 0\quad \left| {\;0 \le q < m-1} \right. $$ as numerically it checks for the first few values of m.

Upon that we have the recursion step $$ \eqalign{ & \left( {2m} \right)!!\prod\limits_{k = 1}^m {x_k } = 2mx_m \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1} } = \cr & = \left( {2m} \right)!!\prod\limits_{k = 1}^m {x_k } = 2mx_m \left( {2\left( {m - 1} \right)} \right)!! \prod\limits_{k = 1}^{m - 1} {x_k } \cr} $$

Indeed, taking the derivative wrt $x_m$ of both sides $$ \eqalign{ & {\partial \over {\partial x_m }}\left( {2m} \right)!!\prod\limits_{k = 1}^m {x_k } = \left( {2m} \right)!!\prod\limits_{k = 1}^{m - 1} {x_k } = \cr & = {\partial \over {\partial x_m }}\sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {\left( {\prod\limits_{k = 1}^m {s_k } } \right)\left( {\sum\limits_{k = 1}^m {s_k x_k } } \right)^m } \right)} = \cr & = m\sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {\left( {s_m ^2 \prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^m {s_k x_k } } \right)^{m - 1} } \right)} = \cr & = m\sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^m {s_k x_k } } \right)^{m - 1} } \right)} = \cr & = {\partial \over {\partial x_m }}2\sum\limits_{j = 0}^m {\left( \matrix{ m \cr 2j + 1 \cr} \right)x_m ^{2j + 1} \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1 - 2j} } } \cr & = 2\sum\limits_{j = 0}^m {\left( {2j + 1} \right)\left( \matrix{ m \cr 2j + 1 \cr} \right)x_m ^{2j} \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1 - 2j} } } \Rightarrow \cr & \Rightarrow \left\{ \matrix{ {{\left( {2m} \right)!!} \over m}\prod\limits_{k = 1}^{m - 1} {x_k } = \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^m {s_k x_k } } \right)^{m - 1} } \right)} \hfill \cr \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right)\left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1 - 2j} } = 0\quad \left| {\;0 < j} \right. \hfill \cr} \right. \cr} $$

Thus the combined hypothesis $$ \left( \matrix{ \left( {2m} \right)!!\prod\limits_{k = 1}^m {x_k } = \sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_m } \right) \in \left\{ { - 1,1} \right\}^m } {\left( {\left( {\prod\limits_{k = 1}^m {s_k } } \right) \left( {\sum\limits_{k = 1}^m {s_k x_k } } \right)^m } \right)} \wedge \hfill \cr \wedge \left( {\left[ {0 < j \le {{m - 1} \over 2}} \right]\sum\limits_{\left( {s_1 ,s_1 , \cdots ,s_{m - 1} } \right) \in \left\{ { - 1,1} \right\}^{m - 1} } {\left( {\prod\limits_{k = 1}^{m - 1} {s_k } } \right) \left( {\sum\limits_{k = 1}^{m - 1} {s_k x_k } } \right)^{m - 1 - 2j} } = 0} \right) \hfill \cr} \right) $$ (the square bracket denotes the Iverson bracket)
which is true for $m=1,2,3$ is demonstrated by induction.

G Cab
  • 35,964