I'm trying to figure out if there's a way to upper bound the expectation of the product of scalar positive random variables $\rm I\kern-.3em E P_n = \rm I\kern-.3em E\prod_{i=1}^n X_i$, that are somewhat correlated, $Cov (\log X_i, \log X_j) = c^{|i-j|}$ with $c<1$ for example, and all with the same mean $\rm I\kern-.3em E X_i = \mu$, but I'm kind of stuck. If the random variables are independent it's easy, since $\rm I\kern-.3em E P_n = \prod_{i=1}^n \rm I\kern-.3em E X_i = \mu^n$ and you bring the expectation in. I would like the upper bound to be in terms of $\mu$ and $c$.
-
Have you an idea for the case $n=2$ for example ? Is it possible to simulate (by a computer program) such a situation ? – Jean Marie Apr 28 '24 at 12:53
-
it might be possible, but I am not sure if I'm going to learn much with a simulation of n=2. I took this one as inspiration: https://math.stackexchange.com/questions/245327/weak-law-of-large-numbers-for-dependent-random-variables-with-bounded-covariance – Luca Herrtti Apr 28 '24 at 14:49
1 Answers
$\newcommand{\E}{{\rm I\kern-.3em E}}$
I found a solution that is satisfactory to me, so I put it here in case it can be useful to somebody else. Also, I'd like to see if somebody finds something more general. I'm going to define $Y_i$ as $X_i = \mu + Y_i$, so $\E Y_i = 0$. I'm going to use two conditions, one I call it the Decaying Covariance $|Cov (Y_i, Y_j)| \leq \hat{V}c^{|i-j|}$, with $\hat{V}$ being an upper bound to the variance and $c<1$. The other condition I call it Covariance Dominance, so every higher order interaction is smaller than the pairwise interaction, e.g. $|\E\sum_{k<j<i} Y_iY_jY_k|\leq |\E\sum_{j<i} Y_iY_j|$. It might not be hyper general, but I think it's often the case, and it's an easy to understand scenario, so the final result is somewhat useful. Let's apply first the definition of $Y_i$ and the Covariance Dominance:
\begin{align} \E\prod X_i =&\E\prod (\mu + Y_i) \\ =&\mu ^n + \mu^{n-1}(\E\sum Y_i) + \mu^{n-2}(\E\sum Y_iY_j) \\ & + \mu^{n-3}(\E\sum_{k<j<i} Y_iY_jY_k) +\cdots \\ \leq&\mu ^n + \mu^{n-1}\sum \E Y_i + \mu^{n-2}|\E\sum Y_iY_j| \\ & + \mu^{n-3}|\E\sum_{k<j<i} Y_iY_jY_k| +\cdots \\ \leq&\mu ^n + \mu^{n-2}|\E\sum Y_iY_j| + \mu^{n-3}|\E\sum Y_iY_j| +\cdots \\ =&\mu ^n + \Big(\sum_{k=1}^{n-2}\mu^{k}\Big)|\E\sum Y_iY_j| \\ =&\mu ^n + \chi|\E\sum Y_iY_j| \\ \chi =& \begin{cases} n-2 & \text{if } \mu = 1\\ \frac{1-\mu^{n-2}}{1-\mu} & \text{otherwise } \end{cases} \end{align}
Therefore if we apply the Decaying Covariance condition we have:
\begin{align} \E\prod X_i \leq&\mu ^n + \chi|\E\sum Y_iY_j| \\ \leq&\mu ^n + \chi\sum_{i=1}^n\sum_{j=1}^{i-1} |\E Y_iY_j| \\ \leq&\mu ^n + \chi \hat{V}\sum_{i=1}^n\sum_{j=1}^{i-1} c^{|i-j|} \\ =&\mu ^n + \chi \frac{1}{1-c}\hat{V}\Big(1+c^2n-\frac{1-c^n}{1-c} \Big) \end{align}
where you obtain the last line using the sum formula for the geometric progression a couple of times and some algebra, essentially fixing some mistakes in the proof that can be found here Weak Law of Large Numbers for Dependent Random Variables with Bounded Covariance.
- 53