3

Information:

a-) $X$ and $Y$ are two continuous random variables on $\mathbb{R}$ having continuous distribution functions $F$ and $G$ with $G(y)\geq F(y)$ for all $y$.

b-) $S^X_n=\sum_{i=1}^n X_i$, $S^Y_n=\sum_{i=1}^n Y_i$, $A>0$, and $B<0$; where $X_i$ and $Y_i$ are i.i.d. replicas of $X$ and $Y$ respectively.

c-) $E[X]<0$ and $E[Y]<0$.

What I know:

By coupling (since $G\geq F$), I know that there exist a pair of random variables $(X^{'},Y^{'})$ such that $X=X^{'}$ in distribution, $Y=Y^{'}$ in distribution, and $X^{'}\geq Y^{'}$ almost surely. Using this result, I also have $S_n^{X^{'}}=\sum_{i=1}^n X^{'}_i\geq \sum_{i=1}^n Y^{'}_i=S_n^{Y^{'}}$. Since this holds for all $n$, I am able to compare the following:

$$\tau_A^{X^{'}}=\inf\{n\geq 0:S_n^{X^{'}}\geq A\}$$ $$\tau_A^{Y^{'}}=\inf\{n\geq 0:S_n^{Y^{'}}\geq A\}$$ $$\tau_B^{X^{'}}=\inf\{n\geq 0:S_n^{X^{'}}\leq B\}$$ $$\tau_B^{Y^{'}}=\inf\{n\geq 0:S_n^{Y^{'}}\leq B\}$$

with $\tau_A^{X^{'}}\leq \tau_A^{Y^{'}}$ since $S_n^{X^{'}}\geq A$ implies $S_n^{Y^{'}}\geq A$ and similarly $\tau_B^{X^{'}}\geq \tau_B^{Y^{'}}$. Please see (for details).

Claim-$1$:

$$E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]\geq E[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}]$$

holds for any $(A,B)$ and $(X,Y)$.

Claim-$2$:

$$E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]\geq E[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}]$$

holds for any $(A,B)$ and $(X,Y)$, if additionally $\partial F/\partial G$ is increasing.

  • I guess this seems like a very basic question but I might also be mistaken.. – Seyhmus Güngören May 02 '14 at 16:18
  • Unless I am missing something obvious, there is no difference between Claim-1 and Claim-2, other than the fact that Claim-2 has an additional hypothesis. – Ian May 04 '14 at 01:41
  • @Ian yes that is true. It is however possible that there might be a counter example for claim $1$ and not for claim $2$. – Seyhmus Güngören May 04 '14 at 01:53
  • Note : You can find X', Y' such that X' is iid with X, Y' is iid with Y, but you can't say that (X', Y') is iid with (X, Y) – Thomas May 06 '14 at 12:32
  • @Thomas thank you very much for the comment. You are right. $(X,Y)=(X^{'},Y^{'})$ are not intented to imply joint distribution. It was just to reduce the abuse of notation. I will edit it in a minute. – Seyhmus Güngören May 06 '14 at 13:00

2 Answers2

2

I think that Claim 1 is not true in general. The following example is for discrete valued variables.

Take $X=\mathrm{Bernoulli}(\frac{1}{2})-\frac{5}{8}$ and $Y=\mathrm{Bernoulli}(\frac{1}{2})-\frac{7}{8}$. Then, $E[X]=-\frac{1}{8}$ and $E[Y]=-\frac{3}{8}$. Take $A=\frac{1}{4}$ and $B=-\frac{1}{2}$. Then, $\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}=1$ and $P[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}>1]\geq\frac{1}{2}$. Note that $E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]=1$ and $E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]\geq \frac{3}{2}$.

But the continuous one can be constructed by taking approximation: Let $Z$ be a gaussian variable of mean $0$ and variance $1$. For $\epsilon>0$, define $X^{(\epsilon)}=X+\epsilon Z$ and $Y^{(\epsilon)}=Y+\epsilon Z$. Then, $X,Y$ satisfy the condition in Claim 1. We will show that for $A=\frac{1}{4}$ and $B=-\frac{1}{2}$, Claim 1 is violated for small enough $\epsilon$. Roughly speaking, we would like to say that small perturbation $\epsilon Z$ just increases $E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]$ a little. Moreover, the small perturbation $\epsilon Z$ cannot decrease $E[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}]$ too much. To be more precise, let $(X_i)_i$ be i.i.d copies of $X$ and $(Z_i)_i$ be i.i.d copies of $Z$. Denote by $S^{X^{(\epsilon)}}_k$ the partial sum of $X_i^{(\epsilon)}=X_i+\epsilon Z_i$. Then, $$P[S^{X^{(\epsilon)}}_k\geq B]\leq P\left[X_1+\cdots+X_k-kE[X]\geq \frac{k}{16}\right]+P\left[\epsilon(Z_1+\cdots+Z_k)+kE[X]\geq -\frac{k}{16}-\frac{1}{2}\right].$$ Recall that $E[X]=-\frac{1}{8}$. By Hoeffding's inequality, $$P\left[X_1+\cdots+X_k-kE[X]\geq \frac{k}{16}\right]\leq \exp\left(-\frac{k}{128}\right).$$ Since $\epsilon(Z_1+\cdots+Z_k)$ is a mean zero gaussian variable with variance $k\epsilon^2$, $$P\left[\epsilon(Z_1+\cdots+Z_k)+kE[X]\geq -\frac{k}{16}-\frac{1}{2}\right]\leq P[\mathcal{N}(0,1)\geq \frac{k-8}{16\sqrt{k}\epsilon}].$$ Then, there exists a universal constant $C$ such that $\sup\limits_{0<\epsilon\leq 1}E[\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_B>k]\leq \frac{C}{k}$. (Note that $P[\tau^{X^{(\epsilon)}}\geq k+1]\leq P[S^{X^{(\epsilon)}}_k\geq B]$.) In other words, the tail of $\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A})$ is uniformly small. Pick $k=K_0$ large enough such that $$E[\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A}),\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A})\geq K_0]\leq 0.01.$$ On the other hand, for $i=2,\ldots,K_0-1$, $$P[\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A})=i]\leq P[\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A})>1]\leq P[\epsilon |Z_1|>1/4].$$ Thus, $$E[\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A}),1<\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A})<K_0]\leq (K_0)^2P[\epsilon |Z_1|>1/4].$$ Then, we can pick a small enough $\epsilon$ such that $$E[\min(\tau^{X^{(\epsilon)}}_{B},\tau^{X^{(\epsilon)}}_{A})]\leq 1+0.02.$$ By a simpler argument as in the last step, $$E[\min(\tau^{Y^{(\epsilon)}}_{B},\tau^{Y^{(\epsilon)}}_{A})]\geq 1.5-0.01.$$

Guest
  • 86
  • 3
  • I am aware of some discrete constructions, even in the same sample space. I will be happy to see a counterexample, if there is, on $\mathbb{R}$ where both $X$ and $Y$ are non-zero everywhere (with continuity conditions as given in the question). If I would translate this answer to the real numbers via replacing the Bernoulli by Gaussian I will see that Claim 1 is still valild, for example. But perhaps Gaussian mixtures with small variances could work, being analogous to this discrete example. – Seyhmus Güngören May 07 '14 at 17:11
  • I add some explaination for the continuous variables with non-zero densities. But I don't know whether Claim 2 is correct or not. It seems to be out of my ability. – Guest May 07 '14 at 20:59
1

Claim $2$ is also incorrect. I found a counterexample for that.

EDIT: I started from the Bernoulli r.v. s suggested by Guest. Since they are discrete I took Gaussian mixtures imitating the mean shifted Bernoulli r.v.s.

Then step by step I decreased the variance of the densities (both densities' variances in the mixture model). So at one point I hit a negative value, which showed that claim $1$ was no more correct. My codes were seaching on $A\times B$ with $A\in[0,4]$ and $B\in[0,4]$ in $0.01$ steps.

After finding a counterexample on the real numbers for claim one. I tried to remove the condition that $\partial F/\partial G$ was non-increasing. This is possible if the variances of the densities in the gaussian mixture model are the same. Then, I made it equal to each other.

In the last step I was only able to change the variance and do the same search again. Here is the final density functions which are counterexamples:

$$G\sim\mathcal{N}(-0.5, 2.27)$$ and $$F\sim\mathcal{N}(-0.25, 2.27)$$ with $A=0.3$ and $B=-4$ but $B$ can be less than this number and $A$ can be even closer to $0$ and the variance can be larger. Counterexample still holds.

Here are the results:

$$E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]=3.4350$$

$$E[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}]=3.5346$$

So I have $$E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]<E[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}]$$

If I change the variance to $4.27$ for both cases. Then I get

$$E[\min\{\tau_A^{X^{'}},\tau_B^{X^{'}}\}]=2.5484$$

$$E[\min\{\tau_A^{Y^{'}},\tau_B^{Y^{'}}\}]=2.6154$$

All counterexamples seem to violate the claim with a minor difference. I use MATLAB and $10^6$ simulation points, therefore I am very confident with the results.