1

I tried to prove the problem.

I think there are lots of errors in my proof because I am newbie.

I'm guided with the web page Weak Law of Large Numbers for Dependent Random Variables with Bounded Covariance

I am stuck with last line. Could you give me a little help?

EDIT : I prove it in my style. Is it right?

I really thank you so much!


[Assumption]

$X(t) \text{ is Gaussian process with stationary increment, but not independent increment.}$

$\text{Therefore }$$Y(n)=X(t_{n+1})-X(t_{n})$ $for\,\, |t_{n+1}-t_{n}|=c$ $\text{is identically distributed but dependent.}$

$for\,\, Y(n), \,\, 0<\rho_{Y_{t}Y_{k}}<1 \,\,,0<t<k$ $\text{ and }$ $\lim_{|t-k|\to\infty}\rho_{Y_{t}Y_{k}}=0$

$\text{and Y(n) follow Gaussian distribution Y }\sim N(\mu, \sigma^2)$


[Here is the statement I want to prove]

$\text{For random variable }$ $T(k)=\unicode{x1D7D9}_{\{Y_{k}\leq x\}}$,

$\text{By the weak law of large number,}$ $$ \lim_{n\to\infty}\frac{1}{n} \sum^{n}_{k=1}\unicode{x1D7D9}_{\{Y_{k}\leq x\}} = P(Y \leq x)$$


[Proof]

$\text{By Chebyshev's inequality, } Pr[|X-\mu| \ge k\sigma] \le \frac{1}{k^2} $.

$\text{For }\bar{T}=\frac{1}{n}\sum^n_{k=1}T_k,$

$E[\bar{T}]=\mu=P(Y_{k} \le x) = P(Y \le x) = p,$ $\text{ and }$ $Var[\bar{T}]=\frac{Var[\sum{T}]}{n^2}=\frac{\sigma^2}{n^2}$

$$P[|\bar{T}_n-p| \ge \epsilon] \le \frac{\sigma^2}{n^2\epsilon^2}$$

$\sigma^{2}=\sum^n_{i=1}\sum^n_{j=1} Cov(T_{i},T_{j})$

$\,\,\,\,\,\,$ $=\sum^n_{i=1}c+2\sum^n_{i=1}\sum^n_{j=i+1} Cov(T_{i},T_{j})$

$\,\,\,\,\,\,\,\,\,\,\,$ $(Cov(T_i,T_i)= Var(T_i) = p-p^2 = c, \text{c is constant})$

$\\$

$for\,\, 0 \le t \le k,$ $Cov(T_t,T_k)=E[T_{k}T_{t}]-E[T_{k}]E[T_{t}]$

$=P(T_{k} \le x, T_{t} \le x) - P(T_{k} \le x)P(T_{t} \le x) $.

$\\$

$\text{In this point, } |t-k|\to\infty \text{ then } \rho_{Y_{t}Y_{k}}= 0$

$\text{means }Y_{t} \text{ and }Y_{k} \text{ are independent. due to its normality. }$

$\text{Therefore,}$ $ |t-k|\to\infty,\,\, Cov(T_t,T_k)=0$

$\\$

$\text{Then, We can choose } \epsilon \gt 0 \text{ take } N \text{ such that } \forall |i-j| \gt N , \text{ we have } Cov(T_{i}, T_{j}) \lt \frac{\epsilon}{n} \text{And if n is larger than } N$

$$\sum^n_{i=1}\sum^n_{j=i+1} Cov(T_{i},T_{j})=\sum^n_{i=1}\sum^{i-1}_{j=1} Cov(T_{i},T_{j})$$ $$=\sum^n_{i=1}\sum^{i+N}_{j=i+1} Cov(T_{i},T_{j})+\sum^n_{i=1}\sum^{n}_{j=i+N+1} Cov(T_{i},T_{j})$$ $$ \le \sum^n_{i=1}Nc + \sum^n_{i=1}\sum^n_{j=i+N+1}\frac{\epsilon}{n} \le nNc+n\epsilon$$

$\\$

$$\therefore P[|\bar{T}_n-p| \ge \epsilon] \le \frac{\sigma^2}{n^2\epsilon^2}$$ $$\le \frac{nc+2nNc+2n\epsilon }{n^2\epsilon^2} \to 0 \text{ when } n \to \infty $$

$\\$

$$ \text{Therefore } \lim_{n\to\infty} \frac{1}{n}\sum^{n}_{k=1}\unicode{x1D7D9}_{\{Y_{k}\leq x\}} = P(Y \leq x)$$

nimdrak
  • 155

0 Answers0