5

Let $(X_{jn})_{1\leq j \leq n}$ be a triangular array of $p-$dimensional random vectors (row independent). Suppose $X_{jn} \sim \mu_{jn}$ and

 1. $\,\, E X_{jn}= \int_{\mathbb R^p} x d \mu_{jn}=0$

  2. $\,\,\lim_{n \to \infty} \max_{1\leq j \leq n} P(|X_{jn}|> \epsilon)=0$, for all $\epsilon > 0$

 3. $\,\,var(S_n):=\sum_{j=1}^n \int_{\mathbb R^p} |x|^2 d\mu_{jn} \leq C < \infty$, for all $n \in \mathbb N$.

Assume that $S_n := \sum_{j=1}^n X_{jn} \Longrightarrow X $, form some $X$.

Now, consider $Y_{jn} \sim CP(1,\mu_{jn})$ [compound Poisson distribution, where the paramenter of the Poisson r.v. is $\lambda =1$ for all $(j,n)$ and the coumpounded vectors are copies of $X_{jn}$]. Define $$S_n' := \sum_{j=1}^n Y_{jn}$$ It is easy to show that $E[S_n']=E[S_n]=0$ and $var[S_n']=var[S_n]$. Moreover, we can show that the characteristic function of $S'_n$ is given by: $$\varphi_{S_n'}(u)=\exp\left\{ \int_{\mathbb R^p} \left[e^{iu'x} - 1 \right] d\nu_n \right\} = \exp\left\{ \int_{\mathbb R^p} \left[e^{iu'x} - 1 - iu'x \right] d\nu_n \right\}, \quad \nu_n(E):= \sum_{j=1}^n \int_E d\mu_{jn}, \quad E\, \,\hbox{ borelian set.}$$

By an argument of Accompanying Law (section 3.7 from the Varadhan'lecture notes), we have that $S_n = \sum_{j=1}^n X_{jn} \Longrightarrow X $ if and only if $$S_n'= \sum_{j=1}^n Y_{jn} \Longrightarrow X $$   Using the theorem 8.7, page 41, from the Sato's book, we have $X$ is Infinitely Divisible (I.D.) and its characteristic function is: $$\varphi_{X}(u) = \exp\left\{ \frac{- u'\sigma u}{2} + \int_{\mathbb R^p} \left[e^{iu'x} - 1 - iu'x \right] d\nu \right\}.$$ Moreover, $$\int f d\nu_n \to \int f d\nu \quad (n \to \infty),\quad \forall f \in \mathcal C_\#$$ ($\mathcal C_\#$ is the class of continuous and bounded functions vanishing on a neighborhood of $0$ ). The mentioned theorem has another implication involving $\sigma$, but I don't think it will be useful to mention it. According to this question, the last integral convergence is equivalent to \begin{equation}\label{asd}\tag{I} \nu_n(E) \to \nu(E), \quad \forall E \in \mathcal{C}_\nu, \,\, 0 \notin \bar E \end{equation} Where $\bar E$ is clousure of the borelian $E$.

Question:

Since $\int_{\mathbb R^p} x d\nu_n = \sum_{j=1}^n \int_{\mathbb R^p} x d\mu_{jn} = 0$ for all $n$, I suspect that $\int_{\mathbb R^p} x d\nu = 0$. How to show this?

Although each $\nu_n$ is not a probability measure (since $\nu_n$ is a sum of $n$ probability measures), convergence in (\ref{asd}) looks a lot like a weak convergence of measures. Furthermore, given that $\sup_n \int x^2d\nu_n(x) < C $, I could apply some similar uniform integrability result to conclude that $\int x d\nu_n(x) \to \int x d\nu (x)$. Given that $\int x d\nu_n(x) =0$, I would have the desired result. But I don't know how to do this rigorously.

Update

As said before, the mentioned theorem has another implication involving $\sigma$ and it was my fault for not specifying. Allow me to add it.

First, for any $\epsilon>0$, define the symetric non-neg-definite matrix $\sigma_{n,\epsilon}$ as (actually, this involves a certain $\sigma_n$, but in this case it's zero): \begin{equation}\label{new}\tag{II} \langle u, \sigma_{n,\epsilon}u \rangle := \int_{|x|\leq \epsilon}\langle u ,x\rangle^2 d\nu_n(x), \quad u \in \mathbb R^p \end{equation} Then: \begin{equation}\label{new2}\tag{III} \lim_{\epsilon \downarrow 0} \limsup_{n \to \infty} \left| \langle u, \sigma_{n,\epsilon}u \rangle - \langle u, \sigma u \rangle \right|=0 \end{equation}

Initially, this question has already been answered. I appreciate any new answer using this new hypotheses given in (\ref{new}) and (\ref{new2}).

PSE
  • 544

1 Answers1

2

At first, Thank PSE to point an error in my former answer.

Now I will give an example, which means that $\int xd\nu_n\not\to \int xd\nu$.

Let $\mu_{jn}=(1-\frac1n)\delta_{\{\frac1n\}}+\frac1n\delta_{\{\frac1n-1\}}$, $1\le j\le n$, i.e., \begin{equation*} \mathsf{P}(X_{jn}=\tfrac{1}{n})=1-\tfrac1n,\quad \mathsf{P}(X_{jn}=\tfrac{1}{n}-1)=\tfrac1n, \qquad 1\le j\le n. \end{equation*}
Then \begin{align*} &1. \quad \mathsf{E}[X_{jn}]=0,\\ &2. \quad \lim_{n\to\infty}\max_{1\le j\le n}\mathsf{P}(|X_{jn}|>\epsilon)=0, \forall \epsilon >0,\\ &3. \quad \mathsf{var}[S_n]= n \mathsf{var}[X_{1n}]=n \tfrac{1-1/n}{n}\le 1. \end{align*} It is also easy to verify the followings: For $\nu=\delta_{-1}$, \begin{gather*} \nu_n=\sum_{j=1}^n\mu_{jn}=(n-1)\delta_{\{\frac1n\}}+\delta_{\{\frac1n-1\}},\\ \lim_{n\to\infty}\int f\,\mathrm{d}\nu_n= f(-1)=\int f\,\mathrm{d}\nu, \quad \forall f\in C_{\#}.\\ S_n=\sum_{j=1}^nX_{jn}\overset{pr}{\longrightarrow}-1. \end{gather*} At last, \begin{equation*} \int xd\nu_n=(n-1)\frac1n+\Big(\frac1n-1\Big)=0,\qquad \int xd\nu=-1, \end{equation*} Hence \begin{equation*} \lim_{n\to\infty}\int xd\nu_n \ne \int xd\nu . \end{equation*}

Remarks: Regarding the convergence of means and variance, the following book is helpful,
B. V. Gnedenko & A. N. Kolmogorov, Limit distributions for sums of Independent Random Variables, Addison-Wesley Publishing Company, (1968). Theorem 19.3, p.91.

JGWang
  • 5,601
  • Thanks. My intuition failed then. Do you have any ideas what minimal hypotheses I should add for this to be true? – PSE Mar 06 '23 at 21:07
  • The mentioned theorem has another implication involving $\sigma$ and it was my fault for not specifying. Allow me to add it. – PSE Mar 07 '23 at 06:18
  • First, for any $\epsilon>0$, define the symetric non-neg-definite matrix $\sigma_{n,\epsilon}$ as (actually, this involves a certain $\sigma_n$, but in this case it's zero): \begin{equation}\label{new}\tag{II} \langle u, \sigma_{n,\epsilon}u \rangle := \int_{|x|\leq \epsilon}\langle u ,x\rangle^2 d\nu_n(x), \quad u \in \mathbb R^p \end{equation} Then: \begin{equation}\label{new2}\tag{III} \lim_{\epsilon \downarrow 0} \limsup_{n \to \infty} \left| \langle u, \sigma_{n,\epsilon}u \rangle - \langle u, \sigma u \rangle \right|=0 \end{equation} For your example, we have: – PSE Mar 07 '23 at 06:18
  • $\sigma_{n,\epsilon}u^2 =\langle u, \sigma_{n,\epsilon}u \rangle= u^2 \int_{|x|\leq \epsilon}x^2d\nu_n(x)$. For $n$ large enough, we have $\sigma_{n,\epsilon}u^2= u^2 \frac{(n-1)}{n^2}$. So $$ \left| \langle u, \sigma_{n,\epsilon}u \rangle - \langle u, \sigma u \rangle \right|= u^2\left| \frac{(n-1)}{n^2} - \sigma\right|$$ is such that $\limsup_{n \to \infty} \left| \langle u, \sigma_{n,\epsilon}u \rangle - \langle u, \sigma u \rangle \right|= u^2\sigma$. Consequently: – PSE Mar 07 '23 at 06:18
  • $$\lim_{\epsilon \downarrow 0} \limsup_{n \to \infty} \left| \langle u, \sigma_{n,\epsilon}u \rangle - \langle u, \sigma u \rangle \right|=u^2 \sigma \neq0$$

    Do you think that using (II) and (III) is enough to show my initial question?

    – PSE Mar 07 '23 at 06:19
  • 1
    @PSE, To answer your new questions, I add a remarks in the former answer. – JGWang Mar 07 '23 at 09:54
  • Dear, I did not find the references or the theorems mentioned. I'll try one more time with this new hypothesis. Anyway thanks for your help. – PSE Mar 07 '23 at 14:48
  • Can you help with this question? https://mathoverflow.net/questions/455496/a-question-related-to-a-certain-convergence-of-levy-measures – PSE Oct 09 '23 at 07:45
  • Thanks for your comment. These questions(MO455469) seems too complex for me. – JGWang Oct 10 '23 at 01:57