2

I'm looking for an example of sequences of random variables $(X_n), (Y_n), (Z_n)$ such that $X_n, Y_n, Z_n \sim \mathcal U(0,1)$, $(X_n,Y_n), (X_n, Z_n), (Y_n, Z_n)$ converge in distribution, but the vector $(X_n,Y_n, Z_n)$ does not converge.

Using copula theory we can relax the assumptions. If $X_n,Y_n,Z_n$ are absolutely continuous and the marginals do not depend on $n$, then

$$C_n(s,t,r) := F_{(X_n,Y_n,Z_n)}(F_{X_1}^{-1}(s), F_{Y_1}^{-1}(t), F_{Z_1}^{-1}(r))$$

defines a joint distribution function with $\mathcal U(0,1)$-marginals. Here, $F$ denotes the (joint) distribution function and $F^{-1}$ the generalized inverse of the distribution function.

Then convergence of the distribution of $(X_n,Y_n,Z_n)$ is equivalent to $C_n$ converging pointwise, similarly for the bivariate marginals. So let me rephrase my initial sentence:

I'm looking for an example of sequences of absolutely continuous random variables $(X_n), (Y_n), (Z_n)$ such that the marginals of $X_n, Y_n, Z_n$ do not depend on $n$, $(X_n,Y_n), (X_n, Z_n), (Y_n, Z_n)$ converge in distribution, but the vector $(X_n,Y_n, Z_n)$ does not converge.

To begin with, it is not possible if $(X_n,Y_n,Z_n)$ is (centered) Gaussian. If the bivariate marginals converge, then the covariance matrix converges and hence so does the characteristic function. So we need something more sophisticated.

Stefan Perko
  • 12,867

3 Answers3

3

This example is inspired by some answers in Examples of pairewise independent but not independent continuous random variables.

Let $\Gamma_1$ be the subset $\left[0, \frac{1}{2}\right]^2 \cup\left[\frac{1}{2}, 1\right]^2$ of $\Gamma := [0,1]^2$. Let $\left(U_i\right)_{i \in[1,3]}$ be i.i.d of law $\mathcal{U}([0,1])$.

Define $$\quad\left\{\begin{array}{l}X := U_1 \\ Y := U_2 \\ Z := \frac{1}{2} \left[\mathbb{1}_{(X, Y) \in \Gamma_1} + U_3\right] \\ Z' := \frac{1}{2} \left[\mathbb{1}_{(X, Y) \in \Gamma \backslash \Gamma_1} + U_3\right]\end{array}\right.$$ Then we have that $X, Y, Z, Z'$ all follow a uniform law on $[0, 1]$. And that $\lbrace X, Y, Z \rbrace$ are pairwise independent but not mutually independent. And samewise for $\lbrace X, Y, Z' \rbrace$.

If we now define for all $n \geq 0$ $$ \left\{\begin{array}{l} X_n:=X \\ Y_n:=Y \\ Z_{2 n}:=Z \\ Z_{2 n+1}:=Z^{\prime} \end{array}\right. $$

we then have

$$ \left\{\begin{array}{l} \left(X_n, Y_n\right) \xrightarrow[n \rightarrow \infty]{\mathcal{L}} \mathcal{U}\left([0,1]^2\right) \\ \left(Y_n, Z_n\right) \xrightarrow[n \rightarrow \infty]{\mathcal{L}} \mathcal{U}\left([0,1]^2\right) \\ \left(Z_n, X_n\right) \xrightarrow[n \rightarrow \infty]{\mathcal{L}} \mathcal{U}\left([0,1]^2\right) \end{array}\right. $$

But for all $n \geq 0$, $$ \left\{\begin{array}{l} \mathbb{P}\left(\left(X_{2 n}, Y_{2 n}, Z_{2 n}\right) \in \Gamma_1 \times\left[0, \frac{1}{2}\right]\right)=0 \\ \mathbb{P}\left(\left(X_{2 n+1}, Y_{2 n+1}, Z_{2 n+1}\right) \in \Gamma_1 \times\left[0, \frac{1}{2}\right]\right)=1 \end{array}\right. $$

So $\left(X_n, Y_n, Z_n\right)_{n \geqslant 0}$ doesn't have a limiting distribution.

Fefe
  • 141
  • It seems a bit tedious to verify the (in-) dependence properties stated for $X,Y,Z,Z'$. Can you provide some insight into those? Maybe it is explained in the linked question, but I didnt really find anything at first glance looking like this example. – Stefan Perko Aug 21 '24 at 12:40
  • An argument could be $(i)$ $X \perp Y$ by definition $(ii)$ $X \perp Z$ because $\mathcal{L}(Z | X = x) = \mathcal{L}(\frac{1}{2} \left[\mathbb{1}_{(x, Y) \in \Gamma_1} + U_3\right]) = \mathcal{U}([0,1])$ so doesn't depend on $x$ $(iii)$ samewise for $Y \perp Z$ $(iv)$ and they are not mutually independent because $\mathbb{P}(Z \leq \frac{1}{2} | (X, Y ) \in \Gamma_1) = 0 \neq \mathbb{P}(Z \leq \frac{1}{2})$ for example. – Fefe Oct 03 '24 at 09:54
2

It suffices find vector $(X,Y,Z)$ such that $(X,Y)$, $(Y,Z)$ and $(X,Z)$ have uniform distribution on the unit square but $(X,Y,Z)$ do not have a uniform distribution on the unit cube of $\mathbb R^3$. Then we can take a sequence $(X_n,Y_n,Z_n)$ such that for $n$ even, $(X_n,Y_n,Z_n)=(X,Y,Z)$ and for $n$ odd, $(X_n,Y_n,Z_n)$ have uniform distribution on the unit cube of $\mathbb R^3$.

Define $g(x)=\mathbf{1}_{[0,1/2)}(x)-\mathbf{1}_{[1/2,1)}(x)$ and $$ f(x,y,z):=\left(1-g(x)g(y)g(z)\right)\mathbf{1}_{[0,1)}(x)\mathbf{1}_{[0,1)}(y)\mathbf{1}_{[0,1)}(z). $$ Then $f$ is a density and since $\int_0^1 g(z)dz=0$, the density of $(X,Y)$ is uniform on the unit square. By symmetry, the same holds for $(X,Z)$ and $(Y,Z)$. But $f$ is not the product of the marginal densities.

Stefan Perko
  • 12,867
Davide Giraudo
  • 181,608
2

Here is a counterexample, a slight simplification of Davide Giraudo's answer. For even $n$ let $(X_n,Y_n,Z_n)=(U, V, U+V\bmod 1)$, and for odd $n$ let $(X_n,Y_n,Z_n)=(U,V,W)$, where $(U,V,W)$ are iid uniforms on $[0,1]$. All the 2-dimensional margins $(X_n,Y_n)$, $(X_n,Z_n)$, and $(Y_n,Z_n)$ are uniformly distributed on $[0,1]\times[0,1]$, but the joint distribution of $(X_n,Y_n,Z_n)$ depends on the parity of $n$.

The distribution of $X_n+Y_n-Z_n\bmod 1$ is degenerate for even $n$ and uniform on $[0,1]$ for odd $n$. A continuous bounded test function showing the desired convergence cannot happen is (for example) $\cos(2\pi(X_n+Y_n-Z_n))$.

The same idea, said slightly differently: If $\mathbb T^k$ denotes the $k$-torus $[0,1)^k$, with component-wise addition mod 1, the probability law of $(U,V)$ is Haar measure on $\mathbb T^2$ and of $(U,V,W)$ is Haar measure on $\mathbb T^3$. The push-forward of the former under the map $(u,v)\mapsto(u,v,u+v)$ is not the latter.

kimchi lover
  • 24,981