1

Suppose we are given absolutely continuous random variables $(X_n), (Y_n), (Z_n)$ such that the marginals of $X_n, Y_n, Z_n$ do not depend on $n$ and $(X_n,Y_n), (X_n, Z_n), (Y_n, Z_n)$ converge in distribution (but $(X_n,Y_n,Z_n)$ may not converge, see this thread).

Is there always a sequence of random vectors $V_n:=(X_n', Y_n', Z_n')$ with the same bivariate marginals as $(X_n,Y_n,Z_n)$ such that $V_n$ converges in distribution?

Stefan Perko
  • 12,867

1 Answers1

2

I believe the answer can be no.

The construction turned out to be quite long. It would have been easier if you didn't insist that the random variables are absolutely continuous and that they have the same marginal distribution for each $n$. To make the construction easier to read, note a few ideas before starting:

(1) We will arrange for $Y_n$ and $Z_n$ to be deterministic functions of $X_n$. A consequence is that if $(X_n', Y_n', Z_n')$ has the same bivariate marginals as $(X_n,Y_n,Z_n)$, then in fact $(X_n', Y_n', Z_n')$ has the same distribution as $(X_n,Y_n,Z_n)$.

(2) All $X_n$, $Y_n$ and $Z_n$ will have univariate marginal distributions which are uniform on $(0,2)$.

(3) We will arrange for different distributional limits for $(X_n, Y_n, Z_n)$ along subsequences of even and odd $n$. To simplify, look at the integer parts, which are either $0$ or $1$. When $n$ is odd, either $1$ or $3$ out of $\lfloor X_n \rfloor$, $\lfloor Y_n \rfloor$, and $\lfloor Z_n \rfloor$ will be $1$. When $n$ is even, either $0$ or $2$ of them will be $1$. However, all the bivariate marginals will converge.

Here goes. For each $n$, let $U_n\sim U(0,1)$.

Let $B_{n,k}$ be the $k$th binary bit of $U_n$, namely $B_{n,k}\equiv\lfloor 2^{k} U_n \rfloor \pmod 2$. Then $$ U_n = \sum_{k=1}^\infty 2^{-k}B_{n,k}. $$

We are going to extract the $n$th and $(n+1)$st of these bits. Let's write $\newcommand{\tU}{\tilde{U}}$ $\tU_n$ for what we get if we omit those bits from the binary expansion of $U_n$, namely $$ \tU_n := \sum_{k=1}^{n-1} 2^{-k}B_{n,k} + \sum_{k=n}^\infty 2^{-k} B_{n, k+2}. $$ Notice that also $\tU_n$ has $U(0,1)$ distribution and is independent of $(B_{n,n}, B_{n,n+1})$.

Now we define as follows. For even $n$, define $$ (X_n, Y_n, Z_n) = \begin{cases} (U_n, \tU_n, \tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (0,0) \\ (U_n, 1+\tU_n, 1+\tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (0,1) \\ (1+U_n, \tU_n, 1+\tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (1,0) \\ (1+U_n, 1+\tU_n, \tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (1,1) \end{cases}. $$ (The way in which the different cases are assigned to the different values of $(B_{n,n}, B_{n,n+1})$ is not important; just that it gives probability $1/4$ to each case.)

For odd $n$, instead define $$ (X_n, Y_n, Z_n) = \begin{cases} (1+U_n, \tU_n, \tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (0,0) \\ (1+U_n, 1+\tU_n, 1+\tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (0,1) \\ (U_n, \tU_n, 1+\tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (1,0) \\ (U_n, 1+\tU_n, \tU_n) & \text{ if } (B_{n,n}, B_{n,n+1}) = (1,1) \end{cases}. $$

Now one can check all of (1), (2), and (3) above. The limit of any of the bivariate marginals is that of $(A+U, B+U)$, where $A$ and $B$ are Bernoulli($1/2$), and $U\sim U(0,1)$, with $A,B,U$ independent. However, the entire distribution has different limits along the even and odd subsequences. By (1), the same is also true for any sequence $(X_n', Y_n', Z_n')$ which has the same bivariate marginals as $(X_n, Y_n, Z_n)$.

  • By the way, you didn't ask for $(X_n, Y_n, Z_n)$ to be jointly absolutely continuous, but with a bit more fiddling about I think one can cover that case too. I believe one can also arrange for the limiting bivariate distributions to be jointly absolutely continuous. – Nicholas Burbank Oct 01 '24 at 20:32
  • Perhaps a stupid question, but why is (1) true? Everything else I understand. Yeah, you're right. I should have asked for the vector to be jointly abs. cont., but in any case you've answered question otherwise (and I'm a bit shocked by the answer...). – Stefan Perko Oct 04 '24 at 09:39
  • 1
    @StefanPerko Just the following: let $g$ be any function. If $X$ and $X'$ have the same distribution, then also $(X, g(X))$ and $(X', g(X'))$ have the same distribution. Because $P((X,g(X))\in A)$ $=P(X\in {x:(x,g(x))\in A})$ $= P(X'\in {x:(x,g(x))\in A})$ $=P(X', g(X'))\in A)$. In the setting above we have that $X_n$ and $X_n'$ have the same distribution, and we write $(Y_n, Z_n)=g(X_n)$ and $(Y_n', Z_n')=g(X_n')$. – Nicholas Burbank Oct 08 '24 at 07:58