I'm looking for an example of sequences of random variables $(X_n), (Y_n), (Z_n)$ such that $X_n, Y_n, Z_n \sim \mathcal U(0,1)$, $(X_n,Y_n), (X_n, Z_n), (Y_n, Z_n)$ converge in distribution, but the vector $(X_n,Y_n, Z_n)$ does not converge.
Using copula theory we can relax the assumptions. If $X_n,Y_n,Z_n$ are absolutely continuous and the marginals do not depend on $n$, then
$$C_n(s,t,r) := F_{(X_n,Y_n,Z_n)}(F_{X_1}^{-1}(s), F_{Y_1}^{-1}(t), F_{Z_1}^{-1}(r))$$
defines a joint distribution function with $\mathcal U(0,1)$-marginals. Here, $F$ denotes the (joint) distribution function and $F^{-1}$ the generalized inverse of the distribution function.
Then convergence of the distribution of $(X_n,Y_n,Z_n)$ is equivalent to $C_n$ converging pointwise, similarly for the bivariate marginals. So let me rephrase my initial sentence:
I'm looking for an example of sequences of absolutely continuous random variables $(X_n), (Y_n), (Z_n)$ such that the marginals of $X_n, Y_n, Z_n$ do not depend on $n$, $(X_n,Y_n), (X_n, Z_n), (Y_n, Z_n)$ converge in distribution, but the vector $(X_n,Y_n, Z_n)$ does not converge.
To begin with, it is not possible if $(X_n,Y_n,Z_n)$ is (centered) Gaussian. If the bivariate marginals converge, then the covariance matrix converges and hence so does the characteristic function. So we need something more sophisticated.