Consider the following situation: $X_1,X_2,X_3$ are mutually independent, and we know the distributions of $(X_1,X_2)$ and $(X_2,X_3)$ - and thus of course also the distributions of the individual random variables $X_1,X_2 $ and $X_3$.
May we derive the distribution of $(X_1+X_2,X_2+X_3)$ from this?
It is known that since $X_1 $ and $X_2 $ are independent the distribution of $X_1+X_2$ is just the convolution of the distributions of the two summands. My first thought then was the following: the distribution of a vector valued random variable is the product of the distributions of the coordinate random variables if the coordinates are mutually independent. Thus we will know the distribution of $(X_1+X_2,X_2+X_3)$ if it is true that $X_1+X_2$ and $X_2+X_3$ are independent. But this is not generally true, see eg here.
Is it true that $(X_1,X_2)$ and $(X_2,X_3)$ are independent so that the distribution of their sum is the convolution of their distributions? Consider then $P[(X_1,X_2)\in A,(X_2,X_3) \in B]$ for $A,B \in \mathcal {B } (\mathbb R^2 )$ Does it generally "factor"? I doubt this is true...
The context is that I have a stochastic process $\{X_t \} $ with independent increments and I have the distributions of $X_0 $ and any vector of increments, $(X_{t_1} - X_0,...,X_{t_n } -X_{t_{n-1 } } )$ for any $t_1<...<t_n $. I would like to get the distributions of $(X_{t_1 } ,...,X_{t_n } )$ and since $(X_{t_1 } ,...,X_{t_n })=(X_{t_1} - X_0,...,X_{t_n } -X_{t_{n-1 } })+(X_0,...,X_{t_{n-1 } } -X_{t_{n-2 } } )$[NOT CORRECT SEE EDIT], I get the situation described above.
Thanks in advance!
EDIT: Of course it's not correct that $(X_{t_j } -X_{t_{j-1 } })+(X_{t_{j-1 } }-X_{t_{j-2 } })=X_{t_j }$. Instead we add all the coordinates to the left of $(X_{t_j } -X_{t_{j-1 } }) $ to it, $X_{t_j }=(X_{t_j } -X_{t_{j-1 } })+ \sum _{k=1 }^{j-1 }(X_{t_k } -X_{t_{k-1 } })$ if we assume $X_0=0 $. But the situation remains the same.