1

Consider the following situation: $X_1,X_2,X_3$ are mutually independent, and we know the distributions of $(X_1,X_2)$ and $(X_2,X_3)$ - and thus of course also the distributions of the individual random variables $X_1,X_2 $ and $X_3$.

May we derive the distribution of $(X_1+X_2,X_2+X_3)$ from this?

It is known that since $X_1 $ and $X_2 $ are independent the distribution of $X_1+X_2$ is just the convolution of the distributions of the two summands. My first thought then was the following: the distribution of a vector valued random variable is the product of the distributions of the coordinate random variables if the coordinates are mutually independent. Thus we will know the distribution of $(X_1+X_2,X_2+X_3)$ if it is true that $X_1+X_2$ and $X_2+X_3$ are independent. But this is not generally true, see eg here.

Is it true that $(X_1,X_2)$ and $(X_2,X_3)$ are independent so that the distribution of their sum is the convolution of their distributions? Consider then $P[(X_1,X_2)\in A,(X_2,X_3) \in B]$ for $A,B \in \mathcal {B } (\mathbb R^2 )$ Does it generally "factor"? I doubt this is true...

The context is that I have a stochastic process $\{X_t \} $ with independent increments and I have the distributions of $X_0 $ and any vector of increments, $(X_{t_1} - X_0,...,X_{t_n } -X_{t_{n-1 } } )$ for any $t_1<...<t_n $. I would like to get the distributions of $(X_{t_1 } ,...,X_{t_n } )$ and since $(X_{t_1 } ,...,X_{t_n })=(X_{t_1} - X_0,...,X_{t_n } -X_{t_{n-1 } })+(X_0,...,X_{t_{n-1 } } -X_{t_{n-2 } } )$[NOT CORRECT SEE EDIT], I get the situation described above.

Thanks in advance!

EDIT: Of course it's not correct that $(X_{t_j } -X_{t_{j-1 } })+(X_{t_{j-1 } }-X_{t_{j-2 } })=X_{t_j }$. Instead we add all the coordinates to the left of $(X_{t_j } -X_{t_{j-1 } }) $ to it, $X_{t_j }=(X_{t_j } -X_{t_{j-1 } })+ \sum _{k=1 }^{j-1 }(X_{t_k } -X_{t_{k-1 } })$ if we assume $X_0=0 $. But the situation remains the same.

MrFranzén
  • 1,036

1 Answers1

1

Simply marginalising, you can derive the distribution $\mu_j$ of each $X_j$ separately from your input. However, by independence, this lets you recreate the distribution of $(X_1,X_2,X_3)$ (it's simply $\nu:=\mu_1\otimes \mu_2\otimes \mu_3$). However, once you have this distribution, you can definitely recreate $(X_1+X_2,X_2+X_3)$ by applying the map $(f,g)$ where $f(x_1,x_2,x_3)=x_1+x_2$ and $g(x_1,x_2,x_3)=x_2+x_3$.

Now, what the measure $(f,g)(\nu)$ (the push-forward of $\nu$ under this map) looks like in general probably varies quite a bit, depending on the exact nature of your increments. In the classical set-up of Brownian Motion, you need to appeal to the linear stability of Gaussian distributions to say something intelligent.

  • Thanks! Would you mind expanding on the case for Brownian Motion? – MrFranzén Jul 25 '19 at 14:25
  • 1
    In the Brownian Motion case, you know that $(X_{t_1}-X_0,...,X_{t_n}-X_{t_{n-1}})$ is jointly Gaussian and hence, any linear transformation of it (like the one you noted that recreates the distribution $(X_{t_j})_{1\leq j\leq n}$) will result in a jointly Gaussian variable and then, you know that its distribution is given uniquely by its mean and its covariance structure and calculating covariance is, in the case of Brownian Motion, straight forward and, in the general set-up, typically much more elementary than computing all of a joint distribution. – WoolierThanThou Jul 25 '19 at 14:29
  • Right! So the keay is the characterization of a jointly gaussian distribution as still being jointly gaussian under linear transformations and then noting that $(X_{t_1} - X_0,...,X_{t_n } -X_{t_{n-1 } })+(X_0,...,X_{t_{n-1 } } -X_{t_{n-2 } } )$ May be written as $A(X_{t_1} - X_0,...,X_{t_n } -X_{t_{n-1 } }))$ for a correctly chosen matrix $A $? – MrFranzén Jul 25 '19 at 14:47
  • Yes. Remember that in this case $X_0=0.$ Otherwise, either you'd have to say that the whole thing is a function of $(X_0,X_{t_1}-X_0,...,X_{t_n}-X_{t_{n-1}}),$ or you'd recover the distribution of $X_{t_j}-X_0,$ which should also suffice for most purposes. – WoolierThanThou Jul 25 '19 at 14:51