Let $(X,Y)$ be a random vector with joint moment generating function $$M(t_1,t_2) = \frac{1}{(1-(t_1+t_2))(1-t_2)}$$ Let $Z=X+Y$. Then, Var(Z) is equal to: (IIT JAM MS 2021, Q21)
Using $M_{X+Y}(t) = M_{X,Y}(t, t)$ gives: $$M(t,t) = \frac{1}{(1-2t)(1-t)}=\frac{1}{(1-2t)} \times \frac{1}{(1-t)}$$
And this can be split into two independent random variables with $$M_U(t)=\frac{1}{(1-2t)}=\frac{0.5}{(0.5-t)} \sim Exp(\lambda=0.5)$$ $$M_V(t)=\frac{1}{(1-t)} \sim Exp(\lambda=1)$$
$M_Z(t)=M_{U+V}(t)=M_U(t)M_V(t)$ which means $U$ and $V$ are independent, and hence the covariance terms is zero: $$V(Z)=V(U+V)=V(U)+V(V)+2Cov(U,V)=V(U)+V(V)=4+1=5$$
This matches the correct answer from the official answer key (which is indeed 5). However, I am skeptical about this solution since $X$ and $Y$ are independent, and this seems to be a way to convert it into two independent random variables. It gives the right answer, but is it valid? And if isn't till what part is it valid?
The general question of interest is that can two dependent random variables be split into two independent random variables using this kind of MGF manipulation.