2

(this is a revision of Does the composition of a random variable-valued function with itself induce dependence?)

Say I have a probability space $(\mathbb{R}, \Sigma, \mu)$ and a function $f$ of the form $f: \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}$ such that for any distinct $x_1,x_2 \in \mathbb{R}$ the functions $f(x_1, -)$ and $f(x_2, -)$ are non-degenerate independent random variables over $(\mathbb{R}, \Sigma, \mu)$.

Now let's define the random variables $G$ and $H$ as: \begin{align} G(y) = f(f(x_1, y), y) \\ H(y) = f(f(x_2, y), y) \\ \end{align}

Are $G$ and $H$ independent?

gigalord
  • 337

1 Answers1

2

Let $\Omega$ be a general sample space (possibly different from the set $[0,1]$). Let $f:\mathbb{R}\times \Omega \rightarrow \mathbb{R}$ be a function that satisfies your assumptions: For outcomes $y$ in the sample space $\Omega$, the random variables $f(x,y)$ (indexed by $x \in \mathbb{R}$) are non-generate and pairwise independent. In particular $f(x,y)$ is a measurable function of the outcome $y$ for each $x \in \mathbb{R}$. For simplicity of notation define $V_x(y) = f(x,y)$. Non-degenerate implies that for each $x \in \mathbb{R}$ there is a threshold $h_x$ such that $$P[V_x(y)>h_x] \in (0,1)$$

Define the function $g:\mathbb{R}\times \Omega \rightarrow \mathbb{R}$ by $$ g(x,y) = \left\{ \begin{array}{ll} 3 &\mbox{ if $x=0$ and $V_0(y)>h_0$} \\ 2 & \mbox{ if $x=0$ and $V_0(y)\leq h_0$}\\ 0 & \mbox{if $x\neq 0$ and $V_x(y)>h_x$}\\ 1 & \mbox{if $x \neq 0$ and $V_x(y)\leq h_x$} \end{array}\right.$$ Then for each distinct $x_1,x_2 \in \mathbb{R}$ the functions $g(x_1,y)$ and $g(x_2,y)$ are measurable functions of $y$, non-degenerate, and are independent random variables. However for $x_1=1$ and $x_2=2$: $$G(y)=g(g(1,y),y) =\left\{\begin{array}{ll} g(0,y) & \mbox{if $V_1(y)>h_1$} \\ g(1,y) & \mbox{ if $V_1(y)\leq h_1$} \end{array}\right.$$ $$H(y)=g(g(2,y),y) =\left\{\begin{array}{ll} g(0,y) & \mbox{if $V_2(y)>h_2$} \\ g(1,y) & \mbox{ if $V_2(y)\leq h_2$} \end{array}\right.$$

Then $$P[H =3]=P[V_2(y)>h_2]P[V_0(y)>h_0] > 0 $$ but $$P[H=3|G=2]=0$$ So $G$ and $H$ are not independent.

Michael
  • 26,378
  • This of course assumes that such indexed random variables $f(x,y)$ on the outcome space $y \in [0,1]$ that are pairwise independent and non-degenerate for all indices $x \in \mathbb{R}$ exists (which I do not think is possible). – Michael Jan 02 '20 at 19:44
  • 2
    Since the sigma algebra and the measure are left unspecified in the question, I am not completely sure if such a family of independent RV can exist. At least for the usual Borel sigma algebra (or in fact for any countably generated sigma algebra), such RV indeed do not exist; see for instance here: https://math.stackexchange.com/questions/1549807/showing-that-there-do-not-exist-uncountably-many-independent-non-constant-rando – PhoemueX Jan 02 '20 at 22:54
  • @PhoemueX : Thanks for the link! That is more directly relevant than the link I gave in the prior question. Indeed I don't think it is possible either, for the sample space $[0,1]$, but it was easier to construct the above counter-example (assuming existence) than to prove nonexistence. – Michael Jan 02 '20 at 23:20
  • I have edited the counter-example to hold for general sample spaces $\Omega$, not necessarily $\Omega =[0,1]$ (the proof does not change). We can indeed have an uncountably infinite number of pairwise independent random variables when the sample space is larger than $[0,1]$ (via the Kolmogorov extension theorem). – Michael Jan 02 '20 at 23:21