8

Let $X_1, X_2$ be independent real random variables. Does there exist a non-zero measurable function $f:\mathbb{R}\to\mathbb{R}$ such that $$\frac{f(X_1)}{X_1+X_2}\text{ is independent of }X_1?$$

I am especially interested in the case when $X_1, X_2$ are continuous, but we do not assume anything else about them.

My ideas are the following: if $\frac{f(X_1)}{X_1+X_2 }$ does not depend on $X_1$, then $\frac{f(X_1)}{X_1+X_2}=g(X_2)$. Rewriting this we get $X_1+X_2 = f(X_1)g^{-1}(X_2)$. But on the left-hand side, we have a sum, and on the right side a product. Hence, that's a contradiction. However, these steps are not rigorous and I am not really sure if I can make them. Especially, the existence of $g$ probably does not hold.

Albert Paradek
  • 897
  • 1
  • 7
  • 19
  • 1
    Note that the composition of measurable functions with differently defined $\sigma$-algebras need not be measurable, so $f(X_1)$ may or may not be a random variable in general. – FShrike Sep 23 '22 at 21:07
  • 3
    And, by “not depend”, does that mean to say it is constant wrt changes in $X_1$, or just independent as an independence of random variables? I’m not sure about the existence of $g$. – FShrike Sep 23 '22 at 21:08
  • Well, if $f(X_1)$ is not a random variable, then it doesn't make sense to talk about its independence of $X_1$ and hence, such $f$ does not satisfy the property. So it is enough to consider $f$ such that $f, X_1$ are defined on the same $\sigma$-algebras, right? Thanks for the comment – Albert Paradek Sep 23 '22 at 21:10
  • And by "not depend", yes, I mean as an independence of random variables. I wrote "does not depend" since I kind of wanted to emphasise that $f(x)/(x+y)$ is not a function of $x$ from calculus point of view. But I am not sure if we can do the same step with random variables. – Albert Paradek Sep 23 '22 at 21:12
  • 4
    f(X) constant will work. – herb steinberg Sep 23 '22 at 21:57
  • 7
    Note that the hypothesis that $$\Psi(X_1,X_2) \text{ is independent of }X_1 \tag{}$$ does not* imply that $$ \Psi(X_1,X_2)=g(X_2) ,. \tag{**} $$

    e.g. suppose that $X_i$ are independent standard normal variables, and
    $$ \Psi(X_1,X_2)=X_2 \cdot \text{sgn}(X_1) ,.$$

    – Yuval Peres Sep 24 '22 at 01:08
  • 1
    f(X) constant will not work, since $42/(X_1+X_2)$ is not independent of $X_1$. Only 0 would work, but f can not be zero functions from the assumptions – Albert Paradek Sep 24 '22 at 10:39
  • @YuvalPeres thank for the thought, it is interesting that such a case can happen. However, does it mean that I am wrong and there exist f? – Albert Paradek Sep 24 '22 at 10:40
  • I expect that you are right. – Yuval Peres Sep 24 '22 at 10:56
  • 1
    What are quantifiers here? $\forall X_1\forall X_2 \exists f$ or $\exists X_1\exists X_2\exists f$, or something else? – mihaild Sep 24 '22 at 21:12
  • Well, the question is put as: characterize the set of random variables such that $\exists f$...But at this point, I would like to see at least $\exists X_1, \exists X_2, \exists f$ since I can not find any such triplet (maybe if the random variables are somehow degenerate but not sure about it) – Albert Paradek Sep 25 '22 at 09:36

1 Answers1

3

If we are interested in just existence, there is such example. Let $\xi = \frac{f(X_1)}{X_1 + X_2}$.

Assume $X_1$ takes values $x_1$ and $x_2$ with probabilities $\frac{1}{2}$ each, and $X_2$ takes $y_1$ and $y_2$.

Then, if $X_1 = x_1$, $\xi$ takes values $\frac{f(x_1)}{x_1 + y_1}$ and $\frac{f(x_1)}{x_1 + y_2}$ with probability $\frac{1}{2}$, and if $X_1 = x_2$ - values $\frac{f(x_2)}{x_2 + y_1}$ and $\frac{f(x_2)}{x_2 + y_2}$.

As we want $\xi$ to be independent of $X_1$, the sets of values should be the same. Assume that, like in Yuval's example in comments, different values of $X_1$ transposes values of $\xi$, so we have

$$\begin{cases} \frac{f(x_1)}{x_1 + y_1} = \frac{f(x_2)}{x_2 + y_2}\\ \frac{f(x_1)}{x_1 + y_2} = \frac{f(x_2)}{x_2 + y_1} \end{cases}$$

This is equivalent to $$\begin{cases} x_1 + y_1 + x_2 + y_2 = 0\\ f(x_1) = -f(x_2) \end{cases}$$ and none of denominators above being equal to $0$.

So, we can take, for example, $x_1 = 1$, $x_2 = -1$, $f(x) = x$, $y_1 = 2$, $y_2 = -2$, and get

$$P\left(\xi = \frac{1}{3}\right) = P\left(\xi = \frac{1}{3} | X_1 = 1\right) = \frac{1}{2}$$ $$P\left(\xi = -1\right) = P\left(\xi = -1 | X_1 = 1\right) = \frac{1}{2}$$

Thus $\xi$ is independent of $X_1$.

mihaild
  • 17,674
  • 1
    Thanks for the example. It is interesting that there exist such a case. However, this argument will not hold if $X_1, X_2$ are non-binary and can attain more than $3$ values; here, you have six parameters and two equations (six parameters are $x_1, x_2, y_1, y_2, f(x_1), f(x_2)$). However, if $X_1, X_2$ can attain $k$ values, then you will have $3k$ parameters and $k(k-1)$ equations. Hence, more equations than parameters. Does it have sometimes a solution? I don't know. What about a continuous case? These questions are still bothering me a lot :D But thanks, I appreciate the time you spent – Albert Paradek Sep 26 '22 at 16:02