1

Let $X$, $Y$ random variables on $ (Ω, \mathcal A, \mathbb P) $, $ \mathcal F ⊂ \mathcal A $ a sub-σ-algebra and $X$ measurable w.r.t. $\mathcal F$. Let $X$ take values in $(E, \mathcal E)$ and $Y$ values in $(E', \mathcal E')$, and consider a measurable map $ g: E \times E' → \mathbb R$ either nonnegative, or bounded, or such that $ g(X,Y) ∈ \mathcal L_1(\mathbb P)$.

I wonder whether it holds that \begin{equation} \mathbb E[g(X,Y) | \mathcal F] = \mathbb E[g(x,Y) | \mathcal F] \mid_{x = X} \tag{$*$}. \end{equation} If $Y$ happens to be independent from $\mathcal F$, this is indeed the case, as shown in this answer. What can we say without independence? Does it hold or is there a counterexample? If it does not hold in general, does it hold under some additional assumptions on $X$ and $Y$?


Edit: Here is a proof attempt using the 'standard machinery' (and where I think it meets an obstacle):

We will use the functional monotone class theorem. Let $ H $ the set of bounded measurable maps for which $(*)$ holds, which is a vector space containing the constants. For $A ∈ \mathcal E$, $ B ∈ \mathcal E'$ and $ g = I_{A \times B}$, it does indeed hold that \begin{align*} \mathbb E[g(X,Y) | \mathcal F] &= \mathbb E[I_A(X) I_B(Y) | \mathcal F] \\ &=I_A(X) \ E[I_B(Y) | \mathcal F] \\ &= \mathbb E[I_A(x) I_B(Y) | \mathcal F] \mid_{x =X} = \mathbb E[g(x,Y) | \mathcal F] \mid_{x =X}, \end{align*} so $H$ contains indicators of sets in $ \mathcal E \times \mathcal E'$, which is a $π$-system generating $\mathcal E \otimes \mathcal E'$. If we can show that $H$ is in addition closed under nonnegative monotone convergence, it follows that $H$ contains all bounded $\mathcal E \otimes \mathcal E'$ measurable functions and we are (mostly) done. So let $0 \leq g ∈ H$ and $(g_n) ⊂ H$ with $0 \leq g_n ↗ g$. Then \begin{align*} \mathbb E[g(X,Y) | \mathcal F] &\overset{a.s.}{=} \lim_{n → ∞} \mathbb E[g_n(X,Y) | \mathcal F] \\ &= \lim_{n → ∞} \mathbb E[g_n(x,Y) | \mathcal F]\mid_{x = X} \\ &\overset{(?)}{=} \lim_{n → ∞} \mathbb E[g(x,Y) | \mathcal F]\mid_{x = X}. \end{align*} I see the following issue with the last equality. Let $ G_n(ω,x) := \mathbb E[g_n(x,Y)| \mathcal F](ω)$, then from $g_n ↗ g$, $G_n(ω,x) → G(ω,x) =: E[g(x,Y)| \mathcal F](ω) $ almost surely, i.e. for all $ω ∈ N_x^c$ where $\mathbb P [N_x ] = 0$. But the exceptional set $N_x$ of those $ω$ where convergence fails to hold depends on $x$, so unless, say $E$ is countable, these sets may accumulate to prevent convergence on a set with positive measure.


Edit #2: A sufficient condition for $(*)$ to hold is that $Y$ takes values in a Borel (or Polish) space (see below). I have not found an example of failure of $(*)$ without these assumptions, so I'll leave the question open.

jro
  • 785
  • The theorem which holds for $Y$ independent of ${\cal F}$ says something different: \begin{equation} \mathbb E[g(X,Y) | \mathcal F] = \mathbb E[g(x,Y)] \mid_{x = X} . \end{equation} That is: there is an ordinary expectation on the RHS, not a conditional one. This becomes false when $Y$ is not independent of ${\cal F}$. – Kurt G. Feb 24 '23 at 06:27
  • @geetha290krm, I added a proof attempt. I think the standard arguments are not quite sufficient here. – jro Feb 24 '23 at 11:39

3 Answers3

1

The theorem which holds for $X$ being ${\cal F}$-measurable and $Y$ independent of $\cal F$ says something different: $$\tag{1} \mathbb E\big[g(X,Y)\big|\mathcal F\big]=\mathbb E\big[g(x,Y)\big]\mid_{x=X}. $$ That is: there is an ordinary expectation on the RHS, not a conditional one.

This is false when $Y$ is not independent of $\cal F\,:$

Counterexample

Let $Y=X$. Then $X$ and $Y$ are not independent and, for ${\cal F}=\sigma(X)\,,$ $Y$ is not independent of ${\cal F}\,.$ Then for $g(X,Y)=XY$ we have $$ \mathbb E\big[g(X,Y)\big|{\cal F}\big]=XY=X^2\,. $$ On the other hand, when $X$, and therefore $Y$, has expectation zero the RHS of (1) becomes $$ X\,\mathbb E[Y]=0\,. $$

Kurt G.
  • 17,136
  • Thank you for your answer! I agree that in the case in which $Y$ is independent from $\mathcal F$, eq. (1) holds. But in my understanding, this is equivalent to $\mathbb E[g(X,Y) | \mathcal F] = \mathbb E[g(x,Y) | \mathcal F] \mid_{x = X} $, since with independence, the conditional expectation becomes an unconditional one.

    For the original formulation without independence, the counterexample does not work, since $\mathbb E[g(x,Y) | \mathcal F] \mid_{x = X} = xY \mid_{x =X } = XY = \mathbb E[XY|\mathcal F]$.

    – jro Feb 24 '23 at 10:57
  • The version you have in OP I have never seen. Let's try the following. Assume $X,Y$ are both ${\cal F}$-measurable (not necessarily $Y$ independent). By the tower property $\mathbb E[g(X,Y)|{\cal F}]=E[g(X,Y)|\sigma(X,Y)],.$ Then by the Doob-Dynkin lemma this is a deterministic function $f$ of $X,Y,.$ This $f(x,y)$ is often written as $\mathbb E[g(X,Y)|X=x,Y=y],.$ I believe this is a more standard way of expressing your version. – Kurt G. Feb 24 '23 at 11:21
  • Actually, it is simpler. When $Y$ is as in my last comment then $\mathbb E[g(X,Y)|{\cal F}]=g(X,Y),.$ The case that needs to be studied is therefore $Y$ neither ${\cal F}$-measurable, nor independent thereof. This is usually approached by showing it first for functions $g(x,y)$ of the form $f(x)h(y)$ for which your formula is trivial. To show it for all functions a monotone class theorem can be used I think. @geetha290krm's comment is of a similar spirit. – Kurt G. Feb 24 '23 at 12:55
  • I have edited the OP with a proof attempt. It seems to me there is an argument missing though. – jro Feb 24 '23 at 13:10
  • Thanks. Before I try to unravel this just one more remark: the proof of my classic version (1) should be the same as your version and along the lines I mentioned in the previous comment. I cannot give you a reference but maybe you want to look for one in the internet. – Kurt G. Feb 24 '23 at 13:13
  • Agreed! I think the proof in the op (which is very similar to your idea) works when $Y$ is independent of $\mathcal F$ and then shows your eq. (1), since in this case (the conditional expectations being unconditional) the obstruction from exceptional sets piling to cause trouble does not happen. – jro Feb 24 '23 at 14:34
0

A sufficient condition for $(*)$ to hold is that $Y$ has a regular conditional distribution (r.c.d.) given $\mathcal F$ (see e.g. §8.3 in Klenke's Probability Theory for details on r.c.d.s). This is certainly the case when the measurable space $(E', \mathcal E')$ in which $Y$ takes its values is a Borel space, in particular also when $E'$ is a Polish space with Borel σ-algebra $\mathcal E'$ (Thm. 8.37 in the same book).


The proof is essentially the one given in the question above; the difficulty in the equality $(?)$ is circumvented using regularity of the conditional distribution. Details: Let $κ_{Y|\mathcal F}: Ω \times \mathcal E' → [0,1]$ be a r.c.d. of $Y$ given $\mathcal F$. The claim is that in the above setting, $$ \begin{aligned} [g(X,Y) | \mathcal F] &\overset{\mathrm{a.s.}}{=} ∫_{E'} g(X,y) κ_{Y|\mathcal F}(·, \mathrm{d}{y}) \\ &\overset{\mathrm{a.s.}}{=} [g(x,Y) | \mathcal F ]\mid_{x = X} \end{aligned} \tag{$**$} $$ holds. The second equality in $(**)$ follows from a general property of r.c.d.s (see e.g. Thm. 8.38 in Klenke's book), and it remains to show the first. Let $ H $ denote the set of bounded measurable maps for which this first equality in $(**)$ holds. Note that $H$ is a vector space containing the constants. For $A ∈ \mathcal E$, $ B ∈ \mathcal E'$, $ g := I_{A \times B}$ and $ω ∈ Ω$, we have \begin{align*} \int_{E'} g(X(\omega)&,y) \kappa_{Y|\mathcal{F}}(\omega, \mathrm{d}{y}) = \int_{E'}^{} I_{A}(X(\omega)) I_{B}(y) \kappa_{Y|\mathcal{F}}(\omega, \mathrm{d}{y}) \\ &= I_{A}(X(\omega)) \kappa_{Y|\mathcal{F}}(\omega,B) \overset{\mathrm{a.e. } ω ∈ Ω}{=} I_{A}(X(\omega)) \mathbb{E}[ I_{\{ Y \in B \} } | \mathcal{F} ](\omega) \\ &\overset{\mathrm{a.e. } ω ∈ Ω}{=} \mathbb{E}[ I_{A}(X) I_{B}(Y)](\omega) = \mathbb{E}[g(X,Y) | \mathcal{F}](\omega), \end{align*} hence $H$ contains the indicators of sets in $ \mathcal E \times \mathcal E'$, which is a $π$-system generating $\mathcal E \otimes \mathcal E'$. By the functional monotone class theorem, it suffices to show that $H$ is in addition closed under nonnegative monotone convergence. So let $0 \leq g ∈ H$ and $(g_n) ⊂ H$ with $0 \leq g_n ↗ g$. Then \begin{align*} \mathbb E[g(X,Y) | \mathcal F] &\overset{\mathrm{a.s.}}{=} \lim_{n → ∞} \mathbb E[g_n(X,Y) | \mathcal F] \\ &\overset{\mathrm{a.s.}}{=} \lim_{n → ∞} ∫_{E'} g_n(X,y) κ_{Y|\mathcal F}(·, \mathrm{d}{y}).\\ \end{align*} Being a r.c.d, $κ_{Y|\mathcal F}(ω,·)$ is a measure for all $ω ∈ Ω$. Hence monotone convergence applies on the right-hand side pointwise for all $ω ∈ Ω$, and we obtain \begin{align*} \mathbb E[g(X,Y) | \mathcal F] \overset{\mathrm{a.s.}}{=} ∫_{E'} g(X,y) κ_{Y|\mathcal F}(·, \mathrm{d}{y}), \end{align*} proving the claim.

jro
  • 785
0

If $X$ is $\mathcal{F}$-measurable and there exists a regular conditional probability $P_{Y/\mathcal{F}}$ then there also exists a regular conditional probability $P_{X\circ Y/\mathcal{F}}$ and it is: $$P_{X\circ Y/\mathcal{F}}(A\times B)=I_A(X).P_{Y/\mathcal{F}}(B)$$ So: $$E[g(X,Y)\mid \mathcal{F}]=\int g(x,y)dP_{X\circ Y/\mathcal{F}}=\int g(X,y)dP_{Y/\mathcal{F}}$$ or symbolically: $$E[g(X,Y)\mid \mathcal{F}](\omega)=E[g(X(\omega),Y)\mid \mathcal{F}](\omega)$$

Speltzu
  • 801