3

As I keep reading probability books, there are always some issues that no one considers.

For example,

for $\omega \in \Omega$ and $X$, $Y$ independent random variable we define $Z(\omega )=X(\omega )\cdot Y(\omega)$, So if $E[X]$ , $E[Y]$ , $E[Z]$ defined, we know that $E[X]\cdot E[Y]=E[Z]$.

But, I really curious whether there's a situation when $E[X]$, $E[Y]$ defined, but $E[X\cdot Y]$ ($E[Z]$) is $\infty$ or even Diverging? I wasnt able to think of an answer.

(Is it ok to post more than one question in the same day?)

Thanks again.

henry11
  • 75
  • That's not possible with $X$ and $Y$ independent. Do you still mean to include independence as an assumption? – Douglas Zare Apr 06 '11 at 20:19
  • yes, this is what I thought about. How come it's not possible? how does one prove that? – henry11 Apr 06 '11 at 20:22
  • The formula $E[XY]=E[X]E[Y]$, when $X$ and $Y$ are independent integrable random variables, can be proved using the Monotone Convergence Theorem. But this is not elementary... – Shai Covo Apr 07 '11 at 01:29

2 Answers2

1

No. If $X,Y$ are integrable (i.e. $E|X| < \infty$, $E|Y|<\infty$) and independent, then $Z=XY$ is integrable.

The first general proof I can think of is to use the distribution measures $\mu_X$, $\mu_Y$ for $X,Y$. We have $E|Z| = \iint |xy|\mu_X(dx)\mu_Y(dy)$, which by Tonelli's theorem equals $\int |x| \mu_X(dx) \int |y| \mu_Y(dy)$. But this is just $E|X| E|Y|$ which is finite.

Nate Eldredge
  • 101,664
  • I'm not familiar with all these terms and therorems, I have reached only to probability distribution, random variables,expectation. Is it able to explain using these? thank you. – henry11 Apr 06 '11 at 20:34
  • Isn't Fubini's theorem good enough? – Yuval Filmus Apr 06 '11 at 20:46
  • Yuval, also not. can you explain me in general why it is not possible? without any formal prove? – henry11 Apr 06 '11 at 20:54
1

You can think of it this way. If $X$ and $Y$ are independent, the conditional distribution of $|Y|$ given $X$ is the same as the distribution of $|Y|$ itself. So $E[|X| |Y| | X] = |X| E[|Y| | X] = |X| E[|Y|]$, and $E[|X| |Y|] = E [E [|X| |Y| | X ]] = E[|X|] E[|Y|]$.

Robert Israel
  • 470,583
  • To write something like $E[|X||Y||X]$, one must first assume that $XY$ is integrable. – Did Apr 06 '11 at 23:12
  • Although that is the assumption when it is written, for non-negative random variables there is a single random non-negative variable (not necessarily finite) which behaves like conditional expectation does. Even if it is not integrable. – yaakov Apr 07 '11 at 13:45
  • If you're worried about convergence, you may replace $|X|$ and $|Y|$ by truncated versions of them, and then take limits. – Robert Israel Apr 07 '11 at 18:07
  • @yaakov I know that, thanks. But I doubt a solution based on such a twist has much pedagogical value (especially when the question is to prove an integrability holds). – Did Apr 11 '11 at 05:52
  • See comment above. – Did Apr 11 '11 at 05:53