1

I want to prove that, for any monotonous function $u$:

$$\operatorname{E}(u(X)) = \int_{-\infty}^\infty u(x) f(x) dx$$

I use that $Y = u(X)$ and then use the CDF method to change variables in the following way:

$$F_Y (y) = \operatorname{P}(Y \le y) = \operatorname{P}(u(X) \le y) = \operatorname{P}(X \le u^{-1}(y)) = F_X(u^{-1}(y))$$

$$\implies f_X(u^{-1}(y)) = \frac{d}{d \; u^{-1}(y)} \; F_X(u^{-1}(y))$$

And using that $(g^{-1})'(y) = \frac{1}{g'(x)}$:

$$\implies f_Y(y) = \frac{d}{dy} F_Y(y) = \frac{d}{dy} F_X(u^{-1}(y)) = f_X(u^{-1}(y)) · \frac{d}{dy} u^{-1}(y) = f_X(u^{-1}(y)) \frac{1}{u'(x)}$$

$$\begin{align} \operatorname{E}(u(X)) & = \operatorname{E}(Y) \\ & = \int_{-\infty}^\infty y \; f_Y (y) dy \\ & = \int_{-\infty}^\infty u(x) \; f_Y (u(x)) \; d(u(x)) & y = u(x) \implies dy = (u'(x)) \; dx\\ & = \int_{-\infty}^\infty u(x) \; f_X (u^{-1}(u(x))) \; \frac{1}{u'(x)} \; d(u(x)) \\ & = \int_{-\infty}^\infty u(x) \; f_X (x) \; \frac{1}{u'(x)} \; (u'(x)) \; dx \\ & = \int_{-\infty}^\infty u(x) \; f_X (x) \; \frac{1}{u'(x)} \; (u'(x)) \; dx = \int_{-\infty}^\infty u(x) \; f_X (x) \; dx \\ \end{align}$$

I know that the proof using the Jacobian method is much shorter and simpler, but is this proof valid? If not, what is wrong with it?

Thanks for any feedback!

  • 2
    What is your motivation of proving it this way? Btw, there is an elementary proof of this that does not use any Jacobians at all, and works without conditions on $u$ or the distribution of $X$ (except that the expectation must exist). – drhab Apr 19 '18 at 14:28
  • @drhab Could you post the link to that proof? This was an exam question and I feel like the shorter the proof, the less time I'll spend on this question... – Alex Lostado Apr 19 '18 at 16:13
  • 1
    My first instinct was that this was the definition of $\mathbb E[g(X)]$. Whether that's true or not depends on your approach, I suppose -- but you can find a good discussion of this topic here: https://math.stackexchange.com/questions/1277800/expected-value-of-a-function-of-a-random-variable – Aaron Montgomery Apr 19 '18 at 16:30
  • 2
    Search using the keyword "Law of the unconscious statistician" – StubbornAtom Apr 19 '18 at 17:46
  • 1
    Instead of providing a link I made it clear in an answer (too much for a comment). – drhab Apr 20 '18 at 09:02

1 Answers1

1

This is not really an answer to your question but shows you another route.


First some basics.

Let $\langle\Omega,\mathcal{A},\mathsf{P}\rangle$ denote the probability measure space on which random variable $X$ is defined.

Then $X:\Omega\to\mathbb{R}$ is a measurable function in the meaning that $X^{-1}\left(B\right)\in\mathcal{A}$ for every $B\in\mathcal{B}$ where $\mathcal{B}$ denotes the $\sigma$-algebra of Borelsets on $\mathbb{R}$.

Here $X^{-1}\left(B\right)=\left\{ \omega\in\Omega\mid X\left(\omega\right)\in B\right\} $ and this set is also denoted by $\left\{ X\in B\right\} $ and $\mathsf{P}\left(\left\{ X\in B\right\} \right)$ is abbreviated by $\mathsf{P}\left(X\in B\right)$.

Further $X$ induces a probability measure $\mathsf{P}_{X}$ on $\mathcal{B}$ which is prescribed by $B\mapsto\mathsf{P}\left(X\in B\right)=\mathsf{P}\left(X^{-1}\left(B\right)\right)$.

This together results in a probability measure space $\langle\mathbb{R},\mathcal{B},\mathsf{P}_{X}\rangle$.

If $X$ is integrable wrt measure $\mathsf{P}$ then it has a so-called expectation defined like this:

$$\mathsf{E}X=\int X\left(\omega\right)\mathsf{P}\left(d\omega\right)$$ and essential in this context is the equality: $$\int X\left(\omega\right)\mathsf{P}\left(d\omega\right)=\int x\mathsf{P}_{X}\left(dx\right)\tag1$$

The RHS of the equality is mostly written as $\int xdF_{X}\left(x\right)$ where $F_{X}$ is the CDF of $X$ which is completely determining for $\mathsf{P}_{X}$.

This equality can be proved by first showing that it is valid for indicator functions.

Note for instance that - if $X=1_{B}$ for some $B\in\mathcal{B}$ - we will find: $F_{X}\left(x\right)=\begin{cases} 0 & \text{if }x<0\\ 1-\mathsf{P}\left(B\right) & \text{if }0\leq x<1\\ 1 & \text{otherwise} \end{cases}$

leading to $\int X\left(\omega\right)\mathsf{P}\left(d\omega\right)=\int1_{B}\left(\omega\right)\mathsf{P}\left(d\omega\right)=\mathsf{P}\left(B\right)=\int xdF_{X}\left(x\right)=\int x\mathsf{P}_{X}\left(dx\right)$.

Then it can easily be extended to step functions and finally to integrable functions.


If this is all well captured then the way is open to prove the equality in your question.

Let $\langle\Omega,\mathcal{A},\mathsf{P}\rangle$ denote the probability measure space on which random variable $X$ is defined.

Let $U:\mathbb{R}\to\mathbb{R}$ be a random variable defined on probability space $\langle\mathbb{R},\mathcal{B},\mathsf{P}_{X}\rangle$ and let it be integrable.

To emphasize that it is random variable I use capital $U$ instead of $u$.

It induces - as stated above - a probability space $\langle\mathbb{R},\mathcal{B},\left(\mathsf{P}_{X}\right)_{U}\rangle$ and we find easily that: $$\left(\mathsf{P}_{X}\right)_{U}=\mathsf{P}_{U\circ X}$$ by stating that for $B\in\mathcal{B}$ we have: $\left(\mathsf{P}_{X}\right)_{U}\left(B\right)=\mathsf{P}_{X}\left(U^{-1}\left(B\right)\right)=\mathsf{P}\left(X^{-1}\left(U^{-1}\left(B\right)\right)\right)=\mathsf{P}\left(\left(U\circ X\right)^{-1}\left(B\right)\right)=\mathsf{P}_{U\circ X}\left(B\right)$

Based on that we find:

$\int U\left(x\right)F_{X}\left(x\right)=\int U\left(x\right)\mathsf{P}_{X}\left(dx\right)=\int u\mathsf{P}_{U\circ X}\left(du\right)=\int U\circ X\left(\omega\right)\mathsf{P}\left(d\omega\right)=\mathsf{E}\left(U\circ X\right)$

The second equality and third equation are both applications of $(1)$. The second uses that $U$ is a random variable on $\langle\mathbb{R},\mathcal{B},\mathsf{P}_{X}\rangle$ and the third equality uses that $U\circ X$ is a random variable on $\langle\Omega,\mathcal{A},\mathsf{P}\rangle$.

drhab
  • 153,781