Possibly related question: Making sense of measure-theoretic definition of random variable
Given a random variable $X$ on $(\Omega, \mathscr{F}, \mathbb{P})$, its law $\mathcal{L}_X$ and a Borel function $g: \mathbb{R} \to \mathbb{R}$,
$$E[g(X)] := \int_{\Omega} g(X(\omega)) d\mathbb{P}(\omega)$$
Change of variable theorem allows us to compute as follows:
$$E[g(X)] = \int_{\mathbb{R}} g(t) d\mathcal{L}_X(t)$$
Dumb question: Without using change of variable theorem, how do we compute $E[g(X)]$?
-
Side question: The point of change of variable is to go back to Riemann or Riemann-Stieltjes integrals to avoid the Lebesgue integral?
-
I guess the answer is to use the measure-theoretic definition of expectation for measurable functions. Since the proof of the change of variable formula is actually to go through indicator, step, nonnegative and measurable functions. It seems like we would end up reinventing the wheel. Humour me anyway, please. How exactly would we be reinventing the wheel?
Say for example $g(x) = x^2$ and $X$ is Unif([0,1]). Then how do we compute
$$\int_{\Omega} X(\omega)^2 d\mathbb{P}(\omega) \tag{*}$$
?
Here's what I got so far.
$$ (*) = \int_{\Omega} (X(\omega)^2)^{+} d\mathbb{P}(\omega) - \int_{\Omega} (X(\omega)^2)^{-} d\mathbb{P}(\omega)$$
where we compute $$\int_{\Omega} (X(\omega)^2)^{+} d\mathbb{P}(\omega) = \sup_{h \in SF^{+}, h \le (X^2)^{+}}\{\int_{\Omega} h d \mathbb P\}$$
and where we compute $$\int_{\Omega} h d \mathbb P = \int_{\Omega} a_11_{A_1} + \cdots + a_n1_{A_n} d \mathbb P = \int_{\Omega} a_11_{A_1} d \mathbb P + \cdots + \int_{\Omega} a_n1_{A_n} d \mathbb P$$
where $A_1, ..., A_n \in \mathscr F$
and finally where we compute
$$\int_{\Omega} a_11_{A_1} d \mathbb P = a_1\int_{\Omega} 1_{A_1} d \mathbb P = a_1 \mathbb P(A_1)$$.
Without using change of variable formula, would we have to come up with indicator and simple functions that lead to a uniformly distributed random variable?
If so, what are these indicator and simple functions that lead to an uniform distribution please?
If not, what to do?
As for the probability space, I was thinking that $X$ being distributed as 'Unif(0,1)' means $X$ is in $(\Omega, \mathscr F, \mathbb P) = ([0,1], \mathscr B[0,1], \lambda)$ or $([0,1], \mathscr M[0,1], \lambda)$?
Actually, I was hoping there would be a way to define $X$ explicitly. For a discrete uniform distribution, say, where $X$ represents toss of a fair die, I guess we would have
$(\Omega, \mathscr F, \mathbb P) = (\{1, \dots ,6\}, 2^{\Omega}, \mathbb P(\omega) = \frac16)$ and $X = \sum_{n=1}^{6} n \cdot 1_{\{\omega = n\}}(\omega)$
Then
$$E[X] = \int_{\Omega}\int_0^1 n 1_{\{(\omega)=n\}}(\omega)dnd\mathbb P(\omega)$$
$$ = \int_0^1 n \int_{\Omega} 1_{\{(\omega)=n\}}(\omega)d\mathbb P(\omega)dn \tag{by Fubini's?}$$
$$ = \int_0^1 n \mathbb P(\{(\omega) = n\}) dn$$
$$ = \int_0^1 n f_X(n) dn$$
$$ = \int_0^1 n \frac11 dn$$
$$ = \int_0^1 (n) dn$$
$$=\frac{n^2}{2} |_{0}^{1}$$
$$=\frac12 - 0 = \frac12$$
As for the second moment,
$$E[X^2] = \int_{\Omega} (\int_0^1 n 1_{\{n = \omega\}}(\omega)dn)^2 d\mathbb P(\omega)$$
$$E[X^2] = \int_{\Omega} \int_0^1 n 1_{\{n = \omega\}}(\omega)dn \int_0^1 m 1_{\{m = \omega\}}(\omega)dm d\mathbb P(\omega)$$
$$E[X^2] = \int_{\Omega} \int_0^1 \int_0^1 n m 1_{\{n = m = \omega\}}(\omega)dn dm d\mathbb P(\omega)$$
$$E[X^2] = \int_{\Omega} \int_0^1 \int_0^1 n^2 1_{\{n = n = \omega\}}(\omega)dn dn d\mathbb P(\omega) \tag{??}$$
$$E[X^2] = \int_0^1 \int_0^1 n^2 dn dn \tag{??}$$
$$E[X^2] = \frac13$$
I think I can do similarly for discrete uniform, but both discrete and continuous uniform are simple random variables. What does $X$ ~ $N(\mu,\sigma^2)$ look like? I guess it would be $X=X^+ - X^-$ where $X^{\pm} = \sup\{\text{simple functions}\}$. Should/Can we use central limit theorem? I'm thinking bernoulli is indicator, binomial is simple and then use binomial to approximate normal?
I guess I'm not making much sense, but what references/topics can I look up for something similar which does? For example, where can I read on explicit representations of or approximations with simple functions for random variables to compute such integrals without change of variable formula?