So, I recently (re-)discovered that random variables learned in elementary probability such as the exponentially distributed random variable $X$ with cdf $F_X(x) = 1-e^{- \lambda x}$ can be explicitly represented as
$$X(\omega) := \frac{1}{\lambda} \ln(\frac{1}{1-\omega})$$
where the probability space is $((0,1), \mathscr B(0,1), \mu)$ where $\mu$ is Lebesgue measure.
This can be derived with the formula
$$X(\omega) = \sup\{x \in \mathbb{R}: F_X(x) < \omega\}$$
This is apparently called Skorokhod representation (so-called in David Williams' Probability with Martingales).
Now, apparently, $X(1-\omega)$ is not only also exponential but also does it have the same parameter $\lambda$ as $X(\omega)$, i.e. they have the same distribution.
Similarly, for $X \sim Be(p)$, I have found the following to be $\sim Be(p)$:
$1_{(0,1-p)}(\omega)$
$1_{(0,1-p)}(1-\omega) = 1_{(p,1)}(\omega)$
$1_{(0,1-p)}(1-(\omega+\frac{p}{2})) = 1_{(p,1)}(\omega+\frac{p}{2}) = 1_{(\frac{p}{2},1-\frac{p}{2})}(\omega)$
$1_{(0,1-p)}(1-(\omega+\varepsilon)) = 1_{(p,1)}(\omega+\varepsilon) = 1_{(\varepsilon,1-p+\varepsilon)}(\omega)$ for $0< \varepsilon < p < 1$
So, for what conditions on $g(\omega)$ are sufficient or necessary s.t. $X(g(\omega))$ has the same distribution as $X(\omega)$? I guess we need $g(\omega) : (0,1) \to (0,1)$.
Also does $g(\omega) = 1-\omega$ work for any X? It seems so:
$$X(1 - \omega) := \sup\{x \in \mathbb R | F_X(x) < 1 - \omega\}$$
$$ = \sup\{x \in \mathbb R | \omega \le 1 - F_X(x)\}$$
$$ = \sup\{x \in \mathbb R | \omega < 1 - F_X(x)\}$$
$$ = \sup\{x \in \mathbb R | \omega \ge F_X(x)\}$$
$$ = \sup\{x \in \mathbb R | \omega > F_X(x)\} := X(\omega)$$
QED?
Basically $\omega$ has the same probability of being in $[F_X(x),1)$ as $(0,F_X(x)]$ because both intervals have the same Lebesgue measure.