1

Let $d\in\mathbb N$ and $X$ be an $\mathbb R^d$-valued random variable on a probability space $(\Omega,\mathcal A,\operatorname P)$. Let $$F(x):=\operatorname P\left[X\le x\right]\;\;\;\text{for }x\in\mathbb R^d,$$ $$F_1(x_1):=\operatorname P\left[X_1\le x_1\right]\;\;\;\text{for }x\in\mathbb R$$ and $$F_i(x):=\operatorname P\left[X_i\le x_i\mid X_1\le x_1,\ldots,X_{i-1}\le x_{i-1}\right]\;\;\;\text{for }x\in\mathbb R^i.$$ Now let $U$ be an $\mathbb R^d$-valued random variable on $(\Omega,\mathcal A,\operatorname P)$ with $U\sim\mathcal U_{[0,\:1)^d}$ (uniform distribution on $[0,1)^d$ and $$Y_1:=F_1^{-1}(U_1)$$ $$Y_i:=F_i^{-1}(Y_1,\ldots,Y_{i-1},U_i)$$ for $i\in{2,\ldots,d}.$$

How can we show that $X\sim Y$.

I'm only able to show this when $d=1$ (see below$^1$).


$^1$ Remember that $F:\mathbb R\to[0,1]$ is called distribution function if

  1. $F$ is nondecreasing;
  2. $F$ is right-continuous;
  3. $F(x)\xrightarrow{x\to-\infty}0$;
  4. $F(x)\xrightarrow{x\to\infty}1$.

Let $$F^{-1}(u):=\inf\{x\in\mathbb R:F(x)\ge u\}\;\;\;\text{for }u\in(0,1).$$ We can easily show that $$F^{-1}(u)\le x\Leftrightarrow u\le F(x)\;\;\;\text{for all }x\in\mathbb R\text{ and }u\in(0,1)\tag1$$ and hence $$\{u\in(0,1):F^{-1}(u)\le x\}=(0,F(x)]\cap(0,1)\;\;\;\text{for all }x\in\mathbb R.\tag2.$$ Using this we are immediately able to conclude that $$X:=F^{-1}$$ is a random variable on $((0,1),\mathcal B((0,1)),\mathcal U_{(0,\:1)})$, where $\mathcal U_{(0,\:1)}$ denotes the uniform distribution on $(0,1)$, with $$\operatorname P\left[X\le x\right]=F(x)\;\;\;\text{for all }x\in\mathbb R\tag3,$$ which is to say that $F$ is the distribution function of $X$.

Mike Earnest
  • 84,902
0xbadf00d
  • 14,208
  • Shouldn't it be$$F_i(x)=P[X_i\le x\mid X_1\color{red}=x_1,\dots,X_{i-1}\color{red}=x_{i-1}]?$$Once you have chosen $Y_1,\dots,Y_{i-1}$, we know the first $i-1$ coordinates of $Y$ exactly, not just upper bounds. – Mike Earnest Nov 18 '22 at 22:09
  • @MikeEarnest Well, I don't think so since we might have $\operatorname P[X_1=x_1,\ldots,X_{i-1}=x_{i-1}]=0$ for all $x_1,\ldots,x_{i-1}$ and this would yield $F_i\equiv0$. BTW, note that the definition I gave also satisfies $$F(x)=\prod_{i=1}^dF_i(x_1,\ldots,x_i).$$ – 0xbadf00d Nov 18 '22 at 23:00
  • See this answer, which discusses generating $(U,V)$ with cdf $F(u,v)$. They use the inverse function of $P[V\le v\mid U=u]$, which agrees with my comment and contradicts your post. You are right that $P[X_1=x_1,\dots,X_{i-1}=x_{i-1}]$ can be zero, but the conditional distribution can still be defined in some sense (I cannot give the details now). – Mike Earnest Nov 18 '22 at 23:24
  • I may be mistaken, but I don't get how $F_i^{-1}(Y_1,...,Y_{i-1},U_i)$ makes sense: doesn't $F_i$ map to $[0,1]$? – Snoop Nov 19 '22 at 02:01
  • @Snoop Okay, the notation is a bit confusing. The inversion is defined as for $F^{-1}$ and is with respect to the last variable. – 0xbadf00d Nov 19 '22 at 09:34

0 Answers0