3

I'm reading Chernoff's paper "A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations," and am trying to understand it in terms of measure theory. On page 495, it says:

"$S_n$ is the sum of $n$ independent observations $X_1,X_2,\ldots,X_n$ on a chance variable $X$."

Which of the following is the correct interpretation?

  1. Let $M$ be a measurable space and let $X:M\to\mathbb{R}$ be a measurable function. Then $S_n:M\times\cdots\times M\to\mathbb{R}$ is the function $S_n(p_1,\ldots,p_n)=X(p_1)+\cdots+X(p_n)$
  2. Let $(M,\mu)$ be a probability space and let $X:M\to\mathbb{R}$ be a measurable function. Let $X_1,\ldots,X_n$ be real-valued measurable functions on $M$ such that $(X_i)_\sharp\mu=X_\sharp\mu$ as measures on $\mathbb{R}$ for each $i$ and such that $(X_1,\ldots,X_n)_\sharp\mu=(X_1)_\sharp\mu\times\cdots\times(X_n)_\sharp\mu$ as measures on $\mathbb{R}^n$. Define $S_n:M\to\mathbb{R}$ by $S_n(p)=X_1(p)+\cdots+X_n(p).$
  3. Something else?

(By $(X_1,\ldots,X_n)$ I mean the function $M\to\mathbb{R}^n$ given by $p\mapsto(X_1(p),\ldots,X_n(p))$.)

(I posted this on the stats stackexchange four days ago with no response)

youler
  • 2,608

1 Answers1

2

Both interpretations are ok. In (1) you explicitly constructed a probability space on which $X_1,\ldots,X_n$ and $S_n$ are defined. If $X$ lives on $(\Omega,\mathcal{F},\mathsf{P})$, then its independent copies $X_1,\ldots,X_n$ and $S_n$ are defined on the corresponding product space $(\Omega^n,\mathcal{F}^n,\mathsf{P}^n)$ and each $X_i:=X\circ \pi_i$, where $\pi_i$ is the projection on the $i$-th coordinate. In (2) you assumed that there is a probability space on which all these random variables are defined and that the joint distribution of $(X_1,\ldots,X_n)$ is the product of individual factors. As (1) clearly shows, such a probability space always exists.