3

If there is some sample $X^n=(x_1,x_2,\dots,x_n)$, do we consider the elements of this sample $x_i$ independent and identically-distributed realizations of the same random variable $X$ or are they all realizations of different independent and identically distributed random variables $X_1,X_2,...,X_n$ (observation $x_1$ is the realization of a random variable $X_1$, observation $x_2$ is the realization of a random variable $X_2$ etc.)?

I hope this makes sense.

  • 1
    I do not see how this makes a difference to the distribution. But to the extent that $X_1$ is a random variable rather than an observation $x_1$ of a random variable, I suspect that the second description is more applicable – Henry Nov 29 '19 at 12:57
  • In terms of distribution it, indeed, doesn't make any difference. I edited the question so the distinction between random variables and observations is clearer. – Margarita Granat Nov 29 '19 at 13:12
  • 1
    Just in case you don't get an answer that makes this concepts clear to you, I would suggest checking the difference between random variate (or realization and random variable. – Antoni Parellada Nov 29 '19 at 13:30
  • 3
    What exactly do you mean with "realization of random variable"? A random variable $X$ is not more than a function with specific properties. For every outcome $\omega$ in sample space $\Omega$ a value $X(\omega)$ "shows up". Is that the realization that you are talking about? If so then realize that there is only one such value if there is only one random variable $X$. – drhab Nov 29 '19 at 13:42
  • 1
    The key sentence is "The value of the random variable (that is, the function) $X$ at a point $\omega \in \Omega,$ i.e. $ x=X(\omega )$ is called a realization of $X$." This would be for a single random variable. In your post $X^n=(x_1,x_2,...,x_n)$ is a realization of $n$ independent and identically distributed random variables, or the random vector $X=\begin{bmatrix}X_1,X_2,\cdots,X_n\end{bmatrix}^\top.$ – Antoni Parellada Nov 29 '19 at 13:53
  • @drhab In the interest of avoiding ambiguity, I will point out that the expression realization of a random variable is quite acceptable and commonly used. Are you taking exception with the expression in general, or in its use in the OP? – Antoni Parellada Nov 29 '19 at 15:10
  • 1
    @AntoniParellada thank you for your helpful comments – Margarita Granat Nov 29 '19 at 15:36
  • @AntoniParellada I was not familiar with the term realization in this context myself. I thought about it and in my former comment my conjectures about it are expressed, and the useful link you provided tells me that they are okay. I placed my former comment just to check: am I on the same line as the OP here? – drhab Nov 30 '19 at 09:08

1 Answers1

2

You may think about $(x_1,\ldots,x_n)$ as a realization of $n$ independent copies of $X$. Basically, there is a probability space $(\Omega,\mathcal{F},\mathsf{P})$ in the background so that $(x_1,\ldots,x_n)=(X_1(\omega)\ldots,X_n(\omega))$ for some $\omega\in\Omega$, which is chosen randomly according to $\mathsf{P}$. Then the statement "independent and identically-distributed realizations of the same random variable" doesn't make sense. Although, sometimes $(x_1,\ldots,x_n)$ is referred to as a random sample from a particular distribution (e.g. $F_X$).

  • thank you for your answer, very helpful. So if those copies of $X$ are separate entities for each $x_i$ in $(x_1,...,x_n)$, can you comment on why "sometimes $(x_1,...,x_n)$ is referred to as a random sample from a particular distribution"? I mean it makes sense: you have a distribution and you drag values from it. But on the other hand, a particular distribution describes probabilities of values for a particular random variable. Do those "copies of $X$" shrink to one entity when people talk about samples from a distribution? – Margarita Granat Nov 29 '19 at 15:28
  • "Drawing a random sample from a distribution" means obtaining a realization of i.i.d. random variables having that distribution. I think that the origin of that phrase may be related to the inverse transform sampling. –  Nov 29 '19 at 15:35
  • I got one question: Imagine that we have $n$ uncorrelated (not independent) copies of $X$. Now imagine we have one realization $(x_1,...,x_n)$ of these $n$ copies. Can we use this realization $(x_1,...,x_n)$ to estimate the mean of $X$? In other words, can we use samples from uncorrelated identically distributed random variables to estimate the mean (for example, using the sample mean)? – Veljko Feb 06 '20 at 20:04
  • @Veljko Sure. The smaple average of pairwise uncorrelated r.v.s. converges to the mean of these r.v.s. (simple application of Chebyshev's inequality). –  Feb 06 '20 at 20:28
  • Not sure if you are still active, but I wondered if you could comment on the difference between your claim here and the claim made by the accepted answer on cross validated: https://math.stackexchange.com/questions/3455773/are-elements-of-a-sample-i-i-d-realizations-of-the-same-random-variable-or-real. Thank you~ – S.C. Mar 23 '24 at 15:32