I've been thinking about the concept of random sample for hours, and I don't really understand it.
Imagine I toss a fair coin three times, and I want to study the number of heads. I define $\Omega=\{HHH,HHC,HCH,CHH,HCC,CHC,CCH,CCC\}$ and $X:\Omega\rightarrow\mathbb{R}$ as $X(\omega)=\text{number of heads in $\omega$}$. The distribution for $X$ is a $\text{binomial}(3,0.5)$.
The definition of random sample is: we say that $\{X_1,\ldots,X_n\}$ is a random sample if the random variables $X_1,\ldots,X_n$ are independent and identically distributed. In this case, imagine that $X_1,\ldots,X_n\sim\text{binomial}(3,0.5)$.
My doubt is: we have $X_1,\ldots,X_n:\Omega\rightarrow\mathbb{R}^n$, and each $X_i$ represents the number of heads. Then, don't we have $X_1(\omega)=X_2(\omega)=\ldots=X_n(\omega)\,$?
For example, in the law of large numbers, why don't we write $$\frac{X(\omega_1)+\ldots+X(\omega_n)}{n}\stackrel{n}{\longrightarrow} E[X]\,?$$
What I don't really understand is why $X_1(\omega),\ldots,X_n(\omega)$ represent different observations of the same phenomenon, and not $X(\omega_1),\ldots,X(\omega_n)$.