3

Let $\Omega = \mathbb{N}_0^2, \mathcal{A}=Pot(\Omega)$ and $\mathbb{P}$ the product of two poisson distributions with paremeter $\lambda_1,\lambda_>0$, i.e $$ \mathbb{P}(\{(n_1,n_2)\})=\frac{\lambda_1^{n_1} \lambda_2^{n_2}}{n_1!n_2!} e^{-\lambda_1-\lambda_2}$$ Define then $X:\Omega\rightarrow \mathbb{N}_0,\ (n_1,n_2)\mapsto n_1+n_2$. Prove that X is poisson distributed with parameter $\lambda_1+\lambda_2$


I've seen various proofs that similar to this one, Poisson Distribution of sum of two random independent variables $X$, $Y$ , but they are basically between independent variables. In this job it's about one single random variable. So what to prove is that
$X$~$\mathcal{P}(\lambda_1+\lambda_2)$
My attempt \begin{align}\mathbb{P}(X=n) &=\mathbb{P}(n_1+n_2=n)\\&=\sum_{k=0}^n\mathbb{P}(n_1=k,n_2=n-k)\\&= \sum_{k=0}^n\mathbb{P}(n_1=k)\mathbb{P}(n_2=n-k) \\&=\sum^n_{k=0}\frac{\lambda_1^k\lambda^{n-k}}{k!(n-1)!}e^{-(\lambda_1+\lambda_2)} \\&=\frac{(\lambda_1+\lambda_2)^n}{n_!}e^{-(\lambda_1+\lambda_2)} \end{align} This is obviously enough for $X$~$\mathcal{P}(\lambda_1+\lambda_2)$

But my worries is that, in the brackets of $\mathbb{P}$ am I allowed to use this kind of notation $n_1=k, n_2=n-k$, because from what I've learnd, in the brackts it should be a random variable equals to some real number, like $\mathbb{P}_X(\{t\})=\mathbb{P}(X=t)$, but in the proof it is a real number equals to some real number, I don't know it's correct

user123
  • 334
  • You can use that $P(X=n)=f(n)$. – callculus42 Jul 23 '22 at 14:13
  • @callculus42 This probability do equal to the density function at the point n, but my question is that, if it also holds for non random variables – user123 Jul 23 '22 at 17:08
  • "but they are basically between independent variables" Which is what you have here, because the probability mass is of the form $f_1(n_1)f_2(n_2)$. – J.G. Jul 23 '22 at 20:02

2 Answers2

4

Your proof essentially has the right ideas. I believe that your confusion simply stems from how you are constructing the probability space.

Your probability space is explicitly the product space $(\mathbb{N}_0 \times \mathbb{N}_0, \mathcal{P}(\mathbb{N}_0) \otimes \mathcal{P}(\mathbb{N}_0), \mathbb{P}_{\lambda_0} \times \mathbb{P}_{\lambda_1})$, where $\mathbb{P}_{\lambda}$ is the Poisson probability measure on $(\mathbb{N}_0, \mathcal{P}(\mathbb{N}_0))$. Since you have explicitly defined probabilities on the non-negative integers, you don't need a random variable to map from another underlying probability space: you already have $\mathbb{P}_{\lambda_0}(\{ n_1 \}) = \frac{\lambda_1^{n_1}}{n_1!} e^{-\lambda_1}$ for all $n_1 \in \mathbb{N}_0$.

Technical point: Remember that a probability measure is defined on sets in the sigma-algebra. For a discrete measure, it is sufficient to define it on the singletons $\{ \{ n \} : n \in \mathbb{N}_0 \}$.)

Thus, to "fix" the notation in your proof, we can write $$ \begin{align} \mathbb{P}(X = n) &= \mathbb{P}(\{ (n_1, n_2) : X(n_1, n_2) = n \}) \\ &=\mathbb{P}(\{ (n_1, n_2) : n_1 + n_2 = n \}) \\ &= \mathbb{P}\left(\bigcup_{k=0}^n \{ (n_1, n_2) : n_1 = k, n_2 = n - k \} \right) = \dots \end{align} $$ The first line simply clarifies what the usual notation $\mathbb{P}(X = n)$ means: it is the probability of the event where the random variable takes on a certain value (i.e. set of 2-tuples $(n_1, n_2)$ that add to $n$). The rest continues as you have written. After splitting into a sum (by additivity), for each $0 \leq k \leq n$ we will have $\mathbb{P}(\{ (n_1, n_2) : n_1 = k, n_2 = n - k \}) = \mathbb{P}_{\lambda_1}(\{ k \}) \mathbb{P}_{\lambda_2}(\{ n - k\})$ since we have a product measure.

This notation is indeed not as easy to understand as the usual random variable notation (which allows for a more "probabilistic" way of thinking). By abstracting away the underlying probability space, we can simply say that $X_1 \sim \mathrm{Poisson}(\lambda_1)$ and $X_2 \sim \mathrm{Poisson}(\lambda_2)$ are two independent random variables on the same probability space, so that for $X = X_1 + X_2$ we can simply write $\mathbb{P}(X = n) = \mathbb{P}(X_1 + X_2 = n) = \sum_{k=0}^n \mathbb{P}\left( X_1 = k, X_2 = n-k\right) = \dots$. The key takeaway is that product measures are essentially the same as independent random variables.

JKL
  • 2,189
3

Your argument is good until the second line: from there just directly use the definition of $\mathbb{P}$ (i.e., without referring to, or even being aware of any "independence" in mind). I also elaborate the first two lines in your argument a little bit.
\begin{align} & \mathbb{P}(X = n) \\ =& \mathbb{P}(\{(n_1, n_2): n_1 + n_2 = n\}) \\ =& \mathbb{P}\left(\{(n_1, n_2): n_1 + n_2 = n\} \bigcap \bigcup_{k = 0}^\infty\{(n_1, n_2): n_1 = k\}\right) \tag{1} \\ =& \mathbb{P}\left(\bigcup_{k = 0}^\infty(\{(n_1, n_2): n_1 + n_2 = n\} \cap \{(n_1, n_2): n_1 = k\})\right) \tag{2} \\ =& \mathbb{P}\left(\bigcup_{k = 0}^n(\{(n_1, n_2): n_1 + n_2 = n\} \cap \{(n_1, n_2): n_1 = k\})\right) \tag{3} \\ =& \mathbb{P}\left(\bigcup_{k = 0}^n\{(n_1, n_2): n_1 = k, n_2 = n - k\}\right) \\ =& \sum_{k = 0}^n\mathbb{P}(\{(n_1, n_2): n_1=k, n_2=n-k\}) \tag{4} \\ =& \sum_{k = 0}^n\mathbb{P}(\{(k, n - k)\}) \\ =& \sum^n_{k=0}\frac{\lambda_1^k\lambda_2^{n-k}}{k!(n - k)!}e^{-(\lambda_1 + \lambda_2)} \tag{5} \\ =& \frac{(\lambda_1 + \lambda_2)^n}{n!}e^{-(\lambda_1 + \lambda_2)}. \end{align}

This shows $X \sim \text{Poisson}(\lambda_1 + \lambda_2)$.


Details explanation:

$(1)$: because $\Omega = \bigcup_{k = 0}^\infty\{(n_1, n_2): n_1 = k\})$.

$(2)$: set operation.

$(3)$: when $k > n$, the intersection is $\emptyset$.

$(4)$: These $n$ sets are disjoint.

$(5)$: Use the specification of $\mathbb{P}$ directly.

Zhanxiong
  • 15,126