6

Would someone please advance/discuss some real-life situations falsities $1, 2$?
I'd like to intuit why these are false. As a neophyte, since I still need to compute the probabilities for the examples in the two answers to apprehend them, I haven't naturalised these concepts yet.
Thus, I beg leave to ask about other real-life examples which require less or no computations.

I tried and would appreciate less numerical examples than http://notesofastatisticswatcher.wordpress.com/2012/01/02/pairwise-independence-does-not-imply-mutual-independence/ and http://econ.la.psu.edu/~hbierens/INDEPENDENCE.PDF, and Examples 1.22 and 1.23 on P39 of Introduction to Pr by Bertsekas.

$1.$ Pairwise Independence $\require{cancel} \cancel{\implies}$ (Mutual) Independence.

$2.$ Pairwise Independence $\require{cancel} \cancel{\Longleftarrow}$ (Mutual) Independence.

P38 defines (Mutual) Independence : For all $S \subseteq \{1, ..., n - 1, n\} $ and events $A_i$, $Pr(\cap_{i \in S} A_i) = \Pi_{i \in S} A_i.$

3 Answers3

10

Suppose three guys each toss a (fair) coin. The events "A and B match", "A and C match", "B and C match" are pairwise independent; the three events are not mutually independent.

bof
  • 6,376
4

The usual (and perhaps the most basic) example is to throw two fair coins and to consider the three following events:

  • "The first coin shows heads"
  • "The second coin shows heads"
  • "The two coins agree"

Then, the probability of each of these events is $.5$, the probability of their intersection is $.25$ and the probability of each intersection of two of them is also $.25$.

Thus, they are not independent and they are pairwise independent.

Did
  • 284,245
  • 2
    But this does not explain the intuition. I think any two pair are independent since in addition to what they have in common, there is something extraneous (in our case a different (temporally) toss event) that makes them different. When you put two together, then you know what happened in the two temporal events, and that influences anything here. – Bogdan Feb 14 '14 at 23:26
  • 2
    @Theta33 "Intuition" is in need of a definition here. For example, the mention of "temporal events" in your comment sems to me to muddy things, not help the intuition. – Did Feb 15 '14 at 06:15
  • I found this guide helpful for further elaborating this idea: http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-042j-mathematics-for-computer-science-spring-2005/lecture-notes/l19_prob_indep.pdf – jaynp Mar 01 '15 at 22:14
2

Mutual independence is stronger than pairwise independence, which means that mutual independence implies pairwise independence, but pairwise independence doesn't imply mutual independence.

To make things simple we just consider three events at a time, and we can learn from this answer the constraints of mutual independence and pairwise independence:

$A, B, C$ are mutually independent if $$P(A\cap B\cap C)=P(A)P(B)P(C)$$ $$P(A\cap B)=P(A)P(B)$$ $$P(A\cap C)=P(A)P(C)$$ $$P(B\cap C)=P(B)P(C).$$

On the other hand, $A, B, C$ are pairwise independent if $$P(A\cap B)=P(A)P(B)$$ $$P(A\cap C)=P(A)P(C)$$ $$P(B\cap C)=P(B)P(C).$$

An example: Consider that we toss a fair coin three times, and here are four events:

A: Head appears in the first toss.
B: Head appears in the second toss.
C: Head appears in the third toss.
D: A and B yield the same outcome.

Mutual independence:

Firstly we only consider the $A, B, C$ (just treat $D$ as nonexistent). It is obvious that they are mutual independent. And here are two perspectives of this statement.

  1. $$P(A\cap B\cap C)=P(A)P(B)P(C)=\frac{1}{8}$$ $$P(A\cap B)=P(A)P(B)=\frac{1}{4}$$ $$P(A\cap C)=P(A)P(C)=\frac{1}{4}$$ $$P(B\cap C)=P(B)P(C)=\frac{1}{4}.$$

  2. We firstly calculate the joint probability: $$P(A\cap B\cap C)=P(A)P(B)P(C)=\frac{1}{8}$$ And then get the marginal probability $P(B, C)$ by summing out A: $$P(B, C) = P(A, B, C) + P(\neg A, B, C)=\frac{1}{4}$$ And we can verify the other two easily due to the symmetry of A, B and C.

Mutual independence implies pairwise independence because we can just marginalize out the variables not in each pair.

Pairwise independent but not mutual independent
Let's consider A, B and D(just treat C as nonexistent). $$P(A\cap B\cap D)=\frac{1}{4}\neq P(A)P(B)P(D)=\frac{1}{8}$$ $$P(A\cap B)=P(A)P(B)=\frac{1}{4}$$ $$P(A\cap D)=P(A)P(D)=\frac{1}{4}$$ $$P(B\cap D)=P(B)P(D)=\frac{1}{4}.$$

We can see that A, B and C are mutually independent(thereof pairwise independent), but A, B and D are only pairwise independent and don't satisfy the mutual independence.

Refrence:

  1. Pairsewise independence versus mutual independence by Isaac (Ed) Leonard.
  2. Mutually Independent Events by Albert R Meyer
  3. Probabilistic graphical model by Stefano Ermon
amWhy
  • 210,739