Conditional probability and independence
Last modified — 09 Jan 2026
In general, the outcome of a random experiment can be any element of \(\Omega\).
Sometimes, we have “partial information” about which elements can occur.
Roll a die.
If \(A\) is the event of obtaining a “2”, then \(\mathbb{P}(A) = 1/6\).
But if the outcome is known to be even, then intuition suggests that \(\mathbb{P}(A) > 1/6\).
Two events play distinct roles in this example:
The event of interest \(A = \{ 2 \}\)
The conditioning event \[B= \{\text{outcome is even}\} = \{2, 4, 6\}\]
The conditioning event captures the “partial information”
Just as \(\mathbb{P}(\cdot)\) is a function, for any fixed \(B\), \[\mathbb{Q}_B \left( \ \cdot \ \right) \, = \, \mathbb{P}\left(\ \cdot \ \ \vert\ B \right)\] is a function. Its argument is any event \(A \subseteq \Omega\).
Moreover, \(\mathbb{P}\left(\ \cdot \ \ \vert\ B \right)\) satisfies the three Axioms of a Probability.
Your friend rolls a fair die once, looks at it, and tells you that the result is even.
What is \(\mathbb{P}(\{2\} \ \vert\ \text{even})\)?
The event of interest \(A = \{ 2 \}\)
The conditioning event \(B= \{\text{outcome is even}\} = \{2, 4, 6\}\)
Therefore, \[ \mathbb{P}(A \ \vert\ B) = \frac{\mathbb{P}(A \cap B)}{\mathbb{P}(B)} = \frac{1/6}{1/2} = 1/3 > 1/6 = \mathbb{P}(A). \]
If \(\mathbb{P}(A_1) > 0\), then \[ \mathbb{P}\left( A_{1} \cap A_{2}\right) = \mathbb{P}\left(A_{2}\ \vert\ A_{1}\right) \, \mathbb{P}\left( A_{1}\right). \]
The proof is the same for any \(n\). Everything cancels except what we want.
An urn has 10 red balls and 40 black balls.
Three balls are randomly drawn without replacement.
Calculate the probability that:
The first drawn ball is red, the 2nd is black and the 3rd is red.
The 3rd ball is red given that the 1st is red and the 2nd is black.
We say that \(B_{1}, \ldots, B_{n}\) is a partition of \(\Omega\) if
They are disjoint \(B_{i}\cap B_{j} \, = \, \varnothing \quad \mbox{ for } i \ne j \, ,\)
They cover the whole sample space: \(\bigcup_{i=1}^{n} B_{i} \, = \, \Omega\)
\(A = A \cap \Omega = A \cap \left( \bigcup _{i=1}^{n}B_{i}\right) = \bigcup_{i=1}^{n}\left( A \cap B_{i}\right)\)
The events \(\left( A \cap B_{i}\right)\) are disjoint.
Therefore, by Axiom 3, we have \[\begin{aligned} \mathbb{P}\left( A\right) & = \mathbb{P}\left( \bigcup_{i=1}^{n} A \cap B_{i} \right) \\ &=\sum_{i=1}^{n} \mathbb{P}\left( A\cap B_{i}\right) \\ &=\sum_{i=1}^{n} \mathbb{P}\left( A\ \vert\ B_{i}\right) \, \mathbb{P}\left( B_{i}\right). \end{aligned}\]
Suppose you bring a friend to the ER.
You select a door, the host then opens one of the 2 remaining doors, revealing a 🐐. The host asks
Would you like to switch to the remaining closed door?
\[\mathbb{P}\left( A\ \vert\ B\right) = \frac{\mathbb{P}\left( A\cap B\right) }{\mathbb{P}\left( B\right) } = \frac{\mathbb{P}\left( A\right) \mathbb{P}\left( B\right) }{\mathbb{P}\left( B\right) } = \mathbb{P}\left( A\right).\]
Knowledge about \(B\) occurring does not change the probability of \(A\) and vice versa.
Knowledge of the occurrence of either of these events does not affect the probability of the other.
Thus the name: “independent events”
If \(A\) and \(B\) are non-trivial events. Then,
Case 1
If \(\mathbb{P}(\{i\}) = 1/8 \qquad \forall i\), then
\[\begin{aligned} \mathbb{P}\left( A\cap B\right) &= \mathbb{P}\left( \left\{ 4 \right\} \right) = 1/8, \qquad \text{ and } \qquad \mathbb{P}\left( A\right) \mathbb{P}\left( B\right) = 4/ 8 \times 2/8 = 1/8. \end{aligned}\]
Case 2
If \(\mathbb{P}( \{ i \}) = i / 36 \qquad 1 \le i \le 8\), then
\[\begin{aligned} \mathbb{P}\left( A\cap B\right) &= \mathbb{P}\left( \left\{ 4\right\} \right) =4/36, \qquad \text{ and } \qquad \mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) = 10 / 36 \times 12/36. \end{aligned}\]
For example, if \(n=3,\) then, \(A_1\), \(A_2\), and \(A_3\) are independent if and only if all of the following hold:
\[\begin{aligned} \mathbb{P}\left( A_{1}\cap A_{2}\right) &= \mathbb{P}\left( A_{1}\right) \, \mathbb{P}\left( A_{2}\right), \\ \mathbb{P}\left( A_{1}\cap A_{3}\right) &= \mathbb{P}\left( A_{1}\right) \, \mathbb{P}\left( A_{3}\right),\\ \mathbb{P}\left( A_{2}\cap A_{3}\right) &= \mathbb{P}\left( A_{2}\right) \, \mathbb{P}\left( A_{3}\right),\\ \mathbb{P}\left( A_{1}\cap A_{2}\cap A_{3}\right) &= \mathbb{P}\left( A_{1}\right) \, \mathbb{P}\left( A_{2}\right) \, \mathbb{P}\left( A_{3}\right). \end{aligned}\]
We flip a fair coin twice. Define three events:
A die is rolled repeatedly until we see a 6.
Let \(Z\) be the event that you eventually stop rolling. Show that \(\mathbb{P}(Z) = 1\).
Stat 302 - Winter 2025/26