Conditional probability and independence
Last modified — 04 Feb 2026
In general, the outcome of a random experiment can be any element of \(\Omega\).
Sometimes, we have “partial information” about which elements can occur.
Roll a die.
If \(A\) is the event of obtaining a “2”, then \(\mathbb{P}(A) = 1/6\).
But if the outcome is known to be even, then intuition suggests that \(\mathbb{P}(A) > 1/6\).
Two events play distinct roles in this example:
The event of interest \(A = \{ 2 \}\)
The conditioning event \[B= \{\text{outcome is even}\} = \{2, 4, 6\}\]
The conditioning event captures the “partial information”
Just as \(\mathbb{P}(\cdot)\) is a function, for any fixed \(B\), \[\mathbb{Q}_B \left( \ \cdot \ \right) \, = \, \mathbb{P}\left(\ \cdot \ \ \vert\ B \right)\] is a function. Its argument is any event \(A \subseteq \Omega\).
Moreover, \(\mathbb{P}\left(\ \cdot \ \ \vert\ B \right)\) satisfies the three Axioms of a Probability.
Your friend rolls a fair die once, looks at it, and tells you that the result is even.
What is \(\mathbb{P}(\{2\} \ \vert\ \text{even})\)?
The event of interest \(A = \{ 2 \}\)
The conditioning event \(B= \{\text{outcome is even}\} = \{2, 4, 6\}\)
Therefore, \[ \mathbb{P}(A \ \vert\ B) = \frac{\mathbb{P}(A \cap B)}{\mathbb{P}(B)} = \frac{1/6}{1/2} = 1/3 > 1/6 = \mathbb{P}(A). \]
If \(\mathbb{P}(A_1) > 0\), then \[ \mathbb{P}\left( A_{1} \cap A_{2}\right) = \mathbb{P}\left(A_{2}\ \vert\ A_{1}\right) \, \mathbb{P}\left( A_{1}\right). \]
The proof is the same for any \(n\). Everything cancels except what we want.
An urn has 10 red balls and 40 black balls.
Three balls are randomly drawn without replacement.
Calculate the probability that:
The 3rd ball is red given that the 1st is red and the 2nd is black.
The first drawn ball is red, the 2nd is black and the 3rd is red.
We say that \(B_{1}, \ldots, B_{n}\) is a partition of \(\Omega\) if
They are disjoint \(B_{i}\cap B_{j} \, = \, \varnothing \quad \mbox{ for } i \ne j \, ,\)
They cover the whole sample space: \(\bigcup_{i=1}^{n} B_{i} \, = \, \Omega\)
\(A = A \cap \Omega = A \cap \left( \bigcup _{i=1}^{n}B_{i}\right) = \bigcup_{i=1}^{n}\left( A \cap B_{i}\right)\)
The events \(\left( A \cap B_{i}\right)\) are disjoint.
Therefore, by Axiom 3, we have \[\begin{aligned} \mathbb{P}\left( A\right) & = \mathbb{P}\left( \bigcup_{i=1}^{n} A \cap B_{i} \right) \\ &=\sum_{i=1}^{n} \mathbb{P}\left( A\cap B_{i}\right) \\ &=\sum_{i=1}^{n} \mathbb{P}\left( A\ \vert\ B_{i}\right) \, \mathbb{P}\left( B_{i}\right). \end{aligned}\]
Suppose you bring a friend to the ER.
You select a door, the host then opens one of the 2 remaining doors, revealing a goat 🐐. The host asks
Would you like to switch to the remaining closed door?
\[\mathbb{P}\left( A\ \vert\ B\right) = \frac{\mathbb{P}\left( A\cap B\right) }{\mathbb{P}\left( B\right) } = \frac{\mathbb{P}\left( A\right) \mathbb{P}\left( B\right) }{\mathbb{P}\left( B\right) } = \mathbb{P}\left( A\right).\]
Knowledge about \(B\) occurring does not change the probability of \(A\) and vice versa.
Knowledge of the occurrence of either of these events does not affect the probability of the other.
Thus the name: “independent events”
If \(A\) and \(B\) are non-trivial events. Then,
Stat 302 - Winter 2025/26