Last modified — 29 Sep 2025
In general, the outcome of a random experiment can be any element of \(\Omega\).
Sometimes, we have “partial information” about the outcome.
Example: Roll a dice. If \(A\) is the even of obtaining a “2”, then \(\mathbb{P}(A) = 1/6\).
But if the outcome is known to be even, then the chance should be higher.
These two events play distinct roles in this example:
The event of interest \(A\): \(A = \{ 2 \}\)
The conditioning event \(B\), the “partial information”: \[B = \{ 2, 4, 6 \}\]
Let \(A, B \subseteq \Omega\) and assume \(\mathbb{P}(B) > 0\)
The conditional probability of \(A\) given \(B\) is \[\mathbb{P}\left(A \ \vert\ B \right) \, = \, \frac{ \mathbb{P}\left( A \cap B \right) }{ \mathbb{P}\left( B \right) }\]
For any fixed \(B\), the function \[\mathbb{Q}_B \left( \ \cdot \ \right) \, = \, \mathbb{P}\left(\ \cdot \ \ \vert\ B \right)\] satisfies the three Axioms of a Probability.
The dot in “\(\mathbb{Q}_B(\cdot)\)” is meant to highlight the fact that \(\mathbb{Q}_B\) is a function, its argument is any event \(A \subseteq \Omega\).
We roll a fair die, once, and record the outcome.
Consider the following events:
\(A=\left\{ \text{4 or higher}\right\} =\{4,5,6\}\),
\(B=\left\{ \text{even number}\right\} =\{2,4,6\}\)
\(A\cap B=\left\{ 4,6\right\}\)
Therefore \[\mathbb{P}\left( A\ \vert\ B\right) =\frac{\mathbb{P}\left( A\cap B\right) }{\mathbb{P}\left( B\right) }=\frac{% 1/3}{1/2}=2/3\]
\(\mathbb{P}\left( A\right) =0.50\),
Knowing that the outcome is even increases this probability to \[\mathbb{P}\left( A\ \vert\ B\right) =0.667\]
Alternatively, if we knew that \[C=\left\{ \text{the number is odd}\right\}\] then we would have: \[\mathbb{P}\left( A\ \vert\ C\right) =\frac{\mathbb{P}\left( A\cap C\right)}{\mathbb{P}\left( C\right) }=\frac{% 1/6}{1/2}=0.333\]
Notation: \(\mathbb{P}\left( A\ \vert\ B\right) =\) “probability of \(A\) given \(B\)”
If \(\mathbb{P}(A_1) > 0\), then, \[\mathbb{P}\left( A_{1}\cap A_{2}\right) = \mathbb{P}\left( A_{2}\ \vert\ A_{1}\right) \, \mathbb{P}\left( A_{1}\right)\]
Proof: To fix ideas, look at the case \(n=4\) \[\begin{aligned} \mathbb{P}\left( A_{1}\right) \, \mathbb{P}\left( A_{2}\ \vert\ A_{1}\right) \, \mathbb{P}\left( A_{3}\ \vert\ A_{1}\cap A_{2}\right) \, \mathbb{P}\left( A_{4}\ \vert\ A_{1}\cap A_{2}\cap A_{3}\right) &= \\ &&\end{aligned}\] \[\begin{aligned} &= \mathbb{P}\left( A_{1}\right) \, \frac{\mathbb{P}\left( A_{1}\cap A_{2}\right) }{\mathbb{P}\left( A_{1}\right) } \, \frac{\mathbb{P}\left( A_{1}\cap A_{2}\cap A_{3}\right) }{\mathbb{P}\left( A_{1}\cap A_{2}\right) } \, \frac{\mathbb{P}\left( A_{1}\cap A_{2}\cap A_{3}\cap A_{4}\right) }{\mathbb{P}\left( A_{1}\cap A_{2}\cap A_{3}\right) }= \\ &&\end{aligned}\] \[=\mathbb{P}\left( A_{1}\cap A_{2}\cap A_{3}\cap A_{4}\right)\] The general proof is the same, but requires more space
An urn has 10 red balls and 40 black balls.
Three balls are randomly drawn without replacement. Calculate the probability that:
The first drawn ball is red, the 2nd is black and the 3rd is red.
The 3rd ball is red given that the 1st is red and the 2nd is black.
Event of interest: first is red, second is black and third is red
Let \[R_{i} \, = \, \left\{ \mbox{the } i^{th} \mbox{ ball is red } \right\}\] \[B_{j} \, = \, \left\{ \text{ the } j^{th} \mbox{ ball is black }\right\}\]
The event of interest is \[R_{1}\cap B_{2}\cap R_{3}\]
third ball is red given that the first is red and the second is black
Event of interest: \(R_{3}\)
Conditioning event: \(R_{1}\cap B_{2}\). We want \[\mathbb{P}\left( R_{3}\ \vert\ R_{1}\cap B_{2}\right) =\frac{\mathbb{P}\left( R_{1}\cap B_{2}\cap R_{3}\right) }{\mathbb{P}\left( R_{1}\cap B_{2}\right) }\]
Using part (a) we have \[\mathbb{P}\left( R_{1}\cap B_{2}\cap R_{3}\right) = 0.031\]
Also \[\mathbb{P}\left( R_{1}\cap B_{2}\right) = \mathbb{P}\left( B_{2}\ \vert\ R_{1}\right) \, \mathbb{P}\left( R_{1}\right) = \frac{40}{49} \times \frac{10}{50} = 0.16327\]
Finally: \[\mathbb{P}\left( R_{3}\ \vert\ R_{1}\cap B_{2}\right) =\frac{0.031}{0.16327} = 0.18987\]
We say that \(B_{1}, \ldots, B_{n}\) is a partition of \(\Omega\) if
They are disjoint \[B_{i}\cap B_{j} \, = \, \varnothing \quad \mbox{ for } i \ne j \, ,\]
They cover the whole sample space: \[\bigcup_{i=1}^{n} B_{i} \, = \, \Omega\]
In that case, for any \(A \in \Omega\): \[\mathbb{P}\left( A\right) =\sum_{i=1}^{n} \mathbb{P}\left( A\ \vert\ B_{i}\right) \, \mathbb{P}\left( B_{i}\right)\]
To prove it, note
\(A = A \cap \Omega =A \cap \left( \bigcup _{i=1}^{n}B_{i}\right) = \bigcup _{i=1}^{n}\left( A \cap B_{i}\right)\)
The events \(\left( A \cap B_{i}\right)\) are disjoint.
Therefore, using Axiom 3, we have \[\begin{aligned} \mathbb{P}\left( A\right) & = \mathbb{P}\left( \bigcup_{i=1}^{n} A \cap B_{i} \right) \\ & \\ &=\sum_{i=1}^{n} \mathbb{P}\left( A\cap B_{i}\right) \\ && \\ &=\sum_{i=1}^{n} \mathbb{P}\left( A\ \vert\ B_{i}\right) \, \mathbb{P}\left( B_{i}\right) \end{aligned}\]
Suppose that 1% of all email traffic is spam
Assume that 90% of all spam email IS WRITTEN IN ALL CAPS
Suppose that only 2% of not spam email is written in all caps
If you receive an email, what is the probability that it is ALL IN CAPS?
Let \[\begin{aligned} S &= \left\{ \mbox{ email is spam } \right\} &C &= \left\{ \mbox{ email IS ALL IN CAPS } \right) \end{aligned}\]
We need to find \(P \left( C \right)\).
\(S\) and \(S^c\) are a partition of \(\Omega\)
Also, we know \[\mathbb{P}\left( S \right) = 0.01 \, , \quad \mathbb{P}\left( C \ \vert\ S \right) = 0.90 \, , \quad \mathbb{P}\left( C \ \vert\ S^c \right) = 0.02\]
Then \[\begin{aligned} \mathbb{P}\left( C \right) & = \mathbb{P}\left( C \ \vert\ S \right) \, \mathbb{P}\left( S \right) + \mathbb{P}\left( C \ \vert\ S^c \right) \, \mathbb{P}\left( S^c \right) \\ & \\ & = 0.90 \times 0.01 + 0.02 \times 0.99 \\ & \\ & = 0.0288 \end{aligned}\]
Suppose \(B_{1}\), \(B_{2}\), …, \(B_{n}\) is a partition of the sample space
Then, for each \(i=1,...,n\) we have: \[\mathbb{P}\left( B_{i}\ \vert\ A\right) \, = \, \frac{\mathbb{P}\left( A\ \vert\ B_{i}\right) \, \mathbb{P}\left( B_{i}\right) }{% \sum_{j=1}^{n} \mathbb{P}\left( A\ \vert\ B_{j}\right) \, \mathbb{P}\left( B_{j}\right) }\]
Suppose that 1% of all email traffic is spam
Assume that 90% of all spam email IS WRITTEN IN ALL CAPS
Suppose that only 2% of not spam email is written in all caps
If you receive an email what is the probability that it is spam? \[\mathbb{P}\left( \mbox{ email is spam } \right) = 0.01\]
If the email IS ALL IN CAPS, what is the probability that it is spam? \[\mathbb{P}\left( \mbox{ email is spam } \ \vert\ \mbox{ IS ALL IN CAPS } \right) = \, ?\]
Let \[S = \left\{ \mbox{ email is spam } \right\} \quad \quad C = \left\{ \mbox{ email IS ALL IN CAPS } \right)\]
We need to find \(\mathbb{P}\left( S \ \vert\ C \right)\).
We know \[\mathbb{P}\left(S \right) = 0.01 \, , \quad \mathbb{P}\left( C \ \vert\ S \right) = 0.90 \, , \text{ and } \quad \mathbb{P}\left( C \ \vert\ S^c \right) = 0.02\]
Then \[\begin{aligned} \mathbb{P}\left( S \ \vert\ C \right) &= \frac{\mathbb{P}\left( C \ \vert\ S \right) \, \mathbb{P}\left( S \right)}{\mathbb{P}\left( C \right) } = \frac{\mathbb{P}\left( C \ \vert\ S \right) \, \mathbb{P}\left( S \right)} {\mathbb{P}\left( C \ \vert\ S \right) \, \mathbb{P}\left( S \right) + \mathbb{P}\left( C \ \vert\ S^c \right) \, \mathbb{P}\left( S^c \right)}\\ & \\ &=0.3125. \end{aligned}\]
Observing that the email consists of ALL CAPS increases the likelihood of it being spam
from \[0.01 \quad \longrightarrow \quad 0.3125\]
That is a 31-fold increase! \[0.3125 = {\color{red} 31.25} \times 0.01\]
People are screened for infections (COVID-19!)
Patients are screened for cancer
Items are screened for defects
The result of the test can be
Positive: condition is present
Negative: condition is absent.
There are two events of interest \[D \, = \, \left\{ \text{\textbf{condition is present}}\right\}\] and \[T_{+} \, = \, \left\{ \text{\textbf{test result is positive}}\right\}\]
These are different!! (why?)
Goal: patients with a negative test result should have a much smaller chance of having the condition: \[\mathbb{P}\left( D \vert T_{-}\right) \, \ll \, \mathbb{P}\left( D \right)\]
and: patients with a positive test should have a higher chance of having the condition: \[\mathbb{P}\left( D \vert T_{+}\right) \, \gg \, \mathbb{P}\left( D \right)\]
False positive: positive test, no condition \[T_{+} \, \cap \, D^{c}\]
False negative: negative test, condition present \[T_{-} \, \cap \, D\]
Overall screening error: \[\left( T_{+}\cap D^{c}\right) \, \cup \, \left( T_{-}\cap D\right) \quad \mbox{ disjoint}\]
Sensitivity of the test: \[\mathbb{P}\left( T_{+} \ \vert\ D \right) \, = \, 0.95 \quad \mbox{ say}\]
Specificity of the test: \[\mathbb{P}\left( T_{-} \ \vert\ D^{c} \right) \, = \, 0.99 \quad \mbox{ say}\]
Incidence of the condition: \[\mathbb{P}\left( D \right) \, = \, 0.02 \quad \mbox{ say}\]
In practice, sensitivity, specificity and incidence need to be estimated with data
Test performance related to:
Probability that the condition is present given that the test resulted positive.
Probability that the condition is present given that the test resulted negative.
Probability of screening error.
\[\begin{aligned} \mathbb{P}\left( D\ \vert\ T_{+}\right) &= \frac{\mathbb{P}\left( T_{+}\ \vert\ D\right) \, \mathbb{P}\left( D\right) }{\mathbb{P}\left( T_{+}\right) } \\ & \\ &=\frac{0.02\times 0.95}{0.0288} \\ & \\ &= 0.65972 \\ & \\ &\gg \mathbb{P}(D) = 0.02 \end{aligned}\]
\[\mathbb{P}\left( T_{+}\right) = \mathbb{P}\left( T_{+}\ \vert\ D\right) \, \mathbb{P}\left( D\right) + \mathbb{P}\left(T_{+}\ \vert\ D^{c}\right) \, \mathbb{P}\left( D^{c}\right) = 0.0288\]
\[\begin{aligned} \text{Error} &=\left( D\cap T_{-}\right) \cup \left( D^{c}\cap T_{+}\right) & \text{(disjoint)} \end{aligned}\]
\[\begin{aligned} \mathbb{P}\left( \text{Error}\right) &=\mathbb{P}\left( D\cap T_{-}\right) + \mathbb{P}\left( D^{c}\cap T_{+}\right) \\ & \\ &= \mathbb{P}\left( T_{-}\ \vert\ D\right) \, \mathbb{P}\left( D\right) + \mathbb{P}\left( T_{+}\ \vert\ D^{c}\right) \, \mathbb{P}\left( D^{c}\right) \\ & \\ &=0.02\times \left( 1-0.95\right) +\left( 1-0.02\right) \times 0.01 \\ & \\ &=0.0108. \end{aligned}\]
Prisoners A, B and C are to be executed
One is randomly chosen to be pardoned
The warden knows who is pardoned, but cannot tell
Prisoner A asks the warden which of the other two prisoners is not pardoned.
The warden says: B is not pardoned
Conditional on this warden’s answer, C is twice more likely to be pardoned than A
Why?
\[\mathbb{P}\left( A\right) = \mathbb{P}\left( B\right) = \mathbb{P}\left( C\right) = \frac{1}{3}\]
We also assume (reasonably) that \[\begin{aligned} \mathbb{P}\left( W\ \vert\ B\right) &=0 & \quad \text{ (wardens do not lie)} \\ \mathbb{P}\left( W\ \vert\ A\right) &=1/2 & \quad \text{ (warden would flip a coin)} \\ \mathbb{P}\left( W\ \vert\ C\right) &=1 & \quad \text{ (warden cannot name A)} \end{aligned}\]
We can calculate \[\mathbb{P}\left( A\ \vert\ W\right) = \frac{ \mathbb{P}\left( W\ \vert\ A\right) \, \mathbb{P}\left( A\right) }{\mathbb{P}\left( W \right) }\]
Note that \[\begin{aligned} \mathbb{P}\left( W \right) & = \, \mathbb{P}\left( W\ \vert\ A\right) \, \mathbb{P}\left( A\right) + \mathbb{P}\left( W\ \vert\ B\right) \, \mathbb{P}\left( B\right) + \mathbb{P}\left( W\ \vert\ C\right) \, \mathbb{P}\left( C\right) \\ & \\ & = \, (1/2) \times (1/3) + 0 \times (1/3) + 1 \times (1/3)\\ & \\ &= 1/2 \end{aligned} \] thus \[ \begin{aligned} \mathbb{P}\left( A\ \vert\ W\right) &= \frac{ (1/2) \times (1/3) }{(1/2)}\\ & \\ &= 1/3 = \mathbb{P}\left( A \right) \end{aligned} \]
\(A\) didn’t learn anything new about his own fate.
What is \(\mathbb{P}(C\ \vert\ W)\) ?
We have \[\mathbb{P}\left( A\ \vert\ W\right) + \mathbb{P}\left( B\ \vert\ W\right) + \mathbb{P}\left( C\ \vert\ W\right) = 1\] and \[\mathbb{P}\left( B\ \vert\ W\right) =0\] Hence: \[\mathbb{P}\left( C\ \vert\ W\right) = 1-\mathbb{P}\left( A\ \vert\ W\right) = 1-\frac{1}{3} = \frac{2}{3} \, = \, 2 \ \mathbb{P}(C)\]
Systems may have several components.
Individual components may work or fail
Suppose the system has two components: “\(c_1\)” and “\(c_2\)”
Either or both of them may be working or have failed
All possible system statuses: \[\begin{aligned} D_{1} &=\left\{ \text{Only component }c_{1}\text{ has failed}\right\} , \\ D_{2} &=\left\{ \text{Only component }c_{2}\text{ has failed}\right\} , \\ D_{12} &=\left\{ \text{Both components have failed }\right\} , \\ D_{0} &=\left\{ \text{Both components are working }\right\} . \end{aligned}\]
Note that these are disjoint and \[\Omega \ = \ D_{0}\cup D_{1}\cup D_{2}\cup D_{12}\]
Assume that \[\begin{aligned} \mathbb{P}\left( D_{1}\right) &=0.01, & \mathbb{P}\left( D_{2}\right) &= 0.008, \\ \mathbb{P}\left( D_{12}\right) &=0.002, & \mathbb{P}\left( D_{0}\right) &= 0.98 \end{aligned}\]
There is a test with events \[T_{+}=\left\{ \text{Test is positive}\right\} \qquad\mbox{ and } \qquad T_{-}=\left\{ \text{Test is negative}\right\}\] and \[\begin{aligned} \mathbb{P}\left( T_{+}\ \vert\ D_{1}\right) &=0.95, & \mathbb{P}\left( T_{+}\ \vert\ D_{2}\right) &= 0.96 \\ \mathbb{P}\left( T_{+}\ \vert\ D_{12}\right) &=0.99 & \mathbb{P}\left( T_{+}\ \vert\ D_{0}\right) &= 0.01\end{aligned}\]
The following probabilities are of interest:
Probability that a specific component has failed given a positive test
Probability that a specific item has failed given a negative test
Probability that both components have failed given a positive test
Probability of testing error.
Probability “\(c_1\)” has failed given a positive test: \[\begin{aligned} \mathbb{P}\left( D_{1}\ \vert\ T_{+}\right) &=\frac{\mathbb{P}\left( D_{1}\cap T_{+}\right) }{% \mathbb{P}\left( T_{+}\right) } \\ & \\ &=\frac{\mathbb{P}\left( D_{1}\right) \, \mathbb{P}\left( T_{+}\ \vert\ D_{1}\right) }{\mathbb{P}\left( T_{+}\right) } \end{aligned}\]
We need \(\mathbb{P}( T_+ )\)
Which can be computed using \[\begin{aligned} \mathbb{P}\left( T_{+}\right) &= \mathbb{P}\left( T_{+}\ \vert\ D_{1}\right) \, \mathbb{P}(D_1) + \mathbb{P}\left( T_{+}\ \vert\ D_{2}\right) \, \mathbb{P}(D_2) + \cdots \\ & \\ &= 0.01\times 0.95 + 0.008\times 0.96 + 0.002\times 0.99 + 0.98\times 0.01 = 0.02896. \end{aligned}\]
\[\mathbb{P}\left( D_{1}\ \vert\ T_{+}\right) = \frac{0.01\times 0.95}{0.02896} = 0.328 \gg \mathbb{P}(D_1) = 0.01\]
\[\begin{aligned} \mathbb{P}\left( D_{12}\ \vert\ T_{+}\right) &=\frac{0.002\times 0.99}{0.02896}= 0.06837 \\ & \\ \mathbb{P}\left( D_{0}\ \vert\ T_{+}\right) &=\frac{0.98\times 0.01}{0.02896}= 0.33840\end{aligned}\]
In summary \[\mathbb{P}\left( D\ \vert\ T_{+}\right) =1-0.33840=0.6616\]
Note that, before testing: \[\mathbb{P}\left( D\right) =0.01+0.008+0.002=0.02\]
\[\begin{aligned} \mathbb{P}\left( D_{1}\ \vert\ T_{-}\right) &=\frac{\mathbb{P}\left( D_{1}\cap T_{-}\right) }{\mathbb{P}\left( T_{-}\right) } \\ & \\ &=\frac{\mathbb{P}\left( D_{1}\right) \, \mathbb{P}\left( T_{-}\ \vert\ D_{1}\right) }{\mathbb{P}\left( T_{-}\right) } \\ & \\ &=\frac{\mathbb{P}\left( D_{1}\right) \, \mathbb{P}\left( T_{-}\ \vert\ D_{1}\right) }{1 - \mathbb{P}\left( T_{+}\right) } \\ & \\ &=\frac{0.01\times 0.05}{0.97104} \\ & \\ &\approx 0.0005. \end{aligned}\]
Good practice problem:
Verify that
\[\begin{aligned} \mathbb{P}\left( D_{2}\ \vert\ T_{-}\right) &\approx 0.0003 \\ & \\ \mathbb{P}\left( D_{12}\ \vert\ T_{-}\right) &\approx 0.00002 \\ & \\ \mathbb{P}\left( D_{0}\ \vert\ T_{-}\right) &\approx 0.9991 \end{aligned}\]
And hence \[\mathbb{P}\left( \text{defective} \vert T_{-}\right) =1-0.9991=0.0009\]
Events \(A\) and \(B\) are independent if \[\mathbb{P}\left( A\cap B\right) \, = \, \mathbb{P}\left( A \right) \, \mathbb{P}\left( B\right)\]
If \(\mathbb{P}\left( B\right) >0\) this implies that \[\mathbb{P}\left( A\ \vert\ B\right) \, = \, \frac{\mathbb{P}\left( A\cap B\right) }{\mathbb{P}\left( B\right) } \, = \, \frac{\mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) }{\mathbb{P}\left( B\right) } \, = \, \mathbb{P}\left( A\right)\]
Knowledge about \(B\) occurring does not change the probability of \(A\).
Similarly: verify that “knowledge about \(A\) occurring does not change the probability of \(B\)”
Knowledge of the occurrence of either of these events does not affect the probability of the other.
Thus the name: “independent events”
Proof:
\[\begin{aligned} \mathbb{P}\left( A^{c}\cap B\right) &=\mathbb{P}\left( B\right) - \mathbb{P}\left( A\cap B\right) \\ & \\ &= \mathbb{P}\left( B\right) - \mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) & \quad \mbox{ (independence)} \\ & \\ &= \mathbb{P}\left( B\right) \left[ 1- \mathbb{P}\left( A\right) \right] \\ & \\ &= \mathbb{P}\left( B\right) \, \mathbb{P}\left( A^{c}\right) & \quad \mbox{ (complement) } \end{aligned}\]
Prove that if \(A\) and \(B\) are independent then so are \(A\) and \(B^{c}\)
Prove that if \(A\) and \(B\) are independent then so are \(A^{c}\) and \(B^{c}\)
An event \(A\) is non trivial if \(0<P\left( A\right) <1\)
If \(A\) and \(B\) are non trivial events. Then:
If \(A\cap B=\varnothing\) then \(A\) and \(B\) are not independent
If \(A\subset B\) then \(A\) and \(B\) are not independent.
Proof of (1): \[ \mathbb{P}\left( A\ \vert\ B\right) =\frac{\mathbb{P}\left( A\cap B\right) }{\mathbb{P}\left( B\right) } \frac{0}{\mathbb{P}\left( B\right)}= 0 \neq \mathbb{P}\left( A\right) \]
Proof of (2): \[ \mathbb{P}\left( A \ \vert\ B\right) =\frac{\mathbb{P}\left( A\cap B\right) }{\mathbb{P}\left( B\right) } =\frac{\mathbb{P}\left( A\right)}{\mathbb{P}\left( B\right) } \neq \mathbb{P}\left( A \right) \]
Let \(\Omega =\left\{ 1, 2, 3, 4, 5, 6, 7, 8 \right\}\)
All numbers equally likely.
Let \(A=\{1, 2, 3, 4\}\) and \(B=\left\{ 4, 8\right\}\)
Then:
\(\mathbb{P}\left( A\cap B\right) = \mathbb{P}\left( \left\{ 4 \right\} \right) = 1 / 8\)
\(\mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) = 4/ 8 \times 2/8 = 1/8\)
Hence, \(A\) and \(B\) are independent
\(A\) is half of \(\Omega\) and \(A\cap B\) is half of \(B\)
Assume that \[ \mathbb{P}\left( \left\{ i \right\} \right) = c \times i \qquad 1 \le i \le 8\] for some \(c \in \mathbb{R}\)
Prove that necessarily \(c = 1 / 36\)
So \[\mathbb{P}\left( A\cap B\right) = \mathbb{P}\left( \left\{ 4\right\} \right) =4/36 \qquad \text{ and } \qquad \mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) = 10 / 36 \times 12/36\]
And now \(A\) and \(B\) are not independent
Definition: We say that the events \(A_{1},A_{2},...,A_{n}\) are independent if \[\mathbb{P}\left( A_{i_{1}}\cap A_{i_{2}}\cap \cdots \cap A_{i_{k}}\right) = \mathbb{P}\left( A_{i_{1}}\right) \, \mathbb{P}\left( A_{i_{2}}\right) \cdots \, \mathbb{P}\left( A_{i_{k}}\right)\] for all \(1\leq i_{1}<i_{2}<\cdots <i_{k}\leq n,\) and all \(1\leq k\leq n.\)
For example, if \(n=3,\) then, \(A_1\), \(A_2\), and \(A_3\) are independent if and only if all of the following hold:
\[\begin{aligned} \mathbb{P}\left( A_{1}\cap A_{2}\right) &= \mathbb{P}\left( A_{1}\text{ }\right) \, \mathbb{P}\left( A_{2}\right) \\ & \\ \mathbb{P}\left( A_{1}\cap A_{3}\right) &= \mathbb{P}\left( A_{1}\right) \, \mathbb{P}\left( A_{3}\right)\\ & \\ \mathbb{P}\left( A_{2}\cap A_{3}\right) &= \mathbb{P}\left( A_{2}\right) \, \mathbb{P}\left( A_{3}\right)\\ & \\ \mathbb{P}\left( A_{1}\cap A_{2}\cap A_{3}\right) &= \mathbb{P}\left( A_{1}\right) \, \mathbb{P}\left( A_{2}\right) \, \mathbb{P}\left( A_{3}\right) \end{aligned}\]
A system is made of many components integrated to perform some task.
Example: serial vs. parallel bridges
A given system can either be working or have failed
The reliability of a system is the probability that the system is working.
To fix ideas, let’s assume that a system has three components: a, b, and c.
Each component could be working or have failed.
Define the events \[\begin{aligned} A &=\left\{ \text{Component a is working}\right\} \\ B &=\left\{ \text{Component b is working}\right\} \\ C &=\left\{ \text{Component c is working}\right\} \end{aligned}\]
Components of a system may be arranged in series/sequential
Such a system fails when any one of the components fail.
Components of a system may be arranged in parallel
Such a system works when at least one of the components works.
Suppose events \(A,\) \(B\) and \(C\) are independent:
\[\begin{aligned} \mathbb{P}\left( A\cap B\cap C\right) &= \mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) \, \mathbb{P}\left( C\right) \\ & \\ \mathbb{P}\left( A\cap B\right) &= \mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) \\ & \\ \mathbb{P}\left( B\cap C\right) &= \mathbb{P}\left( B\right) \, \mathbb{P}\left( C\right) \\ & \\ \mathbb{P}\left( A\cap C\right) &= \mathbb{P}\left( A\right) \, \mathbb{P}\left( C\right) \end{aligned}\]
Suppose that \[ \mathbb{P}\left( A\right) = \mathbb{P}\left( B\right) = \mathbb{P}\left( C\right) =0.95.\] PROBLEM 1: Calculate the reliability of a serial system system. That is: \[P \left( \mbox{ system is working } \right)\]
In serial systems, all the components need to be working for the system to work: \[W = \left\{ \text{System is working} \right\} = A\cap B\cap C\]
Thus \[\begin{aligned} \mathbb{P}\left( W\right) &= \mathbb{P}\left( A\cap B\cap C\right) \\ & \\ &= \mathbb{P}\left( A\right) \, \mathbb{P}\left( B\right) \, \mathbb{P}\left( C\right) & \quad \text{ (independence)}\\ & \\ &=0.95^{3}\\ & \\ &=0.857 \end{aligned}\]
Calculate the reliability of a parallel system system.
Systems with parallel components work if at least one their components works
Thus \[\begin{aligned} W &=\left\{ \text{System\ is working}\right\} \\ & \\ &=A\cup B\cup C \\ & \\ &=\left( A^{c}\cap B^{c}\cap C^{c}\right)^{c} &\mbox{ (De Morgan)} \end{aligned}\]
Then
\[\begin{aligned} \mathbb{P}\left( W\right) &= \mathbb{P}\left[ \left( A^{c}\cap B^{c}\cap C^{c}\right) ^{c}\right] \\ & \\ &=1-\mathbb{P}\left( A^{c}\cap B^{c}\cap C^{c}\right) \\ & \\ &=1-\mathbb{P}\left( A^{c}\right) P\left( B^{c}\right) \, \mathbb{P}\left( C^{c}\right) & \quad \text{ independence} \\ & \\ &=1-\left( 1- \mathbb{P}\left( A\right) \right) \left( 1- \mathbb{P}\left( B\right) \right) \left( 1- \mathbb{P}\left( C\right) \right) \\ & \\ &=1-0.05^{3}\\ & \\ &\approx 0.9999. \end{aligned}\]
Combined system
In this case \[W=\left( A\cup B\right) \cap \left( C\cup D\right)\]
One may verify that \(\left( A\cup B\right)\) and \(\left( C\cup D\right)\) are independent. Hint for the proof: it is sufficient to prove that \(A \cup B\) is independent of \(C\), and \(A \cup B\) is independent of \(D\).
Therefore \[\begin{aligned} \mathbb{P}\left( W\right) &= \mathbb{P}\left( A\cup B\right) \, \mathbb{P}\left( C\cup D\right) &\quad \text{ independence} \\ & \\ &=\left( 2\times 0.95-0.95^{2}\right) \times \left( 2\times 0.95-0.95^{2}\right) \\ & \\ &=0.9950 \end{aligned}\]
Alternatively:
\[\begin{aligned} \mathbb{P}\left( W\right) &= \mathbb{P}\left( A\cup B\right) \, \mathbb{P}\left( C\cup D\right) & \quad \text{ independence} \\ & \\ &=\left[ 1-\mathbb{P}\left( A^{c}\cap B^{c}\right) \right] \, \left[ 1-\mathbb{P}\left( C^{c}\cap D^{c}\right) \right] &\quad \text{ De Morgan \& compl.} \\ & \\ &=\left[ 1-\mathbb{P}\left( A^{c}\right) \, \mathbb{P}\left( B^{c}\right) \right] \left[ 1- \mathbb{P}\left( C^{c}\right) \, \mathbb{P}\left( D^{c}\right) \right] & \quad \text{ independence} \\ & \\ &=\left( 1-0.05^{2}\right) \left( 1-0.05^{2}\right) \\ & \\ &= 0.9950 \end{aligned}\]
Stat 302 - Winter 2025/26