Normal distribution, CDFs, transformations, and joint distributions
Last modified — 24 Feb 2026
\[\begin{aligned} f_Z(z; \alpha, \lambda) &= \frac{\lambda^\alpha}{\Gamma(\alpha)} z^{\alpha-1}e^{-\lambda z} I_{[0,\infty)}(z). \end{aligned}\]
\[f_Z(z; \alpha, \lambda) = \frac{\lambda^\alpha}{\Gamma(\alpha)} z^{\alpha-1}e^{-\lambda z} I_{[0,\infty)}(z)\]
We know that \[1 = \int_0^\infty \frac{\lambda^\alpha}{\Gamma(\alpha)} z^{\alpha-1}e^{-\lambda z} \mathsf{d}z \Longrightarrow \frac{\Gamma(\alpha)}{\lambda^\alpha} = \int_0^\infty z^{\alpha-1}e^{-\lambda z} \mathsf{d}z.\]
\[f_Z(z; \alpha, \lambda) = \frac{\lambda^\alpha}{\Gamma(\alpha)} z^{\alpha-1}e^{-\lambda z} I_{[0,\infty)}(z)\]
Hint: Recall that \(\Gamma(n) = (n-1)!\) when \(n \in \{1,2,\dots\}\).

Let \(X\) be a random variable with CDF \(F_X\). Then:
Discrete RV
\[F_X(x) = \sum_{t \leq x} p_X(t)\]
Continuous RV
\[F_X(x) = \int_{-\infty}^x f_X(t) \mathsf{d}t\]

Let \(X \sim {\mathrm{Exp}}(\lambda)\), with pdf \[f_X(x) = \lambda e^{-\lambda x} I_{[0,\infty)}(x).\]
Let \(X_1, X_2,\dots\) be a random variables with CDFs \(F_{X_1}, F_{X_2}, \dots\). Then the the following hold:
Consider the scores on Midterm 1 in a class. Suppose that there are three types of students, modelled as follows:
Hint: If \(Y \sim \mathcal{N}(\mu, \sigma)\), then \(F_Y(y) = \Phi\left( \frac{y - \mu}{\sigma} \right)\).
Let \(X\) be a random variable with some distribution, and let \(Y = g(X)\) for some function \(g : {\mathbb{R}}\to {\mathbb{R}}\).
We want to find the distribution of \(Y\).
\[\mathbb{P}(Y \in A) = \mathbb{P}(g(X) \in A) = \mathbb{P}(\{x : g(x) \in A\}).\]
If we can characterize these sets, we can find the distribution of \(Y\). This method always works, and is easy for discrete random variables.
Find the PMF of \(Y\).
We have that \[\begin{aligned} p_Y(y) &= \sum_{x : g(x) = y} p_X(x) = \sum_{x : n - x = y} p_X(x) \\ &= p_X(n - y) & \text{only one $x$ satisfies this}\\ &= \binom{n}{n - y} \theta^{n - y} (1 - \theta)^{y}I_{\{0,\dots,n\}}(n-y) & \text{definition of Binomial}\\ &= \binom{n}{y} (1 - \theta)^{y} \theta^{n - y}I_{\{0,\dots,n\}}(y) & \text{symmetry of binomial coeff.} \end{aligned}\]
Therefore, \(Y \sim {\mathrm{Binom}}(n, 1 - \theta).\)
For continuous random variables, you can also use the distribution method, and sometimes this is the easiest way.
Find the PDF of \(Y\).
Note that \(h(x) = -\log(x)\) is monotonic on \((0,1)\), so we can use the Jacobian method. The support of \(X\) is \((0,1)\), so \(Y\) takes values in \((0, \infty)\).
We have that \(h^{-1}(z) = e^{-z}\) and \(\frac{\mathsf{d}}{\mathsf{d}z} h^{-1}(z) = -e^{-z}\).
\[\begin{aligned} f_Y(y) &= f_X(h^{-1}(y)) \left| \frac{\mathsf{d}}{\mathsf{d}y} (h^{-1}(y)) \right|\\ &= f_X(e^{-y}) \left| -e^{-y} \right| \\ &= 1 \times e^{-y} I_{(0, \infty)}(y). \end{aligned}\]
Therefore, \(Y \sim {\mathrm{Exp}}(1)\).
Let \(X \sim {\mathrm{Unif}}\left( -1,1\right)\). Use the distribution method to find the PDF of \(Z = X^2\).
Let \(X \sim {\mathrm{Gam}}(\alpha, \lambda)\). Use the Jacobian method to find the PDF of \(Y = 1/X\).
Recall: \[\left\{ X \le a \, , \ Y \le b \right\} = \left\{ X \le a \right\} \cap \left\{ Y \le b \right\}.\]
Let \(X\) and \(Y\) be two random variables with joint CDF \(F_{X, Y}(x, y)\).
Find the joint PMF of \(X\) and \(Y\).
| \(f_{X,Y}(x, y)\) | 1 | 2 | 3 | 4 | 5 | 6 |
|---|---|---|---|---|---|---|
| 1 | 1/36 | 2/36 | 2/36 | 2/36 | 2/36 | 2/36 |
| 2 | 0 | 1/36 | 2/36 | 2/36 | 2/36 | 2/36 |
| 3 | 0 | 0 | 1/36 | 2/36 | 2/36 | 2/36 |
| 4 | 0 | 0 | 0 | 1/36 | 2/36 | 2/36 |
| 5 | 0 | 0 | 0 | 0 | 1/36 | 2/36 |
| 6 | 0 | 0 | 0 | 0 | 0 | 1/36 |
Important
All of this generalizes to more than two random variables.
| \(f_{X,Y}(x, y)\) | 1 | 2 | 3 | 4 | 5 | 6 |
|---|---|---|---|---|---|---|
| 1 | 1/36 | 2/36 | 2/36 | 2/36 | 2/36 | 2/36 |
| 2 | 0 | 1/36 | 2/36 | 2/36 | 2/36 | 2/36 |
| 3 | 0 | 0 | 1/36 | 2/36 | 2/36 | 2/36 |
| 4 | 0 | 0 | 0 | 1/36 | 2/36 | 2/36 |
| 5 | 0 | 0 | 0 | 0 | 1/36 | 2/36 |
| 6 | 0 | 0 | 0 | 0 | 0 | 1/36 |
In the special case where \(p=2\), \(\mu = (\mu_1, \mu_2)^{\mathsf{T}}\), and \(\Sigma = \begin{bmatrix} \sigma_1^2 & \rho \sigma_1 \sigma_2 \\ \rho \sigma_1 \sigma_2 & \sigma_2^2 \end{bmatrix}\), this “simplifies” to
\[\begin{aligned} &f_X(x; \mu, \Sigma)\\ &= \frac{1}{2\pi\sigma_1\sigma_2\sqrt{1 - \rho^2}} \exp\left\{ -\frac{1}{2(1 - \rho^2)} \left(\left(\frac{x_1-\mu_1}{\sigma_1}\right)^2 - 2\rho \frac{(x_1-\mu_1)(x_2-\mu_2)}{\sigma_1\sigma_2} + \left(\frac{x_2-\mu_2}{\sigma_2}\right)^2 \right) \right\}. \end{aligned}\]
Let \(X\) and \(Y\) be continuous random variables with PDF
\[ f_{X, Y}(x, y) = I_{[0,1]}(x)I_{[0,1]}(y) = I_{[0,1]^2}(x, y). \]
Stat 302 - Winter 2025/26