MGFs and conditional expectation
Last modified — 05 Apr 2026
We want to find \(\mathbb{E}[X]\) and \(\operatorname{Var}(X)\).
Therefore \[\begin{aligned} \mathbb{E}[X] &= \mathbb{E}[\mathbb{E}[X|\Lambda]] = \mathbb{E}[\Lambda] = \alpha / \beta,\\ \operatorname{Var}(X) &= \mathbb{E}[\operatorname{Var}(X|\Lambda)] + \operatorname{Var}(\mathbb{E}[X|\Lambda]) = \mathbb{E}[\Lambda] + \operatorname{Var}(\Lambda) = \alpha / \beta + \alpha / \beta^2. \end{aligned}\]
Hint: There are two ways to do this. One solves ugly integrals, the other requires recognizing the distributions of \(X\) and \(Y|X\).
Let \(X \sim {\mathrm{Poiss}}(\lambda)\) and \(Y | X = x \sim {\mathrm{Binom}}(x, \theta)\). Then,
\[\begin{aligned} m_Y(t) &= \mathbb{E}[ e^{t \, Y} ] = \mathbb{E}[ \, \mathbb{E}[ e^{t \, Y} | X ] ] = \mathbb{E}\left[ \left(\theta e^t + 1 - \theta \right)^X \right] =\sum_{x=0}^\infty \left(\theta e^t + 1 - \theta \right)^x \cdot \frac{\lambda^x e^{-\lambda}}{x!} \\ &= e^{-\lambda} \sum_{x=0}^\infty \frac{\left(\lambda \, \theta \, e^t + \lambda (1 - \theta)\right)^x}{x!} \quad\quad \text{kernel of ${\mathrm{Poiss}}(\lambda \, \theta \, e^t + \lambda (1 - \theta))$}\\ &= e^{-\lambda} \exp\left(\lambda \, \theta \, e^t + \lambda (1 - \theta)\right)\\ &= e^{\lambda \, \theta \left( e^t - 1 \right)} \quad\quad \text{MGF of ${\mathrm{Poiss}}(\lambda \, \theta)$}. \end{aligned}\]
Hint: You should recall that the PMF of a Poisson random variable with parameter \(\lambda\) is \[p_X(x) = \frac{\lambda^x e^{-\lambda}}{x!}I_{\{0,1,2,\ldots\}}(x).\]
Let \(X\) and \(Y\) be two discrete random variables with joint PMF \[p_{X,Y}(x, y) = \frac{1}{11} I_{\{-4\}}(x) \bigg(I_{\{2\}}(y) + 2I_{\{3\}}(y) + 4I_{\{7\}}(y)\bigg) + \frac{1}{11} I_{\{6\}}(x) I_{\{2, 3, 7, 13\}}(y).\]
Note that this implies that for any random variable \(Y\),
\[\mathbb{P}(|Y| \geq a) \le \mathbb{E}[|Y|] / a.\]
Then, for any \(a > 0\), \[\mathbb{P}(| X - \mu| \geq a ) \leq \ \frac{\operatorname{Var}(X)}{a^2}.\]
Let \(X\) be a non-negative random variable with mean \(\mu\) and variance \(\mu\).
We want to examine the bounds on \(\mathbb{P}((X - \mu)/\mu \geq 1)\) given by Markov’s and Chebyshev’s inequalities.
Markov’s inequality gives \[\mathbb{P}((X - \mu)/\mu \geq 1) = \mathbb{P}( X \geq 2\mu ) \leq \ \frac{\mathbb{E}[X]}{2\mu} = \frac{\mu}{2\mu} = \frac{1}{2}.\]
Chebyshev’s inequality gives \[\mathbb{P}((X - \mu)/\mu \geq 1) = \mathbb{P}(X - \mu \geq \mu) \leq \mathbb{P}(|X - \mu| \geq \mu) \leq \frac{\operatorname{Var}(X)}{\mu^2} = \frac{\mu}{\mu^2} = \frac{1}{\mu}.\]
So for any random variable with mean \(\mu\) and variance \(\mu\), Chebyshev’s inequality gives a tighter bound whenever \(\mu > 2\).
Hint: recall that \(\mathbb{E}[ \overline{X}_n ] = \mathbb{E}[X_1]\) and \(\operatorname{Var}( \overline{X}_n ) = \operatorname{Var}(X_1)/n\).
If, in addition, \(\operatorname{Var}(X) > 0\) and \(\operatorname{Var}(Y) > 0\), then \[|\operatorname{Corr}(X, Y)| \leq 1.\]
Recall that a function \(f: {\mathbb{R}}\to {\mathbb{R}}\) is convex if for any \(x, y \in {\mathbb{R}}\) and \(\lambda \in [0, 1]\), we have \[f(\lambda x + (1 - \lambda) y) \leq \lambda f(x) + (1 - \lambda) f(y).\]
How many times should you play this game? Justify your answer.
Stat 302 - Winter 2025/26