Moment generating functions

Question Types
All Questions
grandes-ecoles 2018 Q12 Compute MGF or characteristic function for a named distribution
We set $\sigma^{\prime} = \frac{1}{2\pi\sigma}$. Show that there exists a real $\mu$ such that $\mathcal{F}\left(g_{\sigma}\right) = \mu g_{\sigma^{\prime}}$. The value of $\mu$ need not be made explicit.
grandes-ecoles 2018 Q12 Compute MGF or characteristic function for a named distribution
For $n \in \mathbb { N } ^ { * }$, $U _ { n }$ is a random variable on $(\Omega , \mathcal { A } , \mathbb { P })$ following the uniform distribution on $\llbracket 1 , n \rrbracket$. We set $Y _ { n } = \frac { 1 } { n } U _ { n }$.
Calculate the moment generating function of the random variable $Y _ { n }$.
grandes-ecoles 2018 Q13 Pointwise limit of MGFs or characteristic functions (convergence in distribution)
For $n \in \mathbb { N } ^ { * }$, $U _ { n }$ is a random variable on $(\Omega , \mathcal { A } , \mathbb { P })$ following the uniform distribution on $\llbracket 1 , n \rrbracket$. We set $Y _ { n } = \frac { 1 } { n } U _ { n }$.
For $t \in \mathbb { R }$, calculate $\lim _ { n \rightarrow + \infty } M _ { Y _ { n } } ( t )$.
grandes-ecoles 2019 Q6 Pointwise limit of MGFs or characteristic functions (convergence in distribution)
Let $n$ be a non-zero natural number. We set $$\forall n \in \mathbb{N}^{\star}, \quad X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$ for all $n \geqslant 1$.
Deduce the pointwise limit of the sequence of functions $(\varphi_n)_{n \geqslant 1}$ defined by $$\forall n \in \mathbb{N}^{\star}, \quad \varphi_n : \begin{aligned} \mathbb{R} &\rightarrow \mathbb{R} \\ t &\mapsto \mathbb{E}(\cos(t X_n)) \end{aligned}$$
grandes-ecoles 2019 Q13 Concentration inequality via MGF and Markov's inequality (Chernoff method)
We fix $p, q \in [0,1]$ and $(X_i)_{1 \leq i \leq n}$ a family of $n$ random variables taking values in $\{0,1\}$ mutually independent Bernoulli random variables with parameter $p$. We set $S_n = \sum_{i=1}^n X_i$. We assume in this question that $p < q$.
a. Justify that $$\mathbb{P}\left(\left|\frac{S_n}{n} - q\right| \leq \left|\frac{S_n}{n} - p\right|\right) = \mathbb{P}\left(S_n \geq \frac{p+q}{2}n\right)$$
b. Let $X$ be a Bernoulli random variable with parameter $p$. For $u > 0$, calculate $\mathbb{E}\left(e^{uX}\right)$.
c. Show that for all $u > 0$, $$\mathbb{P}\left(S_n \geq \frac{p+q}{2}n\right) \leq e^{-n\left(\frac{p+q}{2}u - \ln\left(1-p+pe^u\right)\right)}$$ Hint. One may assume that if $(Z_i)_{1 \leq i \leq n}$ are $n$ mutually independent random variables taking a finite number of values, then $\mathbb{E}\left(\prod_{i=1}^n Z_i\right) = \prod_{i=1}^n \mathbb{E}\left(Z_i\right)$.
d. Show that $\mathbb{P}\left(S_n \geq \frac{p+q}{2}n\right) \leq e^{-n\frac{(p-q)^2}{2}}$.
grandes-ecoles 2019 Q14 Concentration inequality via MGF and Markov's inequality (Chernoff method)
Prove Theorem 3: We fix $p, q \in [0,1]$. Let $n \geq 1$ be an integer and let $S_n$ be a sum of $n$ random variables taking values in $\{0,1\}$ mutually independent Bernoulli random variables with parameter $p$. Then $$\mathbb{P}\left(\left|\frac{S_n}{n} - q\right| \leq \left|\frac{S_n}{n} - p\right|\right) \leq e^{-n\frac{(p-q)^2}{2}}.$$
grandes-ecoles 2020 Q1 Compute MGF or characteristic function for a named distribution
We assume in this question that $X ( \Omega )$ is a finite set of cardinality $r \in \mathbb { N } ^ { * }$. We denote $X ( \Omega ) = \left\{ x _ { 1 } , \ldots , x _ { r } \right\}$ where the $x _ { i }$ are pairwise distinct, and, for all integer $k \in \llbracket 1 , r \rrbracket , a _ { k } = \mathbb { P } \left( X = x _ { k } \right)$. Show that, for all real $t , \phi _ { X } ( t ) = \sum _ { k = 1 } ^ { r } a _ { k } \mathrm { e } ^ { \mathrm { i } t x _ { k } }$.
grandes-ecoles 2020 Q8 Concentration inequality via MGF and Markov's inequality (Chernoff method)
Let $n \geqslant 1$ be a natural integer, and let $(X_1, \ldots, X_n)$ be discrete real random variables that are mutually independent such that, for all $k \in \{1, \ldots, n\}$, $$P[X_k = 1] = P[X_k = -1] = \frac{1}{2}$$ We define $$S_n = \frac{1}{n} \sum_{k=1}^{n} X_k$$ as well as, for all $\lambda \in \mathbb{R}$, $$\psi(\lambda) = \log\left(\frac{1}{2}e^{\lambda} + \frac{1}{2}e^{-\lambda}\right)$$ For each $\lambda \geqslant 0$, we set $$m(\lambda) = \frac{E[X_1 \exp(\lambda X_1)]}{E[\exp(\lambda X_1)]}$$ as well as $$D_n(\lambda) = \exp(\lambda n S_n - n \psi(\lambda))$$ For all $n \geqslant 1$, $\lambda \geqslant 0$ and $\varepsilon > 0$, we denote by $I_n(\lambda, \varepsilon)$ the random variable defined by $$I_n(\lambda, \varepsilon) = \begin{cases} 1 & \text{if } |S_n - m(\lambda)| \leqslant \varepsilon \\ 0 & \text{otherwise.} \end{cases}$$
(a) Deduce, for each $\lambda \geqslant 0$ and $\varepsilon > 0$, the existence of a sequence $(u_n(\varepsilon))_{n \geqslant 1}$ that tends to 0 as $n$ tends to infinity and such that $$\frac{1}{n} \log P[S_n \geqslant m(\lambda) - \varepsilon] \geqslant \psi(\lambda) - \lambda m(\lambda) - \lambda \varepsilon + u_n(\varepsilon)$$
(b) Conclude that for all $t \in [0,1[$, $$\lim_{n \rightarrow \infty} \frac{1}{n} \log P[S_n \geqslant t] = \inf_{\lambda \geqslant 0} (\psi(\lambda) - \lambda t).$$
(c) Is the preceding formula still valid for $t = 1$?
grandes-ecoles 2020 Q8 Concentration inequality via MGF and Markov's inequality (Chernoff method)
Let $n \geqslant 1$ be a natural integer, and let $\left( X _ { 1 } , \ldots , X _ { n } \right)$ be mutually independent discrete real random variables such that, for all $k \in \{ 1 , \ldots , n \}$, $$P \left[ X _ { k } = 1 \right] = P \left[ X _ { k } = - 1 \right] = \frac { 1 } { 2 }$$ We define $$S _ { n } = \frac { 1 } { n } \sum _ { k = 1 } ^ { n } X _ { k }$$ as well as, for all $\lambda \in \mathbb { R }$, $$\psi ( \lambda ) = \log \left( \frac { 1 } { 2 } e ^ { \lambda } + \frac { 1 } { 2 } e ^ { - \lambda } \right)$$ For each $\lambda \geqslant 0$, we set $$m ( \lambda ) = \frac { E \left[ X _ { 1 } \exp \left( \lambda X _ { 1 } \right) \right] } { E \left[ \exp \left( \lambda X _ { 1 } \right) \right] }$$ as well as $$D _ { n } ( \lambda ) = \exp \left( \lambda n S _ { n } - n \psi ( \lambda ) \right)$$ For all $n \geqslant 1 , \lambda \geqslant 0$ and $\varepsilon > 0$, we denote by $I _ { n } ( \lambda , \varepsilon )$ the random variable defined by $$I _ { n } ( \lambda , \varepsilon ) = \begin{cases} 1 & \text { if } \left| S _ { n } - m ( \lambda ) \right| \leqslant \varepsilon \\ 0 & \text { otherwise } \end{cases}$$
(a) Deduce, for each $\lambda \geqslant 0$ and $\varepsilon > 0$, the existence of a sequence $\left( u _ { n } ( \varepsilon ) \right) _ { n \geqslant 1 }$ that tends to 0 as $n$ tends to infinity and such that $$\frac { 1 } { n } \log P \left[ S _ { n } \geqslant m ( \lambda ) - \varepsilon \right] \geqslant \psi ( \lambda ) - \lambda m ( \lambda ) - \lambda \varepsilon + u _ { n } ( \varepsilon )$$
(b) Conclude that for all $t \in [ 0,1 [$, $$\lim _ { n \rightarrow \infty } \frac { 1 } { n } \log P \left[ S _ { n } \geqslant t \right] = \inf _ { \lambda \geqslant 0 } ( \psi ( \lambda ) - \lambda t )$$
(c) Is the preceding formula still valid for $t = 1$ ?
grandes-ecoles 2020 Q9 Existence and domain of the MGF
Show that for all $t \in \mathbb { R } , \left| \phi _ { X } ( t ) \right| \leqslant 1$.
grandes-ecoles 2020 Q10 MGF uniquely determines moments or distribution
Show that, if there exist $a \in \mathbb { R }$ and $t _ { 0 } \in \mathbb { R } ^ { * }$ such that $X ( \Omega ) \subset a + \frac { 2 \pi } { t _ { 0 } } \mathbb { Z }$, then $\left| \phi _ { X } \left( t _ { 0 } \right) \right| = 1$.
grandes-ecoles 2020 Q11 MGF uniquely determines moments or distribution
We assume that there exists $t _ { 0 } \in \mathbb { R } ^ { * }$ such that $\left| \phi _ { X } \left( t _ { 0 } \right) \right| = 1$. We assume further that $X ( \Omega )$ is countable and we use the notations of question 2 (i.e. $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ with $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$). Show that there exists $a \in \mathbb { R }$ such that $\sum _ { n = 0 } ^ { + \infty } a _ { n } \exp \left( \mathrm { i } \left( t _ { 0 } x _ { n } - t _ { 0 } a \right) \right) = 1$.
grandes-ecoles 2020 Q12 MGF uniquely determines moments or distribution
We assume that there exists $t _ { 0 } \in \mathbb { R } ^ { * }$ such that $\left| \phi _ { X } \left( t _ { 0 } \right) \right| = 1$. We assume further that $X ( \Omega )$ is countable and we use the notations of question 2 (i.e. $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ with $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$). Using the result of Q11, deduce that $\sum _ { n = 0 } ^ { + \infty } a _ { n } \left( 1 - \cos \left( t _ { 0 } x _ { n } - t _ { 0 } a \right) \right) = 0$.
grandes-ecoles 2020 Q13 MGF uniquely determines moments or distribution
We assume that there exists $t _ { 0 } \in \mathbb { R } ^ { * }$ such that $\left| \phi _ { X } \left( t _ { 0 } \right) \right| = 1$. We assume further that $X ( \Omega )$ is countable and we use the notations of question 2 (i.e. $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ with $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$). Show that for all $n \in \mathbb { N }$, if $a _ { n } \neq 0$, then $x _ { n } \in a + \frac { 2 \pi } { t _ { 0 } } \mathbb { Z }$.
grandes-ecoles 2020 Q27 MGF uniquely determines moments or distribution
We admit that $\int _ { 0 } ^ { + \infty } \operatorname { sinc } ( s ) \mathrm { d } s = \frac { \pi } { 2 }$. If $a$ and $b$ are two real numbers, we denote $K _ { a , b }$ the function defined for all real $t$ by $K _ { a , b } ( t ) = \begin{cases} \frac { \mathrm { e } ^ { \mathrm { i } t b } - \mathrm { e } ^ { \mathrm { i } t a } } { 2 \mathrm { i } t } & \text { if } t \neq 0 , \\ \frac { b - a } { 2 } & \text { if } t = 0 . \end{cases}$ Let $X : \Omega \rightarrow \mathbb { R }$ be a random variable such that $X ( \Omega )$ is finite. We assume that the real numbers $a$ and $b$ do not belong to $X ( \Omega )$. Show that $$\frac { 1 } { \pi } \int _ { - N } ^ { N } \phi _ { X } ( - t ) K _ { a , b } ( t ) \mathrm { d } t \xrightarrow [ N \rightarrow + \infty ] { } \mathbb { P } ( a < X < b )$$
grandes-ecoles 2020 Q29 Derivative formulas and recursive structure for characteristic functions
We fix a real random variable $X : \Omega \rightarrow \mathbb { R }$, whose image $X ( \Omega )$ is a countable set, with $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ and $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$. Let $k \in \mathbb { N } ^ { * }$. We assume that $X$ admits a moment of order $k$. Deduce that $\phi _ { X }$ is of class $C ^ { k }$ on $\mathbb { R }$ and give an expression of the $k$-th derivative of $\phi _ { X }$.
grandes-ecoles 2020 Q38 Power series expansion of the characteristic function
Let $X : \Omega \rightarrow \mathbb { R }$ be a real-valued random variable. We assume that $X ( \Omega )$ is finite and we use the notation from question 1: $X ( \Omega ) = \left\{ x _ { 1 } , \ldots , x _ { r } \right\}$ with $a _ { k } = \mathbb { P } \left( X = x _ { k } \right)$. Show that $\phi _ { X }$ is expandable as a power series on $\mathbb { R }$ and, for all real $t , \phi _ { X } ( t ) = \sum _ { n = 0 } ^ { + \infty } \frac { ( \mathrm { i } t ) ^ { n } } { n ! } \mathbb { E } \left( X ^ { n } \right)$.
grandes-ecoles 2020 Q40 Power series expansion of the characteristic function
Let $X : \Omega \rightarrow \mathbb { R }$ be a real-valued random variable. We assume that $X ( \Omega )$ is countable and we use the notation from question 2: $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ with $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$. We also assume that, for all integer $n \in \mathbb { N } , X$ admits a moment of order $n$ and that there exists a real $R > 0$ such that $$\mathbb { E } \left( | X | ^ { n } \right) = O \left( \frac { n ^ { n } } { R ^ { n } } \right) \quad \text { when } n \rightarrow + \infty$$ Using the result of Q39, deduce that for all real $t \in \left[ - \frac { R } { \mathrm { e } } , \frac { R } { \mathrm { e } } \right]$, $$\phi _ { X } ( t ) = \sum _ { k = 0 } ^ { + \infty } \frac { ( \mathrm { i } t ) ^ { k } } { k ! } \mathbb { E } \left( X ^ { k } \right)$$
grandes-ecoles 2022 Q13 Existence and domain of the MGF
Let $X$ be a real random variable. Show that $\left| \Phi _ { X } ( \theta ) \right| \leq 1$ for all real $\theta$.
grandes-ecoles 2022 Q14 Compute MGF or characteristic function for a named distribution
In this question, we are given a real random variable $X$ following a geometric distribution with parameter $p \in ] 0,1 [$ arbitrary. We set $q = 1 - p$.
Show that for all $( a , b ) \in \mathbf { R } ^ { 2 }$ and all real $\theta$,
$$\Phi _ { a X + b } ( \theta ) = \frac { p e ^ { i ( a + b ) \theta } } { 1 - q e ^ { i a \theta } }$$
grandes-ecoles 2022 Q21 Approximation or bound on characteristic function difference
We are given a centered real random variable $Y$ such that $Y ^ { 4 }$ has finite expectation.
Conclude that for all real $\theta$,
$$\left| \Phi _ { Y } ( \theta ) - \exp \left( - \frac { \mathbf { E } \left( Y ^ { 2 } \right) \theta ^ { 2 } } { 2 } \right) \right| \leq \frac { | \theta | ^ { 3 } } { 3 } \left( \mathbf { E } \left( Y ^ { 4 } \right) \right) ^ { 3 / 4 } + \frac { \theta ^ { 4 } } { 8 } \mathbf { E } \left( Y ^ { 4 } \right)$$
grandes-ecoles 2022 Q40 Upper bound on MGF (sub-Gaussian or exponential inequalities)
Let $X _ { 1 } , \ldots , X _ { n } , Y _ { 1 } , \ldots , Y _ { n }$ be mutually independent random variables with the same distribution $\mathcal { R }$. We define the random vectors $X = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } , \ldots , X _ { n } \right) ^ { \top }$ and $Y = \frac { 1 } { \sqrt { n } } \left( Y _ { 1 } , \ldots , Y _ { n } \right) ^ { \top }$ taking values in $\mathcal { M } _ { n , 1 } ( \mathbb { R } )$.
Deduce that, for every real number $t$, $$\mathbb { E } ( \exp ( t \langle X \mid Y \rangle ) ) \leqslant \exp \left( \frac { t ^ { 2 } } { 2 n } \right)$$
grandes-ecoles 2024 Q24 Extract moments from the MGF or characteristic function
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Deduce that for all $n \in \mathbf { N } ^ { * }$:
$$E \left( \left| S _ { n } \right| \right) = \frac { 2 } { \pi } \int _ { 0 } ^ { + \infty } \frac { 1 - ( \cos ( t ) ) ^ { n } } { t ^ { 2 } } \mathrm {~d} t$$
grandes-ecoles 2024 Q25 Extract moments from the MGF or characteristic function
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Conclude that:
$$\forall n \in \mathbf { N } ^ { * } , \quad E \left( \left| S _ { 2 n } \right| \right) = E \left( \left| S _ { 2 n - 1 } \right| \right) = \frac { ( 2 n - 1 ) ! } { 2 ^ { 2 n - 2 } ( ( n - 1 ) ! ) ^ { 2 } }$$
grandes-ecoles 2025 Q5 Upper bound on MGF (sub-Gaussian or exponential inequalities)
Let $\left( X _ { i } \right) _ { i \in [ 1 , n ] }$ be a sequence of independent random variables all following a Rademacher distribution. Show that: for all $t \geq 0$, for all $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$, $$\mathbf { E } \left( \exp \left( t \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right) \right) \leq \exp \left( \frac { t ^ { 2 } } { 2 } \sum _ { i = 1 } ^ { n } c _ { i } ^ { 2 } \right)$$