UFM Statistics

View all 51 questions →

grandes-ecoles 2017 QI.A.2 Existence and domain of the MGF View
a) Suppose that $X$ is bounded. Justify that $X$ satisfies $(C_{\tau})$ for all $\tau$ in $\mathbb{R}^{+*}$.
b) Suppose that $X$ follows the geometric distribution with parameter $p \in ]0,1[$ $$\forall k \in \mathbb{N}^{*}, \quad P(X = k) = p(1-p)^{k-1}$$ What are the real numbers $t$ such that $E\left(\mathrm{e}^{tX}\right) < +\infty$? For these $t$, give a simple expression for $E\left(\mathrm{e}^{tX}\right)$.
c) Suppose that $X$ follows the Poisson distribution with parameter $\lambda$: $$\forall k \in \mathbb{N}, \quad P(X = k) = \mathrm{e}^{-\lambda} \frac{\lambda^{k}}{k!} \quad \text{where } \lambda \in \mathbb{R}^{+*}$$ What are the real numbers $t$ such that $E\left(\mathrm{e}^{tX}\right) < +\infty$? For these $t$, give a simple expression for $E\left(\mathrm{e}^{tX}\right)$.
grandes-ecoles 2017 QI.A.3 Existence and domain of the MGF View
Let $a$ and $b$ be two real numbers such that $a < b$. Suppose $E\left(\mathrm{e}^{aX}\right) < +\infty$ and $E\left(\mathrm{e}^{bX}\right) < +\infty$.
a) Show $\forall t \in [a,b]$, $\mathrm{e}^{tX} \leqslant \mathrm{e}^{aX} + \mathrm{e}^{bX}$. Deduce that $E\left(\mathrm{e}^{tX}\right) < +\infty$. What can we conclude about the set $\left\{t \in \mathbb{R} ; E\left(\mathrm{e}^{tX}\right) < +\infty\right\}$?
b) Let $k$ be in $\mathbb{N}$, $t$ in $]a,b[$. We denote by $\theta_{k,t,a,b}$ the function $y \in \mathbb{R} \mapsto \frac{y^{k} \mathrm{e}^{ty}}{\mathrm{e}^{ay} + \mathrm{e}^{by}}$. Determine the limits of $\theta_{k,t,a,b}$ at $+\infty$ and $-\infty$. Show that this function is bounded on $\mathbb{R}$.
c) Show that $E\left(|X|^{k} \mathrm{e}^{tX}\right) < +\infty$.
d) We return to the notations of question b). Let $k$ be in $\mathbb{N}$, $c$ and $d$ be two real numbers such that $a < c < d < b$. Show that there exists $M_{k,a,b,c,d} \in \mathbb{R}^{+}$ such that for all $t \in [c,d]$ and for all $y \in \mathbb{R}$: $\left|\theta_{k,t,a,b}(y)\right| \leqslant M_{k,a,b,c,d}$.
grandes-ecoles 2017 QI.A.4 Existence and domain of the MGF View
In this question, $\tau$ is an element of $\mathbb{R}^{+*}$ and $X$ satisfies $(C_{\tau})$.
a) Show that the set of real numbers $t$ such that $E\left(\mathrm{e}^{tX}\right) < +\infty$ is an interval $I$ containing $[-\tau, \tau]$. For $t$ in $I$, we denote $\varphi_{X}(t) = E\left(\mathrm{e}^{tX}\right)$.
b) Show that if $X(\Omega)$ is finite, $\varphi_{X}$ is continuous on $I$ and of class $C^{\infty}$ on the interior of $I$.
c) Suppose now that $X(\Omega)$ is a countably infinite set. We denote $X(\Omega) = \left\{x_{n} ; n \in \mathbb{N}^{*}\right\}$ where $\left(x_{n}\right)_{n \in \mathbb{N}^{*}}$ is a sequence of pairwise distinct real numbers and we set for all $n \in \mathbb{N}^{*}$, $p_{n} = P\left(X = x_{n}\right)$. Using the results established in question I.A.3 and two theorems relating to series of functions which you will state completely, show that $\varphi_{X}$ is continuous on $I$ and of class $C^{\infty}$ on the interior of $I$.
d) Verify that for $t$ in the interior of $I$ and $k$ in $\mathbb{N}$, $\varphi_{X}^{(k)}(t) = E\left(X^{k} \mathrm{e}^{tX}\right)$.
e) Let $\psi_{X} = \frac{\varphi_{X}^{\prime}}{\varphi_{X}}$. Show that $\psi_{X}$ is increasing on $I$ and that, if $X$ is not almost surely equal to a constant, $\psi_{X}$ is strictly increasing on $I$.
grandes-ecoles 2017 QII.B.1 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
The interval $I$ and the function $\varphi_{X}$ are defined as in question I.A.4. We suppose that $X$ satisfies $(C_{\tau})$ for some $\tau > 0$, is not almost surely constant, and $a > E(X)$.
Show that, for $n$ in $\mathbb{N}^{*}$ and $t$ in $I \cap \mathbb{R}^{+}$ $$E\left(\mathrm{e}^{tS_{n}}\right) = \left(\varphi_{X}(t)\right)^{n}, \quad P\left(S_{n} \geqslant na\right) \leqslant \frac{\varphi_{X}(t)^{n}}{\mathrm{e}^{nta}}$$
grandes-ecoles 2017 QII.B.2 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
The interval $I$ and the function $\varphi_{X}$ are defined as in question I.A.4. We suppose that $X$ satisfies $(C_{\tau})$ for some $\tau > 0$, is not almost surely constant, and $a > E(X)$. We define the function $\chi : \begin{aligned} & I \rightarrow \mathbb{R} \\ & t \mapsto \ln\left(\varphi_{X}(t)\right) - ta \end{aligned}$
a) Show that the function $\chi$ is bounded below on $I \cap \mathbb{R}^{+}$. We denote by $\eta_{a}$ the infimum of $\chi$ on $I \cap \mathbb{R}^{+}$.
b) Give an equivalent of $\chi(t)$ as $t$ tends to 0. Deduce that $\eta_{a} < 0$.
c) Show $\forall n \in \mathbb{N}^{*}, \quad P\left(S_{n} \geqslant na\right) \leqslant \mathrm{e}^{n\eta_{a}}$. Deduce that $\gamma_{a} < 0$.
d) In each of the following two cases, determine the set of real numbers $a$ satisfying the conditions $P(X \geqslant a) > 0$ and $a > E(X)$; then, for $a$ satisfying these conditions, calculate $\eta_{a}$.
i. $X$ follows the Bernoulli distribution $\mathcal{B}(p)$ with $0 < p < 1$.
ii. $X$ follows the Poisson distribution $\mathcal{P}(\lambda)$ with $\lambda > 0$.
grandes-ecoles 2017 QII.C.1 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
We suppose here that the infimum $\eta_{a}$ of the function $\chi$ on $I \cap \mathbb{R}^{+}$ is attained at a point $\sigma$ interior to $I \cap \mathbb{R}^{+}$. Let $t$ be a real number interior to $I$ and such that $t > \sigma$, $b$ be a real number such that $b > \frac{\varphi_{X}^{\prime}(t)}{\varphi_{X}(t)}$.
a) Calculate $\sum_{x \in X(\Omega)} \frac{\mathrm{e}^{tx}}{E\left(\mathrm{e}^{tX}\right)} P(X = x)$.
We then admit (if necessary by modifying $(\Omega, \mathcal{A}, P)$) that there exists a random variable $X^{\prime}$ on $(\Omega, \mathcal{A})$ such that $X^{\prime}(\Omega) = X(\Omega)$ and whose probability distribution is given by $$\forall x \in X(\Omega), \quad P\left(X^{\prime} = x\right) = \frac{\mathrm{e}^{tx}}{E\left(\mathrm{e}^{tX}\right)} P(X = x)$$ and that there exists a sequence $\left(X_{n}^{\prime}\right)_{n \in \mathbb{N}^{*}}$ of mutually independent random variables defined on $(\Omega, \mathcal{A}, P)$ all following the same distribution as $X^{\prime}$.
b) Show $$E\left(X^{\prime}\right) = \frac{\varphi_{X}^{\prime}(t)}{\varphi_{X}(t)}, \quad E\left(X^{\prime}\right) > a$$
grandes-ecoles 2017 QII.C.2 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
We suppose here that the infimum $\eta_{a}$ of the function $\chi$ on $I \cap \mathbb{R}^{+}$ is attained at a point $\sigma$ interior to $I \cap \mathbb{R}^{+}$. Let $t$ be a real number interior to $I$ and such that $t > \sigma$, $b$ be a real number such that $b > \frac{\varphi_{X}^{\prime}(t)}{\varphi_{X}(t)}$. We admit that, if $n$ in $\mathbb{N}^{*}$ and if $f$ is a map from $X(\Omega)^{n}$ to $\mathbb{R}^{+}$, we have $$E\left(f\left(X_{1}^{\prime}, \ldots, X_{n}^{\prime}\right)\right) = \frac{E\left(f\left(X_{1}, \ldots, X_{n}\right) \mathrm{e}^{tS_{n}}\right)}{\varphi_{X}(t)^{n}}$$
a) For $n$ in $\mathbb{N}^{*}$, we set $S_{n}^{\prime} = \sum_{k=1}^{n} X_{k}^{\prime}$. Show $P\left(na \leqslant S_{n}^{\prime} \leqslant nb\right) \leqslant P\left(S_{n} \geqslant na\right) \frac{\mathrm{e}^{ntb}}{\varphi_{X}(t)^{n}}$.
$$\text{We may introduce the map } f : \left|\, \begin{array}{cl} X(\Omega)^{n} & \rightarrow \mathbb{R} \\ \left(x_{1}, \ldots, x_{n}\right) & \mapsto \begin{cases} 1 & \text{if } na \leqslant \sum_{i=1}^{n} x_{i} \leqslant nb \\ 0 & \text{otherwise} \end{cases} \end{array} \right.$$
b) Using questions I.B.2, II.B.2c and a) above, finally show that $\eta_{a} = \gamma_{a}$.
grandes-ecoles 2017 QII.C.3 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
In this question you may use the results from II.B.2d.
a) Let $\alpha$ be in $]0, 1/2[$. For $n$ in $\mathbb{N}^{*}$, we set $$A_{n} = \left\{k \in \{0, \ldots, n\}, \left|k - \frac{n}{2}\right| \geqslant \alpha n\right\}, \quad U_{n} = \sum_{k \in A_{n}} \binom{n}{k}$$ Determine the limit of the sequence $\left(U_{n}^{1/n}\right)_{n \geqslant 1}$.
b) Let $\lambda$ be in $\mathbb{R}^{+*}$, $\alpha$ be in $]\lambda, +\infty[$. For $n$ in $\mathbb{N}^{*}$, we set $$T_{n} = \sum_{\substack{k \in \mathbb{N} \\ k \geqslant \alpha n}} \frac{n^{k} \lambda^{k}}{k!}$$ Determine the limit of the sequence $\left(T_{n}^{1/n}\right)_{n \geqslant 1}$.
grandes-ecoles 2017 QII.B.1 Existence and domain of the MGF View
Let $\alpha$ be a strictly positive real and $X$ a discrete random variable admitting an exponential moment of order $\alpha$. Show that the random variable $e^{\alpha X}$ has finite expectation.
grandes-ecoles 2017 QII.B.2 Compute MGF or characteristic function for a named distribution View
For each of the following real random variables, determine the strictly positive reals $\alpha$ such that the random variable admits an exponential moment of order $\alpha$ and calculate $\mathbb{E}\left(\mathrm{e}^{\alpha X}\right)$ in this case.
a) $X$ a random variable following a Poisson distribution with parameter $\lambda$, where $\lambda$ is a strictly positive real.
b) $Y$ a random variable following a geometric distribution with parameter $p$, where $p$ is a real strictly between 0 and 1.
c) $Z$ a random variable following a binomial distribution with parameters $n$ and $p$, where $n$ is a strictly positive integer and $p$ is a real strictly between 0 and 1.
grandes-ecoles 2017 QII.C.2 Existence and domain of the MGF View
In subsection II.C, we consider $\varepsilon$ a strictly positive real, $X$ a discrete real random variable taking values in $\left\{x_{p}, p \in \mathbb{N}\right\}$, and $\left(X_{k}\right)_{k \in \mathbb{N}^{*}}$ a sequence of random variables that are mutually independent and have the same distribution as $X$. For every strictly positive integer $n$, we define the random variable $S_{n}$ by $S_{n}=\sum_{k=1}^{n} X_{k}$. We assume that the random variable $X$ admits an exponential moment of order $\alpha$ where $\alpha$ is a strictly positive real.
a) Show that the function $\Psi: t \mapsto \mathbb{E}\left(\mathrm{e}^{t X}\right)$ is defined and continuous on the segment $[-\alpha, \alpha]$.
b) Show that the function $\Psi$ is differentiable on the interval $]-\alpha, \alpha[$ and determine its derivative function.
grandes-ecoles 2017 QII.C.3 Extract moments from the MGF or characteristic function View
In subsection II.C, we consider $\varepsilon$ a strictly positive real, $X$ a discrete real random variable taking values in $\left\{x_{p}, p \in \mathbb{N}\right\}$, and $\left(X_{k}\right)_{k \in \mathbb{N}^{*}}$ a sequence of random variables that are mutually independent and have the same distribution as $X$. For every strictly positive integer $n$, we define the random variable $S_{n}$ by $S_{n}=\sum_{k=1}^{n} X_{k}$. We assume that the random variable $X$ admits an exponential moment of order $\alpha$ where $\alpha$ is a strictly positive real. The function $\Psi: t \mapsto \mathbb{E}\left(\mathrm{e}^{tX}\right)$ is defined on $[-\alpha, \alpha]$.
We consider the function $f_{\varepsilon}$ defined by $$f_{\varepsilon}:\left\{\begin{array}{l}[-\alpha, \alpha] \rightarrow \mathbb{R}^{+} \\ t \mapsto \mathrm{e}^{-(m+\varepsilon) t} \Psi(t)\end{array}\right.$$
a) Give the values of $f_{\varepsilon}(0)$ and $f_{\varepsilon}^{\prime}(0)$.
b) Deduce that there exists a real $t_{0}$ belonging to the interval $]0, \alpha[$ satisfying $0 < f_{\varepsilon}\left(t_{0}\right) < 1$.
grandes-ecoles 2017 QII.C.4 MGF of sums of independent random variables (product property) View
In subsection II.C, we consider $\varepsilon$ a strictly positive real, $X$ a discrete real random variable taking values in $\left\{x_{p}, p \in \mathbb{N}\right\}$, and $\left(X_{k}\right)_{k \in \mathbb{N}^{*}}$ a sequence of random variables that are mutually independent and have the same distribution as $X$. For every strictly positive integer $n$, we define the random variable $S_{n}$ by $S_{n}=\sum_{k=1}^{n} X_{k}$. We assume that the random variable $X$ admits an exponential moment of order $\alpha$ where $\alpha$ is a strictly positive real. The function $\Psi: t \mapsto \mathbb{E}\left(\mathrm{e}^{tX}\right)$ is defined on $[-\alpha, \alpha]$.
Show that for every real $t$ belonging to the segment $[-\alpha, \alpha]$ and every $n$ belonging to $\mathbb{N}^{*}$, the real random variable $\mathrm{e}^{t S_{n}}$ has expectation equal to $(\Psi(t))^{n}$.
grandes-ecoles 2017 QII.C.5 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
In subsection II.C, we consider $\varepsilon$ a strictly positive real, $X$ a discrete real random variable taking values in $\left\{x_{p}, p \in \mathbb{N}\right\}$, and $\left(X_{k}\right)_{k \in \mathbb{N}^{*}}$ a sequence of random variables that are mutually independent and have the same distribution as $X$. For every strictly positive integer $n$, we define the random variable $S_{n}$ by $S_{n}=\sum_{k=1}^{n} X_{k}$. We assume that the random variable $X$ admits an exponential moment of order $\alpha$ where $\alpha$ is a strictly positive real. The function $\Psi: t \mapsto \mathbb{E}\left(\mathrm{e}^{tX}\right)$ is defined on $[-\alpha, \alpha]$, and $f_{\varepsilon}(t) = \mathrm{e}^{-(m+\varepsilon)t}\Psi(t)$.
a) Let $t$ be a real belonging to the interval $]0, \alpha]$ and let $n$ belong to $\mathbb{N}^{*}$. Show that $\mathbb{P}\left(\frac{S_{n}}{n} \geqslant m+\varepsilon\right)=\mathbb{P}\left(\mathrm{e}^{t S_{n}} \geqslant\left(\mathrm{e}^{t(m+\varepsilon)}\right)^{n}\right)$, then that $\mathbb{P}\left(\frac{S_{n}}{n} \geqslant m+\varepsilon\right) \leqslant\left(f_{\varepsilon}(t)\right)^{n}$.
b) Deduce that there exists a real $r$ belonging to the interval $]0,1[$ such that $\forall n \in \mathbb{N}^{*}, \mathbb{P}\left(\frac{S_{n}}{n} \geqslant m+\varepsilon\right) \leqslant r^{n}$.
grandes-ecoles 2017 QII.D.1 Existence and domain of the MGF View
In subsection II.D, we assume that there exists a strictly positive real number $c$ such that the discrete real random variable $X$ satisfies $\mathbb{E}(X)=0$ and $\forall \omega \in \Omega,|X(\omega)| \leqslant c$.
Show that the random variable $X$ admits an exponential moment of order $\alpha$ for every strictly positive real number $\alpha$.
grandes-ecoles 2017 QII.D.3 Upper bound on MGF (sub-Gaussian or exponential inequalities) View
In subsection II.D, we assume that there exists a strictly positive real number $c$ such that the discrete real random variable $X$ satisfies $\mathbb{E}(X)=0$ and $\forall \omega \in \Omega,|X(\omega)| \leqslant c$.
a) Show that $\mathbb{E}\left(\mathrm{e}^{X}\right) \leqslant \cosh(c)$.
b) Deduce that $\forall t \in \mathbb{R}^{+*}, \Psi(t) \leqslant \cosh(ct)$.
grandes-ecoles 2017 QII.D.4 Upper bound on MGF (sub-Gaussian or exponential inequalities) View
In subsection II.D, we assume that there exists a strictly positive real number $c$ such that the discrete real random variable $X$ satisfies $\mathbb{E}(X)=0$ and $\forall \omega \in \Omega,|X(\omega)| \leqslant c$. The functions $\Psi$ and $f_{\varepsilon}$ are defined on $\mathbb{R}$, with $f_{\varepsilon}(t) = \mathrm{e}^{-\varepsilon t}\Psi(t)$ (since $m=0$).
Show that $\forall t \in \mathbb{R}^{+*}, f_{\varepsilon}(t) \leqslant \exp\left(-t\varepsilon+\frac{1}{2}c^{2}t^{2}\right)$.
grandes-ecoles 2017 QII.D.5 Concentration inequality via MGF and Markov's inequality (Chernoff method) View
In subsection II.D, we assume that there exists a strictly positive real number $c$ such that the discrete real random variable $X$ satisfies $\mathbb{E}(X)=0$ and $\forall \omega \in \Omega,|X(\omega)| \leqslant c$. For every strictly positive integer $n$, $S_{n}=\sum_{k=1}^{n} X_{k}$ where $\left(X_{k}\right)$ are mutually independent with the same distribution as $X$.
Show that $\forall n \in \mathbb{N}^{*}, \mathbb{P}\left(\left|\frac{S_{n}}{n}\right| \geqslant \varepsilon\right) \leqslant 2 \exp\left(-n \frac{\varepsilon^{2}}{2c^{2}}\right)$.
grandes-ecoles 2018 QII.1 Upper bound on MGF (sub-Gaussian or exponential inequalities) View
Let $k$ be a strictly positive integer and $U_{1}, \ldots, U_{k}$ a sequence of $k$ random variables taking values in $\{-1,1\}$, independent and uniformly distributed. We also denote $$S_{k} = \sum_{i=1}^{k} U_{i}$$
Let $\varphi : \mathbb{R} \rightarrow \mathbb{R}$ be the function defined by $\varphi(\lambda) = \ln\left(\mathbb{E}\left[e^{\lambda U_{1}}\right]\right)$. Establish that $$\forall \lambda \in \mathbb{R}, \quad \varphi(\lambda) \leqslant \frac{\lambda^{2}}{2}.$$
grandes-ecoles 2018 Q4 Existence and domain of the MGF View
We assume that, for all non-zero natural integer $n$, $X$ admits a moment of order $n$ and that the power series $\sum _ { n \geqslant 0 } m _ { n } ( X ) \frac { t ^ { n } } { n ! }$ has a radius of convergence $R _ { X } > 0$. For all $t \in ] - R _ { X } , R _ { X } [$, we denote $M _ { X } ( t ) = \sum _ { n = 0 } ^ { + \infty } m _ { n } ( X ) \frac { t ^ { n } } { n ! }$.
Using the results from the preamble, show that, for all $t \in ] - R _ { X } , R _ { X } [$, the random variable $\mathrm { e } ^ { t X }$ admits an expectation and that $M _ { X } ( t ) = \mathbb { E } \left( \mathrm { e } ^ { t X } \right)$.
grandes-ecoles 2018 Q5 Existence and domain of the MGF View
We assume that, for all non-zero natural integer $n$, $X$ admits a moment of order $n$ and that the power series $\sum _ { n \geqslant 0 } m _ { n } ( X ) \frac { t ^ { n } } { n ! }$ has a radius of convergence $R _ { X } > 0$. For all $t \in ] - R _ { X } , R _ { X } [$, we denote $M _ { X } ( t ) = \sum _ { n = 0 } ^ { + \infty } m _ { n } ( X ) \frac { t ^ { n } } { n ! }$.
Show conversely that, if there exists a real $R > 0$ such that, for all $t \in ] - R , R [$, the random variable $\mathrm { e } ^ { t X }$ admits an expectation, then the domain of definition of the moment generating function of $X$ contains $] - R , R [$ and for all $t \in ] - R , R \left[ , M _ { X } ( t ) = \mathbb { E } \left( \mathrm { e } ^ { t X } \right) \right.$.
grandes-ecoles 2018 Q6 MGF of sums of independent random variables (product property) View
We assume that $X$ and $Y$ are two independent discrete real-valued random variables with strictly positive values admitting moments of all orders. We denote $R _ { X }$ (respectively $R _ { Y }$) the radius of convergence (assumed strictly positive) associated with the function $M _ { X }$ (respectively $M _ { Y }$).
Show that the random variable $X + Y$ admits moments of all orders and that $$\forall | t | < \min \left( R _ { X } , R _ { Y } \right) , \quad M _ { X + Y } ( t ) = M _ { X } ( t ) \times M _ { Y } ( t )$$
grandes-ecoles 2018 Q8 Compute MGF or characteristic function for a named distribution View
$\lambda$ is a fixed real number. We assume that $Z$ is a random variable on $(\Omega , \mathcal { A } , \mathbb { P })$ following the Poisson distribution with parameter $\lambda$.
Calculate the moment generating function of $Z$. Deduce the values of $m _ { 1 } ( Z )$ and $m _ { 2 } ( Z )$.
grandes-ecoles 2018 Q9 Compute MGF or characteristic function for a named distribution View
Let $n$ be a non-zero natural integer. For $i \in \llbracket 1 , n \rrbracket$, $X _ { i }$ is a random variable on $(\Omega , \mathcal { A } , \mathbb { P })$ following a Bernoulli distribution with parameter $\lambda / n$. We assume that $X _ { 1 } , X _ { 2 } , \ldots , X _ { n }$ are mutually independent and we set $S _ { n } = \sum _ { i = 1 } ^ { n } X _ { i }$.
Calculate the moment generating function of the random variable $S _ { n }$.
grandes-ecoles 2018 Q11 Compute MGF or characteristic function for a named distribution View
Deduce that, for any real $\xi$, $\int_{-\infty}^{+\infty} \exp\left(-x^{2}\right) \exp(-\mathrm{i} 2\pi \xi x) \mathrm{d}x = \sqrt{\pi} \exp\left(-\pi^{2} \xi^{2}\right)$.