LFM Stats And Pure

View all 322 questions →

grandes-ecoles 2017 Q12 Probability Inequality and Tail Bound Proof View
Let $m$ be a measure. We assume that there exists a constant $C > 0$ such that inequality (3) $$\int e ^ { \lambda f ( x ) } m ( x ) d x \leqslant \exp \left( \lambda \int f ( x ) m ( x ) d x + \frac { C \lambda ^ { 2 } } { 4 } \right)$$ applies to all $f \in \mathscr{C}_b^1$ with $|f'(x)| \leq 1$. Show that inequality (3) applies to the function defined by $f ( x ) = x$. You may use the sequence of functions defined by $f _ { n } ( x ) = n \arctan \left( \frac { x } { n } \right)$.
grandes-ecoles 2017 Q13 Probability Inequality and Tail Bound Proof View
Let $m$ be a measure. We assume that there exists a constant $C > 0$ such that for $\lambda \geq 0$, $$\int e ^ { \lambda f ( x ) } m ( x ) d x \leqslant \exp \left( \lambda \int f ( x ) m ( x ) d x + \frac { C \lambda ^ { 2 } } { 4 } \right) \tag{3}$$ applies (in particular to $f(x) = x$).
13a. Let $M = \int x m ( x ) d x$ and $a \geqslant M$. Show that $$\int _ { a } ^ { + \infty } m ( x ) d x \leqslant \exp \left( - \frac { ( a - M ) ^ { 2 } } { C } \right)$$
13b. Conclude that for all $\alpha < \frac { 1 } { C }$, the function $x \mapsto e ^ { \alpha x ^ { 2 } } m ( x )$ is integrable on $\mathbb { R }$.
grandes-ecoles 2017 Q14 Change of Variable and Integral Evaluation View
Let $p , q , r : \mathbb { R } \rightarrow \mathbb { R } _ { * } ^ { + }$ be three continuous functions, with strictly positive values and integrable on $\mathbb { R }$.
14a. Show that there exists a function $u : ] 0,1 [ \rightarrow \mathbb { R }$ of class $\mathscr { C } ^ { 1 }$ bijective such that $$\forall t \in ] 0,1 [ , \quad u ^ { \prime } ( t ) p ( u ( t ) ) = \int p ( x ) d x$$ Similarly, there exists an analogous function $v : ] 0,1 [ \rightarrow \mathbb { R }$ for $q$.
14b. We assume that $$\forall x , y \in \mathbb { R } , \quad p ( x ) q ( y ) \leqslant \left( r \left( \frac { x + y } { 2 } \right) \right) ^ { 2 } . \tag{4}$$ Show that $$\left( \int p ( x ) d x \right) \left( \int q ( x ) d x \right) \leqslant \left( \int r ( x ) d x \right) ^ { 2 } \tag{5}$$ You may use, after having justified its validity, the change of variable defined by $x = \frac { u ( t ) + v ( t ) } { 2 }$ in the right-hand side of inequality (5).
grandes-ecoles 2017 Q15 Probability Inequality and Tail Bound Proof View
We recall that $\mu ( x ) = \frac { 1 } { \sqrt { \pi } } e ^ { - x ^ { 2 } }$ is a measure, and that for $A \in \operatorname{Int}$, $\mu(A) = \int \mathbb{1}_A(x) \mu(x) dx$. We denote $d(x, A) = \inf\{|x - y| : y \in A\}$. Let $A \subset \mathbb { R }$.
15a. Show that for all $x , y \in \mathbb { R }$, we have $$\exp \left( \frac { 1 } { 2 } d ( x , A ) ^ { 2 } - x ^ { 2 } \right) \mathbb { 1 } _ { A } ( y ) \exp \left( - y ^ { 2 } \right) \leqslant \exp \left( - \frac { ( x + y ) ^ { 2 } } { 2 } \right)$$
15b. We assume that $A \in \operatorname { Int }$ and that $\mu ( A ) > 0$. Deduce that $$\int \exp \left( \frac { 1 } { 2 } d ( x , A ) ^ { 2 } \right) \mu ( x ) d x \leqslant \frac { 1 } { \mu ( A ) }$$
grandes-ecoles 2017 Q16 Probability Inequality and Tail Bound Proof View
We recall that $\mu ( x ) = \frac { 1 } { \sqrt { \pi } } e ^ { - x ^ { 2 } }$ is a measure, and that for $A \in \operatorname{Int}$, $\mu(A) = \int \mathbb{1}_A(x) \mu(x) dx$. Let $A \in \operatorname{Int}$. For $t \geqslant 0$, we define the set $A _ { t } = \{ x \in \mathbb { R } : d ( x , A ) \leqslant t \}$.
16a. Show that $A _ { t } \in \operatorname { Int }$ for all $t \geqslant 0$.
16b. We further assume that $\mu ( A ) > 0$. Show that for all $t \geqslant 0$, we have $$1 - \mu \left( A _ { t } \right) \leqslant \frac { e ^ { - t ^ { 2 } / 2 } } { \mu ( A ) }$$
grandes-ecoles 2018 QII.2 Probability Inequality and Tail Bound Proof View
Let $k$ be a strictly positive integer and $U_{1}, \ldots, U_{k}$ a sequence of $k$ random variables taking values in $\{-1,1\}$, independent and uniformly distributed. We also denote $$S_{k} = \sum_{i=1}^{k} U_{i}$$ Let $\varphi(\lambda) = \ln\left(\mathbb{E}\left[e^{\lambda U_{1}}\right]\right)$.
Let $t \in \mathbb{R}$. Show that for all $\lambda > 0$, we have the inequality $$\mathbb{P}\left(S_{k} \geqslant t\right) \leqslant \exp(k\varphi(\lambda) - \lambda t).$$
grandes-ecoles 2018 QII.3 Probability Inequality and Tail Bound Proof View
Let $k$ be a strictly positive integer and $U_{1}, \ldots, U_{k}$ a sequence of $k$ random variables taking values in $\{-1,1\}$, independent and uniformly distributed. We also denote $$S_{k} = \sum_{i=1}^{k} U_{i}$$
Deduce Hoeffding's inequality for $S_{k}$: for all $t > 0$, we have $$\mathbb{P}\left(S_{k} \geqslant t\right) \leqslant \exp\left(-\frac{t^{2}}{2k}\right).$$
grandes-ecoles 2018 QII.5 Probability Inequality and Tail Bound Proof View
We introduce a uniformly distributed random variable $C : \Omega \rightarrow \mathcal{M}_{n}(\{-1,1\})$. For $\omega \in \Omega$, we denote by $C_{i,j}(\omega)$ the coefficients of the matrix $C(\omega)$.
Show that for all $t \geqslant 0$, we have $$\mathbb{P}\left(M(C) \geqslant t n^{3/2}\right) \leqslant \exp\left(-\left(\frac{t^{2}}{2} - 2\ln 2\right)n\right).$$
grandes-ecoles 2018 QII.6 Probability Inequality and Tail Bound Proof View
We recall the notation $\underline{M}(n) = \min\left\{M(A) \mid A \in \mathcal{M}_{n}(\{-1,1\})\right\}$. Show that for all $n \geqslant 1$, we have $$\underline{M}(n) \leqslant 2\sqrt{\ln 2}\, n^{3/2}.$$
Hint: one may begin by showing that for all $\varepsilon > 0$, there exists a matrix $A$ in $\mathcal{M}_{n}(\{-1,1\})$ such that $$M(A) \leqslant (2\sqrt{\ln 2} + \varepsilon)\, n^{3/2}.$$
grandes-ecoles 2018 QV.2 Probability Inequality and Tail Bound Proof View
We fix $A \in \mathcal{M}_{n}(\{-1,1\})$ and denote $$m(A) := \min(S(A) \cap \mathbb{N}).$$
By drawing inspiration from the previous question and the methods developed in Parts II and III, show that we also have $$m(A) \leqslant \sqrt{2n \ln(2n)}.$$
grandes-ecoles 2018 Q1 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
Show that $g_{\sigma}$ is integrable on $\mathbb{R}$.
grandes-ecoles 2018 Q2 Change of Variable and Integral Evaluation View
Assuming that $\int_{-\infty}^{+\infty} \exp\left(-x^{2}\right) \mathrm{d}x = \sqrt{\pi}$, give the value of $\int_{-\infty}^{+\infty} g_{\sigma}(x) \mathrm{d}x$.
grandes-ecoles 2018 Q2 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
We denote by $\mathbb{R}_{N}[X]$ the vector space of polynomials with real coefficients, of degree at most $N$. We define the set $A_{N}$ formed by $P \in \mathbb{R}_{N}[X]$, such that $P(-1) = P(1) = 1$, which furthermore satisfy $P(x) \geqslant 0$ for all $x$ in the interval $[-1,1]$. We define on $\mathbb{R}_{N}[X]$ a linear form $L$ by $L(P) = \int_{-1}^{1} P(x)\,dx$. The infimum of $L$ on $A_N$ is denoted $a_N = \inf\{L(P) \mid P \in A_N\}$.
(a) Show that the infimum of $L$ on $A_{N}$ is attained.
In what follows, we denote by $B_{N}$ the set of $P \in A_{N}$ such that $L(P) = a_{N}$.
(b) Show that $B_{N}$ is a convex compact subset.
(c) Verify that $B_{N}$ contains an even polynomial.
grandes-ecoles 2018 Q4 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
Let $f$ be a function from $\mathbb{R}$ to $\mathbb{C}$, continuous and integrable on $\mathbb{R}$. Show that, for any real $\xi$, the function $\left\lvert\, \begin{aligned} & \mathbb{R} \rightarrow \mathbb{C} \\ & x \mapsto f(x) \exp(-\mathrm{i} 2\pi \xi x) \end{aligned}\right.$ is integrable on $\mathbb{R}$.
grandes-ecoles 2018 Q6 Expectation and Moment Inequality Proof View
Let $p$ and $q$ be two strictly positive reals such that $\frac{1}{p} + \frac{1}{q} = 1$. Deduce that if $X$ and $Y$ are two real-valued random variables on the finite probability space $(\Omega, \mathcal{A}, \mathbb{P})$ then
$$\mathbb{E}(|XY|) \leqslant \mathbb{E}(|X|^{p})^{1/p} \mathbb{E}(|Y|^{q})^{1/q}$$
You may first show this result when $\mathbb{E}(|X|^{p}) = \mathbb{E}(|Y|^{q}) = 1$.
grandes-ecoles 2018 Q6 Expectation and Moment Inequality Proof View
Let $p$ and $q$ be two strictly positive reals such that $\frac{1}{p} + \frac{1}{q} = 1$. Deduce that if $X$ and $Y$ are two real-valued random variables on the finite probability space $(\Omega, \mathcal{A}, \mathbb{P})$, then
$$\mathbb{E}(|XY|) \leqslant \mathbb{E}(|X|^{p})^{1/p} \mathbb{E}(|Y|^{q})^{1/q}$$
You may first prove this result when $\mathbb{E}(|X|^{p}) = \mathbb{E}(|Y|^{q}) = 1$.
grandes-ecoles 2018 Q10 Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that, for all real $t$,
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant \mathbb{P}(|X| \geqslant t - |\delta|)$$
grandes-ecoles 2018 Q10 Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative reals $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that, for all reals $t$,
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant \mathbb{P}(|X| \geqslant t - |\delta|)$$
grandes-ecoles 2018 Q12 Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Deduce that for all real $t$ such that $t \geqslant |\delta|$ we have
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
grandes-ecoles 2018 Q12 Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative reals $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Deduce that for all reals $t$ such that $t \geqslant |\delta|$, we have
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
grandes-ecoles 2018 Q13 Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that the inequality
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
remains valid if $0 \leqslant t < |\delta|$.
grandes-ecoles 2018 Q13 Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative reals $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that the inequality
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
remains valid if $0 \leqslant t < |\delta|$.
grandes-ecoles 2018 Q14 Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. The objective of this part is to show, for any non-empty closed convex set $C$ of $E$,
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
Handle the case where $C$ is a closed convex set of $E$ that does not meet $X(\Omega)$.
grandes-ecoles 2018 Q14 Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. The objective of this part is to show, for any non-empty closed convex set $C$ of $E$,
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
Handle the case where $C$ is a closed convex set of $E$ that does not meet $X(\Omega)$.
grandes-ecoles 2018 Q16 Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Deduce the expectation of $\exp\left(\frac{1}{8}d(X, u)^{2}\right)$ and show that it is less than or equal to $2^{n}$.