Probability Inequality and Tail Bound Proof

Prove an inequality bounding the probability of a random variable exceeding a threshold, typically involving exponential or concentration inequalities.

grandes-ecoles 2017 Q12 View
Let $m$ be a measure. We assume that there exists a constant $C > 0$ such that inequality (3) $$\int e ^ { \lambda f ( x ) } m ( x ) d x \leqslant \exp \left( \lambda \int f ( x ) m ( x ) d x + \frac { C \lambda ^ { 2 } } { 4 } \right)$$ applies to all $f \in \mathscr{C}_b^1$ with $|f'(x)| \leq 1$. Show that inequality (3) applies to the function defined by $f ( x ) = x$. You may use the sequence of functions defined by $f _ { n } ( x ) = n \arctan \left( \frac { x } { n } \right)$.
grandes-ecoles 2017 Q13 View
Let $m$ be a measure. We assume that there exists a constant $C > 0$ such that for $\lambda \geq 0$, $$\int e ^ { \lambda f ( x ) } m ( x ) d x \leqslant \exp \left( \lambda \int f ( x ) m ( x ) d x + \frac { C \lambda ^ { 2 } } { 4 } \right) \tag{3}$$ applies (in particular to $f(x) = x$).
13a. Let $M = \int x m ( x ) d x$ and $a \geqslant M$. Show that $$\int _ { a } ^ { + \infty } m ( x ) d x \leqslant \exp \left( - \frac { ( a - M ) ^ { 2 } } { C } \right)$$
13b. Conclude that for all $\alpha < \frac { 1 } { C }$, the function $x \mapsto e ^ { \alpha x ^ { 2 } } m ( x )$ is integrable on $\mathbb { R }$.
grandes-ecoles 2017 Q15 View
We recall that $\mu ( x ) = \frac { 1 } { \sqrt { \pi } } e ^ { - x ^ { 2 } }$ is a measure, and that for $A \in \operatorname{Int}$, $\mu(A) = \int \mathbb{1}_A(x) \mu(x) dx$. We denote $d(x, A) = \inf\{|x - y| : y \in A\}$. Let $A \subset \mathbb { R }$.
15a. Show that for all $x , y \in \mathbb { R }$, we have $$\exp \left( \frac { 1 } { 2 } d ( x , A ) ^ { 2 } - x ^ { 2 } \right) \mathbb { 1 } _ { A } ( y ) \exp \left( - y ^ { 2 } \right) \leqslant \exp \left( - \frac { ( x + y ) ^ { 2 } } { 2 } \right)$$
15b. We assume that $A \in \operatorname { Int }$ and that $\mu ( A ) > 0$. Deduce that $$\int \exp \left( \frac { 1 } { 2 } d ( x , A ) ^ { 2 } \right) \mu ( x ) d x \leqslant \frac { 1 } { \mu ( A ) }$$
grandes-ecoles 2017 Q16 View
We recall that $\mu ( x ) = \frac { 1 } { \sqrt { \pi } } e ^ { - x ^ { 2 } }$ is a measure, and that for $A \in \operatorname{Int}$, $\mu(A) = \int \mathbb{1}_A(x) \mu(x) dx$. Let $A \in \operatorname{Int}$. For $t \geqslant 0$, we define the set $A _ { t } = \{ x \in \mathbb { R } : d ( x , A ) \leqslant t \}$.
16a. Show that $A _ { t } \in \operatorname { Int }$ for all $t \geqslant 0$.
16b. We further assume that $\mu ( A ) > 0$. Show that for all $t \geqslant 0$, we have $$1 - \mu \left( A _ { t } \right) \leqslant \frac { e ^ { - t ^ { 2 } / 2 } } { \mu ( A ) }$$
grandes-ecoles 2018 QII.2 View
Let $k$ be a strictly positive integer and $U_{1}, \ldots, U_{k}$ a sequence of $k$ random variables taking values in $\{-1,1\}$, independent and uniformly distributed. We also denote $$S_{k} = \sum_{i=1}^{k} U_{i}$$ Let $\varphi(\lambda) = \ln\left(\mathbb{E}\left[e^{\lambda U_{1}}\right]\right)$.
Let $t \in \mathbb{R}$. Show that for all $\lambda > 0$, we have the inequality $$\mathbb{P}\left(S_{k} \geqslant t\right) \leqslant \exp(k\varphi(\lambda) - \lambda t).$$
grandes-ecoles 2018 QII.3 View
Let $k$ be a strictly positive integer and $U_{1}, \ldots, U_{k}$ a sequence of $k$ random variables taking values in $\{-1,1\}$, independent and uniformly distributed. We also denote $$S_{k} = \sum_{i=1}^{k} U_{i}$$
Deduce Hoeffding's inequality for $S_{k}$: for all $t > 0$, we have $$\mathbb{P}\left(S_{k} \geqslant t\right) \leqslant \exp\left(-\frac{t^{2}}{2k}\right).$$
grandes-ecoles 2018 QII.5 View
We introduce a uniformly distributed random variable $C : \Omega \rightarrow \mathcal{M}_{n}(\{-1,1\})$. For $\omega \in \Omega$, we denote by $C_{i,j}(\omega)$ the coefficients of the matrix $C(\omega)$.
Show that for all $t \geqslant 0$, we have $$\mathbb{P}\left(M(C) \geqslant t n^{3/2}\right) \leqslant \exp\left(-\left(\frac{t^{2}}{2} - 2\ln 2\right)n\right).$$
grandes-ecoles 2018 QII.6 View
We recall the notation $\underline{M}(n) = \min\left\{M(A) \mid A \in \mathcal{M}_{n}(\{-1,1\})\right\}$. Show that for all $n \geqslant 1$, we have $$\underline{M}(n) \leqslant 2\sqrt{\ln 2}\, n^{3/2}.$$
Hint: one may begin by showing that for all $\varepsilon > 0$, there exists a matrix $A$ in $\mathcal{M}_{n}(\{-1,1\})$ such that $$M(A) \leqslant (2\sqrt{\ln 2} + \varepsilon)\, n^{3/2}.$$
grandes-ecoles 2018 QV.2 View
We fix $A \in \mathcal{M}_{n}(\{-1,1\})$ and denote $$m(A) := \min(S(A) \cap \mathbb{N}).$$
By drawing inspiration from the previous question and the methods developed in Parts II and III, show that we also have $$m(A) \leqslant \sqrt{2n \ln(2n)}.$$
grandes-ecoles 2018 Q10 View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that, for all real $t$,
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant \mathbb{P}(|X| \geqslant t - |\delta|)$$
grandes-ecoles 2018 Q12 View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Deduce that for all real $t$ such that $t \geqslant |\delta|$ we have
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
grandes-ecoles 2018 Q13 View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that the inequality
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
remains valid if $0 \leqslant t < |\delta|$.
grandes-ecoles 2018 Q14 View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. The objective of this part is to show, for any non-empty closed convex set $C$ of $E$,
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
Handle the case where $C$ is a closed convex set of $E$ that does not meet $X(\Omega)$.
grandes-ecoles 2018 Q16 View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Deduce the expectation of $\exp\left(\frac{1}{8}d(X, u)^{2}\right)$ and show that it is less than or equal to $2^{n}$.
grandes-ecoles 2018 Q17 View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Justify that $d(X, C) \leqslant d(X, u)$ and deduce inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1$$
in this case.
grandes-ecoles 2018 Q18 View
We propose to prove inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
by induction on the dimension $n$ of $E$. We assume that $C$ is a closed convex set of $E$ such that $C \cap X(\Omega)$ contains at least two elements. Handle the case $n = 1$.
grandes-ecoles 2018 Q24 View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We assume, without loss of generality, that $p_{+} \geqslant p_{-}$. Show that $p_{-} > 0$.
grandes-ecoles 2018 Q25 View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Show that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \mathbb{E}\left(\left(\exp\left(\frac{1}{8}d(X', C_{-1})^{2}\right)\right)^{1-\lambda} \cdot \left(\exp\left(\frac{1}{8}d(X', C_{+1})^{2}\right)\right)^{\lambda}\right)$$
grandes-ecoles 2018 Q26 View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Deduce that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \left(\mathbb{E}\left(\exp\left(\frac{1}{8}d(X', C_{-1})^{2}\right)\right)\right)^{1-\lambda} \cdot \left(\mathbb{E}\left(\exp\left(\frac{1}{8}d(X', C_{+1})^{2}\right)\right)\right)^{\lambda}$$
grandes-ecoles 2018 Q27 View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Using the induction hypothesis, justify that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = 1\right) \leqslant \frac{1}{p_{+}}$$
grandes-ecoles 2018 Q28 View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Deduce from the above that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant \frac{1}{2}\left(\frac{1}{p_{+}} + \exp\left(\frac{\lambda^{2}}{2}\right) \frac{1}{(p_{-})^{1-\lambda}} \cdot \frac{1}{(p_{+})^{\lambda}}\right)$$
grandes-ecoles 2018 Q29 View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. We set $\lambda = 1 - \frac{p_{-}}{p_{+}}$. Show that
$$\mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant \frac{1}{2p_{+}}\left(1 + \exp\left(\frac{\lambda^{2}}{2}\right)(1 - \lambda)^{\lambda - 1}\right)$$
grandes-ecoles 2018 Q32 View
Complete the proof of inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
grandes-ecoles 2018 Q33 View
Deduce Talagrand's inequality: For every non-empty closed convex set $C$ of $E$ and for every strictly positive real number $t$
$$\mathbb{P}(X \in C) \cdot \mathbb{P}(d(X, C) \geqslant t) \leqslant \exp\left(-\frac{t^{2}}{8}\right)$$
grandes-ecoles 2018 Q37 View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the Frobenius norm $\|\cdot\|_{F}$. We fix a unit vector $u$ in $\mathbb{R}^{d}$, define $g(M) = \|M \cdot u\|$, and let $C = \{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\}$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $r$ and $t$ be two real numbers, with $t > 0$. Deduce that
$$\mathbb{P}(g(X) \leqslant r) \cdot \mathbb{P}(g(X) \geqslant r + t) \leqslant \exp\left(-\frac{1}{8}t^{2}\right)$$