Probability Inequality and Tail Bound Proof

Prove an inequality bounding the probability of a random variable exceeding a threshold, typically involving exponential or concentration inequalities.

grandes-ecoles 2018 Q39 View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Deduce from the above that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - m| \geqslant t) \leqslant 4\exp\left(-\frac{1}{8}t^{2}\right)$$
where $m$ is a median of $g(X)$.
grandes-ecoles 2018 Q10 View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative reals $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that, for all reals $t$,
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant \mathbb{P}(|X| \geqslant t - |\delta|)$$
grandes-ecoles 2018 Q12 View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative reals $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Deduce that for all reals $t$ such that $t \geqslant |\delta|$, we have
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
grandes-ecoles 2018 Q13 View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative reals $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that the inequality
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
remains valid if $0 \leqslant t < |\delta|$.
grandes-ecoles 2018 Q14 View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. The objective of this part is to show, for any non-empty closed convex set $C$ of $E$,
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
Handle the case where $C$ is a closed convex set of $E$ that does not meet $X(\Omega)$.
grandes-ecoles 2018 Q17 View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Justify that $d(X, C) \leqslant d(X, u)$ and deduce inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
in this case.
grandes-ecoles 2018 Q18 View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ such that $C \cap X(\Omega)$ contains at least two elements. We propose to prove inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
by induction on the dimension $n$ of $E$. Handle the case $n = 1$.
grandes-ecoles 2018 Q27 View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$. Using the induction hypothesis, justify that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = 1\right) \leqslant \frac{1}{p_{+}}$$
grandes-ecoles 2018 Q32 View
Complete the proof of inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
grandes-ecoles 2018 Q33 View
Deduce Talagrand's inequality: For every non-empty closed convex set $C$ of $E$ and for every strictly positive real number $t$
$$\mathbb{P}(X \in C) \cdot \mathbb{P}(d(X, C) \geqslant t) \leqslant \exp\left(-\frac{t^{2}}{8}\right)$$
grandes-ecoles 2018 Q37 View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We denote by $\|\cdot\|_{F}$ the associated Euclidean norm. We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define
$$g : \left\lvert \, \begin{aligned} & \mathcal{M}_{k,d}(\mathbb{R}) \rightarrow \mathbb{R} \\ & M \mapsto \|M \cdot u\| \end{aligned} \right.$$
Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $C = \left\{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\right\}$. Deduce that
$$\mathbb{P}(g(X) \leqslant r) \cdot \mathbb{P}(g(X) \geqslant r + t) \leqslant \exp\left(-\frac{1}{8} t^{2}\right)$$
grandes-ecoles 2018 Q39 View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Deduce from the above that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - m| \geqslant t) \leqslant 4 \exp\left(-\frac{1}{8} t^{2}\right)$$
where $m$ is a median of $g(X)$.
grandes-ecoles 2019 Q18 View
We fix once and for all an integer $n$ which should be considered as being very large. For each pair $(x,y) \in \{0,1\}^n$ such that $x \neq y$, we fix an integer $j_n(x,y)$ whose existence is proved in question 17c. We say that $x$ is better than $y$ given $E^1, E^2, \ldots, E^T$ if $$\left|\frac{1}{T}\sum_{i=1}^T E^i_{j_n(x,y)} - \mathbb{E}\left[O_{j_n(x,y)}(x)\right]\right| < \left|\frac{1}{T}\sum_{i=1}^T E^i_{j_n(x,y)} - \mathbb{E}\left[O_{j_n(x,y)}(y)\right]\right|$$ We set $R_{n,T}(E^1, E^2, \ldots, E^T) = x$ if for all $y \neq x$, $x$ is better than $y$. If we cannot find such an $x$ we set $R_{n,T}(E^1, E^2, \ldots, E^T) = (0,0,\ldots,0)$.
Prove that if $T_n \geq e^{3\ln(n)n^{1/3}}$ then for all $x \in \{0,1\}^n$ and any sequence $$O^1(x), O^2(x), \ldots, O^{T_n}(x)$$ of $T_n$ random variables taking values in $\{0,1\}^n$ mutually independent with the same distribution as $O(x)$, we have $$\max_{x \in \{0,1\}^n} \mathbb{P}\left(R_{n,T_n}\left(O^1(x), O^2(x), \ldots, O^{T_n}(x)\right) \neq x\right) \leq u_n$$ where $(u_n)_{n \geq 1}$ is a sequence tending to 0 as $n$ tends to infinity.
Hint. One may start by writing, justifying it, that $$\mathbb{P}\left(R_{n,T}\left(O^1(x), O^2(x), \ldots, O^T(x)\right) \neq x\right) \leq \sum_{y \in \{0,1\}^n, y \neq x} \mathbb{P}\left(x \text{ is not better than } y \text{ given } O^1(x), O^2(x), \ldots, O^T(x)\right)$$
grandes-ecoles 2020 Q35 View
We define, for all $x \in \mathbb{R}$, $\lambda^*(x) = \sup_{t \geqslant 0}(tx - \lambda(t))$. We admit that the convergence of the sequence of functions $\left(t \mapsto \frac{\ln\left(E\left(\mathrm{e}^{tS_n}\right)\right)}{n}\right)_{n \in \mathbb{N}^*}$ towards $t \mapsto \ln(\gamma(t))$ is uniform on $\mathbb{R}^+$. Let $\varepsilon > 0$. Show that there exists a rank $n_0 \in \mathbb{N}^*$ such that, for all $t \in \mathbb{R}^+$ and for all $n \in \mathbb{N}^*$, $$n \geqslant n_0 \Longrightarrow \ln\left(E\left(\mathrm{e}^{tS_n}\right)\right) \leqslant n(\lambda(t) + \varepsilon).$$
grandes-ecoles 2020 Q36 View
We define, for all $x \in \mathbb{R}$, $\lambda^*(x) = \sup_{t \geqslant 0}(tx - \lambda(t))$. Let $\varepsilon > 0$ and $n_0$ as in Q35. Using Markov's inequality applied to the random variable $e^{tS_n}$, show that for $a > 1$, $n \geqslant n_0$ and $t \geqslant 0$, $$P\left(S_n \geqslant nam\right) \leqslant \mathrm{e}^{-ntam} \mathrm{e}^{n(\lambda(t) + \varepsilon)}.$$
grandes-ecoles 2020 Q37 View
We define, for all $x \in \mathbb{R}$, $\lambda^*(x) = \sup_{t \geqslant 0}(tx - \lambda(t))$. Let $\varepsilon > 0$ and $n_0$ as in Q35. Deduce that for $n \geqslant n_0$, $$P\left(S_n \geqslant nam\right) \leqslant \mathrm{e}^{-n\left(\lambda^*(am) - \varepsilon\right)}.$$
grandes-ecoles 2020 Q38 View
Give a concrete meaning to $m = \lim_{n \rightarrow +\infty} \frac{1}{n} E(S_n)$ in relation to the industrial process studied and interpret the inequality $P\left(S_n \geqslant nam\right) \leqslant \mathrm{e}^{-n(\lambda^*(am) - \varepsilon)}$. One may establish an intuitive link with the law of large numbers.
grandes-ecoles 2022 Q25 View
Show that there exists a real $\alpha > 0$ such that
$$\forall \theta \in [ - \pi , \pi ] , 1 - \cos \theta \geq \alpha \theta ^ { 2 }$$
Using question $9 \triangleright$, deduce from this that there exist three real numbers $t _ { 0 } > 0 , \beta > 0$ and $\gamma > 0$ such that, for all $\left. t \in ] 0 , t _ { 0 } \right]$ and all $\theta \in [ - \pi , \pi ]$,
$$| h ( t , \theta ) | \leq e ^ { - \beta \left( \sigma _ { t } \theta \right) ^ { 2 } } \quad \text { or } \quad | h ( t , \theta ) | \leq e ^ { - \gamma \left( \sigma _ { t } | \theta | \right) ^ { 2 / 3 } }$$
grandes-ecoles 2022 Q24 View
Show that there exists a real $a > 0$ such that $$\forall \theta \in [-\pi,\pi], 1-\cos\theta \geq a\theta^2.$$ Deduce that there exist three reals $t_0 > 0$, $\beta > 0$ and $\gamma > 0$ such that, for all $t \in ]0,t_0]$ and all $\theta \in [-\pi,\pi]$, $$\left|\frac{P(e^{-t}e^{i\theta})}{P(e^{-t})}\right| \leq e^{-\beta(t^{-3/2}\theta)^2} \quad \text{or} \quad \left|\frac{P(e^{-t}e^{i\theta})}{P(e^{-t})}\right| \leq e^{-\gamma(t^{-3/2}|\theta|)^{2/3}}.$$
grandes-ecoles 2022 Q32 View
With the notation of question 28, deduce that $$\mathbb { P } \left( M \in \mathcal { G } \ell _ { n } ( \mathbb { R } ) \right) \geqslant \frac { 1 } { 2 ^ { n - 1 } } .$$
grandes-ecoles 2022 Q41 View
Let $\sigma$ and $\lambda$ be two strictly positive real numbers and $Z$ a real random variable such that $\exp ( t Z )$ has finite expectation and satisfies $$\forall t \in \mathbb { R } , \quad \mathbb { E } ( \exp ( t Z ) ) \leqslant \exp \left( \frac { \sigma ^ { 2 } t ^ { 2 } } { 2 } \right)$$
By applying Markov's inequality to a suitably chosen random variable, prove that $$\forall t \in \mathbb { R } ^ { + } , \quad \mathbb { P } ( Z \geqslant \lambda ) \leqslant \exp \left( \frac { \sigma ^ { 2 } t ^ { 2 } } { 2 } - \lambda t \right)$$
grandes-ecoles 2022 Q42 View
Let $\sigma$ and $\lambda$ be two strictly positive real numbers and $Z$ a real random variable such that $\exp ( t Z )$ has finite expectation and satisfies $$\forall t \in \mathbb { R } , \quad \mathbb { E } ( \exp ( t Z ) ) \leqslant \exp \left( \frac { \sigma ^ { 2 } t ^ { 2 } } { 2 } \right)$$
Deduce that $$\mathbb { P } ( | Z | \geqslant \lambda ) \leqslant 2 \exp \left( - \frac { \lambda ^ { 2 } } { 2 \sigma ^ { 2 } } \right)$$
grandes-ecoles 2022 Q43 View
Let $X _ { 1 } , \ldots , X _ { n } , Y _ { 1 } , \ldots , Y _ { n }$ be mutually independent random variables with the same distribution $\mathcal { R }$. We define the random vectors $X = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } , \ldots , X _ { n } \right) ^ { \top }$ and $Y = \frac { 1 } { \sqrt { n } } \left( Y _ { 1 } , \ldots , Y _ { n } \right) ^ { \top }$ taking values in $\mathcal { M } _ { n , 1 } ( \mathbb { R } )$.
Prove that $$\mathbb { P } ( | \langle X \mid Y \rangle | \geqslant \varepsilon ) \leqslant 2 \exp \left( - \frac { \varepsilon ^ { 2 } n } { 2 } \right)$$
grandes-ecoles 2022 Q44 View
$N$ being a non-zero natural integer, $\left( X _ { j } ^ { i } \right) _ { 1 \leqslant i \leqslant N , 1 \leqslant j \leqslant n }$ is a family of $n \times N$ mutually independent real random variables with the same distribution $\mathcal { R }$. For every $i \in \llbracket 1 , N \rrbracket$, we set $X ^ { i } = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } ^ { i } , \ldots , X _ { n } ^ { i } \right) ^ { \top }$.
Deduce from the previous questions that $$\mathbb { P } \left( \bigcup _ { 1 \leqslant i < j \leqslant N } \left| \left\langle X ^ { i } \mid X ^ { j } \right\rangle \right| \geqslant \varepsilon \right) \leqslant N ( N - 1 ) \exp \left( - \frac { \varepsilon ^ { 2 } n } { 2 } \right) .$$
grandes-ecoles 2022 Q45 View
$N$ being a non-zero natural integer, $\left( X _ { j } ^ { i } \right) _ { 1 \leqslant i \leqslant N , 1 \leqslant j \leqslant n }$ is a family of $n \times N$ mutually independent real random variables with the same distribution $\mathcal { R }$. For every $i \in \llbracket 1 , N \rrbracket$, we set $X ^ { i } = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } ^ { i } , \ldots , X _ { n } ^ { i } \right) ^ { \top }$.
We assume that $n \geqslant 4 \frac { \ln N } { \varepsilon ^ { 2 } }$. Prove that $$\mathbb { P } \left( \bigcup _ { 1 \leqslant i < j \leqslant N } \left| \left\langle X ^ { i } \mid X ^ { j } \right\rangle \right| \geqslant \varepsilon \right) < 1 .$$