LFM Stats And Pure

View all 322 questions →

grandes-ecoles 2018 Q16 Expectation and Moment Inequality Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Deduce the expectation of $\exp\left(\frac{1}{8} d(X, u)^{2}\right)$ and show that it is less than or equal to $2^{n}$.
grandes-ecoles 2018 Q17 Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Justify that $d(X, C) \leqslant d(X, u)$ and deduce inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1$$
in this case.
grandes-ecoles 2018 Q17 Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Justify that $d(X, C) \leqslant d(X, u)$ and deduce inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
in this case.
grandes-ecoles 2018 Q18 Probability Inequality and Tail Bound Proof View
We propose to prove inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
by induction on the dimension $n$ of $E$. We assume that $C$ is a closed convex set of $E$ such that $C \cap X(\Omega)$ contains at least two elements. Handle the case $n = 1$.
grandes-ecoles 2018 Q18 Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ such that $C \cap X(\Omega)$ contains at least two elements. We propose to prove inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
by induction on the dimension $n$ of $E$. Handle the case $n = 1$.
grandes-ecoles 2018 Q19 Expectation and Moment Inequality Proof View
We assume that $f \in \mathcal { C } ( [ 0,1 ] , \mathbb { R } )$ satisfies: $$\exists \alpha \in ]0,1] , \exists K \geq 0 , \forall ( y , z ) \in [ 0,1 ] ^ { 2 } , | f ( y ) - f ( z ) | \leq K | y - z | ^ { \alpha }$$ and that $c(x) = 0$ for all $x \in [0,1]$. For all $n \in \mathbb{N}^*$, we define: $$B _ { n } f ( X ) = \sum _ { k = 0 } ^ { n } f \left( \frac { k } { n } \right) \binom { n } { k } X ^ { k } ( 1 - X ) ^ { n - k }$$
Let $x \in ]0,1[$ and $n \in \mathbb { N } ^ { * }$. We consider $X _ { 1 } , \ldots , X _ { n }$ mutually independent random variables all following the same Bernoulli distribution with parameter $x$. We set
$$S _ { n } = \frac { X _ { 1 } + \cdots + X _ { n } } { n }$$
(a) Express $\mathbb { E } \left( S _ { n } \right) , \mathbb { V } \left( S _ { n } \right)$ and $\mathbb { E } \left( f \left( S _ { n } \right) \right)$ in terms of $x , n$ and the polynomial $B _ { n } f$.
(b) Deduce the inequalities:
$$\sum _ { k = 0 } ^ { n } \left| x - \frac { k } { n } \right| \binom { n } { k } x ^ { k } ( 1 - x ) ^ { n - k } \leq \mathbb { V } \left( S _ { n } \right) ^ { \frac { 1 } { 2 } } \leq \frac { 1 } { 2 \sqrt { n } }$$
grandes-ecoles 2018 Q19 Verification of Probability Measure or Inner Product Properties View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and $\pi$ the orthogonal projection onto $E'$
$$\pi : \left\lvert \, \begin{aligned} E & \rightarrow E' \\ \sum_{i=1}^{n} x_{i} e_{i} & \mapsto \sum_{i=1}^{n-1} x_{i} e_{i} \end{aligned} \right.$$
We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $H_{t}$ the affine hyperplane $E' + te_{n}$ and $C_{t} = \pi(C \cap H_{t})$.
Show, for $x' \in E'$ and $t \in \{-1, 1\}$, that $x' \in C_{t} \Longleftrightarrow x' + te_{n} \in C$.
grandes-ecoles 2018 Q19 Verification of Probability Measure or Inner Product Properties View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and by $\pi$ the orthogonal projection onto $E'$
$$\pi : \left\lvert \, \begin{aligned} E & \rightarrow E' \\ \sum_{i=1}^{n} x_{i} e_{i} & \mapsto \sum_{i=1}^{n-1} x_{i} e_{i} \end{aligned} \right.$$
We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $H_{t}$ the affine hyperplane $E' + te_{n}$ and $C_{t} = \pi(C \cap H_{t})$.
Show, for $x' \in E'$ and $t \in \{-1, 1\}$, that $x' \in C_{t} \Longleftrightarrow x' + te_{n} \in C$.
grandes-ecoles 2018 Q20 Verification of Probability Measure or Inner Product Properties View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and $\pi$ the orthogonal projection onto $E'$
$$\pi : \left\lvert \, \begin{aligned} E & \rightarrow E' \\ \sum_{i=1}^{n} x_{i} e_{i} & \mapsto \sum_{i=1}^{n-1} x_{i} e_{i} \end{aligned} \right.$$
We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $H_{t}$ the affine hyperplane $E' + te_{n}$ and $C_{t} = \pi(C \cap H_{t})$.
Show that $C_{+1}$ and $C_{-1}$ are non-empty closed convex sets of $E'$.
grandes-ecoles 2018 Q20 Verification of Probability Measure or Inner Product Properties View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and by $\pi$ the orthogonal projection onto $E'$
$$\pi : \left\lvert \, \begin{aligned} E & \rightarrow E' \\ \sum_{i=1}^{n} x_{i} e_{i} & \mapsto \sum_{i=1}^{n-1} x_{i} e_{i} \end{aligned} \right.$$
We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $H_{t}$ the affine hyperplane $E' + te_{n}$ and $C_{t} = \pi(C \cap H_{t})$.
Show that $C_{+1}$ and $C_{-1}$ are non-empty closed convex sets of $E'$.
grandes-ecoles 2018 Q21 Expectation and Moment Inequality Proof View
We assume that $f \in \mathcal { C } ( [ 0,1 ] , \mathbb { R } )$ satisfies: $$\exists \alpha \in ]0,1] , \exists K \geq 0 , \forall ( y , z ) \in [ 0,1 ] ^ { 2 } , | f ( y ) - f ( z ) | \leq K | y - z | ^ { \alpha }$$ For all $n \in \mathbb{N}^*$, define $B _ { n } f ( X ) = \sum _ { k = 0 } ^ { n } f \left( \frac { k } { n } \right) \binom { n } { k } X ^ { k } ( 1 - X ) ^ { n - k }$.
Let $n \in \mathbb { N } ^ { * }$. Show that
$$\left\| f - B _ { n } f \right\| _ { \infty } \leq \frac { 3 K } { 2 } \frac { 1 } { n ^ { \alpha / 2 } }$$
Hint: One may first express $f ( x ) - B _ { n } f ( x )$ in terms of $\mathbb { E } ( f ( x ) - f \left( S _ { n } \right) )$.
grandes-ecoles 2018 Q21 Conditional Probability and Total Probability with Tree/Bayes Structure View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and by $\pi$ the orthogonal projection onto $E'$. We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $C_{t} = \pi(C \cap H_{t})$ where $H_t = E' + te_n$. For $t$ in $\{-1, 1\}$, we denote by $Y_{t}$ the projection of $X'$ onto the non-empty closed convex set $C_{t}$.
Show that
$$\mathbb{P}(X \in C) = \frac{1}{2} \mathbb{P}(X' \in C_{+1}) + \frac{1}{2} \mathbb{P}(X' \in C_{-1})$$
grandes-ecoles 2018 Q22 Distribution of Transformed or Combined Random Variables View
Let $t$ be a strictly positive real number. Using questions 20 and 12, and the result that if $u$ and $v$ are functions from $\mathbb{R}$ to $\mathbb{R}$, continuous and integrable on $\mathbb{R}$ and satisfying $\mathcal{F}(u) = \mathcal{F}(v)$, then $u = v$, deduce the existence of a real $\lambda_{t,\sigma}$ such that $$f(t, \cdot) = \lambda_{t,\sigma} g_{\sqrt{\sigma^{2}+2t}}$$
grandes-ecoles 2018 Q23 Change of Variable and Integral Evaluation View
Show that the function $I : \left\lvert\, \begin{aligned} & \mathbb{R}_{+}^{*} \rightarrow \mathbb{R} \\ & t \mapsto \int_{-\infty}^{+\infty} f(t, x) \mathrm{d}x \end{aligned}\right.$ is constant. One may use the result of question 17.
grandes-ecoles 2018 Q24 Distribution of Transformed or Combined Random Variables View
Deduce that, for any strictly positive real $t$, $f(t, \cdot) = g_{\sqrt{\sigma^{2}+2t}}$.
grandes-ecoles 2018 Q24 Probability Inequality and Tail Bound Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We assume, without loss of generality, that $p_{+} \geqslant p_{-}$. Show that $p_{-} > 0$.
grandes-ecoles 2018 Q24 Conditional Probability and Total Probability with Tree/Bayes Structure View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$.
Show that $p_{-} > 0$.
grandes-ecoles 2018 Q25 Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Show that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \mathbb{E}\left(\left(\exp\left(\frac{1}{8}d(X', C_{-1})^{2}\right)\right)^{1-\lambda} \cdot \left(\exp\left(\frac{1}{8}d(X', C_{+1})^{2}\right)\right)^{\lambda}\right)$$
grandes-ecoles 2018 Q25 Expectation and Moment Inequality Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$. We have shown the inequality
$$d(X, C)^{2} \leqslant 4\lambda^{2} + (1 - \lambda) d(X', C_{\varepsilon_{n}})^{2} + \lambda d(X', C_{-\varepsilon_{n}})^{2}$$
Show that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \mathbb{E}\left(\left(\exp\left(\frac{1}{8} d(X', C_{-1})^{2}\right)\right)^{1-\lambda} \cdot \left(\exp\left(\frac{1}{8} d(X', C_{+1})^{2}\right)\right)^{\lambda}\right)$$
grandes-ecoles 2018 Q26 Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Deduce that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \left(\mathbb{E}\left(\exp\left(\frac{1}{8}d(X', C_{-1})^{2}\right)\right)\right)^{1-\lambda} \cdot \left(\mathbb{E}\left(\exp\left(\frac{1}{8}d(X', C_{+1})^{2}\right)\right)\right)^{\lambda}$$
grandes-ecoles 2018 Q26 Expectation and Moment Inequality Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$.
Deduce that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \left(\mathbb{E}\left(\exp\left(\frac{1}{8} d(X', C_{-1})^{2}\right)\right)\right)^{1-\lambda} \cdot \left(\mathbb{E}\left(\exp\left(\frac{1}{8} d(X', C_{+1})^{2}\right)\right)\right)^{\lambda}$$
grandes-ecoles 2018 Q27 Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Using the induction hypothesis, justify that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = 1\right) \leqslant \frac{1}{p_{+}}$$
grandes-ecoles 2018 Q27 Probability Inequality and Tail Bound Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$. Using the induction hypothesis, justify that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = 1\right) \leqslant \frac{1}{p_{+}}$$
grandes-ecoles 2018 Q28 Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Deduce from the above that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant \frac{1}{2}\left(\frac{1}{p_{+}} + \exp\left(\frac{\lambda^{2}}{2}\right) \frac{1}{(p_{-})^{1-\lambda}} \cdot \frac{1}{(p_{+})^{\lambda}}\right)$$
grandes-ecoles 2018 Q28 Expectation and Moment Inequality Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$.
Deduce from the above that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant \frac{1}{2}\left(\frac{1}{p_{+}} + \exp\left(\frac{\lambda^{2}}{2}\right) \frac{1}{(p_{-})^{1-\lambda}} \cdot \frac{1}{(p_{+})^{\lambda}}\right)$$