LFM Stats And Pure

View all 322 questions →

grandes-ecoles 2018 Q29 Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. We set $\lambda = 1 - \frac{p_{-}}{p_{+}}$. Show that
$$\mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant \frac{1}{2p_{+}}\left(1 + \exp\left(\frac{\lambda^{2}}{2}\right)(1 - \lambda)^{\lambda - 1}\right)$$
grandes-ecoles 2018 Q29 Expectation and Moment Inequality Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$. We set $\lambda = 1 - \frac{p_{-}}{p_{+}}$. Show that
$$\mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant \frac{1}{2p_{+}}\left(1 + \exp\left(\frac{\lambda^{2}}{2}\right) (1 - \lambda)^{\lambda - 1}\right)$$
grandes-ecoles 2018 Q32 Probability Inequality and Tail Bound Proof View
Complete the proof of inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
grandes-ecoles 2018 Q32 Probability Inequality and Tail Bound Proof View
Complete the proof of inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8} d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
grandes-ecoles 2018 Q33 Probability Inequality and Tail Bound Proof View
Deduce Talagrand's inequality: For every non-empty closed convex set $C$ of $E$ and for every strictly positive real number $t$
$$\mathbb{P}(X \in C) \cdot \mathbb{P}(d(X, C) \geqslant t) \leqslant \exp\left(-\frac{t^{2}}{8}\right)$$
grandes-ecoles 2018 Q33 Probability Inequality and Tail Bound Proof View
Deduce Talagrand's inequality: For every non-empty closed convex set $C$ of $E$ and for every strictly positive real number $t$
$$\mathbb{P}(X \in C) \cdot \mathbb{P}(d(X, C) \geqslant t) \leqslant \exp\left(-\frac{t^{2}}{8}\right)$$
grandes-ecoles 2018 Q34 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We denote by $\|\cdot\|_{F}$ the associated Euclidean norm. We fix a vector $u = (u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define
$$g : \left\lvert \, \begin{aligned} & \mathcal{M}_{k,d}(\mathbb{R}) \rightarrow \mathbb{R} \\ & M \mapsto \|M \cdot u\| \end{aligned} \right.$$
Show that $C = \left\{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\right\}$ is a convex and closed subset of $\mathcal{M}_{k,d}(\mathbb{R})$.
grandes-ecoles 2018 Q34 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We denote by $\|\cdot\|_{F}$ the associated Euclidean norm. We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define
$$g : \left\lvert \, \begin{aligned} & \mathcal{M}_{k,d}(\mathbb{R}) \rightarrow \mathbb{R} \\ & M \mapsto \|M \cdot u\| \end{aligned} \right.$$
Show that $C = \left\{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\right\}$ is a convex and closed subset of $\mathcal{M}_{k,d}(\mathbb{R})$.
grandes-ecoles 2018 Q37 Probability Inequality and Tail Bound Proof View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the Frobenius norm $\|\cdot\|_{F}$. We fix a unit vector $u$ in $\mathbb{R}^{d}$, define $g(M) = \|M \cdot u\|$, and let $C = \{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\}$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $r$ and $t$ be two real numbers, with $t > 0$. Deduce that
$$\mathbb{P}(g(X) \leqslant r) \cdot \mathbb{P}(g(X) \geqslant r + t) \leqslant \exp\left(-\frac{1}{8}t^{2}\right)$$
grandes-ecoles 2018 Q37 Probability Inequality and Tail Bound Proof View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We denote by $\|\cdot\|_{F}$ the associated Euclidean norm. We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define
$$g : \left\lvert \, \begin{aligned} & \mathcal{M}_{k,d}(\mathbb{R}) \rightarrow \mathbb{R} \\ & M \mapsto \|M \cdot u\| \end{aligned} \right.$$
Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $C = \left\{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\right\}$. Deduce that
$$\mathbb{P}(g(X) \leqslant r) \cdot \mathbb{P}(g(X) \geqslant r + t) \leqslant \exp\left(-\frac{1}{8} t^{2}\right)$$
grandes-ecoles 2018 Q38 Distribution of Transformed or Combined Random Variables View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Justify that $g(X)$ admits at least one median. One may consider the function $G$ from $\mathbb{R}$ to $\mathbb{R}$ such that, for every real number $t$, $G(t) = \mathbb{P}(g(X) \leqslant t)$, and examine the set $G^{-1}([1/2, 1])$.
grandes-ecoles 2018 Q38 Convergence in Distribution or Probability View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. We say that a real number $m$ is a median of $g(X)$ when
$$\mathbb{P}(g(X) \geqslant m) \geqslant \frac{1}{2} \quad \text{and} \quad \mathbb{P}(g(X) \leqslant m) \geqslant \frac{1}{2}$$
Justify that $g(X)$ admits at least one median. One may consider the function $G$ from $\mathbb{R}$ to $\mathbb{R}$ such that, for every real number $t$, $G(t) = \mathbb{P}(g(X) \leqslant t)$, and examine the set $G^{-1}([1/2, 1])$.
grandes-ecoles 2018 Q39 Probability Inequality and Tail Bound Proof View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Deduce from the above that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - m| \geqslant t) \leqslant 4\exp\left(-\frac{1}{8}t^{2}\right)$$
where $m$ is a median of $g(X)$.
grandes-ecoles 2018 Q39 Probability Inequality and Tail Bound Proof View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Deduce from the above that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - m| \geqslant t) \leqslant 4 \exp\left(-\frac{1}{8} t^{2}\right)$$
where $m$ is a median of $g(X)$.
grandes-ecoles 2018 Q40 Expectation and Moment Inequality Proof View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$, and $m$ is a median of $g(X)$. Deduce that $\mathbb{E}\left((g(X) - m)^{2}\right) \leqslant 32$.
grandes-ecoles 2018 Q40 Expectation and Moment Inequality Proof View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $m$ be a median of $g(X)$. Deduce that $\mathbb{E}\left((g(X) - m)^{2}\right) \leqslant 32$.
grandes-ecoles 2019 Q4 Convergence in Distribution or Probability View
Study the continuity of $\lim_{n \rightarrow +\infty} \Phi_{X_n}$.
grandes-ecoles 2019 Q4 Convergence in Distribution or Probability View
Let $n$ be a non-zero natural number and $t$ a real number. We set $$\forall n \in \mathbb{N}^{\star}, \quad X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$ for all $n \geqslant 1$.
Study the continuity of $\lim_{n \rightarrow +\infty} \Phi_{X_n}$.
grandes-ecoles 2019 Q7 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
Does the sequence of functions $(\varphi_n)_{n \geqslant 1}$ defined by $$\forall n \in \mathbb{N}^{\star}, \quad \varphi_n : \begin{aligned} \mathbb{R} &\rightarrow \mathbb{R} \\ t &\mapsto \mathbb{E}\left(\cos\left(t X_n\right)\right) \end{aligned}$$ converge uniformly on $\mathbb{R}$?
grandes-ecoles 2019 Q7 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
Let $n$ be a non-zero natural number. We set $$\forall n \in \mathbb{N}^{\star}, \quad X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$ for all $n \geqslant 1$, and $$\forall n \in \mathbb{N}^{\star}, \quad \varphi_n : \begin{aligned} \mathbb{R} &\rightarrow \mathbb{R} \\ t &\mapsto \mathbb{E}(\cos(t X_n)) \end{aligned}$$
Does the sequence of functions $(\varphi_n)_{n \geqslant 1}$ converge uniformly on $\mathbb{R}$?
grandes-ecoles 2019 Q18 Probability Inequality and Tail Bound Proof View
We fix once and for all an integer $n$ which should be considered as being very large. For each pair $(x,y) \in \{0,1\}^n$ such that $x \neq y$, we fix an integer $j_n(x,y)$ whose existence is proved in question 17c. We say that $x$ is better than $y$ given $E^1, E^2, \ldots, E^T$ if $$\left|\frac{1}{T}\sum_{i=1}^T E^i_{j_n(x,y)} - \mathbb{E}\left[O_{j_n(x,y)}(x)\right]\right| < \left|\frac{1}{T}\sum_{i=1}^T E^i_{j_n(x,y)} - \mathbb{E}\left[O_{j_n(x,y)}(y)\right]\right|$$ We set $R_{n,T}(E^1, E^2, \ldots, E^T) = x$ if for all $y \neq x$, $x$ is better than $y$. If we cannot find such an $x$ we set $R_{n,T}(E^1, E^2, \ldots, E^T) = (0,0,\ldots,0)$.
Prove that if $T_n \geq e^{3\ln(n)n^{1/3}}$ then for all $x \in \{0,1\}^n$ and any sequence $$O^1(x), O^2(x), \ldots, O^{T_n}(x)$$ of $T_n$ random variables taking values in $\{0,1\}^n$ mutually independent with the same distribution as $O(x)$, we have $$\max_{x \in \{0,1\}^n} \mathbb{P}\left(R_{n,T_n}\left(O^1(x), O^2(x), \ldots, O^{T_n}(x)\right) \neq x\right) \leq u_n$$ where $(u_n)_{n \geq 1}$ is a sequence tending to 0 as $n$ tends to infinity.
Hint. One may start by writing, justifying it, that $$\mathbb{P}\left(R_{n,T}\left(O^1(x), O^2(x), \ldots, O^T(x)\right) \neq x\right) \leq \sum_{y \in \{0,1\}^n, y \neq x} \mathbb{P}\left(x \text{ is not better than } y \text{ given } O^1(x), O^2(x), \ldots, O^T(x)\right)$$
grandes-ecoles 2019 Q19 Distribution of Transformed or Combined Random Variables View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}.$$
Justify $$\forall n \in \mathbb{N}^{\star}, \quad \mathbb{P}(Y_n \in [0,1[) = 1.$$
grandes-ecoles 2019 Q19 Distribution of Transformed or Combined Random Variables View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}.$$
Justify $$\forall n \in \mathbb{N}^{\star}, \quad \mathbb{P}(Y_n \in [0,1[) = 1.$$
grandes-ecoles 2019 Q24 Convergence in Distribution or Probability View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}$, $F_n(x) = \mathbb{P}(Y_n \leqslant x)$ and $G_n(x) = \mathbb{P}(Y_n < x)$.
Let $x$ be a real number. Establish the monotonicity of the sequences $(F_n(x))_{n \geqslant 1}$ and $(G_n(x))_{n \geqslant 1}$.
grandes-ecoles 2019 Q24 Convergence in Distribution or Probability View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}, \quad F_n(x) = \mathbb{P}(Y_n \leqslant x), \quad G_n(x) = \mathbb{P}(Y_n < x).$$
Let $x$ be a real number. Establish the monotonicity of the sequences $(F_n(x))_{n \geqslant 1}$ and $(G_n(x))_{n \geqslant 1}$.