UFM Statistics

View all 201 questions →

grandes-ecoles 2018 Q27 Expectation of a Function of a Discrete Random Variable View
Let $X$ be a random variable that follows the zeta distribution with parameter $x > 1$, i.e. $$\forall n \in \mathbb{N}^{*}, \quad \mathbb{P}(X = n) = \frac{1}{\zeta(x) n^{x}}$$ Using the result of Q26, deduce the variance of $X$.
grandes-ecoles 2018 Q41 Expectation of a Function of a Discrete Random Variable View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Show that $\mathbb{E}\left(g(X)^{2}\right) = k$, and deduce that $\mathbb{E}(g(X)) \leqslant \sqrt{k}$.
grandes-ecoles 2018 Q41 Expectation of a Function of a Discrete Random Variable View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables.
Show that $\mathbb{E}\left(g(X)^{2}\right) = k$, and deduce that $\mathbb{E}(g(X)) \leqslant \sqrt{k}$.
grandes-ecoles 2018 Q42 Probability Bounds and Inequalities for Discrete Variables View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$, and $m$ is a median of $g(X)$. Deduce that $(\sqrt{k} - m)^{2} \leqslant \mathbb{E}\left((g(X) - m)^{2}\right)$.
grandes-ecoles 2018 Q42 Probability Bounds and Inequalities for Discrete Variables View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $m$ be a median of $g(X)$.
Deduce that $(\sqrt{k} - m)^{2} \leqslant \mathbb{E}\left((g(X) - m)^{2}\right)$.
grandes-ecoles 2018 Q43 Probability Bounds and Inequalities for Discrete Variables View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Show that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - \sqrt{k}| \geqslant t) \leqslant 4\exp(4)\exp\left(-\frac{1}{16}t^{2}\right)$$
grandes-ecoles 2018 Q43 Probability Bounds and Inequalities for Discrete Variables View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables.
Show that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - \sqrt{k}| \geqslant t) \leqslant 4 \exp(4) \exp\left(-\frac{1}{16} t^{2}\right)$$
grandes-ecoles 2018 Q44 Probability Bounds and Inequalities for Discrete Variables View
We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. Let $\varepsilon$ be in $]0, 1[$ and $\delta$ be in $]0, 1/2[$. We assume that $k \geqslant 160\frac{\ln(1/\delta)}{\varepsilon^{2}}$. Show that, for every unit vector $u$ in $\mathbb{R}^{d}$:
$$\mathbb{P}\left(\left|\|A_{k} \cdot u\| - 1\right| > \varepsilon\right) < \delta$$
grandes-ecoles 2018 Q44 Probability Bounds and Inequalities for Discrete Variables View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We fix a vector $(u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define $g(M) = \|M \cdot u\|$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. We set $A_{k} = \frac{X}{\sqrt{k}}$. Let $\varepsilon$ be in $]0, 1[$ and $\delta$ be in $]0, 1/2[$. We assume that $k \geqslant 160 \frac{\ln(1/\delta)}{\varepsilon^{2}}$.
Show that, for every unit vector $u$ in $\mathbb{R}^{d}$:
$$\mathbb{P}\left(\left|\left\|A_{k} \cdot u\right\| - 1\right| > \varepsilon\right) < \delta$$
grandes-ecoles 2018 Q45 Probability Bounds and Inequalities for Discrete Variables View
We keep the notations and hypotheses from above. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. Let $\varepsilon \in ]0,1[$, $\delta \in ]0, 1/2[$, and $k \geqslant 160\frac{\ln(1/\delta)}{\varepsilon^{2}}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$ we denote by $E_{ij}$ the event
$$(1 - \varepsilon)\|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon)\|v_{i} - v_{j}\|$$
Show that $\mathbb{P}\left(\overline{E_{ij}}\right) < \delta$, where $\overline{E_{ij}}$ denotes the complementary event of $E_{ij}$.
grandes-ecoles 2018 Q45 Probability Bounds and Inequalities for Discrete Variables View
We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $\varepsilon$ be in $]0, 1[$ and $\delta$ be in $]0, 1/2[$. We assume that $k \geqslant 160 \frac{\ln(1/\delta)}{\varepsilon^{2}}$. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$ we denote by $E_{ij}$ the event
$$(1 - \varepsilon) \|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon) \|v_{i} - v_{j}\|$$
Show that $\mathbb{P}\left(\overline{E_{ij}}\right) < \delta$, where $\overline{E_{ij}}$ denotes the complementary event of $E_{ij}$.
grandes-ecoles 2018 Q46 Probability Bounds and Inequalities for Discrete Variables View
We keep the notations and hypotheses from above. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. Let $\varepsilon \in ]0,1[$, $\delta \in ]0, 1/2[$, and $k \geqslant 160\frac{\ln(1/\delta)}{\varepsilon^{2}}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$, $E_{ij}$ denotes the event
$$(1 - \varepsilon)\|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon)\|v_{i} - v_{j}\|$$
Deduce that $\mathbb{P}\left(\bigcap_{1 \leqslant i < j \leqslant N} E_{ij}\right) \geqslant 1 - \frac{N(N-1)}{2}\delta$.
grandes-ecoles 2018 Q46 Probability Bounds and Inequalities for Discrete Variables View
We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $\varepsilon$ be in $]0, 1[$ and $\delta$ be in $]0, 1/2[$. We assume that $k \geqslant 160 \frac{\ln(1/\delta)}{\varepsilon^{2}}$. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$ we denote by $E_{ij}$ the event
$$(1 - \varepsilon) \|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon) \|v_{i} - v_{j}\|$$
Deduce that $\mathbb{P}\left(\bigcap_{1 \leqslant i < j \leqslant N} E_{ij}\right) \geqslant 1 - \frac{N(N-1)}{2} \delta$.
grandes-ecoles 2018 Q47 Probability Bounds and Inequalities for Discrete Variables View
We keep the notations and hypotheses from above. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$, $E_{ij}$ denotes the event
$$(1 - \varepsilon)\|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon)\|v_{i} - v_{j}\|$$
Deduce the Johnson-Lindenstrauss theorem: there exists an absolute constant $c > 0$ such that for any natural integers $N$ and $d$ greater than or equal to 2 and for any distinct $v_{1}, \ldots, v_{N}$ in $\mathbb{R}^{d}$, it suffices that
$$k \geqslant c\frac{\ln(N)}{\varepsilon^{2}}$$
for there to exist an $\varepsilon$-isometry $f : \mathbb{R}^{d} \rightarrow \mathbb{R}^{k}$ for $v_{1}, \ldots, v_{N}$.
grandes-ecoles 2018 Q47 Probability Bounds and Inequalities for Discrete Variables View
We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $\varepsilon$ be in $]0, 1[$ and $\delta$ be in $]0, 1/2[$. We assume that $k \geqslant 160 \frac{\ln(1/\delta)}{\varepsilon^{2}}$. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$ we denote by $E_{ij}$ the event
$$(1 - \varepsilon) \|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon) \|v_{i} - v_{j}\|$$
Deduce the Johnson-Lindenstrauss theorem: there exists an absolute constant $c$ strictly positive such that for any natural integers $N$ and $d$ greater than or equal to 2, and for any distinct $v_{1}, \ldots, v_{N}$ in $\mathbb{R}^{d}$, it suffices that
$$k \geqslant c \frac{\ln(N)}{\varepsilon^{2}}$$
for there to exist an $\varepsilon$-isometry $f : \mathbb{R}^{d} \rightarrow \mathbb{R}^{k}$ for $v_{1}, \ldots, v_{N}$.
grandes-ecoles 2019 Q3 Convergence of Expectations or Moments View
Let $n$ be a non-zero natural number and $t$ a real number. We set $$\forall n \in \mathbb{N}^{\star}, \quad X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$ for all $n \geqslant 1$, and $$\operatorname{sinc}\, t = \begin{cases} \frac{\sin t}{t} & \text{if } t \neq 0 \\ 1 & \text{otherwise} \end{cases}$$
Determine the pointwise limit of the sequence of functions $(\Phi_{X_n})_{n \geqslant 1}$.
grandes-ecoles 2019 Q5 Independence Proofs for Discrete Random Variables View
Let $n$ be a non-zero natural number. We set $X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$.
Show that $X_n$ and $-X_n$ have the same distribution for all $n \in \mathbb{N}^{\star}$.
grandes-ecoles 2019 Q5 Independence Proofs for Discrete Random Variables View
Let $n$ be a non-zero natural number. We set $$\forall n \in \mathbb{N}^{\star}, \quad X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$ for all $n \geqslant 1$.
Show that $X_n$ and $-X_n$ have the same distribution for all $n \in \mathbb{N}^{\star}$.
grandes-ecoles 2019 Q6 Expectation of a Function of a Discrete Random Variable View
Using the result that $X_n$ and $-X_n$ have the same distribution, deduce the pointwise limit of the sequence of functions $(\varphi_n)_{n \geqslant 1}$ defined by $$\forall n \in \mathbb{N}^{\star}, \quad \varphi_n : \begin{aligned} \mathbb{R} &\rightarrow \mathbb{R} \\ t &\mapsto \mathbb{E}\left(\cos\left(t X_n\right)\right) \end{aligned}$$
grandes-ecoles 2020 Q1 Probability Bounds and Inequalities for Discrete Variables View
Let $Z$ be a discrete real random variable such that $\exp(\lambda Z)$ has finite expectation for all $\lambda > 0$. Show that for all $\lambda > 0$ and $t \in \mathbb{R}$, $$P[Z \geqslant t] \leqslant \exp(-\lambda t) E[\exp(\lambda Z)].$$
grandes-ecoles 2020 Q1 Probability Bounds and Inequalities for Discrete Variables View
Let $Z$ be a discrete real random variable such that $\exp ( \lambda Z )$ has finite expectation for all $\lambda > 0$. Show that for all $\lambda > 0$ and $t \in \mathbb { R }$, $$P [ Z \geqslant t ] \leqslant \exp ( - \lambda t ) E [ \exp ( \lambda Z ) ] .$$
grandes-ecoles 2020 Q2 Probability Bounds and Inequalities for Discrete Variables View
Let $n \geqslant 1$ be a natural integer, and let $(X_1, \ldots, X_n)$ be discrete real random variables that are mutually independent such that, for all $k \in \{1, \ldots, n\}$, $$P[X_k = 1] = P[X_k = -1] = \frac{1}{2}$$ We define $$S_n = \frac{1}{n} \sum_{k=1}^{n} X_k$$ Show that $P[S_n \geqslant 0] \geqslant \frac{1}{2}$.
grandes-ecoles 2020 Q2 Probability Bounds and Inequalities for Discrete Variables View
Let $n \geqslant 1$ be a natural integer, and let $\left( X _ { 1 } , \ldots , X _ { n } \right)$ be mutually independent discrete real random variables such that, for all $k \in \{ 1 , \ldots , n \}$, $$P \left[ X _ { k } = 1 \right] = P \left[ X _ { k } = - 1 \right] = \frac { 1 } { 2 }$$ We define $$S _ { n } = \frac { 1 } { n } \sum _ { k = 1 } ^ { n } X _ { k }$$ Show that $P \left[ S _ { n } \geqslant 0 \right] \geqslant \frac { 1 } { 2 }$.
grandes-ecoles 2020 Q2 Expectation of a Function of a Discrete Random Variable View
We assume in this question that $X ( \Omega )$ is a countable set. We denote $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ where the $x _ { n }$ are pairwise distinct. For all $n \in \mathbb { N }$, we set $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$. Show that $\phi _ { X }$ is defined on $\mathbb { R }$ and that, for all real $t , \phi _ { X } ( t ) = \sum _ { n = 0 } ^ { + \infty } a _ { n } \mathrm { e } ^ { \mathrm { i } t x _ { n } }$.
grandes-ecoles 2020 Q3 Probability Bounds and Inequalities for Discrete Variables View
Let $n \geqslant 1$ be a natural integer, and let $(X_1, \ldots, X_n)$ be discrete real random variables that are mutually independent such that, for all $k \in \{1, \ldots, n\}$, $$P[X_k = 1] = P[X_k = -1] = \frac{1}{2}$$ We define $$S_n = \frac{1}{n} \sum_{k=1}^{n} X_k$$ as well as, for all $\lambda \in \mathbb{R}$, $$\psi(\lambda) = \log\left(\frac{1}{2}e^{\lambda} + \frac{1}{2}e^{-\lambda}\right)$$ Show that for all $t \in \mathbb{R}$, we have $$\frac{1}{n} \log P[S_n \geqslant t] \leqslant \inf_{\lambda \geqslant 0} (\psi(\lambda) - \lambda t).$$