grandes-ecoles

Papers (191)
2025
centrale-maths1__official 40 centrale-maths2__official 42 mines-ponts-maths1__mp 20 mines-ponts-maths1__pc 21 mines-ponts-maths1__psi 21 mines-ponts-maths2__mp 28 mines-ponts-maths2__pc 24 mines-ponts-maths2__psi 26 polytechnique-maths-a__mp 27 polytechnique-maths__fui 16 polytechnique-maths__pc 27 x-ens-maths-a__mp 18 x-ens-maths-c__mp 9 x-ens-maths-d__mp 38 x-ens-maths__pc 27 x-ens-maths__psi 38
2024
centrale-maths1__official 28 centrale-maths2__official 29 geipi-polytech__maths 9 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 20 mines-ponts-maths1__psi 19 mines-ponts-maths2__mp 23 mines-ponts-maths2__pc 21 mines-ponts-maths2__psi 21 polytechnique-maths-a__mp 44 polytechnique-maths-b__mp 37 x-ens-maths-a__mp 43 x-ens-maths-b__mp 35 x-ens-maths-c__mp 22 x-ens-maths-d__mp 45 x-ens-maths__pc 24 x-ens-maths__psi 26
2023
centrale-maths1__official 44 centrale-maths2__official 33 e3a-polytech-maths__mp 4 mines-ponts-maths1__mp 15 mines-ponts-maths1__pc 23 mines-ponts-maths1__psi 23 mines-ponts-maths2__mp 22 mines-ponts-maths2__pc 18 mines-ponts-maths2__psi 22 polytechnique-maths__fui 23 x-ens-maths-a__mp 25 x-ens-maths-b__mp 24 x-ens-maths-c__mp 20 x-ens-maths-d__mp 20 x-ens-maths__pc 18 x-ens-maths__psi 15
2022
centrale-maths1__mp 48 centrale-maths1__official 48 centrale-maths1__pc 37 centrale-maths1__psi 43 centrale-maths2__mp 32 centrale-maths2__official 32 centrale-maths2__pc 39 centrale-maths2__psi 45 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 24 mines-ponts-maths1__psi 24 mines-ponts-maths2__mp 24 mines-ponts-maths2__pc 19 mines-ponts-maths2__psi 20 x-ens-maths-a__mp 13 x-ens-maths-b__mp 40 x-ens-maths-c__mp 27 x-ens-maths-d__mp 46 x-ens-maths1__mp 13 x-ens-maths2__mp 40 x-ens-maths__pc 15 x-ens-maths__pc_cpge 15 x-ens-maths__psi 22 x-ens-maths__psi_cpge 23
2021
centrale-maths1__mp 40 centrale-maths1__official 40 centrale-maths1__pc 36 centrale-maths1__psi 29 centrale-maths2__mp 30 centrale-maths2__official 29 centrale-maths2__pc 38 centrale-maths2__psi 37 x-ens-maths2__mp 39 x-ens-maths__pc 44
2020
centrale-maths1__mp 42 centrale-maths1__official 42 centrale-maths1__pc 36 centrale-maths1__psi 40 centrale-maths2__mp 38 centrale-maths2__official 38 centrale-maths2__pc 40 centrale-maths2__psi 39 mines-ponts-maths1__mp_cpge 24 mines-ponts-maths2__mp_cpge 21 x-ens-maths-a__mp_cpge 18 x-ens-maths-b__mp_cpge 20 x-ens-maths-d__mp 14 x-ens-maths1__mp 18 x-ens-maths2__mp 20 x-ens-maths__pc 18
2019
centrale-maths1__mp 37 centrale-maths1__official 37 centrale-maths1__pc 40 centrale-maths1__psi 39 centrale-maths2__mp 37 centrale-maths2__official 37 centrale-maths2__pc 39 centrale-maths2__psi 49 x-ens-maths1__mp 24 x-ens-maths__pc 18 x-ens-maths__psi 26
2018
centrale-maths1__mp 47 centrale-maths1__official 47 centrale-maths1__pc 41 centrale-maths1__psi 44 centrale-maths2__mp 44 centrale-maths2__official 44 centrale-maths2__pc 35 centrale-maths2__psi 38 x-ens-maths1__mp 19 x-ens-maths2__mp 17 x-ens-maths__pc 22 x-ens-maths__psi 24
2017
centrale-maths1__mp 45 centrale-maths1__official 45 centrale-maths1__pc 22 centrale-maths1__psi 17 centrale-maths2__mp 30 centrale-maths2__official 30 centrale-maths2__pc 28 centrale-maths2__psi 44 x-ens-maths1__mp 26 x-ens-maths2__mp 16 x-ens-maths__pc 18 x-ens-maths__psi 26
2016
centrale-maths1__mp 42 centrale-maths1__pc 31 centrale-maths1__psi 33 centrale-maths2__mp 25 centrale-maths2__pc 47 centrale-maths2__psi 27 x-ens-maths1__mp 18 x-ens-maths2__mp 46 x-ens-maths__pc 15 x-ens-maths__psi 20
2015
centrale-maths1__mp 42 centrale-maths1__pc 18 centrale-maths1__psi 42 centrale-maths2__mp 44 centrale-maths2__pc 18 centrale-maths2__psi 33 x-ens-maths1__mp 16 x-ens-maths2__mp 31 x-ens-maths__pc 30 x-ens-maths__psi 22
2014
centrale-maths1__mp 28 centrale-maths1__pc 26 centrale-maths1__psi 27 centrale-maths2__mp 24 centrale-maths2__pc 26 centrale-maths2__psi 27 x-ens-maths1__mp 9 x-ens-maths2__mp 16 x-ens-maths__pc 4 x-ens-maths__psi 24
2013
centrale-maths1__mp 22 centrale-maths1__pc 45 centrale-maths1__psi 29 centrale-maths2__mp 31 centrale-maths2__pc 52 centrale-maths2__psi 32 x-ens-maths1__mp 24 x-ens-maths2__mp 35 x-ens-maths__pc 22 x-ens-maths__psi 9
2012
centrale-maths1__mp 36 centrale-maths1__pc 28 centrale-maths1__psi 33 centrale-maths2__mp 27 centrale-maths2__psi 18
2011
centrale-maths1__mp 27 centrale-maths1__pc 17 centrale-maths1__psi 24 centrale-maths2__mp 29 centrale-maths2__pc 17 centrale-maths2__psi 10
2010
centrale-maths1__mp 19 centrale-maths1__pc 30 centrale-maths1__psi 13 centrale-maths2__mp 32 centrale-maths2__pc 37 centrale-maths2__psi 27
2018 centrale-maths1__official

47 maths questions

Q1 Matrices Matrix Norm, Convergence, and Inequality View
Let $a$ and $b$ be in $E$. Show the following relation and give a geometric interpretation:
$$\|a + b\|^{2} + \|a - b\|^{2} = 2(\|a\|^{2} + \|b\|^{2})$$
Q2 Proof Deduction or Consequence from Prior Results View
Deduce that if $u$, $v$ and $v'$ in $E$ satisfy $v \neq v'$ and $\|u - v\| = \|u - v'\|$ then $\left\|u - \frac{v + v'}{2}\right\| < \|u - v\|$.
Q3 Proof Existence Proof View
Let $F$ be a non-empty closed set of $E$ and $u$ in $E$. Show that there exists $v$ in $F$ such that
$$\forall w \in F, \quad \|u - v\| \leqslant \|u - w\|$$
Q4 Proof Deduction or Consequence from Prior Results View
Deduce that if $C$ is a non-empty closed convex set of $E$ and $u$ is a vector of $E$ then there exists a unique $v$ in $C$ such that
$$\forall w \in F, \quad \|u - v\| \leqslant \|u - w\|$$
Q5 Proof Direct Proof of an Inequality View
Let $p$ and $q$ be two strictly positive reals such that $\frac{1}{p} + \frac{1}{q} = 1$. Show that, for all non-negative reals $a$ and $b$,
$$ab \leqslant \frac{a^{p}}{p} + \frac{b^{q}}{q}$$
You may use the concavity of the logarithm.
Q6 Continuous Probability Distributions and Random Variables Expectation and Moment Inequality Proof View
Let $p$ and $q$ be two strictly positive reals such that $\frac{1}{p} + \frac{1}{q} = 1$. Deduce that if $X$ and $Y$ are two real-valued random variables on the finite probability space $(\Omega, \mathcal{A}, \mathbb{P})$ then
$$\mathbb{E}(|XY|) \leqslant \mathbb{E}(|X|^{p})^{1/p} \mathbb{E}(|Y|^{q})^{1/q}$$
You may first show this result when $\mathbb{E}(|X|^{p}) = \mathbb{E}(|Y|^{q}) = 1$.
Q7 Discrete Random Variables Expectation of a Function of a Discrete Random Variable View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. Let $(A_{1}, \ldots, A_{m})$ be a complete system of events with non-zero probabilities. Show that
$$\mathbb{E}(X) = \sum_{i=1}^{m} \mathbb{P}(A_{i}) \cdot \mathbb{E}(X \mid A_{i})$$
Q8 Discrete Random Variables Integral or Series Representation of Moments View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Show that
$$\mathbb{E}(X^{2}) = 2 \int_{0}^{+\infty} t \mathbb{P}(|X| \geqslant t) \, dt$$
You may denote $X^{2}(\Omega) = \{y_{1}, \ldots, y_{n}\}$ with $0 \leqslant y_{1} < y_{2} < \cdots < y_{n}$.
Q9 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Show that the second moment of $X$ is less than or equal to $\frac{a}{b}$.
Q10 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that, for all real $t$,
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant \mathbb{P}(|X| \geqslant t - |\delta|)$$
Q11 Inequalities Prove or Verify an Algebraic Inequality (AM-GM, Cauchy-Schwarz, etc.) View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Show that, for all real $t$,
$$-b(t - |\delta|)^{2} \leqslant a - \frac{1}{2}bt^{2}$$
Q12 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Deduce that for all real $t$ such that $t \geqslant |\delta|$ we have
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
Q13 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Let $X : \Omega \rightarrow \mathbb{R}$ be a real-valued random variable. We assume that there exist two strictly positive reals $a$ and $b$ such that, for all non-negative real $t$,
$$\mathbb{P}(|X| \geqslant t) \leqslant a \exp(-bt^{2})$$
Let $\delta$ be a real such that $0 \leqslant |\delta| \leqslant \sqrt{\frac{a}{b}}$. Justify that the inequality
$$\mathbb{P}(|X + \delta| \geqslant t) \leqslant a \exp(a) \exp\left(-\frac{1}{2}bt^{2}\right)$$
remains valid if $0 \leqslant t < |\delta|$.
Q14 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. The objective of this part is to show, for any non-empty closed convex set $C$ of $E$,
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
Handle the case where $C$ is a closed convex set of $E$ that does not meet $X(\Omega)$.
Q15 Binomial Distribution Derive or Prove a Binomial Distribution Identity View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Show that $\frac{1}{4}d(X, u)^{2}$ follows a binomial distribution with parameters $n$ and $1/2$.
Q16 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Deduce the expectation of $\exp\left(\frac{1}{8}d(X, u)^{2}\right)$ and show that it is less than or equal to $2^{n}$.
Q17 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Let $E$ be a Euclidean space of dimension $n \geqslant 1$ equipped with an orthonormal basis $(e_{1}, \ldots, e_{n})$. Let $\varepsilon_{1}, \ldots, \varepsilon_{n} : \Omega \rightarrow \{-1, 1\}$ be Rademacher random variables that are independent of each other. We set $X = \sum_{i=1}^{n} \varepsilon_{i} e_{i}$. We assume that $C$ is a closed convex set of $E$ that meets $X(\Omega)$ in a single vector $u$. Justify that $d(X, C) \leqslant d(X, u)$ and deduce inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1$$
in this case.
Q18 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We propose to prove inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
by induction on the dimension $n$ of $E$. We assume that $C$ is a closed convex set of $E$ such that $C \cap X(\Omega)$ contains at least two elements. Handle the case $n = 1$.
Q19 Continuous Probability Distributions and Random Variables Verification of Probability Measure or Inner Product Properties View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and $\pi$ the orthogonal projection onto $E'$
$$\pi : \left\lvert \, \begin{aligned} E & \rightarrow E' \\ \sum_{i=1}^{n} x_{i} e_{i} & \mapsto \sum_{i=1}^{n-1} x_{i} e_{i} \end{aligned} \right.$$
We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $H_{t}$ the affine hyperplane $E' + te_{n}$ and $C_{t} = \pi(C \cap H_{t})$.
Show, for $x' \in E'$ and $t \in \{-1, 1\}$, that $x' \in C_{t} \Longleftrightarrow x' + te_{n} \in C$.
Q20 Continuous Probability Distributions and Random Variables Verification of Probability Measure or Inner Product Properties View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and $\pi$ the orthogonal projection onto $E'$
$$\pi : \left\lvert \, \begin{aligned} E & \rightarrow E' \\ \sum_{i=1}^{n} x_{i} e_{i} & \mapsto \sum_{i=1}^{n-1} x_{i} e_{i} \end{aligned} \right.$$
We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $H_{t}$ the affine hyperplane $E' + te_{n}$ and $C_{t} = \pi(C \cap H_{t})$.
Show that $C_{+1}$ and $C_{-1}$ are non-empty closed convex sets of $E'$.
Q21 Probability Definitions Proof of a Probability Identity or Inequality View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and $\pi$ the orthogonal projection onto $E'$. We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $C_{t} = \pi(C \cap H_{t})$ where $H_t = E' + te_n$. For $t$ in $\{-1, 1\}$, we denote by $Y_{t}$ the projection of $X'$ onto the non-empty closed convex set $C_{t}$.
Show that
$$\mathbb{P}(X \in C) = \frac{1}{2}\mathbb{P}(X' \in C_{+1}) + \frac{1}{2}\mathbb{P}(X' \in C_{-1})$$
Q22 Vectors Introduction & 2D Inequality or Proof Involving Vectors View
Let $\lambda$ be a real such that $0 \leqslant \lambda \leqslant 1$. For $t$ in $\{-1, 1\}$, $Y_{t}$ denotes the projection of $X'$ onto $C_{t}$. Show that
$$d(X, C) \leqslant \left\|(1 - \lambda)(Y_{\varepsilon_{n}} + \varepsilon_{n} e_{n}) + \lambda(Y_{-\varepsilon_{n}} - \varepsilon_{n} e_{n}) - X\right\|$$
Q23 Vectors Introduction & 2D Inequality or Proof Involving Vectors View
Let $\lambda$ be a real such that $0 \leqslant \lambda \leqslant 1$. For $t$ in $\{-1, 1\}$, $Y_{t}$ denotes the projection of $X'$ onto $C_{t}$. Deduce that
$$d(X, C)^{2} \leqslant 4\lambda^{2} + \left\|(1 - \lambda)(Y_{\varepsilon_{n}} - X') + \lambda(Y_{-\varepsilon_{n}} - X')\right\|^{2}$$
then that
$$d(X, C)^{2} \leqslant 4\lambda^{2} + (1 - \lambda)\|Y_{\varepsilon_{n}} - X'\|^{2} + \lambda\|Y_{-\varepsilon_{n}} - X'\|^{2}$$
Thus, show the inequality
$$d(X, C)^{2} \leqslant 4\lambda^{2} + (1 - \lambda)d(X', C_{\varepsilon_{n}})^{2} + \lambda d(X', C_{-\varepsilon_{n}})^{2}$$
Q24 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We assume, without loss of generality, that $p_{+} \geqslant p_{-}$. Show that $p_{-} > 0$.
Q25 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Show that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \mathbb{E}\left(\left(\exp\left(\frac{1}{8}d(X', C_{-1})^{2}\right)\right)^{1-\lambda} \cdot \left(\exp\left(\frac{1}{8}d(X', C_{+1})^{2}\right)\right)^{\lambda}\right)$$
Q26 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Deduce that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = -1\right) \leqslant \exp\left(\frac{\lambda^{2}}{2}\right) \left(\mathbb{E}\left(\exp\left(\frac{1}{8}d(X', C_{-1})^{2}\right)\right)\right)^{1-\lambda} \cdot \left(\mathbb{E}\left(\exp\left(\frac{1}{8}d(X', C_{+1})^{2}\right)\right)\right)^{\lambda}$$
Q27 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Using the induction hypothesis, justify that
$$\mathbb{E}\left(\left.\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right\rvert \, \varepsilon_{n} = 1\right) \leqslant \frac{1}{p_{+}}$$
Q28 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. Deduce from the above that for all $\lambda$ in $[0, 1]$
$$\mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant \frac{1}{2}\left(\frac{1}{p_{+}} + \exp\left(\frac{\lambda^{2}}{2}\right) \frac{1}{(p_{-})^{1-\lambda}} \cdot \frac{1}{(p_{+})^{\lambda}}\right)$$
Q29 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We denote $p_{+} = \mathbb{P}(X' \in C_{+1})$ and $p_{-} = \mathbb{P}(X' \in C_{-1})$, with $p_{+} \geqslant p_{-}$. We set $\lambda = 1 - \frac{p_{-}}{p_{+}}$. Show that
$$\mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant \frac{1}{2p_{+}}\left(1 + \exp\left(\frac{\lambda^{2}}{2}\right)(1 - \lambda)^{\lambda - 1}\right)$$
Q30 Curve Sketching Variation Table and Monotonicity from Sign of Derivative View
Show that for all $x \in [0, 1[$
$$\frac{x^{2}}{2} + (x - 1)\ln(1 - x) \leqslant \ln(2 + x) - \ln(2 - x)$$
One may perform a function study.
Q31 Curve Sketching Number of Solutions / Roots via Curve Analysis View
Deduce that for all $x \in [0, 1[$
$$1 + \exp\left(\frac{x^{2}}{2}\right)(1 - x)^{x - 1} \leqslant \frac{4}{2 - x}$$
Q32 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Complete the proof of inequality
$$\mathbb{P}(X \in C) \cdot \mathbb{E}\left(\exp\left(\frac{1}{8}d(X, C)^{2}\right)\right) \leqslant 1 \tag{II.1}$$
Q33 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
Deduce Talagrand's inequality: For every non-empty closed convex set $C$ of $E$ and for every strictly positive real number $t$
$$\mathbb{P}(X \in C) \cdot \mathbb{P}(d(X, C) \geqslant t) \leqslant \exp\left(-\frac{t^{2}}{8}\right)$$
Q34 Continuous Probability Distributions and Random Variables Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We denote by $\|\cdot\|_{F}$ the associated Euclidean norm. We fix a vector $u = (u_{1}, \ldots, u_{d})$ in $\mathbb{R}^{d}$ with $\|u\| = 1$, and define
$$g : \left\lvert \, \begin{aligned} & \mathcal{M}_{k,d}(\mathbb{R}) \rightarrow \mathbb{R} \\ & M \mapsto \|M \cdot u\| \end{aligned} \right.$$
Show that $C = \left\{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\right\}$ is a convex and closed subset of $\mathcal{M}_{k,d}(\mathbb{R})$.
Q35 Matrices Matrix Norm, Convergence, and Inequality View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the inner product defined by
$$\forall (A, B) \in E^{2}, \quad \langle A \mid B \rangle = \operatorname{tr}\left(A^{\top} \cdot B\right)$$
We denote by $\|\cdot\|_{F}$ the associated Euclidean norm. We fix a unit vector $u$ in $\mathbb{R}^{d}$ and define $g(M) = \|M \cdot u\|$. Show that for every matrix $M$ in $\mathcal{M}_{k,d}(\mathbb{R})$
$$\|M \cdot u\| \leqslant \|M\|_{F}$$
Q36 Proof Deduction or Consequence from Prior Results View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the Frobenius norm $\|\cdot\|_{F}$. We fix a unit vector $u$ in $\mathbb{R}^{d}$, define $g(M) = \|M \cdot u\|$, and let $C = \{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\}$. Let $r$ and $t$ be two real numbers, with $t > 0$. Show that for every matrix $M$ in $\mathcal{M}_{k,d}(\mathbb{R})$
$$d(M, C) < t \quad \Longrightarrow \quad g(M) < r + t$$
Q37 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We consider the space $E = \mathcal{M}_{k,d}(\mathbb{R})$ equipped with the Frobenius norm $\|\cdot\|_{F}$. We fix a unit vector $u$ in $\mathbb{R}^{d}$, define $g(M) = \|M \cdot u\|$, and let $C = \{M \in \mathcal{M}_{k,d}(\mathbb{R}) \mid g(M) \leqslant r\}$. Let $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ be a random variable taking values in $\mathcal{M}_{k,d}(\mathbb{R})$, whose coefficients $\varepsilon_{ij}$ are independent Rademacher random variables. Let $r$ and $t$ be two real numbers, with $t > 0$. Deduce that
$$\mathbb{P}(g(X) \leqslant r) \cdot \mathbb{P}(g(X) \geqslant r + t) \leqslant \exp\left(-\frac{1}{8}t^{2}\right)$$
Q38 Continuous Probability Distributions and Random Variables Distribution of Transformed or Combined Random Variables View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Justify that $g(X)$ admits at least one median. One may consider the function $G$ from $\mathbb{R}$ to $\mathbb{R}$ such that, for every real number $t$, $G(t) = \mathbb{P}(g(X) \leqslant t)$, and examine the set $G^{-1}([1/2, 1])$.
Q39 Continuous Probability Distributions and Random Variables Probability Inequality and Tail Bound Proof View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Deduce from the above that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - m| \geqslant t) \leqslant 4\exp\left(-\frac{1}{8}t^{2}\right)$$
where $m$ is a median of $g(X)$.
Q40 Continuous Probability Distributions and Random Variables Expectation and Moment Inequality Proof View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$, and $m$ is a median of $g(X)$. Deduce that $\mathbb{E}\left((g(X) - m)^{2}\right) \leqslant 32$.
Q41 Discrete Random Variables Expectation of a Function of a Discrete Random Variable View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Show that $\mathbb{E}\left(g(X)^{2}\right) = k$, and deduce that $\mathbb{E}(g(X)) \leqslant \sqrt{k}$.
Q42 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$, and $m$ is a median of $g(X)$. Deduce that $(\sqrt{k} - m)^{2} \leqslant \mathbb{E}\left((g(X) - m)^{2}\right)$.
Q43 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
We consider $g(X)$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients and $g(M) = \|M \cdot u\|$ for a fixed unit vector $u$. Show that, for every strictly positive real number $t$
$$\mathbb{P}(|g(X) - \sqrt{k}| \geqslant t) \leqslant 4\exp(4)\exp\left(-\frac{1}{16}t^{2}\right)$$
Q44 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. Let $\varepsilon$ be in $]0, 1[$ and $\delta$ be in $]0, 1/2[$. We assume that $k \geqslant 160\frac{\ln(1/\delta)}{\varepsilon^{2}}$. Show that, for every unit vector $u$ in $\mathbb{R}^{d}$:
$$\mathbb{P}\left(\left|\|A_{k} \cdot u\| - 1\right| > \varepsilon\right) < \delta$$
Q45 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
We keep the notations and hypotheses from above. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. Let $\varepsilon \in ]0,1[$, $\delta \in ]0, 1/2[$, and $k \geqslant 160\frac{\ln(1/\delta)}{\varepsilon^{2}}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$ we denote by $E_{ij}$ the event
$$(1 - \varepsilon)\|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon)\|v_{i} - v_{j}\|$$
Show that $\mathbb{P}\left(\overline{E_{ij}}\right) < \delta$, where $\overline{E_{ij}}$ denotes the complementary event of $E_{ij}$.
Q46 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
We keep the notations and hypotheses from above. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. Let $\varepsilon \in ]0,1[$, $\delta \in ]0, 1/2[$, and $k \geqslant 160\frac{\ln(1/\delta)}{\varepsilon^{2}}$. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$, $E_{ij}$ denotes the event
$$(1 - \varepsilon)\|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon)\|v_{i} - v_{j}\|$$
Deduce that $\mathbb{P}\left(\bigcap_{1 \leqslant i < j \leqslant N} E_{ij}\right) \geqslant 1 - \frac{N(N-1)}{2}\delta$.
Q47 Discrete Random Variables Probability Bounds and Inequalities for Discrete Variables View
We keep the notations and hypotheses from above. Let $v_{1}, \ldots, v_{N}$ be distinct vectors in $\mathbb{R}^{d}$. We set $A_{k} = \frac{X}{\sqrt{k}}$ where $X = (\varepsilon_{ij})_{1 \leqslant i \leqslant k, 1 \leqslant j \leqslant d}$ is a random variable with independent Rademacher coefficients. For every $(i, j) \in \llbracket 1, N \rrbracket^{2}$ such that $i < j$, $E_{ij}$ denotes the event
$$(1 - \varepsilon)\|v_{i} - v_{j}\| \leqslant \|A_{k} \cdot v_{i} - A_{k} \cdot v_{j}\| \leqslant (1 + \varepsilon)\|v_{i} - v_{j}\|$$
Deduce the Johnson-Lindenstrauss theorem: there exists an absolute constant $c > 0$ such that for any natural integers $N$ and $d$ greater than or equal to 2 and for any distinct $v_{1}, \ldots, v_{N}$ in $\mathbb{R}^{d}$, it suffices that
$$k \geqslant c\frac{\ln(N)}{\varepsilon^{2}}$$
for there to exist an $\varepsilon$-isometry $f : \mathbb{R}^{d} \rightarrow \mathbb{R}^{k}$ for $v_{1}, \ldots, v_{N}$.