Discrete Random Variables

Question Types
All Questions
grandes-ecoles 2024 Q14a Convergence of Expectations or Moments
Let $n$ be a non-zero natural integer. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_n} \omega(\sigma)^2 \underset{n \rightarrow +\infty}{=} (2\gamma+1)\ln(n) + c + \ln(n)^2 + O\left(\frac{\ln(n)}{n}\right)$$ for a real number $c$ to be determined.
grandes-ecoles 2024 Q14b Convergence of Expectations or Moments
Let $n$ be a non-zero natural integer. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_n} (\omega(\sigma) - \ln(n))^2 \underset{n \rightarrow +\infty}{=} \ln(n) + c + O\left(\frac{\ln(n)}{n}\right).$$
grandes-ecoles 2024 Q18 Expectation and Variance via Combinatorial Counting
We consider a sequence of random variables $(X_n : \Omega \longrightarrow \{-1,1\})_{n \in \mathbf{N}}$ defined on the same probability space $(\Omega, \mathscr{A}, P)$, taking values in $\{-1,1\}$, mutually independent and centered. For every $n \in \mathbf{N}^*$, we denote $S_n = \sum_{k=1}^n X_k$. We fix the integer $n \geqslant 1$. The random variable $N_n : \Omega \longrightarrow \mathbf{N}$ counts, for every $\omega \in \Omega$, the number of equality indices of the path $(X_1(\omega), \cdots, X_{2n}(\omega))$. For every integer $i$ between $1$ and $n$, the event $A_i$ is defined by: $$A_i = \left\{\omega,\; 2i \text{ is an equality index of } (X_1(\omega), \cdots, X_{2n}(\omega))\right\}.$$ Show that the random variable $N_n$ has finite expectation and that its expectation $\mathbb{E}(N_n)$ equals: $$\mathbb{E}(N_n) = \sum_{i=1}^n \frac{\binom{2i}{i}}{4^i}.$$ [Hint: one may express the variable $N_n$ using indicator functions associated with the events $A_i$.]
grandes-ecoles 2024 Q19 Expectation and Variance of Sums of Independent Variables
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Determine, for all $n \in \mathbf { N } ^ { * }$, $E \left( S _ { n } \right)$ and $V \left( S _ { n } \right)$.
grandes-ecoles 2024 Q19 Expectation and Variance via Combinatorial Counting
Let $G _ { 0 } = \left( S _ { 0 } , A _ { 0 } \right)$ be a particular fixed graph with $s _ { 0 } = s _ { G _ { 0 } }$, $a _ { 0 } = a _ { G _ { 0 } }$, $s_0 \geq 2$, $a_0 \geq 1$. Let $X _ { n } ^ { 0 }$ be the discrete real random variable defined on $\mathcal{E}_n$ such that for $G \in \Omega_n$, the integer $X_n^0(G)$ equals the number of copies of $G_0$ contained in $G$. Express $X _ { n } ^ { 0 }$ using random variables of the type $X _ { H }$, and show that : $$\mathbf { E } \left( X _ { n } ^ { 0 } \right) = \sum _ { H \in \mathcal { C } _ { 0 } } \mathbf { P } ( H \subset G ) \leq n ^ { s _ { 0 } } p _ { n } ^ { a _ { 0 } } .$$
grandes-ecoles 2024 Q20 Expectation of a Function of a Discrete Random Variable
Let $S$ and $T$ be two independent random variables each taking a finite number of real values. Assume that $T$ and $- T$ follow the same distribution.
Show that:
$$E ( \cos ( S + T ) ) = E ( \cos ( S ) ) E ( \cos ( T ) )$$
grandes-ecoles 2024 Q20 Probability Bounds and Inequalities for Discrete Variables
Let $G _ { 0 } = \left( S _ { 0 } , A _ { 0 } \right)$ be a particular fixed graph with $s _ { 0 } = s _ { G _ { 0 } }$, $a _ { 0 } = a _ { G _ { 0 } }$, $s_0 \geq 2$, $a_0 \geq 1$. Let $X _ { n } ^ { 0 }$ be the discrete real random variable counting the number of copies of $G_0$ contained in $G \in \Omega_n$, and let $$\omega _ { 0 } = \min _ { \substack { H \subset G _ { 0 } \\ a _ { H } \geq 1 } } \frac { s _ { H } } { a _ { H } }$$ Deduce that if $p _ { n } = \mathrm { o } \left( n ^ { - \omega _ { 0 } } \right)$, then $\lim _ { n \rightarrow + \infty } \mathbf { P } \left( X _ { n } ^ { 0 } > 0 \right) = 0$. Hint: one may introduce $H _ { 0 } \subset G _ { 0 }$ achieving the minimum giving $\omega _ { 0 }$.
grandes-ecoles 2024 Q20 Expectation and Variance via Combinatorial Counting
In an urn containing $n$ white balls and $n$ black balls, we proceed to draw balls without replacement, until the urn is completely empty. The draws are equally likely at each draw. For every integer $k$ between $1$ and $2n$, we say that the integer $k$ is an equality index if, after drawing the first $k$ balls without replacement, there remain as many black balls as white balls in the urn. We note that the integer $2n$ is always an equality index. We denote by $M_n$ the random variable counting the number of equality indices $k$ between $1$ and $2n$.
By using for example the events $B_i$: ``the integer $i$ is an equality index'', show that the variable $M_n$ has finite expectation equal to: $$\mathbb{E}(M_n) = \sum_{i=0}^{n-1} \frac{\binom{2i}{i} \cdot \binom{2n-2i}{n-i}}{\binom{2n}{n}}.$$
grandes-ecoles 2024 Q21 Expectation of a Function of a Discrete Random Variable
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Deduce that for all $n \in \mathbf { N } ^ { * }$, and for all $t \in \mathbf { R }$:
$$E \left( \cos \left( t S _ { n } \right) \right) = ( \cos ( t ) ) ^ { n } .$$
grandes-ecoles 2024 Q21 Expectation and Variance via Combinatorial Counting
Let $G _ { 0 } = \left( S _ { 0 } , A _ { 0 } \right)$ be a particular fixed graph with $s _ { 0 } = s _ { G _ { 0 } }$, $a _ { 0 } = a _ { G _ { 0 } }$, $s_0 \geq 2$, $a_0 \geq 1$. We now assume that $\lim _ { n \rightarrow + \infty } \left( n ^ { \omega _ { 0 } } p _ { n } \right) = + \infty$. Show that the expectation $\mathbf { E } \left( \left( X _ { n } ^ { 0 } \right) ^ { 2 } \right)$ satisfies : $$\mathbf { E } \left( \left( X _ { n } ^ { 0 } \right) ^ { 2 } \right) = \sum _ { \left( H , H ^ { \prime } \right) \in C _ { 0 } ^ { 2 } } \mathbf { P } \left( H \cup H ^ { \prime } \subset G \right) = \sum _ { \left( H , H ^ { \prime } \right) \in C _ { 0 } ^ { 2 } } p _ { n } ^ { 2 a _ { 0 } - a _ { H \cap H ^ { \prime } } } .$$
grandes-ecoles 2024 Q22 Expectation of a Function of a Discrete Random Variable
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Let $a , b \in \mathbf { R }$ such that $a \neq 0$ and $| b | \leq | a |$. Show that
$$| a + b | = | a | + \operatorname { sign } ( a ) b$$
where $\operatorname { sign } ( x ) = x / | x |$ for nonzero real $x$. Deduce that:
$$\forall n \in \mathbf { N } ^ { * } , \quad E \left( \left| S _ { 2 n } \right| \right) = E \left( \left| S _ { 2 n - 1 } \right| \right)$$
grandes-ecoles 2025 QI.5 Expectation and Variance via Combinatorial Counting
Let $k$ be an even integer in $\{2, \ldots, N\}$. We assume in this question that the random variables $X_1, \ldots, X_N$ are $k$-independent.
We introduce the following notations: $\mathcal{T}$ denotes the set $\{1, \ldots, N\}^k$. If $T = (n_1, \ldots, n_k) \in \mathcal{T}$ and $n \in \{1, \ldots, N\}$, we denote by $m_T(n)$ the multiplicity of $n$ in $T$, that is $$m_T(n) = \operatorname{Card}\left\{i \in \{1, \ldots, k\}; n_i = n\right\}$$ For $\ell \in \{1, \ldots, k\}$, we denote by $\mathcal{T}_\ell$ the set of $T$ in $\mathcal{T}$ involving exactly $\ell$ distinct indices, where each has multiplicity at least 2, namely: $T \in \mathcal{T}_\ell$ if $$\operatorname{Card}\left(\left\{n \in \{1, \ldots, N\}; m_T(n) > 0\right\}\right) = \ell,$$ and $$\forall n \in \{1, \ldots, N\}, \quad m_T(n) > 0 \Rightarrow m_T(n) \geq 2$$ Finally, we denote by $|\mathcal{T}_\ell|$ the cardinality of $\mathcal{T}_\ell$.
I.5.a) Determine $|\mathcal{T}_1|$ and $|\mathcal{T}_\ell|$ for $\ell > k/2$.
I.5.b) Justify $$\mathbb{E}\left[(S_N)^k\right] = \sum_{T \in \mathcal{T}} \prod_{n=1}^N \mathbb{E}\left[X_n^{m_T(n)}\right]$$ then $$\mathbb{E}\left[(S_N)^k\right] = \sum_{\ell=1}^{k/2} \sum_{T \in \mathcal{T}_\ell} \prod_{n=1}^N \mathbb{E}\left[X_n^{m_T(n)}\right]$$
I.5.c) Prove that $$\mathbb{E}\left[(S_N)^k\right] \leq \sum_{\ell=1}^{k/2} K^{k-2\ell} |\mathcal{T}_\ell|$$
I.5.d) Let $\ell \in \{1, \ldots, k/2\}$. Justify the following estimate: $$|\mathcal{T}_\ell| \leq \binom{N}{\ell} \ell^k \leq \frac{N^\ell}{\ell!} \ell^k$$ One may consider the set of $T \in \mathcal{T}$ involving at most $\ell$ distinct elements.
I.5.e) For $\ell \in \{1, \ldots, k/2\}$, prove that $$\ell! \geq \ell^\ell e^{-\ell}$$ then deduce that $$|\mathcal{T}_\ell| \leq (Ne)^\ell \left(\frac{k}{2}\right)^{k-\ell}$$
I.5.f) Prove that $$\mathbb{E}\left[(S_N)^k\right] \leq \left(\frac{Kk}{2}\right)^k \sum_{\ell=1}^{k/2} \left(\frac{2Ne}{kK^2}\right)^\ell$$
I.5.g) We assume $$kK^2 \leq N.$$ Prove that $$\mathbb{E}\left[(S_N)^k\right] \leq \frac{\theta}{\theta - 1} \left(\frac{Nek}{2}\right)^{k/2} \leq 2\left(\frac{Nek}{2}\right)^{k/2},$$ where $$\theta := \frac{2Ne}{kK^2}$$
I.5.h) Prove (under hypothesis (27)) the following estimate: for all $t > 0$, $$\mathbb{P}\left(|S_N| \geq t\sqrt{N}\right) \leq 2\left(\frac{\sqrt{ek/2}}{t}\right)^k$$
grandes-ecoles 2025 QI.6 Probability Bounds and Inequalities for Discrete Variables
We now assume that the random variables $X_1, \ldots, X_N$ are independent, so that they are $k$-independent for all $k \in \{2, \ldots, N\}$. We now want to establish the following bound: there exist numerical constants $\alpha, \beta > 0$ (independent of $K \geq 1$ and $N$) such that for all $t \geq 0$, $$\mathbb{P}\left(|S_N| \geq t\sqrt{N}\right) \leq \beta \exp\left(-\alpha t^2/K^2\right)$$
I.6.a) Justify that it suffices to consider the case $K = 1$, which we will do in the next three questions.
I.6.b) Let $k$ be the largest even integer in $\{1, \ldots, N\}$ less than or equal to $\frac{2t^2}{e^2}$. Justify that (27) is satisfied if $$e \leq t \leq \frac{e}{\sqrt{2}}\sqrt{N}$$
I.6.c) Under hypothesis (32), prove that we have (31) with $$\beta = 2e, \quad \alpha = e^{-2}$$
I.6.d) Conclude that there exist numerical constants $\alpha, \beta > 0$ such that (31) is verified for all $t \geq 0$.
grandes-ecoles 2025 QII.1 Probability Bounds and Inequalities for Discrete Variables
Let $\gamma$ be a positive numerical constant. We assume the following property is satisfied: for every convex set $A \subset Q^N$, $$\mathbb{P}(X \in A) \mathbb{E}\left[\exp\left(\gamma \frac{d(X,A)^2}{4K^2}\right)\right] \leq 1.$$ We are given a 1-Lipschitz and convex function $F : \mathbb{R}^N \rightarrow \mathbb{R}$. The purpose of this question is to prove that (37) is then verified: $$\mathbb{P}(F(X) \geq m) \geq \frac{1}{2} \Longrightarrow \mathbb{P}(F(X) \leq m - t) \leq \beta e^{-\alpha t^2/K^2}$$
II.1.a) Let $s, \sigma \in \mathbb{R}$ with $s < \sigma$. By considering the set $$A_s = \left\{x \in Q^N; F(x) \leq s\right\}$$ show that $$\mathbb{P}(F(X) \leq s)\mathbb{P}(F(X) \geq \sigma) \leq \exp\left(-\gamma \frac{(\sigma - s)^2}{4K^2}\right)$$
II.1.b) Prove that (37) is verified.
grandes-ecoles 2025 QII.2 Probability Bounds and Inequalities for Discrete Variables
Let $x$ be an arbitrary point of $Q^N$. Let $P^N$ be the set of vertices of the hypercube $[0,1]^N$, that is the set of linear combinations of the $e_i$ for $i \in \{1, \ldots, N\}$ with coefficients 0 or 1. If $A$ is a non-empty subset of $Q^N$, we define the subsets $P_A(x)$ and $R_A(x)$ of $P^N$ as follows: let $H_i$ be the hyperplane orthogonal to $e_i$, generated by the $e_j$ for $j \neq i$. Then $z \in P_A(x)$ if there exists $a \in A$ such that $$\forall i \in \{1, \ldots, N\}, z \in H_i \Longrightarrow a - x \in H_i$$ while $z \in R_A(x)$ if there exists $a \in A$ such that $$\forall i \in \{1, \ldots, N\}, z \in H_i \Longleftrightarrow a - x \in H_i.$$
Given $A \subset Q^N$ non-empty and $x \in Q^N$, we also define the quantity $$q(x, A) := \inf\left\{|z|; z \in \Gamma(P_A(x))\right\}.$$ We moreover adopt the following convention: if $A$ is the empty set, we set $q(x, A) = 2N$.
II.2.a) If $z, z' \in P^N$, we denote by $z \leq z'$ when $\langle z, e_i \rangle \leq \langle z', e_i \rangle$ for all $i \in \{1, \ldots, N\}$. Prove that $$P_A(x) = \left\{z' \in P^N; \exists z \in R_A(x), z \leq z'\right\}$$
II.2.b) Let $x \in \mathbb{R}^N$. Justify the equivalences $$x \in A \Longleftrightarrow 0 \in P_A(x) \Longleftrightarrow P_A(x) = P^N$$
II.2.c) In dimension $N = 3$, give an example of a set $A$ for which $e_3 \notin P_A(0)$ and describe precisely the sets $R_A(0)$ and $P_A(0)$ corresponding.
II.2.d) Let $B$ be a non-empty subset of $\mathbb{R}^N$. We denote by $\Gamma_0(B)$ the set of convex combinations of at most $N+1$ elements of $B$: $$\Gamma_0(B) := \left\{\sum_{j=1}^{N+1} \theta_j z_j; \theta_j \in [0,1], z_j \in B, \sum_{j=1}^{N+1} \theta_j = 1\right\}.$$ Prove that $\Gamma(B) = \Gamma_0(B)$.
Hint: you may prove that any convex combination of $m+1$ elements of $B$ with $m > N$ can be rewritten as a convex combination of at most $m$ elements of $B$.
II.2.e) Let $B$ be a non-empty subset of $\mathbb{R}^N$. Prove that $\Gamma(B)$ is a convex set, and that it is compact if $B$ is.
II.2.f) Draw and name (as a geometric object) the convex hull $\Gamma(B)$ in dimension $N = 3$, in the three following cases: $$B = \{e_1, e_2, e_1 + e_2\}, \quad B = \{e_1, e_2, e_1 + e_2, e_2 + e_3\}, \quad B = P^3.$$ For each of these examples, say whether $B$ can correspond to a set $P_A(x)$.
II.2.g) Let $A \subset Q^N$ non-empty and $x \in Q^N$. Justify that the infimum in (48) is attained.
II.2.h) Let $A \subset Q^N$ non-empty and $x \in Q^N$. Justify that $q(x, A) \leq \sqrt{N}$. Under what condition do we have $q(x, A) = 0$?
II.2.i) Let $x \in Q^N$ and $A \subset Q^N$ non-empty. Justify that $$q(x, A) = \inf\left\{|z|; z \in \Gamma(R_A(x))\right\}$$
II.2.j) Let $x \in Q^N$ and $A \subset Q^N$ with $A$ convex. Prove that $$d(x, A) \leq 2K\, q(x, A).$$
II.2.k) Let $\gamma \geq 0$ be a numerical constant. Prove that the property: ``for every convex set $A \subset Q^N$, we have $\mathbb{P}(X \in A)\mathbb{E}\left[\exp\left(\gamma\, q(X,A)^2\right)\right] \leq 1$'' implies $$\mathbb{P}(X \in A)\mathbb{E}\left[\exp\left(\gamma \frac{d(X,A)^2}{4K^2}\right)\right] \leq 1.$$
grandes-ecoles 2025 QII.3 Probability Bounds and Inequalities for Discrete Variables
Let $$\gamma_0 = \frac{1}{4}$$ The purpose of the end of this part II is the proof by induction on the dimension $N$ of the property: for every convex set $A \subset Q^N$, $$\mathbb{P}(X \in A)\mathbb{E}\left[\exp\left(\gamma_0\, q(X,A)^2\right)\right] \leq 1,$$ for $\gamma = \gamma_0$. For $N$ a positive integer, we introduce the following induction hypothesis $H_N$: ``Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space. Let $N$ random variables $X_1, \ldots, X_N$ taking values in a finite set, independent and identically distributed, satisfying $\mathbb{P}(|X_n| \leq K) = 1$, and let $X = (X_i)_{1 \leq i \leq N}$ be the random vector with components $X_1, \ldots, X_N$. Then (53) is verified for $\gamma = \gamma_0 = \frac{1}{4}$''.
II.3.a) We consider the case $N = 1$. Prove that (53) is satisfied when $\gamma \leq \ln(2)$, and thus for $\gamma = \gamma_0$.
We now assume $N > 1$, and we fix $A \subset Q^N$ convex. We adopt the following notations: we decompose $$x = (\bar{x}, x_N) \quad \text{with} \quad \bar{x} = (x_i)_{1 \leq i \leq N-1} \in \mathbb{R}^{N-1}.$$ If $A \subset \mathbb{R}^N$ and $\theta \in \mathbb{R}$, we denote $$A_\theta := \left\{b \in \mathbb{R}^{N-1}; (b, \theta) \in A\right\}$$ the section of $A$ at level $\theta$. We also denote $$\bar{A} := \left\{\bar{a} \in \mathbb{R}^{N-1}; \exists \theta \in \mathbb{R}, (\bar{a}, \theta) \in A\right\}$$ the projection of $A$ onto $\mathbb{R}^{N-1}$.
II.3.b) Let $x \in Q^N$. Let $A \subset Q^N$ such that $A_{x_N}$ is non-empty. Let $\bar{z} \in P^{N-1}$. Prove that $$\bar{z} \in P_{A_{x_N}}(\bar{x}) \Longrightarrow (\bar{z}, 0) \in P_A(x)$$ and $$\bar{z} \in P_{\bar{A}}(\bar{x}) \Longrightarrow (\bar{z}, 1) \in P_A(x)$$
II.3.c) Prove that, for all $\lambda \in [0,1]$, we have $$q(x, A)^2 \leq (1-\lambda)^2 + \lambda\, q(\bar{x}, A_{x_N})^2 + (1-\lambda)\, q(\bar{x}, \bar{A})^2.$$
We fix $x_N \in \mathbb{R}$ such that $\mathbb{P}(X_N = x_N) > 0$ and we consider the probability $$\overline{\mathbb{P}}(B) := \mathbb{P}(B \mid X_N = x_N) = \frac{\mathbb{P}(B \cap \{X_N = x_N\})}{\mathbb{P}(X_N = x_N)} \quad \text{for } B \in \mathcal{A},$$ as well as the associated expectation $$\overline{\mathbb{E}}[Z] := \frac{1}{\mathbb{P}(X_N = x_N)} \mathbb{E}\left[Z\, \mathbf{1}_{\{X_N = x_N\}}\right]$$ for any random variable $Z$.
II.3.d) Assuming the induction hypothesis $H_{N-1}$, prove $$\overline{\mathbb{P}}(\bar{X} \in \bar{A})\, \overline{\mathbb{E}}\left[\exp\left(\gamma_0\, q(X,A)^2\right)\right] \leq e^{\gamma_0}$$ and justify that, for all $\lambda \in [0,1]$, we have $$\overline{\mathbb{P}}(\bar{X} \in A_{x_N})^\lambda\, \overline{\mathbb{P}}(\bar{X} \in \bar{A})^{1-\lambda}\, \overline{\mathbb{E}}\left[\exp\left(\gamma_0\, q(X,A)^2\right)\right] \leq e^{\gamma_0(1-\lambda)^2}.$$
Hint: you may assume H\"older's inequality: $$\overline{\mathbb{E}}\left[e^{\lambda Y} e^{(1-\lambda)Z}\right] \leq \left\{\overline{\mathbb{E}}\left[e^Y\right]\right\}^\lambda \left\{\overline{\mathbb{E}}\left[e^Z\right]\right\}^{(1-\lambda)}$$ for $\lambda \in [0,1]$ and $Y, Z$ random variables.
II.3.e) We assume $$\overline{\mathbb{P}}(\bar{X} \in \bar{A}) > 0,$$ and we define $$r = \frac{\overline{\mathbb{P}}(\bar{X} \in A_{x_N})}{\overline{\mathbb{P}}(\bar{X} \in \bar{A})}$$ Prove that $$r^\lambda e^{-\gamma_0(1-\lambda)^2}\, \overline{\mathbb{P}}(\bar{X} \in \bar{A})\, \overline{\mathbb{E}}\left[\exp\left(\gamma_0\, q(X,A)^2\right)\right] \leq 1.$$
II.3.f) We provisionally admit the following inequality: for all $\gamma \in [0, \gamma_0]$, for all $r \in ]0,1]$, $$\frac{1}{2-r} \leq \sup_{\lambda \in [0,1]} r^\lambda e^{-\gamma(1-\lambda)^2}$$ Justify that $$\overline{\mathbb{P}}(\bar{X} \in \bar{A})\, \overline{\mathbb{E}}\left[\exp\left(\gamma_0\, q(X,A)^2\right)\right] \leq (2 - r).$$ We shall distinguish the cases $\overline{\mathbb{P}}(\bar{X} \in A_{x_N}) > 0$ and $\overline{\mathbb{P}}(\bar{X} \in A_{x_N}) = 0$.
II.3.g) Prove that $$\mathbb{P}(X \in A)\, \mathbb{E}\left[\exp\left(\gamma_0\, q(X,A)^2\right) \mathbf{1}_{\{X_N = x_N\}}\right] \leq R(2-r)\, \mathbb{P}(X_N = x_N),$$ where $$R = \frac{\mathbb{P}(X \in A)}{\mathbb{P}(\bar{X} \in \bar{A})}$$
II.3.h) Prove that $$\mathbb{P}(X \in A)\, \mathbb{E}\left[\exp\left(\gamma_0\, q(X,A)^2\right)\right] \leq R(2-R),$$ where $R$ is defined in (72), then prove (53) and conclude the induction $H_{N-1} \Rightarrow H_N$. You should take care to account for the case where (66) is not verified.
II.3.i) Justify (69): for all $\gamma \in [0, \gamma_0]$, for all $r \in ]0,1]$, $$\frac{1}{2-r} \leq \sup_{\lambda \in [0,1]} r^\lambda e^{-\gamma(1-\lambda)^2}$$
grandes-ecoles 2025 Q10 Expectation and Variance of Sums of Independent Variables
Let $p \in \left[ 1 , + \infty \right[$. Let $\left( X _ { i } \right) _ { i \in \llbracket 1 , n \rrbracket}$ be a sequence of independent random variables all following a Rademacher distribution. Let $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$. Show that $$\mathbf { E } \left( \left( \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right) ^ { 2 } \right) = \sum _ { i = 1 } ^ { n } c _ { i } ^ { 2 } .$$
grandes-ecoles 2025 Q20 Expectation of a Function of a Discrete Random Variable
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. We consider an arbitrary orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, following the uniform distribution on this set. Show that for all $\mathbf{w} \in \mathbb{R}^n$, we have $\mathbb{E}\left[\langle \mathbf{U}, \mathbf{w} \rangle^2\right] = \frac{1}{n} \|\mathbf{w}\|^2$.
grandes-ecoles 2025 Q21 Expectation of a Function of a Discrete Random Variable
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$, and for all $x \in \mathbb{R}$, $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$.
Let $x \in \mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$. Show that the random variable $\chi_B(x)$ has finite expectation, and that, denoting by $\chi_A'$ the derivative of the polynomial $\chi_A$, we have $$\mathbb{E}\left[\chi_B(x)\right] = \chi_A(x) - \frac{1}{n} \chi_A'(x)$$
grandes-ecoles 2025 Q21 Expectation of a Function of a Discrete Random Variable
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. We consider an arbitrary orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, following the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$. For all $x \in \mathbb{R}$, we denote $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$. Let $x \in \mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$. Show that the random variable $\chi_B(x)$ has finite expectation, and that, denoting by $\chi_A'$ the derivative of the polynomial $\chi_A$, we have $$\mathbb{E}\left[\chi_B(x)\right] = \chi_A(x) - \frac{1}{n} \chi_A'(x).$$
grandes-ecoles 2025 Q22 Expectation of a Function of a Discrete Random Variable
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$, and for all $x \in \mathbb{R}$, $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$.
Show that for all $k \in \{1, 2, \ldots, n\}$, we have $$\mathbb{E}\left[\chi_B\left(\lambda_k\right)\right] = -\frac{1}{n} \chi_A'\left(\lambda_k\right)$$
grandes-ecoles 2025 Q22 Expectation of a Function of a Discrete Random Variable
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. We consider an arbitrary orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, following the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$. Show that for all $k \in \{1,2,\ldots,n\}$, we have $$\mathbb{E}\left[\chi_B\left(\lambda_k\right)\right] = -\frac{1}{n} \chi_A'\left(\lambda_k\right).$$
grandes-ecoles 2025 Q23 Existence of Expectation or Moments
Justify that for all $n \geq 0$, the random variables $\tilde { S } _ { n } , \tilde { I } _ { n }$ and $\tilde { R } _ { n }$ as well as the random variables $\Delta \tilde { S } _ { n } , \Delta \tilde { I } _ { n }$ and $\Delta \tilde { R } _ { n }$, have finite expectation.
Here $\Delta U_n = U_{n+1} - U_n$ and the random variables take values in $\{0, \ldots, M\}$.
grandes-ecoles 2025 Q23 Expectation of a Function of a Discrete Random Variable
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$, and for all $x \in \mathbb{R}$, $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$.
Prove that there exists $x \in \mathbb{R}$ such that $\mathbb{E}\left[\chi_B(x)\right] \neq 0$.
jee-main 2025 Q9 Probability Distribution Construction and Parameter Determination
Let $\mathrm { A } = \left[ \mathrm { a } _ { i j } \right]$ be a $2 \times 2$ matrix such that $\mathrm { a } _ { i j } \in \{ 0,1 \}$ for all $i$ and $j$. Let the random variable X denote the possible values of the determinant of the matrix $A$. Then, the variance of $X$ is :
(1) $\frac { 3 } { 4 }$
(2) $\frac { 5 } { 8 }$
(3) $\frac { 3 } { 8 }$
(4) $\frac { 1 } { 4 }$