UFM Statistics

View all 201 questions →

grandes-ecoles 2023 Q12 Probability Bounds and Inequalities for Discrete Variables View
Let $X$ be a Bernoulli random variable with parameter $\lambda \in ]0,1[$. Show that $$d_{VT}\left(p_X, \pi_\lambda\right) = \lambda\left(1 - e^{-\lambda}\right).$$ Deduce that $$d_{VT}\left(p_X, \pi_\lambda\right) \leq \lambda^2.$$
grandes-ecoles 2023 Q13 Expectation and Variance of Sums of Independent Variables View
We assume that for all $i,j \in \{1,\ldots,d\}$, $N_{i,j}$ is a random variable taking values in $\mathbb{N}$ and such that $N_{i,j}^2$ has finite expectation. For all $i \in \{1,\ldots,d\}$, we introduce the random variable $L_i = (N_{i,1}, \ldots, N_{i,d})$. We consider a family of independent random variables $(L_i^{n,k})_{n \geqslant 1, k \geqslant 1}$ where for all $i$, $n$, $k$, $L_i^{n,k}$ has the same distribution as $L_i$. Let $X_0 = (X_{0,i})_{1 \leqslant i \leqslant d}$ be a random variable taking values in $\mathscr{M}_{1,d}(\mathbb{N})$. We define by recursion for $n \geqslant 0$: $$X_{n+1} = \sum_{i=1}^{d} \sum_{k=1}^{X_{n,i}} L_i^{n,k}.$$ We introduce $M \in \mathscr{M}_d\left(\mathbb{R}_{+}\right)$ defined by $M_{i,j} = \mathbb{E}(N_{i,j})$ and $x_{n,j} = \mathbb{E}(X_{n,j})$.
(a) Show that, for all $y \in \mathscr{M}_{1,d}(\mathbb{N})$ and $1 \leqslant j \leqslant d$, $$\mathbb{E}\left(X_{n+1,j} \mathbf{1}_{X_n = y}\right) = (yM)_j \mathbb{P}(X_n = y).$$ (One may use without proof the fact that the random variables $L_i^{n,k}$ and $\mathbf{1}_{X_n = y}$ are independent.)
(b) Deduce that, for all $n \geqslant 0$, $$x_{n+1} = x_n M.$$
grandes-ecoles 2023 Q14 Expectation and Variance of Sums of Independent Variables View
Let $\mathscr{I}$ be a finite set and $(Y_i)_{i \in \mathscr{I}}$ be a family of random variables that are pairwise independent, take real values and whose squares have finite expectation. Show that $$\mathbb{E}\left(\left(\sum_{i \in \mathscr{I}} Y_i\right)^2\right) = \left(\sum_{i \in \mathscr{I}} \mathbb{E}(Y_i)\right)^2 + \sum_{i \in \mathscr{I}} \operatorname{Var}(Y_i).$$
grandes-ecoles 2023 Q15 Expectation and Variance of Sums of Independent Variables View
We use the setup of the third part. For $u \in \mathscr{M}_{d,1}(\mathbb{R})$, we denote $T(u) = (T_i(u))_{1 \leqslant i \leqslant d} \in \mathbb{R}^d$ the vector defined by $$T_i(u) = \operatorname{Var}\left(\langle L_i, u \rangle\right) \quad \text{for } i \in \{1,\ldots,d\}.$$
(a) Show that for all $u \in \mathscr{M}_{d,1}(\mathbb{R})$, $y \in \mathscr{M}_{1,d}(\mathbb{N})$ and $n \geqslant 0$, $$\mathbb{E}\left(\langle X_{n+1}, u \rangle^2 \mathbf{1}_{X_n = y}\right) = \mathbb{P}(X_n = y)\left(\langle y, Mu \rangle^2 + \langle y, T(u) \rangle\right).$$ (One may use without proof the fact that, for all $n \geqslant 0$, the random variables $\sum_{j=1}^{d} u_j L_{i,j}^{n,k} \mathbf{1}_{X_n = y}$ are pairwise independent when $k$ and $i$ vary.)
(b) Show that for all $u \in \mathscr{M}_{d,1}(\mathbb{R})$ and $n \geqslant 0$, $$\mathbb{E}\left(\langle X_{n+1}, u \rangle^2\right) = \mathbb{E}\left(\langle X_n, Mu \rangle^2\right) + \langle x_0 M^n, T(u) \rangle.$$
grandes-ecoles 2023 Q16 Probability Distribution Construction and Parameter Determination View
If $x$ and $y$ are two probability distributions on $\mathbf{N}$, we define the application $x * y : \mathbf{N} \rightarrow \mathbf{R}_+$ by $$\forall k \in \mathbf{N} \quad (x * y)(k) = \sum_{i=0}^{k} x(i) y(k-i) = \sum_{i+j=k} x(i) y(j)$$ Show that $x * y$ is a distribution on $\mathbf{N}$.
grandes-ecoles 2023 Q16 Expectation and Variance of Sums of Independent Variables View
We use the setup of the third part. For $u \in \mathscr{M}_{d,1}(\mathbb{R})$, we denote $T(u) = (T_i(u))_{1 \leqslant i \leqslant d} \in \mathbb{R}^d$ the vector defined by $$T_i(u) = \operatorname{Var}\left(\langle L_i, u \rangle\right) \quad \text{for } i \in \{1,\ldots,d\}.$$
Show that for all $n \geqslant 0$, $$\mathbb{E}\left(\langle X_n, u \rangle^2\right) = \mathbb{E}\left(\langle X_0, M^n u \rangle^2\right) + \sum_{k=0}^{n-1} \langle x_0 M^k, T\left(M^{n-1-k} u\right) \rangle$$ (with the convention that the sum indexed by $k$ is zero if $n = 0$).
grandes-ecoles 2023 Q17 Expectation and Variance of Sums of Independent Variables View
Let $X$ and $Y$ be two independent random variables, taking values in $\mathbf{N}$, defined on the same probability space $(\Omega, \mathcal{A}, P)$. Prove the relation $$p_{X+Y} = p_X * p_Y$$
grandes-ecoles 2023 Q17 Monotonicity and Convergence of Sequences Defined via Expectations View
We use the notations of the previous parts. We assume that there exists an eigenvalue $\lambda > 0$ and an associated eigenvector column $h \in \mathscr{M}_{d,1}\left(\mathbb{R}_{+}^{*}\right)$: $$Mh = \lambda h,$$ and that there exist $\nu \in \mathscr{P}$ and $c > 0$ such that for all $i,j \in \{1,\ldots,d\}$, $$M_{i,j} \geqslant c\nu_j.$$
Show that there exist $\pi \in \mathscr{P}$ and $h' \in \mathscr{M}_{d,1}\left(\mathbb{R}_{+}^{*}\right)$ and $C > 0$ and $\gamma \in [0,1[$, such that $\pi M = \lambda \pi$ and for all $n \geqslant 0$, $$\sum_{i=1}^{d} \sum_{j=1}^{d} \left| \lambda^{-n} \left(M^n\right)_{i,j} - h_i' \pi_j \right| \leqslant C\gamma^n.$$
grandes-ecoles 2023 Q20 Probability Bounds and Inequalities for Discrete Variables View
We use the notations of the previous parts. We assume that there exists an eigenvalue $\lambda > 0$ and an associated eigenvector column $h \in \mathscr{M}_{d,1}\left(\mathbb{R}_{+}^{*}\right)$ with $Mh = \lambda h$, and that there exist $\nu \in \mathscr{P}$ and $c > 0$ such that $M_{i,j} \geqslant c\nu_j$ for all $i,j$. Let $\pi \in \mathscr{P}$ be such that $\pi M = \lambda \pi$, and let $C > 0$, $\gamma \in [0,1[$ be as in question 17.
(a) Show that for all $n \geqslant 0$ and $u \in \mathscr{M}_{d,1}(\mathbb{R})$ such that $\langle u, \pi \rangle = 0$, $$\left\| M^n u \right\|_1 \leqslant C(\lambda\gamma)^n \|u\|_1.$$
(b) Deduce that there exists $C_1 \geqslant 0$ such that for all $n \geqslant 0$ and $u \in \mathscr{M}_{d,1}(\mathbb{R})$ column vector such that $\langle u, \pi \rangle = 0$, $$\mathbb{E}\left(\langle X_n, u \rangle^2\right) \leqslant C_1 \|u\|_1^2 \left(\lambda^{2n} \left(\sum_{k=0}^{n-1} \lambda^{-k} \gamma^{2n-2k}\right) + (\lambda\gamma)^{2n}\right).$$
grandes-ecoles 2023 Q21 Convergence of Expectations or Moments View
We consider the matrix $H_t$, the stationary probability $\pi$, and $\lambda$ the smallest nonzero eigenvalue of $u : X \mapsto (I_N - K)X$. We have established that $\|H_t E_i - \pi[i] U\| \leq e^{-\lambda t} \sqrt{\pi[i]}$ and that $$H_t[i,j] - \pi[j] = \sum_{k=1}^{N} \left(H_{t/2}[i,k] - \pi[k]\right)\left(H_{t/2}[k,j] - \pi[j]\right)$$ Deduce that for all $(i,j) \in \llbracket 1;N \rrbracket^2$ and all $t \in \mathbf{R}_+$, $$\left|H_t[i,j] - \pi[j]\right| \leq e^{-\lambda t} \sqrt{\frac{\pi[j]}{\pi[i]}}$$ Determine $\lim_{t \rightarrow +\infty} H_t[i,j]$.
grandes-ecoles 2023 Q21 Monotonicity and Convergence of Sequences Defined via Expectations View
We suppose in the rest of this part that $\lambda > 1$ and we introduce the random row vector $$W_n = \lambda^{-n}\left(X_n - \|X_n\|_1 \pi\right).$$
(a) Show that the series $\displaystyle\sum_{n \geqslant 1} \left(\sum_{k=0}^{n-1} \lambda^{-k} \gamma^{2n-2k}\right)$ converges.
(b) Let $w \in (\mathbb{R}_{+})^d$ and let $e_0 = (1,\ldots,1)$. Show that $$\left\langle w - \|w\|_1 \pi, \pi \right\rangle = \left\langle w, \pi - \langle \pi, \pi \rangle e_0 \right\rangle$$ and that the vector $\pi - \langle \pi, \pi \rangle e_0$ is orthogonal to $\pi$.
(c) Show that the series $\displaystyle\sum_{n \geqslant 0} \mathbb{E}\left(\|W_n\|_2^2\right)$ is convergent. Deduce that the sequence $\left(\mathbb{E}\left(\|W_n\|_2^2\right)\right)_{n \geqslant 0}$ tends to $0$. (One may for example decompose $X_n$ in a well-chosen orthonormal basis of $\mathbb{R}^d$.)
(d) Show that for all $\varepsilon > 0$, $$\lim_{n \rightarrow \infty} \mathbb{P}\left(\|W_n\|_2 \geqslant \varepsilon\right) = 0.$$
grandes-ecoles 2023 Q22 Probability Bounds and Inequalities for Discrete Variables View
We suppose that $\lambda > 1$ and we use the random row vector $W_n = \lambda^{-n}\left(X_n - \|X_n\|_1 \pi\right)$.
Show that the event $\left\{\lim_{n \rightarrow +\infty} W_n = 0_{\mathbb{R}^d}\right\}$ is almost surely true. (One may begin by computing the probability of the event $$\left\{ \forall m \geqslant 0, \exists k \geqslant m \mid \|W_k\|_2 \geqslant \varepsilon \right\}$$ for all $\varepsilon > 0$.)
grandes-ecoles 2024 Q8c Expectation and Variance via Combinatorial Counting View
For $n$ a natural integer greater than or equal to 2, we consider the probability space $(\mathfrak{S}_{n}, \mathscr{P}(\mathfrak{S}_{n}))$ equipped with the uniform probability. We define a random variable $Z_{n}$ by $Z_{n}(\sigma) = \nu(\sigma)$.
Determine the average number of fixed points of a random permutation and its limit as $n$ tends to $+\infty$.
grandes-ecoles 2024 Q8c Expectation and Variance via Combinatorial Counting View
For $n$ a natural integer greater than or equal to 2, we consider the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability. We define a random variable $Z_n$ by $Z_n(\sigma) = \nu(\sigma)$.
Determine the average number of fixed points of a random permutation and its limit as $n$ tends to $+\infty$.
grandes-ecoles 2024 Q9 Expectation and Variance via Combinatorial Counting View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$. We thus obtain a map $\omega : \mathfrak{S}_{n} \rightarrow \mathbb{N}$. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_{n}$ such that $\omega(\sigma) = k$. We then consider, on the probability space $(\mathfrak{S}_{n}, \mathscr{P}(\mathfrak{S}_{n}))$ equipped with the uniform probability, the random variable $X_{n}$ defined by $X_{n}(\sigma) = \omega(\sigma)$.
Calculate, for $n \in \{2, 3, 4\}$, the quantity $\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_{n}} \omega(\sigma)$.
grandes-ecoles 2024 Q9 Expectation and Variance via Combinatorial Counting View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_n$, we recall that there exists, up to order, a unique decomposition $\sigma = c_1 c_2 \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^*$ and $c_1, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_1 \leqslant \ell_2 \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_1 + \ell_2 + \cdots + \ell_{\omega(\sigma)} = n$. We thus obtain a map $\omega : \mathfrak{S}_n \rightarrow \mathbb{N}$. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_n$ such that $\omega(\sigma) = k$. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Calculate, for $n \in \{2, 3, 4\}$, the quantity $\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_n} \omega(\sigma)$.
grandes-ecoles 2024 Q12 Probability Bounds and Inequalities for Discrete Variables View
Let $X$ be a random variable defined on a probability space $(\Omega , \mathcal{A} , \mathbf{P})$ with values in $\mathbf{N}$ and admitting an expectation $\mathbf{E}(X)$ and a variance $\mathbf{V}(X)$. Show that if $\mathbf{E}(X) \neq 0$, then $\mathbf{P}(X = 0) \leq \frac{\mathbf{V}(X)}{(\mathbf{E}(X))^{2}}$. Hint: note that $(X = 0) \subset (|X - \mathbf{E}(X)| \geq \mathbf{E}(X))$.
grandes-ecoles 2024 Q12 Convergence of Expectations or Moments View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$. We consider, on the probability space $(\mathfrak{S}_{n}, \mathscr{P}(\mathfrak{S}_{n}))$ equipped with the uniform probability, the random variable $X_{n}$ defined by $X_{n}(\sigma) = \omega(\sigma)$.
Prove that $\mathbb{E}\left[X_{n}\right] \underset{n \rightarrow +\infty}{=} \ln(n) + \gamma + O\left(\frac{1}{n}\right)$.
grandes-ecoles 2024 Q12 Convergence of Expectations or Moments View
Let $n$ be a non-zero natural integer. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_n$ such that $\omega(\sigma) = k$. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Prove that $\mathbb{E}\left[X_n\right] \underset{n \rightarrow +\infty}{=} \ln(n) + \gamma + O\left(\frac{1}{n}\right)$.
grandes-ecoles 2024 Q13b Expectation and Variance via Combinatorial Counting View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_{n}$ such that $\omega(\sigma) = k$. We consider, on the probability space $(\mathfrak{S}_{n}, \mathscr{P}(\mathfrak{S}_{n}))$ equipped with the uniform probability, the random variable $X_{n}$ defined by $X_{n}(\sigma) = \omega(\sigma)$.
Deduce that $$\frac{1}{n!} \sum_{k=1}^{n} k^{2} s(n,k) = \mathbb{E}\left[X_{n}\right] + \left(\sum_{i=1}^{n} \sum_{j=1}^{n} \frac{1}{ij} - \sum_{i=1}^{n} \frac{1}{i^{2}}\right).$$
grandes-ecoles 2024 Q13a Expectation and Variance via Combinatorial Counting View
Let $n$ be a non-zero natural integer. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_n$ such that $\omega(\sigma) = k$.
Show that $$\frac{1}{n!} \sum_{k=1}^{n} k(k-1) s(n,k) = \sum_{i=1}^{n} \sum_{j=1}^{n} \frac{1}{ij} - \sum_{i=1}^{n} \frac{1}{i^2}.$$
grandes-ecoles 2024 Q13b Expectation and Variance via Combinatorial Counting View
Let $n$ be a non-zero natural integer. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_n$ such that $\omega(\sigma) = k$. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Deduce that $$\frac{1}{n!} \sum_{k=1}^{n} k^2 s(n,k) = \mathbb{E}\left[X_n\right] + \left(\sum_{i=1}^{n} \sum_{j=1}^{n} \frac{1}{ij} - \sum_{i=1}^{n} \frac{1}{i^2}\right).$$
grandes-ecoles 2024 Q14 Probability Bounds and Inequalities for Discrete Variables View
Show that if $p _ { n } = o \left( \frac { 1 } { n ^ { 2 } } \right)$ in the neighborhood of $+ \infty$, then $\lim _ { n \rightarrow + \infty } \mathbf { P } \left( A _ { n } > 0 \right) = 0$.
grandes-ecoles 2024 Q14a Convergence of Expectations or Moments View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_{n}} \omega(\sigma)^{2} \underset{n \rightarrow +\infty}{=} (2\gamma+1)\ln(n) + c + \ln(n)^{2} + O\left(\frac{\ln(n)}{n}\right)$$ for a real number $c$ to be specified.
grandes-ecoles 2024 Q14b Convergence of Expectations or Moments View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_{n}} (\omega(\sigma) - \ln(n))^{2} \underset{n \rightarrow +\infty}{=} \ln(n) + c + O\left(\frac{\ln(n)}{n}\right)$$