Convergence of Expectations or Moments

Questions that ask to prove that a sequence of expectations, variances, or moments converges to a specific limit, including asymptotic expansions of expectations.

grandes-ecoles 2016 Q6b View
Let $K > 0$ and $g : \mathbb{R} \rightarrow \mathbb{R}$ be a positive bounded function with support in $[0, K]$. The function $f$ is the pointwise limit of the sequence $f_n$. Show that the function $f$ satisfies the following equality on $\mathbb{R}$ $$f(x) = g(x) + \sum_{i=0}^{+\infty} p_i f\left(x - x_i\right) \tag{E}$$
grandes-ecoles 2019 Q3 View
Let $n$ be a non-zero natural number and $t$ a real number. We set $$\forall n \in \mathbb{N}^{\star}, \quad X_n = \sum_{k=1}^{n} \frac{\varepsilon_k}{2^k}$$ where $(\varepsilon_n)_{n \geqslant 1}$ is a sequence of independent random variables taking values in $\{-1,1\}$ with $\mathbb{P}(\varepsilon_n = 1) = \mathbb{P}(\varepsilon_n = -1) = 1/2$ for all $n \geqslant 1$, and $$\operatorname{sinc}\, t = \begin{cases} \frac{\sin t}{t} & \text{if } t \neq 0 \\ 1 & \text{otherwise} \end{cases}$$
Determine the pointwise limit of the sequence of functions $(\Phi_{X_n})_{n \geqslant 1}$.
grandes-ecoles 2020 Q3 View
Show that $\phi _ { X }$ is continuous on $\mathbb { R }$.
grandes-ecoles 2021 Q40 View
For all $(i,j) \in (\mathbb{N}^{\star})^{2}$ and for all $C > 0$, we set $\sigma_{ij}(C) = \sqrt{\mathbb{V}\left(X_{ij} \mathbb{1}_{|X_{ij}| \leqslant C}\right)}$. If $\sigma_{ij}(C) \neq 0$, we set $$\widehat{X}_{ij}(C) = \frac{1}{\sigma_{ij}(C)} \left(X_{ij} \mathbb{1}_{|X_{ij}| \leqslant C} - \mathbb{E}\left(X_{ij} \mathbb{1}_{|X_{ij}| \leqslant C}\right)\right).$$
Show that $$\lim_{C \rightarrow +\infty} \mathbb{E}\left(\left(X_{ij} - \widehat{X}_{ij}(C)\right)^{2}\right) = 0.$$
grandes-ecoles 2022 Q23 View
Given a real $t > 0$, we set, following the notations of part $\mathbf{C}$,
$$m _ { t } : = S _ { 1,1 } ( t ) \quad \text { and } \quad \sigma _ { t } : = \sqrt { S _ { 2,1 } ( t ) }$$
Given reals $t > 0$ and $\theta$, we set
$$h ( t , \theta ) = e ^ { - i m _ { t } \theta } \frac { P \left( e ^ { - t } e ^ { i \theta } \right) } { P \left( e ^ { - t } \right) }$$
Let $\theta \in \mathbf { R }$ and $t \in \mathbf { R } _ { + } ^ { * }$. We consider, for all $k \in \mathbf { N } ^ { * }$, a random variable $Z _ { k }$ following the distribution $\mathcal { G } \left( 1 - e ^ { - k t } \right)$, and we set $Y _ { k } = k \left( Z _ { k } - \mathrm { E } \left( Z _ { k } \right) \right)$. Prove that
$$h ( t , \theta ) = \lim _ { n \rightarrow + \infty } \prod _ { k = 1 } ^ { n } \Phi _ { Y _ { k } } ( \theta )$$
Deduce, using in particular question $21 \triangleright$, the inequality
$$\left| h ( t , \theta ) - e ^ { - \frac { \sigma _ { t } ^ { 2 } \theta ^ { 2 } } { 2 } } \right| \leq K ^ { 3 / 4 } | \theta | ^ { 3 } S _ { 3,3 / 4 } ( t ) + K \theta ^ { 4 } S _ { 4,1 } ( t )$$
grandes-ecoles 2023 Q21 View
We consider the matrix $H_t$, the stationary probability $\pi$, and $\lambda$ the smallest nonzero eigenvalue of $u : X \mapsto (I_N - K)X$. We have established that $\|H_t E_i - \pi[i] U\| \leq e^{-\lambda t} \sqrt{\pi[i]}$ and that $$H_t[i,j] - \pi[j] = \sum_{k=1}^{N} \left(H_{t/2}[i,k] - \pi[k]\right)\left(H_{t/2}[k,j] - \pi[j]\right)$$ Deduce that for all $(i,j) \in \llbracket 1;N \rrbracket^2$ and all $t \in \mathbf{R}_+$, $$\left|H_t[i,j] - \pi[j]\right| \leq e^{-\lambda t} \sqrt{\frac{\pi[j]}{\pi[i]}}$$ Determine $\lim_{t \rightarrow +\infty} H_t[i,j]$.
grandes-ecoles 2024 Q12 View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$. We consider, on the probability space $(\mathfrak{S}_{n}, \mathscr{P}(\mathfrak{S}_{n}))$ equipped with the uniform probability, the random variable $X_{n}$ defined by $X_{n}(\sigma) = \omega(\sigma)$.
Prove that $\mathbb{E}\left[X_{n}\right] \underset{n \rightarrow +\infty}{=} \ln(n) + \gamma + O\left(\frac{1}{n}\right)$.
grandes-ecoles 2024 Q14a View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_{n}} \omega(\sigma)^{2} \underset{n \rightarrow +\infty}{=} (2\gamma+1)\ln(n) + c + \ln(n)^{2} + O\left(\frac{\ln(n)}{n}\right)$$ for a real number $c$ to be specified.
grandes-ecoles 2024 Q14b View
Let $n$ be a non-zero natural integer. For any permutation $\sigma \in \mathfrak{S}_{n}$, we recall that there exists, up to order, a unique decomposition $\sigma = c_{1} c_{2} \cdots c_{\omega(\sigma)}$, where $\omega(\sigma) \in \mathbb{N}^{*}$ where $c_{1}, \ldots, c_{\omega(\sigma)}$ are cycles with disjoint supports of respective lengths $\ell_{1} \leqslant \ell_{2} \leqslant \cdots \leqslant \ell_{\omega(\sigma)}$ and $\ell_{1} + \ell_{2} + \cdots + \ell_{\omega(\sigma)} = n$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_{n}} (\omega(\sigma) - \ln(n))^{2} \underset{n \rightarrow +\infty}{=} \ln(n) + c + O\left(\frac{\ln(n)}{n}\right)$$
grandes-ecoles 2024 Q12 View
Let $n$ be a non-zero natural integer. For an integer $k$ at most $n$, we denote by $s(n,k)$ the number of permutations of $\mathfrak{S}_n$ such that $\omega(\sigma) = k$. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Prove that $\mathbb{E}\left[X_n\right] \underset{n \rightarrow +\infty}{=} \ln(n) + \gamma + O\left(\frac{1}{n}\right)$.
grandes-ecoles 2024 Q14a View
Let $n$ be a non-zero natural integer. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_n} \omega(\sigma)^2 \underset{n \rightarrow +\infty}{=} (2\gamma+1)\ln(n) + c + \ln(n)^2 + O\left(\frac{\ln(n)}{n}\right)$$ for a real number $c$ to be determined.
grandes-ecoles 2024 Q14b View
Let $n$ be a non-zero natural integer. We consider, on the probability space $(\mathfrak{S}_n, \mathscr{P}(\mathfrak{S}_n))$ equipped with the uniform probability, the random variable $X_n$ defined by $X_n(\sigma) = \omega(\sigma)$.
Show that $$\frac{1}{n!} \sum_{\sigma \in \mathfrak{S}_n} (\omega(\sigma) - \ln(n))^2 \underset{n \rightarrow +\infty}{=} \ln(n) + c + O\left(\frac{\ln(n)}{n}\right).$$