Monotonicity and Convergence of Sequences Defined via Expectations

Questions that ask to prove that a sequence defined through expectations (e.g., E(|Tâ‚™|)) is monotone, bounded, or convergent, often using properties of the underlying random variables.

grandes-ecoles 2016 QII.A.5 View
We consider a sequence $\left(X_{n}\right)_{n \in \mathbb{N}^{*}}$ of mutually independent random variables, taking values in $\{1, -1\}$ and such that, for all $k \in \mathbb{N}^{*}$, $$P\left(X_{k} = 1\right) = P\left(X_{k} = -1\right) = \frac{1}{2}$$ For all $n \in \mathbb{N}^{*}$, we set $S_{n} = X_{1} + \cdots + X_{n}$, and $u_{n} = \int_{0}^{\infty} \frac{1 - (\cos t)^{n}}{t^{2}} \mathrm{~d}t$.
Deduce from the previous question that, for all $n \in \mathbb{N}$, $u_{2n+1} = u_{2n+2}$.
grandes-ecoles 2016 QIII.A.1 View
We consider a sequence $\left(X_{n}\right)_{n \in \mathbb{N}^{*}}$ of mutually independent random variables, taking values in $\{1, -1\}$ and such that, for all $k \in \mathbb{N}^{*}$, $P\left(X_{k} = 1\right) = P\left(X_{k} = -1\right) = \frac{1}{2}$. We also consider a sequence $\left(a_{n}\right)_{n \in \mathbb{N}^{*}}$ of non-negative real numbers. For all $n \in \mathbb{N}^{*}$, we set $T_{n} = \sum_{k=1}^{n} a_{k} X_{k}$.
Show that the sequence $\left(E\left(\left|T_{n}\right|\right)\right)_{n \in \mathbb{N}^{*}}$ is increasing.
grandes-ecoles 2016 QIII.A.2 View
We consider a sequence $\left(X_{n}\right)_{n \in \mathbb{N}^{*}}$ of mutually independent random variables, taking values in $\{1, -1\}$ and such that, for all $k \in \mathbb{N}^{*}$, $P\left(X_{k} = 1\right) = P\left(X_{k} = -1\right) = \frac{1}{2}$. We also consider a sequence $\left(a_{n}\right)_{n \in \mathbb{N}^{*}}$ of non-negative real numbers. For all $n \in \mathbb{N}^{*}$, we set $T_{n} = \sum_{k=1}^{n} a_{k} X_{k}$.
Show that if the series $\sum a_{n}^{2}$ is convergent, then the sequence $\left(E\left(\left|T_{n}\right|\right)\right)_{n \in \mathbb{N}^{*}}$ is convergent.
grandes-ecoles 2016 Q4a View
Let $K > 0$ and $g : \mathbb{R} \rightarrow \mathbb{R}$ be a positive bounded function with support in $[0, K]$. The sequence of functions $f_n : \mathbb{R} \rightarrow \mathbb{R}$ is defined for $n \geqslant 0$ by $$f_n(x) = \sum_{k=0}^{n} \mathbb{E}\left(g\left(x - S_k\right)\right)$$ Show that for all $x \in \mathbb{R}$, the sequence $\left(f_n(x)\right)_{n \geqslant 0}$ is increasing. We denote by $f(x)$ its limit in $\mathbb{R} \cup \{+\infty\}$.
grandes-ecoles 2016 Q4d View
Let $K > 0$ and $g : \mathbb{R} \rightarrow \mathbb{R}$ be a positive bounded function with support in $[0, K]$. The sequence of functions $f_n : \mathbb{R} \rightarrow \mathbb{R}$ is defined for $n \geqslant 0$ by $$f_n(x) = \sum_{k=0}^{n} \mathbb{E}\left(g\left(x - S_k\right)\right)$$ Conclude that the sequence of functions $f_n$ converges pointwise to a positive bounded function $f$ whose support is included in $\mathbb{R}^+$.
grandes-ecoles 2017 QII.A.3 View
Let $a$ be a real number. We suppose that $P(X \geqslant a) > 0$. Show that the sequence $\left(\frac{\ln\left(P\left(S_{n} \geqslant na\right)\right)}{n}\right)_{n \geqslant 1}$ is well-defined and admits a limit $\gamma_{a}$ that is negative or zero, satisfying $$\forall n \in \mathbb{N}^{*}, \quad P\left(S_{n} \geqslant na\right) \leqslant \mathrm{e}^{n\gamma_{a}}$$
grandes-ecoles 2023 Q17 View
We use the notations of the previous parts. We assume that there exists an eigenvalue $\lambda > 0$ and an associated eigenvector column $h \in \mathscr{M}_{d,1}\left(\mathbb{R}_{+}^{*}\right)$: $$Mh = \lambda h,$$ and that there exist $\nu \in \mathscr{P}$ and $c > 0$ such that for all $i,j \in \{1,\ldots,d\}$, $$M_{i,j} \geqslant c\nu_j.$$
Show that there exist $\pi \in \mathscr{P}$ and $h' \in \mathscr{M}_{d,1}\left(\mathbb{R}_{+}^{*}\right)$ and $C > 0$ and $\gamma \in [0,1[$, such that $\pi M = \lambda \pi$ and for all $n \geqslant 0$, $$\sum_{i=1}^{d} \sum_{j=1}^{d} \left| \lambda^{-n} \left(M^n\right)_{i,j} - h_i' \pi_j \right| \leqslant C\gamma^n.$$
grandes-ecoles 2023 Q21 View
We suppose in the rest of this part that $\lambda > 1$ and we introduce the random row vector $$W_n = \lambda^{-n}\left(X_n - \|X_n\|_1 \pi\right).$$
(a) Show that the series $\displaystyle\sum_{n \geqslant 1} \left(\sum_{k=0}^{n-1} \lambda^{-k} \gamma^{2n-2k}\right)$ converges.
(b) Let $w \in (\mathbb{R}_{+})^d$ and let $e_0 = (1,\ldots,1)$. Show that $$\left\langle w - \|w\|_1 \pi, \pi \right\rangle = \left\langle w, \pi - \langle \pi, \pi \rangle e_0 \right\rangle$$ and that the vector $\pi - \langle \pi, \pi \rangle e_0$ is orthogonal to $\pi$.
(c) Show that the series $\displaystyle\sum_{n \geqslant 0} \mathbb{E}\left(\|W_n\|_2^2\right)$ is convergent. Deduce that the sequence $\left(\mathbb{E}\left(\|W_n\|_2^2\right)\right)_{n \geqslant 0}$ tends to $0$. (One may for example decompose $X_n$ in a well-chosen orthonormal basis of $\mathbb{R}^d$.)
(d) Show that for all $\varepsilon > 0$, $$\lim_{n \rightarrow \infty} \mathbb{P}\left(\|W_n\|_2 \geqslant \varepsilon\right) = 0.$$