grandes-ecoles 2023 Q13
Expectation and Variance of Sums of Independent Variables
View
We assume that for all $i,j \in \{1,\ldots,d\}$, $N_{i,j}$ is a random variable taking values in $\mathbb{N}$ and such that $N_{i,j}^2$ has finite expectation. For all $i \in \{1,\ldots,d\}$, we introduce the random variable $L_i = (N_{i,1}, \ldots, N_{i,d})$. We consider a family of independent random variables $(L_i^{n,k})_{n \geqslant 1, k \geqslant 1}$ where for all $i$, $n$, $k$, $L_i^{n,k}$ has the same distribution as $L_i$. Let $X_0 = (X_{0,i})_{1 \leqslant i \leqslant d}$ be a random variable taking values in $\mathscr{M}_{1,d}(\mathbb{N})$. We define by recursion for $n \geqslant 0$: $$X_{n+1} = \sum_{i=1}^{d} \sum_{k=1}^{X_{n,i}} L_i^{n,k}.$$ We introduce $M \in \mathscr{M}_d\left(\mathbb{R}_{+}\right)$ defined by $M_{i,j} = \mathbb{E}(N_{i,j})$ and $x_{n,j} = \mathbb{E}(X_{n,j})$.
(a) Show that, for all $y \in \mathscr{M}_{1,d}(\mathbb{N})$ and $1 \leqslant j \leqslant d$, $$\mathbb{E}\left(X_{n+1,j} \mathbf{1}_{X_n = y}\right) = (yM)_j \mathbb{P}(X_n = y).$$ (One may use without proof the fact that the random variables $L_i^{n,k}$ and $\mathbf{1}_{X_n = y}$ are independent.)
(b) Deduce that, for all $n \geqslant 0$, $$x_{n+1} = x_n M.$$