Expectation and Variance of Sums of Independent Variables

Questions that ask to compute or prove results about E(S) and/or V(S) where S is a sum of independent random variables, using linearity of expectation and the additive property of variance for independent variables.

grandes-ecoles 2016 QII.A.1 View
We consider a sequence $\left(X_{n}\right)_{n \in \mathbb{N}^{*}}$ of mutually independent random variables, taking values in $\{1, -1\}$ and such that, for all $k \in \mathbb{N}^{*}$, $$P\left(X_{k} = 1\right) = P\left(X_{k} = -1\right) = \frac{1}{2}$$ For all $n \in \mathbb{N}^{*}$, we set $S_{n} = X_{1} + \cdots + X_{n}$.
Determine the expectation and variance of $S_{n}$.
grandes-ecoles 2016 Q6a View
Let $K > 0$ and $g : \mathbb{R} \rightarrow \mathbb{R}$ be a positive bounded function with support in $[0, K]$. The sequence of functions $f_n : \mathbb{R} \rightarrow \mathbb{R}$ is defined for $n \geqslant 0$ by $$f_n(x) = \sum_{k=0}^{n} \mathbb{E}\left(g\left(x - S_k\right)\right)$$ Show that for all $n \in \mathbb{N}$ and $x \in \mathbb{R}$, $$f_{n+1}(x) = g(x) + \sum_{i=0}^{+\infty} p_i f_n\left(x - x_i\right)$$
grandes-ecoles 2021 Q2 View
What relation exists between $S _ { n }$ and $Y _ { n }$ ? Deduce the expectation and variance of $S _ { n }$. Justify that $S _ { n }$ and $n$ have the same parity.
grandes-ecoles 2022 Q20 View
Let $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ be $n ^ { 2 }$ real random variables that are mutually independent, all following the distribution $\mathcal { R }$. The matrix random variable $M _ { n } = \left( m _ { i , j } \right) _ { 1 \leqslant i , j \leqslant n }$ takes values in $\mathcal { V } _ { n , n }$. We set $\tau _ { n } = \operatorname { tr } \left( M _ { n } \right)$.
Calculate the expectation and the variance of the variable $\tau _ { n }$.
grandes-ecoles 2022 Q19 View
Let $(X_n)_{n \in \mathbb{N}}$ be a sequence of mutually independent random variables satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ for all $n \in \mathbb{N}$, and let $(a_n)_{n \in \mathbb{N}}$ be a real sequence such that $\sum a_n^2$ converges. For all $N \in \mathbb{N}$, let $S_N = \sum_{n=0}^N X_n a_n$. Express the expectation and variance of $S_{\phi(j+1)} - S_{\phi(j)}$ in terms of the terms of the sequence $(a_n)_{n \in \mathbb{N}}$.
grandes-ecoles 2022 Q19 View
Let $(X_n)_{n \in \mathbb{N}}$ be a sequence of mutually independent random variables satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ for all $n \in \mathbb{N}$, and let $(a_n)_{n \in \mathbb{N}}$ be a real sequence such that the series $\sum a_n^2$ converges. For all $N \in \mathbb{N}$, denote $S_N = \sum_{n=0}^N X_n a_n$. Let $(\phi(j))_{j \in \mathbb{N}}$ be a strictly increasing sequence of natural integers satisfying $\sum_{n > \phi(j)}^{+\infty} a_n^2 \leqslant \frac{1}{8^j}$ for all $j \in \mathbb{N}$. Express the expectation and variance of $S_{\phi(j+1)} - S_{\phi(j)}$ in terms of the terms of the sequence $(a_n)_{n \in \mathbb{N}}$.
grandes-ecoles 2023 Q7 View
Express $X_n$ using the $U_i$, $1 \leq i \leq n$. Deduce the expectation $\mathrm{E}\left(X_n\right)$ and the variance $\mathrm{V}\left(X_n\right)$.
grandes-ecoles 2023 Q17 View
Let $X$ and $Y$ be two independent random variables, taking values in $\mathbf{N}$, defined on the same probability space $(\Omega, \mathcal{A}, P)$. Prove the relation $$p_{X+Y} = p_X * p_Y$$
grandes-ecoles 2023 Q13 View
We assume that for all $i,j \in \{1,\ldots,d\}$, $N_{i,j}$ is a random variable taking values in $\mathbb{N}$ and such that $N_{i,j}^2$ has finite expectation. For all $i \in \{1,\ldots,d\}$, we introduce the random variable $L_i = (N_{i,1}, \ldots, N_{i,d})$. We consider a family of independent random variables $(L_i^{n,k})_{n \geqslant 1, k \geqslant 1}$ where for all $i$, $n$, $k$, $L_i^{n,k}$ has the same distribution as $L_i$. Let $X_0 = (X_{0,i})_{1 \leqslant i \leqslant d}$ be a random variable taking values in $\mathscr{M}_{1,d}(\mathbb{N})$. We define by recursion for $n \geqslant 0$: $$X_{n+1} = \sum_{i=1}^{d} \sum_{k=1}^{X_{n,i}} L_i^{n,k}.$$ We introduce $M \in \mathscr{M}_d\left(\mathbb{R}_{+}\right)$ defined by $M_{i,j} = \mathbb{E}(N_{i,j})$ and $x_{n,j} = \mathbb{E}(X_{n,j})$.
(a) Show that, for all $y \in \mathscr{M}_{1,d}(\mathbb{N})$ and $1 \leqslant j \leqslant d$, $$\mathbb{E}\left(X_{n+1,j} \mathbf{1}_{X_n = y}\right) = (yM)_j \mathbb{P}(X_n = y).$$ (One may use without proof the fact that the random variables $L_i^{n,k}$ and $\mathbf{1}_{X_n = y}$ are independent.)
(b) Deduce that, for all $n \geqslant 0$, $$x_{n+1} = x_n M.$$
grandes-ecoles 2023 Q14 View
Let $\mathscr{I}$ be a finite set and $(Y_i)_{i \in \mathscr{I}}$ be a family of random variables that are pairwise independent, take real values and whose squares have finite expectation. Show that $$\mathbb{E}\left(\left(\sum_{i \in \mathscr{I}} Y_i\right)^2\right) = \left(\sum_{i \in \mathscr{I}} \mathbb{E}(Y_i)\right)^2 + \sum_{i \in \mathscr{I}} \operatorname{Var}(Y_i).$$
grandes-ecoles 2023 Q15 View
We use the setup of the third part. For $u \in \mathscr{M}_{d,1}(\mathbb{R})$, we denote $T(u) = (T_i(u))_{1 \leqslant i \leqslant d} \in \mathbb{R}^d$ the vector defined by $$T_i(u) = \operatorname{Var}\left(\langle L_i, u \rangle\right) \quad \text{for } i \in \{1,\ldots,d\}.$$
(a) Show that for all $u \in \mathscr{M}_{d,1}(\mathbb{R})$, $y \in \mathscr{M}_{1,d}(\mathbb{N})$ and $n \geqslant 0$, $$\mathbb{E}\left(\langle X_{n+1}, u \rangle^2 \mathbf{1}_{X_n = y}\right) = \mathbb{P}(X_n = y)\left(\langle y, Mu \rangle^2 + \langle y, T(u) \rangle\right).$$ (One may use without proof the fact that, for all $n \geqslant 0$, the random variables $\sum_{j=1}^{d} u_j L_{i,j}^{n,k} \mathbf{1}_{X_n = y}$ are pairwise independent when $k$ and $i$ vary.)
(b) Show that for all $u \in \mathscr{M}_{d,1}(\mathbb{R})$ and $n \geqslant 0$, $$\mathbb{E}\left(\langle X_{n+1}, u \rangle^2\right) = \mathbb{E}\left(\langle X_n, Mu \rangle^2\right) + \langle x_0 M^n, T(u) \rangle.$$
grandes-ecoles 2023 Q16 View
We use the setup of the third part. For $u \in \mathscr{M}_{d,1}(\mathbb{R})$, we denote $T(u) = (T_i(u))_{1 \leqslant i \leqslant d} \in \mathbb{R}^d$ the vector defined by $$T_i(u) = \operatorname{Var}\left(\langle L_i, u \rangle\right) \quad \text{for } i \in \{1,\ldots,d\}.$$
Show that for all $n \geqslant 0$, $$\mathbb{E}\left(\langle X_n, u \rangle^2\right) = \mathbb{E}\left(\langle X_0, M^n u \rangle^2\right) + \sum_{k=0}^{n-1} \langle x_0 M^k, T\left(M^{n-1-k} u\right) \rangle$$ (with the convention that the sum indexed by $k$ is zero if $n = 0$).
grandes-ecoles 2024 Q19 View
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Determine, for all $n \in \mathbf { N } ^ { * }$, $E \left( S _ { n } \right)$ and $V \left( S _ { n } \right)$.
grandes-ecoles 2025 Q10 View
Let $p \in \left[ 1 , + \infty \right[$. Let $\left( X _ { i } \right) _ { i \in \llbracket 1 , n \rrbracket}$ be a sequence of independent random variables all following a Rademacher distribution. Let $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$. Show that $$\mathbf { E } \left( \left( \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right) ^ { 2 } \right) = \sum _ { i = 1 } ^ { n } c _ { i } ^ { 2 } .$$