Deriving moments or distribution from a PGF

The question asks to extract expectation, variance, or the full probability distribution of a random variable from its generating function or its derivatives.

grandes-ecoles 2015 QII.B View
Let $(X_n)_{n\in\mathbb{N}^*}$ be a sequence of random variables, mutually independent, with the same distribution taking values in $\mathbb{N}$, and let $T$ be a random variable taking values in $\mathbb{N}$ independent of the previous ones. We denote by $G_X$ the generating function common to all the $X_n$. For $n\in\mathbb{N}$ and $\omega\in\Omega$, we set $S_n(\omega)=\sum_{k=1}^n X_k(\omega)$ and $S_0(\omega)=0$, then $S(\omega)=S_{T(\omega)}(\omega)$. We have shown $G_S=G_T\circ G_X$.
Deduce that, if $T$ and the $X_n$ have finite expectation, then so does $S$ and $E(S)=E(T)E(X_1)$.
grandes-ecoles 2015 QIII.A.2 View
Let $\mu$ be a probability distribution characterized by the sequence $(p_k)_{k\in\mathbb{N}}$ with $\sum_{k=0}^{+\infty}p_k=1$ and $p_0+p_1<1$. We define the Galton-Watson process with $Y_0=1$ and the recurrence above. We assume that every random variable following distribution $\mu$ has expectation equal to $m$ and a variance.
Express, for $n\in\mathbb{N}$, the expectation of $Y_n$ in terms of $m$ and $n$.
grandes-ecoles 2015 QIII.C.2 View
We consider the Galton-Watson process. We assume $m\leqslant 1$. We denote, for $n\in\mathbb{N}^*$, $Z_n=1+\sum_{i=1}^n Y_i$ and $Z=1+\sum_{n=1}^{+\infty}Y_n$.
a) Show that, for all $k\in\mathbb{N}$, $(P(Z_n\leqslant k))_{n\in\mathbb{N}^*}$ is a convergent sequence. Determine its limit.
b) Deduce that, for all $k\in\mathbb{N}$, $(P(Z_n=k))_{n\in\mathbb{N}^*}$ converges to $P(Z=k)$.
c) Show that, for all $s\in\left[0,1\left[$, all $n\in\mathbb{N}^*$ and $K\in\mathbb{N}$, $$\left|G_{Z_n}(s)-G_Z(s)\right|\leqslant\sum_{k=0}^K\left|P(Z_n=k)-P(Z=k)\right|+\frac{s^K}{1-s}$$
d) Deduce that the sequence of functions $(G_{Z_n})$ converges pointwise to $G_Z$ on $[0,1]$.
grandes-ecoles 2015 QIV.G View
We assume that, for all $k\in\mathbb{N}$, $p_k=\frac{1}{2^{k+1}}$.
Express, for $s\in\left[0,1\left[$, $G_Z(s)$ in terms of $s$.
Deduce the distribution of $Z$.
grandes-ecoles 2017 QIVB View
We are given a probability space $( \Omega , \mathcal { A } , \mathbb { P } )$. We define $H_k(X) = X(X-1)\cdots(X-k+1)$ for $k \in \mathbb{N}^*$ and $H_0(X)=1$. Let $m$ be a strictly positive integer. We say that a random variable $Y : \Omega \rightarrow \mathbb { N }$ admits a finite moment of order $m$ if the series $\sum n ^ { m } P ( Y = n )$ converges, and $\mathbb { E } \left( Y ^ { m } \right) = \sum _ { n = 0 } ^ { \infty } n ^ { m } \mathbb { P } ( Y = n )$.
Let $Y : \Omega \rightarrow \mathbb { N }$ be a random variable admitting a finite moment of all orders.
IV.B.1) Show that the generating function $G _ { Y }$ is of class $C ^ { \infty }$ on $[ - 1,1 ]$.
IV.B.2) Express $G _ { Y } ^ { ( k ) } ( 1 )$ using the polynomials $H _ { k } ( X )$ and the random variable $Y$.
IV.B.3) Does the generating function $G _ { Y }$ necessarily have a radius of convergence strictly greater than 1? One may use the power series $\sum \mathrm { e } ^ { - \sqrt { n } } x ^ { n }$.
grandes-ecoles 2019 Q10 View
We have an infinite supply of black and white balls. An urn initially contains one black ball and one white ball. We perform a sequence of draws according to the following protocol:
  • we randomly draw a ball from the urn;
  • we replace the drawn ball in the urn;
  • we add to the urn a ball of the same color as the drawn ball.
We define the sequence $(X_{n})_{n \in \mathbb{N}}$ of random variables by $X_{0} = 1$ and, for all integers $n \geqslant 1$, $X_{n}$ gives the number of white balls in the urn after $n$ draws. We denote by $g_{n}$ the generating function of the random variable $X_{n}$.
Prove that, for all integers $n \in \mathbb{N}^{*}$ and all real $t$, $$g_{n}(t) = \frac{1}{n+1} \sum_{k=1}^{n+1} t^{k}$$
grandes-ecoles 2020 Q30 View
We fix a real random variable $X : \Omega \rightarrow \mathbb { R }$, whose image $X ( \Omega )$ is a countable set, with $X ( \Omega ) = \left\{ x _ { n } , n \in \mathbb { N } \right\}$ and $a _ { n } = \mathbb { P } \left( X = x _ { n } \right)$. Let $k \in \mathbb { N } ^ { * }$. We assume that $X$ admits a moment of order $k$. Deduce an expression of $\mathbb { E } \left( X ^ { k } \right)$ in terms of $\phi _ { X } ^ { ( k ) } ( 0 )$.
grandes-ecoles 2021 Q10 View
We set, for all $t \in I$, $$f ( t ) = \sum _ { n = 0 } ^ { + \infty } C _ { n } t ^ { n } \quad \text { and } \quad g ( t ) = 2 t f ( t ) .$$ The generating series of $T$ is given by $G _ { T } ( t ) = \sum _ { n = 0 } ^ { \infty } \mathbb { P } ( T = n ) t ^ { n }$, defined if $t \in [ - 1,1 ]$. Using the previous questions, express $G _ { T }$ using $g$ and $\mathbb { P } ( T = 0 )$.
grandes-ecoles 2022 Q16 View
In this question, we are given a real random variable $X$ following a geometric distribution with parameter $p \in ] 0,1 [$ arbitrary. We set $q = 1 - p$.
Show that there exists a sequence $\left( P _ { k } \right) _ { k \in \mathbf { N } }$ of polynomials with coefficients in $\mathbf { C }$, independent of $p$, such that
$$\forall \theta \in \mathbf { R } , \forall k \in \mathbf { N } , \Phi _ { X } ^ { ( k ) } ( \theta ) = p i ^ { k } e ^ { i \theta } \frac { P _ { k } \left( q e ^ { i \theta } \right) } { \left( 1 - q e ^ { i \theta } \right) ^ { k + 1 } } \quad \text { and } \quad P _ { k } ( 0 ) = 1$$