MGF of sums of independent random variables (product property)

The question asks to show or use the fact that the MGF (or characteristic function) of a sum of independent random variables equals the product of their individual MGFs.

grandes-ecoles 2017 QII.C.4 View
In subsection II.C, we consider $\varepsilon$ a strictly positive real, $X$ a discrete real random variable taking values in $\left\{x_{p}, p \in \mathbb{N}\right\}$, and $\left(X_{k}\right)_{k \in \mathbb{N}^{*}}$ a sequence of random variables that are mutually independent and have the same distribution as $X$. For every strictly positive integer $n$, we define the random variable $S_{n}$ by $S_{n}=\sum_{k=1}^{n} X_{k}$. We assume that the random variable $X$ admits an exponential moment of order $\alpha$ where $\alpha$ is a strictly positive real. The function $\Psi: t \mapsto \mathbb{E}\left(\mathrm{e}^{tX}\right)$ is defined on $[-\alpha, \alpha]$.
Show that for every real $t$ belonging to the segment $[-\alpha, \alpha]$ and every $n$ belonging to $\mathbb{N}^{*}$, the real random variable $\mathrm{e}^{t S_{n}}$ has expectation equal to $(\Psi(t))^{n}$.
grandes-ecoles 2018 Q6 View
We assume that $X$ and $Y$ are two independent discrete real-valued random variables with strictly positive values admitting moments of all orders. We denote $R _ { X }$ (respectively $R _ { Y }$) the radius of convergence (assumed strictly positive) associated with the function $M _ { X }$ (respectively $M _ { Y }$).
Show that the random variable $X + Y$ admits moments of all orders and that $$\forall | t | < \min \left( R _ { X } , R _ { Y } \right) , \quad M _ { X + Y } ( t ) = M _ { X } ( t ) \times M _ { Y } ( t )$$