Extract moments from the MGF or characteristic function

The question asks to recover moments (mean, variance, or higher-order moments) by differentiating or reading off coefficients from a known MGF or characteristic function.

grandes-ecoles 2017 QII.C.3 View
In subsection II.C, we consider $\varepsilon$ a strictly positive real, $X$ a discrete real random variable taking values in $\left\{x_{p}, p \in \mathbb{N}\right\}$, and $\left(X_{k}\right)_{k \in \mathbb{N}^{*}}$ a sequence of random variables that are mutually independent and have the same distribution as $X$. For every strictly positive integer $n$, we define the random variable $S_{n}$ by $S_{n}=\sum_{k=1}^{n} X_{k}$. We assume that the random variable $X$ admits an exponential moment of order $\alpha$ where $\alpha$ is a strictly positive real. The function $\Psi: t \mapsto \mathbb{E}\left(\mathrm{e}^{tX}\right)$ is defined on $[-\alpha, \alpha]$.
We consider the function $f_{\varepsilon}$ defined by $$f_{\varepsilon}:\left\{\begin{array}{l}[-\alpha, \alpha] \rightarrow \mathbb{R}^{+} \\ t \mapsto \mathrm{e}^{-(m+\varepsilon) t} \Psi(t)\end{array}\right.$$
a) Give the values of $f_{\varepsilon}(0)$ and $f_{\varepsilon}^{\prime}(0)$.
b) Deduce that there exists a real $t_{0}$ belonging to the interval $]0, \alpha[$ satisfying $0 < f_{\varepsilon}\left(t_{0}\right) < 1$.
grandes-ecoles 2024 Q24 View
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Deduce that for all $n \in \mathbf { N } ^ { * }$:
$$E \left( \left| S _ { n } \right| \right) = \frac { 2 } { \pi } \int _ { 0 } ^ { + \infty } \frac { 1 - ( \cos ( t ) ) ^ { n } } { t ^ { 2 } } \mathrm {~d} t$$
grandes-ecoles 2024 Q25 View
Let $\left( X _ { k } \right) _ { k \in \mathbf{N} ^ { * } }$ be independent random variables with the same distribution given by:
$$P \left( X _ { 1 } = - 1 \right) = P \left( X _ { 1 } = 1 \right) = \frac { 1 } { 2 }$$
For all $n \in \mathbf { N } ^ { * }$, we denote $S _ { n } = \sum _ { k = 1 } ^ { n } X _ { k }$.
Conclude that:
$$\forall n \in \mathbf { N } ^ { * } , \quad E \left( \left| S _ { 2 n } \right| \right) = E \left( \left| S _ { 2 n - 1 } \right| \right) = \frac { ( 2 n - 1 ) ! } { 2 ^ { 2 n - 2 } ( ( n - 1 ) ! ) ^ { 2 } }$$
todai-math 2015 Q3 View
Answer the following questions.
(1) Let $X$ be a real-valued random variable. Let $t$ be a real-valued variable. We define $\phi _ { X } ( t )$ for $X$ as
$$\phi _ { X } ( t ) = E _ { X } \left[ e ^ { t X } \right]$$
where $E _ { X } [ \cdot ]$ denotes the expectation taken with respect to $X$. Supposing that $\phi _ { X } ( t )$ is finite in a neighborhood of $t = 0$, give the mean and variance of $X$ using $\phi _ { X } ^ { \prime } ( 0 )$ and $\phi _ { X } ^ { \prime \prime } ( 0 )$. Here $\phi _ { X } ^ { \prime } ( t )$ and $\phi _ { X } ^ { \prime \prime } ( t )$ denote the first- and second-order derivatives of $\phi _ { X } ( t )$ with respect to $t$, respectively.
(2) For a sequence of mutually independent random variables: $X _ { 1 } , X _ { 2 } , \ldots , X _ { N }$, suppose that each $X _ { j }$ is identically generated according to the 1-dimensional normal distribution with mean $\mu$ and variance $\sigma ^ { 2 }$. That is, the probability density function for each $X _ { j }$ is given by
$$p \left( X _ { j } = x \right) = \frac { 1 } { \sqrt { 2 \pi } \sigma } \exp \left( - \frac { ( x - \mu ) ^ { 2 } } { 2 \sigma ^ { 2 } } \right) .$$
Then calculate $\phi _ { X _ { j } } ( t )$. Also find a probability distribution according to which
$$Y = X _ { 1 } + X _ { 2 } + \cdots + X _ { N }$$
is generated. You can use the fact that for random variables $Z$ and $W$ with $\phi _ { Z } ( t ) = \phi _ { W } ( t )$, the probability distribution of $Z$ is the same as that of $W$.
(3) Suppose that $N \in \{ 1,2 , \ldots , \infty \}$ as in Question (2) is generated according to the geometric distribution with parameter $\theta ( 0 < \theta < 1 )$ for which the probability function is given by
$$P ( N = n ) = ( 1 - \theta ) ^ { n - 1 } \theta$$
For $Y = X _ { 1 } + X _ { 2 } + \cdots + X _ { N }$, define $\phi _ { Y } ( t )$ by
$$\phi _ { Y } ( t ) = E _ { Y } \left[ e ^ { t Y } \right]$$
Then calculate $\phi _ { Y } ( t )$ and express it using $\phi _ { X _ { j } } ( t )$. Since $\phi _ { X _ { j } } ( t )$ does not depend on $j$, you can write it as $\phi _ { X } ( t )$.
(4) Calculate the mean and variance of $Y$ in Question (3).
(5) For given $\xi \left( > E _ { Y } [ Y ] \right)$, give an upper bound on the probability that $Y$ in Question (3) exceeds $\xi$, as a function of $\mu , \sigma , \theta$, and $\xi$ (not all of $\mu , \sigma , \theta$, and $\xi$ have to be used).