todai-math 2015 Q3
Extract moments from the MGF or characteristic function
View
Answer the following questions.
(1) Let $X$ be a real-valued random variable. Let $t$ be a real-valued variable. We define $\phi _ { X } ( t )$ for $X$ as
$$\phi _ { X } ( t ) = E _ { X } \left[ e ^ { t X } \right]$$
where $E _ { X } [ \cdot ]$ denotes the expectation taken with respect to $X$. Supposing that $\phi _ { X } ( t )$ is finite in a neighborhood of $t = 0$, give the mean and variance of $X$ using $\phi _ { X } ^ { \prime } ( 0 )$ and $\phi _ { X } ^ { \prime \prime } ( 0 )$. Here $\phi _ { X } ^ { \prime } ( t )$ and $\phi _ { X } ^ { \prime \prime } ( t )$ denote the first- and second-order derivatives of $\phi _ { X } ( t )$ with respect to $t$, respectively.
(2) For a sequence of mutually independent random variables: $X _ { 1 } , X _ { 2 } , \ldots , X _ { N }$, suppose that each $X _ { j }$ is identically generated according to the 1-dimensional normal distribution with mean $\mu$ and variance $\sigma ^ { 2 }$. That is, the probability density function for each $X _ { j }$ is given by
$$p \left( X _ { j } = x \right) = \frac { 1 } { \sqrt { 2 \pi } \sigma } \exp \left( - \frac { ( x - \mu ) ^ { 2 } } { 2 \sigma ^ { 2 } } \right) .$$
Then calculate $\phi _ { X _ { j } } ( t )$. Also find a probability distribution according to which
$$Y = X _ { 1 } + X _ { 2 } + \cdots + X _ { N }$$
is generated. You can use the fact that for random variables $Z$ and $W$ with $\phi _ { Z } ( t ) = \phi _ { W } ( t )$, the probability distribution of $Z$ is the same as that of $W$.
(3) Suppose that $N \in \{ 1,2 , \ldots , \infty \}$ as in Question (2) is generated according to the geometric distribution with parameter $\theta ( 0 < \theta < 1 )$ for which the probability function is given by
$$P ( N = n ) = ( 1 - \theta ) ^ { n - 1 } \theta$$
For $Y = X _ { 1 } + X _ { 2 } + \cdots + X _ { N }$, define $\phi _ { Y } ( t )$ by
$$\phi _ { Y } ( t ) = E _ { Y } \left[ e ^ { t Y } \right]$$
Then calculate $\phi _ { Y } ( t )$ and express it using $\phi _ { X _ { j } } ( t )$. Since $\phi _ { X _ { j } } ( t )$ does not depend on $j$, you can write it as $\phi _ { X } ( t )$.
(4) Calculate the mean and variance of $Y$ in Question (3).
(5) For given $\xi \left( > E _ { Y } [ Y ] \right)$, give an upper bound on the probability that $Y$ in Question (3) exceeds $\xi$, as a function of $\mu , \sigma , \theta$, and $\xi$ (not all of $\mu , \sigma , \theta$, and $\xi$ have to be used).