Continuous Probability Distributions and Random Variables

Question Types
All Questions
grandes-ecoles 2024 Q20 Entropy, Information, or Log-Sobolev Functional Analysis
Establish the following inequality
$$\operatorname{Ent}_{\varphi}(f) \leq \frac{1}{2} \int_{-\infty}^{+\infty} \frac{f^{\prime 2}(x)}{f(x)} \varphi(x) \mathrm{d}x$$
grandes-ecoles 2024 Q20 Entropy, Information, or Log-Sobolev Functional Analysis
Establish the following inequality $$\operatorname{Ent}_{\varphi}(f) \leq \frac{1}{2}\int_{-\infty}^{+\infty} \frac{f'^2(x)}{f(x)}\,\varphi(x)\,\mathrm{d}x.$$
grandes-ecoles 2025 QI.3 Expectation and Moment Inequality Proof
We assume that the random variables $X_1, \ldots, X_N$ are pairwise uncorrelated, that is: $$\forall 1 \leq m, n \leq N, \quad n \neq m \Rightarrow \mathbb{E}[X_n X_m] = 0.$$ Prove that $$\mathbb{E}\left[|S_N|^2\right] \leq N.$$ Deduce that, for all $t > 0$, $$\mathbb{P}\left(|S_N| > t\sqrt{N}\right) \leq \frac{1}{t^2}$$ where $S_N := X_1 + \cdots + X_N$.
grandes-ecoles 2025 Q2 Expectation and Moment Inequality Proof
Let $p , q \in ] 1 , + \infty [$ such that $\frac { 1 } { p } + \frac { 1 } { q } = 1$. Let $X , Y \in L ^ { 0 } ( \Omega )$ which we assume are both non-negative. Deduce the following inequality (Hölder's inequality): $$\mathbf { E } ( X Y ) \leq \left( \mathrm { E } \left( X ^ { p } \right) \right) ^ { 1 / p } \left( \mathrm { E } \left( Y ^ { q } \right) \right) ^ { 1 / q } .$$ You may begin by treating the case where $\mathbf { E } \left( X ^ { p } \right) = \mathbf { E } \left( Y ^ { q } \right) = 1$.
grandes-ecoles 2025 Q6 Probability Inequality and Tail Bound Proof
Let $\left( X _ { i } \right) _ { i \in [ 1 , n ] }$ be a sequence of independent random variables all following a Rademacher distribution. Deduce that: for all $t \geq 0$, for all $x \geq 0$ and for all $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$, $$\mathbf { P } \left( \exp \left( x \left| \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right| \right) > \mathrm { e } ^ { t x } \right) \leq 2 \mathrm { e } ^ { - t x } \exp \left( \frac { x ^ { 2 } \sum _ { i = 1 } ^ { n } c _ { i } ^ { 2 } } { 2 } \right) .$$ You may use Markov's inequality.
grandes-ecoles 2025 Q7 Probability Inequality and Tail Bound Proof
Let $\left( X _ { i } \right) _ { i \in [ 1 , n ] }$ be a sequence of independent random variables all following a Rademacher distribution. Show that: for all $t \geq 0$ and for all non-zero $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$, $$\mathbf { P } \left( \left| \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right| > t \right) \leq 2 \exp \left( - \frac { t ^ { 2 } } { 2 \sum _ { i = 1 } ^ { n } c _ { i } ^ { 2 } } \right) .$$
grandes-ecoles 2025 Q8 Change of Variable and Integral Evaluation
Let $p \in \left[ 1 , + \infty \right[$. Let $X$ be a positive and finite real random variable. Let $F _ { X }$ be the function defined for all $t \geq 0$ by $$F _ { X } ( t ) = \mathbf { P } ( X > t ) .$$ Show that the integral $\int _ { 0 } ^ { + \infty } t ^ { p - 1 } F _ { X } ( t ) \mathrm { d } t$ converges, then that $$\mathbf { E } \left( X ^ { p } \right) = p \int _ { 0 } ^ { + \infty } t ^ { p - 1 } F _ { X } ( t ) \mathrm { d } t$$
grandes-ecoles 2025 Q9 Expectation and Moment Inequality Proof
Let $p \in \left[ 1 , + \infty \right[$. Let $\left( X _ { i } \right) _ { i \in \llbracket 1 , n \rrbracket}$ be a sequence of independent random variables all following a Rademacher distribution. Let $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$. Suppose in this question that $\sum _ { i = 1 } ^ { n } c _ { i } ^ { 2 } = 1$. Show that the integral $\int _ { 0 } ^ { + \infty } t ^ { 3 } \mathrm { e } ^ { - t ^ { 2 } / 2 } \mathrm {~d} t$ converges, then that $$\mathbf { E } \left( \left( \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right) ^ { 4 } \right) \leq 8 \int _ { 0 } ^ { + \infty } t ^ { 3 } \mathrm { e } ^ { - t ^ { 2 } / 2 } \mathrm {~d} t$$
grandes-ecoles 2025 Q11 Expectation and Moment Inequality Proof
Let $p \in \left[ 1 , + \infty \right[$. Let $\left( X _ { i } \right) _ { i \in \llbracket 1 , n \rrbracket}$ be a sequence of independent random variables all following a Rademacher distribution. Let $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$. Deduce that there exists a real $\beta _ { p } > 0$ such that $$\mathbf { E } \left( \left| \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right| ^ { p } \right) ^ { 1 / p } \leq \beta _ { p } \mathbf { E } \left( \left( \sum _ { i = 1 } ^ { n } c _ { i } X _ { i } \right) ^ { 2 } \right) ^ { 1 / 2 } .$$
grandes-ecoles 2025 Q13 Expectation and Moment Inequality Proof
Let $p \in \left[ 1 , + \infty \right[$. Let $\left( X _ { i } \right) _ { i \in \llbracket 1 , n \rrbracket}$ be a sequence of independent random variables all following a Rademacher distribution. Let $\left( c _ { 1 } , \ldots , c _ { n } \right) \in \mathbf { R } ^ { n }$. Assume $1 \leq p < 2$. Justify that there exists $\theta \in ] 0,1 [$ such that $\frac { 1 } { 2 } = \frac { \theta } { p } + \frac { 1 - \theta } { 4 }$.
grandes-ecoles 2025 Q17 Verification of Probability Measure or Inner Product Properties
Let $\left( X _ { i } \right) _ { i \in \mathbf { N } }$ be a sequence of independent random variables all following a Rademacher distribution. Show that the map $\varphi$ defined on $\left( L ^ { 0 } ( \Omega ) \right) ^ { 2 }$ by $$\forall X , Y \in L ^ { 0 } ( \Omega ) , \quad \varphi ( X , Y ) = \mathbf { E } ( X Y )$$ is an inner product on $L ^ { 0 } ( \Omega )$.
grandes-ecoles 2025 Q18 Verification of Probability Measure or Inner Product Properties
Let $\left( X _ { i } \right) _ { i \in \mathbf { N } }$ be a sequence of independent random variables all following a Rademacher distribution. Let the map $\psi : u \in \mathbf { R } ^ { ( \mathbf { N } ) } \mapsto \sum _ { i = 0 } ^ { + \infty } u _ { i } X _ { i }$. Show that $\psi$ takes its values in $L ^ { 0 } ( \Omega )$, then that $\psi$ preserves the inner product.
grandes-ecoles 2025 Q19 Expectation and Moment Inequality Proof
Let $\left( X _ { i } \right) _ { i \in \mathbf { N } }$ be a sequence of independent random variables all following a Rademacher distribution. Let the map $\psi : u \in \mathbf { R } ^ { ( \mathbf { N } ) } \mapsto \sum _ { i = 0 } ^ { + \infty } u _ { i } X _ { i }$. We denote $R = \psi \left( \mathbf { R } ^ { ( \mathbf { N } ) } \right)$. Show that for all $p , q \in \left[ 1 , + \infty \right[$, the norms $\| \cdot \| _ { p }$ and $\| \cdot \| _ { q }$ are equivalent on $R$.
grandes-ecoles 2025 Q20 Expectation and Moment Inequality Proof
In this part, we assume that $n$ is a power of 2: we write $n = 2 ^ { k }$ with $k \in \mathbf { N } ^ { \star }$. Let $\left( a _ { 1 } , \ldots , a _ { k } \right) \in \mathbf { R } ^ { k }$. Show that $$\alpha _ { 1 } n \left\| \left( a _ { 1 } , \ldots , a _ { k } \right) \right\| _ { 2 } ^ { \mathbf { R } ^ { k } } \leq \sum _ { \left( \varepsilon _ { 1 } , \ldots , \varepsilon _ { k } \right) \in \{ - 1,1 \} ^ { k } } \left| \sum _ { i = 1 } ^ { k } \varepsilon _ { i } a _ { i } \right| \leq \beta _ { 1 } n \left\| \left( a _ { 1 } , \ldots , a _ { k } \right) \right\| _ { 2 } ^ { \mathbf { R } ^ { k } } .$$ You may use questions 11 and 16.
grandes-ecoles 2025 Q20 Expectation and Moment Inequality Proof
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$.
Show that for all $\mathbf{w} \in \mathbb{R}^n$, we have $\mathbb{E}\left[\langle \mathbf{U}, \mathbf{w} \rangle^2\right] = \frac{1}{n} \|\mathbf{w}\|^2$.
isi-entrance 2024 Q6 Expectation and Moment Inequality Proof
Let $x_1, x_2, \ldots, x_n$ be non-negative real numbers such that $\sum_{i=1}^{n} x_i = 1$. What is the maximum possible value of $\sum_{i=1}^{n} \sqrt{x_i}$?
(A) 1
(B) $\sqrt{n}$
(C) $n^{3/4}$
(D) $n$
isi-entrance 2024 Q6 Expectation and Moment Inequality Proof
Let $x _ { 1 } , \ldots , x _ { 2024 }$ be non-negative real numbers with $\sum _ { i = 1 } ^ { 2024 } x _ { i } = 1$. Find, with proof, the minimum and maximum possible values of the expression
$$\sum _ { i = 1 } ^ { 1012 } x _ { i } + \sum _ { i = 1013 } ^ { 2024 } x _ { i } ^ { 2 }$$
todai-math 2022 QII.2 Distribution of Transformed or Combined Random Variables
Consider a situation where products are produced sequentially. The events producing defective products are independent and identically distributed, and a defective product is produced with a probability of $\phi$ $(0 \leq \phi \leq 1)$. Suppose that the probability of producing a defective product follows the Beta distribution
$$\operatorname{Beta}_{\mathrm{a},\mathrm{b}}(x) = \frac{1}{\mathrm{B}(\mathrm{a},\mathrm{b})} x^{\mathrm{a}-1}(1-x)^{\mathrm{b}-1} \quad (0 \leq x \leq 1),$$
for real numbers $\mathrm{a}(>1)$ and $\mathrm{b}(>1)$. Note that the Beta function $\mathrm{B}(\mathrm{a},\mathrm{b})$ is defined as
$$\mathrm{B}(\mathrm{a},\mathrm{b}) = \int_0^1 t^{\mathrm{a}-1}(1-t)^{\mathrm{b}-1} \mathrm{d}t$$
In the Bayesian estimation, the parameter $\theta$ (in this case, $\phi$) that determines the probability is treated as the random variable and we assume that its distribution is described by $\pi(\theta)$. We calculate $\pi(\theta \mid A)$ by
$$\pi(\theta \mid A) = \frac{\pi(\theta) P(A \mid \theta)}{P(A)}$$
where $\pi(\theta \mid A)$ is the posterior probability, $P(A \mid \theta)$ is the conditional occurrence probability that the event $A$ is observed under $\theta$, and $\pi(\theta)$ is the prior probability.
We assume that $\phi$, the probability of producing a defective product, follows the prior probability $\operatorname{Beta}_{\mathrm{a},\mathrm{b}}(\phi)$. Let $Q(\boldsymbol{v} \mid \phi)$ be the conditional occurrence probability of $\boldsymbol{v}$ under $\phi$ and $Q_{\mathrm{a},\mathrm{b}}(\boldsymbol{v})$ be the occurrence probability of $\boldsymbol{v}$. Obtain the posterior probability after $\boldsymbol{v}$ occurs.
todai-math 2022 QII.3 Distribution of Transformed or Combined Random Variables
Consider a situation where products are produced sequentially. The events producing defective products are independent and identically distributed, and a defective product is produced with a probability of $\phi$ $(0 \leq \phi \leq 1)$. Suppose that the probability of producing a defective product follows the Beta distribution
$$\operatorname{Beta}_{\mathrm{a},\mathrm{b}}(x) = \frac{1}{\mathrm{B}(\mathrm{a},\mathrm{b})} x^{\mathrm{a}-1}(1-x)^{\mathrm{b}-1} \quad (0 \leq x \leq 1),$$
for real numbers $\mathrm{a}(>1)$ and $\mathrm{b}(>1)$. Note that the Beta function $\mathrm{B}(\mathrm{a},\mathrm{b})$ is defined as
$$\mathrm{B}(\mathrm{a},\mathrm{b}) = \int_0^1 t^{\mathrm{a}-1}(1-t)^{\mathrm{b}-1} \mathrm{d}t$$
By defining $v_i = 1$ if the $i$-th product is a defective product, and $v_i = 0$ if it is not defective, we get a series $\boldsymbol{v} = (v_1, \cdots, v_N)$, where the values can be 0 or 1. Let $N_d(\boldsymbol{v})$ be the number of observations with value of 1 in $\boldsymbol{v}$.
Suppose that $Q(\boldsymbol{v} \mid \phi)$ in Question II.2 is the occurrence probability obtained in Question II.1 and let $a = 2,\ b = 50$, obtain $Q_{2,50}(\boldsymbol{v})$.
todai-math 2022 QII.4 Distribution of Transformed or Combined Random Variables
Consider a situation where products are produced sequentially. The events producing defective products are independent and identically distributed, and a defective product is produced with a probability of $\phi$ $(0 \leq \phi \leq 1)$. Suppose that the probability of producing a defective product follows the Beta distribution
$$\operatorname{Beta}_{\mathrm{a},\mathrm{b}}(x) = \frac{1}{\mathrm{B}(\mathrm{a},\mathrm{b})} x^{\mathrm{a}-1}(1-x)^{\mathrm{b}-1} \quad (0 \leq x \leq 1),$$
for real numbers $\mathrm{a}(>1)$ and $\mathrm{b}(>1)$. Note that the Beta function $\mathrm{B}(\mathrm{a},\mathrm{b})$ is defined as
$$\mathrm{B}(\mathrm{a},\mathrm{b}) = \int_0^1 t^{\mathrm{a}-1}(1-t)^{\mathrm{b}-1} \mathrm{d}t$$
In Question II.3, with $a=2,\ b=50$, show that the posterior probability becomes the Beta distribution $\operatorname{Beta}_{\mathrm{a}^{\prime},\mathrm{b}^{\prime}}(\phi)$, and obtain $\mathrm{a}^{\prime}$ and $\mathrm{b}^{\prime}$.
todai-math 2022 QII.5 Distribution of Transformed or Combined Random Variables
Consider a situation where products are produced sequentially. The events producing defective products are independent and identically distributed, and a defective product is produced with a probability of $\phi$ $(0 \leq \phi \leq 1)$. Suppose that the probability of producing a defective product follows the Beta distribution
$$\operatorname{Beta}_{\mathrm{a},\mathrm{b}}(x) = \frac{1}{\mathrm{B}(\mathrm{a},\mathrm{b})} x^{\mathrm{a}-1}(1-x)^{\mathrm{b}-1} \quad (0 \leq x \leq 1),$$
for real numbers $\mathrm{a}(>1)$ and $\mathrm{b}(>1)$. Note that the Beta function $\mathrm{B}(\mathrm{a},\mathrm{b})$ is defined as
$$\mathrm{B}(\mathrm{a},\mathrm{b}) = \int_0^1 t^{\mathrm{a}-1}(1-t)^{\mathrm{b}-1} \mathrm{d}t$$
In Question II.4, where the posterior probability is the Beta distribution $\operatorname{Beta}_{\mathrm{a}^{\prime},\mathrm{b}^{\prime}}(\phi)$ with $a=2,\ b=50$, obtain $\phi$ that gives the maximum likelihood estimate (that maximizes the posterior probability).
todai-math 2022 Q3 Distribution of Transformed or Combined Random Variables
Consider a region $R$ defined by $0 < x < 1$ and $0 < y < 1$ in the $x y$-plane. We randomly select a point on $R$ and refer to the selected point as A. We assume that A is uniformly distributed on $R$. Let AB be a perpendicular line from A to the $y$-axis and AC be a perpendicular line from $A$ to the $x$-axis as shown in the figure. We call rectangle $OCAB$ as "the rectangle of A", where O denotes the origin. Let $S$ be a random variable representing the area of the rectangle of A. Answer the following questions.
(1) Calculate the expectation value of $S$.
(2) Calculate the probability that $S \leq r$ holds, where $0 < r < 1$.
(3) Calculate the probability density function of $S$.
Again consider the region $R$. Let $n$ be a positive integer. We select $n$ points on $R$ and refer to the selected points as $\mathrm{A}_1, \mathrm{~A}_2, \ldots, \mathrm{~A}_n$. We assume that each of the points is uniformly distributed on $R$, and $\mathrm{A}_i$ and $\mathrm{A}_j$ for $i \neq j$ are selected independently. Answer the following question.
(4) Let $S_i$ be a random variable representing the area of the rectangle of $\mathrm{A}_i$. Let $Z$ be a random variable which is the minimum of $S_1, S_2, \ldots, S_n$. Calculate the probability density function of $Z$.