LFM Stats And Pure

View all 322 questions →

grandes-ecoles 2019 Q25 Convergence in Distribution or Probability View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}$, $F_n(x) = \mathbb{P}(Y_n \leqslant x)$ and $G_n(x) = \mathbb{P}(Y_n < x)$.
Using the monotonicity established in Q24, deduce the pointwise convergence of the sequences of functions $(F_n)_{n \geqslant 1}$ and $(G_n)_{n \geqslant 1}$.
grandes-ecoles 2019 Q25 Convergence in Distribution or Probability View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}, \quad F_n(x) = \mathbb{P}(Y_n \leqslant x), \quad G_n(x) = \mathbb{P}(Y_n < x).$$
Deduce the pointwise convergence of the sequences of functions $(F_n)_{n \geqslant 1}$ and $(G_n)_{n \geqslant 1}$.
grandes-ecoles 2019 Q26 Convergence in Distribution or Probability View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}$, $F_n(x) = \mathbb{P}(Y_n \leqslant x)$ and $G_n(x) = \mathbb{P}(Y_n < x)$, and $D = \bigcup_{n \in \mathbb{N}^{\star}} D_n$.
Show $$\forall x \in D \cup \{1\}, \quad \lim_{n \rightarrow \infty} F_n(x) = x \quad \text{and} \quad \lim_{n \rightarrow \infty} G_n(x) = x.$$
grandes-ecoles 2019 Q26 Distribution of Transformed or Combined Random Variables View
In the general model of a Pólya urn ($b = c = 0$, $a = d$), using the results established so far (in particular that $H = G$ on $D_{\rho}$), conclude that, for all integers $n$ and for all $k \in \llbracket 0, n \rrbracket$, $$P(X_{n} = a_{0} + ka) = \binom{n}{k} \frac{L_{k}(a_{0}/a) L_{n-k}(b_{0}/a)}{L_{n}(a_{0}/a + b_{0}/a)}$$
grandes-ecoles 2019 Q26 Convergence in Distribution or Probability View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}, \quad F_n(x) = \mathbb{P}(Y_n \leqslant x), \quad G_n(x) = \mathbb{P}(Y_n < x).$$ We denote $D = \bigcup_{n \in \mathbb{N}^{\star}} D_n$ where $D_n = \left\{ \sum_{j=1}^{n} \frac{x_j}{2^j},\, (x_j)_{j \in \llbracket 1,n \rrbracket} \in \{0,1\}^n \right\}$.
Show $$\forall x \in D \cup \{1\}, \quad \lim_{n \rightarrow \infty} F_n(x) = x \quad \text{and} \quad \lim_{n \rightarrow \infty} G_n(x) = x.$$
grandes-ecoles 2019 Q27 Convergence in Distribution or Probability View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}$, $F_n(x) = \mathbb{P}(Y_n \leqslant x)$ and $G_n(x) = \mathbb{P}(Y_n < x)$.
Generalize the results obtained in Q26 for all $x \in [0,1]$.
grandes-ecoles 2019 Q27 Distribution of Transformed or Combined Random Variables View
In the general model of a Pólya urn ($b = c = 0$, $a = d$), using the result of question 26, recover the result of question 10.
grandes-ecoles 2019 Q27 Convergence in Distribution or Probability View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}, \quad F_n(x) = \mathbb{P}(Y_n \leqslant x), \quad G_n(x) = \mathbb{P}(Y_n < x).$$ We denote $D = \bigcup_{n \in \mathbb{N}^{\star}} D_n$ where $D_n = \left\{ \sum_{j=1}^{n} \frac{x_j}{2^j},\, (x_j)_{j \in \llbracket 1,n \rrbracket} \in \{0,1\}^n \right\}$.
Generalize the results obtained in the previous question for all $x \in [0,1]$.
grandes-ecoles 2019 Q28 Convergence in Distribution or Probability View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}$.
Show that for every non-empty interval $I \subset [0,1]$, we have $$\lim_{n \rightarrow \infty} \mathbb{P}(Y_n \in I) = \ell(I) \quad \text{with} \quad \ell(I) = \sup I - \inf I.$$
grandes-ecoles 2019 Q28 Expectation and Moment Inequality Proof View
In the general model of a Pólya urn ($b = c = 0$, $a = d$), using the results of questions 16 and 19, determine the expectation of $X_{n}$.
grandes-ecoles 2019 Q28 Convergence in Distribution or Probability View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}.$$
Show that for every non-empty interval $I \subset [0,1]$, we have $$\lim_{n \rightarrow \infty} \mathbb{P}(Y_n \in I) = \ell(I) \quad \text{with} \quad \ell(I) = \sup I - \inf I.$$
grandes-ecoles 2019 Q29 Convergence in Distribution or Probability View
Let $(U_n)_{n \geqslant 1}$ be a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}$.
Using the result of Q28, deduce that for every continuous function $f$ from $[0,1]$ to $\mathbb{R}$, the sequence $(\mathbb{E}(f(Y_n)))_{n \geqslant 1}$ converges and specify its limit.
grandes-ecoles 2019 Q29 Convergence in Distribution or Probability View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}.$$
Deduce that, for every continuous function $f$ from $[0,1]$ to $\mathbb{R}$, the sequence $(\mathbb{E}(f(Y_n)))_{n \geqslant 1}$ converges and specify its limit.
grandes-ecoles 2019 Q30 Characteristic/Moment Generating Function Derivation View
Using the result of Q29, propose another proof of the result obtained in question 6, i.e., the pointwise limit of the sequence of functions $(\varphi_n)_{n \geqslant 1}$ defined by $$\forall n \in \mathbb{N}^{\star}, \quad \varphi_n(t) = \mathbb{E}(\cos(t X_n)).$$
grandes-ecoles 2019 Q30 Characteristic/Moment Generating Function Derivation View
Let $(\Omega, \mathcal{A}, \mathbb{P})$ be a probability space, $(U_n)_{n \geqslant 1}$ a sequence of mutually independent random variables following a Bernoulli distribution with parameter $1/2$. We set $$\forall n \in \mathbb{N}^{\star}, \quad Y_n = \sum_{k=1}^{n} \frac{U_k}{2^k}.$$ For every continuous function $f$ from $[0,1]$ to $\mathbb{R}$, the sequence $(\mathbb{E}(f(Y_n)))_{n \geqslant 1}$ converges to $\int_0^1 f(t)\,\mathrm{d}t$.
Using the previous result, propose another proof of the result obtained in question 6.
grandes-ecoles 2020 Q4 Characteristic/Moment Generating Function Derivation View
Let $(X_1, \ldots, X_n)$ be discrete real random variables that are mutually independent such that, for all $k \in \{1, \ldots, n\}$, $$P[X_k = 1] = P[X_k = -1] = \frac{1}{2}$$ For each $\lambda \geqslant 0$, we set $$m(\lambda) = \frac{E[X_1 \exp(\lambda X_1)]}{E[\exp(\lambda X_1)]}$$ Show that the function $m$ is strictly increasing on $\mathbb{R}_{\geqslant 0}$, and that for all $t \in [0,1]$, there exists a unique $\lambda \geqslant 0$ such that $m(\lambda) = t$.
grandes-ecoles 2020 Q4 Characteristic/Moment Generating Function Derivation View
Let $n \geqslant 1$ be a natural integer, and let $\left( X _ { 1 } , \ldots , X _ { n } \right)$ be mutually independent discrete real random variables such that, for all $k \in \{ 1 , \ldots , n \}$, $$P \left[ X _ { k } = 1 \right] = P \left[ X _ { k } = - 1 \right] = \frac { 1 } { 2 }$$ For each $\lambda \geqslant 0$, we set $$m ( \lambda ) = \frac { E \left[ X _ { 1 } \exp \left( \lambda X _ { 1 } \right) \right] } { E \left[ \exp \left( \lambda X _ { 1 } \right) \right] }$$ Show that the function $m$ is strictly increasing on $\mathbb { R } _ { + }$, and that for all $t \in [ 0,1 [$, there exists a unique $\lambda \geqslant 0$ such that $m ( \lambda ) = t$.
grandes-ecoles 2020 Q4 Characteristic/Moment Generating Function Derivation View
Let $a$ and $b$ be two real numbers and $Y = a X + b$. For all real $t$, express $\phi _ { Y } ( t )$ in terms of $\phi _ { X } , t , a$ and $b$.
grandes-ecoles 2020 Q5 Expectation and Moment Inequality Proof View
Let $n \geqslant 1$ be a natural integer, and let $(X_1, \ldots, X_n)$ be discrete real random variables that are mutually independent such that, for all $k \in \{1, \ldots, n\}$, $$P[X_k = 1] = P[X_k = -1] = \frac{1}{2}$$ We define $$S_n = \frac{1}{n} \sum_{k=1}^{n} X_k$$ as well as, for all $\lambda \in \mathbb{R}$, $$\psi(\lambda) = \log\left(\frac{1}{2}e^{\lambda} + \frac{1}{2}e^{-\lambda}\right)$$ For each $\lambda \geqslant 0$, we set $$m(\lambda) = \frac{E[X_1 \exp(\lambda X_1)]}{E[\exp(\lambda X_1)]}$$ as well as $$D_n(\lambda) = \exp(\lambda n S_n - n \psi(\lambda))$$
(a) For $n \geqslant 2$ and $\lambda \geqslant 0$, show that $$E[(X_1 - m(\lambda))(X_2 - m(\lambda)) D_n(\lambda)] = 0$$
(b) Deduce that, for $n \geqslant 1$ and $\lambda \geqslant 0$, $$E[(S_n - m(\lambda))^2 D_n(\lambda)] \leqslant \frac{4}{n}.$$
grandes-ecoles 2020 Q5 Expectation and Moment Inequality Proof View
Let $n \geqslant 1$ be a natural integer, and let $\left( X _ { 1 } , \ldots , X _ { n } \right)$ be mutually independent discrete real random variables such that, for all $k \in \{ 1 , \ldots , n \}$, $$P \left[ X _ { k } = 1 \right] = P \left[ X _ { k } = - 1 \right] = \frac { 1 } { 2 }$$ We define $$S _ { n } = \frac { 1 } { n } \sum _ { k = 1 } ^ { n } X _ { k }$$ as well as, for all $\lambda \in \mathbb { R }$, $$\psi ( \lambda ) = \log \left( \frac { 1 } { 2 } e ^ { \lambda } + \frac { 1 } { 2 } e ^ { - \lambda } \right)$$ For each $\lambda \geqslant 0$, we set $$m ( \lambda ) = \frac { E \left[ X _ { 1 } \exp \left( \lambda X _ { 1 } \right) \right] } { E \left[ \exp \left( \lambda X _ { 1 } \right) \right] }$$ as well as $$D _ { n } ( \lambda ) = \exp \left( \lambda n S _ { n } - n \psi ( \lambda ) \right)$$
(a) For $n \geqslant 2$ and $\lambda \geqslant 0$, show that $$E \left[ \left( X _ { 1 } - m ( \lambda ) \right) \left( X _ { 2 } - m ( \lambda ) \right) D _ { n } ( \lambda ) \right] = 0$$
(b) Deduce that, for $n \geqslant 1$ and $\lambda \geqslant 0$, $$E \left[ \left( S _ { n } - m ( \lambda ) \right) ^ { 2 } D _ { n } ( \lambda ) \right] \leqslant \frac { 4 } { n }$$
grandes-ecoles 2020 Q5 Characteristic/Moment Generating Function Derivation View
Let $t \in \mathbb { R }$. Give an expression of $\phi _ { X } ( - t )$ in terms of $\phi _ { X } ( t )$. Deduce a necessary and sufficient condition on the image $\phi _ { X } ( \mathbb { R } )$ for the function $\phi _ { X }$ to be even.
grandes-ecoles 2020 Q8 Verification of Probability Measure or Inner Product Properties View
For all $f, g \in \mathcal{E}$, we define $$(f \mid g) = \int_{-\infty}^{+\infty} f(y) g(y) \,\mathrm{d}y.$$
We define $\gamma_\lambda : \mathbf{R} \rightarrow \mathbf{R}$ by $\gamma_\lambda(y) = \exp\left(-y^2/\lambda\right)$ and for all $x \in \mathbf{R}$, $\tau_x(f)(y) = f(y-x)$.
(a) Show that for all $f \in \mathcal{E}$, we have $(f \mid f) \geq 0$ with equality if and only if $f = 0$.
(b) Show that for all $x \in \mathbf{R}$, $\tau_x\left(\gamma_\lambda\right)$ belongs to $\mathcal{E}$.
grandes-ecoles 2020 Q8 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
In this part, $E$ denotes the vector space of functions $f:[0,1] \rightarrow \mathbb{R}$ continuous, equipped with the inner product defined by, $$\forall (f,g) \in E^2, \quad \langle f,g \rangle = \int_0^1 f(t)g(t)\,\mathrm{d}t$$ For all $f \in E$, we set, $$\forall s \in [0,1], \quad T(f)(s) = \int_0^1 k_s(t) f(t)\,\mathrm{d}t$$ where $k_s(t) = \begin{cases} t(1-s) & \text{if } t < s \\ s(1-t) & \text{if } t \geqslant s. \end{cases}$ Show that $T$ is a continuous endomorphism of $E$.
grandes-ecoles 2020 Q8 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
In this part, $E$ denotes the vector space of functions $f : [0,1] \rightarrow \mathbb{R}$ continuous, equipped with the inner product defined by, $$\forall (f,g) \in E^2, \quad \langle f, g \rangle = \int_0^1 f(t) g(t) \, \mathrm{d}t$$ For all $s \in [0,1]$, we define the function $k_s$ by, $$\forall t \in [0,1], \quad k_s(t) = \begin{cases} t(1-s) & \text{if } t < s \\ s(1-t) & \text{if } t \geqslant s. \end{cases}$$ For all $f \in E$, we set, $$\forall s \in [0,1], \quad T(f)(s) = \int_0^1 k_s(t) f(t) \, \mathrm{d}t$$ Show that $T$ is a continuous endomorphism of $E$.
grandes-ecoles 2020 Q9 Change of Variable and Integral Evaluation View
We define $\gamma_\lambda(y) = \exp(-y^2/\lambda)$ and for all $x \in \mathbf{R}$, $\tau_x(f)(y) = f(y-x)$. For all $f, g \in \mathcal{E}$, $(f \mid g) = \int_{-\infty}^{+\infty} f(y)g(y)\,\mathrm{d}y$.
(a) Let $a > 0$. Show that there exists $c \geq 0$ such that for all $x \in \mathbf{R}$ we have $$\int_{-\infty}^{+\infty} \exp\left(-\frac{(y-x)^2}{\lambda}\right) \exp\left(-\frac{y^2}{a}\right) \mathrm{d}y = c \exp\left(-\frac{x^2}{a+\lambda}\right).$$ Hint: One may show the equality $$\frac{(y-x)^2}{\lambda} + \frac{y^2}{a} = \frac{a+\lambda}{a\lambda}\left(y - \frac{ax}{a+\lambda}\right)^2 + \frac{x^2}{a+\lambda}.$$
(b) Let $g \in \mathcal{E}$. We consider $C(g) : \mathbf{R} \rightarrow \mathbf{R}$ defined for all $x \in \mathbf{R}$ by $$C(g)(x) = \left(\tau_x(\gamma_\lambda) \mid g\right)$$ Show that $C(g) \in \mathcal{E}$.
(c) Show that $C : \mathcal{E} \rightarrow \mathcal{E}$ defines an endomorphism of $\mathcal{E}$.