LFM Stats And Pure

View all 322 questions →

grandes-ecoles 2022 Q31 Almost Sure Convergence and Random Series Properties View
We fix $K \in \mathbb{N}^\star$ and consider a sequence of random variables $(X_n)_{n \in \mathbb{N}}$ satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ for all $n \in \mathbb{N}$, distinct real numbers $x_1 < \cdots < x_K$ in $[0,1]$, and a sequence of functions $(f_n)$ of class $\mathcal{C}^K$ on $[0,1]$ with real values satisfying: (H1) the function series $\sum f_n^{(K)}$ converges normally on $[0,1]$; (H2') for all $\ell \in \llbracket 1, K \rrbracket$, the numerical series $\sum f_n(x_\ell)^2$ is convergent.
Show that the event $$\left\{\text{for all } \ell \in \llbracket 1, K \rrbracket, \text{ the series } \sum X_n f_n(x_\ell) \text{ is convergent}\right\}$$ has probability 1.
grandes-ecoles 2022 Q32 Probability Inequality and Tail Bound Proof View
With the notation of question 28, deduce that $$\mathbb { P } \left( M \in \mathcal { G } \ell _ { n } ( \mathbb { R } ) \right) \geqslant \frac { 1 } { 2 ^ { n - 1 } } .$$
grandes-ecoles 2022 Q32 Almost Sure Convergence and Random Series Properties View
We fix $K \in \mathbb{N}^{\star}$ and consider a sequence of random variables $(X_n)_{n \in \mathbb{N}}$ satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ (mutually independent), distinct real numbers $x_1 < \cdots < x_K$ in $[0,1]$, and a sequence of functions $(f_n)$ of class $\mathcal{C}^K$ on $[0,1]$ satisfying hypotheses (H1) and (H2'). Let $P_n \in \mathbb{R}_{K-1}[X]$ be a polynomial satisfying $P_n(x_\ell) = f_n(x_\ell)$ for all $\ell \in \llbracket 1, K \rrbracket$ (cf. question 7). Show that the event $$\left\{\begin{array}{l} \text{for all } k \in \llbracket 0, K \rrbracket, \text{ the function series } \sum X_n (f_n - P_n)^{(k)} \text{ is uniformly convergent on } [0,1], \\ \text{the function } \sum_{n=0}^{+\infty} X_n (f_n - P_n) \text{ is of class } \mathcal{C}^K, \\ \text{for all } k \in \llbracket 0, K \rrbracket, \left(\sum_{n=0}^{+\infty} X_n (f_n - P_n)\right)^{(k)} = \sum_{n=0}^{+\infty} X_n (f_n - P_n)^{(k)} \end{array}\right\}$$ has probability 1.
grandes-ecoles 2022 Q32 Almost Sure Convergence and Random Series Properties View
We fix $K \in \mathbb{N}^\star$ and consider a sequence of random variables $(X_n)_{n \in \mathbb{N}}$ satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ for all $n \in \mathbb{N}$, distinct real numbers $x_1 < \cdots < x_K$ in $[0,1]$, and a sequence of functions $(f_n)$ of class $\mathcal{C}^K$ on $[0,1]$ with real values satisfying: (H1) the function series $\sum f_n^{(K)}$ converges normally on $[0,1]$; (H2') for all $\ell \in \llbracket 1, K \rrbracket$, the numerical series $\sum f_n(x_\ell)^2$ is convergent.
Let $P_n \in \mathbb{R}_{K-1}[X]$ be a polynomial satisfying $P_n(x_\ell) = f_n(x_\ell)$ for all $\ell \in \llbracket 1, K \rrbracket$ (cf. question 7). Show that the event $$\left\{\begin{array}{l} \text{for all } k \in \llbracket 0, K \rrbracket, \text{ the function series } \sum X_n (f_n - P_n)^{(k)} \text{ is uniformly convergent on } [0,1], \\ \text{the function } \sum_{n=0}^{+\infty} X_n (f_n - P_n) \text{ is of class } \mathcal{C}^K, \\ \text{for all } k \in \llbracket 0, K \rrbracket, \left(\sum_{n=0}^{+\infty} X_n (f_n - P_n)\right)^{(k)} = \sum_{n=0}^{+\infty} X_n (f_n - P_n)^{(k)} \end{array}\right\}$$ has probability 1.
grandes-ecoles 2022 Q33 Almost Sure Convergence and Random Series Properties View
We fix $K \in \mathbb{N}^{\star}$ and consider a sequence of random variables $(X_n)_{n \in \mathbb{N}}$ satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ (mutually independent), distinct real numbers $x_1 < \cdots < x_K$ in $[0,1]$, and a sequence of functions $(f_n)$ of class $\mathcal{C}^K$ on $[0,1]$ satisfying hypotheses (H1) and (H2'). Show that the event $$\left\{\begin{array}{l} \text{for all } k \in \llbracket 0, K \rrbracket, \text{ the function series } \sum X_n f_n^{(k)} \text{ is uniformly convergent on } [0,1], \\ \text{the function } \sum_{n=0}^{+\infty} X_n f_n \text{ is of class } \mathcal{C}^K, \\ \text{for all } k \in \llbracket 0, K \rrbracket, \left(\sum_{n=0}^{+\infty} X_n f_n\right)^{(k)} = \sum_{n=0}^{+\infty} X_n f_n^{(k)} \end{array}\right\}$$ has probability 1.
grandes-ecoles 2022 Q33 Almost Sure Convergence and Random Series Properties View
We fix $K \in \mathbb{N}^\star$ and consider a sequence of random variables $(X_n)_{n \in \mathbb{N}}$ satisfying $\mathbb{P}(X_n = -1) = \mathbb{P}(X_n = 1) = \frac{1}{2}$ for all $n \in \mathbb{N}$, distinct real numbers $x_1 < \cdots < x_K$ in $[0,1]$, and a sequence of functions $(f_n)$ of class $\mathcal{C}^K$ on $[0,1]$ with real values satisfying: (H1) the function series $\sum f_n^{(K)}$ converges normally on $[0,1]$; (H2') for all $\ell \in \llbracket 1, K \rrbracket$, the numerical series $\sum f_n(x_\ell)^2$ is convergent.
Show that the event $$\left\{\begin{array}{l} \text{for all } k \in \llbracket 0, K \rrbracket, \text{ the function series } \sum X_n f_n^{(k)} \text{ is uniformly convergent on } [0,1], \\ \text{the function } \sum_{n=0}^{+\infty} X_n f_n \text{ is of class } \mathcal{C}^K, \\ \text{for all } k \in \llbracket 0, K \rrbracket, \left(\sum_{n=0}^{+\infty} X_n f_n\right)^{(k)} = \sum_{n=0}^{+\infty} X_n f_n^{(k)} \end{array}\right\}$$ has probability 1.
grandes-ecoles 2022 Q34 Almost Sure Convergence and Random Series Properties View
Give an example of an integer $K \in \mathbb{N}^{\star}$ for which the event in question Q33 occurs with the functions $f_n$ defined by $$\left\{\begin{array}{l} f_0 = 0 \\ f_n(x) = \ln\left(1 + \sin\left(\frac{x}{n}\right)\right) \quad \forall n \in \mathbb{N}^{\star}, \forall x \in [0,1]. \end{array}\right.$$
grandes-ecoles 2022 Q34 Almost Sure Convergence and Random Series Properties View
Give an example of an integer $K \in \mathbb{N}^\star$ for which the event $$\left\{\begin{array}{l} \text{for all } k \in \llbracket 0, K \rrbracket, \text{ the function series } \sum X_n f_n^{(k)} \text{ is uniformly convergent on } [0,1], \\ \text{the function } \sum_{n=0}^{+\infty} X_n f_n \text{ is of class } \mathcal{C}^K, \\ \text{for all } k \in \llbracket 0, K \rrbracket, \left(\sum_{n=0}^{+\infty} X_n f_n\right)^{(k)} = \sum_{n=0}^{+\infty} X_n f_n^{(k)} \end{array}\right\}$$ occurs with the functions $f_n$ defined by $$\left\{\begin{array}{l} f_0 = 0 \\ f_n(x) = \ln\left(1 + \sin\left(\frac{x}{n}\right)\right) \quad \forall n \in \mathbb{N}^\star, \forall x \in [0,1]. \end{array}\right.$$
grandes-ecoles 2022 Q36 Expectation and Moment Inequality Proof View
We assume that $\Sigma_Y$ has $n$ distinct eigenvalues which we order in strictly decreasing order $\lambda_1 > \cdots > \lambda_n$. We equip ourselves with a vector $U_0$ such that $\mathbb{V}\left(U_0^\top Y\right) = \max_{U \in C} \mathbb{V}\left(U^\top Y\right)$, where $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. We denote $$C' = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \text{ and } U_0^\top U = 0 \right\}.$$
Justify that $q_Y$ admits a maximum on $C'$.
grandes-ecoles 2022 Q37 Distribution of Transformed or Combined Random Variables View
We assume that $\Sigma_Y$ has $n$ distinct eigenvalues which we order in strictly decreasing order $\lambda_1 > \cdots > \lambda_n$. We equip ourselves with a vector $U_0$ such that $\mathbb{V}\left(U_0^\top Y\right) = \max_{U \in C} \mathbb{V}\left(U^\top Y\right)$, where $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. We denote $$C' = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \text{ and } U_0^\top U = 0 \right\}.$$
Determine the value of $\max_{U \in C'} \mathbb{V}\left(U^\top Y\right)$ and specify a vector $U_1 \in C'$ such that $$\max_{U \in C'} \mathbb{V}\left(U^\top Y\right) = \mathbb{V}\left(U_1^\top Y\right).$$
grandes-ecoles 2022 Q38 Distribution of Transformed or Combined Random Variables View
We assume that $\Sigma_Y$ has $n$ distinct eigenvalues which we order in strictly decreasing order $\lambda_1 > \cdots > \lambda_n$. We equip ourselves with a vector $U_0$ such that $\mathbb{V}\left(U_0^\top Y\right) = \max_{U \in C} \mathbb{V}\left(U^\top Y\right)$, and a vector $U_1 \in C'$ such that $\mathbb{V}\left(U_1^\top Y\right) = \max_{U \in C'} \mathbb{V}\left(U^\top Y\right)$, where $$C' = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \text{ and } U_0^\top U = 0 \right\}.$$
Calculate the covariance of the discrete random variables $U_0^\top Y$ and $U_1^\top Y$ (to simplify notation, one may assume $Y$ is centered, that is, $\mathbb{E}(Y) = 0$).
grandes-ecoles 2022 Q39 Characteristic/Moment Generating Function Derivation View
Let $X _ { 1 } , \ldots , X _ { n } , Y _ { 1 } , \ldots , Y _ { n }$ be mutually independent random variables with the same distribution $\mathcal { R }$ (where $X(\Omega) = \{-1,1\}$, $\mathbb{P}(X=-1)=\mathbb{P}(X=1)=\frac{1}{2}$). We define the random vectors $X = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } , \ldots , X _ { n } \right) ^ { \top }$ and $Y = \frac { 1 } { \sqrt { n } } \left( Y _ { 1 } , \ldots , Y _ { n } \right) ^ { \top }$ taking values in $\mathcal { M } _ { n , 1 } ( \mathbb { R } )$.
Prove that, for every real number $t$, $$\mathbb { E } ( \exp ( t \langle X \mid Y \rangle ) ) = \left( \operatorname { ch } \left( \frac { t } { n } \right) \right) ^ { n }$$
grandes-ecoles 2022 Q41 Probability Inequality and Tail Bound Proof View
Let $\sigma$ and $\lambda$ be two strictly positive real numbers and $Z$ a real random variable such that $\exp ( t Z )$ has finite expectation and satisfies $$\forall t \in \mathbb { R } , \quad \mathbb { E } ( \exp ( t Z ) ) \leqslant \exp \left( \frac { \sigma ^ { 2 } t ^ { 2 } } { 2 } \right)$$
By applying Markov's inequality to a suitably chosen random variable, prove that $$\forall t \in \mathbb { R } ^ { + } , \quad \mathbb { P } ( Z \geqslant \lambda ) \leqslant \exp \left( \frac { \sigma ^ { 2 } t ^ { 2 } } { 2 } - \lambda t \right)$$
grandes-ecoles 2022 Q42 Probability Inequality and Tail Bound Proof View
Let $\sigma$ and $\lambda$ be two strictly positive real numbers and $Z$ a real random variable such that $\exp ( t Z )$ has finite expectation and satisfies $$\forall t \in \mathbb { R } , \quad \mathbb { E } ( \exp ( t Z ) ) \leqslant \exp \left( \frac { \sigma ^ { 2 } t ^ { 2 } } { 2 } \right)$$
Deduce that $$\mathbb { P } ( | Z | \geqslant \lambda ) \leqslant 2 \exp \left( - \frac { \lambda ^ { 2 } } { 2 \sigma ^ { 2 } } \right)$$
grandes-ecoles 2022 Q43 Probability Inequality and Tail Bound Proof View
Let $X _ { 1 } , \ldots , X _ { n } , Y _ { 1 } , \ldots , Y _ { n }$ be mutually independent random variables with the same distribution $\mathcal { R }$. We define the random vectors $X = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } , \ldots , X _ { n } \right) ^ { \top }$ and $Y = \frac { 1 } { \sqrt { n } } \left( Y _ { 1 } , \ldots , Y _ { n } \right) ^ { \top }$ taking values in $\mathcal { M } _ { n , 1 } ( \mathbb { R } )$.
Prove that $$\mathbb { P } ( | \langle X \mid Y \rangle | \geqslant \varepsilon ) \leqslant 2 \exp \left( - \frac { \varepsilon ^ { 2 } n } { 2 } \right)$$
grandes-ecoles 2022 Q44 Probability Inequality and Tail Bound Proof View
$N$ being a non-zero natural integer, $\left( X _ { j } ^ { i } \right) _ { 1 \leqslant i \leqslant N , 1 \leqslant j \leqslant n }$ is a family of $n \times N$ mutually independent real random variables with the same distribution $\mathcal { R }$. For every $i \in \llbracket 1 , N \rrbracket$, we set $X ^ { i } = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } ^ { i } , \ldots , X _ { n } ^ { i } \right) ^ { \top }$.
Deduce from the previous questions that $$\mathbb { P } \left( \bigcup _ { 1 \leqslant i < j \leqslant N } \left| \left\langle X ^ { i } \mid X ^ { j } \right\rangle \right| \geqslant \varepsilon \right) \leqslant N ( N - 1 ) \exp \left( - \frac { \varepsilon ^ { 2 } n } { 2 } \right) .$$
grandes-ecoles 2022 Q45 Probability Inequality and Tail Bound Proof View
$N$ being a non-zero natural integer, $\left( X _ { j } ^ { i } \right) _ { 1 \leqslant i \leqslant N , 1 \leqslant j \leqslant n }$ is a family of $n \times N$ mutually independent real random variables with the same distribution $\mathcal { R }$. For every $i \in \llbracket 1 , N \rrbracket$, we set $X ^ { i } = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } ^ { i } , \ldots , X _ { n } ^ { i } \right) ^ { \top }$.
We assume that $n \geqslant 4 \frac { \ln N } { \varepsilon ^ { 2 } }$. Prove that $$\mathbb { P } \left( \bigcup _ { 1 \leqslant i < j \leqslant N } \left| \left\langle X ^ { i } \mid X ^ { j } \right\rangle \right| \geqslant \varepsilon \right) < 1 .$$
grandes-ecoles 2022 Q46 Probability Inequality and Tail Bound Proof View
$N$ being a non-zero natural integer, $\left( X _ { j } ^ { i } \right) _ { 1 \leqslant i \leqslant N , 1 \leqslant j \leqslant n }$ is a family of $n \times N$ mutually independent real random variables with the same distribution $\mathcal { R }$. For every $i \in \llbracket 1 , N \rrbracket$, we set $X ^ { i } = \frac { 1 } { \sqrt { n } } \left( X _ { 1 } ^ { i } , \ldots , X _ { n } ^ { i } \right) ^ { \top }$.
Deduce that, for every natural integer $N$ less than or equal to $\exp \left( \frac { \varepsilon ^ { 2 } n } { 4 } \right)$, there exists a family of $N$ unit vectors of $\mathbb { R } ^ { n }$ whose coherence parameter is bounded by $\varepsilon$.
grandes-ecoles 2022 Q46 Verification of Probability Measure or Inner Product Properties View
For all integer $p \in \mathbb { N } ^ { * }$ and all $x > 0$, we set $P _ { p } ( x ) = x \mathrm { e } ^ { x } g _ { p } ^ { ( p ) } ( x )$, where $g _ { p } ( x ) = x ^ { p - 1 } \mathrm { e } ^ { - x }$. We recall that $P _ { p }$ is a polynomial function of degree $p$, that $P _ { p } \in E$, and that $P_p$ is an eigenvector of $U$ for the eigenvalue $1/p$. The inner product on $E$ is $\langle f \mid g \rangle = \int _ { 0 } ^ { + \infty } f ( t ) g ( t ) \frac { \mathrm { e } ^ { - t } } { t } \mathrm {~d} t$. Show that the polynomials $P _ { p }$ are pairwise orthogonal in $E$.
grandes-ecoles 2023 Q4 Entropy, Information, or Log-Sobolev Functional Analysis View
Let $X$ be a finite set and $p = (p_x)_{x \in X}$ a probability distribution on $X$. We assume that $p$ charges all points of $X$: $p_x > 0$ for all $x \in X$. We call entropy of $p$ the quantity $$H(p) = -\sum_{x \in X} p_x \ln(p_x)$$ We consider the set $Q_X = \{\boldsymbol{q} = (q_x)_{x \in X} \in \mathbb{R}^X \mid \forall x \in X, q_x \geq 0\}$. For all $\boldsymbol{q}, \boldsymbol{q}' \in Q_X$ such that $q_x' > 0$ for all $x \in X$, we define: $$\mathrm{KL}(\boldsymbol{q}, \boldsymbol{q}') = \sum_{x \in X} \varphi(q_x / q_x') q_x'$$ with $\varphi : \mathbb{R}_+ \rightarrow \mathbb{R}$ defined by $\varphi(x) = x \log(x) - x + 1$ for $x > 0$ and extended to 0 by continuity.
(a) Specify $\varphi(0)$.
(b) Verify that $\varphi$ is continuous, strictly convex, positive and that $\varphi(x) = 0$ if and only if $x = 1$.
(c) Show that $Q_X$ is convex and that $\boldsymbol{q} \mapsto \mathrm{KL}(\boldsymbol{q}, \boldsymbol{q}')$ is strictly convex, positive and vanishes if and only if $q = q'$.
grandes-ecoles 2023 Q7 Verification of Probability Measure or Inner Product Properties View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote $$Q = \left\{(q_{ij})_{(i,j) \in I \times J} \in \mathbb{R}^{I \times J} \mid q_{ij} \geq 0 \text{ for all } (i,j) \in I \times J\right\}$$ and $$F(\alpha, \beta) = \left\{q \in Q \mid \sum_{j' \in J} q_{ij'} = \alpha_i \text{ and } \sum_{i' \in I} q_{i'j} = \beta_j \text{ for all } (i,j) \in I \times J\right\}.$$ Verify that $F(\alpha, \beta)$ is a convex set of the vector space $E = \mathbb{R}^{I \times J}$.
grandes-ecoles 2023 Q8 Conditional Probability and Total Probability with Tree/Bayes Structure View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote $$F(\alpha, \beta) = \left\{q \in Q \mid \sum_{j' \in J} q_{ij'} = \alpha_i \text{ and } \sum_{i' \in I} q_{i'j} = \beta_j \text{ for all } (i,j) \in I \times J\right\}.$$ We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $X_1$ and $X_2$ be two random variables such that $X_1$ takes values in $I$ and $X_2$ takes values in $J$.
(a) Verify that if $\boldsymbol{q} \in F(\alpha, \beta)$, then $\sum_{i \in I} \sum_{j \in J} q_{ij} = 1$.
(b) Assume that $P(X_1 = i, X_2 = j) = q_{ij}$ with $q \in F(\alpha, \beta)$. Calculate the distribution of $X_1$ and that of $X_2$ in terms of $\alpha$ and $\beta$.
(c) What can we say about $X_1$ and $X_2$ when $\boldsymbol{q} = \boldsymbol{p}$?
grandes-ecoles 2023 Q10 Probability Inequality and Tail Bound Proof View
Let $x > 0$. By writing that $\varphi ( t ) \leqslant \frac { t } { x } \varphi ( t )$ for all $t \geqslant x$, show that $$\int _ { x } ^ { + \infty } \varphi ( t ) \mathrm { d } t \leqslant \frac { \varphi ( x ) } { x }$$ where $\varphi ( x ) = \frac { 1 } { \sqrt { 2 \pi } } \mathrm { e } ^ { - x ^ { 2 } / 2 }$.
grandes-ecoles 2023 Q11 Integrability, Boundedness, and Regularity of Density/Distribution-Related Functions View
For $\mu > 0$ and $\varphi \in \mathcal{C}_{c}(\mathbb{R})$, we define $T_{\mu} : \varphi \mapsto T_{\mu}\varphi$, where for all $x \in \mathbb{R}$,
$$T_{\mu}\varphi(x) = \frac{1}{2\mu} \int_{x-\mu}^{x+\mu} \varphi(t)\, dt$$
Show that $T_{\mu}$ is a linear map, which sends the space $\mathcal{C}_{c}(\mathbb{R})$ into itself, and that for all $\varphi \in \mathcal{C}_{c}(\mathbb{R})$ we have $\|T_{\mu}\varphi\|_{\infty} \leqslant \|\varphi\|_{\infty}$.
grandes-ecoles 2023 Q11 Entropy, Information, or Log-Sobolev Functional Analysis View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $C = (C_{ij})_{(i,j) \in I \times J} \in \mathbb{R}_+^{I \times J}$ and $\epsilon > 0$. We consider $J_\epsilon : Q \rightarrow \mathbb{R}$ defined by $$J_\epsilon(\boldsymbol{q}) = \sum_{ij} q_{ij} C_{ij} + \epsilon \operatorname{KL}(\boldsymbol{q}, \boldsymbol{p})$$ and $\boldsymbol{q}(\epsilon)$ the unique minimizer of $J_\epsilon$ on $F(\alpha, \beta)$.
(a) Verify that $q(\epsilon)_{ij} > 0$ for all $(i,j) \in I \times J$ (Hint: One may reason by contradiction and consider for all $t \in ]0,1[$ $\boldsymbol{q}(\epsilon, t) = (1-t)\boldsymbol{q}(\epsilon) + t\boldsymbol{p}$ then observe the behavior of $\varphi(x)$ near $x = 0$).
(b) Show that this is no longer true if we assume that $\epsilon = 0$.