grandes-ecoles

Papers (191)
2025
centrale-maths1__official 40 centrale-maths2__official 42 mines-ponts-maths1__mp 20 mines-ponts-maths1__pc 21 mines-ponts-maths1__psi 21 mines-ponts-maths2__mp 28 mines-ponts-maths2__pc 24 mines-ponts-maths2__psi 26 polytechnique-maths-a__mp 27 polytechnique-maths__fui 16 polytechnique-maths__pc 27 x-ens-maths-a__mp 18 x-ens-maths-c__mp 9 x-ens-maths-d__mp 38 x-ens-maths__pc 27 x-ens-maths__psi 38
2024
centrale-maths1__official 28 centrale-maths2__official 29 geipi-polytech__maths 9 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 20 mines-ponts-maths1__psi 19 mines-ponts-maths2__mp 23 mines-ponts-maths2__pc 21 mines-ponts-maths2__psi 21 polytechnique-maths-a__mp 44 polytechnique-maths-b__mp 37 x-ens-maths-a__mp 43 x-ens-maths-b__mp 35 x-ens-maths-c__mp 22 x-ens-maths-d__mp 45 x-ens-maths__pc 24 x-ens-maths__psi 26
2023
centrale-maths1__official 44 centrale-maths2__official 33 e3a-polytech-maths__mp 4 mines-ponts-maths1__mp 15 mines-ponts-maths1__pc 23 mines-ponts-maths1__psi 23 mines-ponts-maths2__mp 22 mines-ponts-maths2__pc 18 mines-ponts-maths2__psi 22 polytechnique-maths__fui 23 x-ens-maths-a__mp 25 x-ens-maths-b__mp 24 x-ens-maths-c__mp 20 x-ens-maths-d__mp 20 x-ens-maths__pc 18 x-ens-maths__psi 15
2022
centrale-maths1__mp 48 centrale-maths1__official 48 centrale-maths1__pc 37 centrale-maths1__psi 43 centrale-maths2__mp 32 centrale-maths2__official 32 centrale-maths2__pc 39 centrale-maths2__psi 45 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 24 mines-ponts-maths1__psi 24 mines-ponts-maths2__mp 24 mines-ponts-maths2__pc 19 mines-ponts-maths2__psi 20 x-ens-maths-a__mp 13 x-ens-maths-b__mp 40 x-ens-maths-c__mp 27 x-ens-maths-d__mp 46 x-ens-maths1__mp 13 x-ens-maths2__mp 40 x-ens-maths__pc 15 x-ens-maths__pc_cpge 15 x-ens-maths__psi 22 x-ens-maths__psi_cpge 23
2021
centrale-maths1__mp 40 centrale-maths1__official 40 centrale-maths1__pc 36 centrale-maths1__psi 29 centrale-maths2__mp 30 centrale-maths2__official 29 centrale-maths2__pc 38 centrale-maths2__psi 37 x-ens-maths2__mp 39 x-ens-maths__pc 44
2020
centrale-maths1__mp 42 centrale-maths1__official 42 centrale-maths1__pc 36 centrale-maths1__psi 40 centrale-maths2__mp 38 centrale-maths2__official 38 centrale-maths2__pc 40 centrale-maths2__psi 39 mines-ponts-maths1__mp_cpge 24 mines-ponts-maths2__mp_cpge 21 x-ens-maths-a__mp_cpge 18 x-ens-maths-b__mp_cpge 20 x-ens-maths-d__mp 14 x-ens-maths1__mp 18 x-ens-maths2__mp 20 x-ens-maths__pc 18
2019
centrale-maths1__mp 37 centrale-maths1__official 37 centrale-maths1__pc 40 centrale-maths1__psi 39 centrale-maths2__mp 37 centrale-maths2__official 37 centrale-maths2__pc 39 centrale-maths2__psi 49 x-ens-maths1__mp 24 x-ens-maths__pc 18 x-ens-maths__psi 26
2018
centrale-maths1__mp 47 centrale-maths1__official 47 centrale-maths1__pc 41 centrale-maths1__psi 44 centrale-maths2__mp 44 centrale-maths2__official 44 centrale-maths2__pc 35 centrale-maths2__psi 38 x-ens-maths1__mp 19 x-ens-maths2__mp 17 x-ens-maths__pc 22 x-ens-maths__psi 24
2017
centrale-maths1__mp 45 centrale-maths1__official 45 centrale-maths1__pc 22 centrale-maths1__psi 17 centrale-maths2__mp 30 centrale-maths2__official 30 centrale-maths2__pc 28 centrale-maths2__psi 44 x-ens-maths1__mp 26 x-ens-maths2__mp 16 x-ens-maths__pc 18 x-ens-maths__psi 26
2016
centrale-maths1__mp 42 centrale-maths1__pc 31 centrale-maths1__psi 33 centrale-maths2__mp 25 centrale-maths2__pc 47 centrale-maths2__psi 27 x-ens-maths1__mp 18 x-ens-maths2__mp 46 x-ens-maths__pc 15 x-ens-maths__psi 20
2015
centrale-maths1__mp 42 centrale-maths1__pc 18 centrale-maths1__psi 42 centrale-maths2__mp 44 centrale-maths2__pc 18 centrale-maths2__psi 33 x-ens-maths1__mp 16 x-ens-maths2__mp 31 x-ens-maths__pc 30 x-ens-maths__psi 22
2014
centrale-maths1__mp 28 centrale-maths1__pc 26 centrale-maths1__psi 27 centrale-maths2__mp 24 centrale-maths2__pc 26 centrale-maths2__psi 27 x-ens-maths1__mp 9 x-ens-maths2__mp 16 x-ens-maths__pc 4 x-ens-maths__psi 24
2013
centrale-maths1__mp 22 centrale-maths1__pc 45 centrale-maths1__psi 29 centrale-maths2__mp 31 centrale-maths2__pc 52 centrale-maths2__psi 32 x-ens-maths1__mp 24 x-ens-maths2__mp 35 x-ens-maths__pc 22 x-ens-maths__psi 9
2012
centrale-maths1__mp 36 centrale-maths1__pc 28 centrale-maths1__psi 33 centrale-maths2__mp 27 centrale-maths2__psi 18
2011
centrale-maths1__mp 27 centrale-maths1__pc 17 centrale-maths1__psi 24 centrale-maths2__mp 29 centrale-maths2__pc 17 centrale-maths2__psi 10
2010
centrale-maths1__mp 19 centrale-maths1__pc 30 centrale-maths1__psi 13 centrale-maths2__mp 32 centrale-maths2__pc 37 centrale-maths2__psi 27
2022 centrale-maths1__pc

37 maths questions

Q1 Matrices Diagonalizability and Similarity View
Prove that a matrix $A \in \mathcal{M}_{n}(\mathbb{R})$ is orthodiagonalizable if and only if it is symmetric.
Q2 Invariant lines and eigenvalues and vectors Compute eigenvectors or eigenspaces View
We set $A_1 = \left(\begin{array}{ccc} 3 & -2 & 4 \\ -2 & 6 & 2 \\ 4 & 2 & 3 \end{array}\right)$.
By observing the first and last column of $A_1$, determine an eigenvector of $A_1$ and the associated eigenvalue $\lambda_1$.
Q3 Invariant lines and eigenvalues and vectors Compute eigenvalues of a given matrix View
We set $A_1 = \left(\begin{array}{ccc} 3 & -2 & 4 \\ -2 & 6 & 2 \\ 4 & 2 & 3 \end{array}\right)$.
Determine the eigenspace of $A_1$ associated with the eigenvalue $\lambda_1$ and deduce the spectrum of $A_1$.
Q4 Invariant lines and eigenvalues and vectors Diagonalize a matrix explicitly View
We set $A_1 = \left(\begin{array}{ccc} 3 & -2 & 4 \\ -2 & 6 & 2 \\ 4 & 2 & 3 \end{array}\right)$.
Orthodiagonalize $A_1$.
Q6 Proof Computation of a Limit, Value, or Explicit Formula View
Write the matrix $H$ of the inner product $\phi(P,Q) = \int_0^1 P(t)Q(t)\,\mathrm{d}t$ in the canonical basis of $\mathbb{R}_{n-1}[X]$, that is, the matrix with general term $h_{i,j} = \phi\left(X^i, X^j\right)$ where the indices $i$ and $j$ vary between 0 and $n-1$.
Q7 Proof Direct Proof of a Stated Identity or Equality View
Let $H$ be the matrix of the inner product $\phi(P,Q) = \int_0^1 P(t)Q(t)\,\mathrm{d}t$ in the canonical basis of $\mathbb{R}_{n-1}[X]$, with general term $h_{i,j} = \phi(X^i, X^j)$. Let $U \in \mathcal{M}_{n,1}(\mathbb{R})$. Express the product $U^\top H U$ using $\phi$ and the coefficients of $U$.
Q8 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Let $H$ be the matrix of the inner product $\phi(P,Q) = \int_0^1 P(t)Q(t)\,\mathrm{d}t$ in the canonical basis of $\mathbb{R}_{n-1}[X]$, with general term $h_{i,j} = \phi(X^i, X^j)$. Show that $H$ belongs to $\mathcal{S}_n(\mathbb{R})$ and that its eigenvalues are strictly positive.
Q9 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Show that, if $A$ is nilpotent, that is, if there exists $p \in \mathbb{N}^\star$ such that $A^p = 0_n$, then the spectral radius of $A$ is zero.
Q10 Matrices Matrix Norm, Convergence, and Inequality View
We denote $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. Prove that $C$ is a closed subset of $\mathcal{M}_{n,1}(\mathbb{R})$.
Q11 Matrices Matrix Norm, Convergence, and Inequality View
We denote $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. Deduce that the application $U \mapsto \left| U^\top A U \right|$ admits a maximum on $C$.
Q12 Matrices Eigenvalue and Characteristic Polynomial Analysis View
We denote $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. Show that $\rho(A) \leqslant \max_{U \in C} \left| U^\top A U \right|$.
Q13 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Let $A \in \mathcal{S}_n(\mathbb{R})$ and $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. Prove that $\rho(A) = \max_{U \in C} \left| U^\top A U \right|$.
Q14 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Let $A \in \mathcal{S}_n(\mathbb{R})$ and $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. We further assume that the eigenvalues of $A$ are all positive. Show then that $\rho(A) = \max_{U \in C} \left( U^\top A U \right)$.
Q15 Proof Proof That a Map Has a Specific Property View
Let $A \in \mathcal{S}_n(\mathbb{R})$ and $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. Prove that the application $\rho$ defines a norm on $\mathcal{S}_n(\mathbb{R})$.
Q16 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ defined on $(\Omega, \mathcal{B}, \mathbb{P})$ with real values and define the random vector $Y(\omega) = \left(\begin{array}{c} Y_1(\omega) \\ \vdots \\ Y_n(\omega) \end{array}\right)$. The covariance matrix $\Sigma_Y$ has general term $\sigma_{i,j} = \operatorname{cov}(Y_i, Y_j)$.
Verify that $\Sigma_Y$ is a symmetric matrix, that $$\Sigma_Y = \mathbb{E}\left((Y - \mathbb{E}(Y))(Y - \mathbb{E}(Y))^\top\right)$$ and that, if $U$ is a constant vector in $\mathcal{M}_{n,1}(\mathbb{R})$, then $$\Sigma_{Y+U} = \Sigma_Y.$$
Q17 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. Let $p \in \mathbb{N}^*$ and $M \in \mathcal{M}_{p,n}(\mathbb{R})$. We define the discrete random variable $Z = MY$, with values in $\mathcal{M}_{p,1}(\mathbb{R})$. Justify that $Z$ admits an expectation and express $\mathbb{E}(Z)$ in terms of $\mathbb{E}(Y)$. Show that $Z$ admits a covariance matrix $\Sigma_Z$ and that $$\Sigma_Z = M \Sigma_Y M^\top.$$
Q18 Matrices Matrix Decomposition and Factorization View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. We denote by $P$ the change of basis matrix from the canonical basis of $\mathcal{M}_{n,1}(\mathbb{R})$ to an orthonormal basis formed by eigenvectors of $\Sigma_Y$. We define the discrete random variable $X = P^\top Y = \left(\begin{array}{c} X_1 \\ \vdots \\ X_n \end{array}\right)$.
Prove that $\Sigma_X$ is a diagonal matrix.
Q19 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. We denote by $P$ the change of basis matrix from the canonical basis of $\mathcal{M}_{n,1}(\mathbb{R})$ to an orthonormal basis formed by eigenvectors of $\Sigma_Y$. We define the discrete random variable $X = P^\top Y$, and $\Sigma_X$ is a diagonal matrix.
Deduce that the eigenvalues of $\Sigma_Y$ are all positive.
Q20 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. We denote by $P$ the change of basis matrix from the canonical basis of $\mathcal{M}_{n,1}(\mathbb{R})$ to an orthonormal basis formed by eigenvectors of $\Sigma_Y$. We define the discrete random variable $X = P^\top Y = \left(\begin{array}{c} X_1 \\ \vdots \\ X_n \end{array}\right)$.
Prove that the total variance of $X$ is equal to that of $Y$.
Q21 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
Let $D = \operatorname{diag}(\lambda_1, \ldots, \lambda_n)$ be a diagonal matrix whose diagonal coefficients $\lambda_i$ are all positive. Prove the existence of a discrete random variable $Z$ with values in $\mathcal{M}_{n,1}(\mathbb{R})$ such that $\Sigma_Z = D$.
Q22 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
Let $A \in \mathcal{S}_n(\mathbb{R})$ be a symmetric matrix whose eigenvalues are positive. Prove the existence of a discrete random variable $Y$ with values in $\mathcal{M}_{n,1}(\mathbb{R})$ such that $\Sigma_Y = A$.
Q23 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. Let $U = \left(\begin{array}{c} u_1 \\ \vdots \\ u_n \end{array}\right)$ in $\mathcal{M}_{n,1}(\mathbb{R})$. We define the discrete random variable $X = U^\top Y$.
Show that $X$ admits a variance and that $$\mathbb{V}(X) = U^\top \Sigma_Y U.$$
Q24 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. The objective is to show that $$\mathbb{P}\left(Y - \mathbb{E}(Y) \in \operatorname{Im}\Sigma_Y\right) = 1.$$ We denote by $r$ the rank of the covariance matrix of $Y$.
Handle the case where $r = n$.
Q25 Matrices Projection and Orthogonality View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. The objective is to show that $\mathbb{P}\left(Y - \mathbb{E}(Y) \in \operatorname{Im}\Sigma_Y\right) = 1$. We now assume $r < n$ where $r$ is the rank of $\Sigma_Y$.
Prove that the kernel and image of $\Sigma_Y$ are supplementary orthogonal subspaces in $\mathcal{M}_{n,1}(\mathbb{R})$.
Q26 Continuous Probability Distributions and Random Variables Expectation and Moment Inequality Proof View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. We assume $r < n$ where $r$ is the rank of $\Sigma_Y$. We denote by $d = \dim \ker \Sigma_Y$ and we consider an orthonormal basis $(V_1, \ldots, V_d)$ of $\ker \Sigma_Y$.
Prove that $$\forall j \in \llbracket 1, d \rrbracket, \quad \mathbb{V}\left(V_j^\top(Y - \mathbb{E}(Y))\right) = 0.$$
Q27 Continuous Probability Distributions and Random Variables Verification of Probability Measure or Inner Product Properties View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. We assume $r < n$ where $r$ is the rank of $\Sigma_Y$. We denote by $d = \dim \ker \Sigma_Y$ and we consider an orthonormal basis $(V_1, \ldots, V_d)$ of $\ker \Sigma_Y$.
Deduce that $\mathbb{P}\left(V_j^\top(Y - \mathbb{E}(Y)) = 0\right) = 1$.
Q28 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. We assume $r < n$ where $r$ is the rank of $\Sigma_Y$. We denote by $d = \dim \ker \Sigma_Y$ and we consider an orthonormal basis $(V_1, \ldots, V_d)$ of $\ker \Sigma_Y$, and we have shown that $\mathbb{P}\left(V_j^\top(Y - \mathbb{E}(Y)) = 0\right) = 1$ for all $j \in \llbracket 1, d \rrbracket$.
Conclude that $\mathbb{P}\left(Y - \mathbb{E}(Y) \in \operatorname{Im}\Sigma_Y\right) = 1$.
Q29 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We set $A_2 = \operatorname{diag}(9, 5, 4)$. Justify the existence of a random vector whose covariance matrix is $A_2$.
Q30 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We set $A_2 = \operatorname{diag}(9, 5, 4)$ and $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. In this question only, we assume that $Y$ is a random variable with values in $\mathcal{M}_{3,1}(\mathbb{R})$ such that $\Sigma_Y = A_2$. Determine the maximum of $q_Y$ on $C$, where $q_Y(U) = \mathbb{V}(U^\top Y)$.
Q31 Discrete Random Variables Covariance Matrix and Multivariate Expectation View
We consider $n$ discrete random variables $Y_1, \ldots, Y_n$ with random vector $Y$ and covariance matrix $\Sigma_Y$. Let $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$ and $q_Y(U) = \mathbb{V}(U^\top Y)$.
In the general case, prove that the function $q_Y$ admits a maximum on $C$. Specify the value of this maximum as well as a vector $U_0 \in C$ such that $$\max_{U \in C} \mathbb{V}\left(U^\top Y\right) = \mathbb{V}\left(U_0^\top Y\right).$$
Q32 Matrices Structured Matrix Characterization View
We assume that $\Sigma_Y$ satisfies $$\forall i \in \llbracket 1, n \rrbracket, \quad \sigma_{i,i} = \sigma^2 \quad \text{and} \quad \forall (i,j) \in \llbracket 1, n \rrbracket^2, \quad i \neq j \Longrightarrow \sigma_{i,j} = \sigma^2 \gamma$$ where $\sigma$ and $\gamma$ are two strictly positive real numbers. We denote by $J \in \mathcal{M}_n(\mathbb{R})$ the matrix whose coefficients are all equal to 1.
Prove that $\gamma \leqslant 1$ and express $\Sigma_Y$ in terms of $J$.
Q33 Matrices Eigenvalue and Characteristic Polynomial Analysis View
We assume that $\Sigma_Y$ satisfies $$\forall i \in \llbracket 1, n \rrbracket, \quad \sigma_{i,i} = \sigma^2 \quad \text{and} \quad \forall (i,j) \in \llbracket 1, n \rrbracket^2, \quad i \neq j \Longrightarrow \sigma_{i,j} = \sigma^2 \gamma$$ where $\sigma$ and $\gamma$ are two strictly positive real numbers. We denote by $J \in \mathcal{M}_n(\mathbb{R})$ the matrix whose coefficients are all equal to 1.
Determine the eigenvalues of $J$ and the dimension of each associated eigenspace. Also determine an eigenvector associated with its eigenvalue of maximal modulus.
Q34 Matrices Projection and Orthogonality View
We assume that $\Sigma_Y$ satisfies $$\forall i \in \llbracket 1, n \rrbracket, \quad \sigma_{i,i} = \sigma^2 \quad \text{and} \quad \forall (i,j) \in \llbracket 1, n \rrbracket^2, \quad i \neq j \Longrightarrow \sigma_{i,j} = \sigma^2 \gamma$$ where $\sigma$ and $\gamma$ are two strictly positive real numbers. We denote by $J \in \mathcal{M}_n(\mathbb{R})$ the matrix whose coefficients are all equal to 1.
Specify a unit vector $U_0$ such that the variance of $Z = U_0^\top Y$ is maximal.
Q35 Matrices Eigenvalue and Characteristic Polynomial Analysis View
We assume that $\Sigma_Y$ satisfies $$\forall i \in \llbracket 1, n \rrbracket, \quad \sigma_{i,i} = \sigma^2 \quad \text{and} \quad \forall (i,j) \in \llbracket 1, n \rrbracket^2, \quad i \neq j \Longrightarrow \sigma_{i,j} = \sigma^2 \gamma$$ where $\sigma$ and $\gamma$ are two strictly positive real numbers. We denote by $J \in \mathcal{M}_n(\mathbb{R})$ the matrix whose coefficients are all equal to 1, and $U_0$ is a unit vector such that the variance of $Z = U_0^\top Y$ is maximal.
Calculate the percentage of total variance represented by $Z$, that is, the ratio $\dfrac{\mathbb{V}(Z)}{\mathbb{V}_T(Y)}$.
Q36 Continuous Probability Distributions and Random Variables Expectation and Moment Inequality Proof View
We assume that $\Sigma_Y$ has $n$ distinct eigenvalues which we order in strictly decreasing order $\lambda_1 > \cdots > \lambda_n$. We equip ourselves with a vector $U_0$ such that $\mathbb{V}\left(U_0^\top Y\right) = \max_{U \in C} \mathbb{V}\left(U^\top Y\right)$, where $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. We denote $$C' = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \text{ and } U_0^\top U = 0 \right\}.$$
Justify that $q_Y$ admits a maximum on $C'$.
Q37 Continuous Probability Distributions and Random Variables Distribution of Transformed or Combined Random Variables View
We assume that $\Sigma_Y$ has $n$ distinct eigenvalues which we order in strictly decreasing order $\lambda_1 > \cdots > \lambda_n$. We equip ourselves with a vector $U_0$ such that $\mathbb{V}\left(U_0^\top Y\right) = \max_{U \in C} \mathbb{V}\left(U^\top Y\right)$, where $C = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \right\}$. We denote $$C' = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \text{ and } U_0^\top U = 0 \right\}.$$
Determine the value of $\max_{U \in C'} \mathbb{V}\left(U^\top Y\right)$ and specify a vector $U_1 \in C'$ such that $$\max_{U \in C'} \mathbb{V}\left(U^\top Y\right) = \mathbb{V}\left(U_1^\top Y\right).$$
Q38 Continuous Probability Distributions and Random Variables Distribution of Transformed or Combined Random Variables View
We assume that $\Sigma_Y$ has $n$ distinct eigenvalues which we order in strictly decreasing order $\lambda_1 > \cdots > \lambda_n$. We equip ourselves with a vector $U_0$ such that $\mathbb{V}\left(U_0^\top Y\right) = \max_{U \in C} \mathbb{V}\left(U^\top Y\right)$, and a vector $U_1 \in C'$ such that $\mathbb{V}\left(U_1^\top Y\right) = \max_{U \in C'} \mathbb{V}\left(U^\top Y\right)$, where $$C' = \left\{ U \in \mathcal{M}_{n,1}(\mathbb{R}) \mid U^\top U = 1 \text{ and } U_0^\top U = 0 \right\}.$$
Calculate the covariance of the discrete random variables $U_0^\top Y$ and $U_1^\top Y$ (to simplify notation, one may assume $Y$ is centered, that is, $\mathbb{E}(Y) = 0$).