grandes-ecoles

Papers (191)
2025
centrale-maths1__official 40 centrale-maths2__official 42 mines-ponts-maths1__mp 20 mines-ponts-maths1__pc 21 mines-ponts-maths1__psi 21 mines-ponts-maths2__mp 28 mines-ponts-maths2__pc 24 mines-ponts-maths2__psi 26 polytechnique-maths-a__mp 27 polytechnique-maths__fui 16 polytechnique-maths__pc 27 x-ens-maths-a__mp 18 x-ens-maths-c__mp 9 x-ens-maths-d__mp 38 x-ens-maths__pc 27 x-ens-maths__psi 38
2024
centrale-maths1__official 28 centrale-maths2__official 29 geipi-polytech__maths 9 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 20 mines-ponts-maths1__psi 19 mines-ponts-maths2__mp 23 mines-ponts-maths2__pc 21 mines-ponts-maths2__psi 21 polytechnique-maths-a__mp 44 polytechnique-maths-b__mp 37 x-ens-maths-a__mp 43 x-ens-maths-b__mp 35 x-ens-maths-c__mp 22 x-ens-maths-d__mp 45 x-ens-maths__pc 24 x-ens-maths__psi 26
2023
centrale-maths1__official 44 centrale-maths2__official 33 e3a-polytech-maths__mp 4 mines-ponts-maths1__mp 15 mines-ponts-maths1__pc 23 mines-ponts-maths1__psi 23 mines-ponts-maths2__mp 22 mines-ponts-maths2__pc 18 mines-ponts-maths2__psi 22 polytechnique-maths__fui 23 x-ens-maths-a__mp 25 x-ens-maths-b__mp 24 x-ens-maths-c__mp 20 x-ens-maths-d__mp 20 x-ens-maths__pc 18 x-ens-maths__psi 15
2022
centrale-maths1__mp 48 centrale-maths1__official 48 centrale-maths1__pc 37 centrale-maths1__psi 43 centrale-maths2__mp 32 centrale-maths2__official 32 centrale-maths2__pc 39 centrale-maths2__psi 45 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 24 mines-ponts-maths1__psi 24 mines-ponts-maths2__mp 24 mines-ponts-maths2__pc 19 mines-ponts-maths2__psi 20 x-ens-maths-a__mp 13 x-ens-maths-b__mp 40 x-ens-maths-c__mp 27 x-ens-maths-d__mp 46 x-ens-maths1__mp 13 x-ens-maths2__mp 40 x-ens-maths__pc 15 x-ens-maths__pc_cpge 15 x-ens-maths__psi 22 x-ens-maths__psi_cpge 23
2021
centrale-maths1__mp 40 centrale-maths1__official 40 centrale-maths1__pc 36 centrale-maths1__psi 29 centrale-maths2__mp 30 centrale-maths2__official 29 centrale-maths2__pc 38 centrale-maths2__psi 37 x-ens-maths2__mp 39 x-ens-maths__pc 44
2020
centrale-maths1__mp 42 centrale-maths1__official 42 centrale-maths1__pc 36 centrale-maths1__psi 40 centrale-maths2__mp 38 centrale-maths2__official 38 centrale-maths2__pc 40 centrale-maths2__psi 39 mines-ponts-maths1__mp_cpge 24 mines-ponts-maths2__mp_cpge 21 x-ens-maths-a__mp_cpge 18 x-ens-maths-b__mp_cpge 20 x-ens-maths-d__mp 14 x-ens-maths1__mp 18 x-ens-maths2__mp 20 x-ens-maths__pc 18
2019
centrale-maths1__mp 37 centrale-maths1__official 37 centrale-maths1__pc 40 centrale-maths1__psi 39 centrale-maths2__mp 37 centrale-maths2__official 37 centrale-maths2__pc 39 centrale-maths2__psi 49 x-ens-maths1__mp 24 x-ens-maths__pc 18 x-ens-maths__psi 26
2018
centrale-maths1__mp 47 centrale-maths1__official 47 centrale-maths1__pc 41 centrale-maths1__psi 44 centrale-maths2__mp 44 centrale-maths2__official 44 centrale-maths2__pc 35 centrale-maths2__psi 38 x-ens-maths1__mp 19 x-ens-maths2__mp 17 x-ens-maths__pc 22 x-ens-maths__psi 24
2017
centrale-maths1__mp 45 centrale-maths1__official 45 centrale-maths1__pc 22 centrale-maths1__psi 17 centrale-maths2__mp 30 centrale-maths2__official 30 centrale-maths2__pc 28 centrale-maths2__psi 44 x-ens-maths1__mp 26 x-ens-maths2__mp 16 x-ens-maths__pc 18 x-ens-maths__psi 26
2016
centrale-maths1__mp 42 centrale-maths1__pc 31 centrale-maths1__psi 33 centrale-maths2__mp 25 centrale-maths2__pc 47 centrale-maths2__psi 27 x-ens-maths1__mp 18 x-ens-maths2__mp 46 x-ens-maths__pc 15 x-ens-maths__psi 20
2015
centrale-maths1__mp 42 centrale-maths1__pc 18 centrale-maths1__psi 42 centrale-maths2__mp 44 centrale-maths2__pc 18 centrale-maths2__psi 33 x-ens-maths1__mp 16 x-ens-maths2__mp 31 x-ens-maths__pc 30 x-ens-maths__psi 22
2014
centrale-maths1__mp 28 centrale-maths1__pc 26 centrale-maths1__psi 27 centrale-maths2__mp 24 centrale-maths2__pc 26 centrale-maths2__psi 27 x-ens-maths1__mp 9 x-ens-maths2__mp 16 x-ens-maths__pc 4 x-ens-maths__psi 24
2013
centrale-maths1__mp 22 centrale-maths1__pc 45 centrale-maths1__psi 29 centrale-maths2__mp 31 centrale-maths2__pc 52 centrale-maths2__psi 32 x-ens-maths1__mp 24 x-ens-maths2__mp 35 x-ens-maths__pc 22 x-ens-maths__psi 9
2012
centrale-maths1__mp 36 centrale-maths1__pc 28 centrale-maths1__psi 33 centrale-maths2__mp 27 centrale-maths2__psi 18
2011
centrale-maths1__mp 27 centrale-maths1__pc 17 centrale-maths1__psi 24 centrale-maths2__mp 29 centrale-maths2__pc 17 centrale-maths2__psi 10
2010
centrale-maths1__mp 19 centrale-maths1__pc 30 centrale-maths1__psi 13 centrale-maths2__mp 32 centrale-maths2__pc 37 centrale-maths2__psi 27
2025 x-ens-maths__pc

27 maths questions

Q1 Matrices Determinant and Rank Computation View
Let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$. We set $M = \mathbf{u v}^T$. Show that $M$ is a square matrix of size $n \times n$, of rank 1.
Q2 Matrices Determinant and Rank Computation View
Calculate with justification the rank of the following matrix $J \in \mathcal{M}_n(\mathbb{R})$: $$J = \left(\begin{array}{cccc} 1 & 1 & \cdots & 1 \\ 1 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 1 \end{array}\right)$$
Q3 Matrices Determinant and Rank Computation View
Conversely, let $K \in \mathcal{M}_n(\mathbb{R})$ be a square matrix of rank 1. Show that there exist $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$ such that $K = \mathbf{u v}^T$.
Q4 Matrices Matrix Algebra and Product Properties View
Let $\mathbf{u}, \mathbf{v}, \mathbf{x}, \mathbf{y} \in \mathbb{R}^n \backslash \{0\}$. Show that $\mathbf{u v}^T = \mathbf{x y}^T$ if and only if there exists $\lambda \in \mathbb{R} \backslash \{0\}$ such that $$\mathbf{u} = \lambda \mathbf{x}, \quad \text{and} \quad \mathbf{v} = \frac{1}{\lambda} \mathbf{y}$$
Q5 Matrices Diagonalizability and Similarity View
Let $K \in \mathcal{M}_n(\mathbb{R})$ be a matrix of rank 1, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$ be such that $K = \mathbf{u v}^T$.
(a) Show that $\operatorname{Tr}(K) = \langle \mathbf{v}, \mathbf{u} \rangle$.
(b) Show that $K^2 = \operatorname{Tr}(K) K$.
(c) Deduce that $K$ is diagonalizable if and only if $\operatorname{Tr}(K) \neq 0$.
Q6 Matrices Projection and Orthogonality View
Let $P \in \mathcal{M}_n(\mathbb{R})$. Show that $P$ is an orthogonal projector of rank 1 if and only if there exists $\mathbf{y} \in \mathbb{R}^n$ with $\|\mathbf{y}\| = 1$ such that $P = \mathbf{y y}^T$.
Q7 Matrices Matrix Algebra and Product Properties View
Let $A \in \mathrm{GL}_n(\mathbb{R})$ be an invertible matrix, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$. Calculate the block matrix product $$\left(\begin{array}{cc} \mathbb{I}_n & 0 \\ \mathbf{v}^T & 1 \end{array}\right) \left(\begin{array}{cc} \mathbb{I}_n + \mathbf{u}\mathbf{v}^T & \mathbf{u} \\ 0 & 1 \end{array}\right) \left(\begin{array}{cc} \mathbb{I}_n & 0 \\ -\mathbf{v}^T & 1 \end{array}\right)$$
Q8 Matrices Determinant and Rank Computation View
Let $A \in \mathrm{GL}_n(\mathbb{R})$ be an invertible matrix, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$. Show that $$\operatorname{det}\left(\mathbb{I}_n + \mathbf{u}\mathbf{v}^T\right) = 1 + \langle \mathbf{v}, \mathbf{u} \rangle$$
Q9 Matrices Determinant and Rank Computation View
Let $A \in \mathrm{GL}_n(\mathbb{R})$ be an invertible matrix, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$. Show more generally that $$\operatorname{det}\left(A + \mathbf{u}\mathbf{v}^T\right) = \operatorname{det}(A)\left(1 + \left\langle \mathbf{v}, A^{-1}\mathbf{u} \right\rangle\right).$$
Q10 Matrices Linear System and Inverse Existence View
Let $A \in \mathrm{GL}_n(\mathbb{R})$ be an invertible matrix, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$. Show that $A + \mathbf{u v}^T$ is invertible if and only if $\left\langle \mathbf{v}, A^{-1}\mathbf{u} \right\rangle \neq -1$.
Q11 Matrices Linear System and Inverse Existence View
Let $A \in \mathrm{GL}_n(\mathbb{R})$ be an invertible matrix, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$. Suppose that $A + \mathbf{u v}^T$ is invertible. Show that $$\left(A + \mathbf{u v}^T\right)^{-1} = A^{-1} - \frac{A^{-1}\mathbf{u}\mathbf{v}^T A^{-1}}{1 + \left\langle \mathbf{v}, A^{-1}\mathbf{u} \right\rangle}$$
Q12 Matrices Determinant and Rank Computation View
Let $A \in \mathrm{GL}_n(\mathbb{R})$ be an invertible matrix, and let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n$. Let $C \in \mathcal{M}_n(\mathbb{R})$ be a matrix such that $\operatorname{det}(C) = 0$. Is it always true that $\operatorname{det}\left(C + \mathbf{u v}^T\right) = 0$? Justify your answer.
Q13 Matrices Structured Matrix Characterization View
We now consider the case where $A \in \mathcal{S}_n(\mathbb{R})$ is symmetric. Let $\mathbf{u} \in \mathbb{R}^n$ be such that $\|\mathbf{u}\| = 1$. We set $B = A + \mathbf{u}\mathbf{u}^T$. Show that $B \in \mathcal{S}_n(\mathbb{R})$.
Q14 Matrices Projection and Orthogonality View
Let $\left(\mathbf{v}_1, \ldots, \mathbf{v}_n\right)$ be any orthonormal basis of $\mathbb{R}^n$. Show that $$\mathbb{I}_n = \sum_{k=1}^n \mathbf{v}_k \mathbf{v}_k^T$$
Q15 Matrices Matrix Decomposition and Factorization View
We now consider the symmetric matrix $A$. By virtue of the spectral theorem, we denote by $\lambda_1 \leqslant \cdots \leqslant \lambda_n$ the eigenvalues of $A$, and $\left(\mathbf{w}_1, \ldots, \mathbf{w}_n\right)$ a corresponding orthonormal basis of eigenvectors.
(a) Show that $$A = \sum_{k=1}^n \lambda_k \mathbf{w}_k \mathbf{w}_k^T$$ (b) Show that for all $x \in \mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$, we have $$\left(x \mathbb{I}_n - A\right)^{-1} = \sum_{k=1}^n \frac{1}{x - \lambda_k} \mathbf{w}_k \mathbf{w}_k^T$$
Q16 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Let $\lambda$ be an eigenvalue of $A$ with multiplicity $m \geqslant 2$. We set $E = \operatorname{Ker}\left(A - \lambda \mathbb{I}_n\right)$.
(a) Show that $\operatorname{dim}\left(E \cap \{\mathbf{u}\}^\perp\right) \geqslant m - 1$.
(b) Deduce that $\lambda$ is an eigenvalue of $B$ with multiplicity at least $m - 1$.
Q17 Matrices Determinant and Rank Computation View
We denote by $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$ the characteristic polynomial of $A$, and $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$ that of $B$. Show that, for all $x \in \mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$, we have $$\chi_B(x) = \chi_A(x)\left(1 - \sum_{k=1}^n \frac{\left\langle \mathbf{w}_k, \mathbf{u} \right\rangle^2}{x - \lambda_k}\right).$$
Q18 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Let $J = \left\{k \in \{1, 2, \ldots, n\}, \left\langle \mathbf{w}_k, \mathbf{u} \right\rangle \neq 0\right\}$ be the set of indices $k$ such that $\left\langle \mathbf{w}_k, \mathbf{u} \right\rangle \neq 0$.
(a) Show that $J \neq \varnothing$.
(b) Let $\ell \notin J$. Show that $\lambda_\ell$ is an eigenvalue of $B$.
(c) Suppose that $J = \{j\}$ for some $j \in \{1, 2, \ldots, n\}$. Show that the eigenvalues of $B$ are $$\left(\lambda_1, \lambda_2, \ldots, \lambda_{j-1}, \lambda_j + 1, \lambda_{j+1}, \ldots, \lambda_n\right).$$
Q19 Matrices Eigenvalue and Characteristic Polynomial Analysis View
Suppose in this question that $\lambda_1 < \lambda_2 < \cdots < \lambda_n$, and that $J = \{1, 2, \ldots, n\}$. For $x \in \mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$ we set $$f(x) = \sum_{k=1}^n \frac{\left\langle \mathbf{w}_k, \mathbf{u} \right\rangle^2}{x - \lambda_k}$$ (a) Show that $f$ is of class $C^\infty$ on $\mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$, and calculate its derivative $f'(x)$.
(b) Show that the equation $f(x) = 1$ has a unique solution in each interval $]\lambda_\ell, \lambda_{\ell+1}[$ for all $\ell \in \{1, 2, \ldots, n-1\}$, and in $]\lambda_n, +\infty[$.
(c) We denote by $\mu_1 \leqslant \mu_2 \leqslant \cdots \leqslant \mu_n$ the eigenvalues of $B$. Show that $$\lambda_1 < \mu_1 < \lambda_2 < \mu_2 < \cdots < \lambda_n < \mu_n$$
Q20 Continuous Probability Distributions and Random Variables Expectation and Moment Inequality Proof View
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$.
Show that for all $\mathbf{w} \in \mathbb{R}^n$, we have $\mathbb{E}\left[\langle \mathbf{U}, \mathbf{w} \rangle^2\right] = \frac{1}{n} \|\mathbf{w}\|^2$.
Q21 Discrete Random Variables Expectation of a Function of a Discrete Random Variable View
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$, and for all $x \in \mathbb{R}$, $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$.
Let $x \in \mathbb{R} \backslash \left\{\lambda_1, \ldots, \lambda_n\right\}$. Show that the random variable $\chi_B(x)$ has finite expectation, and that, denoting by $\chi_A'$ the derivative of the polynomial $\chi_A$, we have $$\mathbb{E}\left[\chi_B(x)\right] = \chi_A(x) - \frac{1}{n} \chi_A'(x)$$
Q22 Discrete Random Variables Expectation of a Function of a Discrete Random Variable View
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$, and for all $x \in \mathbb{R}$, $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$.
Show that for all $k \in \{1, 2, \ldots, n\}$, we have $$\mathbb{E}\left[\chi_B\left(\lambda_k\right)\right] = -\frac{1}{n} \chi_A'\left(\lambda_k\right)$$
Q23 Discrete Random Variables Expectation of a Function of a Discrete Random Variable View
In this fourth part, $A \in \mathcal{S}_n(\mathbb{R})$ is a symmetric matrix whose eigenvalues are denoted $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$. For $x \in \mathbb{R}$ we denote $\chi_A(x) = \operatorname{det}\left(x \mathbb{I}_n - A\right)$. We consider any orthonormal basis $\left(\mathbf{u}_1, \ldots, \mathbf{u}_n\right)$. Let $\mathbf{U}$ be a random variable defined on a probability space $(\Omega, \mathcal{A}, \mathbb{P})$ taking values in the finite set $\left\{\mathbf{u}_1, \ldots, \mathbf{u}_n\right\}$, and which follows the uniform distribution on this set. We consider the random variable $B = A + \mathbf{U}\mathbf{U}^T$, and for all $x \in \mathbb{R}$, $\chi_B(x) = \operatorname{det}\left(x \mathbb{I}_n - B\right)$.
Prove that there exists $x \in \mathbb{R}$ such that $\mathbb{E}\left[\chi_B(x)\right] \neq 0$.
Q24 Matrices Linear System and Inverse Existence View
As in the third part, we suppose that $B = A + \mathbf{u}\mathbf{u}^T$ with $A \in \mathcal{S}_n(\mathbb{R})$ a symmetric matrix, and $\mathbf{u} \in \mathbb{R}^n$ a vector such that $\|\mathbf{u}\| = 1$. We denote by $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$ the eigenvalues of $A$ and $\mu_1 \leqslant \mu_2 \leqslant \cdots \leqslant \mu_n$ those of $B$. We admit that $$\lambda_1 \leqslant \mu_1 \leqslant \lambda_2 \leqslant \mu_2 \leqslant \cdots \leqslant \lambda_n \leqslant \mu_n.$$ We further suppose that there exists an integer $m \in \{1, 2, \ldots, n-1\}$ such that the eigenvalues of $A$ satisfy $$0 = \lambda_1 = \lambda_2 = \cdots = \lambda_m < \lambda_{m+1} \leqslant \cdots \leqslant \lambda_n.$$ Let $\varepsilon \in ]0, \lambda_{m+1}[$.
Justify that $(A - \varepsilon \mathbb{I}_n)$ is invertible.
Q25 Matrices Linear System and Inverse Existence View
As in the third part, we suppose that $B = A + \mathbf{u}\mathbf{u}^T$ with $A \in \mathcal{S}_n(\mathbb{R})$ a symmetric matrix, and $\mathbf{u} \in \mathbb{R}^n$ a vector such that $\|\mathbf{u}\| = 1$. We denote by $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$ the eigenvalues of $A$ and $\mu_1 \leqslant \mu_2 \leqslant \cdots \leqslant \mu_n$ those of $B$. We admit that $$\lambda_1 \leqslant \mu_1 \leqslant \lambda_2 \leqslant \mu_2 \leqslant \cdots \leqslant \lambda_n \leqslant \mu_n.$$ We further suppose that there exists an integer $m \in \{1, 2, \ldots, n-1\}$ such that the eigenvalues of $A$ satisfy $$0 = \lambda_1 = \lambda_2 = \cdots = \lambda_m < \lambda_{m+1} \leqslant \cdots \leqslant \lambda_n.$$ Let $\varepsilon \in ]0, \lambda_{m+1}[$. We suppose that $\left\langle \mathbf{u}, \left(A - \varepsilon \mathbb{I}_n\right)^{-1}\mathbf{u}\right\rangle < -1$.
Show that $(B - \varepsilon \mathbb{I}_n)$ is invertible.
Q26 Matrices Linear System and Inverse Existence View
As in the third part, we suppose that $B = A + \mathbf{u}\mathbf{u}^T$ with $A \in \mathcal{S}_n(\mathbb{R})$ a symmetric matrix, and $\mathbf{u} \in \mathbb{R}^n$ a vector such that $\|\mathbf{u}\| = 1$. We denote by $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$ the eigenvalues of $A$ and $\mu_1 \leqslant \mu_2 \leqslant \cdots \leqslant \mu_n$ those of $B$. We admit that $$\lambda_1 \leqslant \mu_1 \leqslant \lambda_2 \leqslant \mu_2 \leqslant \cdots \leqslant \lambda_n \leqslant \mu_n.$$ We further suppose that there exists an integer $m \in \{1, 2, \ldots, n-1\}$ such that the eigenvalues of $A$ satisfy $$0 = \lambda_1 = \lambda_2 = \cdots = \lambda_m < \lambda_{m+1} \leqslant \cdots \leqslant \lambda_n.$$ Let $\varepsilon \in ]0, \lambda_{m+1}[$. We suppose that $\left\langle \mathbf{u}, \left(A - \varepsilon \mathbb{I}_n\right)^{-1}\mathbf{u}\right\rangle < -1$.
Show that $\operatorname{Tr}\left(\left(B - \varepsilon \mathbb{I}_n\right)^{-1}\right) > \operatorname{Tr}\left(\left(A - \varepsilon \mathbb{I}_n\right)^{-1}\right)$.
Q27 Invariant lines and eigenvalues and vectors Eigenvalue interlacing and spectral inequalities View
As in the third part, we suppose that $B = A + \mathbf{u}\mathbf{u}^T$ with $A \in \mathcal{S}_n(\mathbb{R})$ a symmetric matrix, and $\mathbf{u} \in \mathbb{R}^n$ a vector such that $\|\mathbf{u}\| = 1$. We denote by $\lambda_1 \leqslant \lambda_2 \leqslant \cdots \leqslant \lambda_n$ the eigenvalues of $A$ and $\mu_1 \leqslant \mu_2 \leqslant \cdots \leqslant \mu_n$ those of $B$. We admit that $$\lambda_1 \leqslant \mu_1 \leqslant \lambda_2 \leqslant \mu_2 \leqslant \cdots \leqslant \lambda_n \leqslant \mu_n.$$ We further suppose that there exists an integer $m \in \{1, 2, \ldots, n-1\}$ such that the eigenvalues of $A$ satisfy $$0 = \lambda_1 = \lambda_2 = \cdots = \lambda_m < \lambda_{m+1} \leqslant \cdots \leqslant \lambda_n.$$ Let $\varepsilon \in ]0, \lambda_{m+1}[$. We suppose that $\left\langle \mathbf{u}, \left(A - \varepsilon \mathbb{I}_n\right)^{-1}\mathbf{u}\right\rangle < -1$.
Show that $\mu_m > \varepsilon$.