Matrices

Question Types
All Questions
grandes-ecoles 2024 Q23 Linear Transformation and Endomorphism Properties
In this part, we assume that $n \geqslant 4$. Let $u = (u_k)_{k \geqslant 0}$ be a sequence of $\mathbb{C}$ satisfying the condition $(C^\star)$: $R_u > 1$.
Let $H \in \mathscr{M}_n(\mathbb{C})$ be the matrix given by $$H = \begin{pmatrix} 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \ddots & \vdots \\ \vdots & \ddots & \ddots & \ddots & 0 \\ \vdots & & \ddots & \ddots & 1 \\ 0 & \cdots & \cdots & 0 & 0 \end{pmatrix}.$$
(a) Determine the polynomial $\varphi_H$ in this case.
(b) Let $A = H + \alpha I_n$ where $\alpha \in \mathbb{C}$ is such that $|\alpha| < R_u$. Show that $$u(A) = \sum_{k=0}^{n-1} \frac{U^{(k)}(\alpha)}{k!} H^k$$ and deduce that $$u(A) = \begin{pmatrix} U(\alpha) & \frac{U^{(1)}(\alpha)}{1!} & \frac{U^{(2)}(\alpha)}{2!} & \cdots & \frac{U^{(n-1)}(\alpha)}{(n-1)!} \\ 0 & U(\alpha) & \frac{U^{(1)}(\alpha)}{1!} & \ddots & \vdots \\ \vdots & \ddots & \ddots & \ddots & \frac{U^{(2)}(\alpha)}{2!} \\ \vdots & & \ddots & \ddots & \frac{U^{(1)}(\alpha)}{1!} \\ 0 & \cdots & \cdots & 0 & U(\alpha) \end{pmatrix}.$$
grandes-ecoles 2024 Q23 Projection and Orthogonality
We consider $R \in \mathrm{O}_{d}(\mathbb{R})$ with $\operatorname{det}(R) = -1$, and an orthonormal basis $(u_{1}, \ldots, u_{d})$ of $\mathbb{R}^{d}$ such that $Ru_{d} = -u_{d}$ and $R(E_{1}) = E_{1}$ where $E_{1} = \operatorname{Vect}(u_{1}, \ldots, u_{d-1})$. We consider a matrix $D = \operatorname{Diag}(\alpha_{1}, \ldots, \alpha_{d}) \in \mathscr{M}_{d}(\mathbb{R})$ diagonal with diagonal entries $\alpha_{i} \geqslant 0$ in decreasing order. We denote $U = (u_{1} | \ldots | u_{d})$.
  • [(a)] Verify that $\langle D, R \rangle = \langle S, R^{\prime} \rangle$ where $R^{\prime} = U^{T}RU$ and $S = U^{T}DU$.
  • [(b)] Show that if $R_{0} = (R_{ij}^{\prime})_{1 \leqslant i,j \leqslant d-1} \in \mathscr{M}_{d-1}(\mathbb{R})$ then $R_{0} \in \mathrm{O}_{d-1}(\mathbb{R})$.
grandes-ecoles 2024 Q24 Eigenvalue and Characteristic Polynomial Analysis
In this part, we assume that $n \geqslant 4$. Let $u = (u_k)_{k \geqslant 0}$ be a sequence of $\mathbb{C}$ satisfying the condition $(C^\star)$: $R_u > 1$.
Let $G \in \mathscr{M}_n(\mathbb{C})$ be the matrix defined by $$G = Y\, {}^t Z$$ where $Y, Z \in \mathscr{M}_{n,1}(\mathbb{R})$ are two column vectors such that ${}^t Y Y = {}^t Z Z = 1$.
(a) Show that $G$ has rank 1 and give its image.
(b) Show that $0$ and ${}^t Z Y$ are the only eigenvalues of $G$.
(c) Deduce that $G \in \mathbb{M}_n(u)$.
(d) Determine $\varphi_G$ when ${}^t Z Y \neq 0$.
(e) Deduce that if ${}^t Z Y \neq 0$ then $$u(G) = U(0) I_n + \frac{U({}^t Z Y) - U(0)}{{}^t Z Y} G.$$ (f) Determine a simple expression for $u(G)$ when ${}^t Z Y = 0$.
grandes-ecoles 2024 Q24 Projection and Orthogonality
We consider $R \in \mathrm{O}_{d}(\mathbb{R})$ with $\operatorname{det}(R) = -1$, and the notation from question 23. We set $S_{0} = (S_{ij})_{1 \leqslant i,j \leqslant d-1} \in \mathscr{M}_{d-1}(\mathbb{R})$.
  • [(a)] Show that $\langle D, R \rangle = \operatorname{tr}(S_{0} R_{0}) - S_{dd}$.
  • [(b)] Show that $\operatorname{tr}(S_{0} R_{0}) \leqslant \operatorname{tr}(S_{0})$.
  • [(c)] Show that $\operatorname{tr}(S_{0}) + S_{dd} = \operatorname{tr}(D)$ and deduce that $\langle D, R \rangle \leqslant \operatorname{tr}(D) - 2S_{dd}$.
grandes-ecoles 2024 Q25 Matrix Power Computation and Application
In this part, we assume that $n \geqslant 4$. Let $u = (u_k)_{k \geqslant 0}$ be a sequence of $\mathbb{C}$ satisfying the condition $(C^\star)$: $R_u > 1$.
Let $F \in \mathscr{M}_n(\mathbb{C})$ be the matrix defined by $$[F]_{k,j} = \frac{1}{\sqrt{n}} \omega^{(k-1)(j-1)} \text{ for all } (k,j) \in \llbracket 1; n \rrbracket^2,$$ where $\omega = e^{-2\pi i/n}$ (here $i$ denotes the usual complex number satisfying $i^2 = -1$).
(a) Show that $F$ is invertible and that $F^{-1} = \bar{F}$.
(b) Show that $F^2 \in \mathscr{M}_n(\mathbb{R})$.
(c) Deduce that $F^4 = I_n$ and that $F \in \mathbb{M}_n(u)$.
(d) Deduce that $$\begin{aligned} u(F) = & \frac{1}{4}\left(U(1)(F + I_n) - U(-1)(F - I_n)\right)(F^2 + I_n) \\ & + \frac{i}{4}\left(U(i)(F + iI_n) - U(-i)(F - iI_n)\right)(F^2 - I_n) \end{aligned}$$
grandes-ecoles 2024 Q25 Matrix Norm, Convergence, and Inequality
We consider $R \in \mathrm{O}_{d}(\mathbb{R})$ with $\operatorname{det}(R) = -1$, $D = \operatorname{Diag}(\alpha_{1}, \ldots, \alpha_{d})$ with $\alpha_{i} \geqslant 0$ in decreasing order, $U = (U_{ij})_{1 \leqslant i,j \leqslant d}$, and $S_{dd}$ as defined in question 24.
  • [(a)] Show that $S_{dd} = \sum_{j=1}^{d} \alpha_{j} U_{jd}^{2}$ where $U = (U_{ij})_{1 \leqslant i,j \leqslant d}$.
  • [(b)] Deduce that $\langle D, R \rangle \leqslant \left(\sum_{i=1}^{d-1} \alpha_{i}\right) - \alpha_{d}$.
grandes-ecoles 2024 Q26 Matrix Power Computation and Application
In this part, we assume that $n \geqslant 4$. Let $u = (u_k)_{k \geqslant 0}$ be a sequence of $\mathbb{C}$ satisfying the condition $(C^\star)$: $R_u > 1$.
Suppose that for all $k \in \mathbb{N}$, $u_k = \mathbb{P}(X = k)$ where $X$ is a random variable taking values in $\mathbb{N}$.
(a) Suppose that $X$ follows a binomial distribution with parameters $(N, p)$. Verify that $u$ satisfies condition $(C^\star)$ and find a simple expression for $u(A)$ for all $A \in \mathbb{M}_n(u)$.
(b) Suppose that $X$ follows a geometric distribution with parameter $p \in ]0,1[$. Verify that $u$ satisfies condition $(C^\star)$ and show that $$u(A) = p\left(I_n - (1-p)A\right)^{-1} A$$ for every diagonalizable matrix $A \in \mathbb{M}_n(u)$.
grandes-ecoles 2024 Q26 Matrix Decomposition and Factorization
Give the value of $\delta(\boldsymbol{x}, \boldsymbol{y})$ as a function of $V_{n}(\boldsymbol{x})$, $V_{n}(\boldsymbol{y})$ and the singular values of $Z(\boldsymbol{x}, \boldsymbol{y})$ in the case where $\operatorname{det}(Z(\boldsymbol{x}, \boldsymbol{y})) < 0$.
grandes-ecoles 2025 QP2-1 Matrix Norm, Convergence, and Inequality
Problem 2, Part 1: Adapted norms
We denote by $\mathrm { M } _ { d } ( \mathbb { C } )$ the space of $d \times d$ square matrices with complex coefficients and we identify $\mathbb { C } ^ { d }$ with the space of column vectors of size $d$.
For a vector $x = \left( x _ { 1 } , \ldots , x _ { d } \right) \in \mathbb { C } ^ { d }$, we define $\| x \| _ { \infty } = \max _ { 1 \leqslant i \leqslant d } \left| x _ { i } \right|$ and $\| x \| _ { 1 } = \sum _ { i = 1 } ^ { d } \left| x _ { i } \right|$.
Let $A \in \mathrm { M } _ { d } ( \mathbb { C } )$. Determine a necessary and sufficient condition on $A$ for the map $x \mapsto \| A x \| _ { \infty }$ to define a norm on $\mathbb { C } ^ { d }$.
grandes-ecoles 2025 QP2-2 Matrix Norm, Convergence, and Inequality
Problem 2, Part 1: Adapted norms
We denote by $\mathrm { M } _ { d } ( \mathbb { C } )$ the space of $d \times d$ square matrices with complex coefficients and we identify $\mathbb { C } ^ { d }$ with the space of column vectors of size $d$.
For a vector $x = \left( x _ { 1 } , \ldots , x _ { d } \right) \in \mathbb { C } ^ { d }$, we define $\| x \| _ { \infty } = \max _ { 1 \leqslant i \leqslant d } \left| x _ { i } \right|$ and $\| x \| _ { 1 } = \sum _ { i = 1 } ^ { d } \left| x _ { i } \right|$.
Given a matrix $A \in \mathrm { M } _ { d } ( \mathbb { C } )$ we define $$\| A \| = \sup _ { \| x \| _ { \infty } \leqslant 1 } \| A x \| _ { \infty } .$$
a. Show that this defines a norm on $\mathrm { M } _ { d } ( \mathbb { C } )$ and that there exists $x _ { 0 } \in \mathbb { C } ^ { d }$ such that $\left\| x _ { 0 } \right\| _ { \infty } = 1$ and $\left\| A x _ { 0 } \right\| _ { \infty } = \| A \|$. b. Show that for all $( A , B ) \in \mathrm { M } _ { d } ( \mathbb { C } )$ we have $\| A B \| \leqslant \| A \| \cdot \| B \|$.
grandes-ecoles 2025 QP2-3 Matrix Norm, Convergence, and Inequality
Problem 2, Part 1: Adapted norms
We denote by $\mathrm { M } _ { d } ( \mathbb { C } )$ the space of $d \times d$ square matrices with complex coefficients and we identify $\mathbb { C } ^ { d }$ with the space of column vectors of size $d$.
For a vector $x = \left( x _ { 1 } , \ldots , x _ { d } \right) \in \mathbb { C } ^ { d }$, we define $\| x \| _ { \infty } = \max _ { 1 \leqslant i \leqslant d } \left| x _ { i } \right|$ and $\| x \| _ { 1 } = \sum _ { i = 1 } ^ { d } \left| x _ { i } \right|$.
Given a matrix $A \in \mathrm { M } _ { d } ( \mathbb { C } )$ we define $\| A \| = \sup _ { \| x \| _ { \infty } \leqslant 1 } \| A x \| _ { \infty }$. For $1 \leqslant i \leqslant d$ we define $L _ { i } = \left( a _ { i , j } \right) _ { 1 \leqslant j \leqslant d }$ as the $i ^ { \mathrm { th } }$ row vector of $A$. Show that $$\| A \| = \max _ { 1 \leqslant i \leqslant d } \left\| L _ { i } \right\| _ { 1 } .$$
grandes-ecoles 2025 QP2-4 Diagonalizability and Similarity
Problem 2, Part 1: Adapted norms
We denote by $\mathrm { M } _ { d } ( \mathbb { C } )$ the space of $d \times d$ square matrices with complex coefficients.
a. Let $u \in \mathcal { L } \left( \mathbb { C } ^ { d } \right)$ be an endomorphism of $\mathbb { C } ^ { d }$ and $M = \left( m _ { i , j } \right) _ { 1 \leqslant i , j \leqslant d }$ the matrix of $u$ in a basis $\mathcal { B } = \left( e _ { 1 } , \ldots , e _ { d } \right)$. Express the matrix $M ^ { \prime } = \left( m _ { i , j } ^ { \prime } \right) _ { 1 \leqslant i , j \leqslant d }$ of $u$ in the basis $\mathcal { B } ^ { \prime } = \left( \alpha _ { 1 } e _ { 1 } , \ldots , \alpha _ { d } e _ { d } \right)$, where the $\alpha _ { i }$ are complex numbers. b. Suppose that $M$ is upper triangular. Show that for all $\varepsilon > 0$ we can choose the $\alpha _ { i }$ such that for $j > i$ we have $\left| m _ { i , j } ^ { \prime } \right| < \varepsilon$.
grandes-ecoles 2025 Q1 Diagonalizability and Similarity
Restriction of a diagonalizable endomorphism to a stable subspace Let $V$ be a finite-dimensional vector space, let $h$ be an endomorphism of $V$ and let $W$ be a subspace stable by $h$. We denote by $h_W$ the endomorphism of $W$ induced by $h$, that is $h_W : W \rightarrow W$, $v \mapsto h(v)$. Prove that if $h$ is diagonalizable, then $h_W$ is also diagonalizable.
grandes-ecoles 2025 Q1 Determinant and Rank Computation
Let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$. We set $M = \mathbf{u v}^T$. Show that $M$ is a square matrix of size $n \times n$, of rank 1.
grandes-ecoles 2025 Q1 Diagonalizability and Similarity
Explain why the matrix $J_n$ is diagonalizable.
grandes-ecoles 2025 Q1 Determinant and Rank Computation
Let $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$. We set $M = \mathbf{u v}^T$. Show that $M$ is a square matrix of size $n \times n$, of rank 1.
grandes-ecoles 2025 Q2 Linear Transformation and Endomorphism Properties
A matrix invariant For a square matrix $M$ and a nonzero natural integer $k$, we denote $$\delta_k(M) = -\operatorname{dim}\ker M^{k-1} + 2\operatorname{dim}\ker M^k - \operatorname{dim}\ker M^{k+1}.$$
a) Prove that if two square matrices $M$ and $M'$ are similar, then $\delta_k(M) = \delta_k(M')$ for all $k$.
b) Let $r$ be a nonzero natural integer. Verify that for all nonzero integer $k$, $\delta_k(J_r)$ equals 1 if $k = r$ and 0 otherwise.
c) Let $M_1$ and $M_2$ be two square matrices and let $M = \operatorname{diag}(M_1, M_2)$. Prove the relation $\operatorname{dim}\ker M = \operatorname{dim}\ker M_1 + \operatorname{dim}\ker M_2$ and then that for all nonzero integer $k$, $$\delta_k(M) = \delta_k(M_1) + \delta_k(M_2).$$ You may use without proof the fact that all these relations extend to a block diagonal matrix $\operatorname{diag}(M_1, \ldots, M_s)$.
grandes-ecoles 2025 Q2 Determinant and Rank Computation
Calculate with justification the rank of the following matrix $J \in \mathcal{M}_n(\mathbb{R})$: $$J = \left(\begin{array}{cccc} 1 & 1 & \cdots & 1 \\ 1 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 1 \end{array}\right)$$
grandes-ecoles 2025 Q2 Eigenvalue and Characteristic Polynomial Analysis
We denote by $\lambda_{\text{max}}$ the largest of the eigenvalues of $J_n$ and $\lambda_{\text{min}}$ the smallest. Show that $$\forall x \in \Lambda_n, \quad n\lambda_{\min} \leqslant \sum_{1 \leqslant i,j \leqslant n} J_n(i,j) x_i x_j \leqslant n\lambda_{\max}$$
grandes-ecoles 2025 Q2 Determinant and Rank Computation
Calculate with justification the rank of the following matrix $J \in \mathcal{M}_n(\mathbb{R})$: $$J = \left(\begin{array}{cccc} 1 & 1 & \cdots & 1 \\ 1 & 1 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 1 \end{array}\right).$$
grandes-ecoles 2025 Q3 Linear Transformation and Endomorphism Properties
The linear application $\widehat{\xi}$ and the endomorphism $\xi$ We denote by $\widehat{\xi} : \mathbb{C}[X^{\pm 1}] \rightarrow \mathcal{D}$ the linear application that to a Laurent polynomial $F$ associates $$\widehat{\xi}(F) = \Pi(XF) \quad \text{and} \quad \xi = \widehat{\xi}_{\mathcal{D}}$$ that is the endomorphism of $\mathcal{D}$ induced by $\widehat{\xi}$.
a) Let $F$ be an element of $\mathbb{C}[X^{\pm 1}]$. Prove that $\widehat{\xi}(\Pi(F)) = \widehat{\xi}(F)$.
b) Let $P$ be a polynomial and let $F$ be an element of $\mathcal{D}$. Prove that $P(\xi)(F) = \Pi(PF)$.
grandes-ecoles 2025 Q3 Determinant and Rank Computation
Conversely, let $K \in \mathcal{M}_n(\mathbb{R})$ be a square matrix of rank 1. Show that there exist $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$ such that $K = \mathbf{u v}^T$.
grandes-ecoles 2025 Q3 Linear Transformation and Endomorphism Properties
Conversely, let $K \in \mathcal{M}_n(\mathbb{R})$ be a square matrix of rank 1. Show that there exist $\mathbf{u}, \mathbf{v} \in \mathbb{R}^n \backslash \{\mathbf{0}\}$ such that $K = \mathbf{u v}^T$.
grandes-ecoles 2025 Q4 Linear Transformation and Endomorphism Properties
Image and kernel of powers of $\xi$ Let $n$ be a natural integer. Prove that $\xi^n$ is surjective and give a basis of the kernel of $\xi^n$.
grandes-ecoles 2025 Q4 Matrix Algebra and Product Properties
Let $\mathbf{u}, \mathbf{v}, \mathbf{x}, \mathbf{y} \in \mathbb{R}^n \backslash \{0\}$. Show that $\mathbf{u v}^T = \mathbf{x y}^T$ if and only if there exists $\lambda \in \mathbb{R} \backslash \{0\}$ such that $$\mathbf{u} = \lambda \mathbf{x}, \quad \text{and} \quad \mathbf{v} = \frac{1}{\lambda} \mathbf{y}$$