Compute eigenvectors or eigenspaces

The question asks to find eigenvectors, a basis of eigenvectors, or describe eigenspaces (including their dimension) for a given matrix or linear map.

csat-suneung 2016 Q24 3 marks View
For the system of linear equations in $x , y$ $$\left( \begin{array} { c c } 1 & a - 2 \\ 2 & - 1 \end{array} \right) \binom { x } { y } = 3 \binom { x } { y }$$ Find the value of the constant $a$ such that the system has a solution other than $x = 0 , y = 0$. [3 points]
grandes-ecoles 2011 Q4 View
Show that there exists a basis $\left( e _ { i } \right) _ { 1 \leq i \leq n }$ of $\mathbb { R } ^ { n }$ and $n$ strictly positive real numbers $\lambda _ { i } \in \mathbb { R } ^ { + * } ( 1 \leq i \leq n )$ such that $$\forall i \in \{ 1 , \ldots , n \} , A ^ { - 1 } K e _ { i } = \lambda _ { i } e _ { i }$$
grandes-ecoles 2011 QI.B.3 View
For $n \in \mathbb{N}^*$, $A \in \mathcal{S}_n(\mathbb{R})$ and $i \in \llbracket 1; n \rrbracket$, we denote by $A^{(i)}$ the square matrix of order $i$ extracted from $A$, consisting of the first $i$ rows and the first $i$ columns of $A$. For all $n \in \mathbb{N}^*$, we say that a matrix $A$ of $\mathcal{S}_n(\mathbb{R})$ satisfies property $\mathcal{P}_n$ if $\operatorname{det}\left(A^{(i)}\right) > 0$ for all $i \in \llbracket 1; n \rrbracket$.
Let $n \in \mathbb{N}^*$. We assume that any matrix of $\mathcal{S}_n(\mathbb{R})$ satisfying property $\mathcal{P}_n$ is positive definite. We consider a matrix $A$ of $\mathcal{S}_{n+1}(\mathbb{R})$ satisfying property $\mathcal{P}_{n+1}$ and we assume by contradiction that $A$ is not positive definite.
a) Show then that $A$ admits two linearly independent eigenvectors associated with eigenvalues (not necessarily distinct) that are strictly negative.
b) Deduce that there exists $X \in \mathcal{M}_{n+1,1}(\mathbb{R})$ whose last component is zero and such that ${}^t X A X < 0$.
c) Conclude.
grandes-ecoles 2011 QV.C.6 View
We set $D = (d_{ij})_{(i,j) \in \llbracket 1,n\rrbracket^2} = (\sqrt{m_{ij}})_{(i,j) \in \llbracket 1,n\rrbracket^2} \in \mathcal{M}_n(\mathbb{R})$ and $M_c = \left((d_{ij} + c\xi_i^j)^2\right)$ with $c > 0$. Let $c^* = \alpha^*$ be the minimal constant found previously, and $X^*$ the associated vector.
a) Show that $\Psi(M_{c^*}) X^* = 0$.
We set $Y^* = \frac{2}{c^*} \Psi(M) X^*$.
b) Show that the column vector $\binom{Y^*}{X^*}$ is an eigenvector of the $2n \times 2n$ matrix $\left(\begin{array}{cc}0 & 2\Psi(M) \\ -I_n & -4\Psi(D)\end{array}\right)$ and that $c^*$ is an eigenvalue of this matrix.
grandes-ecoles 2013 QI.B.2 View
Let $A, B \in \mathcal{M}_n(\mathbb{R})$ and $P \in \mathrm{GL}_n(\mathbb{R})$ such that $B = P^{-1}AP$. Show that if $\lambda$ is an eigenvalue of $A$, then $E_\lambda(f_A) = f_P(E_\lambda(f_B))$.
grandes-ecoles 2013 QIV.A.2 View
In this section we consider a circle $\mathcal{C}(\Omega, r)$ with center $\Omega$ and non-zero radius $r$, intersecting the $x$-axis. We denote by $L_1$ and $L_2$, with coordinates respectively $(\lambda_1, 0)$ and $(\lambda_2, 0)$, with $\lambda_1 < \lambda_2$, the two intersection points of $\mathcal{C}(\Omega, r)$ with the $x$-axis. Let $A = \left(\begin{array}{ll} a & b \\ c & d \end{array}\right)$ be a matrix whose eigenvalue circle equals $\mathcal{C}(\Omega, r)$. We keep the notations $E, F, G, H$ from III.D.
Show that if $c \neq 0$, then $\left(\overrightarrow{L_1 E}, \overrightarrow{L_2 E}\right)$ is a basis of $\mathbb{R}^2$ consisting of eigenvectors for $f_A$.
grandes-ecoles 2013 QIV.A.3 View
In this section we consider a circle $\mathcal{C}(\Omega, r)$ with center $\Omega$ and non-zero radius $r$, intersecting the $x$-axis. We denote by $L_1$ and $L_2$, with coordinates respectively $(\lambda_1, 0)$ and $(\lambda_2, 0)$, with $\lambda_1 < \lambda_2$, the two intersection points of $\mathcal{C}(\Omega, r)$ with the $x$-axis. Let $A = \left(\begin{array}{ll} a & b \\ c & d \end{array}\right)$ be a matrix whose eigenvalue circle equals $\mathcal{C}(\Omega, r)$. We keep the notations $E, F, G, H$ from III.D.
When $c = 0$, can we give a basis of eigenvectors for $f_A$ using the eigenvalue circle and the eigenvalue rectangle?
grandes-ecoles 2013 QIV.A.4 View
In this section we consider a circle $\mathcal{C}(\Omega, r)$ with center $\Omega$ and non-zero radius $r$, intersecting the $x$-axis. We denote by $L_1$ and $L_2$, with coordinates respectively $(\lambda_1, 0)$ and $(\lambda_2, 0)$, with $\lambda_1 < \lambda_2$, the two intersection points of $\mathcal{C}(\Omega, r)$ with the $x$-axis. Let $A = \left(\begin{array}{ll} a & b \\ c & d \end{array}\right)$ be a matrix whose eigenvalue circle equals $\mathcal{C}(\Omega, r)$. We keep the notations $E, F, G, H$ from III.D.
Show that the square of the cosine of the angle between two eigenvectors of $A$ associated with two distinct eigenvalues is determined by the circle $\mathcal{C}(\Omega, r)$, and does not depend on the choice of a matrix $A$ whose eigenvalue circle equals $\mathcal{C}(\Omega, r)$ (one may, if deemed useful, introduce the orthogonal projection of $\Omega$ onto the $x$-axis). What about if $A$ is symmetric?
grandes-ecoles 2013 QIV.B.2 View
In this section we consider a circle $\mathcal{C}(\Omega, r)$ with center $\Omega$ and non-zero radius $r$, tangent to the $x$-axis. We call $L$, with coordinates $(\lambda, 0)$, the point of tangency of $\mathcal{C}(\Omega, r)$ with the $x$-axis. Let $A$ be a matrix whose eigenvalue circle equals $\mathcal{C}(\Omega, r)$. We keep the notations $E, F, G, H$ from III.D.
Can we give an eigenvector using the points $L, E, F, G$ and $H$?
grandes-ecoles 2013 QIV.B.5 View
In this section we consider a circle $\mathcal{C}(\Omega, r)$ with center $\Omega$ and non-zero radius $r$, tangent to the $x$-axis. We call $L$, with coordinates $(\lambda, 0)$, the point of tangency of $\mathcal{C}(\Omega, r)$ with the $x$-axis. Let $A$ be a matrix whose eigenvalue circle equals $\mathcal{C}(\Omega, r)$, and let $\alpha$ be the unique non-zero real such that $A$ is directly orthogonally similar to $T_{\lambda,\alpha} = \left(\begin{array}{cc} \lambda & \alpha \\ 0 & \lambda \end{array}\right)$.
Show that there exists an orthonormal direct basis $(e_1, e_2)$ of the plane such that for all $u$ in $\mathbb{R}^2$, we have $f_A(u) = \lambda u + \alpha (e_2 \mid u) e_1$.
grandes-ecoles 2014 QI.A.1 View
Let $F$ and $G$ be two supplementary subspaces of $E$ and $s$ the symmetry with respect to $F$ parallel to $G$. a) Show that $F = F_s$ and $G = G_s$. b) Show that $s \circ s = \operatorname{Id}_E$. Deduce that $s$ is an automorphism of $E$. c) Determine the eigenvalues and eigenspaces of $s$. We will discuss according to the subspaces $F$ and $G$.
grandes-ecoles 2018 Q31 View
Let $\lambda$ be an eigenvalue of $B$ and $Y = \left(\begin{array}{c} y_{1} \\ \vdots \\ y_{q} \end{array}\right)$ an associated eigenvector. Show that, if we impose $y_{0} = y_{q+1} = 0$, then, for all $k \in \llbracket 1, q \rrbracket$, $y_{k-1} - \lambda y_{k} + y_{k+1} = 0$.
grandes-ecoles 2019 Q4 View
Let $\left(a_0, a_1, \ldots, a_{n-1}\right) \in \mathbb{K}^n$ and $Q(X) = X^n + a_{n-1}X^{n-1} + \cdots + a_0$. We consider the companion matrix
$$C_Q = \left(\begin{array}{cccccc} 0 & \cdots & \cdots & \cdots & 0 & -a_0 \\ 1 & 0 & \cdots & \cdots & 0 & -a_1 \\ 0 & 1 & \ddots & & \vdots & -a_2 \\ \vdots & \ddots & \ddots & \ddots & \vdots & \vdots \\ \vdots & & \ddots & 1 & 0 & -a_{n-2} \\ 0 & \cdots & \cdots & 0 & 1 & -a_{n-1} \end{array}\right).$$
Let $\lambda$ be an eigenvalue of $C_Q^{\top}$. Determine the dimension and a basis of the associated eigenspace.
grandes-ecoles 2019 Q4 View
Let $\left(a_0, a_1, \ldots, a_{n-1}\right) \in \mathbb{K}^n$ and $Q(X) = X^n + a_{n-1}X^{n-1} + \cdots + a_0$. We consider the companion matrix
$$C_Q = \left(\begin{array}{cccccc} 0 & \cdots & \cdots & \cdots & 0 & -a_0 \\ 1 & 0 & \cdots & \cdots & 0 & -a_1 \\ 0 & 1 & \ddots & & \vdots & -a_2 \\ \vdots & \ddots & \ddots & \ddots & \vdots & \vdots \\ \vdots & & \ddots & 1 & 0 & -a_{n-2} \\ 0 & \cdots & \cdots & 0 & 1 & -a_{n-1} \end{array}\right).$$
Let $\lambda$ be an eigenvalue of $C_Q^{\top}$. Determine the dimension and a basis of the associated eigenspace.
grandes-ecoles 2022 Q2 View
We set $A_1 = \left(\begin{array}{ccc} 3 & -2 & 4 \\ -2 & 6 & 2 \\ 4 & 2 & 3 \end{array}\right)$.
By observing the first and last column of $A_1$, determine an eigenvector of $A_1$ and the associated eigenvalue $\lambda_1$.
grandes-ecoles 2022 Q3.3 View
We denote by $G_0$ the subgroup of $G$ formed by elements $g$ such that $g(\mathcal{H})=\mathcal{H}$. For all $w\in V$ such that $B(w,w)>0$, we define the linear map $$s_w : v \mapsto v - 2\frac{B(v,w)}{B(w,w)}w.$$ Show that $s_w^2 = \mathrm{Id}_V$, and determine the eigenvalues and eigenspaces of $s_w$.
grandes-ecoles 2024 Q7 View
Deduce the characteristic polynomial of a graph with $n$ vertices whose non-isolated vertices form a star with $d$ branches with $1 \leq d \leq n - 1$. Then determine the eigenvalues and eigenvectors of an adjacency matrix of this graph.
grandes-ecoles 2024 Q16 View
Let $Z \in \mathscr{M}_{d}(\mathbb{R})$ be an invertible matrix. We denote $\mathrm{S} = Z^{T}Z$. Show that there exists a decreasing family $(\lambda_{i})_{1 \leqslant i \leqslant d}$ of strictly positive real numbers and an orthonormal basis $(u_{1}, \ldots, u_{d})$ of $\mathbb{R}^{d}$ such that $Su_{i} = \lambda_{i} u_{i}$ for all $1 \leqslant i \leqslant d$.
todai-math 2017 Q1 View
Suppose that three-dimensional vectors $\left( \begin{array} { c } x _ { n } \\ y _ { n } \\ z _ { n } \end{array} \right)$ satisfy the equation
$$\left( \begin{array} { l } x _ { n + 1 } \\ y _ { n + 1 } \\ z _ { n + 1 } \end{array} \right) = A \left( \begin{array} { l } x _ { n } \\ y _ { n } \\ z _ { n } \end{array} \right) \quad ( n = 0,1,2 , \ldots )$$
where $x _ { 0 } , y _ { 0 } , z _ { 0 }$ and $\alpha$ are real numbers, and
$$A = \left( \begin{array} { c c c } 1 - 2 \alpha & \alpha & \alpha \\ \alpha & 1 - \alpha & 0 \\ \alpha & 0 & 1 - \alpha \end{array} \right) , \quad 0 < \alpha < \frac { 1 } { 3 }$$
Answer the following questions.
(1) Express $x _ { n } + y _ { n } + z _ { n }$ using $x _ { 0 } , y _ { 0 }$ and $z _ { 0 }$.
(2) Obtain the eigenvalues $\lambda _ { 1 } , \lambda _ { 2 }$ and $\lambda _ { 3 }$, and their corresponding eigenvectors $\boldsymbol { v } _ { \mathbf { 1 } } , \boldsymbol { v } _ { \mathbf { 2 } }$ and $\boldsymbol { v } _ { \mathbf { 3 } }$ of the matrix $A$.
(3) Express the matrix $A$ using $\lambda _ { 1 } , \lambda _ { 2 } , \lambda _ { 3 } , \boldsymbol { v } _ { 1 } , \boldsymbol { v } _ { 2 }$ and $\boldsymbol { v } _ { 3 }$.
(4) Express $\left( \begin{array} { l } x _ { n } \\ y _ { n } \\ z _ { n } \end{array} \right)$ using $x _ { 0 } , y _ { 0 } , z _ { 0 }$ and $\alpha$.
(5) Obtain $\lim _ { n \rightarrow \infty } \left( \begin{array} { l } x _ { n } \\ y _ { n } \\ z _ { n } \end{array} \right)$. (6) Regard
$$f \left( x _ { 0 } , y _ { 0 } , z _ { 0 } \right) = \frac { \left( x _ { n } , y _ { n } , z _ { n } \right) \left( \begin{array} { l } x _ { n + 1 } \\ y _ { n + 1 } \\ z _ { n + 1 } \end{array} \right) } { \left( x _ { n } , y _ { n } , z _ { n } \right) \left( \begin{array} { l } x _ { n } \\ y _ { n } \\ z _ { n } \end{array} \right) }$$
as a function of $x _ { 0 } , y _ { 0 }$ and $z _ { 0 }$. Obtain the maximum and the minimum values of $f \left( x _ { 0 } , y _ { 0 } , z _ { 0 } \right)$, where we assume that $x _ { 0 } ^ { 2 } + y _ { 0 } ^ { 2 } + z _ { 0 } ^ { 2 } \neq 0$.
todai-math 2019 Q2 View
Problem 2
I. Answer the following questions about the matrix $\boldsymbol { P }$: $$\boldsymbol { P } = \left( \begin{array} { c c c } 0 & 0 & \frac { 3 } { 2 } \\ 2 & 0 & 0 \\ 0 & \frac { 1 } { 3 } & 0 \end{array} \right)$$
  1. Obtain all eigenvalues of the matrix $\boldsymbol { P }$ and the corresponding eigenvectors with unit norms.
  2. Obtain $P ^ { 2 }$ and $P ^ { 3 }$.

II. Let $\boldsymbol { A }$ be the real matrix given by the block diagonal matrix: $$\boldsymbol { A } = \left( \begin{array} { c c c c c } 0 & 0 & c & 0 & 0 \\ a & 0 & 0 & 0 & 0 \\ 0 & b & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & e \\ 0 & 0 & 0 & d & 0 \end{array} \right)$$ Express succinctly the necessary and sufficient condition on $a , b , c , d$, and $e$, such that there exists a positive integer $m$ for which $\boldsymbol { A } ^ { m }$ is the identity matrix (proof is not required).
III. The matrix $M$ is a square matrix of order 12 with all elements taking either 0 or 1, such that each row and column has exactly one element being 1. Let $k _ { 0 }$ be the minimum value of the positive integer $k$ such that $M ^ { k }$ is the identity matrix. For all possible matrices $M$, give the maximum value of $k _ { 0 }$ (proof is not required).