Not Maths

All Questions
Let $M = (m_{ij})_{(i,j) \in \llbracket 1,n\rrbracket^2} \in \mathcal{S}_n(\mathbb{R})$ such that for every pair $(i,j) \in \llbracket 1,n\rrbracket^2$, $m_{ij} \geqslant 0$ and $m_{ii} = 0$. Conversely, we assume that the eigenvalues of $\Phi(M)$ are all non-positive and we set $\Psi(M) = -\frac{1}{2}\Phi(M)$ and $r = \operatorname{rg}(\Psi(M))$.
1) Show that there exists a matrix $U \in \mathcal{M}_{r,n}(\mathbb{R})$ such that ${}^t UU = \Psi(M)$.
2) We denote $U_1, U_2, \cdots, U_n$ the columns of the matrix $U$. We seek to show that for every $(i,j) \in \llbracket 1,n\rrbracket^2$, $m_{ij} = \|U_i - U_j\|^2$.
a) Show that the $(U_i)$ are centered, that is, $\sum_{i=1}^n U_i = 0$.
b) Show that the matrix $N = (n_{ij})$ defined by: $$\forall (i,j) \in \llbracket 1,n\rrbracket^2, \quad n_{ij} = \|U_i - U_j\|^2$$ satisfies $\Psi(N) = \Psi(M)$.
c) Show that $M = N$ and conclude.
We consider four distinct points $A, B, C$ and $D$ in the canonical Euclidean space $\mathbb{R}^3$ such that $AB = BC = CD = DA = 1$, $AC = a > 0$ and $BD = b > 0$. We assume that the four points $A, B, C$ and $D$ exist and are coplanar. What relation do $a$ and $b$ then satisfy?
We consider four distinct points $A, B, C$ and $D$ in the canonical Euclidean space $\mathbb{R}^3$ such that $AB = BC = CD = DA = 1$, $AC = a > 0$ and $BD = b > 0$. We assume that the four distinct points $A, B, C$ and $D$ are not coplanar. We denote $I$ the midpoint of $[AC]$ and $J$ the midpoint of $[BD]$.
a) Show that $(IJ)$ is the common perpendicular to the lines $(AC)$ and $(BD)$.
b) By projecting the points $B$ and $D$ onto the plane containing $(AC)$ and perpendicular to $(IJ)$, show that $a^2 + b^2 < 4$.
We consider four distinct points $A, B, C$ and $D$ in the canonical Euclidean space $\mathbb{R}^3$ such that $AB = BC = CD = DA = 1$, $AC = a > 0$ and $BD = b > 0$.
Show that if strictly positive reals $a$ and $b$ satisfy the relation $a^2 + b^2 \leqslant 4$, then there indeed exist four distinct points $A, B, C$ and $D$ in the canonical Euclidean space $\mathbb{R}^3$ satisfying $AB = BC = CD = DA = 1$, $AC = a$ and $BD = b$.
We consider four points $U_1, U_2, U_3, U_4$ in $\mathbb{R}^3$ satisfying $U_1U_2 = U_2U_3 = U_3U_4 = U_4U_1 = 1$, $U_1U_3 = a$ and $U_2U_4 = b$. We use the notations of the previous parts with $n = 4$.
We set $M = \left(\|U_i - U_j\|^2\right)_{(i,j) \in \llbracket 1,4\rrbracket^2} \in \mathcal{S}_4(\mathbb{R})$.
Write the matrix $M$ then calculate $S(M)$ and $\sigma(M)$.
We consider four points $U_1, U_2, U_3, U_4$ in $\mathbb{R}^3$ satisfying $U_1U_2 = U_2U_3 = U_3U_4 = U_4U_1 = 1$, $U_1U_3 = a$ and $U_2U_4 = b$. We set $\Psi(M) = -\frac{1}{2}\Phi(M)$.
Show that the vectors $$\left(\begin{array}{r}1\\0\\-1\\0\end{array}\right), \left(\begin{array}{r}0\\1\\0\\-1\end{array}\right), \left(\begin{array}{r}-1\\1\\-1\\1\end{array}\right), \left(\begin{array}{l}1\\1\\1\\1\end{array}\right)$$ form a basis of eigenvectors of the matrix $\Psi(M)$ and determine the eigenvalues of the matrix $\Psi(M)$.
We consider four points $U_1, U_2, U_3, U_4$ in $\mathbb{R}^3$ satisfying $U_1U_2 = U_2U_3 = U_3U_4 = U_4U_1 = 1$, $U_1U_3 = a$ and $U_2U_4 = b$. We set $\Psi(M) = -\frac{1}{2}\Phi(M)$.
Determine the rank of $\Psi(M)$ according to the values taken by $a$ and $b$.
We consider four points $U_1, U_2, U_3, U_4$ in $\mathbb{R}^3$ satisfying $U_1U_2 = U_2U_3 = U_3U_4 = U_4U_1 = 1$, $U_1U_3 = a$ and $U_2U_4 = b$. We set $\Psi(M) = -\frac{1}{2}\Phi(M)$.
What equality do the reals $a$ and $b$ satisfy when the points $U_1, U_2, U_3$ and $U_4$ are coplanar?
We consider four points $U_1, U_2, U_3, U_4$ in $\mathbb{R}^3$ satisfying $U_1U_2 = U_2U_3 = U_3U_4 = U_4U_1 = 1$, $U_1U_3 = a$ and $U_2U_4 = b$. We set $\Psi(M) = -\frac{1}{2}\Phi(M)$.
Recover that the strictly positive reals $a$ and $b$ satisfy $a^2 + b^2 \leqslant 4$.
We consider four points $U_1, U_2, U_3, U_4$ in $\mathbb{R}^3$ satisfying $U_1U_2 = U_2U_3 = U_3U_4 = U_4U_1 = 1$, $U_1U_3 = a$ and $U_2U_4 = b$. We set $\Psi(M) = -\frac{1}{2}\Phi(M)$.
Conversely, if $a^2 + b^2 \leqslant 4$, give a family of points $U_1, U_2, U_3$ and $U_4$ satisfying the mutual distance constraints.
We consider a matrix $M = (m_{ij}) \in \mathcal{S}_n(\mathbb{R})$ such that for every $(i,j) \in \llbracket 1,n\rrbracket^2$, $m_{ij} \geqslant 0$ and $m_{ii} = 0$. We assume that $\Psi(M)$ has at least one strictly negative eigenvalue.
We seek to prove that there exists a unique symmetric matrix $T_0$ with non-negative eigenvalues that minimizes $\|\Psi(M) - T\|_{\mathcal{M}_n(\mathbb{R})}$ when $T$ ranges over $\mathcal{S}_n^+(\mathbb{R})$.
a) Show that $$\forall Q \in \mathcal{O}_n(\mathbb{R}), \forall A \in \mathcal{M}_n(\mathbb{R}), \quad \|{}^t QAQ\|_{\mathcal{M}_n(\mathbb{R})} = \|A\|_{\mathcal{M}_n(\mathbb{R})}$$
b) Justify the existence of a matrix $Q_0 \in \mathcal{O}_n(\mathbb{R})$ such that the matrix ${}^t Q_0 \Psi(M) Q_0$ is diagonal.
c) Show that a necessary condition for $\|\Psi(M) - T_0\|_{\mathcal{M}_n(\mathbb{R})}$ to minimize $\|\Psi(M) - T\|_{\mathcal{M}_n(\mathbb{R})}$ when $T$ ranges over $\mathcal{S}_n^+(\mathbb{R})$ is that the matrix ${}^t Q_0 T_0 Q_0$ is diagonal.
d) Prove the existence and uniqueness of the matrix $T_0$ sought.
We consider a matrix $M = (m_{ij}) \in \mathcal{S}_n(\mathbb{R})$ such that for every $(i,j) \in \llbracket 1,n\rrbracket^2$, $m_{ij} \geqslant 0$ and $m_{ii} = 0$. We assume that $\Psi(M)$ has at least one strictly negative eigenvalue. Let $T_0$ be the unique symmetric matrix with non-negative eigenvalues minimizing $\|\Psi(M) - T\|_{\mathcal{M}_n(\mathbb{R})}$ over $\mathcal{S}_n^+(\mathbb{R})$.
We assume in this question that $T_0$ is non-zero. We want to show that there exists a minimal integer $p \in \llbracket 1, n-1\rrbracket$ that we will specify such that we can determine vectors $U_1, U_2, \cdots, U_n$ elements of $\mathbb{R}^p$ satisfying the condition $\sum_{i=1}^n U_i = 0$ and for which the matrix $\widetilde{M} = \left(\|U_i - U_j\|^2\right)_{(i,j) \in \llbracket 1,n\rrbracket^2}$ satisfies the relation $\Psi(\widetilde{M}) = T_0$.
We use the notations of part II and denote $U = (U_1 | U_2 | \cdots | U_n)$.
a) Show that the integer $p$ satisfies $p \geqslant \operatorname{rg}(T_0)$ and that $\operatorname{rg}(T_0) \in \llbracket 1, n-1\rrbracket$.
b) Construct a matrix $U \in \mathcal{M}_{r,n}(\mathbb{R})$ such that ${}^t UU = T_0$ for $r = \operatorname{rg}(T_0)$.
Hint. Assuming that ${}^t Q_0 T_0 Q_0$ is of the form $\left(\begin{array}{ll}\Delta & \\ & 0_{n-r}\end{array}\right)$ with $\Delta \in \mathcal{M}_r(\mathbb{R})$, diagonal with non-zero values, we will seek $U$ in the form $U = \left((\Delta_1)(0)\right) \times Q_0 \in \mathcal{M}_{r,n}(\mathbb{R})$ with $\Delta_1 \in \mathcal{M}_r(\mathbb{R})$, diagonal.
c) Show that $\sum_{i=1}^n U_i = 0$ (we may study the vector $UZ$).
d) Deduce that $\Psi(\widetilde{M}) = T_0$ with $\widetilde{M} = \left(\|U_i - U_j\|^2\right)_{(i,j) \in \llbracket 1,n\rrbracket^2}$ and conclude.
In this part all matrices are of format $(n, n)$, where $n$ is an integer greater than or equal to 2. We say that a real symmetric matrix is positive definite if and only if all its eigenvalues are strictly positive.
Let $A$ and $B$ be two positive definite symmetric matrices, $\alpha$ and $\beta$ two real numbers $> 0$ such that $\alpha + \beta = 1$; prove that: $$\operatorname { det } ( \alpha A + \beta B ) \geqslant ( \operatorname { det } A ) ^ { \alpha } ( \operatorname { det } B ) ^ { \beta }$$
In this part all matrices are of format $(n, n)$, where $n$ is an integer greater than or equal to 2. We say that a real symmetric matrix is positive definite if and only if all its eigenvalues are strictly positive.
For $1 \leqslant i \leqslant k$, let $A _ { i }$ be positive definite symmetric matrices and $\alpha _ { i }$ strictly positive real numbers such that $\alpha _ { 1 } + \cdots + \alpha _ { k } = 1$. Prove that $$\operatorname { det } \left( \alpha _ { 1 } A _ { 1 } + \cdots + \alpha _ { k } A _ { k } \right) \geqslant \left( \operatorname { det } A _ { 1 } \right) ^ { \alpha _ { 1 } } \ldots \left( \operatorname { det } A _ { k } \right) ^ { \alpha _ { k } }$$
One may reason by induction on $k$.
Let $n \in \mathbb { N }$. Assuming the sequence $\left( \alpha _ { k } ^ { ( n ) } \right) _ { k \in \mathbb { N } }$ of zeros of $\varphi_n$ constructed in IV.D.1, deduce that the sequence $\left( \alpha _ { k } ^ { ( n ) } \right) _ { k \in \mathbb { N } }$ satisfies the asymptotic distribution property:
$$\forall c \in ] 0,1 \left[ , \quad \exists j \in \mathbb { N } \quad \text { such that } \quad \forall k \in \mathbb { N } , 0 < \alpha _ { j + k + 1 } ^ { ( n ) } - \alpha _ { j + k } ^ { ( n ) } < \frac { \pi } { c } \right.$$
In this question, $f$ denotes the linear form defined by $\forall M \in \mathcal{M}_n(\mathbb{R}), f(M) = \sum_{j=1}^n \sum_{i=j}^n m_{i,j}$, and $A$ is the matrix such that $\forall M \in \mathcal{M}_n(\mathbb{R}), f(M) = \operatorname{Tr}(AM)$.
Show that $M_n = \sum_{k=1}^n \dfrac{1}{2\cos\dfrac{k\pi}{2n+1}}$.
We denote $\alpha$ a real number such that $\alpha > -1/2$, $E$ the $\mathbb{R}$-vector space of functions of class $\mathcal{C}^\infty$ on $[-1,1]$ with real values, and $$S_\alpha(f,g) = \int_{-1}^{1} f(t)g(t)\left(1-t^2\right)^{\alpha - \frac{1}{2}} \mathrm{~d}t$$ Verify that $S_\alpha$ is an inner product on $E$.
We denote $\alpha > -1/2$, $E$ the $\mathbb{R}$-vector space of functions of class $\mathcal{C}^\infty$ on $[-1,1]$ with real values, and $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ Justify that $\varphi_\alpha$ is an endomorphism of $E$. Is it injective?
We denote $\alpha > -1/2$, $E$ the $\mathbb{R}$-vector space of functions of class $\mathcal{C}^\infty$ on $[-1,1]$ with real values, $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ and $$S_\alpha(f,g) = \int_{-1}^{1} f(t)g(t)\left(1-t^2\right)^{\alpha - \frac{1}{2}} \mathrm{~d}t$$ Show that $$\forall (f,g) \in E^2, \quad S_\alpha\left(\varphi_\alpha(f), g\right) = S_\alpha\left(f, \varphi_\alpha(g)\right)$$ One may calculate the derivative of $t \mapsto \left(1-t^2\right)^{\alpha+\frac{1}{2}} f'(t)$.
We denote $\alpha > -1/2$, $F_n$ the vector subspace of $E$ of polynomial functions of degree less than or equal to $n$ (where $n \in \mathbb{N}$), and $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ Justify that $\varphi_\alpha$ induces on $F_n$ an endomorphism and that this induced endomorphism (still denoted $\varphi_\alpha$) is diagonalizable.
We denote $\alpha > -1/2$, $F_n$ the vector subspace of $E$ of polynomial functions of degree less than or equal to $n$ (where $n \in \mathbb{N}$), and $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ Show that there exists a basis of $F_n$ consisting of eigenvectors of $\varphi_\alpha$ of pairwise distinct degrees.
We denote $\alpha > -1/2$, $F_n$ the vector subspace of $E$ of polynomial functions of degree less than or equal to $n$ (where $n \in \mathbb{N}$), and $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ Verify that two eigenvectors of $\varphi_\alpha$ of distinct degrees are associated with distinct eigenvalues. One may be interested in the leading coefficient of a judiciously chosen polynomial.
We denote $\alpha > -1/2$, $F_n$ the vector subspace of $E$ of polynomial functions of degree less than or equal to $n$ (where $n \in \mathbb{N}$), $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ and $$S_\alpha(f,g) = \int_{-1}^{1} f(t)g(t)\left(1-t^2\right)^{\alpha - \frac{1}{2}} \mathrm{~d}t$$ Justify that two eigenvectors of $\varphi_\alpha$ of distinct degrees are orthogonal (with respect to $S_\alpha$).
We denote $\alpha > -1/2$, $F_n$ the vector subspace of $E$ of polynomial functions of degree less than or equal to $n$ (where $n \in \mathbb{N}$), and $$\varphi_\alpha(y) : t \mapsto \left(1-t^2\right)y''(t) - (2\alpha+1)t\,y'(t)$$ Show that any eigenvector of $\varphi_\alpha$ of degree greater than or equal to 1 vanishes at least once in the interval $]-1,1[$.
We assume $\alpha = 1$. We denote $\|\cdot\|$ the norm associated with $S_1$, and $$\varphi_1(y) : t \mapsto \left(1-t^2\right)y''(t) - 3t\,y'(t)$$ Justify that, for all $k \in \mathbb{N}$, there exists a unique polynomial eigenvector of $\varphi_1$ of degree $k$, of norm 1 and with positive leading coefficient. We denote it $T_k$.