QIV.A.4
Matrices
Projection and Orthogonality
View
Let $p \in ]0,1[$. Let $X_1, \ldots, X_n$ be mutually independent random variables, defined on a probability space $(\Omega, \mathcal{A}, P)$ and following the same Bernoulli distribution with parameter $p$. Let $S = X_1 + \ldots + X_n$.
If $\omega \in \Omega$, we introduce the column matrix $$U(\omega) = \begin{pmatrix} X_1(\omega) \\ \vdots \\ X_n(\omega) \end{pmatrix}$$ and the matrix $M(\omega) = U(\omega)\, {}^t(U(\omega))$. The application $M : \left\{\begin{array}{l} \Omega \rightarrow \mathcal{M}_n(\mathbb{R}) \\ \omega \mapsto M(\omega) \end{array}\right.$ is thus a random variable.
a) If $\omega \in \Omega$, justify that $M(\omega) \in \mathcal{X}_n$.
b) If $\omega \in \Omega$, justify that $\operatorname{tr}(M(\omega)) \in \{0, \ldots, n\}$, that $M(\omega)$ is diagonalizable over $\mathbb{R}$ and that $\operatorname{rg}(M(\omega)) \leqslant 1$.
c) If $\omega \in \Omega$, justify that $M(\omega)$ is an orthogonal projection matrix if and only if $S(\omega) \in \{0,1\}$.