grandes-ecoles

Papers (191)
2025
centrale-maths1__official 40 centrale-maths2__official 42 mines-ponts-maths1__mp 20 mines-ponts-maths1__pc 21 mines-ponts-maths1__psi 21 mines-ponts-maths2__mp 28 mines-ponts-maths2__pc 24 mines-ponts-maths2__psi 26 polytechnique-maths-a__mp 27 polytechnique-maths__fui 16 polytechnique-maths__pc 27 x-ens-maths-a__mp 18 x-ens-maths-c__mp 9 x-ens-maths-d__mp 38 x-ens-maths__pc 27 x-ens-maths__psi 38
2024
centrale-maths1__official 28 centrale-maths2__official 29 geipi-polytech__maths 9 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 20 mines-ponts-maths1__psi 19 mines-ponts-maths2__mp 23 mines-ponts-maths2__pc 21 mines-ponts-maths2__psi 21 polytechnique-maths-a__mp 44 polytechnique-maths-b__mp 37 x-ens-maths-a__mp 43 x-ens-maths-b__mp 35 x-ens-maths-c__mp 22 x-ens-maths-d__mp 45 x-ens-maths__pc 24 x-ens-maths__psi 26
2023
centrale-maths1__official 44 centrale-maths2__official 33 e3a-polytech-maths__mp 4 mines-ponts-maths1__mp 15 mines-ponts-maths1__pc 23 mines-ponts-maths1__psi 23 mines-ponts-maths2__mp 22 mines-ponts-maths2__pc 18 mines-ponts-maths2__psi 22 polytechnique-maths__fui 23 x-ens-maths-a__mp 25 x-ens-maths-b__mp 24 x-ens-maths-c__mp 20 x-ens-maths-d__mp 20 x-ens-maths__pc 18 x-ens-maths__psi 15
2022
centrale-maths1__mp 48 centrale-maths1__official 48 centrale-maths1__pc 37 centrale-maths1__psi 43 centrale-maths2__mp 32 centrale-maths2__official 32 centrale-maths2__pc 39 centrale-maths2__psi 45 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 24 mines-ponts-maths1__psi 24 mines-ponts-maths2__mp 24 mines-ponts-maths2__pc 19 mines-ponts-maths2__psi 20 x-ens-maths-a__mp 13 x-ens-maths-b__mp 40 x-ens-maths-c__mp 27 x-ens-maths-d__mp 46 x-ens-maths1__mp 13 x-ens-maths2__mp 40 x-ens-maths__pc 15 x-ens-maths__pc_cpge 15 x-ens-maths__psi 22 x-ens-maths__psi_cpge 23
2021
centrale-maths1__mp 40 centrale-maths1__official 40 centrale-maths1__pc 36 centrale-maths1__psi 29 centrale-maths2__mp 30 centrale-maths2__official 29 centrale-maths2__pc 38 centrale-maths2__psi 37 x-ens-maths2__mp 39 x-ens-maths__pc 44
2020
centrale-maths1__mp 42 centrale-maths1__official 42 centrale-maths1__pc 36 centrale-maths1__psi 40 centrale-maths2__mp 38 centrale-maths2__official 38 centrale-maths2__pc 40 centrale-maths2__psi 39 mines-ponts-maths1__mp_cpge 24 mines-ponts-maths2__mp_cpge 21 x-ens-maths-a__mp_cpge 18 x-ens-maths-b__mp_cpge 20 x-ens-maths-d__mp 14 x-ens-maths1__mp 18 x-ens-maths2__mp 20 x-ens-maths__pc 18
2019
centrale-maths1__mp 37 centrale-maths1__official 37 centrale-maths1__pc 40 centrale-maths1__psi 39 centrale-maths2__mp 37 centrale-maths2__official 37 centrale-maths2__pc 39 centrale-maths2__psi 49 x-ens-maths1__mp 24 x-ens-maths__pc 18 x-ens-maths__psi 26
2018
centrale-maths1__mp 47 centrale-maths1__official 47 centrale-maths1__pc 41 centrale-maths1__psi 44 centrale-maths2__mp 44 centrale-maths2__official 44 centrale-maths2__pc 35 centrale-maths2__psi 38 x-ens-maths1__mp 19 x-ens-maths2__mp 17 x-ens-maths__pc 22 x-ens-maths__psi 24
2017
centrale-maths1__mp 45 centrale-maths1__official 45 centrale-maths1__pc 22 centrale-maths1__psi 17 centrale-maths2__mp 30 centrale-maths2__official 30 centrale-maths2__pc 28 centrale-maths2__psi 44 x-ens-maths1__mp 26 x-ens-maths2__mp 16 x-ens-maths__pc 18 x-ens-maths__psi 26
2016
centrale-maths1__mp 42 centrale-maths1__pc 31 centrale-maths1__psi 33 centrale-maths2__mp 25 centrale-maths2__pc 47 centrale-maths2__psi 27 x-ens-maths1__mp 18 x-ens-maths2__mp 46 x-ens-maths__pc 15 x-ens-maths__psi 20
2015
centrale-maths1__mp 42 centrale-maths1__pc 18 centrale-maths1__psi 42 centrale-maths2__mp 44 centrale-maths2__pc 18 centrale-maths2__psi 33 x-ens-maths1__mp 16 x-ens-maths2__mp 31 x-ens-maths__pc 30 x-ens-maths__psi 22
2014
centrale-maths1__mp 28 centrale-maths1__pc 26 centrale-maths1__psi 27 centrale-maths2__mp 24 centrale-maths2__pc 26 centrale-maths2__psi 27 x-ens-maths1__mp 9 x-ens-maths2__mp 16 x-ens-maths__pc 4 x-ens-maths__psi 24
2013
centrale-maths1__mp 22 centrale-maths1__pc 45 centrale-maths1__psi 29 centrale-maths2__mp 31 centrale-maths2__pc 52 centrale-maths2__psi 32 x-ens-maths1__mp 24 x-ens-maths2__mp 35 x-ens-maths__pc 22 x-ens-maths__psi 9
2012
centrale-maths1__mp 36 centrale-maths1__pc 28 centrale-maths1__psi 33 centrale-maths2__mp 27 centrale-maths2__psi 18
2011
centrale-maths1__mp 27 centrale-maths1__pc 17 centrale-maths1__psi 24 centrale-maths2__mp 29 centrale-maths2__pc 17 centrale-maths2__psi 10
2010
centrale-maths1__mp 19 centrale-maths1__pc 30 centrale-maths1__psi 13 centrale-maths2__mp 32 centrale-maths2__pc 37 centrale-maths2__psi 27
2021 centrale-maths1__psi

29 maths questions

Q1 Proof Direct Proof of a Stated Identity or Equality View
Justify that, for every natural integer $k$, $p _ { 1 } ^ { ( k ) } + \cdots + p _ { n } ^ { ( k ) } = 1$.
Q2 Matrices Matrix Power Computation and Application View
Show that, for every natural integer $k$, $P ^ { ( k + 1 ) } = P ^ { ( k ) } T$.
Q3 Sequences and series, recurrence and convergence Closed-form expression derivation View
Deduce from this, for every natural integer $k$, an expression for $P ^ { ( k ) }$ as a function of $T$, $k$ and $P ^ { ( 0 ) }$.
Q4 Matrices Linear Transformation and Endomorphism Properties View
Suppose that the sequence of vectors $\left( P ^ { ( k ) } \right) _ { k \in \mathbb { N } }$ converges to a vector $P = \left( p _ { 1 } , \ldots , p _ { n } \right)$. Show that $P T = P$, that for all $i \in \llbracket 1 , n \rrbracket$, $p _ { i } \geqslant 0$ and that $p _ { 1 } + \cdots + p _ { n } = 1$.
Q5 Matrices Structured Matrix Characterization View
We consider the directed graph $G = ( S , A )$ where $$\left\{ \begin{array} { l } S = \{ 1,2,3,4 \} \\ A = \{ ( 1,2 ) , ( 2,1 ) , ( 1,3 ) , ( 3,1 ) , ( 1,4 ) , ( 4,1 ) , ( 2,3 ) , ( 3,2 ) , ( 2,4 ) , ( 4,2 ) , ( 3,4 ) , ( 4,3 ) \} \end{array} \right.$$ We assume that, when the point is on one of the vertices of the graph, it has the same probability of going to each of the three other vertices of the graph. We set $$J _ { 4 } = \left( \begin{array} { l l l l } 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end{array} \right)$$ Express the transition matrix $T$ as a function of $J _ { 4 }$ and $I _ { 4 }$.
Q6 Matrices Matrix Decomposition and Factorization View
We consider the directed graph $G = ( S , A )$ where $$\left\{ \begin{array} { l } S = \{ 1,2,3,4 \} \\ A = \{ ( 1,2 ) , ( 2,1 ) , ( 1,3 ) , ( 3,1 ) , ( 1,4 ) , ( 4,1 ) , ( 2,3 ) , ( 3,2 ) , ( 2,4 ) , ( 4,2 ) , ( 3,4 ) , ( 4,3 ) \} \end{array} \right.$$ We assume that, when the point is on one of the vertices of the graph, it has the same probability of going to each of the three other vertices of the graph. We set $$J _ { 4 } = \left( \begin{array} { l l l l } 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end{array} \right)$$ Prove that there exists a matrix $Q \in \mathcal { O } _ { 4 } ( \mathbb { R } )$ such that $$T = \frac { 1 } { 3 } Q \left( \begin{array} { c c c c } - 1 & 0 & 0 & 0 \\ 0 & - 1 & 0 & 0 \\ 0 & 0 & - 1 & 0 \\ 0 & 0 & 0 & 3 \end{array} \right) Q ^ { \top }$$
Q7 Matrices Matrix Power Computation and Application View
We consider the directed graph $G = ( S , A )$ where $$\left\{ \begin{array} { l } S = \{ 1,2,3,4 \} \\ A = \{ ( 1,2 ) , ( 2,1 ) , ( 1,3 ) , ( 3,1 ) , ( 1,4 ) , ( 4,1 ) , ( 2,3 ) , ( 3,2 ) , ( 2,4 ) , ( 4,2 ) , ( 3,4 ) , ( 4,3 ) \} \end{array} \right.$$ We assume that, when the point is on one of the vertices of the graph, it has the same probability of going to each of the three other vertices of the graph. Show that the sequence of matrices $\left( T ^ { k } \right) _ { k \in \mathbb { N } }$ converges and identify geometrically the endomorphism canonically associated with the limit matrix.
Q8 Matrices Matrix Power Computation and Application View
We consider the directed graph $G = ( S , A )$ where $$\left\{ \begin{array} { l } S = \{ 1,2,3,4 \} \\ A = \{ ( 1,2 ) , ( 2,1 ) , ( 1,3 ) , ( 3,1 ) , ( 1,4 ) , ( 4,1 ) , ( 2,3 ) , ( 3,2 ) , ( 2,4 ) , ( 4,2 ) , ( 3,4 ) , ( 4,3 ) \} \end{array} \right.$$ We assume that, when the point is on one of the vertices of the graph, it has the same probability of going to each of the three other vertices of the graph. Show that, for any row vector $P ^ { ( 0 ) } = \left( p _ { 1 } ^ { ( 0 ) } , p _ { 2 } ^ { ( 0 ) } , p _ { 3 } ^ { ( 0 ) } , p _ { 4 } ^ { ( 0 ) } \right)$, where for $1 \leqslant i \leqslant 4$, $p _ { i } ^ { ( 0 ) }$ is the probability that the point is initially on vertex $i$, the sequence $\left( P ^ { ( k ) } \right) _ { k \in \mathbb { N } }$ converges to the row vector $( 1 / 4,1 / 4,1 / 4,1 / 4 )$.
Q9 Matrices Matrix Algebra and Product Properties View
We consider the graph $G$ represented in Figure 2. We recall that, when the point is on one of the vertices of the graph, it has the same probability of going to each of the vertices to which it is connected. We assume that initially, the point is on vertex 1, so that $P ^ { ( 0 ) } = ( 1,0,0,0,0,0,0,0 )$. We denote $S _ { 1 } = \{ 1,3,6,8 \}$ and $S _ { 2 } = \{ 2,4,5,7 \}$.
Give the transition matrix $T$ of this graph and calculate $$( 1,1,1,1,1,1,1,1 ) T .$$
Q10 Discrete Probability Distributions Proof of Distributional Properties or Symmetry View
We consider the graph $G$ represented in Figure 2. We denote $S _ { 1 } = \{ 1,3,6,8 \}$ and $S _ { 2 } = \{ 2,4,5,7 \}$.
Show that, if the point is on a vertex of part $S _ { 1 }$ at a given step, it will be on a vertex of part $S _ { 2 }$ at the next step and that, if it is on a vertex of $S _ { 2 }$ at a given step, it will be on a vertex of $S _ { 1 }$ at the next step.
Q11 Discrete Probability Distributions Markov Chain and Transition Matrix Analysis View
We consider the graph $G$ represented in Figure 2. We denote $S _ { 1 } = \{ 1,3,6,8 \}$ and $S _ { 2 } = \{ 2,4,5,7 \}$. We assume that initially, the point is on vertex 1, so that $P ^ { ( 0 ) } = ( 1,0,0,0,0,0,0,0 )$.
Is the sequence of vectors $\left( P ^ { ( k ) } \right) _ { k \in \mathbb { N } }$ convergent?
Q12 Matrices Linear System and Inverse Existence View
Let $M \in M _ { n } ( \mathbb { R } )$, all of whose coefficients are non-negative. Show that $M$ is a stochastic matrix if and only if $$M \left( \begin{array} { c } 1 \\ \vdots \\ 1 \end{array} \right) = \left( \begin{array} { c } 1 \\ \vdots \\ 1 \end{array} \right) .$$
Q13 Matrices Linear System and Inverse Existence View
Show that the transition matrix of a graph (defined in part I) is a stochastic matrix and that, for every natural integer $k$, the vector $P ^ { ( k ) }$, also defined in part I, is a probability distribution.
Q14 Matrices Linear Transformation and Endomorphism Properties View
Let $M \in \mathcal { M } _ { n } ( \mathbb { R } )$ and $N \in \mathcal { M } _ { n } ( \mathbb { R } )$ be two stochastic matrices, $X \in \mathbb { R } ^ { n }$ a probability distribution and $\alpha \in [ 0,1 ]$. Show that $X M$ is a probability distribution.
Q15 Matrices Matrix Algebra and Product Properties View
Let $M \in \mathcal { M } _ { n } ( \mathbb { R } )$ and $N \in \mathcal { M } _ { n } ( \mathbb { R } )$ be two stochastic matrices, $X \in \mathbb { R } ^ { n }$ a probability distribution and $\alpha \in [ 0,1 ]$. Show that $M N$ is a stochastic matrix.
Q16 Matrices Matrix Algebra and Product Properties View
Let $M \in \mathcal { M } _ { n } ( \mathbb { R } )$ and $N \in \mathcal { M } _ { n } ( \mathbb { R } )$ be two stochastic matrices, $X \in \mathbb { R } ^ { n }$ a probability distribution and $\alpha \in [ 0,1 ]$. Show that $\alpha M + ( 1 - \alpha ) N$ is a stochastic matrix.
Q17 Invariant lines and eigenvalues and vectors Eigenvalue constraints from matrix properties View
Let $M = \left( m _ { i , j } \right)$ be a stochastic matrix of $\mathcal { M } _ { n } ( \mathbb { R } )$ and $\lambda$ an eigenvalue (real or complex) of $M$. We denote by $\left( u _ { 1 } , \ldots , u _ { n } \right)$ the components (real or complex), in the canonical basis, of an eigenvector $u$ associated with $\lambda$.
Let $h \in \{ 1 , \ldots , n \}$ such that $\left| u _ { h } \right| = \max _ { 1 \leqslant i \leqslant n } \left| u _ { i } \right|$. Show that $\left| \lambda - m _ { h , h } \right| \leqslant 1 - m _ { h , h }$. Deduce that $| \lambda | \leqslant 1$.
Q18 Invariant lines and eigenvalues and vectors Eigenvalue constraints from matrix properties View
Let $M = \left( m _ { i , j } \right)$ be a stochastic matrix of $\mathcal { M } _ { n } ( \mathbb { R } )$ and $\lambda$ an eigenvalue (real or complex) of $M$. We denote by $\left( u _ { 1 } , \ldots , u _ { n } \right)$ the components (real or complex), in the canonical basis, of an eigenvector $u$ associated with $\lambda$.
Let $\delta = \min _ { 1 \leqslant i \leqslant n } m _ { i , i }$. Show that $| \lambda - \delta | \leqslant 1 - \delta$. Give a geometric interpretation of this result and show that, if all diagonal terms of $M$ are strictly positive, then 1 is the only eigenvalue of $M$ with modulus 1.
Q19 Matrices Linear Transformation and Endomorphism Properties View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive.
Prove that $\operatorname { dim } \left( \operatorname { ker } \left( M - I _ { n } \right) \right) = 1$.
If $\left( u _ { 1 } , \ldots , u _ { n } \right)$ denotes the components (real) in the canonical basis of a vector of $\operatorname { ker } \left( M - I _ { n } \right)$, one may use $\min _ { 1 \leqslant i \leqslant n } u _ { i }$.
Q20 Matrices Linear Transformation and Endomorphism Properties View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive.
Deduce that there exists at most one probability distribution $X$ invariant by $M$, that is, satisfying $X M = X$.
Q21 Matrices Matrix Power Computation and Application View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive. We set $\varepsilon = \min _ { 1 \leqslant i , j \leqslant n } m _ { i , j }$. We are interested in the sequence $\left( M ^ { k } \right) _ { k \in \mathbb { N } }$ of powers of $M$. We denote by $m _ { i , j } ^ { ( k ) }$ the coefficient of the matrix $M ^ { k }$ located in row $i$ and column $j$.
For all $j \in \llbracket 1 , n \rrbracket$, we set $$\left\{ \begin{array} { l } \alpha _ { j } ^ { ( k ) } = \min _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } , \\ \beta _ { j } ^ { ( k ) } = \max _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } . \end{array} \right.$$
In the following four questions, $j$ is a fixed integer in $\llbracket 1 , n \rrbracket$ and $k$ is fixed in $\mathbb { N }$. Prove the inequalities $\alpha _ { j } ^ { ( k ) } \leqslant \alpha _ { j } ^ { ( k + 1 ) } \leqslant \beta _ { j } ^ { ( k + 1 ) } \leqslant \beta _ { j } ^ { ( k ) }$.
Q22 Matrices Matrix Entry and Coefficient Identities View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive. We set $\varepsilon = \min _ { 1 \leqslant i , j \leqslant n } m _ { i , j }$. We are interested in the sequence $\left( M ^ { k } \right) _ { k \in \mathbb { N } }$ of powers of $M$. We denote by $m _ { i , j } ^ { ( k ) }$ the coefficient of the matrix $M ^ { k }$ located in row $i$ and column $j$.
For all $j \in \llbracket 1 , n \rrbracket$, we set $$\left\{ \begin{array} { l } \alpha _ { j } ^ { ( k ) } = \min _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } , \\ \beta _ { j } ^ { ( k ) } = \max _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } . \end{array} \right.$$
In the following four questions, $j$ is a fixed integer in $\llbracket 1 , n \rrbracket$ and $k$ is fixed in $\mathbb { N }$. Prove that there exists a pair $\left( i _ { 0 } , j _ { 0 } \right) \in \llbracket 1 , n \rrbracket ^ { 2 }$ such that $$\alpha _ { j } ^ { ( k + 1 ) } - \alpha _ { j } ^ { ( k ) } \geqslant m _ { i _ { 0 } , j _ { 0 } } \left( \beta _ { j } ^ { ( k ) } - \alpha _ { j } ^ { ( k ) } \right) .$$
Q23 Matrices Matrix Entry and Coefficient Identities View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive. We set $\varepsilon = \min _ { 1 \leqslant i , j \leqslant n } m _ { i , j }$. We are interested in the sequence $\left( M ^ { k } \right) _ { k \in \mathbb { N } }$ of powers of $M$. We denote by $m _ { i , j } ^ { ( k ) }$ the coefficient of the matrix $M ^ { k }$ located in row $i$ and column $j$.
For all $j \in \llbracket 1 , n \rrbracket$, we set $$\left\{ \begin{array} { l } \alpha _ { j } ^ { ( k ) } = \min _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } , \\ \beta _ { j } ^ { ( k ) } = \max _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } . \end{array} \right.$$
In the following four questions, $j$ is a fixed integer in $\llbracket 1 , n \rrbracket$ and $k$ is fixed in $\mathbb { N }$. Prove that there exists a pair $\left( i _ { 1 } , j _ { 1 } \right) \in \llbracket 1 , n \rrbracket ^ { 2 }$ such that $$\beta _ { j } ^ { ( k ) } - \beta _ { j } ^ { ( k + 1 ) } \geqslant m _ { i _ { 1 } , j _ { 1 } } \left( \beta _ { j } ^ { ( k ) } - \alpha _ { j } ^ { ( k ) } \right) .$$
Q24 Matrices Matrix Norm, Convergence, and Inequality View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive. We set $\varepsilon = \min _ { 1 \leqslant i , j \leqslant n } m _ { i , j }$. We are interested in the sequence $\left( M ^ { k } \right) _ { k \in \mathbb { N } }$ of powers of $M$. We denote by $m _ { i , j } ^ { ( k ) }$ the coefficient of the matrix $M ^ { k }$ located in row $i$ and column $j$.
For all $j \in \llbracket 1 , n \rrbracket$, we set $$\left\{ \begin{array} { l } \alpha _ { j } ^ { ( k ) } = \min _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } , \\ \beta _ { j } ^ { ( k ) } = \max _ { 1 \leqslant i \leqslant n } m _ { i , j } ^ { ( k ) } . \end{array} \right.$$
Deduce that $\beta _ { j } ^ { ( k + 1 ) } - \alpha _ { j } ^ { ( k + 1 ) } \leqslant ( 1 - 2 \varepsilon ) \left( \beta _ { j } ^ { ( k ) } - \alpha _ { j } ^ { ( k ) } \right)$.
Q25 Matrices Matrix Power Computation and Application View
We now assume that all coefficients $m _ { i , j } ( 1 \leqslant i , j \leqslant n )$ of the stochastic matrix $M$ are strictly positive. We set $\varepsilon = \min _ { 1 \leqslant i , j \leqslant n } m _ { i , j }$.
Prove that the sequence $\left( M ^ { k } \right)$ converges to a stochastic matrix $B = \left( \begin{array} { l l l } b _ { 1 } & \cdots & b _ { n } \\ b _ { 1 } & \cdots & b _ { n } \\ b _ { 1 } & \cdots & b _ { n } \end{array} \right)$ all of whose rows are equal.
Q29 Matrices Structured Matrix Characterization View
We model the web by a directed graph with $n$ vertices. For every integer $i \in \llbracket 1 , n \rrbracket$, $\lambda _ { i }$ denotes the number of outgoing edges from page $i$. We assume that no page points to itself. A surfer navigates the web in the following way: when on page $i$,
  • if page $i$ points to other pages, he goes randomly, with equal probability, to one of these pages;
  • if page $i$ points to no other page, he remains on page $i$.
Verify that the transition matrix associated with this navigation model is the matrix $A = \left( a _ { i , j } \right) _ { 1 \leqslant i , j \leqslant n }$ with $$\left\{ \begin{array} { l } a _ { i , i } = \begin{cases} 1 & \text { if page } i \text { points to no other page } \\ 0 & \text { otherwise } \end{cases} \\ a _ { i , j } = \left\{ \begin{array} { l l } 0 & \text { if } i \nrightarrow j \\ 1 / \lambda _ { i } & \text { if } i \rightarrow j \end{array} \quad \text { for } i \neq j \right. \end{array} \right.$$
Q30 Matrices Structured Matrix Characterization View
We model the web by a directed graph with $n$ vertices. The matrix $A$ is the stochastic matrix described in question 29. We define $$B = ( 1 - \alpha ) A + \frac { \alpha } { n } J _ { n }$$ where $J _ { n }$ is the matrix in $\mathcal { M } _ { n } ( \mathbb { R } )$ whose coefficients are all equal to $1$, $A$ is the stochastic matrix described in question 29 and $\alpha$ is a real number in $] 0,1 [$, called the damping factor.
Show that $B$ is a stochastic matrix whose coefficients are all strictly positive.
Q31 Matrices Linear Transformation and Endomorphism Properties View
We model the web by a directed graph with $n$ vertices. The matrix $A$ is the stochastic matrix described in question 29. We define $$B = ( 1 - \alpha ) A + \frac { \alpha } { n } J _ { n }$$ where $J _ { n }$ is the matrix in $\mathcal { M } _ { n } ( \mathbb { R } )$ whose coefficients are all equal to $1$, $A$ is the stochastic matrix described in question 29 and $\alpha$ is a real number in $] 0,1 [$, called the damping factor.
In the navigation model admitting $B$ as its transition matrix, give the probability of leaving a page containing no links to another page.
Q32 Matrices Matrix Power Computation and Application View
We model the web by a directed graph with $n$ vertices. The matrix $A$ is the stochastic matrix described in question 29. We define $$B = ( 1 - \alpha ) A + \frac { \alpha } { n } J _ { n }$$ where $J _ { n }$ is the matrix in $\mathcal { M } _ { n } ( \mathbb { R } )$ whose coefficients are all equal to $1$, $A$ is the stochastic matrix described in question 29 and $\alpha$ is a real number in $] 0,1 [$, called the damping factor.
Let $Q$ be a probability distribution. We define the sequence $\left( Q ^ { ( k ) } \right) _ { k \in \mathbb { N } }$ by $Q ^ { ( k ) } = Q B ^ { k }$ for every natural number $k$. Prove that the sequence $\left( Q ^ { ( k ) } \right) _ { k \in \mathbb { N } }$ converges and that its limit $Q ^ { \infty }$ satisfies conditions (i) and (ii) described in the introduction to this part. It thus provides relevance scores for the $n$ pages of the web. The relevance of each page $j$ will be expressed as a function of those of the pages pointing to it, distinguishing between pages that point to another page and the others.