grandes-ecoles

Papers (191)
2025
centrale-maths1__official 40 centrale-maths2__official 42 mines-ponts-maths1__mp 20 mines-ponts-maths1__pc 21 mines-ponts-maths1__psi 21 mines-ponts-maths2__mp 28 mines-ponts-maths2__pc 24 mines-ponts-maths2__psi 26 polytechnique-maths-a__mp 27 polytechnique-maths__fui 16 polytechnique-maths__pc 27 x-ens-maths-a__mp 18 x-ens-maths-c__mp 9 x-ens-maths-d__mp 38 x-ens-maths__pc 27 x-ens-maths__psi 38
2024
centrale-maths1__official 28 centrale-maths2__official 29 geipi-polytech__maths 9 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 20 mines-ponts-maths1__psi 19 mines-ponts-maths2__mp 23 mines-ponts-maths2__pc 21 mines-ponts-maths2__psi 21 polytechnique-maths-a__mp 44 polytechnique-maths-b__mp 37 x-ens-maths-a__mp 43 x-ens-maths-b__mp 35 x-ens-maths-c__mp 22 x-ens-maths-d__mp 45 x-ens-maths__pc 24 x-ens-maths__psi 26
2023
centrale-maths1__official 44 centrale-maths2__official 33 e3a-polytech-maths__mp 4 mines-ponts-maths1__mp 15 mines-ponts-maths1__pc 23 mines-ponts-maths1__psi 23 mines-ponts-maths2__mp 22 mines-ponts-maths2__pc 18 mines-ponts-maths2__psi 22 polytechnique-maths__fui 23 x-ens-maths-a__mp 25 x-ens-maths-b__mp 24 x-ens-maths-c__mp 20 x-ens-maths-d__mp 20 x-ens-maths__pc 18 x-ens-maths__psi 15
2022
centrale-maths1__mp 48 centrale-maths1__official 48 centrale-maths1__pc 37 centrale-maths1__psi 43 centrale-maths2__mp 32 centrale-maths2__official 32 centrale-maths2__pc 39 centrale-maths2__psi 45 mines-ponts-maths1__mp 25 mines-ponts-maths1__pc 24 mines-ponts-maths1__psi 24 mines-ponts-maths2__mp 24 mines-ponts-maths2__pc 19 mines-ponts-maths2__psi 20 x-ens-maths-a__mp 13 x-ens-maths-b__mp 40 x-ens-maths-c__mp 27 x-ens-maths-d__mp 46 x-ens-maths1__mp 13 x-ens-maths2__mp 40 x-ens-maths__pc 15 x-ens-maths__pc_cpge 15 x-ens-maths__psi 22 x-ens-maths__psi_cpge 23
2021
centrale-maths1__mp 40 centrale-maths1__official 40 centrale-maths1__pc 36 centrale-maths1__psi 29 centrale-maths2__mp 30 centrale-maths2__official 29 centrale-maths2__pc 38 centrale-maths2__psi 37 x-ens-maths2__mp 39 x-ens-maths__pc 44
2020
centrale-maths1__mp 42 centrale-maths1__official 42 centrale-maths1__pc 36 centrale-maths1__psi 40 centrale-maths2__mp 38 centrale-maths2__official 38 centrale-maths2__pc 40 centrale-maths2__psi 39 mines-ponts-maths1__mp_cpge 24 mines-ponts-maths2__mp_cpge 21 x-ens-maths-a__mp_cpge 18 x-ens-maths-b__mp_cpge 20 x-ens-maths-d__mp 14 x-ens-maths1__mp 18 x-ens-maths2__mp 20 x-ens-maths__pc 18
2019
centrale-maths1__mp 37 centrale-maths1__official 37 centrale-maths1__pc 40 centrale-maths1__psi 39 centrale-maths2__mp 37 centrale-maths2__official 37 centrale-maths2__pc 39 centrale-maths2__psi 49 x-ens-maths1__mp 24 x-ens-maths__pc 18 x-ens-maths__psi 26
2018
centrale-maths1__mp 47 centrale-maths1__official 47 centrale-maths1__pc 41 centrale-maths1__psi 44 centrale-maths2__mp 44 centrale-maths2__official 44 centrale-maths2__pc 35 centrale-maths2__psi 38 x-ens-maths1__mp 19 x-ens-maths2__mp 17 x-ens-maths__pc 22 x-ens-maths__psi 24
2017
centrale-maths1__mp 45 centrale-maths1__official 45 centrale-maths1__pc 22 centrale-maths1__psi 17 centrale-maths2__mp 30 centrale-maths2__official 30 centrale-maths2__pc 28 centrale-maths2__psi 44 x-ens-maths1__mp 26 x-ens-maths2__mp 16 x-ens-maths__pc 18 x-ens-maths__psi 26
2016
centrale-maths1__mp 42 centrale-maths1__pc 31 centrale-maths1__psi 33 centrale-maths2__mp 25 centrale-maths2__pc 47 centrale-maths2__psi 27 x-ens-maths1__mp 18 x-ens-maths2__mp 46 x-ens-maths__pc 15 x-ens-maths__psi 20
2015
centrale-maths1__mp 42 centrale-maths1__pc 18 centrale-maths1__psi 42 centrale-maths2__mp 44 centrale-maths2__pc 18 centrale-maths2__psi 33 x-ens-maths1__mp 16 x-ens-maths2__mp 31 x-ens-maths__pc 30 x-ens-maths__psi 22
2014
centrale-maths1__mp 28 centrale-maths1__pc 26 centrale-maths1__psi 27 centrale-maths2__mp 24 centrale-maths2__pc 26 centrale-maths2__psi 27 x-ens-maths1__mp 9 x-ens-maths2__mp 16 x-ens-maths__pc 4 x-ens-maths__psi 24
2013
centrale-maths1__mp 22 centrale-maths1__pc 45 centrale-maths1__psi 29 centrale-maths2__mp 31 centrale-maths2__pc 52 centrale-maths2__psi 32 x-ens-maths1__mp 24 x-ens-maths2__mp 35 x-ens-maths__pc 22 x-ens-maths__psi 9
2012
centrale-maths1__mp 36 centrale-maths1__pc 28 centrale-maths1__psi 33 centrale-maths2__mp 27 centrale-maths2__psi 18
2011
centrale-maths1__mp 27 centrale-maths1__pc 17 centrale-maths1__psi 24 centrale-maths2__mp 29 centrale-maths2__pc 17 centrale-maths2__psi 10
2010
centrale-maths1__mp 19 centrale-maths1__pc 30 centrale-maths1__psi 13 centrale-maths2__mp 32 centrale-maths2__pc 37 centrale-maths2__psi 27
2023 x-ens-maths__psi

15 maths questions

Q1 Groups Group Actions and Surjectivity/Injectivity of Maps View
Let $C \subset E$ be a convex set. Let $f$ and $g$ be two convex functions from $C$ to $\mathbb{R}$.
(a) Show that $f + g$ is convex, and strictly convex if one of the two functions $f$ or $g$ is strictly convex.
(b) Assume $f$ is strictly convex. Verify that the minimum of $f$ is attained on $C$ at most at one point of $C$.
Q2 Matrices Projection and Orthogonality View
Let $A \in \mathcal{M}_{m,n}(\mathbb{R})$ be a matrix with $m$ rows and $n$ columns. We denote by $\langle u, v \rangle_{\mathbb{R}^n}$ the inner product between two vectors $u$ and $v$ of $\mathbb{R}^n$ and $\langle \mu, \nu \rangle_{\mathbb{R}^m}$ that between two vectors $\mu$ and $\nu$ of $\mathbb{R}^m$.
(a) Show that for all $(x, \nu) \in \mathbb{R}^n \times \mathbb{R}^m$, we have $$\langle Ax, \nu \rangle_{\mathbb{R}^m} = \left\langle x, A^\top \nu \right\rangle_{\mathbb{R}^n},$$ where $A^\top$ denotes the transpose matrix of $A$.
(b) Deduce that $\ker A \subset (\operatorname{Im} A^\top)^\perp$ where $E^\perp$ denotes the orthogonal complement of $E$ for the inner product on $\mathbb{R}^n$ for any vector subspace $E$ of $\mathbb{R}^n$.
(c) Show that $\ker A = (\operatorname{Im} A^\top)^\perp$.
Q3 Matrices Linear System and Inverse Existence View
Consider an open set $U \subset \mathbb{R}^n$, $h : U \rightarrow \mathbb{R}$ a $\mathcal{C}^1$ application and $b \in \mathbb{R}^m$. Assume that there exists $x_* \in U$ a minimum of $h$ on the set $V_b = \{x \in U \mid Ax + b = 0\}$.
(a) Show that for all $u \in \mathbb{R}^n$ such that $Au = 0$ we have $\left\langle \nabla h(x_*), u \right\rangle_{\mathbb{R}^n} = 0$ where $\nabla h(x)$ denotes the gradient of $h$ at $x$.
(b) Show the existence of $\nu_* \in \mathbb{R}^m$ such that $\nabla h(x_*) - A^T \nu_* = 0$.
(c) Deduce that the application $L : U \times \mathbb{R}^m \rightarrow \mathbb{R}$ such that $L(x, \nu) = h(x) - \langle \nu, Ax + b \rangle_{\mathbb{R}^m}$ satisfies $\frac{\partial L}{\partial x_k}(x_*, \nu_*) = 0$ for all $1 \leq k \leq n$ where $\frac{\partial L}{\partial x_k}(x, \nu)$ denotes the partial derivative of $L$ with respect to the $k$-th coordinate of $x \in \mathbb{R}^n$.
(d) Conclude that if $U$ is convex, and $h$ is convex on $U$, then $L$ admits a saddle point at $(x_*, \nu_*)$, that is, we have $$L(x_*, \nu) \leq L(x_*, \nu_*) \leq L(x, \nu_*)$$ for all $(x, \nu) \in U \times \mathbb{R}^m$.
Q4 Continuous Probability Distributions and Random Variables Entropy, Information, or Log-Sobolev Functional Analysis View
Let $X$ be a finite set and $p = (p_x)_{x \in X}$ a probability distribution on $X$. We assume that $p$ charges all points of $X$: $p_x > 0$ for all $x \in X$. We call entropy of $p$ the quantity $$H(p) = -\sum_{x \in X} p_x \ln(p_x)$$ We consider the set $Q_X = \{\boldsymbol{q} = (q_x)_{x \in X} \in \mathbb{R}^X \mid \forall x \in X, q_x \geq 0\}$. For all $\boldsymbol{q}, \boldsymbol{q}' \in Q_X$ such that $q_x' > 0$ for all $x \in X$, we define: $$\mathrm{KL}(\boldsymbol{q}, \boldsymbol{q}') = \sum_{x \in X} \varphi(q_x / q_x') q_x'$$ with $\varphi : \mathbb{R}_+ \rightarrow \mathbb{R}$ defined by $\varphi(x) = x \log(x) - x + 1$ for $x > 0$ and extended to 0 by continuity.
(a) Specify $\varphi(0)$.
(b) Verify that $\varphi$ is continuous, strictly convex, positive and that $\varphi(x) = 0$ if and only if $x = 1$.
(c) Show that $Q_X$ is convex and that $\boldsymbol{q} \mapsto \mathrm{KL}(\boldsymbol{q}, \boldsymbol{q}')$ is strictly convex, positive and vanishes if and only if $q = q'$.
Q5 Proof by induction Prove a summation inequality by induction View
Let $X$ be a non-empty finite set and $c : X \rightarrow \{0,1\}^+$ an injective application. We say that $c$ is a binary code on $X$. We further assume that $c$ is a prefix code, that is, for all $x \neq y$ in $X$, $c(x)$ is not a prefix of $c(y)$. We define $\bar{c} : X \rightarrow \{0,1\}^*$ such that for all $x \in X$, $c(x) = c(x)_1 \cdot \bar{c}(x)$ where $c(x)_1$ is the first element of the word $c(x)$.
(a) Verify that for all $x \neq y \in X$, if $c(x)_1 = c(y)_1$ then $\bar{c}(x) \neq \bar{c}(y)$ and $\bar{c}(x)$ is not a prefix of $\bar{c}(y)$.
(b) For $a \in \{0,1\}$ we denote $X_a = \{x \in X \mid c(x)_1 = a\}$. Show that if $X_a$ contains at least two elements, then the restriction of $\bar{c}$ to $X_a$ is a prefix code on $X_a$.
(c) Deduce that $\sum_{x \in X} 2^{-|c(x)|} \leq 1$. (Hint: One may decompose the sum into a sum over $X_0$ and $X_1$ and reason by induction on $L(c) = \max\{|c(x)| \mid x \in X\}$)
Q7 Continuous Probability Distributions and Random Variables Verification of Probability Measure or Inner Product Properties View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote $$Q = \left\{(q_{ij})_{(i,j) \in I \times J} \in \mathbb{R}^{I \times J} \mid q_{ij} \geq 0 \text{ for all } (i,j) \in I \times J\right\}$$ and $$F(\alpha, \beta) = \left\{q \in Q \mid \sum_{j' \in J} q_{ij'} = \alpha_i \text{ and } \sum_{i' \in I} q_{i'j} = \beta_j \text{ for all } (i,j) \in I \times J\right\}.$$ Verify that $F(\alpha, \beta)$ is a convex set of the vector space $E = \mathbb{R}^{I \times J}$.
Q8 Continuous Probability Distributions and Random Variables Conditional Probability and Total Probability with Tree/Bayes Structure View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote $$F(\alpha, \beta) = \left\{q \in Q \mid \sum_{j' \in J} q_{ij'} = \alpha_i \text{ and } \sum_{i' \in I} q_{i'j} = \beta_j \text{ for all } (i,j) \in I \times J\right\}.$$ We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $X_1$ and $X_2$ be two random variables such that $X_1$ takes values in $I$ and $X_2$ takes values in $J$.
(a) Verify that if $\boldsymbol{q} \in F(\alpha, \beta)$, then $\sum_{i \in I} \sum_{j \in J} q_{ij} = 1$.
(b) Assume that $P(X_1 = i, X_2 = j) = q_{ij}$ with $q \in F(\alpha, \beta)$. Calculate the distribution of $X_1$ and that of $X_2$ in terms of $\alpha$ and $\beta$.
(c) What can we say about $X_1$ and $X_2$ when $\boldsymbol{q} = \boldsymbol{p}$?
Q9 Proof Direct Proof of an Inequality View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $C = (C_{ij})_{(i,j) \in I \times J} \in \mathbb{R}_+^{I \times J}$ and $\epsilon > 0$. We consider $J_\epsilon : Q \rightarrow \mathbb{R}$ defined by $$J_\epsilon(\boldsymbol{q}) = \sum_{ij} q_{ij} C_{ij} + \epsilon \operatorname{KL}(\boldsymbol{q}, \boldsymbol{p})$$ where $\mathrm{KL}(\boldsymbol{q}, \boldsymbol{p})$ is defined by taking $X = I \times J$. Show that $J_\epsilon$ is strictly convex on $Q$.
Q10 Proof Existence Proof View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $C = (C_{ij})_{(i,j) \in I \times J} \in \mathbb{R}_+^{I \times J}$ and $\epsilon > 0$. We consider $J_\epsilon : Q \rightarrow \mathbb{R}$ defined by $$J_\epsilon(\boldsymbol{q}) = \sum_{ij} q_{ij} C_{ij} + \epsilon \operatorname{KL}(\boldsymbol{q}, \boldsymbol{p})$$ (a) Verify that $F(\alpha, \beta)$ is a closed bounded set of $\mathbb{R}^{I \times J}$.
(b) Show that there exists a unique $\boldsymbol{q}(\epsilon) \in Q$ minimizing $J_\epsilon$ on $F(\alpha, \beta)$.
(c) By considering a simple counterexample, show that uniqueness is no longer true if we assume that $\epsilon = 0$.
Q11 Continuous Probability Distributions and Random Variables Entropy, Information, or Log-Sobolev Functional Analysis View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $C = (C_{ij})_{(i,j) \in I \times J} \in \mathbb{R}_+^{I \times J}$ and $\epsilon > 0$. We consider $J_\epsilon : Q \rightarrow \mathbb{R}$ defined by $$J_\epsilon(\boldsymbol{q}) = \sum_{ij} q_{ij} C_{ij} + \epsilon \operatorname{KL}(\boldsymbol{q}, \boldsymbol{p})$$ and $\boldsymbol{q}(\epsilon)$ the unique minimizer of $J_\epsilon$ on $F(\alpha, \beta)$.
(a) Verify that $q(\epsilon)_{ij} > 0$ for all $(i,j) \in I \times J$ (Hint: One may reason by contradiction and consider for all $t \in ]0,1[$ $\boldsymbol{q}(\epsilon, t) = (1-t)\boldsymbol{q}(\epsilon) + t\boldsymbol{p}$ then observe the behavior of $\varphi(x)$ near $x = 0$).
(b) Show that this is no longer true if we assume that $\epsilon = 0$.
Q12 Continuous Probability Distributions and Random Variables Entropy, Information, or Log-Sobolev Functional Analysis View
We define $Q_{>0} = (\mathbb{R}_+^*)^{I \times J}$ and $\mathscr{L} : Q_{>0} \times (\mathbb{R}^I \times \mathbb{R}^J) \rightarrow \mathbb{R}$ defined by $$\mathscr{L}(\boldsymbol{q}, (f, g)) = J_\epsilon(\boldsymbol{q}) + \sum_{i \in I} f_i \left(\alpha_i - \sum_{j \in J} q_{ij}\right) + \sum_{j \in J} g_j \left(\beta_j - \sum_{i \in I} q_{ij}\right).$$ (a) Verify that $Q_{>0}$ is an open convex set of $\mathbb{R}^{I \times J}$.
(b) Show that there exists $(f(\epsilon), g(\epsilon)) \in \mathbb{R}^I \times \mathbb{R}^J$ such that $\mathscr{L}(q(\epsilon), (f(\epsilon), g(\epsilon)))$ is a saddle point of $\mathscr{L}$. (Hint: One may identify $\mathbb{R}^{I \times J}$ with $\mathbb{R}^n$ and $\mathbb{R}^I \times \mathbb{R}^J$ with $\mathbb{R}^m$, for $n$ the cardinality of $I \times J$ and $m$ the sum of the cardinalities of $I$ and $J$, then use question 3 of part I.)
Q13 Stationary points and optimisation Existence or properties of extrema via abstract/theoretical argument View
We define $Q_{>0} = (\mathbb{R}_+^*)^{I \times J}$ and $\mathscr{L} : Q_{>0} \times (\mathbb{R}^I \times \mathbb{R}^J) \rightarrow \mathbb{R}$ defined by $$\mathscr{L}(\boldsymbol{q}, (f, g)) = J_\epsilon(\boldsymbol{q}) + \sum_{i \in I} f_i \left(\alpha_i - \sum_{j \in J} q_{ij}\right) + \sum_{j \in J} g_j \left(\beta_j - \sum_{i \in I} q_{ij}\right).$$ (a) Show that for all $(f, g) \in \mathbb{R}^I \times \mathbb{R}^J$, the minimum of $\boldsymbol{q} \mapsto \mathscr{L}(\boldsymbol{q}, (f, g))$ on $Q_{>0}$ is attained at $q(f,g)_{ij} = e^{(f_i + g_j - C_{ij})/\epsilon} p_{ij}$.
(b) Calculate the value of $G(f, g) = \mathscr{L}(q(f,g), (f,g))$.
(c) Verify that $G$ is concave on $\mathbb{R}^I \times \mathbb{R}^J$.
Q14 Stationary points and optimisation Existence or properties of extrema via abstract/theoretical argument View
Verify that if $f_* : \mathbb{R}^J \rightarrow \mathbb{R}^I$ and $g_* : \mathbb{R}^I \rightarrow \mathbb{R}^J$ are defined by $$f_*(g)_i = -\epsilon \log\left(\sum_{j \in J} e^{(g_j - C_{ij})/\epsilon} \beta_j\right) \text{ and } g_*(f)_j = -\epsilon \log\left(\sum_{i \in I} e^{(f_i - C_{ij})/\epsilon} \alpha_i\right)$$ then for all $(f, g) \in \mathbb{R}^I \times \mathbb{R}^J$, we have $\frac{\partial G}{\partial f_i}(f_*(g), g) = \frac{\partial G}{\partial g_j}(f, g_*(f)) = 0$ for all $(i,j) \in I \times J$.
Q15 Proof Direct Proof of an Inequality View
Let $(f^0, g^0) \in \mathbb{R}^{I \times J}$. For all $k \geq 0$, we consider $$g^{k+1} = g_*(f^k) \text{ and } f^{k+1} = f_*(g^{k+1})$$ Show that the sequence $(G(f^k, g^k))_{k \geq 0}$ is increasing.
Q16 Proof Direct Proof of a Stated Identity or Equality View
Let $(f^0, g^0) \in \mathbb{R}^{I \times J}$. For all $k \geq 0$, we consider $$g^{k+1} = g_*(f^k) \text{ and } f^{k+1} = f_*(g^{k+1})$$ Assume that there exist $f^\infty = (f_i^\infty)_{i \in I}$ and $g^\infty = (g_j^\infty)_{j \in J}$ such that $|f_i^k - f_i^\infty| \rightarrow 0$ and $|g_j^k - g_j^\infty| \rightarrow 0$ for all $i \in I$ and $j \in J$. We denote $G_* = \sup\{G(f,g) \mid (f,g) \in \mathbb{R}^I \times \mathbb{R}^J\}$.
(a) Show that $G(f^\infty, g^\infty) = G_*$.
(b) Show that $G(f(\epsilon), g(\epsilon)) = G_*$.
(c) Show that there exists a constant $a \in \mathbb{R}$ such that $f(\epsilon)_i = f_i^\infty + a$ and $g(\epsilon)_j = g_j^\infty - a$ for all $(i,j) \in I \times J$.
(d) Deduce that $q(f^k, g^k) \rightarrow q(\epsilon)$.