Q3
Matrices
Linear System and Inverse Existence
View
Consider an open set $U \subset \mathbb{R}^n$, $h : U \rightarrow \mathbb{R}$ a $\mathcal{C}^1$ application and $b \in \mathbb{R}^m$. Assume that there exists $x_* \in U$ a minimum of $h$ on the set $V_b = \{x \in U \mid Ax + b = 0\}$.
(a) Show that for all $u \in \mathbb{R}^n$ such that $Au = 0$ we have $\left\langle \nabla h(x_*), u \right\rangle_{\mathbb{R}^n} = 0$ where $\nabla h(x)$ denotes the gradient of $h$ at $x$.
(b) Show the existence of $\nu_* \in \mathbb{R}^m$ such that $\nabla h(x_*) - A^T \nu_* = 0$.
(c) Deduce that the application $L : U \times \mathbb{R}^m \rightarrow \mathbb{R}$ such that $L(x, \nu) = h(x) - \langle \nu, Ax + b \rangle_{\mathbb{R}^m}$ satisfies $\frac{\partial L}{\partial x_k}(x_*, \nu_*) = 0$ for all $1 \leq k \leq n$ where $\frac{\partial L}{\partial x_k}(x, \nu)$ denotes the partial derivative of $L$ with respect to the $k$-th coordinate of $x \in \mathbb{R}^n$.
(d) Conclude that if $U$ is convex, and $h$ is convex on $U$, then $L$ admits a saddle point at $(x_*, \nu_*)$, that is, we have $$L(x_*, \nu) \leq L(x_*, \nu_*) \leq L(x, \nu_*)$$ for all $(x, \nu) \in U \times \mathbb{R}^m$.