kyotsu-test 2015 QCourse2-II-Q1
View
Given two points $\mathrm { A } ( 1 , - 1,0 )$ and $\mathrm { B } ( - 2,1,2 )$ in a coordinate space with the origin O, let us set $\overrightarrow { \mathrm { OA } } = \vec { a } , \overrightarrow { \mathrm { OB } } = \vec { b }$.
(1) First, we are to find the value of $t$ at which $| \vec { a } + t \vec { b } |$ is minimized. Since
$$| \vec { a } + t \vec { b } | ^ { 2 } = \mathbf { A } t ^ { 2 } - \mathbf { B } t + \mathbf { C }$$
$| \vec { a } + t \vec { b } |$ is minimized at $t = \frac { \mathbf { D } } { \mathbf { E } }$, and its minimum value is $\mathbf { F }$.
(2) Next, the vectors $\vec { c }$ which are orthogonal to the vectors $\vec { a }$ and $\vec { b }$ can be represented as
$$\vec { c } = s ( \mathbf { G } , \mathbf { H } , 1 )$$
where $s$ is a non-zero real number. Now, let C and D be the points such that $\overrightarrow { \mathrm { OC } } = ( \mathbf { G } , \mathbf { H } , 1 )$ and $\overrightarrow { \mathrm { OD } } = 3 \vec { a } + \vec { b }$. Since $\angle \mathrm { CBD } = \frac { \pi } { \mathbf { I } }$, the area of the triangle BCD is $\frac { \mathbf { J } \sqrt { \mathbf { K } } } { \mathbf { L } }$.