In information theory, when an event $E$ occurs, the information content $I ( E )$ of event $E$ is defined as follows: $$I ( E ) = - \log _ { 2 } \mathrm { P } ( E )$$ Which of the following statements in $\langle$Remarks$\rangle$ are correct? (Note: The probability $\mathrm { P } ( E )$ of event $E$ is positive, and the unit of information content is bits.) [4 points] $\langle$Remarks$\rangle$ ㄱ. If event $E$ is rolling an odd number on a single die, then $I ( E ) = 1$. ㄴ. If two events $A$ and $B$ are independent and $\mathrm { P } ( A \cap B ) > 0$, then $I ( A \cap B ) = I ( A ) + I ( B )$. ㄷ. For two events $A$ and $B$ with $\mathrm { P } ( A ) > 0$ and $\mathrm { P } ( B ) > 0$, we have $2 I ( A \cup B ) \leqq I ( A ) + I ( B )$. (1) ㄱ (2) ㄱ, ㄴ (3) ㄱ, ㄷ (4) ㄴ, ㄷ (5) ㄱ, ㄴ, ㄷ
In information theory, when an event $E$ occurs, the information content $I ( E )$ of event $E$ is defined as follows:
$$I ( E ) = - \log _ { 2 } \mathrm { P } ( E )$$
Which of the following statements in $\langle$Remarks$\rangle$ are correct? (Note: The probability $\mathrm { P } ( E )$ of event $E$ is positive, and the unit of information content is bits.) [4 points]
$\langle$Remarks$\rangle$\\
ㄱ. If event $E$ is rolling an odd number on a single die, then $I ( E ) = 1$.\\
ㄴ. If two events $A$ and $B$ are independent and $\mathrm { P } ( A \cap B ) > 0$, then $I ( A \cap B ) = I ( A ) + I ( B )$.\\
ㄷ. For two events $A$ and $B$ with $\mathrm { P } ( A ) > 0$ and $\mathrm { P } ( B ) > 0$, we have $2 I ( A \cup B ) \leqq I ( A ) + I ( B )$.\\
(1) ㄱ\\
(2) ㄱ, ㄴ\\
(3) ㄱ, ㄷ\\
(4) ㄴ, ㄷ\\
(5) ㄱ, ㄴ, ㄷ