Proof of a General Conditional Expectation or Independence Property

The question asks for a formal proof of a theoretical result involving conditional expectation, conditional probability identities, or independence of events in a general or abstract setting.

grandes-ecoles 2025 Q21 View
We fix $n \in \mathbf { N } ^ { * }$ and draw successively and with replacement two integers $p$ and $q$ according to a uniform distribution on $\llbracket 1 , n \rrbracket$. Using the result $H_n \sim \ln n$ as $n \to +\infty$, show that $$\mathbf { P } \left( A _ { n } \cup B _ { n } \right) \sim \frac { \ln n } { n } \quad ( n \rightarrow + \infty )$$
grandes-ecoles 2025 Q22 View
Let $Z$ be a random variable taking values in $\{ 0 , \ldots , M \}$. Show that:
$$\mathbf { E } [ Z ] = \sum _ { ( s , i , r ) \in E } \left( \sum _ { k = 0 } ^ { M } k \mathbf { P } \left( Z = k \mid \left( \tilde { S } _ { n } , \tilde { I } _ { n } , \tilde { R } _ { n } \right) = ( s , i , r ) \right) \right) \mathbf { P } \left( \left( \tilde { S } _ { n } , \tilde { I } _ { n } , \tilde { R } _ { n } \right) = ( s , i , r ) \right).$$
jee-advanced 2007 Q62 View
Let $E^c$ denote the complement of an event $E$. Let $E$, $F$, $G$ be pairwise independent events with $P(G) > 0$ and $P(E \cap F \cap G) = 0$. Then $P(E^c \cap F^c | G)$ equals
(A) $P(E^c) + P(F^c)$
(B) $P(E^c) - P(F^c)$
(C) $P(E^c) - P(F)$
(D) $P(E) - P(F^c)$