Conditional Probability and Total Probability with Tree/Bayes Structure

Compute probabilities using the law of total probability, conditional probabilities, or Bayes' theorem in a scenario with discrete conditioning events.

grandes-ecoles 2018 Q21 View
Let $n$ be an integer such that $n \geqslant 2$. We denote by $E' = \operatorname{Vect}(e_{1}, \ldots, e_{n-1})$ and by $\pi$ the orthogonal projection onto $E'$. We set $X' = \pi \circ X = \sum_{i=1}^{n-1} \varepsilon_{i} e_{i}$. For $t$ in $\{-1, 1\}$ we denote $C_{t} = \pi(C \cap H_{t})$ where $H_t = E' + te_n$. For $t$ in $\{-1, 1\}$, we denote by $Y_{t}$ the projection of $X'$ onto the non-empty closed convex set $C_{t}$.
Show that
$$\mathbb{P}(X \in C) = \frac{1}{2} \mathbb{P}(X' \in C_{+1}) + \frac{1}{2} \mathbb{P}(X' \in C_{-1})$$
grandes-ecoles 2018 Q24 View
We denote
$$p_{+} = \mathbb{P}(X' \in C_{+1}) \quad \text{and} \quad p_{-} = \mathbb{P}(X' \in C_{-1})$$
We will assume, without loss of generality, that $p_{+} \geqslant p_{-}$.
Show that $p_{-} > 0$.
grandes-ecoles 2023 Q8 View
We consider $\alpha = (\alpha_i)_{i \in I} \in (\mathbb{R}_+^*)^I$ and $\beta = (\beta_j)_{j \in J} \in (\mathbb{R}_+^*)^J$ such that $\sum_{i \in I} \alpha_i = \sum_{j \in J} \beta_j = 1$. We denote $$F(\alpha, \beta) = \left\{q \in Q \mid \sum_{j' \in J} q_{ij'} = \alpha_i \text{ and } \sum_{i' \in I} q_{i'j} = \beta_j \text{ for all } (i,j) \in I \times J\right\}.$$ We denote by $\boldsymbol{p}$ the element of $F(\alpha, \beta)$ defined by $p_{ij} = \alpha_i \beta_j > 0$ for all $(i,j) \in I \times J$. Let $X_1$ and $X_2$ be two random variables such that $X_1$ takes values in $I$ and $X_2$ takes values in $J$.
(a) Verify that if $\boldsymbol{q} \in F(\alpha, \beta)$, then $\sum_{i \in I} \sum_{j \in J} q_{ij} = 1$.
(b) Assume that $P(X_1 = i, X_2 = j) = q_{ij}$ with $q \in F(\alpha, \beta)$. Calculate the distribution of $X_1$ and that of $X_2$ in terms of $\alpha$ and $\beta$.
(c) What can we say about $X_1$ and $X_2$ when $\boldsymbol{q} = \boldsymbol{p}$?