Full Solution Manual for
Probabilistic Machine Learning: An Introduction
Kevin Murphy
1 1 / 4
- Solutions
2 2 / 4
Part I Foundations
3 3 / 4
- Solutions
2.1 Conditional independence
PRIVATE
1.P(HjE1; E2) = P(E1; E2jH)P(H)
P(E1; E2)
(1) Thus the information in (ii) is sucient. In fact, we don't needP(E1; E2 )because it is equal to the normalization constant (to enforce the sum to one constraint). (i) and (iii) are insucient.
2.P(HjE1; E2) = P(E1jH)P(E2jH)P(H)
P(E1; E2)
(2) so (i) and (ii) are obviously sucient. (iii) is also sucient, because we can computeP(E1; E2 )using normalization.
2.2 Pairwise independence does not imply mutual independence We provide two counter examples.LetX1andX2be independent binary random variables, andX3=X1X2 , whereis the XOR operator. We havep(X3jX1; X2 )6=p(X3), sinceX3can be deterministically calculated fromX1andX2. So the variablesfX1; X2; X3g are not mutually independent. However, we also havep(X3jX1 ) =p(X3), since withoutX2, no information can be provided toX3. SoX1?X3 and similarlyX2?X3 . HencefX1; X2; X3g are pairwise independent.Here is a dierent example. Let there be four balls in a bag, numbered 1 to 4. Suppose we draw one at
random. Dene 3 events as follows:
X1: ball 1 or 2 is drawn.
X2: ball 2 or 3 is drawn.
X3: ball 1 or 3 is drawn.
We havep(X1) =p(X2) =p(X3) = 0:5. Also,p(X1; X2 ) =p(X2; X3 ) =p(X1; X3 ) = 0:25. Hence p(X1; X2 ) =p(X1)p(X2), and similarly for the other pairs. Hence the events are pairwise independent.However,p(X1; X2; X3) = 06= 1=8 =p(X1)p(X2)p(X3).
2.3 Conditional independence i joint factorizes
PRIVATE
Independency)Factorization. Letg(x; z) =p(xjz)andh(y; z) =p(yjz). IfX?YjZthen p(x; yjz) =p(xjz)p(yjz) =g(x; z)h(y; z)(3)
- / 4