This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: I ( X,A ; Y,Z ) = X x,a,y,z p ( x,a,y,z ) log p ( x,a,y,z ) p ( x,a ) p ( y,z ) . Similar to the identities above, I ( X,A ; Y,Z ) = H ( X,A )-H ( X,A | Y,Z ) = H ( Y,Z )-H ( Y,Z | X,A ) . Notice that the ; operator tells us between which sets of random variables the mutual information is taken. Similarly, we can also use conditional mutual in-formation, when the conditioning is over a set of random variables. For instance, I ( X,A ; Y,Z | B,C ) = H ( X,A | B,C )-H ( X,A | Y,Z,B,C ) = H ( Y,Z | B,C )-H ( Y,Z | X,A,B,C ) . 1 Finally, we have the chain rule for mutual information (not hard to prove), which states that for any random variables X 1 ,...,X n ,Y , I ( X 1 ,X 2 ,...,X n ; Y ) = n X i =1 I ( X i ; Y | X 1 ,X 2 ,...,X i-1 ) . 2...
View Full Document
- Spring '08
- Probability theory, Randomness, one hand, Mutual Information, conditional mutual information