This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: (a) Show X → ( Y,Z ) → W forms a Markov chain. (b) Find I ( X ; W,Y ). 5. ( 30 points ) Two looks. Compare the mutual information I ( X ; Y 1 ,Y 2 ) that ( Y 1 ,Y 2 ) provide about X to the sum of the mutual information I ( X ; Y 1 )+ I ( X ; Y 2 ) for each of the following two probability mass functions. (a) ( 15 points ) Two independent looks: p ( x,y 1 ,y 2 ) = p ( x ) p ( y 1 | x ) p ( y 2 | x ) . (b) ( 15 points ) One look at two independents:: p ( x,y 1 ,y 2 ) = p ( y 1 ) p ( y 2 ) p ( x | y 1 ,y 2 ) 2...
View Full Document
- Information Theory, Summation, Markov chain, Coding theory, Variable-length code, Prof. T. Weissman