Denition 2.6 (Markov Chain) For random variables X1 , X2 , , Xn , where
n 3, X1 ! X2 ! ! Xn forms a Markov chain if
p ( x 1 , x2 , , x n ) =
p( x 1 , x 2 ) p ( x 3 | x 2 ) p ( x n | x n
0
1)
if p(x2 )
Chapter 2
Information Measures
Raymond W. Yeung 2014
The Chinese University of Hong Kong
Thursday, 26 December, 13
In this chapter basic tools
Thursday, 26 December, 13
In this chapter basic tools
P
Chapter 1
The Science of Information
Raymond W. Yeung 2014
The Chinese University of Hong Kong
Information Theory
Information Theory
Founded by Claude E. Shannon (1916-2001)
The Mathematical Theory
Proposition 2.5 For random variables X, Y , and Z , X
p(x, y, z ) = a(x, y )b(y, z )
for all x, y , and z such that p(y ) > 0.
Thursday, 26 December, 13
Z |Y if and only if
Proposition 2.5 For random
2.3 Continuity of Shannons
Information Measures for Fixed
Finite Alphabets
Thursday, 26 December, 13
Finite Alphabet vs Countable
Alphabet
All Shannons information measures are continuous when the al
2.2 Shannons Information Measures
Thursday, 26 December, 13
Shannons Information Measures
Entropy
Conditional entropy
Mutual information
Conditional mutual information
Thursday, 26 December, 13
Sh