Denition 2.6 (Markov Chain) For random variables X1 , X2 , , Xn , where
n 3, X1 ! X2 ! ! Xn forms a Markov chain if
p ( x 1 , x2 , , x n ) =
p( x 1 , x 2 ) p ( x 3 | x 2 ) p ( x n | x n
0
1)
if p(x2 ), p(x3 ), , p(xn
otherwise.
1)
>0
Remark X1 ! X2 ! X3 i
Chapter 2
Information Measures
Raymond W. Yeung 2014
The Chinese University of Hong Kong
Thursday, 26 December, 13
In this chapter basic tools
Thursday, 26 December, 13
In this chapter basic tools
Probability preliminaries
Shannons information measures
Chapter 1
The Science of Information
Raymond W. Yeung 2014
The Chinese University of Hong Kong
Information Theory
Information Theory
Founded by Claude E. Shannon (1916-2001)
The Mathematical Theory of Communication, 1948
Study fundamental limits in co
Proposition 2.5 For random variables X, Y , and Z , X
p(x, y, z ) = a(x, y )b(y, z )
for all x, y , and z such that p(y ) > 0.
Thursday, 26 December, 13
Z |Y if and only if
Proposition 2.5 For random variables X, Y , and Z ,
X ? Z |Y if and only if
p(x, y
Proposition 2.12 Let X1 , X2 , X3 , and X4 be random variables such that
p(x1 , x2 , x3 , x4 ) is strictly positive. Then
X1 ? X4 |(X2 , X3 )
X1 ? X3 |(X2 , X4 )
) X1 ? (X3 , X4 )|X2 .
See textbook for a proof of the proposition.
Not true if p is not st
2.3 Continuity of Shannons
Information Measures for Fixed
Finite Alphabets
Thursday, 26 December, 13
Finite Alphabet vs Countable
Alphabet
All Shannons information measures are continuous when the alphabets
are xed and nite.
For countable alphabets, Sha