Module2_4 - Module 2, Lecture 4 Fundamental Concepts:...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Module 2, Lecture 4 Fundamental Concepts: Information Inequalities II G.L. Heileman Module 2, Lecture 4
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Stochastic Processes A number of the inequalities we consider next are associated with Markov chains, a special type of stochastic process. Definition (Discrete stochastic process) A discrete stochastic (or random) process is an indexed collection of discrete RVs X 1 , . . . , X n = { X i } , in which there can be an arbitrary dependence among the X i ’s. Such a process is completely described by the joint pdf Pr { X 1 = x 1 , . . . , X n = x n } = p ( x 1 , . . . , x n ) = p ( x 1 ) p ( x 2 | x 1 ) p ( x 3 | x 2 , x 1 ) ··· p ( x n | x n - 1 , . . . , x 1 ) . We have already considered the special case when the X i ’s are iid. The independence part of iid implies p ( x 1 , . . . , x n ) = p ( x 1 ) p ( x 2 ) ··· p ( x n ) . This is also referred to as a zero memory source . G.L. Heileman Module 2, Lecture 4
Background image of page 2
Stochastic Processes Definition (Stationary stochastic process) A stochastic process is stationary if the joint distribution of any subset of the sequence of RVs is invariant with respect to shifts in the index, i.e., Pr { X 1 = x 1 , . . . , X n = x n } = Pr { X 1+ l = x 1 , . . . , X n + l = x n } for any shift l , and for any x 1 , . . . , x n . I.e., the pdf does not change over time. An iid stochastic process is stationary. Proof: Pr { X 1 = x 1 , . . . , X n = x n } = Pr { X 1 = x 1 }··· Pr { X n = x n } (indepen.) = Pr { X 1 }··· Pr { X n } (ident. distrib.) = Pr { X 1+ l }··· Pr { X n + l } (ident. distrib.) = Pr { X 1+ l = x 1 , . . . , X n + l = x n } for any shift l . G.L. Heileman Module 2, Lecture 4
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Markov Chains Definition ( m -th order Markov chain) A discrete stochastic process X 1 , X 2 , . . . is said to be an m -th order Markov chain (or process) if for n = 1 , 2 , . . . p ( x n +1 | x n , . . . , x 1 ) = p ( x n +1 | x n , . . . , x n - m +1 ) . I.e., the probability of the next “output” depends only on the previous m outputs. Thus, the joint pdf of an 1-st order Markov chain can be written as p ( x , . . . , x n ) = p ( x 1 ) p ( x 2 | x 1 ) p ( x 3 | x 2 ) ··· p ( x n | x n - 1 ) . G.L. Heileman Module 2, Lecture 4
Background image of page 4
Definition (Time invariant Markov chain) An m -th order Markov chain is said to be time invariant if the conditional probability p ( x n +1 | x n , . . . , x n - m +1 ) is independent of n . I.e., the next output still depends on the previous m outputs, but not on the time that they occured. Thus, for n = m , m + 1 , . . . Pr { X n +1 = α m +1 | X n = α m , . . . , X 1 = α 1 } = Pr { X m +1 = α m +1 | X m = α m , . . . , X 1 = α 1 } for any α 1 , . . . , α m +1 . In the text (and in these notes), the term Markov chain will generally mean a 1-st time invariant Markov chain. G.L. Heileman
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 05/06/2010 for the course ECE 549 taught by Professor G.l.heileman during the Spring '10 term at University of New Brunswick.

Page1 / 25

Module2_4 - Module 2, Lecture 4 Fundamental Concepts:...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online