This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECE320 Solution Notes 9 Spring 2006 Cornell University T.L.Fine 1. Recall the Bernoulli process of Section 3.6 in which the outcome space X = { , 1 } and the probability of a sequence of binaryvalued random variables is given by P ( X 1 = x 1 , . . . , X n = x n ) = p P n 1 x i (1 p ) n P n 1 x i for some 0 < p < 1 . (a) Show that the Bernoulli process is also a Markov chain by evaluating the conditional probability P ( X n = x n  x 1 = x 1 , . . . , X n 1 = x n 1 and showing that it satisfies the Markov condition in that it does not depend upon x 1 , . . . , x n 2 . (Unusually, it will also turn out not to depend upon x n 1 .) P ( X n = x n  X 1 = x 1 , . . . , X n 1 = x n 1 ) = P ( X 1 = x 1 , . . . , X n = x n ) P ( X 1 = x 1 , . . . , X n 1 = x n 1 ) = p x n + P n 1 1 x i (1 p ) n x n P n 1 1 x i p P n 1 1 x i (1 p ) n 1 P n 1 1 x i = p x n (1 p ) 1 x n . The condition for being a Markov chain is satisfied as the conditional probability does not depend upon x 1 , . . . , x n 2 . This Markov chain has only the two states X = { , 1 } . (b) Identify the initial distribution (1) for this Markov chain. (1) = [ P ( X 1 = 0) , P ( X 1 = 1)] = [1 p, p ] , and sums to one, as it should. (c) Identify the onestep transition matrix P and see that you have a special case of all rows being identical....
View
Full
Document
This homework help was uploaded on 09/25/2007 for the course ECE 3200 taught by Professor Fine during the Spring '06 term at Cornell University (Engineering School).
 Spring '06
 FINE

Click to edit the document details