This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECE320 Homework 9 Spring 2006 Cornell University T.L.Fine Please hand in this assignment at the end of lecture on Tuesday, 11 April. Use only your assigned threedigit code and not your name. Throughout, give reasons for your answers. 1. Recall the Bernoulli process of Section 3.6 in which the outcome space X = { , 1 } * and the probability of a sequence of binaryvalued random variables is given by P ( X 1 = x 1 , . . . , X n = x n ) = p P n 1 x i (1 p ) n P n 1 x i for some 0 < p < 1 . (a) Show that the Bernoulli process is also a Markov chain by evaluating the conditional probability P ( X n = x n  x 1 = x 1 , . . . , X n 1 = x n 1 and showing that it satisfies the Markov condition in that it does not depend upon x 1 , . . . , x n 2 . (Unusually, it will also turn out not to depend upon x n 1 .) (b) Identify the initial distribution π (1) for this Markov chain,. (c) Identify the onestep transition matrix P and see that you have a special case of all rows being identical....
View
Full
Document
This homework help was uploaded on 09/25/2007 for the course ECE 3200 taught by Professor Fine during the Spring '06 term at Cornell.
 Spring '06
 FINE
 Conditional Probability, Probability, Probability theory, Markov chain, Bernoulli process

Click to edit the document details