This preview shows page 1. Sign up to view the full content.
Unformatted text preview: X and p = { p ( x, y ) } x,y X a XX matrix of nonnegative numbers such that, for any x X : X y X p ( x, y ) = 1 . (1) A Markov chain is a probability measure over ( X N , F ), with X N = { ( , 1 , 2 , . . . ) : i X} and F the algebra generated by cylindrical sets, such that P ( { : = x , 1 = x 1 . . . n = x n } ) = q ( x ) n1 Y i =0 p ( x i , x i +1 ) , (2) for any n 0. Check that this indeed denes a probability distribution using Kolmogorov extension theorem. Let X i ( ) = i and recall that the tail algebra is dened by T n ( { X i } i n ) . (3) Prove that, if p ( x, y ) > 0 for any x, y X , then T is trivial. [Hint: It might be a good idea to remind yourself the statement of PerronFrobenius theorem]...
View
Full
Document
 '11
 MONTANARI,A.
 Probability

Click to edit the document details