0809MarkovChains2H - O.H. Probability and Markov Chains –...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: O.H. Probability and Markov Chains – MATH 2561/2571 E09 2 Markov Chains In various applications one considers collections of random variables which evolve in time in some random but prescribed manner (think, eg., about con- secutive flips of a coin combined with counting the number of heads observed). Such collections are called random (or stochastic) processes. A typical random process X is a family { X t : t ∈ T } of random variables indexed by elements of some set T . When T = { , 1 , 2 , . . . } one speaks about a ‘discrete-time’ process, alternatively, for T = R or T = [0 , ∞ ) one has a ‘continuous-time’ process. In what follows we shall only consider discrete-time processes. 2.1 Markov property Let (Ω , F , P ) be a probability space and let { X , X 1 , . . . } be a sequence of random variables 6 which take values in some countable set S , called the state space . We assume that each X n is a discrete 7 random variable which takes one of N possible values, where N = | S | ( N may equal + ∞ ). Definition 2.1. The process X is a Markov chain if it satisfies the Markov property : P ( X n +1 = x n +1 | X = x , X 1 = x 1 , . . ., X n = x n ) = P ( X n +1 = x n +1 | X n = x n ) (2.1) for all n ≥ 1 and all x , x 1 , . . . , x n +1 ∈ S . Thinking of n as the ‘present’ and of n +1 as a ‘future’ moment of time, the Markov property (2.1) says that “given the present value of a Markov chain, its future behaviour does not depend on the past”. Remark 2.1.1. It is straightforward to check that the Markov property (2.1) is equivalent to the following statement: for each s ∈ S and every sequence { x k : k ≥ } in S , P ( X n + m = s | X = x , X 1 = x 1 , . . ., X n = x n ) = P ( X n + m = s | X n = x n ) for any m, n ≥ . The evolution of a chain is described by its ‘initial distribution’ μ k def = P ( X = k ) and its ‘transition probabilities’ P ( X n +1 = j | X n = i ) ; it can be quite complicated in general since these probabilities depend upon the three quantities n , i , and j . 6 ie, each X n is a F-measurable mapping from Ω into S . 7 without loss of generality, we can and shall assume that S is a subset of integers. 6 O.H. Probability and Markov Chains – MATH 2561/2571 E09 Definition 2.2. A Markov chain X is called homogeneous if P ( X n +1 = j | X n = i ) ≡ P ( X 1 = j | X = i ) for all n , i , j . The transition matrix P = ( p ij ) is the | S |×| S | matrix of transition probabilities p ij = P ( X n +1 = j | X n = i ) . In what follows we shall only consider homogeneous Markov chains. The next claim characterizes transition matrices. Theorem 2.3. P is a stochastic matrix , which is to say that a) every entry of P is non-negative, p ij ≥ ; b) each row sum of P equals one, ie., for every i ∈ S we have ∑ j p ij = 1 ....
View Full Document

This note was uploaded on 05/12/2010 for the course APPLIED ST 2010 taught by Professor Various during the Spring '10 term at Universidad Nacional Agraria La Molina.

Page1 / 20

0809MarkovChains2H - O.H. Probability and Markov Chains –...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online