Lecture13

# Lecture13 - Markov Chains Most of our study of probability...

This preview shows pages 1–5. Sign up to view the full content.

Markov Chains 11/15/2005

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Most of our study of probability has dealt with independent trials processes. A Markov chain is aprocess in which the outcome of a given ex- periment can aﬀect the outcome of the next experiment. 1
Markov Chains We have a set of states , S = { s 1 , s 2 , . . . , s r } . The process starts in one of these states and moves successively from one state to another. Each move is called a step . If the chain is currently in state s i , then it moves to state s j at the next step with a probability denoted by p ij , and this probability does not depend upon which states the chain was in before the current state. 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
The probabilities p ij are called transition probabilities. The process can remain in the state it is in, and this occurs with probability p ii . An initial probability distribution, deﬁned on S , speciﬁes the start- ing state. 3
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 07/16/2010 for the course MATH 20 taught by Professor Ionescu during the Fall '05 term at Dartmouth.

### Page1 / 11

Lecture13 - Markov Chains Most of our study of probability...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online