markov2_notes - Markov Chain, part 2 December 12, 2010 1...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Markov Chain, part 2 December 12, 2010 1 The gamblers ruin problem Consider the following problem. Problem. Suppose that a gambler starts playing a game with an initial $B bank roll. The game proceeds in turns, where at the end of each turn the gambler either wins $1 with probability p , or loses $1 with probability q = 1 p . The player continues until he or she either makes it to $N, or goes bankrupt with $0. Determine the probability that the player eventually reaches the $N. We can represent this by a Markov chain having N +1 states representing the amount of money that the player has: either $0, $1, ..., or $N. The transition probabilities are given as follows: P , = 1; P N,N = 1; and P i,i +1 = p and P i,i- 1 = q for i = 1 , 2 , ..., N 1. The corresponding transition matrix is P = 1 q 0 0 0 0 0 0 0 q 0 0 0 0 p q 0 0 0 0 p q 0 0 . . . . . . . . . . . . . . . . . . . . . . . . 0 0 0 0 0 q 0 0 0 0 0 0 0 0 0 0 0 0 p 1 ....
View Full Document

Page1 / 4

markov2_notes - Markov Chain, part 2 December 12, 2010 1...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online