stat2303_tutorial08

Trials are performed in sequence define a markov

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ce. Define a Markov chain defined as follows: State of Outcomes of the and trial 0 (Success, Success) 1 (Failure, Success) 2 (Failure, Failure) 3 (Success, Failure) If the last two trials were successes, the probability of being success in the next trial is 0.8. Otherwise, the probability of being success in the next trial is 0.5. In the long run, what is the proportion of trials being success? 0.8 0 0 0.2 0.5 0 0 0.5 0 0.5 0.5 0 0 0.5 0.5 0 Note that the period for states 0 and 2 is 1 as 0.8 0 and 0.5 0 . The period for...
View Full Document

This note was uploaded on 03/15/2014 for the course STAT 2303 taught by Professor Steven during the Fall '11 term at HKU.

Ask a homework question - tutors are online