This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: THE UNIVERSITY OF NEW SOUTH WALES MONTH OF EXAMINATION  NOVEMBER 2008 Final Examination ACTL2003 STOCHASTIC MODELS FOR ACTUARIAL APPLICATIONS Sample Solutions and Marking Guide 1 SECTION I [50 MARKS] START A NEW EXAMINATION BOOK. ANSWER ALL QUESTIONS. START EACH QUESTION ON A NEW PAGE. Question 1 (10 marks) a) Consider a Markov chain { X n ,n = 0 , 1 , ···} with state space { 1 , 2 , ··· , 5 } . Suppose its probability transition matrix is given as below: 1 5 0 0 0 4 5 1 3 1 3 1 3 0 0 1 2 1 2 1 4 1 2 1 4 1 2 1 2 0 0 i) Classify the states, and determine whether they are recurrent or transient [1 marks] . 1 → 1 , 5 , 3 2 → 1 , 2 , 4 , 3 , 5 3 → 3 , 5 , 1 4 → 1 , 2 , 4 , 5 , 3 5 → 1 , 3 , 5 { 1 , 3 , 5 } recurrent; { 2 , 4 } transient. Marking guide: 0.5 for state classification and 0.5 for correct identification of recurrence or transience. Also take ” { , 2 , 4 } recurrent; { 1 , 3 } transient” as correct. ii) Find the expected number of visits in each transient state starting in any transient state. [2 marks] Rearrange the states and let the previous states 2 and 4 be denoted by 1 and 2, respectively. P T = 1 3 1 3 1 2 1 4 ¶ S = ( I P T ) 1 = 2 3 1 3 1 2 3 4 ¶ 1 = 3 4 1 3 1 2 2 3 ¶ 1 3 = 9 4 1 3 2 2 ¶ Hence, the mean number of visits are s 11 = 9 4 , s 12 = 1, s 21 = 3 2 , and s 22 = 2. 2 Marking guide: 0.5 for correct specification of P T , 1 for correct formula for S and 0 . 5 for correct calculation. iii) Given that X is equally likely to be in any state, calculate the probability that two steps later, the process is in state 3. [2 marks] Pr ( X 2 = 3) = 5 X i =1 P ( X 2 = 3  X = i ) P ( X = i ) = 5 X i =1 1 5 p (2) i 3 = 1 5 P · 1 2 1 2 = ( 1 5 , 1 5 , 1 5 , 1 5 , 1 5 )( 2 5 + 0 + 1 2 + 0 + 1 4 ) = 23 100 Marking guide: 1 for expressing the required probability in terms of conditional prob abilities, 0.5 for understanding the conditional probabilities as transition probabilities, and 0.5 for calculation. (If a student realizes the necessity of calculating P (2) , he/she will get 0.5 for this.) b) Consider a matrix x 1 y 2 1 x y ¶ i) Find all the possible values of x and y such that the above matrix is a probability transition matrix? [2 marks] ≤ x ≤ 1, 0 ≤ y ≤ 1, 0 ≤ 1 x ≤ 1, 0 ≤ 1 y 2 ≤ 1, x + 1 y 2 = 1 and 1 x + y = 1. These give us x = y = 0 or x = y = 1. Marking guide: 1.5 for listing all the conditions or the last two equalities, and . 5 for final result. ii) Show in detail when the above matrix will be a probability transition matrix of a Markov chain whose limiting probabilities exist, and when it will not. [3 marks] When x = y = 0, the matrix becomes P = 0 1 1 0 ¶ 3 Noticing that for n = 1 , 2 , ··· , P (2 n ) = 1 0 0 1 ¶ and P (2 n +1) = 0 1 1 0 ¶ , Obviously, limiting probabilities do not exist....
View
Full
Document
This note was uploaded on 06/12/2011 for the course ASB 2003 taught by Professor Kim during the Three '11 term at University of New South Wales.
 Three '11
 kim

Click to edit the document details