{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

&iacute;™•&euml;&yen;&nbsp;_&euml;&deg;_&euml;žœ&euml;&curren;&euml;&sup3;€&igrave;ˆ˜_&igrave;†”&e

# í™•ë¥ _ë°_ëžœë¤ë³€ìˆ˜_ì†”&e

This preview shows pages 1–4. Sign up to view the full content.

Problem Solutions – Chapter 12 Problem 12.1.1 Solution From the given Markov chain, the state transition matrix is P = P 00 P 01 P 02 P 10 P 11 P 12 P 20 P 21 P 22 = 0 . 5 0 . 5 0 0 . 5 0 . 5 0 0 . 25 0 . 25 0 . 5 (1) Problem 12.1.2 Solution This problem is very straightforward if we keep in mind that P i j is the probability that we transition from state i to state j . From Example 12.1, the state transition matrix is P = P 00 P 01 P 10 P 11 = 1 p p q 1 q (1) Problem 12.1.3 Solution Under construction. Problem 12.1.4 Solution Under construction. Problem 12.1.5 Solution In this problem, it is helpful to go fact by fact to identify the information given. “. . . each read or write operation reads or writes an entire file and that files contain a geometric number of sectors with mean 50.” This statement says that the length L of a file has PMF P L ( l ) = ( 1 p ) l 1 p l = 1 , 2 , . . . 0 otherwise (1) with p = 1 / 50 = 0 . 02. This says that when we write a sector, we will write another sector with probability 49 / 50 = 0 . 98. In terms of our Markov chain, if we are in the write state, we write another sector and stay in the write state with probability P 22 = 0 . 98. This fact also implies P 20 + P 21 = 0 . 02. Also, since files that are read obey the same length distribution, P 11 = 0 . 98 P 10 + P 12 = 0 . 02 (2) 417

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
“Further, suppose idle periods last for a geometric time with mean 500.” This statement simply says that given the system is idle, it remains idle for another unit of time with probability P 00 = 499 / 500 = 0 . 998. This also says that P 01 + P 02 = 0 . 002. “After an idle period, the system is equally likely to read or write a file.” Given that at time n , X n = 0, this statement says that the conditional probability that P [ X n + 1 = 1 | X n = 0 , X n + 1 = 0] = P 01 P 01 + P 02 = 0 . 5 (3) Combined with the earlier fact that P 01 + P 02 = 0 . 002, we learn that P 01 = P 02 = 0 . 001 (4) “Following the completion of a read, a write follows with probability 0 . 8.” Here we learn that given that at time n , X n = 1, the conditional probability that P [ X n + 1 = 2 | X n = 1 , X n + 1 = 1] = P 12 P 10 + P 12 = 0 . 8 (5) Combined with the earlier fact that P 10 + P 12 = 0 . 02, we learn that P 10 = 0 . 004 P 12 = 0 . 016 (6) “However, on completion of a write operation, a read operation follows with probability 0 . 6.” Now we find that given that at time n , X n = 2, the conditional probability that P [ X n + 1 = 1 | X n = 2 , X n + 1 = 2] = P 21 P 20 + P 21 = 0 . 6 (7) Combined with the earlier fact that P 20 + P 21 = 0 . 02, we learn that P 20 = 0 . 008 P 21 = 0 . 012 (8) The complete tree is 0 1 2 0. 998 0. 98 0. 98 0.001 0.016 0.004 0.012 0.001 0.00 8 Problem 12.1.6 Solution Under construction. 418
Problem 12.1.7 Solution Under construction. Problem 12.1.8 Solution Under construction. Problem 12.2.1 Solution Under construction. Problem 12.2.2 Solution From the given Markov chain, the state transition matrix is P = P 00 P 01 P 02 P 10 P 11 P 12 P 20 P 21 P 22 = 0 . 5 0 . 5 0 0 . 5 0 . 5 0 0 . 25 0 . 25 0 . 5 (1) The way to find P n is to make the decomposition P = SDS 1 where the columns of S are the eigenvectors of P and D is a diagonal matrix containing the eigenvalues of P . The eigenvalues are λ 1 = 1 λ 2 = 0 λ 3 = 1 / 2 (2) The corresponding eigenvectors are s 1 = 1 1 1 s 2 = 1 1 0 s 3 = 0 0 1 (3)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 17

í™•ë¥ _ë°_ëžœë¤ë³€ìˆ˜_ì†”&e

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online