This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: MS&E 221 Midterm Examination
Ramesh Johari February 14, 2007 Instructions 1. Take alternate seating. 2. Answer all questions in the spaces provided on these sheets. If needed, additional paper will
be available at the front of the room. Answers given on any other paper will not be counted. 3. The examination begins at 1:20 pm, and ends at 2:30 pm. 4. Show your work! Partial credit will be given for correct reasoning. Honor Code In taking this examination, I acknowledge and accept the Stanford University Honor Code. NAME (signed) NAME (printed) So Luna NS. Problem 1 (40 points). Answer each ofthe following short answer questions. (10 points per question) (a) True or false: in a chain on a ﬁnite state space that has two communicating classes and a
unique invariant distribution, at least one of the two classes is transient. (Justify your answer.) TRuE. If ern chum (sag 0,6,.) were reCchodz, than mad he ducal (at +0.91 m ﬂaile )6. Mud: use 4w (bum "Sin‘cl’d
1" C; has a (19131“; MVM‘dWé 4‘“. ireI
and Hm“ like oanai obtain “M 4‘ Conh'oiuuu all mam AmaraW, 011?, + a—ﬂﬁ'z, oé «s I. (b) Suppose the irreducible, aperiodic transition matrix P has a greater second largest eigen
value modulus (SLEM) than the irreducible, aperiodic transition matrix Q (both on the same
ﬁnite state space). Which Markov chain converges faster to its equilibrium distribution, and
why? Let )2: SLEM all 'P', p, = SLEM a? Q
L’J’ {I ‘ unique invan'auf JRf ’i‘" is, and ’9 u n H 4 9 : Q.
Lit 1'. = VLC. wi‘i'ia Hausii'ilm No.5!) 6, and \ln : u u " a .
Tm: lP(xn =.) ‘ zru) I g Commal. . A; = 1') "€(1')i g (madanti. 2 Sue '02 < 9‘1, \{n vawﬂg; «gawk! ‘i" Piuu'ic’bH‘KM. (e) Let X0.X1.X2. . .. be a Markov chain on the state space {0,123}, and suppose a new
process )1). Y1. Y2. . . . is deﬁned according to: Y _ 0, ian is even;
"’ _ 1., if X" is odd. Under exactly what conditions is the process Y" also a Markov chain? Explain your answer. The keg (ondiliws (are: Maria and {ufﬁ'a'ewf) are.
Jﬁak if (,5 are amt, Haw
Z : 2:. PJ‘“ It.“ hold
Owl I" (‘1’. WC on”, #w
2: P. t .
hem! “‘ “Z,” P"' (“on P, is M lvaméh‘am MEN" P“ K") Unwr «Wis (wdlh‘m, P(X,m arm I X“ odd) do” no? chﬂjt («an if im 3min in):me abov'l' 44‘: Same 0.0le ﬁx a" 0,1.“ Y. Jymmbn .ribbabilikes. (d) Suppose a Markov chain is irreducible and has invariant distribution 7r. hat is the mean
number of visits to '1: before the ﬁrst return to j? Use. 4w Marital LLN My" if“), f1, k= (' O, owmyise.
1.1
‘ 7: ﬁx.) 7 ’ p a.
1m L,“ m t 2mm) = ﬂigmlxo ,1
T~>v T P E[Ra(,)X°:J]
: Use ii again MW 3“) : mm MM 1r.=,,_....._.... gomeami Problem 2 (30 points). Consider a Markov chain with the followin transition matrix.
1 1 3 5 5 L 1 3
0 1 O 0 0 0 0 0
0 0 l 0 (V) 0 0 0
1 0 0 0 0 () 0 (7)
1/6 0 1/3 0 1/2 0 0 0
0 0 0 1/6 () 0 1/ 1 /‘2
0 0 O 0 0 1/2 1/2 0
0 0 0 0 0 () 0 1
0 (1 0 0 0 1/3 2/3 0 P: (a) (6 points) What are the communicating classcs ofthis chain, and which are closed? .5
O
‘K'
0 o
w J 1
1 /i .I
3

.'
o cc.= i 1. 2.31, {453, 16,1,“
closul M? dde closed (b) (6 points) Which communicating classes are positive rccurrcnt, null recurrent, and/or tran—
sicnt? f 11 213} : rahv: recurruut “ WS'EMI’
{9.1.85 " rosih‘rc recuwwt. (c) (6 points) Find all invariant distributions ofthis chain. c,— €412,33‘ MC vcsMZled la 6. 1m mutuc
Mvariaud: 013+. 1r':(‘/3, ‘l,/ ‘/3). CI? = “ulngl ; MC, r¢s+yichd 4.. C3 W; was?“
invariant dJJl'. 1; ‘
{3(5) ‘ 3i1\',(6) +
mm = £1, (a) + = 4 = q 1r’(;)=%, 1" >,
«3 <2) um a»... So all "mun/taut disanMS an: 0‘ l—ol 3(11) 3(1—1) .
l3” 9‘2" 5 0r 0, 2; ' 7. osael. (d) (6 points) For what values of]: does the following limit exist? Give the limit for those values
of) where it exists. lim lP’(X,, : iX() : Let c. = {mg}, cl. {4,5}, (5: {War}.
Lee h3(Ckl= Pam (klxo=5>, 5:4,5. For !.2,3= (1k) = man115m P(x,=elx.e6.). lit90°
Since, C, ;; waao, .Hacs limil' Joe: of exist”
l ‘ 4.5‘ Swicc Cl is «lami‘uk, (if) = O Fbr
F0! 3: 6,73%: C3 is dyer/fair“ go 6*) ",1
“4 ((5)1l'3lé). (when. «3(a) is a; tin (0) To Md L1,,(C,), yum “4(9) = {55(5) + 2.1.0
‘ S
5 “5(6): i'l‘ﬁasllz =7 has) = l (e) (6 points) Now suppose the seventh row 0fthe matrix is Changed to:
(0 1/2000001/2).
Does this change your answer to (d)? Why or Why not? Yes‘ NovI (6,173) 7! WSCM,
5° W 0 to: c= 6,4,8. 41/! ramminth iw'iﬁ 5% W Sam,  Problem 3 (30 points). A salesman travels to four different cities that are located at the vertices ofthe unit square (i.e.,
the square with vertices (0. 0), (0‘ l), (1, 0), (0. 1)). At each time step, the salesman jumps to one
ot‘the two adjacent vertices. He jumps vertically with probability p, and horizontally with probability q. In all cities except
(0. U). with probability 7‘ the salesman stays in the same city in the next time step. ln (0. 0) (his
home ofﬁce). with probability 7‘ the salesman takes a vacation in the next time step. (llere /) + (I +
r : 1.) Once on vacation, the salesman stays on vacation each period with probability (1, and otherwise
returns to work at his home oiﬁce. (a) (6 points) Describe the movement of the salesman as a Markov chain. k 905;? Q q
o A a Q r
w Q0 r Lot A= (0.0) V B: (“0)
(,2 (0,0
D= (m) (b) (8 points) Assuming 7' = 0. ﬁnd the long run fraction of time the salesman spends in each city.
1A = an 4— fr;
“'5 z qu 4' f1l'p
1‘1 ‘ fﬂ'h * (TED
1v ’ P113 + cl“;
Wk 4 1TB + 1‘; 4 1r,
M km]: a 3*, Markov LLN, Hats: 5r»? tin each “1’3. wrap“ M 4m {mam A 4m (0) (8 points) Suppose the salesman is currently on vacation. Find the probability that, once the
salesman returns to work, he goes on vacation again without ever visiting (1, 1). Let la; = Plket v bol'wc (mil x0 = c). “A: y'l + ‘l‘ ha =14,“ + 7.0 + r'ks =7 “A = Y +(rzlﬂp 'l’ qzl‘lA) /(l~r) ’ﬂu‘s is exacHg 441:, probability m1le ' (d) (8 points) Suppose the salesman is currently in (l, 1). Suppose also that he earns a reward of
one dollar for each horizontal sales trip, and two dollars for each vertical sales trip. Give a
set ofequations that can be used to compute the expected reward he will earn once he leaves
(1. 1), until he returns to (l. 1). You do not need to explicitly compute the answer. Lat k“): e‘rchd nde “M4 (Ile U”) is LIE
Sivw we. start ll" ( M kn?) =0
MA) = MU um) + q(l+ 143)) + rl<(V) MB) P(1+ to») + 41+ HAD + “2(8)
k(C) r “2+ MM) 4 1(l+ HP» hrle HVl= ak(V) 4' (lalkM) In law 0‘ L0), 44% claim! value is? Fm“ um) + 10+ ktcl). A" “Make ﬂrrnaoln um —Ha¢. Markov LLN.
beui‘e a rev/aid ,fwwh‘m 3t(¢‘l H)= l I" {3 A'BIC,D; mm .4 c=A,8,c,I>; m tow,» =0 41”“,
ﬁlm c—a,‘ mm". "Mm (v Junta a Am‘zonw
Muslim/1, J a VOYle “milCm.)
“6 [Mg run ave/«age rewwat fs‘
WA'rZl TA~q.l 4. W‘.F.2+ f3.¢1 + 1rcr2.+ ire14 «mgr2+ 337., (9“.Il.u»(2‘§.+,j>
(when 1i: is 404. wing {urnm‘aut 4131.)
This is also! ﬂyD(i)[xo=D] w.“ EERPU) IXo‘D] . Ev «HAM a fwd“. wt m4 429 so’vg’ 414;
invariant Jaw ‘m e Mans, ‘TT’ =7? F and
41k 4' f3 4' «'0 + {D 4’ 11/ g ...
View
Full Document
 Winter '11
 Ramesh
 Markov chain

Click to edit the document details