Stat 150 Midterm 1 Spring 2015
Instructor: Allan Sly
Name:
SID:
There are 4 questions worth a total of 50 points plus 4 bonus points. Attempt all questions
and show your working - solutions without explanation will not receive full credit. Answer
the ques
Stat 150 Stochastic Processes
Spring 2009
Lecture 1: To be determined
Lecturer: Jim Pitman
Stochastic Process: Random process, evolution over time/space. Especially
models for sequences of random variables:
X0 , X1 , X2 , . . .
Time = cfw_0, 1, 2, . .
Stat 150 Stochastic Processes
Spring 2009
Lecture 5: Markov Chains and First Step Analysis
Lecturer: Jim Pitman
1
Further Analysis of Markov Chains
Q: Suppose = row vector, f = column vector, and P is a probability transition
matrix, then what is the mea
Stat 150 Stochastic Processes
Spring 2009
Lecture 4: Conditional Independence and Markov Chain
Lecturer: Jim Pitman
1
Conditional Independence
Q: If X and Y were conditionally independent given , are X and Y independent? (Typically no.)
Y
Write X Z to i
Stat 150 Stochastic Processes
Spring 2009
Lecture 3: Martingales and hitting probabilities for random walk
Lecturer: Jim Pitman
A sequence of random variables Sn , n cfw_0, 1, 2, . . . , is a martingale if
1) E|Sn | <
for each n = 0, 1, 2, . . .
2) E(Sn
Stat 150 Stochastic Processes
Spring 2009
Lecture 2: Conditional Expectation
Lecturer: Jim Pitman
Some useful facts (assume all random variables here have nite mean square):
E(Y g(X)|X) = g(X)E(Y |X)
Y E(Y |X) is orthogonal to E(Y |X), and orthogonal a
Stat 150 Stochastic Processes
Spring 2009
Lecture 6: Markov Chains and First Step Analysis II
Lecturer: Jim Pitman
1
Further Analysis of Markov Chain
In class last time: We found that if h = P h, then
(1) E[h(Xn+1 ) | X0 , X1 , . . . , Xn ] = h(Xn )
(2)
Stat 150 Stochastic Processes
Spring 2009
Lecture 7: Limits of Random Variables
Lecturer: Jim Pitman
Simplest case: pointwise limits.
Recall that formally a random variable X is a function of outcomes :
X:R
X()
A sequence of random variables (functions
Stat 150 Stochastic Processes
Spring 2009
Lecture 12: Probability Generating Functions
Lecturer: Jim Pitman
(See text, around page 185.) Previous lecture showed the value of identifying a
sequence with a corresponding generating function dened by a power
Stat 150 Stochastic Processes
Spring 2009
Lecture 10: The fundamental matrix (Green function)
Lecturer: Jim Pitman
1
The fundamental matrix (Green function)
Formulate for Markov chains with an absorbing boundary. (Applications to
other chains can be made
Stat 150 Stochastic Processes
Spring 2009
Lecture 9: Waiting for patterns
Lecturer: Jim Pitman
1
Waiting for patterns
Expected waiting time for patterns in Bernoulli trials
Suppose X1 , X2 , . . . are independent coin tosses with P(Xi = H) = p, P(Xi =
T
Stat 150 Stochastic Processes
Spring 2009
Lecture 8: First passage and occupation times for random walk
Lecturer: Jim Pitman
1
First passage and occupation times for random
walk
Gamblers ruin problem on cfw_0, 1, . . . , b with P (a, a 1) = P (a, a + 1)
Stat 150 Homework # 1 Due February 6
Problems:
Q 1 Let S and T be stopping time. Which of the following are stopping times.
T -1.
mincfw_S, T .
maxcfw_S, T .
Answer: 1 := T 1 is not a ST. The event cfw_1 n = cfw_T n + 1 is determined by the
process up
Stat 150 - Section 3 - Reversible Markov Chains Solutions
w(x)
.
W
Q1. (i) Denote (x) :=
(x)P (x, y) =
Then
w(y) wx,y
w(y) ww,x
w(x) wx,y
=
=
= (w)P (y, x).
w(x)
w(y)
w(y)
W
W
W
(ii) In this setup w(x) =
y
wx,y =
y
(x)P (x, y) = (x) and
w(x) =
W =
x
(x) =
Stat 150 Stochastic Processes
Spring 2009
Lecture 4: Conditional Independence and Markov Chain
Lecturer: Jim Pitman
1
Conditional Independence
Q: If X and Y were conditionally independent given , are X and Y independent? (Typically no.)
Y
Write X Z to i
Stat 150 Homework # 3 Solutions
1: We take the set cfw_1, 2 as the state space, with state i corresponding to contestant i being
7/10 3/10
asked the current question. Then P =
. We need to solve P = .
2/10 8/10
1 7/10 + 2 2/10 = 1 ,
3/101 + 8/102 = 2 .
Th
Stat 150 Homework # 2 Solutions
1: First note that since (x)P (x, y) = (y)P (y, x) for all x, y we have that P 2 (x, x) for
all x. Since 2 is prime, we get that either the chain is aperiodic or it has period two. Period
2 can indeed be obtained. Let G = (
Stat 150 Practice Midterm Spring 2015 Solutions
Instructor: Allan Sly
Name:
SID:
There are 4 questions worth a total of 61 points plus 4 bonus points. Attempt all questions
and show your working - solutions without explanation will not receive full credit