Stat 150 Stochastic Processes
Spring 2009
Lecture 4: Conditional Independence and Markov Chain
Lecturer: Jim Pitman
1
Conditional Independence
Q: If X and Y were conditionally independent given , are X and Y independent? (Typically no.) Y Write X Z to in
Stat 150 Stochastic Processes
Spring 2009
Lecture 7: Limits of Random Variables
Lecturer: Jim Pitman
Simplest case: pointwise limits. Recall that formally a random variable X is a function of outcomes : X:R X() A sequence of random variables (functions)
Stat 150 Stochastic Processes
Spring 2009
Lecture 6: Markov Chains and First Step Analysis II
Lecturer: Jim Pitman
1
Further Analysis of Markov Chain
In class last time: We found that if h = P h, then (1) E[h(Xn+1 ) | X0 , X1 , . . . , Xn ] = h(Xn ) (2)
Stat 150 Stochastic Processes
Spring 2009
Lecture 5: Markov Chains and First Step Analysis
Lecturer: Jim Pitman
1
Further Analysis of Markov Chains
Q: Suppose = row vector, f = column vector, and P is a probability transition matrix, then what is the mea
Lecture 17 : Long run behaviour of Markov chains
STAT 150 Spring 2006 Lecturer: Jim Pitman Scribe: Vincent Gee <>
Basic Case: S is finite Markov matrix is P Assume that for some power of P has all entries > 0: k such that P k (i, j) > 0i, j S Such P is c
Statistics 150 (Stochastic Processes): Midterm Exam, Spring 2009. J. Pitman, U.C. Berkeley.
1. A sequence of random variables X1 , X2 , . . ., each with two possible values 0 and 1, is such that
P(X1 = 1) = p1 and for each n 1
P(Xn+1 = 1 | X1 , . . . , Xn
Statistics 150 (Stochastic Processes): Final Exam, Spring 2009. J. Pitman, U.C. Berkeley.
1. Suppose that a Markov matrix P indexed by a nite set has the property that for each state i:
P (i, j ) =
j =i
P (j, i).
j =i
a) What does this property imply abou
Lecture 17 : Long run behaviour of Markov chains
STAT 150 Spring 2006 Lecturer: Jim Pitman Scribe: Vincent Gee <>
Basic Case: S is finite Markov matrix is P Assume that for some power of P has all entries > 0: k such that P k (i, j) > 0i, j S Such P is c
FIRST HITTING PROBABILITY WITH TWO BARRIERS
XIXI WANG, CITED FROM THE BOOK “APPLIED STOCHASTIC PROCESS” BY
YUANLIE LIN, TSINGHUA UNIVERSITY PRESS
Abstract. In class we’ve studied the ﬁrst hitting probability for a Brownian
motion with drift. I have found
Lecture 21 : Continuous Time Markov Chains
STAT 150 Spring 2006 Lecturer: Jim Pitman Scribe: Stephen Bianchi <>
(These notes also include material from the subsequent guest lecture given by Ani Adhikari.) Consider a continuous time stochastic process (Xt
Kunal Mehta
15913699
Branching Processes
Lecture 13
Probability Generating Function:
" (s) = generic notation for PGF
#
" (s) = $ sn pn
n= 0
where ( p0 , p1, p2 ,.) is the probability distribution of some r.v. x :
P(X = n) = pn
E(sx ) = " (s)
!
To make co
Stat 150 Stochastic Processes
Spring 2009
Lecture 8: First passage and occupation times for random walk
Lecturer: Jim Pitman
1
First passage and occupation times for random walk
Gambler's ruin problem on cfw_0, 1, . . . , b with P (a, a - 1) = P (a, a +
Stat 150 Stochastic Processes
Spring 2009
Lecture 9: Waiting for patterns
Lecturer: Jim Pitman
1
Waiting for patterns
Expected waiting time for patterns in Bernoulli trials Suppose X1 , X2 , . . . are independent coin tosses with P(Xi = H) = p, P(Xi = T
STAT 150 CLASS NOTES
Onur Kaya 16292609 May 18, 2006
Martingales: A sequence of random variables (Mn ) is a martingale relative to the sequence (Xn ) if: 1. Mn is some measurable function of X1 , X2 ,., Xn 2. E[Mn+1 |X1 , X2 , ., Xn ] = Mn Notice that (1)
Stat 150 Stochastic Processes
Spring 2009
Lecture 28: Brownian Bridge
Lecturer: Jim Pitman
From last time:
Problem: Find hitting probability for BM with drift , Dt := a + t + Bt ,
( )
Pa (Dt hits b before 0).
Idea: Find a suitable MG. Found e2Dt is a MG.
Stat 150 Stochastic Processes
Spring 2009
Lecture 20: Markov Chains: Examples
Lecturer: Jim Pitman
A nice formula: Ei (number of hits on j before Ti ) = Domain of truth: P is irreducible. is an invariant measure: P = , j 0 for all j, and Either j < or (w
Stat 150 Stochastic Processes
Spring 2009
Lecture 18: Markov Chains: Examples
Lecturer: Jim Pitman
A nice collection of random walks on graphs is derived from random movement of a chess piece on a chess board. The state space of each walk is the set of 8
Stat 150 Stochastic Processes
Spring 2009
Lecture 17: Limit distributions for Markov Chains
Lecturer: Jim Pitman
Finite state space Markov chain X0 , X1 , . . . with state space S having N elements, N < . Matrix P , Pi (Xn = j) = P n (i, j). Problem: Sup
Stat 150 Stochastic Processes
Spring 2009
Lecture 14: Branching Processes and Random Walks
Lecturer: Jim Pitman
Common setting: p0 , p1 , p2 , . . . probability distribution of X on {0, 1, 2, . . . }.
• Two surprisingly related problems:
Problem 1: Use X
Stat 150 Stochastic Processes
Spring 2009
Lecture 13: Branching Processes
Lecturer: Jim Pitman
Consider the following branching process:
Z4 = 2 Z3 = 2 Z2 = 3 Z0 = 2 k MMM q M M qqq kkkk qqq MMM qqqq kkkk qqqq q k k kk qqSSSSSS SSSS kkkkkk qqq qq Skkk M S
Stat 150 Stochastic Processes
Spring 2009
Lecture 12: Probability Generating Functions
Lecturer: Jim Pitman
(See text, around page 185.) Previous lecture showed the value of identifying a
sequence with a corresponding generating function dened by a power
Stat 150 Stochastic Processes
Spring 2009
Lecture 11: Return times for random walk
Lecturer: Jim Pitman
1
Recurrence/Transience
From last time: look at the simple random walk on Z with p q . Sn := a + X1 + + Xn . Compute: u2n = P0 (S2n = 0) f2n = P0 (T0
Stat 150 Stochastic Processes
Spring 2009
Lecture 10: The fundamental matrix (Green function)
Lecturer: Jim Pitman
1
The fundamental matrix (Green function)
Formulate for Markov chains with an absorbing boundary. (Applications to other chains can be made
Lecture 6 : Markov Chains
STAT 150 Spring 2006 Lecturer: Jim Pitman Scribe: Alex Michalka <>
Markov Chains Discrete time Discrete (finite or countable) state space S Process cfw_Xn Homogenous transition probabilities matrix P = cfw_P (i, j); i, j S P (i,