Unformatted text preview: = ab
ba
ba
ba To give a rigorous proof now, we use Theorems 5.13 and 5.11 to conclude
2
0 = E0 (S⌧ ^n ⌧ ^ n) = a2 P (S⌧ = a, ⌧ n) + b2 P (S⌧ = b, T n)
2
+ E (Sn ; ⌧ > n) E0 (⌧ ^ n) 2
P (⌧ < 1) = 1 and on {⌧ > n} we have S⌧ ^n max{a2 , b2 } so the third term
tends to 0. To handle the fourth term we note that by (1.6) E0 (⌧ ^ n) = n
X P (⌧ m=0 Putting it all together, we have m) " 1
X P (⌧ m) = E 0 ⌧ . (5.13) m=0 0 = a2 P0 (S⌧ = a) + b2 P0 (S⌧ = b) E0 ⌧ and we have proved the result.
Consider now a random walk Sn = S0 + X1 + · · · + Xn where X1 , X2 , . . .
are i.i.d. with mean µ. From Example 5.2, Mn = Sn nµ is a martingale with
respect to Xn . 170 CHAPTER 5. MARTINGALES Theorem 5.15. Wald’s equation. If T is a stopping time with ET < 1,
then
E (ST S0 ) = µET
Recalling Example 5.9, which has µ = 0 and S0 = 1, but ST = 1 shows that for
symmetric simple random walk E1 V0 = 1.
Why is this true? Theorems 5.13 and 5.11 give
ES0 = E (ST ^n ) µE (T ^ n) As n " 1, E0 (T ^ n) " E0 T by (5.13). To pass to the limit in the other term,
we note that
!
T
X
E ST ST ^n  E
Xm ; T >...
View
Full
Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell.
 Spring '10
 DURRETT
 The Land

Click to edit the document details