This preview shows page 1. Sign up to view the full content.
Unformatted text preview: n−1 + Yn−1 , we can express Zn as
Zn = Zn−1 + Yn−1 − g + wXn − wXn−1 .
Since E [Yn−1  Xn−1 = i] = Y i and E [wXn  Xn−1 = i] = P j (7.56) Pij wj , we have E [Zn  Zn−1 , Zn−2 , . . . , Z1 , Xn−1 = i] = Zn−1 + Y i − g + X
j Pij wj − wi . (7.57) From (7.53) the ﬁnal four terms in (7.57) sum to 0, so
E [Zn  Zn−1 , . . . , Z1 , Xn−1 = i] = Zn−1 . (7.58) Since this is valid for all choices of Xn−1 , we have E [Zn  Zn−1 , . . . , Z1 ] = Zn−1 . Since
the expected values of all the reward variables Y i exist, we see that E [Yn ] < 1, so that
E [Zn ] < 1 also. This veriﬁes that {Zn ; n ≥ 1} is a martingale. It can be veriﬁed similarly
that E [Z1 ] = 0, so E [Zn ] = 0 for all n ≥ 1. 7.6. MARTINGALES AND SUBMARTINGALES 303 In showing that {Zn ; n ≥ 1} is a martingale, we actually showed something a little stronger.
That is, (7.58) is conditioned on Xn−1 as well as Zn−1 , . . . , Z1 . In the same way, it follows
that for all n > 1,
E [Zn  Zn−1 , Xn−1 , Zn−2 , Xn−2 , . . . , Z1 , X1 ] = Zn−1 . (7.59) In terms of the gambling analogy, this says that {Zn ; n ≥ 1} is fair for each possible
past sequence of states. A martingale {Zn ; n ≥ 1} with this property (i.e., satisfying
(7.59)) is said to be a martingale relative to the joint process {Zn , Xn ; n ≥ 1}. We will use
this martingale later to discuss threshold crossing problems for Markov modulated random
walks. We shall see that the added property of being a martingale relative to {Zn , Xn }
gives us added ﬂexibility in deﬁning stopping times.
As an added bonus to this example, note that if {Xn ; n ≥ 0} is taken as the embedded
chain of a Markov process (or semiMarkov process), and if Yn is taken as the time interval
from transition n to n + 1, then Sn becomes the epoch of the nth transition in the process. 7.6.3 Generating functions for Markov random walks Consider the same Markov chain and reward variables as in the previous example, and
assume that for each pair of states, i, j , the moment generating function
gij (r) = E [exp(rYn )  Xn = i, Xn+1 = j ] . (7.60) exists over some open interval (r− , r+ ) containing 0. Let [Γ(r)] be the matrix with terms
Pij gij (r). Since [Γ(r)] is an irreducible nonnegative matrix, Theorem 4.6 shows that [Γ(r)]
has a largest real eigenvalue, ρ(r) > 0, and an associated positive right eigenvector, ∫ (r) =
(∫1 (r), . . . , ∫J (r))T that is unique within a scale factor. We now show that the process
{Mn (r); n ≥ 1} deﬁned by
Mn (r) = exp(rSn )∫Xn (r)
.
ρ(r)n ∫k (r) (7.61) is a product type Martingale for each r ∈ (r− , r+ ). Since Sn = Sn−1 + Yn−1 , we can express
Mn (r) as
Mn (r) = Mn−1 (r) exp(rYn−1 ) ∫Xn (r)
.
ρ(r)∫Xn−1 (r) The expected value of the ratio in (7.62), conditional on Xn−1 = i, is
P
∑
∏
exp(rYn−1 )∫Xn (r)
j Pij gij (r )∫j (r )
E
 Xn−1 =i =
= 1.
ρ(r)∫i (r)
ρ(r)∫i (r) (7.62) (7.63) where, in the last step, we have used the fact that ∫ (r) is an eigenvector of [Γ(r)]. Thus,
E [Mn (r)  Mn−1 (r), . . . , M1 (r), Xn−1 = i] = Mn−1 (r). Since this is true for all choices of i,
the condition on Xn−1 = i can be removed and {Mn (r); n ≥ 1} is a martingale. Also, for
n > 1,
E [Mn (r)  Mn−1 (r), Xn−1 , . . . , M1 (r), X1 ] = Mn−1 (r). (7.64) 304 CHAPTER 7. RANDOM WALKS, LARGE DEVIATIONS, AND MARTINGALES so that {Mn (r); n ≥ 1} is also a martingale relative to the joint process {Mn (r), Xn ; n ≥ 1}.
It can be veriﬁed by the same argument as in (7.63) that E [M1 (r)] = 1. It then follows
that E [Mn (r)] = 1 for all n ≥ 1.
One of the uses of this martingale is to provide exponential upper bounds, similar to (7.16),
to the probabilities of threshold crossings for Markov modulated random walks. Deﬁne
exp(rSn ) minj (∫j (r))
f
Mn (r) =
.
(7.65)
ρ(r)n ∫k (r)
h
i
f
f
Then Mn (r) ≤ Mn (r), so E Mn (r) ≤ 1. For any µ > 0, the Markov inequality can be
f
applied to Mn (r) to get
n
o
i
1 hf
1
f
Pr Mn (r) ≥ µ ≤ E Mn (r) ≤ .
µ
µ (7.66) For any given α, and any given r, 0 ≤ r < r+ , we can choose µ = exp(rα)ρ(r)−n minj (∫j (r))/∫k (r),
and for r > 0. Combining (7.65) and (7.66),
Pr {Sn ≥ α} ≤ ρ(r)n exp(−rα)∫k (r)/ min(∫j (r)).
j (7.67) This can be optimized over r to get the tightest bound in the same way as (7.16). 7.6.4 Scaled branching processes A ﬁnal example of a martingale is a “scaled down” version of a branching process {Xn ; n ≥
0}. Recall from Section 5.2 that, for each n, Xn is deﬁned as the aggregate number of
elements in generation n. Each element i of generation n, 1 ≤ i ≤ Xn has a number Yi,n
P
of oﬀspring which collectively constitute generation n + 1, i.e., Xn+1 = Xn Yi,n . The rv’s
i=1
Yi,n are IID over both i and n.
Let Y = E [Yi,n ] be the mean number of oﬀspring of each element of the population. Then
E [Xn  Xn−1 ] = Y Xn−1 , which resembles a martingale except for the factor of Y . We can
c...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details