This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ach of mean X . If J is
a stopping time for {Xn ; n ≥ 1}, E [J ] < 1, and SJ = X1 + X2 + · · · + XJ , then
E [SJ ] = X E [J ] . (3.10) Proof: We can express SJ as
SJ = 1
X n=1 Xn In where In = 1 if J ≥ n and In = 0 if J < n. (3.11) By the deﬁnition of a stopping rule, In is independent of Xn , Xn+1 , . . . and thus independent
of Xn . Thus, E [Xn In ] = E [Xn ] E [In ] = X E [In ]. We then have
hX1
i
E [SJ ] = E
Xn In
n=1
X1
=
E [Xn In ]
(3.12)
Xn=1
1
=
X E [In ]
n=1 = X E [J ] . (3.13) The interchange of expectation and inﬁnite sum in (3.12) is obviously valid for a ﬁnite sum,
and is also valid for the inﬁnite sum if E [J ] < 1, although we do not prove that here. The
ﬁnal step above comes from the observation that E [In ] = Pr {In = 1} = Pr {J ≥ n} and,
P
since J is a positive integer rv, E [J ] = n≥0 Pr {J ≥ n}. One can also obtain the last step
P
by using J = n≥1 In (see Exercise 3.4). It can be seen from the proof that the result essentially holds under the weaker conditions
that the random variables Xn all have the same mean and that for each n, Xn and In are
uncorrelated. In this case, however, added conditions are necessary in order to exchange
the sum and the expectation in (3.12).
What this result essentially says in terms of gambling is that strategies for when to stop
betting are not really eﬀective as far as the mean is concerned. This is one of these strange
results that sometimes appear obvious and sometimes appear very surprising, depending on
the application.
We next use Wald’s equality in evaluating m(t). Consider an experiment in which we
observe successive interarrival intervals until the sum ﬁrst exceeds t. From Figure 3.5,
note that SN (t)+1 is the epoch of the ﬁrst arrival after t, and thus N (t) + 1 is the number
of intervals observed until the sum exceeds t. We now show that N (t) + 1 is a stopping 3.3. EXPECTED NUMBER OF RENEWALS 103 time for the interarrival sequence {Xn ; n ≥ 1}. Informally, the decision to stop when the
sum exceeds t depends only on the interarrival intervals already observed. More formally,
N (t) + 1 is a random variable and the associated decision variable In has the value 1 for
N (t) + 1 ≥ n, which is equivalent to Sn−1 ≤ t. This is a function only of X1 , . . . , Xn−1 , and
thus independent of Xn , Xn+1 . . . , verifying that N (t) + 1 is a stopping time for X1 , X2 . . . . .
✏
✏✏ m(t) + 1
✏✏
✏✏
✏✏
1
✏
m(t)
Slope = X
✏✏
✏✏
✏
✏✏
✏✏
✏✏
t
✏✏ 0 £
§
E SN (t) £
§
E SN (t)+1 Figure 3.5: Illustration of Wald’s equality, 3.14, applied to N (t) + 1.
Note that N (t) is not a stopping time for X1 , X2 , . . . . For any given n, observation of
X1 , . . . , Xn−1 where Sn−1 < t, does not specify whether or not N (t) ≥ n. One would have
to peek ahead at Xn+1 to verify that Sn+1 exceeds t.
Since N (t) + 1 is a stopping time, however, Wald’s equality yields
£
§
E SN (t)+1 = X E [N (t) + 1] = X [m(t) + 1]
£
§
E SN (t)+1
m(t) =
− 1.
X
£
§
Since E SN (t)+1 ≥ t, we have m(t) ≥ t/X − 1, and (3.14) m(t)
1
1
≥
−.
(3.15)
t
t
X
£
§
If we had a similar upper bound on E SN (t)+1 − t, we could easily show that m(t)/t
£
§
approaches 1/X in the limit t → 1, but unfortunately E SN (t)+1 might be larger than t
by a surprising amount. The diﬀerence SN (t)+1 − t is the interval from t to the next arrival
and is known as the residual life of the renewal process at t. We shall see subsequently
£§
that its expected value, in the limit as t → 1, is E X 2 /(2E [X ]). Substituting this into
(3.14), we ﬁnd the same limiting expression for m(t) as in (3.9) (whic§ was restricted to
£h
interrenewal intervals with a rational Laplace transform). Since E X 2 can be arbitrarily
large, and even inﬁnite, this does not show that m(t)/t → 1/X .
The reason that the expected residual life can be so large can be seen by an example.
Suppose that X is 0 with probability 1 − ≤ and 1/≤ with probability ≤, and that ≤ is very
small. Then X = 1, but arrivals occur bunched together with a large gap of 1/≤ between
successive bunches. Most points t lie in these gaps, and the residual life is large over most
of each gap (we discuss this example in more detail later). Fortunately, it turns out that the
familiar truncation method allows us to circumvent these problems and prove the following
theorem.
Theorem 3.4 (The Elementary Renewal Theorem). Let {N (t); t ≥ 0} be a renewal
counting process with mean interrenewal interval X . Then limt→1 E [N (t)] /t = 1/X . 104 CHAPTER 3. RENEWAL PROCESSES e
e
Proof 5 : Let Xi = Xi for Xi ≤ b and let Xi = b for Xi > b. Since these truncated
e
random variables are IID, they form a related renewal counting process {N (t); t > 0} with
h
i
e
e
e
e
m(t) = E N (t) and Sn = X1 + · · · + Xn . Since the nth arrival in this truncated process
e
e
arrives no later than the nth arrival in the i
original process, N (t) ≥ N (t), so m(t) ≥ m(t).
e
h
ee
Finally, in the truncated process, E SN (t)+1...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details