This preview shows page 1. Sign up to view the full content.
Unformatted text preview: must
approach a limit as t → 1; and third, the limits must be the same. The following example,
for a stochastic process very diﬀerent from a renewal process, shows that equality between
time and ensemble averages is not always satisﬁed for arbitrary processes.
Example 3.2.1. Let {Xi ; i ≥ 1} be a sequence of binary IID random variables, each taking
the value 0 with probability 1/2 and 2 with probability 1/2. Let {Mn ; n ≥ 1} be the product
process in which Mn = X1 X2 · · · Xn . Since Mn = 2n if X1 to Xn each take the value 2
(an event of probability 2−n ) and Mn = 0 otherwise, we see that limn→1 Mn = 0 with
probability 1. Also E [Mn ] = 1 for all n ≥ 1. Thus the timeaverage exists and equals 0
with probability 1 and the ensembleaverage exists and equals 1 for all n, but the two are
diﬀerent. The problem is that as n increases, the atypical event in which Mn = 2n has a
probability approaching 0, but still has a signiﬁcant eﬀect on the ensembleaverage.
Before establishing the results about ensembleaverages, we state and brieﬂy discuss the
central limit theorem for renewal processes.
Theorem 3.2 (Central Limit Theorem for N (t)). Assume that the interrenewal intervals for a renewal counting process {N (t); t ≥ 0} have ﬁnite standard deviation σ > 0.
Then
(
)
N (t) − t/X
lim Pr
< α = Φ(α).
(3.3)
−3/2 √
t→1
σX
t
Ry
where Φ(y ) = −1 √1 π exp(−x2 /2)dx.
2 3.3. EXPECTED NUMBER OF RENEWALS 97 ✟
✟✟ ❍
✛✟ ✻
❍
n
✟✟ ✲❄ ❍ α√n σ
✟
■
❅
X
✟
❅√
✟✟
α nσ
✟✟
✟✟
✟✟
1
✟✟ Slope = X
✟
nX
t
✟
E [Sn ] Figure 3.4: Illustration of the central limit theorem for renewal processes. A given
integer n is shown on the vertical axis, and the corresponding mean, E [Sn ] = nX is
shown on the horizontal axis. The horizontal line with arrows at height n indicates α
standard deviations from E [Sn ], and the vertical line with arrows indicates the distance
below (t/X ). This says that N (t) tends to Gaussian with mean t/X and standard deviation σ X −3/2 √ t. The theorem can be proved by applying the central limit theorem (CLT) for a sum of IID
rv’s, (1.56), to Sn and then using the identity {Sn ≤ t} = {N (t) ≥ n}. The general idea
is illustrated in Figure 3.4, but the details are somewhat tedious, and can be found, for
example, in [16]. We simply outline the argument here. For any real α, the CLT states that
©
√™
Pr Sn ≤ nX + α nσ ≈ Φ(α)
Rα
where Φ(α) = −1 √1 π exp(−x2 /2) dx and where the approximation becomes exact in the
2
limit n → 1. Letting
√
t = nX + α nσ, and using {Sn ≤ t} = {N (t) ≥ n},
Pr {N (t) ≥ n} ≈ Φ(α). (3.4) Since t is monotonic in n for ﬁxed α, we can express n in terms of t, getting
√
t
ασ n
t
n=
−
≈
− ασ t1/2 (X )−3/2 .
X
X
X
Substituting this into (3.4) establishes the theorem for −α, which establishes the theorem
since α is arbitrary. The omitted details involve handling the approximations carefully. 3.3 Expected number of renewals Let E [N (t)] be denoted by m(t) in what follows. We ﬁrst ﬁnd an exact expression for
m(t). This is often quite messy for large t, so we then ﬁnd the asymptotic behavior of
m(t). Since N (t)/t approaches 1/X with probability 1, we expect m(t) to grow with a 98 CHAPTER 3. RENEWAL PROCESSES slope that asymptotically approaches 1/X , but we will ﬁnd that this is not quite true in
general. Two somewhat weaker results, however, are true. The ﬁrst, called the elementary
renewal theorem (Theorem 3.4), states that limt→1 m(t)/t = 1/X . The second result,
called Blackwell’s theorem (Theorem 3.5), states that, sub ject to some limitations on δ > 0,
limt→1 [m(t + δ ) − m(t)] = δ /X . This says essentially that the expected renewal rate
approaches steadystate as t → 1. We will ﬁnd a large number of applications of Blackwell’s
theorem throughout the remainder of the text.
The exact calculation of m(t) makes use of the fact that the expectation of a nonnegative
random variable is the integral of its complementary distribution function,
m(t) = E [N (t)] = 1
X n=1 Pr {N (t) ≥ n} . Since the event N (t) ≥ n is the same as Sn ≤ t, m(t) is expressed in terms of the distribution
functions of Sn , n ≥ 1, as follows.
m(t) = 1
X n=1 Pr {Sn ≤ t} . (3.5) Although this expression looks fairly simple, it becomes increasingly complex with increasing
t. As t increases, there is an increasing set of values of n for which Pr {Sn ≤ t} is signiﬁcant,
and Pr {Sn ≤ t} itself is not that easy to calculate if the interarrival distribution FX (x)
is complicated. The main utility of (3.5) comes from the fact that it leads to an integral
equation for m(t). Since Sn = Sn−1 + Xn for each n ≥ 1 (interpreting S0 as 0), and since
Xn and Sn−1 are independent, we can use the convolution equation (1.11) to get
Zt
Pr {Sn ≤ t} =
Pr {Sn−1 ≤ t − x} dFX (x) for n ≥ 2.
x=0 Substituting this in (3.5) for n ≥ 2 and using the fact th...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details