This preview shows page 1. Sign up to view the full content.
Unformatted text preview: can similarly deﬁne the probability of an arbitrary event A conditional on a given value
of a rv Y with a density as
Pr {A, Y ∈ [y , y + δ ]}
.
δ →0 Pr {Y ∈ [y , y + δ ]} Pr {A  Y = y } = lim 14 CHAPTER 1. INTRODUCTION AND REVIEW OF PROBABILITY For n rv’s X1 , . . . , Xn , we deﬁne statistical independence by the equation
Yn
Yn
F (x1 , . . . , xn ) =
Pr {Xi ≤ xi } =
FXi (xi )
for all values of x1 , . . . , xn .
i=1 i=1 (1.18) Another way to state this is that X1 , . . . , Xn are independent if the events Xi ≤ xi for
1 ≤ i ≤ n are independent for all choices of x1 , . . . , xn . If the density or PMF exists, (1.18)
is equivalent to a product form for the density or mass function. A set of rv’s is said to be
pairwise independent if each pair of rv’s in the set is independent. As shown in Exercise
1.19, pairwise independence does not imply that the entire set is independent.
Independent rv’s are very often also identically distributed, i.e., they all have the same
distribution function. These cases arise so often that we abreviate independent identically
distributed by IID. For the IID case (1.18) becomes
F (x1 , . . . , xn ) = 1.3.5 Yn i=1 FX (xi ). (1.19) Stochastic processes A stochastic13 process (or random process) is an inﬁnite collection of rv’s, usually indexed
by an integer or a real number that is often interpreted as discrete or continuously varying
time. Thus each sample point of the probability model maps into an inﬁnite collection of
sample values of rv’s. If the index is regarded as time, then each sample point maps into a
function of time called a sample function. These sample functions might vary continuously
with time or might vary at only discrete times, and if they vary at discrete times, those
times can be deterministic or random.
In many cases, this collection of rv’s comprising the stochastic process is the only thing of
interest. In this case the sample points of the probability model can be taken to be the
sample functions of the process. Conceptually, then, each event is a set of sample functions.
Usually these events are characterized in terms of a ﬁnite set of rv’s.
As an example of sample functions that vary at only discrete times, we might be concerned
with arrivals to some system. The arrivals might model incoming jobs for a computer
system, arriving packets to a communication system, patients in a health care system, or
orders for some merchandising warehouse.
The Bernoulli process is an example of an arrival process and is probably the simplest
imaginable nontrivial stochastic process. The following example deﬁnes this process and
develops a few of its many properties. We will often return to it as an example.
Example 1.3.1. The Bernoulli process: A Bernoulli process is an IID sequence, Y1 , Y2 , . . . ,
of binary random variables. Let q = Pr {Yi = 1} and 1 − q = Pr {Yi = 0}. We visualize time
as a discrete variable and the event {Yi = 1} as an arrival at time i and {Yi = 0} as no
13 Stochastic and random are synonyms, but random has become more popular for rv’s and stochastic for
stochastic processes, perhaps because both the word random and the word process are used and misused in
so many ways. 1.3. PROBABILITY REVIEW 15 arrival. There can be at most one arrival at each discrete time. We visualize the process as
starting at time 0, with the ﬁrst opportunity for an arrival at time 1.
Several of the random variables of interest in a Bernoulli process are illustrated in Figure
P
1.2. Note that the counting process N (t) = t=1 Yi gives the number of arrivals up to and
i
including each time t. It is an alternate way to specify the process, since Yi = N (i) − N (i−1).
S3
S2 ✛ ✲ X3 S1 ✛✲
X2
N (t) ✛ X1 ✲
i0
Yi 1
0 2
1 3
1 4
0 5
0 6
1 7
0 8
0 Figure 1.2: For a Bernoulli process with binary rv’s Yi , the counting process at each
P
t discrete time t is N (t) = i=1 Yi . The arrival epoch Sj is the smallest t for which
N (t) = j , i.e., it is the epoch of the j th arrival. The interarrival interval Xj is S1 for
j = 1 and Sj − Sj −1 for j > 1. The rv N (t), for any given t and q , is a binomial rv and is a good example of the sums of
IID rv’s involved in the laws of large numbers. The PMF pN (t) (k) is the probability that k
°t¢
out of t of the Yi ’s have the value 1. There are k arrangements of t binary numbers with
k 1’s, and each has probability q k (1 − q )t−k . Thus
µ∂
tk
pN (t) (k) =
q (1 − q )t−k .
k
The interarrival intervals Xj are also illustrated in the ﬁgure. The rv X1 is simply the time
of the ﬁrst arrival. It has the value 1 if Y1 = 1, and thus pX1 (1) = q . It has the value 2
if Y1 = 0 and Y2 = 1, so pX1 (2) = q (1 − q ). Continuing, we see that X1 has the geometric
PMF,
pX1 (j ) = q (1 − q )j −1 .
We can see that each subsequent Xj , conditional on Sj −1 , can be found in the same way
and has this same geometric PMF. Also, since this conditional PMF does not depend on
Sj −1 , Xj is independent of Sj −1 and thus14 also of all previous Xj...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details