This preview shows pages 1–3. Sign up to view the full content.
APPENDIX
H
INTRODUCTION TO PROBABILITY
AND RANDOM PROCESSES
This appendix is not intended to be a definitive dissertation on the subject of
random processes.
The major concepts, definitions, and results which are
employed in the text are stated here with little discussion and no proof.
The
reader who requires a more complete presentation of this material is referred
to any one of several excellent books on the subject: among them Davenport
and Root (Ref. 2), Laning and Battin (Ref.
3),
and Lee (Ref. 4).
Possibly
the most important function served by this appendix is the definition of the
notation and of certain conventions used in the text.
PROBABILITY
Consider an event
E which is a possible outcome of a random experiment.
We denote by P(E) the probability of this event, and think of it intuitively as
the limit, as the number of trials becomes large, of the ratio of the number of
times
E occurred to the number of times the experiment was tried. The
joint event that A and B and C, etc., occurred is denoted by ABC
. ,
and
the probability of this joint event, by P(ABC.
.
).
If these events A, B, C,
etc., are mutually independent, which means that the occurrence of any one
of them bears no relation to the occurrence of any other, the probability of
the joint event is the product of the probabilities of the simple events.
That
is,
P(ABC.
. .
)
=
P(A)P(B)P(C)
. .
(H 1)
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document630
PROBABILITY AND RANDOM PROCESSES
if the events
A,
B,
C,
etc., are mutually independent. Actually, the mathe
matical definition of independence is the reverse of this statement, but the
result of consequence is that independence of events and the multiplicative
property of probabilities go together.
RANDOM VARIABLES
A random variable Xis in simplest terms a variable which takes on values at
random; it may be thought of as a function of the outcomes of some random
experiment.
The manner of specifying the probability with which different
values are taken by the random variable is by the probability distribution
function F(x), which is defined by
or by the probability density function
f(x), which is defined by
The inverse of the defining relation for the probability density function is
(H4)
An evident characteristic of any probability distribution or density function is
From the definition, the interpretation off (x) as the density of probability
of the event that X takes a value in the vicinity of x is clear.
F(x
+
dx)

F(x)
f(x)
=
lim
&+o
dx
P(x
<
X
s
x
+
dx)
=
lim
dx40
dx
This function is finite if the probability that X takes a value in the infinitesimal
interval between x and x
+
dx (the interval closed on the right) is an infini
tesimal of order dx.
This is usually true of random variables which take
values over a continuous range.
If, however, X takes a set of discrete values
xi with nonzero probabilities pi,
f
(x)
is infinite at these values of x. This is
accommodated by a set of delta functions weighted by the appropriate
probabilities.
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '04
 EricFeron

Click to edit the document details