This preview shows page 1. Sign up to view the full content.
Unformatted text preview: Probability and Statistics with
Reliability, Queuing and
Computer Science Applications:
second edition by K.S. Trivedi
PublisherJohn Wiley & Sons Chapter 2:Discrete Random Variables
Dept. of Electrical & Computer engineering
Duke University
Email: kst@ee.duke.edu
URL: www.ee.duke.edu/~kst
Copyright © 2006 by K.S. Trivedi 1 Random Variables
Sample space is often too large to deal with
directly.
Recall that in the sequence of Bernoulli trials, if we
don’t need the detailed information about the
actual pattern of 0’s and 1’s but only the number of
0’s and 1’s, we are able to reduce the sample space
from size of ‘2n’ to size of just ‘n+1’.
Such abstractions lead to the notion of a random
variable.
Copyright © 2006 by K.S. Trivedi 2 Random Variables (cont’d)
Discrete RV X: countable number of values.
– Properties of a discrete RV:
– Probability mass function: p(xi) of X
Continuous RV X: uncountably infinite number of
different values (an interval or a collection of
intervals). (In Chapter 3)
– Properties of a continuous RV:
– Probability density function : f(x) of X
Note the distinction: pmf vs. pdf
Notation: uppercase letters (X, Y) for RVs,
lowercase, x,y, for the values.
Copyright © 2006 by K.S. Trivedi 3 Discrete Random Variables
A random variable (rv) X is a mapping (function)
from the sample space S to the set of real numbers.
i.e., a function that assigns a real number to each
sample point Random variable is not a variable but a function
If image of X (set of all values taken by X), is
finite or countably infinite, X is a discrete rv. Copyright © 2006 by K.S. Trivedi 4 Discrete Random Variables
Inverse image Ax of a real number x is the set of
all sample points that are mapped by X into x:
It is easy to see that the set of all inverse images
are set that is mutually exclusive and collectively
exhaustive: Copyright © 2006 by K.S. Trivedi 5 Discrete Random Variables
(Example 2.1)
Consider a random experiment defined by a sequence of three
Bernoulli trials.
The sample space S consists of eight triples of 0s and 1s
Define a random variable X to be the total number of successes from three
trials
The values of the random variable X are {0,1,2,3}
X(0,0,0) =0
X(0,0,1)=X(0,1,0)=X(1,0,0)=1
X(0,1,1)=X(1,0,1)=X(1,1,0) =2
X(1,1,1) = 3
The inverse images of random variable X are A0 = {(0,0,0)} ; A1={(0,0,1),(0,1,0),(1,0,0)};
A2={(0,1,1),(1,1,0),(1,0,1)}; A3 = {(1,1,1)}
Copyright © 2006 by K.S. Trivedi 6 Probability Mass Function (pmf)
Ax : set of all sample points such that, pmf is defined as = the probability that the value of the random
variable X obtained on a performance of the
experiment is equal to x
Copyright © 2006 by K.S. Trivedi 7 Equivalence
pmf:
Probability mass function
Discrete density function
Sometimes mistakenly called the
distribution function
Empirical version is called a histogram Copyright © 2006 by K.S. Trivedi 8 pmf Properties
The following properties hold for pmf , since the random variable
assigns some value
to each sample point
Since a discrete rv X takes a finite or a countably
infinite set values,
, the last property
above can be restated as,
Copyright © 2006 by K.S. Trivedi 9 Cumulative Distribution Function
Note that pmf is defined for a specific rv value, i.e.,
Probability of a set The function defined as is called the cumulative distribution function (CDF) or the
probability distribution function or simply the distribution
function of the random variable X
Copyright © 2006 by K.S. Trivedi 10 Equivalence & Notes
CDF (cumulative distribution function)
PDF (probability distribution function)
not recommended as confusion with pdf
(prob. density function) may arise
Distribution function
FX(t) , subscript indicates the name of the rv;
it must be a capital letter; the subscript may
be omitted if the rv is clear from the context;
the argument is a dummy variable so FX(t) &
FX(y) is the same function while FX(t) & FY(t)
are different functions
Copyright © 2006 by K.S. Trivedi 11 Distribution Function properties F(x) is a monotone increasing function of x, since if Copyright © 2006 by K.S. Trivedi 12 Common discrete random variables
Constant
Uniform
Bernoulli
Binomial
Geometric
Negative binomial
Poisson
Hypergeometric
Copyright © 2006 by K.S. Trivedi 13 Constant Random Variable
pmf
1.0 c CDF 1.0 c This a deterministic quantity (value); it may seem strange to
call it an rv but often we need to mix deterministic and
random quantities Copyright © 2006 by K.S. Trivedi
14 Constant Random Variable in SHARPE
In order to assign this distribution to a block
in rbd, to a basic event in fault tree or to a
task in a task graph, we can only take two
possible values of c: c=0 and c=∞. For other
cases, as we will see later, approximations
can be used
Block try
comp z zero * prob(0); time to failure is 0
comp I inf * prob(1); time to failure is ∞
… Copyright © 2006 by K.S. Trivedi 15 Discrete Uniform Distribution
Discrete rv X that assumes n discrete values with equal
probability 1/n
Discrete uniform pmf Discrete uniform distribution function:
assume X takes integer values 1,2,3,…,n then, for
Copyright © 2006 by K.S. Trivedi 0≤ x≤n 16 Discrete Uniform pmf Copyright © 2006 by K.S. Trivedi 17 Discrete Uniform CDF Copyright © 2006 by K.S. Trivedi 18 Notation: Floor & Ceiling
Define Copyright © 2006 by K.S. Trivedi 19 Bernoulli Random Variable
rv generated by a single Bernoulli trial that has a
binary valued outcome {0,1}
Such a binary valued Random variable X is
called the indicator or Bernoulli random variable
so that
Probability mass function: pX(1) = p1 = P(X=1) = p
pX(0) = p0 = P(X = 0) = q = 1p
Copyright © 2006 by K.S. Trivedi 20 Bernoulli Distribution
CDF p+q=1
q 0.0 1.0 x CDF of Bernoulli random variable Copyright © 2006 by K.S. Trivedi 21 Binomial Random Variable
Binomial rv a fixed no. n of independent
Bernoulli trials (BTs)
RV Yn: no. of successes in n BTs.
Binomial pmf b(k;n,p) Binomial CDF
Copyright © 2006 by K.S. Trivedi 22 Regarding Parameters
Note that the binomial pmf (or distribution) is
completely defined by the formula given earlier, and
by its "parameters" n and p.The binomial probability
equation never changes so we regard a binomial
distribution as being defined by its parameters. This is
typical of all probability distributions (using their own
parameters, of course). One of the problems we often
face in statistics is estimating the parameters after
collecting data that we know (or believe) comes from
a particular probability distribution (such as the
binomial). We will deal with parameter estimation in
Chapter 10.
Copyright © 2006 by K.S. Trivedi 23 Binomial Random Variable: pmf
pk Copyright © 2006 by K.S. Trivedi 24 1.2 Binomial Random Variable: CDF 1 CDF 0.8 0.6 0.4 0.2 0
0 1 2 3 4 5 6 7 8 9 x
Copyright © 2006 by K.S. Trivedi 25 10 Conditions for application of the binomial
distribution
Applicable wherever a series of trials is made such that Each trial has two mutually exclusive outcomes
“success” and “failure” (or 0 and 1)
The probability of success at each trial is a
constant (therefore, probability of failure too)
The outcomes of successive trials are mutually
independent
Copyright © 2006 by K.S. Trivedi 26 Applications of the binomial pmf
Used frequently in quality control,
reliability, survey sampling, and other
industrial problems
Typical situation where these conditions apply
is where number of defective components are
counted when several are selected from a
large batch of components
This situation can be generalized to many
other applications as in the following slides
Copyright © 2006 by K.S. Trivedi 27 Applications of the binomial pmf
Reliability of a k out of n system, R is the
reliability of an individual component
n n j =k j =k Rkofn = 1 − B(k −1; n, R) = ∑ b( j; n, R) = ∑ (nj )[R] j [1 − R]n− j
Series system: n
Rseries = b(n; n, R) = ∑ (nj )[R] j [1 − R]n− j = [ R]n
j =n Parallel system:
n
Rparallel = 1 − b(0; n, R) = ∑ (nj )[R] j [1 − R]n− j = 1 − [1 − R]n
j =1 Copyright © 2006 by K.S. Trivedi 28 Binomial Random Variable in SHARPE
Binomial CDF
block bin(q, k, n)
comp one prob(q)
kofn block0 k, n, one
end
bind n 5
bind R 0.9
loop k,1,n
expr sysprob(bin; 1R, k, n)
* B(k1;n,R) is computed and printed
end
end
Copyright © 2006 by K.S. Trivedi 29 Applications of the binomial dist.
Transmitting an LLC frame using n MAC blocks
p is the prob. of correctly transmitting one block
Let pK(k) be the pmf of the rv K that is the number
of LLC transmissions required to transmit n MAC
blocks correctly; then (1) = b(n; n, p ) = p n
pK
p K (2) = [1 − (1 − p ) 2 ]n − p n
and
p K (k ) = [1 − (1 − p ) k ]n − [1 − (1 − p ) k −1 ]n
Copyright © 2006 by K.S. Trivedi 30 Applications of the binomial dist.
Counting the number of defective chips in a
sample of size 35. 10% of them are expected
to be defective (p = 0.1 is the prob. of success)
The observed fraction defective should be close
to the binomial pmf: The observed data and the binomial pmf are
shown in the following table Copyright © 2006 by K.S. Trivedi 31 Observed Data
Number of Defects Number of samples showing
this number of defects Fraction (of 800 samples)
showing this No. of defects 0 11 0.01375 1 95 0.11875 2 139 0.17375 3 213 0.26625 4 143 0.17875 5 113 0.14125 6 49 0.06125 7 27 0.03375 8 6 0.00750 9 4 0.00500 10 0 0.00000 Copyright © 2006 by K.S. Trivedi 32 Binomial pmf
k = Number of Defects/sample Data b(k;35,0.1) 0 0.01375 1 0.11875 0.0974 2 0.17375 0.1839 3 0.26625 0.2248 4 0.17875 0.1998 5 0.14125 0.1376 6 0.06125 0.0765 7 0.03375 0.0352 8 0.00750 0.0137 9 0.00500 0.0046 10 0.00000 0.0013 Copyright © 2006 by K.S. Trivedi 0.0250 33 Comparing the model pmf with real data
Com paring m odel pm f with data
0. 3 Fraction 0. 25
0. 2
Data 0. 15 b(k ; 35, 0. 1) 0. 1
0. 05
0 0 1 2 3 4 5 6 7 8 9 10 k Copyright © 2006 by K.S. Trivedi 34 Applications of the binomial dist.
Transmitting binary digits through a
communication channel, the number of digits
received correctly, Cn, out of n transmitted
digits ~ binomial distribution B(k;n;p), where
p = prob. of successfully transmitting one digit.
The prob. of exactly i errors Pe(i) and the prob.
of an errorfree transmission is given by:
pe (i ) = pCn (n − i ) = ( in ) p ( n −i ) (1 − p ) i , and
pe (0) = p n .
Copyright © 2006 by K.S. Trivedi 35 Applications of the binomial dist.
Taking a random sample of 10 VLSI chips from
a very large batch.
The No. of defective chips in the sample has
the pmf b(k;10,p), where p = Prob. of a
randomly chosen chip is defective.
No defective in the sample > accept the batch
Find defective in the sample > reject the batch
P (“a batch is accepted”) = P (“No defectives”)
=(1p)10. Copyright © 2006 by K.S. Trivedi 36 Computation of the binomial pmf
Computation using the formula directly is
numerically unstable
A recursive formula should be used
Normal or Poisson approximation can be used;
see page 75 of the blue book for details
Pmf can be symmetrical (for p=0.5;fig. I in
slide #33), positively skewed (for p < 0.5; fig.
II in slide # 33) or negatively skewed (for p >
0.5; fig. III in slide # 34) Copyright © 2006 by K.S. Trivedi 37 Fig I: Symmetrical
binomial pmf Fig II: Positively Skewed
Binomial pmf Copyright © 2006 by K.S. Trivedi 38 Fig III: Negatively Skewed binomial pmf Copyright © 2006 by K.S. Trivedi 39 Binomial Random Variable
We shall see later that the number of successes in n Bernoulli trials can be seen as the sum of the
number of successes in each trial: Y n = X 1 + X 2 + ... + X n
where Xi ’s are independent identically distributed
Bernoulli random variables.
Copyright © 2006 by K.S. Trivedi 40 Geometric Distribution
Consider a sequence of Bernoulli trials up to and including
the 1st success.
Sample space S is countably infinite in size If p is the probability of a success and q be the probability of
failure of each Bernoulli trial (recall independence) . Then, rv
Z has a geometric distribution with resp. pmf & CDF Copyright © 2006 by K.S. Trivedi 41 Geometric pmf Example Copyright © 2006 by K.S. Trivedi 42 Geometric CDF Example Copyright © 2006 by K.S. Trivedi 43 Modified Geometric Distribution
Now let X be a rv counting total no. of trials upto but
not including the 1st success.
Modified geometric random variable: Defined as total
no. of failed trials before first success, i.e., Z=X+1.
Then X is a modified geometric random variable with
pmf : Copyright © 2006 by K.S. Trivedi 44 Note
Many well known papers don’t clearly
distinguish between geometric and
modified geometric distributions
Be careful so that the right formula is
used Copyright © 2006 by K.S. Trivedi 45 Geometric Distribution (contd.)
Geometric distribution (and its cousins) is the only
discrete distribution that exhibits MEMORYLESS
property.
Future outcomes are independent of the past events.
Let Z be the rv denoting total number of trials upto
and including the first success
Assume n trials completed with all failures. Let Y
denote additional trials upto and including the first
success, i.e., Z = n+Y or Y=Zn
The conditional probability qi is given by
Copyright © 2006 by K.S. Trivedi 46 Geometric Distribution (contd.) Thus after n unsuccessful trials, the number of trials remaining
until the first success has the same pmf as Z had originally
(i.e., memoryless property)
Copyright © 2006 by K.S. Trivedi 47 Applications of the Geometric Dist.
Consider the scheduling of a computer system with a
fixed time slice.
At the end of a time slice a program would either have
completed execution with probability p or would need
more computation with probability q = 1 – p . The random
variable denoting the number of time slices needed to
complete the execution of a program is geometrically
distributed.
The number of times the following statement is executed:
repeat S until B
is geometrically distributed assuming that successive tests
of condition B satisfy the conditions of Bernoulli trials
Copyright © 2006 by K.S. Trivedi 48 Applications of the Geometric Dist.
The number of times the following statement is executed:
while (¬B) do S
is modified geometrically distributed assuming that
successive tests of condition B satisfy the conditions of
Bernoulli trials. Copyright © 2006 by K.S. Trivedi 49 Negative Binomial Distribution
RV Tr: no. of trials up to and including the rth
success.
Image of Tr = {r, r+1, r+2, …}. Define events:
A: Tr = n
B: Exactly r1 successes in n1 trials.
C: The nth trial is a success. Clearly, since B and C are mutually independent, Where, n = r, r+1, r+2, ……
This pmf is known as negative binomial pmf
Copyright © 2006 by K.S. Trivedi 50 Poisson Random Variable
RV such as “no. of arrivals in an interval (0,t]”
Assuming λ is rate of arrival of the job
In a small interval, ∆t, prob. of new arrival= λ∆t.
If ∆t is small enough, then probability of two
arrivals in ∆t may be neglected
Suppose the interval (0,t] is divided into n
subintervals of length t/n
Suppose arrival of a job in any interval is
independent of the arrival of a job in any other
interval .
Copyright © 2006 by K.S. Trivedi 51 Poisson Random Variable
For a large n, the n intervals can be thought of as
constituting a sequence of Bernoulli trials with probability of
success
p = λt/n
Therefore, probability of k arrivals in a total of n intervals, is
given by As , this gives the Poisson pmf :
Copyright © 2006 by K.S. Trivedi 52 Poisson Random Variable (contd.) Poisson rv often occurs in situations, such as,
“no. of packets (or calls) arriving in t sec.” or
“no. of components failing in t hours” etc. Copyright © 2006 by K.S. Trivedi 53 Poisson Failure Model
Let N(t) be the number of (failure) events that
occur in the time interval (0,t]. Then a
(homogeneous) Poisson model for N(t) assumes:
1. The probability mass function (pmf) of N(t) is: (λt ) P{N (t ) = k } = ⎡
⎢
⎣ k − λt
/ k!⎤ e
⎥
⎦ k = 0, 1, 2, … Where λ > 0 is the expected number of events
(failures) occurrences per unit time.
2. The number of events in two nonoverlapping
intervals are mutually independent.
Copyright © 2006 by K.S. Trivedi 54 Note:
For a fixed t, N(t) is a random variable (in
this case a discrete random variable known
as the Poisson random variable).
The family {N(t), t ≥ 0} is a stochastic
process, in this case, the homogeneous
Poisson process. We will study stochastic
processes in Chapter 6 and beyond. Copyright © 2006 by K.S. Trivedi 55 Poisson Failure Model (contd.)
The successive interevent times X1, X2, … in a
homogenous Poisson model, are mutually independent,
and have a common exponential distribution given by:
P {X 1 ≤ t } = 1 − e − λ t
t≥0
To show this: P ( X 1 > t ) = P ( N ( t ) = 0) = e − λt Thus, the discrete random variable, N(t), with the
Poisson distribution, is related to the continuous random
variable X1, which has an exponential distribution (See
Chapter 3).
The mean interevent time is 1/λ, which in this
case is the mean time to failure (See Chapter 4).
Copyright © 2006 by K.S. Trivedi 56 Poisson Random Variable
Probability mass function (pmf) (or
discrete density function):
(λt)k
pk = P{N(t) = k} = e−λt
k! Distribution function (CDF):
k
⎣ x ⎦ −λt (λt )
F ( x) = ∑ e
k =0
k!
Copyright © 2006 by K.S. Trivedi 57 Poisson pmf
pk λt=1.0 Copyright © 2006 by K.S. Trivedi 58 Poisson CDF CDF
1 λt=1.0 0.5 0.1
1 2 3 4 5 6 7 Copyright © 2006 by K.S. Trivedi 8 9 10
59 t pk Poisson pmf
λt=4.0
λt=4.0 Copyright © 2006 by K.S. Trivedi 60 Poisson CDF
CDF
1 λt=4.0 0.5 0.1
1 2 3 4 5 6 7 Copyright © 2006 by K.S. Trivedi 8 9 10
61 t Poisson Approximation Example
The prob. of defective VLSI chip = 0.01.
Find Prob(no defective chip in a box of 100
chips)
Binomial pmf (n=100, p=0.01) b(k ; n, p ) = b(0;100,0.01) = (100 )(0.01) 0 (0.99)100 = 0.366.
0
Poisson approximation (α=np=100x0.01=1) f (k ; α ) = f (0;1) = e −1 = 0.3679.
Copyright © 2006 by K.S. Trivedi 62 Another Poisson Example
Connections arrive at a switch at a rate of 12 per ms. The
arrival distribution is Poisson. (a) What is the probability
that exactly 12 calls arrive in one ms? (b) What is the
probability that exactly 100 calls arrive in 10 ms? (c) What
is the probability that the number of calls arriving in 2 ms
is greater that 7 and less than or equal to 10?
Work it out yourself Copyright © 2006 by K.S. Trivedi 63 Computing Poisson pmf
Direct computation of Poisson probabilities is
numerically unstable
A recommended recursive formula is: f (k + 1;α ) = αf (k ;α ) /(k + 1)
With the intialization step:
f (0;α ) = exp(−α )
But even this formula is unstable for very large
values of α. See Fox 1988 for an algorithm Copyright © 2006 by K.S. Trivedi 64 Hypergeometric pmf
Obtained when sampling without replacement (binomial
pmf resulted when sampling was done with
replacement)
Probability of choosing k defective components in a
random sample of m components, chosen without
replacement, from a total of n components, d of
which are defective, is given by the hypergeometric
pmf , h(k;m,d,n) as Copyright © 2006 by K.S. Trivedi 65 Hyper Geometric PMF: Example 2.11
Compute the probability of obtaining 3 defectives in a sample of
size 10 taken without replacement from a box of 20
components containing 4 defectives.
Applying the formula with k=3, m=10, d=4, n=20, we get: If we approximated this probability using a binomial distribution
with corresponding parameters, we would have obtained
b(3;10,0.20)=0.2013, which is a considerable underestimate of
the actual value.
Copyright © 2006 by K.S. Trivedi 66 Hyper Geometric PMF: Example 2.12
Cellular Wireless System with TDMA: In
this system base transceiver of each cell has
n base repeaters, each of which provide m
timedivisionmultiplexed channels.
Let k be the total number of talking channels
in the system, which are allocated randomly
to the users. Now, if a base repeater fails,
then the probability that i talking channels
reside in the failed repeater is given by pi =h(i;k,m,mn). Copyright © 2006 by K.S. Trivedi 67 Hypergeomteric pmf (Example 2.13)
Consider a s/w reliability growth model for estimating
the number of residual faults in the s/w after testing
phase
A s/w is subjected to a sequence of n test
instances ti, i=1,2,..n. Faults detected at each test
are removed without inserting new ones.
Number of faults detected by ti is denoted by Ni
The cumulative number of faults detected by test
instances from t1 thru ti is given by the random
variable
i.e. number of faults still
undetected after ith test instance is m Ci
Copyright © 2006 by K.S. Trivedi 68 Hypergeomteric pmf (Example 2.13)
The probability that k faults are
detected by the test instance ti+1 given
that ci faults are detected by test
instances t1 through ti is Copyright © 2006 by K.S. Trivedi 69 Probability Generating Function
(PGF)
Helps in dealing with operations (e.g., sum) on nonnegative integervalued rv’s.
Letting, P(X=k)=pk , PGF of X is defined by, Onetoone mapping: pmf (or CDF)
PGF
See page 98 for PGF of some common pmfs
We will return to this topic in Chapter 4 Copyright © 2006 by K.S. Trivedi 70 PGFs of Some Wellknown
Distributions
Bernoulli R.V. :
Binomial R.V. :
Modified Geometric R.V. :
Poisson R.V. :
Uniform R.V. :
Constant R.V. :
Copyright © 2006 by K.S. Trivedi 71 PGF and Distributions
Thm 2.1: If two discrete R.V.s X and Y
have same PGFs, then they must have
the same distributions and pmfs. Copyright © 2006 by K.S. Trivedi 72 Application of PGF: Recurrence
Relation for Binomial pmf Differentiating both sides we get Equating the coefficients of zk1 on each side we get, Copyright © 2006 by K.S. Trivedi 73 Application of PGF: Another
Recurrence Relation for Binomial pmf Multiplying each side with (q+pz), we get: Equating coefficients of zk on each sides we get: Copyright © 2006 by K.S. Trivedi 74 Application of PGF: Recurrence
Relation for Poisson pmf Differentiating both sides and equating the
coefficients of zk on both the sides we get: Copyright © 2006 by K.S. Trivedi 75 Discrete Random Vectors
X:(X1, X2,…,Xr) be r rv defined on a sample space S
For each sample point s in S, each of the rv X1,
X2,…,Xr takes on one of its possible values, as The random vector X = (X1, X2,…,Xr) is an rdimensional
vector – valued function Copyright © 2006 by K.S. Trivedi 76 Joint or compound pmf (properties)
The joint or compound pmf of a random vector
X is defined to be
The properties of this pmf are : Copyright © 2006 by K.S. Trivedi 77 Joint or compound pmf (Example)
An interesting example of a compound pmf is the multinomial
pmf
Consider a sequence of generalized Bernoulli trials, with r distinct
outcomes in each trial with probabilities
, where
Define a random vector X = (X1, X2,…,Xr), such that Xi is the
number of trials that resulted in the ith outcome.
Then the compound pmf of X is given by Copyright © 2006 by K.S. Trivedi 78 Joint or compound pmf (Example)
The marginal pmf of Xi may be computed by summing
the joint pmf over all nj’s except ni. The marginal pmf of
each Xi is binomial with parameters n and pi.
Consider the case when a program requires I/O service
from device i with probability pi at the end of a CPU
burst, with
If n CPU bursts are observed, then the probability that ni of
these will be directed to I/O device i (for i = 1,2,…,r) is given
by the multinomial pmf.
The number of I/O requests (out of n) directed to a specific
device j, has a binomial distribution with parameters n and pj
Copyright © 2006 by K.S. Trivedi 79 Independent Discrete Random Variables
X and Y are independent iff the joint pmf satisfies: Mutual independence also implies:
Pair wise independence vs. mutual independence
It is possible for every pair of random variables in
the set {X1,X2,…,Xr} to be pairwise independent
without the entire set being mutually independent
Copyright © 2006 by K.S. Trivedi 80 Discrete Convolution
Let Z=X+Y . Then, if X and Y are independent,
t pZ (t ) = p X +Y (t ) = ∑ p X ( x ) pY (t − x )
x =0 This sum is known as the discrete convolution Copyright © 2006 by K.S. Trivedi 81 Discrete Convolution
Given events X and Y are independent, consider the
probability of event Z=X+Y=t
On a two dimensional(x,y) event space, this event is
represented by all the event points on the line X+Y=t
(as shown in the figure).
The probability of Z can be computed by adding
event points on this line. Hence, Copyright © 2006 by K.S. Trivedi 82 Discrete Convolution (Contd.)
This summation is
called Discrete Convo
lution and it gives the
formula for the pmf of
the sum of two nonnegative independent
discrete random
variables Copyright © 2006 by K.S. Trivedi 83 Discrete Convolution
Let Z=X+Y . Then, if X and Y are independent,
t pZ (t ) = p X +Y (t ) = ∑ p X ( x ) pY (t − x )
x =0 This sum is known as the discrete convolution
Restricting to nonnegative integervalued random
variables, using the probability generating function (PGF),
the PGF of Z can be represented as
In general, then, if Xi‘ s are independent Copyright © 2006 by K.S. Trivedi 84 Some results on Sum of discrete
independent random variables
Let X1,X2,…,Xr be mutually independent
If Xi is binomially distributed with parameters ni and
p, then sum of Xis i.e.
has the binomial
distribution with parameters
and p
If Xi has the (modified) negative binomial distribution
with parameters αi and p, then
has the
(modified) negative binomial distribution with
parameters
and p
If Xi has the Poisson distribution with parameter αi,
then
has the Poisson distribution with
parameter
Copyright © 2006 by K.S. Trivedi 85 Theorem 2.2: Closure of
Distributions Under Sum Copyright © 2006 by K.S. Trivedi 86 ...
View
Full
Document
This note was uploaded on 04/08/2010 for the course COMPUTER E 409232 taught by Professor Mohammadabdolahiazgomiph.d during the Spring '10 term at Islamic University.
 Spring '10
 MohammadAbdolahiAzgomiPh.D

Click to edit the document details