This preview shows pages 1–10. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Generating Samples of Random Variables With Arbitrary
Distributions We have seen how to generate samples of uniformly distributed random variables over [0, 1].
However, this is not enough for building complex simulation models, since the probability
distribution of many of the input random variables are anything but uniform. (For example,
in a queueing model, the interarrival and service times of the customers may have exponential
distribution.) Therefore, we need a way to generate samples of random variables having
distributions other than uniform distribution. This problem is actually not as complicated as it seems (at least in theory). It turns
out that, one can “transform” a uniformly distributed random variable over [0,1] into any
other random variable with an arbitrary probability distribution. Here, we consider three
methods that can be used to accomplish this transformation: Inversion, acceptance—rejection and convolution. 1. Inversion Method Continuous random variables — Assume that X is a continuous random variable and we
want to generate samples of X. Let F be the cumulative distribution function (c.d.f.) of X.
Since F is continuous and increasing, one can define the inverse of which we denote by F—1(). That is, =11. <=> F_1(u) = m. The inversion method works as follows:
1. Generate U N U[0, 1]. 2. Return X : F_1(U).
One can Show that if U N U [0, 1], then the c.d.f. of X is In order to see this, recall that 37 the c.d.f. of a uniformly distributed random variable over [0, 1] is 0 ifrr<0
m ifOSmgl
1 ifzr>1. Letting U be a uniformly distributed random variable over [0, 1], set X = F’1(U Then, we
have P{X£x}= WWW) wa mirr‘cunsm 1 j (Nué lec L Ewample — We want to generate samples of an exponentially distributed random variable with
parameter A. Remember that the c.d.f. of an exponentially distributed random variable with
parameter A is i —)m 
F(m)={1 6 If”??? 0 otherw1se. Canmini? F'llu) (“be ’hx g) PU: [9,)”.‘9 ﬂuru)’ J)”
"5:; K: "ii. TM) liqtula ——l% ,{AiHM)
r) (jun/rte UNUWlJ
1) ram Aidan» u) (M glide/AU) The argument above mathematically proves that the inversion method works. But, what
is the intuition behind it? Assume that we want to generate samples of a random variable X
whose probability density function (p.d.f.) looks like 0.3 3432‘101234‘5'6'781 Fix) 38 Since p.d.f. is the derivative of the c.d.f., the c.d.f. will have the steepest rise where the p.d.f.
takes large values. Therefore, the c.d.f. of X looks like 4:2‘10i2345678910 Now, imagine “bombarding” the y—axis with random numbers {U1,U2, . . .}, and tracing
the value of {F"1(U1), F‘2(U2), . . on the m—axis. {F_1(U1), F_2(U2),. . are Closely clustered around the points where the c.d.f. is steep (that
is, where the p.d.f. takes large values). Estample — X has Weibull distribution with scale and shape parameters [7’ and o: (,6, o: > 0).
That is, the c.d.f. of X is given by 17 as)” 'f0< e 1 is: 0 otherwise. (Weibull distribution is widely used in reliability modeling to describe component lifetime.)
Develop an inversion method to generate samples of X. 39 Emmple W The p.d.f. of X is given by 1/3 if 0 g a; S l
f(:1:)= 2/3 if1<mg2
0 otherwise. Develop an inversion method to generate samples of X. it“ PM
I I: Discrete random variables — Now, consider the case Where the random variable of interest
is discrete. In particular, assume that X is a random variable that takes values {$1, 332, . . . ,xn},
and lP’{X = = Then, an intuitive way of generating samples of X is to compute Fen) : 2m, and use the following method: 1. Generate U N U[0, 1]. 2. Return :a; such that F(£B¢_1) < U S F 40 Emmple — Let the probability mass function of X be 0.3 if1E=l
0.2 ifzc=3
:]P X: :
Mm) { 3:} 0.1 ifcc=4
0.4 if33=7.
Let
0.3 if5c=1
0.5 if$=3
F =
(x) 0.6 ifzt=4
1.0 ifm=7.
Thus, if U E (0, 0.3], then we return X : 1, if U E (0.3, 0.5], then we return X = 3,
if U E (0.5, 0.6], then we return X 2 4,
if U E (0.6,1.0], then we return X = 7. The approach above can be used for any discrete random variable, and in fact, it is an
inversion method. Let X be a discrete random variable that takes values {2:1,x2, . . . ,mﬂ},
with :31 5 2:2 g g 33,, and IP{X = = The c.d.f. of X is 0 if m < $1
29031) if 31 S a: < 332
Mm) +p($2) if m2 3 a: < 2:3 2
190131) +PC$2J + . . . +p(:r:,;) if 22,; S a: < marl l if$n<$, which is a “right~eontinuous” step function that looks like P(x1)+P(x2)+P(x3)+P(x4)+P(x5) .—
1906.) + We) + .0063) + pm) .—
p(x1)+P(x2)+P(x3) .— p(x1)+p(x2) 0— Then, generating U N U[0, 1] and returning x, such that
13(1131) + . . . + P($i_1) < U g 19(231) + . . . + is the inversion method applied to the c.d.f. above. Emample — X has geometric distribution with parameter p. That is,
lP{X=k}= (l—p)k_1p fork=1,2,.... Simple algebra shows that the c.d.f. of X is
k
FUc) :IP{XS k}:Z]P’{X=t‘}= 1—(1—p)k for k: 1,2,....
i=1 Then, an inversion method to generate samples of X is 1. Generate U N U[0, 1]. 2. Return it such that k
_\ (d '
FUN) AM FM y 1%th 4’ a .3 we?) a; jamLt) .19“ (by) ECV‘H’V‘ Side note 7 While inversion is a general method that can be applied to any probability distri
bution, it may be computationally expensive. In particular, the computation of F‘1() may require numerical search. Example — X has Beta(3,4) distribution. That is, the p.d.f. of X is given by 603331—532 if0<$<l
ftb‘) : l l _ . _
0 0therw1se. We want to generate samples of X by using the inversion method. We have
F(m) : 155:4 — 249:5 + 103:5 for 0 g :r g 1. Thus, computing Fﬁ1() requires ﬁnding the root of a 6—th order polynomial. However, the
inversion method is quite efﬁcient when the inverse of the c.d.f. can easily be computed. 42 2. AcceptanceRejection Method Acceptance—rejection method is well—suited for generating samples of random variables whose
p.d.f.’s take positive values over a bounded region. In particular, assume that we want to generate samples of a random variable X, whose
p.d.f. takes positive values only over the interval [mt]. That is, f = 0, whenever
maﬁa, 6]. Furthermore, assume that Ill/I is an upper bound on the p.d.f. of X. That is, Mr 2 max f
$E[a,b] Then, we can “enclose” the p.d.f. in the rectangle [63, b] X [0, M The acceptancerejection method works as follows: 1. Generate U1 N U[0, 1] and U2 N UlO, 1]. 2. Set Z1 2 «2+ (bﬁ a)U1, Z2 2 MUQ. 3. If Z2 3 (Z1), then return 21. Otherwise, go to Step 1.  The ﬁrst two steps of the method generate points uniformly distributed over the rectangle
[a, ] X [0, M Whenever one of these points fall below the p.d.f., we return the m—coordinate
of the point. Note that if 21 is a point at which the p.d.f. takes a large value, then the probability of accepting Z1 is relatively high. In order to see that the acceptance—rejection method works, let the random variable X9
denote the output of the method. Then, :3 5x , dril‘f’fhtmrf {WE’RE 2725 g( I}
IP{}§S:I:}:]P’{Z1 £33 I acceptance}: WED—“Z H l" 3’ g l
lilomo’wel ulna inn}
7 43 But, lP’{Zl S :5, Z2 g f(Z1)} is the probability that (thg) falls into the shaded region in the ﬁgure This probability is just "31¢; 4(X‘rl? C" Thus,
x
T mum Hi (Wa 19323 3 find On the other hand, putting :13 = b in the expression above, we get I: _ 1
1 : P{X°3 b} = [b — a]M M22 S f(Z1)} _ [b @1314 W22 S 1131)}: lP’{Xo§ m} = IP{Z1 S a:  acceptance} 2 which implies that 1 = [b 7 e]M lP’{Zg S Then,
a, _
P{X)S :5} : ll”{Zl g :L‘  acceptance} = f f That is, the p.d.f. of Xois In Step 3 of the acceptance—rejection method, Zl is accepted with probability magma}: ‘/
{wt/u *P and rejected with probability
1 — M22 3 f(Zl)} : 4?
Let N be the number of pairs (U1, U2) that are used to generate one sample. Then, lP’{N=k}= (1—p)k_1p fork: 1,2,.... 44 The expected number of pairs of random numbers that are used to generate one sample is
06) E{N}=: g? , ((wik’h f C 1.. : (iaL»«
'(r'[ F Thus, a good measure of the efﬁciency of an acceptance—rejection algorithm is the closeness
of [b — a]M to 1. Example 7 X has Beta(2,2) distribution. That is, the p.d.f. of X is given by f($)={6$(1_$l ifosxgi 0 otherwise. This p.d.f. looks like Then we can choose a=0 524 Then, the following acceptance—rejection method generates samples of X: M: {.5 1. Generate U1 N U[0, 1], U2 N U[0,1].
2. lf1.5U2 f 6U1(1 a U1), return U1. Otherwise, go to Step 1. On the average, this algorithm requires 3 uniform random numbers per sample generated. 3. Convolution Method Assume that the random variable X can be written as
X=K+E+m+n, Where Y1, 1/2,. . . ,Yn are independent random variables. Then, an obvious strategy to generate
samples of X is to generate Y1, Y2, . . . , Y” independently and sum them up. This is known
as the convolution method, because the c.d.f. of X can be computed analytically as the
convolution of the c.d.f.’s of Y1, Y2, . . . ,Y,,. Emample — Let X be a binomially distributed random variable with parameters n and p. w Then, X can be written as
X=Y1+Yg+...+Y,,, where Y1, Y2, . . . , Y” are i.i.d. random variables, each having Bernoulli distribution with pa—
rameter 19. Samples of Y1, Yé, . . . , K, can be generated by the following method: 1. Generate U N U [0, 1].
2. If U 3 p, then return 1. Otherwise, return 0. If n is not too large, then the convolution method is reasonably efﬁcient. One should turn
to other alternatives when n is large. Emmple e Let X be a gamma distributed random variable with scale parameter A and shape
parameter 71. That is, the p.d.f. of X is given by Anmn—le—Aa: m= (n—li! 0 otherwise. ifsc>0 If the shape parameter n is an integer, then X can be written as the sum of n i.i.d. random
variables, each exponentially distributed with parameter A. That is, X=l€+Y2+...+Ku where Y1,Y2, . . . , Y” are exponentiale distributed with parameter A. Recall that one can generate samples of an exponentially distributed random variable with
parameter A by the following method: 1. Generate U N U [0, 1]. 2. Return —% ln U. Then, one can generate samples of X by the following method: 1. Generate U1 N U[0,1], U2 N U[O, l], . . . ,Un N U[0,1]. 1 1 l. 1
2. Return —Xan1 — Xll'lUg h . . . e Xann = Niln(U1U2...U,,). 10 46 ...
View
Full
Document
This note was uploaded on 03/08/2012 for the course ORIE 4580 at Cornell University (Engineering School).
 '08
 TOPALOGLU

Click to edit the document details