4703-10-Notes-ARM

4703-10-Notes-ARM - Copyright c 2007 by Karl Sigman 1...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Copyright c ± 2007 by Karl Sigman 1 Acceptance-Rejection Method As we already know, finding an explicit formula for F - 1 ( y ) for the cdf of a rv X we wish to generate, F ( x ) = P ( X x ), is not always possible. Moreover, even if it is, there may be alternative methods for generating a rv distributed as F that is more efficient than the inverse transform method or other methods we have come across. Here we present a very clever method known as the acceptance-rejection method . We start by assuming that the F we wish to simulate from has a probability density function f ( x ); that is, the continuous case. Later we will give a discrete version too, which is very similar. The basic idea is to find an alternative probability distribution G , with density function g ( x ), from which we already have an efficient algorithm for generating from (e.g., inverse transform method or whatever), but also such that the function g ( x ) is “close” to f ( x ). In particular, we assume that the ratio f ( x ) /g ( x ) is bounded by a constant c > 0; sup x { f ( x ) /g ( x ) } ≤ c . (And in practice we would want c as close to 1 as possible.) Here then is the algorithm for generating X distributed as F : Acceptance-Rejection Algorithm for continuous random variables 1. Generate a rv Y distributed as G . 2. Generate U (independent from Y ). 3. If U f ( Y ) cg ( Y ) , then set X = Y (“accept”) ; otherwise go back to 1 (“reject”). Before we prove this and give examples, several things are noteworthy: f ( Y ) and g ( Y ) are rvs, hence so is the ratio f ( Y ) cg ( Y ) and this ratio is independent of U in Step (2). The ratio is bounded between 0 and 1; 0 < f ( Y ) cg ( Y ) 1. The number of times N that steps 1 and 2 need to be called (e.g., the number of iterations needed to successfully generate X ) is itself a rv and has a geometric distribution with “success” probability p = P ( U f ( Y ) cg ( Y ) ; P ( N = n ) = (1 - p ) n - 1 p, n 1. Thus on average the number of iterations required is given by E ( N ) = 1 /p . In the end we obtain our
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 4

4703-10-Notes-ARM - Copyright c 2007 by Karl Sigman 1...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online