# IE306Lec11 - IE306 SYSTEMS SIMULATION Ali Rıza Kaylan...

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: IE306 SYSTEMS SIMULATION Ali Rıza Kaylan [email protected] 1 LECTURE 11 VARIANCE REDUCTION TECHNIQUES OUTLINE STATISTICAL EFFICIENCY COMMON RANDOM NUMBERS ANTITHETIC VARIATES CONTROL VARIATES INDIRECT ESTIMATION CONDITIONING IMPORTANCE SAMPLING BOOTSTRAPPING 2 Statistical Efficiency Which estimate is better? Consistency ˆ → θ θ Unbiasedness ˆ) = θ E (θ n →∞ Minimal Variance {Y1 , Y2 ,..., Yn } i.i.d. r.v. Var (Y ) = Var(Y) = σ σ2 n 3 2 Common Random Numbers (CRN) Comparison of two alternative systems H 0 : µ1 = µ 2 H1 : µ1 ≠ µ 2 n replications of each system Inducing positive correlation to each pair. 4 Common Random Numbers (CRN) {Z1 , Z 2 ,..., Z n } Z j = Y1 j − Y2 j , j = 1,2,..., n n Z ( n) = ∑Z j =1 j n Var ( Z j ) = Var (Y1 j ) + Var (Y2 j ) − 2Cov (Y1 j , Y2 j ) 5 Antithetic Variates (AV) Applicable to simulating a single system H 0 : µ = µ0 H1 : µ ≠ µ 0 n replications Inducing negative correlation to separate runs. 6 Antithetic Variates (AV) {Z1 , Z 2 ,..., Z n } Zj = Y1 j + Y2 j 2 , j = 1,2,..., n n Z ( n) = ∑Z j =1 j n Var ( Z j ) = Var (Y1 j ) + Var (Y2 j ) + 2Cov (Y1 j , Y2 j ) Var ( Z (n)) = Var (Y1 j ) + Var (Y2 j ) + 2Cov (Y1 j , Y2 j ) 4n 7 Control Variates (CV) Let Y be an output variable. To estimate µ = E (Y ) Suppose that X is another random variable. Its expected value E(X)=v is known. It is thought that Y is correlated with X. X is referred as a control variate for Y since it will be used to adjust Y. 8 Control Variates (CV) X is referred as a control variate for Y since it will be used to adjust Y. YC = Y − a ( X − υ ) Var (YC ) = Var (Y ) + a Var ( X ) − 2aCov ( X , Y ) 2 Var (YC ) < Var (Y ) if and only if 2aCov ( X , Y ) > a Var ( X ) 2 9 Indirect Estimation (CV) Developed for queueing type simulations. Steady state performance measures: d = Expected time spent in the queue w = Expected waiting time in the system Q = Expected number in the queue L = Expected number in the system Q = λd L = λw 10 Indirect Estimation (CV) 1n ˆ d (n) = ∑ Di n i =1 ˆ Q ( n) = 1 T ( n) 1n ˆ w(n) = ∑ Wi n i =1 T (n) ∫ Q(t )dt ˆ L ( n) = 0 ˆ ˆ w(n) = d (n) + S (n) ~ ˆ w( n ) = d ( n ) + E ( S ) ~ ˆ Q ( n) = λd ( n) ~ ~ ˆ L ( n ) = λ w( n ) = λ d ( n ) + E ( S ) [ 1 T ( n) T (n) ∫ L(t )dt 0 11 Conditioning Let Y be an output random variable. Suppose that there is some other random variable Z such that given any particular possible value z for Z, conditional expectation of Y given Z can be analytically computed. µ = E (Y ) = E Z [E (Y Z )] VarZ [E (Y Z )] = Var (Y ) − E Z [Var (Y Z )] 12 Importance Sampling This is a method to work around the small area problem. If we want to evaluate tail probabilities, or very small areas, we may have very few hits of our random number generator in that area. However we can modify the random number generator, make that area more likely as long as we take that into account, we do the summation. 13 Importance Sampling ∞ µ = E ( g ( x)) = ∫ g ( x) f ( x)dx −∞ g ( x) f ( x) h( x)dx µ=∫ −∞ h( x) ∞ f ( x) g ( x) f ( x) = g ( x) g ( x) = h( x) h( x) * 14 Bootstrapping Popularized by Brad Efron. The bootstrap is a name generically applied to statistical resampling schemes that allow uncertainty in the data to be assessed from the data themselves. {Y1 , Y2 ,...,Yn } i.i.d. r.v. Var(Y) = σ 2 Compute a statistic S. For instance, 1n Y = ∑ Yi n i =1 15 Bootstrapping Procedure: • Draw n values from the original data with replacement. • Calculate the statistic S’ from the bootstrapped sample. • Repeat L times to build up a distribution of uncertainty in S. 16 ...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online