DOEPaper2 - Proceedings of the 2006 Winter Simulation...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Proceedings of the 2006 Winter Simulation Conference L. F. Perrone, F. P. Wieland, J. Liu, B. G. Lawson, D. M. Nicol, and R. M. Fujimoto, eds. WHITE NOISE ASSUMPTIONS REVISITED: REGRESSION METAMODELS AND EXPERIMENTAL DESIGNS IN PRACTICE Jack P. C. Kleijnen Center for Economic Research and Department of Information Systems and Management Tilburg University 5000 LE Tilburg, THE NETHERLANDS ABSTRACT Classic linear regression metamodels and their concomitant experimental designs assume a univariate (not multivariate) simulation response and white noise. By definition, white noise is normally (Gaussian), independently (implying no common random numbers), and identically (constant vari- ance) distributed with zero mean (valid metamodel). This advanced tutorial tries to answer the following questions: (i) How realistic are these classic assumptions in simulation practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulations I/O data be transformed such that the analysis becomes correct? (iv) If such transformations cannot be applied, which alternative statistical methods (for example, generalized least squares, bootstrapping, jackknifing) can then be applied? 1 INTRODUCTION Simulation models may be either deterministic or random (stochastic). To investigate the Input/Output (I/O) behavior of these simulation models, the analysts often use linear regression metamodels; for example, first-order and second- order polynomial approximations of the I/O function implied by the underlying simulation model. A good analysis (for example, a regression analysis) requires a good statistical design ; for example, a fractional factorial such as a 2 k- p design. For more mathematical details and background information I refer to my old textbook Kleijnen (1987) and my forthcoming textbook Kleijnen (2007); a recent tutorial is Kleijnen (2006). In this article, I revisit the classic assumptions for linear regression analysis and its concomitant designs. These classic assumptions stipulate univariate output and white noise . In practice, however, these assumptions usually do not hold. Indeed, in practice the simulation output (say) hatwide is usually a multivariate random variable. For example, the simulation output (response) hatwide 1 may estimate the mean flow time, and hatwide 2 may estimate the 90% quantile of the waiting time distribution. More examples will follow in Section 2. White noise (say) u is Normally, Independently, and Identically Distributed (NIID) with zero mean: u NIID (0 , 2 u ) . This definition implies the following as- sumptions: 1. normally (Gaussian) distributed simulation re- sponses; 2. no Common Random Numbers (CRN) across the (say) n factor (input) combinations simulated; 3. a common variance (or homoscedasticity) of the simulation responses across these n combinations; 4. a valid regression metamodel; i.e., zero expected values for the residuals of the fitted metamodel....
View Full Document

This note was uploaded on 11/13/2010 for the course ISE 680 taught by Professor Santanu during the Spring '10 term at Purdue University Calumet.

Page1 / 11

DOEPaper2 - Proceedings of the 2006 Winter Simulation...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online