IE306Lec8 - IE306 SYSTEMS SIMULATION Ali Rıza Kaylan...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: IE306 SYSTEMS SIMULATION Ali Rıza Kaylan kaylan@boun.edu.tr 1 LECTURE 8 OUTLINE INPUT MODELING Data Collection and Compilation Point Estimation Goodness of fit Tests 2 Data Compilation Visual Representation Histograms Box Plots Stem and Leaf Diagrams Descriptive Statistics: Location Statistics: Sample Mean, Median, Mode Dispersion Statistics: Sample Variance, Range 3 MEASURES OF STATISTICAL DISPERSION Interquartile range (IQR) The range between the third and first quartiles 25% of the data are less than or equal to the First Quartile (Q1=5) 25% are greater than or equal to the Third Quartile (Q3=15), IQR is expected to include about half of the data. IQR=15 - 5 = 10. Median Q2=9 BOX PLOT 2 5 9 15 i 1 2 3 4 5 6 7 8 9 10 11 X(i) 2 4 5 7 8 9 10 12 15 15 18 18 4 BOX PLOT Univariate Data Display developed by John W. Tukey (1970). Procedure: Calculate Q1, Q2, Q3 Calculate IQR=Q3-Q1 Construct a box above the number line bounded on the left by Q1 and on the right by Q3. Indicate where the Q2 (median) lies inside of the box. The mean value of the data can also be labeled with a point. Upper Fence = Q3+k*IQR Lower Fence = Q1-k*IQR Outliers: Any data observation which lies more than 1.5*IQR lower than Q1 or 1.5*IQR higher than Q3. Indicate outliers by open and closed dots. Whisker: The smallest and largest values that are not considered as outliers Q1 Q2 * k*IQR Lower Fence Q3 k*IQR Upper Fence 5 BOX PLOT * 1 2 3 4 5 6 7 8 9 10 Left whisker=5 (smallest non-outlier observation) Q1 = 7 Q2 (Median) = 8.5 Q3 = 9 Right whisker=10 (largest non-outlier observation) IQR = Q3 − Q1 = 2 “Mild" outlier = 3.5 (between 1.5*IQR and 3*IQR below Q1 “Extreme" outlier = 0.5 (more than 3*IQR below Q1 Negatively skewed data 6 PARAMETER ESTIMATION Maximum Likelihood Estimation (MLE): Based on the random sample {X1, X2, ..., Xn}, the likelihood function is the joint p.d.f. (p.m.f. For the discrete case) n l ( x1 , x2 ,..., xn ) = f ( x1 , x2 ,..., xn ) = ∏ f ( xi ) i =1 max ln l ( x1 , x2 ,..., xn ;θ ) with respect to θ. Procedure: 1. Take the ln of the likelihood function. 2. Take the partial derivatives with respect to the parameters and set them equal to zero. 3. Solve for the parameters. 4. Check the second derivatives to make sure that the maximum is attained. 7 PARAMETER ESTIMATION Example: A coin is flipped and Head (X=1) or Tail (X=0) is observed. Note that X is a Bernoulli random variable with success probability 1 3 p∈ , 4 4 a) Based on this Bernoulli trial, find the likelihood function. b) Find the MLE of the success probability p. 8 PARAMETER ESTIMATION Example: Bernoulli random variable with success probability 1 3 p∈ , 4 4 a) Find the likelihood function. p L ( p; x ) = 1 − p if x = 1 if x = 0 L( p; x) = p x (1 − p )1− x , x = 0,1 b) Find the MLE of p. 3 4 ˆ p( x) = 1 4 if x = 1 if x = 0 2X +1 ˆ p( X ) = 4 9 GOODNESS OF FIT TESTS Chi-Square Test Kolmogorov Smirnov Test Restrictions? How to use them? 10 ...
View Full Document

Ask a homework question - tutors are online