# note09 - Chapter 8 Nonparametric Bootstrap Methods 1 The...

This preview shows pages 1–4. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 8: Nonparametric Bootstrap Methods 1. The Basic Bootstrap Method • to measure how close the statistical estimate is to the population parameter • for the population mean μ – the approximate 95% CI is given by ¯ X ± 2 S X √ n . – the margin of error ( 2 S X n- 1 / 2 ) : a measure of how accurately ¯ X esti- mates μ • to present a simple method for obtaining a margin of error for problems in which analytical solutions are not available ⇒ the method is called the bootstrap (a) Mean Squared Error and Margin of Error • the mean square error (MSE) of the estimate: MSE = E parenleftBig ˆ θ- θ parenrightBig 2 where θ is a population parameter and ˆ θ is a statistical estimate • the Chebyshev-Markov inequality: Pr parenleftBig | ˆ θ- θ | ≤ k √ MSE parenrightBig ≥ 1- 1 k 2 – in the case of estimating μ , the probability may be much higher – the quantity k √ MSE : the margin of error of the estimate – the margin of error can be used as a rough measure of the accuracy of an estimate when other measures may not be readily available • when an explicit formula for the MSE of an estimate is not available ˆ MSE = 1 T T summationdisplay t =1 parenleftBig ˆ θ t- θ parenrightBig 2 where ˆ θ t is the estimate of θ from the t th repetition of the sampling and T is the number of times the sampling experiment is repeated 1 (b) The Bootstrap Estimate of MSE • the population distribution is generally not known: the data may be used as a substitute for the population ⇒ resample the data • to simulate sampling from an infinite population, sampling is done with replacement ⇒ a bootstrap sample • The steps for obtaining a bootstrap estimate of MSE: i. ˆ θ from the original data ii. Take T independent bootstrap samples x * 1 , x * 2 , . . ., x * T , each con- sisting of n data values drawn with replacement from x . Typically, ≥ 1000 iii. Compute ˆ θ b,t : the estimate of θ from x * t iv. Obtain the bootstrap MSE as ˆ MSE = 1 T T summationdisplay t =1 parenleftBig ˆ θ b,t- ˆ θ parenrightBig 2 • bootstrap variance and bias – the bias B : B = E parenleftBig ˆ θ parenrightBig- θ MSE = V ar parenleftBig ˆ θ parenrightBig + B 2 – in place of step iv. ˆ E = 1 T T summationdisplay t =1 ˆ θ b,t ˆ B = ˆ E- ˆ θ V ar parenleftBig ˆ θ parenrightBig = 1 T T summationdisplay t =1 parenleftBig ˆ θ b,t- ˆ E parenrightBig 2 2 • number of bootstrap samples – Efron and Tibshirani (1993): if 50 ≤ T ≤ 200, ˆ se boots a good estimator of the standard error – Booth and Sarkar (1998): T ≥ 800 – if not too much burden for computation, T ≥ 1000 safe • nonparametric vs parametric bootstrap estimates – nonparametric bootstrap: no assumptions are made about the functional form of the population distribution – parametric bootstrap: assumptions are made about the form of the population distribution example with a normal distribution i. compute ˆ θ , ¯ x and S 2 x from data x ii. Takeii....
View Full Document

{[ snackBarMessage ]}

### Page1 / 13

note09 - Chapter 8 Nonparametric Bootstrap Methods 1 The...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online