Lec_16 - Bootstrap Methods Pawel Polak GU4222 GR5222...

This preview shows page 1 - 6 out of 33 pages.

1/33 Bootstrap Methods Pawe l Polak April 10, 2017 GU4222 & GR5222: Nonparametric Statistics - Lecture 16 Pawe l Polak (Columbia University) GU4222 & GR5222: Nonparametric Statistics -Lec ˙ 16
2/33 Bootstraping jackknife estimates of the standard errors bootstrap for the iid observations y 1 , . . . , y n bootstrap estimates of the standard errors bootstrap estimates of the bias of the estimator the construction of confidence intervals bootstrap for the parameters of the linear regression model bootstrap for the parameters of any regression model with heteroscedasticity nonparametric hypothesis testing using bootstrap, as an alternative to permutation and Monte Carlo tests parametric vs. nonparametric bootstrap smoothed bootstrap bootstrap for dependent data when bootstrap fails Pawe l Polak (Columbia University) GU4222 & GR5222: Nonparametric Statistics -Lec ˙ 16
3/33 What people do before they invented bootstrap? Let Y 1 , . . . , Y n be independent random variables with common distribution function F which depends on some parameters , then to get things like standard errors or confidence intervals, we need to know the distribution of our estimates b around the true values of . these sampling distributions follow, from the distribution of the data, since our estimates are functions of the data. the two classical responses of statisticians were to focus on tractable special cases, and to appeal to asymptotics. In 1957 the jackknife was invented. It was a halfway between classical methodology and a “computer-intensive statistics”. The bootstrap approach propsed by Efron (1979) goes further and it is a very general method which combines estimation with simulation. Pawe l Polak (Columbia University) GU4222 & GR5222: Nonparametric Statistics -Lec ˙ 16
4/33 Standard Errors & the jackknife A central element of frequentist inference is the standard error It answers the question: how accurate is the estimate? However, direct standard error formulas exist for various forms of averaging, such as linear regression, and for hardly anything else. Taylor series approximation known as “delta method” extend the formulas to smooth functions of averages. The jackknife is a first step toward a computation-based, non-formulaic approach to standard errors. Pawe l Polak (Columbia University) GU4222 & GR5222: Nonparametric Statistics -Lec ˙ 16
5/33 The Jackknife Estiamte of Standard Error The basic applications of the jackknife apply to one-sample problems , where we observe the iid sample y = ( y 1 , . . . , y n ) T from unknown probability distribution F A real valued statistic b has been computed by applying some algorithm s () to y b = s ( y ) and we wish to assign standard error to b . That is we wish to estimate the standard deviation of b under sampling model F . Let b ( i ) = s ( y ( i ) ) be the statistic of interest computed with observation i removed from the sample. Then the jackknife estimate of standard errors for b is b se jack = " n - 1 n n X i =1 ( b ( i ) - b ( · ) ) # , where b ( · ) = n X i =1 b ( i ) / n The fudge factor ( n - 1) / n makes the b se jack agree with the standard error of the mean estimator. Pawe l Polak (Columbia University)

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture