lecture05stat - formulas with the aim to deduce 4 f from 4...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
The Jackknife Approach Jackknife estimators allow to correct for a bias and its statistical error. The method was introduced in the 1950s in papers by Quenouille and Tukey. The jackknife method is recommended as the standard for error bar calculations. In unbiased situations the jackknife and the usual error bars agree. Otherwise the jackknife estimates are improvements, so that one cannot loose. In particular, the jackknife method solves the question of error propagation elegantly and with little efforts involved. The unbiased estimator of the expectation value b x is x = 1 N N X i =1 x i Normally bias problems occur when one estimates a non-linear function of b x : b f = f ( b x ) . (1) 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Typically, the bias is of order 1 /N : bias ( f ) = b f - h f i = a 1 N + a 2 N 2 + O ( 1 N 3 ) . (2) Unfortunately, we lost the ability to estimate the variance σ 2 ( f ) = σ 2 ( f ) /N via the standard equation s 2 ( f ) = 1 N s 2 ( f ) = 1 N ( N - 1) N X i =1 ( f i - f ) 2 , (3) because f i = f ( x i ) is not a valid estimator of b f : b f -h f i i = O (1) . Also it is in non- trivial applications almost always a bad idea to to use standard error propagation
Background image of page 2
Background image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: formulas with the aim to deduce 4 f from 4 x . Jackknife methods are not only easier to implement, but also more precise and far more robust . The error bar problem for the estimator f is conveniently overcome by using 2 jackknife estimators f J , f J i , defined by f J = 1 N N X i =1 f J i with f J i = f ( x J i ) and x J i = 1 N-1 X k 6 = i x k . (4) The estimator for the variance σ 2 ( f J ) is s 2 J ( f J ) = N-1 N N X i =1 ( f J i-f J ) 2 . (5) Straightforward algebra shows that in the unbiased case the estimator of the jackknife variance (5) reduces to the normal variance (3). Note that only of order N (not N 2 ) operations are needed to construct the jackknife averages x J i , i = 1 , . . . , N from the original data. 3...
View Full Document

This note was uploaded on 11/10/2011 for the course PHY 5157 taught by Professor Berg during the Fall '08 term at University of Florida.

Page1 / 3

lecture05stat - formulas with the aim to deduce 4 f from 4...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online