Bootstrap and Jackknife Calculations in R
Version 6 April 2004
These notes work through a simple example to show how one can program R to do both jackknife and bootstrap sampling. We start with bootstrapping.
Bootstrap Calculations
R has a number of nice
1
Chapter 5. Hypothesis Testing
1
Nested Hypotheses
In this chapter we provide a theoretical discussion on testing of statistical hypotheses.
Neyman and Pearson (1933) presented Neyman-Pearson Fundamental Lemma which unfolded the various complex problems
1
Chapter 3. Asymptotic Methods
1
Modes of Convergence of A Sequence of Random Variables
Due to the diculty of making exact calculation, we make use of asymptotic
results. For example, we experience the approximation of probabilities for
computing signica
1
Chapter 2. Order Statistics
1
The Order Statistics
For a sample of independent observations X1 , X2 , . . . , Xn on a distribution F , the ordered
sample values
X(1) X(2) X(n) ,
or, in more explicit notation,
X(1:n) X(2:n) X(n:n) ,
are called the order
1
Chapter 4. Method of Maximum Likelihood
1
Introduction
Many statistical procedures are based on statistical models which specify under
which conditions the data are generated. Usually the assumption is made that
the set of observations x1 , . . . , xn i
1
Chapter 1. Bootstrap Method
1
1.1
Introduction
The Practice of Statistics
Statistics is the science of learning from experience, especially experience that arrives a little
bit at a time. Most people are not natural-born statisticians. Left to our own d
AN INTRODUCTION TO
EXTREME ORDER
STATISTICS AND
ACTUARIAL
APPLICATIONS
2004 ERM Symposium, Chicago
April 26, 2004
Sessions CS 1E, 2E, 3E:
Extreme Value Forum
H. N. Nagaraja
(hnn@stat.ohio-state.edu)
Ohio State University, Columbus
Hour 1: ORDER STATISTICS
Large Sample Theory
Homework 5: Maximum Likelihood Estimate, Testing, Asymptotic Distribution
Due Date: January 12th
1. Consider the classical Gaussian linear model Yi = i + i , 1 i n, where i = zT
i
and i are i.i.d. Gaussian with mean 0 and variance 2 .
Large Sample Theory
Homework 4: Methods of Estimation, Asymptotic Distribution, Probability and Conditioning
Due Date: December 1st
1. The Weibull distribution (after the Swedish physicist Waloddi Weibull, who proposed
the distribution in 1939 for the bre
Large Sample Theory
Homework 3: Probability and Conditioning
Due Date: November 10th
1. Let X be a random variable with range cfw_0, 1, 2, . . .. Show that if E (X ) < , then
P (X n).
E (X ) =
n=1
2. Let X be a random variable having a c.d.f. F (x). Show
Large Sample Theory
romework PX yrder ttistis
hue hteX ytoer PUth
IF how tht if X hs et @r; sA distriutionDthen
r @r C @k IAA
k a I; P;
@r C sA @r C s C @k IAA
rs
V ar@X A a
:
2
@r C sA @r C s C IA
E @X k A a
PF vet X1 ; : : : ; Xn e smple from uniform U
Large Sample Theory
Homework 1: Bootstrap Method, CLT
Due Date: October 3rd, 2004
1. Suppose that someone collects a random sample of size 4 of a particular measurement. The observed values are cfw_2, 4, 9, 12.
(a) Find the bootstrap mean and variance of
Topic 3: Tests in Parametric Models
Hypothesis Testing By Likelihood Methods
Let H0 denote a null hypothesis to be tested. Typically, we may represent H0
as a specied family F0 of distributions for the data.
For any test procedure T , we shall denote by