1
Chapter 2
Some Basic Large Sample Theory
1. Modes of Convergence
Convergence in distribution, d
Convergence in probability, p
Convergence almost surely, a.s.
Convergence in rth mean, r
2. Classical Limit Theorems
Weak and strong laws of large numbers
Cl
1
Chapter 1
Special Distributions
1. Special Distributions
Bernoulli, binomial, geometric, and negative binomial
Sampling with and without replacement; Hypergeometric
Finite sample variance correction
Poisson and an informal Poisson process
Stationary and
1
Chapter 3
Lower Bounds for Estimation
1. Introduction and Examples
2. Cram
er - Rao lower bounds for parametric models
3. Regular Estimators; Supereciency; LAN and Le Cams three lemmas
4. Hajeks convolution theorem and local asymptotic minimax theorem
5
1
Chapter 8
Bootstrap and Jackknife Estimation of
Sampling Distributions
1. A General View of the Bootstrap
2. Bootstrap Methods
3. The Jackknife
4. Some limit theory for bootstrap methods
5. The bootstrap and the delta method
6. Bootstrap Tests and Boots
1
Chapter 5
Bayes Methods
and Elementary Decision Theory
1. Elementary Decision Theory
2. Structure of the risk body: the finite case
3. The finite case: relations between Bayes minimax, admissibility
4. Posterior distributions
5. Finding Bayes rules
6. F
1
Chapter 4
Efficient Likelihood Estimation
and Related Tests
1. Maximum likelihood and efficient likelihood estimation
2. Likelihood ratio, Wald, and Rao (or score) tests
3. Examples
4. Consistency of Maximum Likelihood Estimates
5. The EM algorithm and
1
Chapter 13
Sufficiency and Unbiased Estimation
1. Conditional Probability and Expectation
2. Sufficiency
3. Exponential families and sufficiency
4. Uses of sufficiency
5. Ancillarity and completeness
6. Unbiased estimation
7. Nonparametric unbiased esti
1
Chapter 7
Statistical Functionals and the Delta
Method
1. Estimators as Functionals of Fn or Pn
2. Continuity of Functionals of F or P
3. Metrics for Distribution Functions F and Probability Distributions P
4. Differentiability of Functionals of F or P