Lecture 2
Admissibility and Bayesian
Two criteria will be introduced in what follows to evaluate and compare decision rules.
2.1
Admissibility
An order can be defined between two decision rules d and d based on their risk functions: d is said to be (at le
Lecture 1
Basic Elements in Decision Theory
1.1
Introduction
Put this course in a big picture: reconstruction cycle for statistical modelling
proposed model
t t
fitted model
statistical inference
(bottom-up) descriptive statistics
(top-down) Monte Carlo
Lecture 8
Information Inequalities
One criterion for evaluating an estimator T (X) of q(), where is an unknown parameter while q is a known function, is the mean squared error (MSE) R(, T ) = E [T (X) - q()]2 = V ar T + [E T - q()]2 = variance + (bias)2 .
Lecture 10
Likelihood Ratio Tests
In this lecture, a basic framework of hypothesis testing will be presented, followed by an introduction to likelihood ratio tests.
10.1
Basic elements in hypothesis testing
Data model: X f (x|), x X , . Parameter space d
Lecture 7
Maximum Likelihood
Maximum likelihood has been the most extensively used frequentist method in point estimation. Unlike empirical frequencies introduced in Lecture 6, maximum likelihood is a fully model-based approach. See Efron's Wald Lecture p
Lecture 11
Neyman-Pearson Theory
Neyman-Pearson theory sets a foundation for hypothesis testing. It enhances our understanding of the trade-off between type I and type II errors. Definition 1 For (0, 1), let T be the set of all level- tests (including ran
Lecture 12
Some Extensions of Neyman-Pearson Theory
12.1
One-sided hypotheses and monotone likelihood ratios
Definition 1 F = cfw_f (|), is said to have a monotone likelihood ratio (MLR) for f (x|1 ) a sufficient statistic U if for any 0 < 1 , the ratio
Lecture 16
Consistency
Unless mentioned otherwise in the next few lectures, we assume that cfw_Xn is a sequence of iid samples from P , , and will present some basic results in asymptotic statistics, i.e. large sample theory. Definition 1 Tn = T (X n ) i