Text: Statistical Inference, by George Casella and Roger Berger.
Outline
1. Review of probability.
2. Important Discrete Distributions.
3. Important Continuous distributions.
4. Transformations, multivariate distributions.
5. Data reduction. Suciency.
6.

Lecture 7.
Bayesian Estimation. Here we assume that while we do not know the
exact value of the unknown parameter we suppose that it is chosen randomly from a set of possible values of and we have reason to believe that
its distribution is given by some p

Lecture 6.
We have the identity
D2 eg = eg D2 g + eg [Dg]2
Taking g = log f (, x) and dierentiating f = eg twice with respect to we
have
2 f (, x)
2 exp[log f (, x)]
=
2
2
2 log f (, x)
log f (, x)
=
f (, x) +
2
2
f (, x)
On the other hand if we diere

Lecture 4.
Data is information. About what? About the underlying distribution that
the data is sampled from. We can always create data by simulation if we
know the underlying distribution exactly, in which case the data has no
value. In that case any lost

Lecture 5.
Some times we want to estimate a function f () of rather than itself. If
f is a smooth function and tn (x1 , . . . , xn ) is an estimate of with
E [(tn )2 ]
v()
n
by Taylor expansion we saw that f (tn ) f () = f ()(tn ) and we expect
[f ()]2 v(

Lectures 2 and 3
Covariance between two random variables. X and Y .
Cov(X, Y ) = E[X E[X][Y E[Y ] = E[XY ] E[X]E[Y ]
If X = F (x) and Y = G(x), then
Cov(X, Y ) =
x
F (x)G(x)p(x)
F (x)p(x)
G(x)p(x)
x
x
In particular
i
Cov(Xi , Xj )
V ar(Xi ) +
Xi ) =
V ar

Assignment 6.
Due Oct 30.
1. We have a multinomial distribution with 4 possibilities 1, 2, 3, 4 with
probabilities cfw_pi (), i = 1, 2, 3, 4 given by
p1 () =
2
2(1 )
1
, p2 () =
, p3 () =
, p4 =
3
3
3
3
that depend on a parameter between 0 and 1. We have

1. Review of Probability.
What is probability? Perform an experiment. The result is not predictable. One of nitely many possibilities R1 , R2 , , Rk can occur. Some
are perhaps more likely than others. We assign nonnegative numbers pi =
P [Ri ] such that

Assignment 5.
Due Oct 23.
1. If X1 , X2 , . . . , Xn are n independent observations from the uniform distribution on [, ] where < , i.e the common density of cfw_Xi is given
by
1
f (, , x) =
; x
what are the Maximum Likelihood Estimators for and ?
2. If

Assignment 4.
Due Oct 16.
1. If cfw_Xi are independent observations from the uniform distribution on
[, ], i.e with density given by
f (, x) =
1
;
2
x
Is there a sucient statistic? What is it? Why is it sucient?
2. For the uniform distribution on [0, )

Assignment 3 (revised).
Due Oct 2.
We have a coin that has probability p for coming up head and q = 1 p
for coming up tail when tossed. It is tossed n times and the number of
times X that head appeared is counted and t = X is oered as an unbiased
n
estima

Assignment 2. Due Sept 25.
We have a population of size N which is very large and can be considered
to be innite. It consist of two groups of equal size. An opinion survey is to
be conducted that results in a simple yes or no answer and the proportion
of