This preview shows page 1. Sign up to view the full content.
Unformatted text preview: dentically distributed random variables,
with mean µ and variance σ 2 . It might be that the mean and variance are unknown, and that the
distribution is not even known to be a particular type, so maximum likelihood estimation is not
appropriate. In this case it is reasonable to estimate µ and σ 2 by the sample mean and sample
variance deﬁned as follows:
n n Xk
k=1 σ2 = 1
n−1 n (Xk − X )2 .
k=1 156 CHAPTER 4. JOINTLY DISTRIBUTED RANDOM VARIABLES Note the perhaps unexpected appearance of n − 1 in the sample variance. Of course, we should
have n ≥ 2 to estimate the variance (assuming we don’t know the mean) so it is not surprising that
the formula is not deﬁned if n = 1. An estimator is called unbiased if the mean of the estimator is
equal to the parameter that is being estimated. (a) Is the sample mean an unbiased estimator of
µ? (b) Find the mean square error, E [(µ − X )2 ], for estimation of the mean by the sample mean.
(c) Is the sample variance an unbiased estimator of σ 2 ?
Solution (a) By the linearity of expectation,
View Full Document
This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.
- Spring '08
- The Land