{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

chap07PRN econ 325

chap07PRN econ 325 - Chapter 7.1 Properties of Point...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
Econ 325 – Chapter 7 1 Chapter 7.1 Properties of Point Estimators Let the random sample 1 X , 2 X , . . . , n X be a set of random variables that are independently and identically distributed. Population characteristics are summarized by parameters – the true values are typically unknown. For example, the population mean is denoted by μ . An estimation rule can be specified for a parameter of interest. This estimation rule is called a point estimator . For example, a point estimator for the population mean μ is: = = n 1 i i X n 1 X An estimator is a random variable that is a function of the sample information. An estimator has a probability distribution called the sampling distribution. Econ 325 – Chapter 7 2 An applied study works with a data set. The numeric observations are: 1 x , 2 x , . . . , n x . The estimation rule given by the estimator X can be used to calculate a point estimate of the population mean μ : = = n 1 i i x n 1 x head2right An important distinction is made between an estimator and an estimate. A point estimator is a random variable. A point estimate is a numeric outcome. Different samples of data will have different numeric observations and, therefore, will result in different point estimates of the population parameter.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Econ 325 – Chapter 7 3 xrhombus Properties of Point Estimators Denote θ (the Greek letter theta) as a population parameter to be estimated (as a special case this may be the population mean μ ). Let θ ˆ (theta-hat) be a point estimator of θ . θ ˆ is a function of the sample information: ) X , , X , X ( f ˆ n 2 1 K = θ This estimator is a random variable with a sampling distribution. θ ˆ is said to be an unbiased estimator of θ if: θ = θ ) ˆ ( E The bias of an estimator θ ˆ is defined as: θ - θ = θ ) ˆ ( E ) ˆ ( Bias It follows that the bias of an unbiased estimator is zero. Econ 325 – Chapter 7 4 Example: Let 1 X , 2 X , 3 X be a random sample from a population with mean μ . That is, μ = = = ) X ( E ) X ( E ) X ( E 3 2 1 Consider two alternative point estimators of μ : ) X X X ( 3 1 X 3 2 1 + + = and ) X X 4 X ( 6 1 X 3 2 1 W + + = The second estimator is a weighted average of the sample information. To compare these estimators consider: μ = ) μ = + + = 3 ( 3 1 )] X ( E ) X ( E ) X ( E [ 3 1 ) X ( E 3 2 1 and μ = ) μ + μ + μ = + + = 4 ( 6 1 )] X ( E ) X ( E 4 ) X ( E [ 6 1 ) X ( E 3 2 1 W Therefore, both estimators are unbiased estimators of the population mean μ .
Background image of page 2
Econ 325 – Chapter 7 5 The above example demonstrated that there may be several unbiased estimators of a population parameter of interest. A problem that arises is: how can an estimator be selected from among a number of competing unbiased estimators ? A suggestion is to choose the estimator with minimum variance. Let 1 ˆ θ and 2 ˆ θ be two unbiased estimators of the population parameter θ . That is, θ = θ ) ˆ ( E 1 and θ = θ ) ˆ ( E 2 1 ˆ θ is said to be more efficient than 2 ˆ θ if: ) ˆ ( Var ) ˆ ( Var 2 1 θ < θ The relative efficiency of one estimator with respect to another is the variance ratio: ) ˆ ( Var ) ˆ ( Var 1 2 θ θ head2right If θ ˆ is an unbiased estimator of θ , and no other unbiased estimator has smaller variance than θ ˆ
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}