sLecture2cPBS

# sLecture2cPBS - Lecture #2 Learning Objectives 1....

This preview shows pages 1–5. Sign up to view the full content.

Lecture #2 Learning Objectives 1. Understand properties which help define a “good” estimator. 2. Know how to find a point estimate. 3. Know how to construct and interpret an interval estimate (confidence interval) of a population mean. 4. Understand the concept of margin of error (sampling error). 5. Be able to use knowledge of a sampling distribution to make probability statements about the margin of error. 6. Be able to determine the size of a simple random sample necessary to estimate a population mean with a specified level of confidence and precision. 7. Know the definition of the following terms: Unbiasedness Consistency Efficiency Sufficiency Robustness Resistance Point Estimator Point Estimate Confidence Interval Precision Confidence Coefficient Sampling Error Confidence Level Margin of Error “I could prove God statistically.” - George Gallup (1901 - 1984)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
BM330 – Lecture #2 1 Definitions: A Point Estimator is a statistic that provides a point estimate, such as n X X i = () 1 2 2 = n X X S i A Point Estimate is a single number calculated from sample data used to estimate a parameter, such as x = 3 = 2.5 2 s Example : A nightclub is reviewing revenues for weekday nights. Management has determined that to profit, they must have at least 50 patrons on a weekday night. The following sample data for the number of patrons on weekday nights is obtained. 48 50 50 46 30 60 52 49 47 (1) What is the value of the point estimate for the population mean number of patrons? (2) What is the value of the point estimate for the population standard deviation of number of patrons? (3) What is the value of the point estimate for the population proportion of nights with at least 50 patrons? (4) How might you calculate a point estimate of the standard error of X ?
BM330 – Lecture #2 2 Properties of Statistics/Point Estimators A large number of statistics can be computed from a given sample. Why do we select one statistic over another when we want to estimate an unknown population parameter? First and foremost, we choose estimators that provide relevant information. For example, what statistic might be most relevant when designing a retaining pond, or when summarizing salary information? We then try to choose estimators that have desirable properties. 1. Unbiasedness. This criterion requires that the expected value (mean) of the sampling distribution must be equal to the population parameter that you are trying to estimate: μ = ) ˆ ( E . Is X an unbiased estimator of μ ? μ Sampling Distribution of X This makes intuitive sense. On the average, the unbiased statistic should yield a value close to the unknown parameter value you are attempting to estimate.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
BM330 – Lecture #2 3 2. Efficiency or Minimum Variance . The variance of an estimator should be as small as possible.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 02/22/2009 for the course BUSMGT 330m taught by Professor Kriska during the Spring '08 term at Ohio State.

### Page1 / 17

sLecture2cPBS - Lecture #2 Learning Objectives 1....

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online