{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Chapter6_print - STAT511 Summer 2009 Lecture Notes 1...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
STAT511 — Summer 2009 Lecture Notes 1 Chapter 6 July 16, 2009 Point Estimation 6.1 Point Estimation Chapter Overview Point estimate and point estimator Define point estimate and estimator Unbiased estimators Minimum variance estimators Finding the standard error of an estimator Deriving directly Bootstrapping Methods of point estimation The method of moments The method of maximum likelihood Estimating functions of parameters Point Estimate and Point Estimator Example 6.1.1 Not in textbook. Suppose we want to find the ratio p of FIV infected cats in a specific area. Impossible to check all feral cats in the area. Maybe we can do the following: 1. let X be the rv that: X = 1 if cat has FIV and X = 0 if not. 2. Distribution of X ? Bernoulli, p unknown. 3. Now the question becomes: how to estimate the value of the parameter p of a Bernoulli distribution? RV = Distribution = Parameter of interest Point Estimate and Point Estimator Example 6.1.1 continued... To estimate the value of p : 1. We go catch 25 feral cats randomly and check them. This is a random sample of size n = 25 of the Bernoulli distribution X 1 , X 2 , ..., X 25 . 2. Use statistic ˆ p = X 1 + ... + X 25 25 . This statistic is a random variable, its value could be used to estimate p , and this statistic is called a point estimator of p . Purdue University Chapter6˙print.tex; Last Modified: July 16, 2009 (W. Sharabati)
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
STAT511 — Summer 2009 Lecture Notes 2 3. Suppose we found that cat number 1,5,10,15,23 are infected with FIV, then ˆ p = 5 25 = 0 . 2 Now the actual value of ˆ p is 0 . 2 and this is called the point estimate of p . Random sample = Estimator = Estimate Definitions and Differences Definition 1. A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ . A point estimate is obtained by selecing a suitable statistic and computing its value from given sample data. The selected statistic is called the point estimator . Difference between point estimate and point estimator: point estimate is a value , point estimator is a statistic . Usually we use ˆ θ to denote the point estimator of a parameter θ . Different statistic can be used to estimate the same parameter, i.e., a parameter may have multiple point estimators. Example 6.1.2 Example 6.1.2 Example 6.2 in textbook. Assume the dielectric breakdown voltage for pieces of epoxy resin to be normally distributed. Now we want to estimate the mean μ of the breakdown voltage. We randomly check 20 breakdown voltages, and denote them X 1 , X 2 , ..., X 20 . This is a random sample of size 20 from this normal distribution of interest. Suppose the observed voltage values are: { 24.46, 25.61, 26.25, 26.42, 26.66, 27.15, 27.31, 27.54, 27.74, 27.94, 27.98, 28.04, 28.28, 28.49, 28.50, 28.87, 29.11, 29.13, 29.50, 30.88 } Which point estimators could be used to estimate μ ? 1. Sample mean: ˆ μ = X 2. Sample median: ˆ μ = e X 3. Average of the extremes: ˆ μ = min ( X i )+ max ( X i ) 2 4. Trimmed mean, say, 10% trimmed mean (discard the smallest and largest 10% of the sample data and then take an average): ˆ μ = X tr (10) 5. etc..
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}