{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Lecture19

# Lecture19 - Lecture 19 Robust Quantile regression Stat 704...

This preview shows pages 1–5. Sign up to view the full content.

Lecture 19: Robust & Quantile regression Stat 704: Data Analysis I, Fall 2010 Tim Hanson, Ph.D. University of South Carolina T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 1 / 17

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Chapter 11 11.3 Influential cases rem. measure: Robust regression 11.3: Robust regression * Leverages h ii and deleted residuals t i useful for finding outlying x i and Y i (w.r.t. the model) cases. * Cook’s D i and DFFIT i indicate which cases are highly influencing the fit of the model, i.e. the OLS b . * What to do with influential and/or outlying cases? Are they transcription errors or somehow unrepresentative of the target population? * Outliers are often interesting in their own right and can help guide the building of a better model. * Robust regression dampens the effect of outlying cases on estimation to provide a better fit to the majority of cases. * Useful in situations when there’s no time for “influence diagnostics” or a more careful analysis. T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 2 / 17
Chapter 11 11.3 Influential cases rem. measure: Robust regression * Robust regression is effective when the error distribution is not normal, but heavy-tailed. * M-estimation is a general class of estimation methods. Choose β to minimize Q ( β ) = n X i =1 ρ ( Y i - x 0 i β ) , where ρ ( · ) is some function. * ρ ( u ) = u 2 gives OLS b . * ρ ( u ) = | u | gives L 1 regression, or least absolute residual (LAR) regression. * Huber’s method – described next – builds a ρ ( · ) that is a compromise between OLS and LAR. T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 3 / 17

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Chapter 11 11.3 Influential cases rem. measure: Robust regression Iteratively Reweighted Least Squares Outlying values of r j i = y i - x 0 i b j are (iteratively) given less weight in the estimation process. 0
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 17

Lecture19 - Lecture 19 Robust Quantile regression Stat 704...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online