{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# gam_704 - Lecture 24 Generalized Additive Models Stat 704...

This preview shows pages 1–6. Sign up to view the full content.

Lecture 24: Generalized Additive Models Stat 704: Data Analysis I, Fall 2010 Tim Hanson, Ph.D. University of South Carolina T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 1 / 26

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Generalized additive models Additive predictors Generalized additive models Consider a linear regression problem: Y i = β 0 + β 1 x i 1 + β 2 x i 2 + i , where e 1 , . . . , e n iid N (0 , σ 2 ). * Diagnostics (residual plots, added variable plots) might indicate poor fit of the basic model above. * Remedial measures might include transforming the response, transforming one or both predictors, or both. * One also might consider adding quadratic terms and/or an interaction term. * Note: we only consider transforming continuous predictors! T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 2 / 26
Generalized additive models Additive predictors When considering a transformation of one predictor, an added variable plot can suggest a transformation (e.g. log( x ) , 1 / x ) that might work if the other predictor is “correctly” specified . In general, a transformation is given by a function x * = g ( x ). Say we decide that x i 1 should be log-transformed and the reciprocal of x i 2 should be used. Then the resulting model is Y i = β 0 + β 1 log( x i 1 ) + β 2 / x i 2 + i = β 0 + g β 1 ( x i 1 ) + g β 2 ( x i 2 ) + i , where g β 1 ( x ) and g β 2 ( x ) are two functions specified by β 1 and β 2 . T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 3 / 26

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Generalized additive models Additive predictors Here we are specifying forms for g 1 ( x | β 1 ) and g 2 ( x | β 2 ) based on exploratory data analysis, but we could from the outset specify models for g 1 ( x | θ 1 ) and g 2 ( x | θ 2 ) that are rich enough to capture interesting and predictively useful aspects of how the predictors affect the response and estimate these functions from the data . One example of this is through an basis expansion; for the j th predictor the transformation is: g j ( x ) = K j X k =1 θ jk ψ jk ( x ) , where { ψ jk ( · ) } K j k =1 are B-spline basis functions, or sines/cosines, etc. This approach has gained more favor from Bayesians, but is not the approach taken in SAS PROC GAM. PROC GAM makes use of cubic smoothing splines . This is an example of “nonparametric regression,” which ironically connotes the inclusion of lots of parameters rather than fewer. T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 4 / 26
Generalized additive models Additive predictors For simple regression data { ( x i , y i ) } n i =1 , a cubic spline smoother g ( x ) minimizes n X i =1 ( y i - g ( x i )) 2 + λ Z -∞ g 00 ( x ) 2 dx .

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}