This preview shows pages 1–6. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture 24: Generalized Additive Models Stat 704: Data Analysis I, Fall 2010 Tim Hanson, Ph.D. University of South Carolina T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 1 / 26 Generalized additive models Additive predictors Generalized additive models Consider a linear regression problem: Y i = + 1 x i 1 + 2 x i 2 + i , where e 1 , .. . , e n iid N (0 , 2 ). * Diagnostics (residual plots, added variable plots) might indicate poor fit of the basic model above. * Remedial measures might include transforming the response, transforming one or both predictors, or both. * One also might consider adding quadratic terms and/or an interaction term. * Note: we only consider transforming continuous predictors! T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 2 / 26 Generalized additive models Additive predictors When considering a transformation of one predictor, an added variable plot can suggest a transformation (e.g. log( x ) , 1 / x ) that might work if the other predictor is correctly specified . In general, a transformation is given by a function x * = g ( x ). Say we decide that x i 1 should be logtransformed and the reciprocal of x i 2 should be used. Then the resulting model is Y i = + 1 log( x i 1 ) + 2 / x i 2 + i = + g 1 ( x i 1 ) + g 2 ( x i 2 ) + i , where g 1 ( x ) and g 2 ( x ) are two functions specified by 1 and 2 . T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 3 / 26 Generalized additive models Additive predictors Here we are specifying forms for g 1 ( x  1 ) and g 2 ( x  2 ) based on exploratory data analysis, but we could from the outset specify models for g 1 ( x  1 ) and g 2 ( x  2 ) that are rich enough to capture interesting and predictively useful aspects of how the predictors affect the response and estimate these functions from the data . One example of this is through an basis expansion; for the j th predictor the transformation is: g j ( x ) = K j X k =1 jk jk ( x ) , where { jk ( ) } K j k =1 are Bspline basis functions, or sines/cosines, etc. This approach has gained more favor from Bayesians, but is not the approach taken in SAS PROC GAM. PROC GAM makes use of cubic smoothing splines . This is an example of nonparametric regression, which ironically connotes the inclusion of lots of parameters rather than fewer. T. Hanson (USC) Stat 704: Data Analysis I, Fall 2010 4 / 26 Generalized additive models Additive predictors For simple regression data { ( x i , y i ) } n i =1 , a cubic spline smoother g ( x ) minimizes n X i =1 ( y i g ( x i )) 2 + Z  g 00 ( x ) 2 dx ....
View
Full
Document
This note was uploaded on 12/14/2011 for the course STAT 704 taught by Professor Staff during the Fall '11 term at South Carolina.
 Fall '11
 Staff

Click to edit the document details