This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Variable Selection Methods Principle of Parsimony (Occams razor): Choose fewer variables with sufficient explanatory power. This is a desirable modeling strategy. The goal is thus to identify the smallest subset of covariates that provides good fit. One way of achieving this is to retain the significant predictors in the fitted multiple regression. This may not work well if some variables are strongly correlated among themselves or if there are too many variables (e.g., exceeding the sample size). Two other possible strategies are Best subset regression using Mallows C p statistic. Stepwise regression. 38 Best Subset Regression For a model with p regression coefficients, (i.e., p 1 covariates plus the intercept ), define its C p value as C p = RSS s 2 ( N 2 p ) , where RSS = residual sum of squares for the given model, s 2 = mean square error = RSS (for the complete model) df (for the complete model) , N = number of observations....
View
Full
Document
 Fall '08
 Ma,P

Click to edit the document details