varselect - Variable Selection Methods Principle of...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Variable Selection Methods Principle of Parsimony (Occams razor): Choose fewer variables with sufficient explanatory power. This is a desirable modeling strategy. The goal is thus to identify the smallest subset of covariates that provides good fit. One way of achieving this is to retain the significant predictors in the fitted multiple regression. This may not work well if some variables are strongly correlated among themselves or if there are too many variables (e.g., exceeding the sample size). Two other possible strategies are Best subset regression using Mallows C p statistic. Stepwise regression. 38 Best Subset Regression For a model with p regression coefficients, (i.e., p- 1 covariates plus the intercept ), define its C p value as C p = RSS s 2- ( N- 2 p ) , where RSS = residual sum of squares for the given model, s 2 = mean square error = RSS (for the complete model) df (for the complete model) , N = number of observations....
View Full Document

Page1 / 3

varselect - Variable Selection Methods Principle of...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online