All subsets regression libraryleaps attachmydata leaps

This preview shows page 30 - 31 out of 34 pages.

# All Subsets Regression library(leaps) attach(mydata) leaps<-regsubsets(y~x1+x2+x3+x4,data=mydata,nbest=10) # view results summary(leaps) # plot a table of models showing variables in each model. # models are ordered by the selection statistic. plot(leaps,scale="r2") # plot statistic by subset size library(car) subsets(leaps, statistic="rsq") Applied Statistical Computing and Graphics 176 Variable Selection Other options for plot( ) are bic, Cp, and adjr2. Other options for plotting with subset( ) are bic, cp, adjr2, and rss. Relative Importance The relaimpo package provides measures of relative importance for each of the predictors in the model. See help(calc.relimp) for details on the four measures of relative importance provided. # Calculate Relative Importance for Each Predictor library(relaimpo) calc.relimp(fit,type=c("lmg","last","first","pratt"), rela=TRUE) # Bootstrap Measures of Relative Importance (1000 samples) boot <- boot.relimp(fit, b = 1000, type = c("lmg", "last", "first", "pratt"), rank = TRUE, diff = TRUE, rela = TRUE) booteval.relimp(boot) # print result plot(booteval.relimp(boot,sort=TRUE)) # plot result Applied Statistical Computing and Graphics 177 Nonlinear Regression The nls package provides functions for nonlinear regression. See John Fox's Nonlinear Regression and Nonlinear Least Squares for an overview. Huet and colleagues' Statistical Tools for Nonlinear Regression: A Practical Guide with S-PLUS and R Examples is a valuable reference book. Robust Regression There are many functions in R to aid with robust regression. For example, you can perform robust regression with the rlm( ) function in the MASS package. John Fox's (who else?) Robust Regression provides a good starting overview. The UCLA Statistical Computing website has Robust Regression Examples. The robust package provides a comprehensive library of robust methods, including regression. The robustbase package also provides basic robust statistics including model selection methods. And David Olive has provided an detailed online review of Applied Robust Statistics with sample R code.Applied Statistical Computing and Graphics 178 Comparing Models You can compare nested models with the anova( ) function. The following code provides a simultaneous test that x3 and x4 add to linear prediction above and beyond x1 and x2. # compare models fit1 <- lm(y ~ x1 + x2 + x3 + x4, data=mydata) fit2 <- lm(y ~ x1 + x2) anova(fit1, fit2) Cross Validation You can do K-Fold cross-validation using the cv.lm( ) function in the DAAG package. # K-fold cross-validation library(DAAG) cv.lm(df=mydata, fit, m=3) # 3 fold cross-validationSum the MSE for each fold, divide by the number of observations, and take the square root to get the cross-validated standard error of estimate. You can assess R2 shrinkage via K-fold cross-validation. Using the crossval() function from the bootstrap package, do the following: # Assessing R2 shrinkage using 10-Fold Cross-Validation fit <- lm(y~x1+x2+x3,data=mydata) library(bootstrap) Applied Statistical Computing and Graphics 179 ANOVA If you have been analyzing

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture