# RSprimer - Computing Primer for Applied Linear Regression...

This preview shows pages 1–6. Sign up to view the full content.

Computing Primer for Applied Linear Regression, Third Edition Using R and S-Plus Sanford Weisberg University of Minnesota School of Statistics October 23, 2007 c c 2005, Sanford Weisberg Home Website: www.stat.umn.edu/alr

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Contents Introduction 1 0.1 Organization of this primer 4 0.2 Data Fles 5 0.2.1 Documentation 5 0.2.2 R data Fles and a package 6 0.2.3 S-Plus data Fles and library 6 0.2.4 Getting the data in text Fles 7 0.2.5 An exceptional Fle 7 0.3 Scripts 7 0.4 The very basics 8 0.4.1 Reading a data Fle 8 0.4.2 Reading Excel ±iles 9 0.4.3 Saving text output and graphs 9 0.4.4 Normal, F , t and χ 2 tables 10 0.5 Abbreviations to remember 11 0.6 Packages/Libraries for R and S-Plus 12 0.7 Copyright and Printing this Primer 12 1 Scatterplots and Regression 13 1.1 Scatterplots 13 v

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
vi CONTENTS 1.2 Mean functions 16 1.3 Variance functions 16 1.4 Summary graph 16 1.5 Tools for looking at scatterplots 16 1.6 Scatterplot matrices 16 2 Simple Linear Regression 19 2.1 Ordinary least squares estimation 19 2.2 Least squares criterion 19 2.3 Estimating σ 2 20 2.4 Properties of least squares estimates 20 2.5 Estimated variances 20 2.6 Comparing models: The analysis of variance 21 2.7 The coeFcient of determination, R 2 22 2.8 Con±dence intervals and tests 23 2.9 The Residuals 26 3 Multiple Regression 27 3.1 Adding a term to a simple linear regression model 27 3.2 The Multiple Linear Regression Model 27 3.3 Terms and Predictors 27 3.4 Ordinary least squares 28 3.5 The analysis of variance 30 3.6 Predictions and ±tted values 31 4 Drawing Conclusions 33 4.1 Understanding parameter estimates 33 4.1.1 Rate of change 34 4.1.2 Sign of estimates 34 4.1.3 Interpretation depends on other terms in the mean function 34 4.1.4 Rank de±cient and over-parameterized models 34 4.2 Experimentation versus observation 34 4.3 Sampling from a normal population 34 4.4 More on R 2 34 4.5 Missing data 34 4.6 Computationally intensive methods 36 5 Weights, Lack of ²it, and More 41
CONTENTS vii 5.1 Weighted Least Squares 41 5.1.1 Applications of weighted least squares 42 5.1.2 Additional comments 42 5.2 Testing for lack of Ft, variance known 42 5.3 Testing for lack of Ft, variance unknown 43 5.4 General F testing 44 5.5 Joint conFdence regions 45 6 Polynomials and ±actors 47 6.1 Polynomial regression 47 6.1.1 Polynomials with several predictors 48 6.1.2 Using the delta method to estimate a minimum or a maximum 49 6.1.3 ±ractional polynomials 51 6.2 ±actors 51 6.2.1 No other predictors 53 6.2.2 Adding a predictor: Comparing regression lines 53 6.3 Many factors 54 6.4 Partial one-dimensional mean functions 54 6.5 Random coe²cient models 56 7 Transformations 59 7.1 Transformations and scatterplots 59 7.1.1 Power transformations 59 7.1.2 Transforming only the predictor variable 59 7.1.3 Transforming the response only 63 7.1.4 The Box and Cox method 64 7.2 Transformations and scatterplot matrices 65 7.2.1 The 1D estimation result and linearly related predictors 66 7.2.2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 10/22/2009 for the course STAT 425 taught by Professor Pingma during the Spring '09 term at University of Illinois at Urbana–Champaign.

### Page1 / 118

RSprimer - Computing Primer for Applied Linear Regression...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online