This preview shows pages 1–8. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: STA 100 Lecture 25 Paul Baines Department of Statistics University of California, Davis March 7th, 2011 Admin for the Day I Final project due Friday, 3pm I Office Hours project questions please! I Office hours Monday 9.3011.30am, 3.004.30pm I Office hours Tuesday 11.0012.00pm References for Today: Rosner, Ch 11 (7th Ed.) References for Wednesday: Rosner, Ch 11 (7th Ed.) Testing Regression Parameters We assumed that the the response was a linear function of the explanatory variable. Did we need the intercept? Did we need the slope? Would a simpler model have been ok? We need to be able to test whether the regression parameters are necessary i.e., is = 0? Testing Model Parameters Recall that our linear model is: Y i = + X i + i , i iid N ( , 2 ) . Two simpler models would be: Y i = X i + i , i iid N ( , 2 ) , [ = ] (1) Y i = + i , i iid N ( , 2 ) , [ = ] (2) We can test the hypothesis that = 0 or the hypothesis that = 0, and see if a simpler model would suffice. Testing the Slope 1. H : = 0 vs . H 1 : 6 = 0 2. Test statistic: t = SE ( ) = / s xx where s xx = n i =1 ( x i x ) 2 . 3. Reference distribution: under H , the test statistic t follows a t distribution with n 2 degrees of freedom. 4. The p value as usual is p = P (  t n 2  > t ) 5. Decide to reject or not depending on the value of p 6. Interpret the meaning for your example Testing the Slope: BrainBody Example 1. H : = 0 , vs . H 1 : 6 = 0 2. Test statisic: t = SE ( ) = . 43580 . 08751 = 4 . 980 . The estimate, standard error and test statistic are given in columns 1, 2 and 3 of the R output. 3. Reference distribution: under H , the test statistic t follows a t distribution with 25 degrees of freedom. 4. The p value is p = P (  t n 2  > t ) = 0 . 0000392. The p value is listed in column 4 of the R output . 5. Reject H since p < = 0 . 05. 6. The logbrain and logbody weights of animals appear to have a linear relationship (logbody weight is a statistically significant explanatory variable). Doing Linear Regression For the brainbody weight example we get: > RegModel.2 < lm(log(brain)~log(body), data=animals) > summary(RegModel.2) [snipped] Coefficients: Estimate Std. Error t value Pr(>t) (Intercept) 2.99064 0.47051 6.356 1.18e06 *** log(body) 0.43580 0.08751 4.980 3.93e05 *** Signif. codes: 0 *** 0.001 ** 0.01 * 0.05 . 0.1 1 [snipped] I The asterisks in the last column make it easy to see whether or not each variable is necessary or not. I No asterisk (or dot) means that you can probably do without that variable (not much evidence it is not equal to zero)....
View
Full
Document
This note was uploaded on 03/09/2011 for the course STAT 100 taught by Professor drake during the Spring '10 term at UC Davis.
 Spring '10
 DRAKE
 Statistics

Click to edit the document details