# R ypcn ypclambpc lpcn lmypcnxpc summarylpcn

• Homework Help
• 9
• 100% (3) 3 out of 3 people found this document helpful

This preview shows page 8 - 9 out of 9 pages.

{r} ypc_n<-ypc^lambpc lpc_n<-lm(ypc_n~xpc) summary(lpc_n) plot(xpc,ypc_n) abline(summary(lpc_n)$coefficient[1,1],summary(lpc_n)$coefficient[2,1]) rpc_n<-residuals(lpc_n) plot(xpc,rpc_n) abline(0,0) plot(xpc,abs(rpc_n)) qqnorm(rpc_n)
 Therefore, we find the new regression function is more approriate, so we make predictions with it as below: {r} ypc20e<-pc[,2][which(pc[,3]>=18&pc[,3]<=22)] ypc20e xpc20<-20 ypc_n20<-summary(lpc_n)$coefficient[1,1]+summary(lpc_n)$coefficient[2,1]*xpc20 ypc20<-ypc_n20^(1/lambpc) ypc20  So, we could see that even though the regression fuction comes to be more appropriate. This function makes the error terms fairly follow the normal distribution, so our prediction is more precise. However, there are few data around 20, but our regression function just take every point as the same important one. This may cause some errors during the prediction.