This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: STAT 200 Chapter 10 Inference for Regression Linear Regression (Section 10.1) Recall in Chapter 2, we explored the relationship between two quantitative variables by examining the scatterplot, measuring the strength of a linear association with the correlation coefficient. We also learnt techniques of simple linear regression that allow us to make predictions about a response variable from a given explanatory variable. The linear regression model In simple linear regression, we fit a straight line to the data ( x i ,y i ), where x is the explanatory variable and y is the response variable. We say that we are fitting a linear model to the data. The linear model has the form Y i = + 1 X i + i , where is the population intercept, 1 is the population slope, and is the error term. Assumption of the linear model: i s are independent N (0 , ) random variables, which implies for each fixed x value ( X i = x i ), the Y i s are independent random variables from a normal distribution: E ( Y i ) = E ( + 1 x i + i ) = + 1 x i + E ( i ) = + 1 x i because E ( i ) = 0 V ( Y i ) = V ( + 1 x i + i ) = V ( i ) = 2 Y i is a linear combination of i which is normal, so Y i is also normally distributed, i.e....
View Full Document
- Spring '10
- Linear Regression