stat200ch10_winter10

# stat200ch10_winter10 - STAT 200 Chapter 10 Inference for...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: STAT 200 Chapter 10 Inference for Regression Linear Regression (Section 10.1) • Recall in Chapter 2, we explored the relationship between two quantitative variables by examining the scatterplot, measuring the strength of a linear association with the correlation coefficient. We also learnt techniques of simple linear regression that allow us to make predictions about a response variable from a given explanatory variable. • The linear regression model In simple linear regression, we fit a straight line to the data ( x i ,y i ), where x is the explanatory variable and y is the response variable. We say that we are fitting a linear model to the data. The linear model has the form Y i = β + β 1 X i + i , where β is the population intercept, β 1 is the population slope, and is the error term. Assumption of the linear model: i ’s are independent N (0 ,σ ) random variables, which implies – for each fixed x value ( X i = x i ), the Y i ’s are independent random variables from a normal distribution: E ( Y i ) = E ( β + β 1 x i + i ) = β + β 1 x i + E ( i ) = β + β 1 x i because E ( i ) = 0 V ( Y i ) = V ( β + β 1 x i + i ) = V ( i ) = σ 2 Y i is a linear combination of i which is normal, so Y i is also normally distributed, i.e....
View Full Document

{[ snackBarMessage ]}

### Page1 / 5

stat200ch10_winter10 - STAT 200 Chapter 10 Inference for...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online