stat200ch10_winter10

stat200ch10_winter10 - STAT 200 Chapter 10 Inference for...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: STAT 200 Chapter 10 Inference for Regression Linear Regression (Section 10.1) Recall in Chapter 2, we explored the relationship between two quantitative variables by examining the scatterplot, measuring the strength of a linear association with the correlation coefficient. We also learnt techniques of simple linear regression that allow us to make predictions about a response variable from a given explanatory variable. The linear regression model In simple linear regression, we fit a straight line to the data ( x i ,y i ), where x is the explanatory variable and y is the response variable. We say that we are fitting a linear model to the data. The linear model has the form Y i = + 1 X i + i , where is the population intercept, 1 is the population slope, and is the error term. Assumption of the linear model: i s are independent N (0 , ) random variables, which implies for each fixed x value ( X i = x i ), the Y i s are independent random variables from a normal distribution: E ( Y i ) = E ( + 1 x i + i ) = + 1 x i + E ( i ) = + 1 x i because E ( i ) = 0 V ( Y i ) = V ( + 1 x i + i ) = V ( i ) = 2 Y i is a linear combination of i which is normal, so Y i is also normally distributed, i.e....
View Full Document

Page1 / 5

stat200ch10_winter10 - STAT 200 Chapter 10 Inference for...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online