Lecture9-2

# Lecture9-2 - So far we have discussed inference for 1 and...

This preview shows pages 1–11. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 So far we have discussed inference for 1 and 2- sample quantitative variables (e.g. t -tests) and categorical variables (e.g. tests) Now consider inference where we have a single quantitative response variable (Y) and a single quantitative explanatory (predictor) variable (X) We covered descriptive tools (scatterplots, least squares regression and correlation) Now consider inference for regression 2 χ
Recall that for fuel consumption the regression line is If a different sample is used, we hope the regression line will be similar to the one we found here, but they won’t be exactly the same We can imagine that there is a true line that summarizes the relationship between Fuel and Temp The true line is our “population line” 3 Temp Fuel 128 . 0 15.8 - =

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 In statistics we make a clear distinction between population parameters versus sample statistics Consider now the least-squares line computed from a sample as an estimate of the true regression line for a population The least-squares line fitted from the sample as: = b 0 + b 1 x b 0 and b 1 are sample statistics – random!! Corresponding to the fitted line, the population line as: μ y|x = β 0 + β 1 x (using Greek “betas”) β 0 and β 1 are population parameters μ y|x = β 0 + β 1 x is mean value of y when value of x is given y ˆ
5 Given the population regression line, μ y|x = β 0 + β 1 x β 0 is the population Y-intercept β 0 is the mean value of Y when X = 0 * β 1 is the population slope β 1 describes the change in the mean response (Y) for a single unit increase in X * Note : Be careful: You need to have data where x is 0 for this to make sense.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
6 Suppose we have a sample of n pairs of observations on the response variable Y and the predictor variable X: ( x 1 , y 1 ), ( x 2 , y 2 ), ( x 3 , y 3 ), . . . , ( x n , y n ) The statistical model for simple linear regression states that y i = β 0 + β 1 x i + ε i = μ y|x + ε i μ y|x = β 0 + β 1 x is mean value of y when value of x is given ε is error term describing leftover effect on y (Greek epsilon)
7 Precision of the regression line is measured by σ Model can be written as: DATA = FIT + RESIDUAL In simple linear regression the FIT is μ y|x = β 0 + β 1 x The independent deviations ε i are the errors, ε i ~ N(0,σ)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
8 Assumptions about the model error terms, ε i ’s Mean of Zero At any given value of x, the population of potential error term values has a mean equal to zero Constant Variance At any given value of x, the population of potential error term values has a variance that does not depend on the value of x Normal distribution At any given value of x, the population of potential error term values has a normal distribution Independence Values of the error term are statistically independent of each other
9

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
10 Consider the population regression line μ y|x = β 0 + β 1 x We estimate this line from the sample
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 01/16/2010 for the course BUAD 310 at USC.

### Page1 / 65

Lecture9-2 - So far we have discussed inference for 1 and...

This preview shows document pages 1 - 11. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online