CH-15 PPT

# CH-15 PPT - Multiple Regression Analysis November 30, 2010...

This preview shows pages 1–8. Sign up to view the full content.

Multiple Regression Analysis November 30, 2010

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Introduction We can use the same ideas from simple linear regression to analyze relationships between a dependent variable and several independent variables Multiple regression is an extension of the simple linear regression for investigating how a response y is aﬀected by several independent variables x 1 , ..., x k Our objectives are: - ﬁnd relationships between y and x 1 , ..., x k - predict y using x 1 , ..., x k
Examples Monthly sales ( y ) of a retail store may depend on x 1 = advertising expenditure x 2 = time of year x 3 = size of inventory x 4 = state of economy Body fat ( y ) may depend on x 1 = age x 2 = sex x 3 = body type

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Pertinent Questions Which of the independent variables are useful and which are not? How could we create a prediction equation which allows us to predict y using knowledge of x 1 , x 2 , x 3 etc? How strong is the relationship between y and the independent variables? How good is the prediction?
The General Linear Model y = β 0 + β 1 x 1 + β 2 x 2 + ... + β k x k + ± y is the dependent variable (response) x 1 , ..., x k are independent variables (predictors) The deterministic part of the model, E ( y ) = β 0 + β 1 x 1 + β 2 x 2 + ... + β k x k , describes average value of y for any ﬁxed values of x 1 , ..., x k . The observation y deviates from the deterministic model by an amount ± ± is random error. We assume random errors are independent normal random variables with mean zero and a constant variance σ 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
The Method of Least Squares Data: n observations of the response y and the independent variables x 1 , ..., x k The best ﬁtting prediction equation is ˆ y = b 0 + b 1 x 1 + b 2 x 2 + ... + b k x k We choose our estimates b 0 , ..., b k to minimize SSE = X ( y - β 0 - β 1 x 1 - β 2 x 2 - ... - β k x k ) 2 The computation is usually done by a computer
Example A data set contains the college GPA ( y ), high school GPA ( x 1 ) and study time ( x 2 ) of 59 randomly selected students. We are interested in regressing college GPA on high school GPA and study time. Regression line:ˆ

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 11/05/2011 for the course BMGT 220 taught by Professor Bulmash during the Spring '08 term at Maryland.

### Page1 / 21

CH-15 PPT - Multiple Regression Analysis November 30, 2010...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online