CH-15 PPT - Multiple Regression Analysis November 30, 2010...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Multiple Regression Analysis November 30, 2010
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Introduction We can use the same ideas from simple linear regression to analyze relationships between a dependent variable and several independent variables Multiple regression is an extension of the simple linear regression for investigating how a response y is affected by several independent variables x 1 , ..., x k Our objectives are: - find relationships between y and x 1 , ..., x k - predict y using x 1 , ..., x k
Background image of page 2
Examples Monthly sales ( y ) of a retail store may depend on x 1 = advertising expenditure x 2 = time of year x 3 = size of inventory x 4 = state of economy Body fat ( y ) may depend on x 1 = age x 2 = sex x 3 = body type
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Pertinent Questions Which of the independent variables are useful and which are not? How could we create a prediction equation which allows us to predict y using knowledge of x 1 , x 2 , x 3 etc? How strong is the relationship between y and the independent variables? How good is the prediction?
Background image of page 4
The General Linear Model y = β 0 + β 1 x 1 + β 2 x 2 + ... + β k x k + ± y is the dependent variable (response) x 1 , ..., x k are independent variables (predictors) The deterministic part of the model, E ( y ) = β 0 + β 1 x 1 + β 2 x 2 + ... + β k x k , describes average value of y for any fixed values of x 1 , ..., x k . The observation y deviates from the deterministic model by an amount ± ± is random error. We assume random errors are independent normal random variables with mean zero and a constant variance σ 2
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
The Method of Least Squares Data: n observations of the response y and the independent variables x 1 , ..., x k The best fitting prediction equation is ˆ y = b 0 + b 1 x 1 + b 2 x 2 + ... + b k x k We choose our estimates b 0 , ..., b k to minimize SSE = X ( y - β 0 - β 1 x 1 - β 2 x 2 - ... - β k x k ) 2 The computation is usually done by a computer
Background image of page 6
Example A data set contains the college GPA ( y ), high school GPA ( x 1 ) and study time ( x 2 ) of 59 randomly selected students. We are interested in regressing college GPA on high school GPA and study time. Regression line:ˆ
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 11/05/2011 for the course BMGT 220 taught by Professor Bulmash during the Spring '08 term at Maryland.

Page1 / 21

CH-15 PPT - Multiple Regression Analysis November 30, 2010...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online