lecture5

# lecture5 - ISYE6414 Summer 2010 Lecture 5 The Linear Model...

This preview shows pages 1–4. Sign up to view the full content.

ISYE6414 Summer 2010 Lecture 5 The Linear Model: Multiple Regression Dr. Kobi Abayomi July 6, 2010 1 Introduction Multiple Regression is the extension of simple linear regression from 1 to k -predictors. The response variable (of interest) is modeled as a conditional expected value the sample as in OLS; the estimating equations are generally the multivariate versions (linear estimators) of minimization of the sum of squared deviations; the estimators themselves have properties which are multivariate extensions of the properties we are already familiar with from OLS. God is in the details 1 , however, and the presence of multiple predictors introduces manifold entan- glements and a wider domain of assumptions that careful statisticians — like yourselves — must be sure to address. Sotto Voce : We aren’t covering Chapter 5 in KNNL explicitly; you are expected to know the matrix algebra on your own. 2 Multiple Regression In multiple regression, several predictors are used to model a single response variable. For each of n cases observed, values for the response (the y variable) and for each of the predictors (the x variables) are collected. The data will form an nx ( k + 1) array or matrix 1 Lookup Ludwig Mies van der Rohe 1

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
i = Y X 1 X 2 X 3 ··· X k 1 y 1 x 11 x 12 ··· x 1 k 2 y 2 x 21 x 22 ··· x 2 k . . . . . . . . . ··· . . . . . . . . . n y n x n 1 x n 2 x n 3 ··· x nk Figure 1: A matrix, or array, of observations for a linear regression model. n observations and k predictors. Data, presented in this format, will yield estimates for a linear regression model. The model is speciﬁed by a linear equation Y = β 0 + k X j =1 β i X i + ± where, as before, the β ’s are unknown parameters, and the ± is an error term with E ( ± ) = 0 and V ar ( ± ) = σ 2 . Y the response variable and X 1 ,...,X k are the predictors . When k = 2 the equation Y = β 0 + β 1 X 1 + β 2 X 2 gives the equation of a two dimensional plane or surface. Figure 2: Graph of the regression plane y = 5 + 5 x 1 + 7 x 2 : This is a natural extension of a simple linear regression model. When x 2 is constant, the surface in two dimensions is the regression line y = 5 + 5 x 1 ; when x 1 is held constant, the surface in two dimensions is the line y = 5 + 7 x 2 2
and Y = β 0 + β 1 X 1 + β 2 X 2 + ± includes the error term (with E ( ± ) = 0 and V ar ( ± ) = σ 2 = σ ± 2 ) and yields the full probabilistic model. In general it is often desirable for some predictors to be mathematical functions of others in the sense that the resulting model may be much more successful in explaining variation than any model without such predictors. (a) (b) Figure 3: Fig 3.a:Graph of regression surface y = 5 + 5 x 1 2 + 7 x 2 2 . Fig 3.b: Graph of regression surface y = 0 + sin ( x 1 ) + cos ( x 1 ) Each, though a non linear surface, can be ﬁt with a regression equation. In the ﬁrst model, let x 1 2 = x 1 0 , x 2 2 = x 2 0 . In the second model, let sin ( x 1 ) = x 1 0 , let cos (

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 09/01/2011 for the course ISYE 6414 taught by Professor Staff during the Fall '08 term at Georgia Tech.

### Page1 / 11

lecture5 - ISYE6414 Summer 2010 Lecture 5 The Linear Model...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online