lecture15 - Lecture 15: Extensions of the Linear Model:...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture 15: Extensions of the Linear Model: Multiple Regression, Non-Linear Regression, Regression Diagnostics Dr. Kobi Abayomi April 13, 2009 1 Introduction: Multiple Regression Modeling In multiple regression, several predictors are used to model a single response variable. For each of n cases observed, values for the response (the y variable) and for each of the predictors (the x variables) are collected. The data will form an nX ( p + 1) array i = Y X 1 X 2 X 3 X k 1 y 1 x 11 x 12 x 1 k 2 y 2 x 21 x 22 x 2 k . . . . . . . . . . . . . . . . . . n y n x n 1 x n 2 x n 3 x nk Figure 1: A matrix, or array, of observations for a linear regression model. n observations and k predictors. Data, presented in this format, will yield estimates for a linear regression model. 1.1 Extension of Simple Linear Regression model The model is specified by a linear equation Y = + k X j =1 i X i + where, as before, the s are unknown parameters, and the is an error term with E ( ) = 0 and V ar ( ) = 2 . Y the response variable and X 1 ,...,X k are the predictors . When k = 2 the equation Y = + 1 X 1 + 2 X 2 1 gives the equation of a two dimensional plane or surface. Figure 2: Graph of the regression plane y = 5 + 5 x 1 + 7 x 2 : This is a natural extension of a simple linear regression model. When x 2 is constant, the surface in two dimensions is the regression line y = 5 + 5 x 1 ; when x 1 is held constant, the surface in two dimensions is the line y = 5 + 7 x 2 and Y = + 1 X 1 + 2 X 2 + includes the error term (with E ( ) = 0 and V ar ( ) = 2 = 2 ) and yields the full probabilistic model. In general it is often desirable for some predictors to be mathematical functions of others in the sense that the resulting model may be much more successful in explaining variation than any model without such predictors. 2 (a) (b) Figure 3: Fig 3.a:Graph of regression surface y = 5 + 5 x 1 2 + 7 x 2 2 . Fig 3.b: Graph of regression surface y = 0 + sin ( x 1 ) + cos ( x 1 ) Each, though a non linear surface, can be fit with a regression equation. In the first model, let x 1 2 = x 1 , x 2 2 = x 2 . In the second model, let sin ( x 1 ) = x 1 , let cos ( x 1 ) = x 2 . The point here is that the linear model, when extended beyond many variables, is extremely flexible. The linear in linear regression refers to estimation of the response y as a linear function of the observed data. Transforming the variables to fit an array, such as that given in figure 1. For the case of two independent variables x 1 ,x 2 four useful regression models are: 1. The first order model: Y = + 1 x 1 + 2 x 2 + 2. The second order, no interaction model: Y = + 1 x 1 + 2 x 2 + 3 x 3 2 + 3 x 4 2 + ....
View Full Document

This note was uploaded on 11/08/2009 for the course ISYE 2028 taught by Professor Shim during the Spring '07 term at Georgia Institute of Technology.

Page1 / 13

lecture15 - Lecture 15: Extensions of the Linear Model:...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online