This preview shows pages 1–13. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Click to edit Master subtitle style Applied Probability Methods for Engineers Slide Set 3 Click to edit Master subtitle style Chapter 12 Simple Linear Regression and Correlation Simple Linear Regression n Used when we believe some random variable is dependent on some other factor and is a linear function of this factor n We consider a random variable Yi that depends on the value of an independent variable xi n Assume that an observation yi is the sum of a linear function of xi and an error term i, i.e., yi = 0 + 1xi + i n Generally assumed that the error terms are normally distributed (and iid) with zero mean and variance 2 n Observations y1, , yn are therefore observations of independent random variables Yi ~ N(0 + 1xi, 2) n Expected value of Yi equals 0 + 1xi Simple Linear Regression Simple Linear Regression n y is the dependent variable and x is the explanatory variable n 0 and 1 are the slope and intercept parameters n 1 slope parameter determines how expected value of y changes as a function of x n 1 = 0 implies y and x are unrelated n 0, 1, and 2 typically estimated from a data set n Should first look at a graph of the data to make sure linearity is a reasonable assumption Example Data Set Factory electricity usage as a function of production Example Data Set Fitting a Regression Line n How do we take a data set and fit the best line to it? n Consider vertical deviations Fitting a Regression Line n Given the data set of xi, yi values, we want the 0 and 1 values that minimize the sum of squared deviations from the line n The vertical error is given by i = yi (0 + 1xi) n Why not minimize the sum of the errors? n Min ii = Min i{yi (0 + 1xi)} = Max i{0 + 1xi} n Since we dont have sign restrictions on 0 and 1, optimal solution is 0 = 1 = & & (assuming xis nonnegative) n This is a meaningless result that essentially interprets negative errors as a good thing Fitting a Regression Line n We need to consider negative and positive errors as equally bad n Min ii2 = Min Q = i{yi (0 + 1xi)}2 n Setting these to zero gives ( 29 ( 29 1 1 2 n i i i Q y x = =  + ( 29 ( 29 1 1 1 2 n i i i i Q x y x = =  + 1 1 1 n n i i i i y n x = = = + 2 1 1 1 1 n n n i i i i i i i x y x x = = = = + Side Note n Is Q a convex function of 0 and 1? n Jointly convex in 0 and 1 if a 0, c 0, and ac b2 0 2 2 2 Q a n = = 2 2 2 1 1 2 n i i Q c x = = = 2 1 1 2 n i i Q b x = = = 2 1 1 2 n i i Q b x = = = ( 29 ( 29 2 2 2 2 1 1 1 4 4 4 n n n i i i i i i ac b n x x n x x = = = = = Back to the Fitted Line n Solve the two equations simultaneously: 1 1 1 n n i i i i y n x = = = + 2 1 1 1 1 n n n i i i i i i i x y x x = = = = + 1 1 1 1 n n i i i i y x y x n n = =...
View Full
Document
 Spring '07
 JosephGeunes

Click to edit the document details