{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Section_4_v1 - Section 4 Econ 140 GSI Hedvig Tarso Xiaoyu 1...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Section 4 - Econ 140 GSI: Hedvig, Tarso, Xiaoyu * 1 Review - Linear regression with one regressor 1.1 The OLS estimator, predicted values, and residuals The OLS estimators of the slope β 1 and the intercept β 0 are ˆ β 1 = ( Y i - ¯ Y )( X i - ¯ X ) ( X i - ¯ X ) 2 = s XY s 2 X ˆ β 0 = ¯ Y - ˆ β 1 ¯ X The OLS predicted values ˆ Y i and residuals ˆ u i are ˆ Y i = ˆ β 0 + ˆ β 1 X i , i = 1 , ..., N ˆ u i = Y i - ˆ Y i , i = 1 , ..., N The estimated intercept ˆ β 0 , slope ˆ β 1 , and residual u i ) are computed from a sample of N observations of X i and Y i , i = 1 , ..., N . These estimatesof the unknown true population intercept ( β 0 ) , slope ( β 1 ) , and error term ( u i ) , respectively. 1.2 Measures of Fit For regression above, the following measures of fit are defined: R 2 : fraction of the sample variance of Y i explained by (or predicted by) X i . It is a scale-free measure and can be written as R 2 = ESS T SS = 1 - SSR T SS , where: ESS = N i =1 ˆ Y i - ¯ Y 2 : explained sum of squares SSR = N i =1 Y i - ˆ Y i 2 = N i =1 ˆ u 2 i : sum of squared residuals TSS = N i =1 ( Y i - ¯ Y ) 2
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}