This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Chapter-4.3 The regression R^2 is the fraction of the sample variance of Yi, explained by ( or predicted by) Xi. Y(i) (dependent variable) = Y(hat)i + u (hat) i R^2= ratio of the sample variance of Y(hat)i / the sample variance of Y(i). R^2= ESS/TSS ( ESS) Explained Sum of Squares= Is the sum of squared deviations of the predicted values of Y i, Y (hat) I from their average. Total Sum of Squares: sum of squared deviations of Y I, from its average: R^2 can also be written as the fraction towards SSR- Sum of Squared residuals, the sum of the squared OLS residuals. R^2= 1- SSR/TSS SER== The standard error od the regression is an estimator i=of the standard deviation of the regression error. Ui The least squares assumption: Assumption 1- The conditional distribution of U(i) gives X(i) has a mean zero. The distribution of U(i), conditional on Xi= x, has a mean of zero, stated mathematically, E(Ui/Xi=x) = 0, in somewhat simpler notation Eu(i)/X(i)=0 is equivalent to assuming that the population regression line is the conditional mean of Yi given Xi.the population regression line is the conditional mean of Yi given Xi....
View Full Document
This note was uploaded on 04/07/2011 for the course ECONOMICS 300 taught by Professor Sani during the Spring '11 term at Rutgers.
- Spring '11