{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Regression Modeling and Analysis-ECO6416

# Regression Modeling and Analysis-ECO6416 - Regression...

This preview shows pages 1–2. Sign up to view the full content.

Regression Modeling and Analysis Many problems in analyzing data involve describing how variables are related. The simplest of all models describing the relationship between two variables is a linear, or straight-line, model. Linear regression is always linear in the coefficients being estimated, not necessarily linear in the variables. The simplest method of drawing a linear model is to “eye-ball" a line through the data on a plot, but a more elegant, and conventional method is that of least squares, which finds the line minimizing the sum of the vertical distances between observed points and the fitted line. Realize that fitting the “best" line by eye is difficult, especially when there is much residual variability in the data. Know that there is a simple connection between the numerical coefficients in the regression equation and the slope and intercept of the regression line. Know that a single summary statistic, like a correlation coefficient, does not tell the whole story. A scatterplot is an essential complement to examining the relationship between the two variables. Again, the regression line is a group of estimates for the variable plotted on the Y- axis. It has a form of y = b + mx, where m is the slope of the line. The slope is the rise over run. If a line goes up 2 for each 1 it goes over, then its slope is 2. The regression line goes through a point with coordinates of (mean of x values, mean of y values), known as the mean-mean point. If you plug each x in the regression equation, then you obtain a predicted value for y. The difference between the predicted y and the observed y is called a residual, or an error term. Some errors are positive and some are negative. The sum of squares of the errors plus the sum of squares of the estimates add up to the sum of squares of Y: Partitioning the Three Sum of Squares The regression line is the line that minimizes the variance of the errors. The mean error is zero; so, this means that it minimizes the sum of the squares errors. The reason for finding the best fitting line is so that you can make a reasonable prediction of what y will be if x is known (not vise-versa). r 2 is the variance of the estimates divided by the variance of Y. r is the size of the slope of the regression line, in terms of standard deviations. In other words, it is the slope of the regression line if we use the standardized X and Y. It is how many

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 4

Regression Modeling and Analysis-ECO6416 - Regression...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online