Lecture7CorrelationRegression

Lecture7CorrelationRegression - Temple University Ambler...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Temple University Ambler Community and Regional Planning CRP 410 Planning Methods Professor Paulsen October 10, 2005 CORRELATION and REGRESSION. Correlation: correlation measures the strength of the relationship between variables or the degree to which two variables are correlated (co-related). Another way to think of it is that is a measure of the extent to which two variables "move together" – as one changes, how does the other one change? The correlation measure is a "dimensonless" number, and can therefore be used to compare "apples" and "oranges" or variables measured in different units. The most commonly used Correlation coefficient is the Pearson Correlation Coefficient, usually labeled r. r ranges from -1 to 1, with negative numbers indicating a negative correlation while positive numbers indicate a positive relationship. A positive (negative) correlation implies that as one variable goes up, the other variable goes up (down). A correlation coefficient of zero indicates that there is no "linear" relationship between the two variables. = = = - - - - = n i i n i i n i i i y y x x y y x x r 1 2 1 2 1 ) ( ) ( ) )( ( Pearson's Correlation Coefficient is a measure of linear relationship between two variables. However, In the presence of extreme outliers, or in the presence of a non- linear relationship, Pearson's r may not detect a significant correlation. There is a different correlation measure which can be used which does not assume a linear relationship. This non-parametric measure of correlation is "Spearman's rank correlation", which the book calls "Spearman's s" but is sometimes also called Spearman's rho. To implement Pearson's correlation in Microsoft Excel, the command is "=CORREL" With the StatPlus add-in, you can calculate the p-value with "=CORRELP". In StatPlus, correlation can be accessed under "multivariate analysis: correlation matrix." Regression. Preliminary notes. It used to be common, when "doing" a regression, to simply use the word "regression" because everyone understood what that meant: an ordinary least- squares (OLS) linear regression. Now, of course, given the proliferation of computerized 1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
statistical packages, people can calculate a whole range of different types of "regressions." In this class, we will stick with the most basic linear regression, but please be aware of the fact that there really are a whole range of types of regressions. The most basic form of regression is the OLS regression. What is a regression? Informally, it is a line fitted between two variables to estimate the (linear) relationship between the two variables. (cf. book, p. 294). In Problem Set 1, we inserted a trend line which was doing a regression. Or, in the case where we have more than one "predictor" variable, it is multi-dimensional plane describing the relationship between the variables. One way to think about regression is that it is a way to test the statistical effect of one
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 08/08/2011 for the course WCOB 2023 taught by Professor Billthompson during the Spring '07 term at Arkansas.

Page1 / 9

Lecture7CorrelationRegression - Temple University Ambler...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online