mm1011

mm1011 - Review: Correlation vs. Regression What are the...

Info iconThis preview shows pages 1–13. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Review: Correlation vs. Regression What are the main differences between correlation & regression? What are the data requirements for each? What are their principal vulnerabilities? How do we establish causality? Pearson correlation: a linear association between two quantitative variables. Data Requirements A probability sampleif the analysis will be inferential, as opposed to descriptive. For OLS regression, the outcome variable must be quantitative (interval or ratio); the explanatory variables may be quantitative or categorical (nominal or ordinal). What are the disadvantages of using correlation to study the relationships between two or more variables? See Moore/McCabe, chapter 2. = xy xy : Ha : Ho Hypothesis tests for correlation We can use Pearson correlation not only descriptively but also inferentially. To use it inferentially, use a scatterplot to check the bivariate relationship for linearity. If the relationship is sufficiently linear: In Stata, pwcorr vs. corr correlate (corr): listwise, or casewise, deletioni.e. any observation (i.e. individual, case) for which any of the correlated variables has missing data is not used (i.e. corr only uses observations with complete data for the examined variables). If, for the relationship between math and reading scores, observation #27 has, say, a missing math score, then corr or regress will automatically drop observation #27. This is how regression works, so corr corresponds to regression. Moreover, corr does not permit hypothesis tests . pwcorr: (pairwise) uses all of the non-missing observations for the examined variables (e.g., it would use observation #27s reading score, even though #27s math score is missing). This does not correspond to the way that regression works. Moreover, pwcorr permits hypothesis tests Note : There is a way to use pwcorr so that, like regression analysis, it is based on casewise (i.e. listwise) deletion of missing observations. Well demonstrate this later. Use a Bonferonni or other multiple-test adjustment when simultaneously testing multiple correlation hypotheses: . pwcorr read write math science socst, obs sig star(.05) bonf Why is the multiple-test adjustment important? If the data have no missing values, then theres no problem using pwcorr. Contingency Table vs. Pearson Correlation What if the premises of parametric statistics dont hold? What if the premises of parametric statistics dont hold?...
View Full Document

This note was uploaded on 07/11/2011 for the course SYA 6305 taught by Professor Tardanico during the Fall '08 term at FIU.

Page1 / 168

mm1011 - Review: Correlation vs. Regression What are the...

This preview shows document pages 1 - 13. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online