This preview shows pages 1–13. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Review: Correlation vs. Regression What are the main differences between correlation & regression? What are the data requirements for each? What are their principal vulnerabilities? How do we establish causality? Pearson correlation: a linear association between two quantitative variables. Data Requirements A probability sampleif the analysis will be inferential, as opposed to descriptive. For OLS regression, the outcome variable must be quantitative (interval or ratio); the explanatory variables may be quantitative or categorical (nominal or ordinal). What are the disadvantages of using correlation to study the relationships between two or more variables? See Moore/McCabe, chapter 2. = xy xy : Ha : Ho Hypothesis tests for correlation We can use Pearson correlation not only descriptively but also inferentially. To use it inferentially, use a scatterplot to check the bivariate relationship for linearity. If the relationship is sufficiently linear: In Stata, pwcorr vs. corr correlate (corr): listwise, or casewise, deletioni.e. any observation (i.e. individual, case) for which any of the correlated variables has missing data is not used (i.e. corr only uses observations with complete data for the examined variables). If, for the relationship between math and reading scores, observation #27 has, say, a missing math score, then corr or regress will automatically drop observation #27. This is how regression works, so corr corresponds to regression. Moreover, corr does not permit hypothesis tests . pwcorr: (pairwise) uses all of the nonmissing observations for the examined variables (e.g., it would use observation #27s reading score, even though #27s math score is missing). This does not correspond to the way that regression works. Moreover, pwcorr permits hypothesis tests Note : There is a way to use pwcorr so that, like regression analysis, it is based on casewise (i.e. listwise) deletion of missing observations. Well demonstrate this later. Use a Bonferonni or other multipletest adjustment when simultaneously testing multiple correlation hypotheses: . pwcorr read write math science socst, obs sig star(.05) bonf Why is the multipletest adjustment important? If the data have no missing values, then theres no problem using pwcorr. Contingency Table vs. Pearson Correlation What if the premises of parametric statistics dont hold? What if the premises of parametric statistics dont hold?...
View
Full
Document
This note was uploaded on 07/11/2011 for the course SYA 6305 taught by Professor Tardanico during the Fall '08 term at FIU.
 Fall '08
 Tardanico

Click to edit the document details