Measures the squared diff bw each data point and the average of the data points

Measures the squared diff bw each data point and the

This preview shows page 102 - 106 out of 133 pages.

Measures the squared diff b/w each data point and the average of the data points - Larger difference b/w data points = larger SST o Divide SST → SSR and SSE SST = SSR + SSE o Sum of Squares Error (SSE) – measures the variation in the dependent variable that is explained by variables other than the independent variable SSE = ∑y² - b∑y - m∑xy - Where b = y-intercept - M = slope In our example → 74.62 o Sum of Squares Regression (SSR) – measures the amount of variation in the dependent variable that is explained by the independent variable
Image of page 102
SSR = ∑(predicted value of y at given value of x – average value of the dependent variable from the sample)² SSR = SST – SSE In our example → 237.5 Calculating the Coefficient of Determination o Coefficient of Determination (R²) – measures the percentage of the total variation of our dependent variable that is explained by our independent variable from a sample R² = SSR / SST R² = (r)² In our example → 0.686 - Conclude that 68.6% of total variation can be explained by the independent variable (hours of study) o The Basics Value ranges from 0 – 100% Higher values = more desirable - b/c want to explain most of variation by independent variable - indicate stronger relationship b/w dependent and independent variables low values may indicate: - using wrong independent variable - need additional independent variables to explain variation in dependent variable Conducting a Hypothesis Test to Determine the Significance of the Coefficient of Determination o Population Coefficient of Determination (p²) – measures for an entire population the percentage of the total variation of a dependent variable that is explained by an independent variable Unknown in our example
Image of page 103
Perform hypothesis test to det. if p² is significantly diff from zero based on R² If reject null hypothesis - Have enough evidence from our sample to conclude that a relationship does exist b/w the 2 variables o Step 1: Hypothesis Statements H0: p² ≤ 0 H1: p² > 0 o Step 2: Set level of significance Set ∞ = 0.05 o Step 3: F-Test Statistic F = SSR ÷ (SSE / n – 2) In our example → 8.73 o Step 4: Find the critical F-Score Identifies the rejection region for the hypothesis test Follows F-distribution (Table 6, Appendix A) Degrees of Freedom - D1 = 1 (always one b/c only one independent variable) - D2 = (n – 2) In our example, ∞ = 0.05, D1 = 1, D2 = 4 - → 7.709 o Step 5: Comparing the F-Test Statistic (F) with the Critical (F-Score) 8.73 (F) > 7.709 (F-score) → REJECT o Step 6: State your conclusions Conclude that the coefficient of determination (R²) is greater than zero – appears to be a relationship b/w the two variables o In the REGRESSION OUTPUT – EXCEL Regression Statistics > R Square - = coefficient of determination, R² Sum of squares (ANOVA – SS column) - Residual = error sum of squares
Image of page 104
- Regression ANOVA F column = F-score ANOVA DF shows degrees of freedom for - D1 (regression) - D2 (residual) ANOVA Significance F = p-value -
Image of page 105
Image of page 106

You've reached the end of your free preview.

Want to read all 133 pages?

  • Fall '12
  • Donnelly
  • Normal Distribution, Null hypothesis, Hypothesis testing, Statistical hypothesis testing

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture