ANova - One-Way ANOVA PSY 2801 Summer 2010 One-Way ANOVA...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: One-Way ANOVA PSY 2801: Summer 2010 One-Way ANOVA Jeff Jones University of Minnesota Jeff Jones One-Way ANOVA (More Than) Two Independent Samples There is (at least) one caveat with the t-tests we have learned: we have no way of performing a test with more than two independent groups. What happens if we want to run an experiment, assign people to many independent groups, and perform a statistical test to detect differences? Can we do that with a t-test? Jeff Jones One-Way ANOVA (More Than) Two Independent Samples There is (at least) one caveat with the t-tests we have learned: we have no way of performing a test with more than two independent groups. What happens if we want to run an experiment, assign people to many independent groups, and perform a statistical test to detect differences? Can we do that with a t-test? Well ... we could perform many t-tests, one for each combination of conditions, but that’s inefficient and is bad practice. Jeff Jones One-Way ANOVA (More Than) Two Independent Samples So, we have the format for the t-tests: Statistic − Parameter Standard Error And, if we’re only testing differences between groups, this reduces to: t (df ) = Mean Difference Standard Error It would be nice if we could create a test of the form: t (df ) = Effect Error That way we could see if the experiment worked, if there were differences amongst all of our (more than two) conditions. F= Jeff Jones One-Way ANOVA Analysis of Variance Analysis of Variance (ANOVA) is a statistical test that investigates whether an experiment worked - it examines if there were differences amongst all of our conditions. In ANOVA, we want to see if the error variance is small relative to the actual differences between the conditions -- that way, we know if the differences in the sample stuff actually represent differences in the parameters. Jeff Jones One-Way ANOVA ANOVA steps Here are the basic steps for a One-Way ANOVA: 1 2 3 4 5 6 7 8 9 Form Null and Alternative Hypotheses Set α level (as always) ¯¯ ¯ Calculate Group Means (x1 , x2 , ... , xk ) ¯ Calculate Grand Mean (xG ) Calculate Sum of Squares (between, within, and total) Find df (between and within) Find Mean Squares (between and within) Calculate F -statistic Check the probability of the test statistic occurring (given what?) Choose to reject or not reject H0 Jeff Jones 10 One-Way ANOVA An Example ANOVA Set-Up If we have three levels of an Independent Variable, this is typically what the data would look like: ANOVA Illustration Overall Mean 65 In this case, the green line marks the Grand Mean: ¯ the mean (xG ) of everybody, ignoring group. Furthermore, Group 1 is furthest to the left, Group 2 is in the center, and Group 3 is furthest to the right. Scores 35 1.0 40 45 50 55 60 1.5 2.0 Group 2.5 3.0 Jeff Jones One-Way ANOVA Group Means This is a diagram of the Group Means for an ANOVA: Each Group also has respective sample means ¯¯ ¯ (call them x1 , x2 , and x3 ). All three means are marked with red lines. Jeff Jones One-Way ANOVA Breakdown of Sums of Squares We could look at the squared difference between each point and the grand mean, ignoring group. If we add up all of these squared differences, it is called SStotal, or the Sums of Squares for the entire sample, irrespective of group membership. This is like a variance calculation without dividing by N − 1. Jeff Jones One-Way ANOVA Breakdown of Sums of Squares Each Sums of Squares is the consequence of two things. The first part of the squared difference between each point and the grand mean is due to group membership. The group means won’t be in the same place, and each observation will be hovering around its respective group mean. Jeff Jones One-Way ANOVA Breakdown of Sums of Squares Each Sums of Squares is the consequence of two things. The second part of the squared difference between each point and the grand mean is due to individual (or error) variability. The observations in each group won’t all be in the same place, so there will be some effect attributable to random variation around the respective group mean. Jeff Jones One-Way ANOVA ANOVA Objective Since, because of sampling error (sampling variation), even if all of the groups came from the same population, the sample means would still vary somewhat. What we’re trying to figure out is whether the actual variation of the sample means is greater than that expected by chance ... assuming they all came from the same population. If the sample means vary too much, then we will say that they probably (according to our weird Null Hypothesis Significance Testing criterion) came from different populations. Jeff Jones One-Way ANOVA ANOVA Table To decide whether the groups came from the same population or different populations, it’s easiest to fill out a table. Source Between Within Total Sums of Sq. * * * df * * * Mean Sq. * * F -stat * Each of the stars will have something filled in - our objective is to find the F -stat, as that is our test statistic for an ANOVA. Jeff Jones One-Way ANOVA Sums of Squares: Total Here’s the equation for the Sums of Squares Total: SST = N ￿ i =1 ¯ ( xi − xG ) 2 ¯ where N is the total number of people in all of the groups and xG is the grand mean. If we combined all of our groups into one big group, the Total Sums of Squares is like the variance of that big group without the “divided by N − 1” part of it. Jeff Jones One-Way ANOVA Sums of Squares: Between Here’s the equation for the Sums of Squares Between: SSB = g ￿ j =1 ¯¯ nj (xj − xG )2 Where g is the number of groups, j is the index for group, nj is the ¯ number of people in group j , and xG is the grand mean. This is asking “what would the Sums of Squares be if everybody were exactly at the mean of their respective group.” Jeff Jones One-Way ANOVA Sums of Squares: Within Here’s the equation for the Sums of Squares Within: SSW = nj g ￿￿ j =1 i =1 ¯ (xji − xj )2 Or the easier way of calculating it: This is the combined spread within the groups. We’re only looking at what’s happening within the groups, not between the groups. If we add what’s happening within the groups to what’s happening between the groups, we have what’s happening in total! Jeff Jones SSW = SST − SSB One-Way ANOVA Degrees of Freedom Just like Sums of Squares are additive, Degrees of Freedom are also additive: dfW = N − g dfB = g − 1 dfT = dfB + dfW = (g − 1) + (N − g ) = N − 1 The degrees of freedom total is similar to the denominator of the sample variance calculation ... or it’s similar to the degrees of freedom of a one-sample t -test. Jeff Jones One-Way ANOVA Mean Squares OK - let’s redefine mean - instead of “Sum of Somethings” divided by “Number of Somethings” -- it’s more like “Sum of Somethings” divided by “Respective Degrees of Freedom” (which is usually pretty close to “Number of Somethings”). That’s why s 2 = i =1 −i1 has an N − 1 in the denominator: N because N − 1 is the degrees of freedom for a sample variance. With this in mind: SSB dfB SSW MSW = dfW MSB = Jeff Jones ￿N (x − x ) 2 ¯ One-Way ANOVA F -Stat Finally: MSB MSW If there are no differences in the groups, these are both measures of the same thing. F (dfB , dfW ) = If there are no differences in the groups, MSB should be close to ¯ the expected variance of a x sampling distribution multiplied by N . 2 What is (σx × N )? ¯ Regardless of the differences in the groups, MSW should be like our pooled variance in the t -test (a weighted average of each group’s variance). Jeff Jones One-Way ANOVA F -Stat If there are no differences in the groups: 2 MSB ≈ σx 2 MSW ≈ σx F= MSB σ2 ≈ x ≈1 2 MSW σx What happens to this ratio if, instead of coming from the same population, the groups came from at least two different population? Jeff Jones One-Way ANOVA F -Stat If there are no differences in the groups: 2 MSB ≈ σx 2 MSW ≈ σx F= MSB σ2 ≈ x ≈1 2 MSW σx What happens to this ratio if, instead of coming from the same population, the groups came from at least two different population? Well, the denominator (MSW ) would still estimate the pooled variance. However, now the numerator (MSB ) would estimate the pooled variance plus the differences between the populations ... it should have an extra effect. Jeff Jones One-Way ANOVA The F -Distribution Since the ANOVA is just looking for “any effect” of treatment, it’s looking at a variance ratio. Since variances are always positive, the F -distribution is always positive. If there is an effect, then the numerator will be much larger than the denominator and the ratio will be larger than 1. In fact, the F -test is just a generalization of the Independent Samples t -test. Even stranger, if there are only 2 groups, the F -distribution is just the square of the t -distribution. Jeff Jones One-Way ANOVA The F -Distribution The F -distribution is a generalization of the t -distribution: Histogram of Random F Scores with 1 and 100 df Histogram of Random t Scores Squared with 100 df 0.7 0.6 0.5 0.4 Density 0.3 Density 0 2 4 6 8 10 0.2 0.1 0.0 0.0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 2 4 6 8 10 F Score t Score Squared Since F is t 2 , the F -test is only a 1 tailed test. Jeff Jones One-Way ANOVA A Research Question OK - so let’s take a simple research question to use an ANOVA: Suppose there were three drugs developed in order to make people happier, “Jeff ’s Drug,” “New Drug,” and “Bad Drug.” We picked 15 people at random -- 5 took each of the drugs. Furthermore, after a couple of days, each person was measured on a happiness questionnaire. Higher scores indicate “more happiness.” The researcher wants to decide if there are any differences between the drugs on perceptions of happiness. Since we’re supposed to make our hypotheses before looking at the data, I will save looking at the data for a future slide. We will also set our α level at .05. Jeff Jones One-Way ANOVA Step 1: Setting up Hypotheses The first step is to figure out your Null and Alternative Hypotheses: Jeff Jones One-Way ANOVA Step 1: Setting up Hypotheses The first step is to figure out your Null and Alternative Hypotheses: H0 : µJeff = µNew = µBad H1 : At least one µ differs from at least one other µ Things to remember: 1 2 We are testing something about the means in the population This is a one-directional test -- if the means differ from each other in any way, the F -stat will be large. The Null Hypothesis will only hold if all of the means equal each other in the population. The Alternative Hypothesis is confusing. If H0 isn’t true, then something happened (some mean is different from some other mean) -- however, we don’t know what happened without further tests. Jeff Jones 3 4 One-Way ANOVA The Data OK -- here are our mean level of happiness by drug: Steve Drug 20 22 21 26 31 New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 We want to see whether mean happiness in the population are different among the groups. Jeff Jones One-Way ANOVA The ANOVA Table Let’s take a look at the ANOVA Table again, so we can fill in the missing pieces: Source Between Within Total Sums of Sq. * * * df * * * Mean Sq. * * F -stat * OK -- the first thing to calculate is our Sums of Squares, which involves group means, grand means, and individual observations. We have the individual observations, so let’s calculate the means. Jeff Jones One-Way ANOVA Step 3: Calculate Group Means Jeff Drug 20 22 21 26 31 New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 Jeff Jones One-Way ANOVA Step 3: Calculate Group Means Jeff Drug 20 22 21 26 31 ¯ xj = New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 20 + 22 + 21 + 26 + 31 120 = = 24 5 5 19 + 21 + 22 + 23 + 25 110 ¯ xn = = = 22 5 5 19 + 16 + 15 + 16 + 19 85 ¯ xb = = = 16 5 5 Jeff Jones One-Way ANOVA Step 4: Calculate Grand Mean Furthermore, if the groups all have the same size, the Grand Mean is equal to the mean of means: 24 + 22 + 17 63 = = 21 3 3 If the groups do not have the same size you have to to calculate the mean the regular way: ¯ xG = ¯ xG = {20 + 22 + 21 + 26 + 31 + 19 + 21 + 22 + 23 + 25 + 19 + 16 + 15 + 16 + 19}/15 315 = = 21 15 Jeff Jones One-Way ANOVA Step 4: Calculate Grand Mean So, we have: Steve Drug 20 22 21 26 31 ¯ xs = 24 New Drug 19 21 22 23 25 ¯ xp = 22 ¯ xG = 21 Bad Drug 19 16 15 16 19 ¯ xb = 16 Jeff Jones One-Way ANOVA Step 5: Calculate Sums of Squares So, we know: SSB = g ￿ j =1 ¯¯ nj (xj − xG )2 Where nj is the number of people in group j . But since all of the groups have the same number of people, nj = n = 5 for all groups. So: SSB = g ￿ j =1 ¯¯ n(xj − xG )2 = 5(24 − 21)2 + 5(22 − 21)2 + 5(17 − 21)2 = 5(3)2 + 5(1)2 + 5(−4)2 = 5(9) + 5(1) + 5(16) = 45 + 5 + 80 = 130 Jeff Jones One-Way ANOVA Step 5: Calculate Sums of Squares We also know: SST = N ￿ i =1 ¯ ( xi − xG ) 2 This is just the sample variance multiplied by N − 1. So, if you take each of the 15 observations, subtract 21, square them, and add them up, you will get SST . SST = (20 − 21)2 + (22 − 21)2 + (21 − 21)2 + (26 − 21)2 + (31 − 21)2 + (19 − 21)2 + (21 − 21)2 + (22 − 21)2 + (23 − 21)2 + (25 − 21)2 + (19 − 21)2 + (16 − 21)2 + (15 − 21)2 + (16 − 21)2 + (19 − 21)2 = 246 Jeff Jones One-Way ANOVA Step 5: Calculate Sums of Squares So, we now have SST and SSB . To find SSW we can either use the formula: SSW = nj g ￿￿ j =1 i =1 ¯ (xji − xj )2 or remember that sums of squares are additive: SST = SSB + SSW . N ￿ i =1 SST = ¯ (xi − xG )2 = 246 SSW = SST − SSB = 246 − 130 = 116 Jeff Jones One-Way ANOVA The ANOVA Table Let’s take a look at the ANOVA Table again, so we can fill in the missing pieces: Source Between Within Total Sums of Sq. 130 116 246 df * * * Mean Sq. * * F -stat * OK - now that we figured out the Sums of Squares, let’s find the degrees of freedom. Jeff Jones One-Way ANOVA Step 6: Find Degrees of Freedom Remember: We have 15 total people (N = 15); We have 5 in each group (n = 5); We have 3 groups (g = 3). What are the degrees of freedom? Jeff Jones One-Way ANOVA Step 6: Find Degrees of Freedom Remember: We have 15 total people (N = 15); We have 5 in each group (n = 5); We have 3 groups (g = 3). What are the degrees of freedom? dfW = N − g = 15 − 3 = 12 dfB = g − 1 = 3 − 1 = 2 dfT = dfB + dfW = 2 + 12 = 14 Jeff Jones One-Way ANOVA The ANOVA Table Let’s take a look at the ANOVA Table again, so we can fill in the missing pieces: Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 Mean Sq. * * F -stat * OK - now that we found the degrees of freedom, let’s find the Mean Squares. Jeff Jones One-Way ANOVA Step 7: Mean Squares All we have to remember is that a “Mean” is a “Sum of Somethings” divided by its “Degrees of Freedom:” Jeff Jones One-Way ANOVA Step 7: Mean Squares All we have to remember is that a “Mean” is a “Sum of Somethings” divided by its “Degrees of Freedom:” SSB 130 = = 65 dfB 2 SSW 116 MSW = = = 9.667 dfW 12 MSB = Jeff Jones One-Way ANOVA The ANOVA Table Let’s take a look at the ANOVA Table again, so we can fill in the missing pieces: Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 Mean Sq. 65 9.667 F -stat * Finally, let’s find the F -statistic by dividing “MSB” by “MSW.” Jeff Jones One-Way ANOVA Step 8: The F -Statisic MSB 65 = = 6.72 MSW 9.667 F= Jeff Jones One-Way ANOVA Step 8: The F -Statisic MSB 65 = = 6.72 MSW 9.667 F= So, our final ANOVA Table is: Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 Mean Sq. 65 9.667 F -stat 6.72 That’s basically it -- if you can fill in this table, you’ve accomplished the most important ANOVA stuff. Jeff Jones One-Way ANOVA Step 9: Make Decision OK -- so now comes the confusing “Make Decision” part: We have a specific F -table for each level of significance (Howell p. 593–594). We look across the top of the table for the Numerator Degrees of Freedom (2 in this case), and look down the side of the table for the Denominator Degrees of Freedom (12 in this case). Where they cross is our critical value. So, our F -crit in this situation is: Fcrit (2, 12) = 3.89. OK -- now what do we do? Jeff Jones One-Way ANOVA Step 9: Make Decision Since the F -test is only a 1 tailed test, we reject for large F -values, indicating a large variance of the means relative to the variance within the groups. Our critical value: F -crit = 3.89 Our test statistic: F -stat = 6.72 So, what do we do? Jeff Jones One-Way ANOVA Step 9 & 10: Decision and Conclusion We want to compare our F -stat to our critical value, which consequently compares our p -value to our α level. This is how we do that: Since 3.89 < F = 6.72, p < α, we reject H0 and conclude that there are differences in mean happiness levels between some of the groups. Notice how limited this is -- we can only conclude that something happened. We know that there were differences ... somewhere. However, just based on our F -test, we don’t know where the differences are. Some people refer to the F -test as a very weak straw man, since finding differences is usually not all that useful. Jeff Jones One-Way ANOVA Break for Thought Before we go about figuring out differences between groups, one of the cool things about ANOVA is the possibility of filling out the table with only a few pieces of information: Source Between Within Total Sums of Sq. 351.52 * 786.82 df 4 * 49 Mean Sq. * * F -stat * You should be able to fill in the stars in the table just based on the four pieces of information given, and you should also be able to compare the F -stat to a prespecified α level. How would you do that? Jeff Jones One-Way ANOVA Post-Hoc Tests The omnibus (overall) ANOVA tells us whether or not there is a difference somewhere. It does not tells us where the differences are. We do Post-Hoc (after the fact) tests to figure that out. There are many ... many ... many Post-Hoc tests that serve many purposes. We will be interested in the most basic purpose, to decide where means are different from each other in the population. Most Post-Hoc tests attempt to control the Family-wise Error rate: The probability that a family of comparisons contains at least oneType 1 error. The more tests you do, the more likely you are to have a Type I error. We will discuss three different Post-Hoc Tests: Fisher’s LSD, Bonferroni,and Tukey’s HSD. Jeff Jones One-Way ANOVA Fisher’s LSD Test 1 Fisher’s LSD (Least Significant Difference): A very liberal post-hoc test that basically performs t-tests where the MSW (also called MSE or MSerror ) is the “pooled variance” of your new t -tests. It can be used to perform t -tests on all of the paired groups. However, you should be cautious of doing this for many pairs. We had 3 groups, so we have 3 paired comparisons to make (1-2; 1-3; 2-3) It doesn’t correct much for Family-wise Error: the compounded Type I Error that you will make (on the whole set of tests) by performing so many tests. However, see Howell for a justification of its ability to control family-wise error under certain conditions Jeff Jones One-Way ANOVA Fisher’s LSD Test Things you will need: 1 2 3 4 5 6 A significant F test Groups Means Total Sample Size (N ) Individual Group Sample Sizes (nj ) Mean Squares Within (estimating your pooled variance) df Within You can then calculate: ¯¯ xi − xj t=￿ ￿ 1 MSE ni + Jeff Jones 1 nj ￿ One-Way ANOVA Fisher’s LSD Test After calculating ¯¯ xi − xj t=￿ ￿ 1 MSE ni + ￿ 1 nj you would then compare it to the two tailed critical t value with df = N − g . If the t statistic is more extreme than the two tailed critical t value, then the means are different in the population. Jeff Jones One-Way ANOVA Fisher’s LSD Test: Example In our situation: 1 2 3 4 5 ¯ ¯ ¯ Group Means: xj = 24, xp = 22, xb = 16 Total Sample Size: N = 15 Individual Group Sample Sizes: nj = n = 5 for all groups. Mean Squares Within: MSW = 9.667 df Within: dfW = 12 Using α = .05, what’s the two-tailed critical value where df = 12? tcrit (12) = 2.179 Jeff Jones One-Way ANOVA Fisher’s LSD Test We will first test the mean difference: µj − µp ¯¯ xj − xp t=￿ 1 1 MSE ( n + n ) 2 = 1.017 1.966 24 − 22 =￿ 9.667( 1 + 1 ) 5 5 = t = 1.107 < tcrit = 2.179 so we fail to reject the Null Hypothesis and conclude that the two drugs do not produce different amounts of happiness. Jeff Jones One-Way ANOVA Fisher’s LSD Test We will do one more test: µj − µb : ¯¯ xj − xb t=￿ 1 1 MSE ( n + n ) 8 = 4.07 1.966 24 − 16 =￿ 9.667( 1 + 1 ) 5 5 = t = 4.07 > tcrit = 2.179 so we reject the Null Hypothesis and conclude that the two drugs do produce different amounts of happiness. Jeff Jones One-Way ANOVA Post-Hoc Tests In the previous post-hoc comparison (LSD) we performed two tests, all with α = .05. If they all have the same error rate, and if we perform a lot of tests, eventually we might reject H0 based solely on chance. We want to make our α = .05 for all of the tests as a whole. If H0 is true for all of the tests, we want the probability of rejecting any of the tests to be .05. This leads us to the Bonferroni correction. Jeff Jones One-Way ANOVA Bonferroni 2 Bonferroni: Same as before, only the t -test critical value is adjusted by the number of tests. Let’s say we have 3 groups to compare, so we want to do 3 paired tests (1-2; 1-3; 2-3) Also, let’s say our α = .05 for all the tests as a whole Bonferroni says set our t -crit to α where k is the number of k tests This test is very conservative when there is a large number of comparisons we want to make. Jeff Jones One-Way ANOVA Bonferroni Things you will need: 1 2 3 4 5 Groups Means Total Sample Size (N ) Individual Group Sample Sizes (nj ) Mean Squares Within (estimating your pooled variance) df Within We use the following t statistic ¯¯ xi − xj t=￿ ￿ 1 MSE ni + ￿ 1 nj This should look familiar - we just did this for the LSD test. Jeff Jones One-Way ANOVA Bonferroni The difference between the LSD approach and Bonferroni approach involves how you obtain the critical t value. For the LSD test you look up the critical t value at some α level with N − g degrees of freedom. For the Bonferroni you use the same degrees of freedom (N − g ) but a modified α value. αB = α/k where k = number of comparisons to be made. For example, if you were interested in the computing three different mean comparisons at an α = .05, The Bonferroni correction would be αB = 0.05/3 = 0.0167 Jeff Jones One-Way ANOVA Bonferroni: Example In our situation: 1 2 3 4 5 ¯ ¯ ¯ Group Means: xj = 24, xp = 22, xb = 16 Total Sample Size: N = 15 Individual Group Sample Sizes: nj = n = 5 for all groups. Mean Squares Within: MSW = 9.667 df Within: dfW = 12 For this example we will compute two t tests (k = 2). Using α = .05, corrected-α = .05 = .025, what’s the two-tailed critical 2 value where df = 12? tcrit (12) = 2.56 Jeff Jones One-Way ANOVA Bonferroni: Example First we will test H0 : µj − µb = 0 ¯¯ xj − xb t=￿ 1 1 MSE ( n + n ) 8 = 4.07 1.966 24 − 16 =￿ 9.667( 1 + 1 ) 5 5 = Because t = 4.07 > tcrit = 2.56, we would reject the Null Hypothesis and conclude that the drugs produce a different amount of happiness. Jeff Jones One-Way ANOVA Bonferroni: Example For the second test we will test H0 : µp − µb = 0 ¯ ¯ xp − xb t=￿ 1 1 MSE ( n + n ) 6 = 3.05 1.966 22 − 16 =￿ 9.667( 1 + 1 ) 5 5 = Because t = 3.05 > tcrit = 2.56, we would reject the Null Hypothesis and conclude that the drugs produce a different amount of happiness. Jeff Jones One-Way ANOVA Post-hoc Tests OK - we have one test that is very liberal (LSD) and one test that can be very conservative (Bonferroni). Is there any test in the middle? Well, yes there is -- a somewhat conservative test called the Tukey HSD. It finds the smallest mean difference that is Honestly Significant, and anything surpassing that mean difference is rejected. I won’t make you perform that test, just know that the rank ordering of liberal to conservative: 1 2 3 LSD Tukey Bonferroni Furthermore, know that you use the tests to correct for family-wise error Jeff Jones One-Way ANOVA Assumptions of ANOVA ANOVA is like an Independent Samples t-test, and it is also like a Regression (though we won’t learn why here). Thus the assumptions: 1 2 3 Normality of the Populations Homogeneity of Variance Independence of Observations Notice how, in the t -tests, the ANOVA, and the Regression, how in order to test significance, they have the same basic assumptions Jeff Jones One-Way ANOVA Effect Size ANOVA also has an effect-size, and it is pretty easy to calculate. It’s called “eta-squared”: SSB SST An interpretation of this statistic is that it is the “percentage of variance accounted for by group membership.” η2 = Or, it’s how much a person moves predictably toward the mean of their group compared to how much they move total (some of which is random). Think of it in a similar manner as R 2 from the Regression lecture. Jeff Jones One-Way ANOVA Eta Squared Example Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 SSB SST 130 = 246 = 0.53 Mean Sq. 65 9.667 F -stat 6.72 η2 = Jeff Jones One-Way ANOVA Extending One-Way ANOVA This lecture has been focused on One-Way ANOVA where there is one dependent variable and one independent variable (group membership). The ANOVA framework/test is much more general and can be used to do tests involving both multiple dependent variables and independent variables. The remainder of the lecture will be illustrating ANOVAs generality by working through an example with two independent variables and one dependent variable. The ideas presented here readily generalize to n independent variables and one dependent variable. Jeff Jones One-Way ANOVA Drug Example Extended Recall the Drug data for the One-Way ANOVA setup: Jeff Drug 20 22 21 26 31 New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 Suppose that in addition to Drug Type, we are also interested in Age Differences, and how Drug Type interacts with Age to affect Happiness. Jeff Jones One-Way ANOVA Drug Example Extended Suppose we collected the following data Jeff Drug 20 22 21 26 31 28 New Drug 19 21 22 20 18 16 Bad Drug 19 16 15 31 28 33 Young Old We can then run an ANOVA to investigate the effect of Age on Happiness, the effect of Drug on Happiness, and the how the interaction between Age and Drug affects Happiness. Jeff Jones One-Way ANOVA Extending the ANOVA Table Recall the One-Way ANOVA Table Source Between Within Total Sums of Sq. * * * df * * * Mean Sq. * * F -stat * We have a Sums of Squares Row for Between - which was the difference among the Drug means. Now we will have two extra rows - one for Age, and another for the interaction. Jeff Jones One-Way ANOVA Generic Two-Way ANOVA Table Here, A and B stand in for two different independent variables (A could be Drug, and B could be Age, for example) Source A B A×B Error Total SS SSA SSB SSA×B SSE SST df a−1 b−1 (a − 1) × (b − 1) N − ab N −1 Mean Sq. MSA MSB MSA×B MSE F -stat MSA /MSE MSB /MSE MSA×B /MSE Jeff Jones One-Way ANOVA Two-Way Table for our Example If you were to do all the calculations, the ANOVA table would be Source Age Drug Age X Drug Error Total SS 174.22 96.44 211.11 48.67 530.44 df 1 2 2 12 17 Mean Sq. 174.22 48.22 105.56 4.056 F -stat 42.96 11.89 26.03 All the F statistics are significant at α = 0.05 in this example what does this mean? We will look at the interaction effect first. Jeff Jones One-Way ANOVA Interaction Effect It is always a good idea to plot the data to see what is going on. Jeff Jones One-Way ANOVA Main Effects We can also look at how the independent variables affect happiness. Jeff Jones ...
View Full Document

This note was uploaded on 10/08/2010 for the course PSY 2801 taught by Professor Guyer during the Summer '08 term at Minnesota.

Ask a homework question - tutors are online