Unformatted text preview: OneWay ANOVA PSY 2801: Summer 2010
OneWay ANOVA Jeﬀ Jones
University of Minnesota Jeﬀ Jones OneWay ANOVA (More Than) Two Independent Samples There is (at least) one caveat with the ttests we have learned: we have no way of performing a test with more than two independent groups. What happens if we want to run an experiment, assign people to many independent groups, and perform a statistical test to detect diﬀerences? Can we do that with a ttest? Jeﬀ Jones OneWay ANOVA (More Than) Two Independent Samples There is (at least) one caveat with the ttests we have learned: we have no way of performing a test with more than two independent groups. What happens if we want to run an experiment, assign people to many independent groups, and perform a statistical test to detect diﬀerences? Can we do that with a ttest? Well ... we could perform many ttests, one for each combination of conditions, but that’s ineﬃcient and is bad practice. Jeﬀ Jones OneWay ANOVA (More Than) Two Independent Samples
So, we have the format for the ttests: Statistic − Parameter Standard Error And, if we’re only testing diﬀerences between groups, this reduces to: t (df ) = Mean Diﬀerence Standard Error It would be nice if we could create a test of the form: t (df ) = Eﬀect Error That way we could see if the experiment worked, if there were diﬀerences amongst all of our (more than two) conditions. F=
Jeﬀ Jones OneWay ANOVA Analysis of Variance Analysis of Variance (ANOVA) is a statistical test that investigates whether an experiment worked  it examines if there were diﬀerences amongst all of our conditions. In ANOVA, we want to see if the error variance is small relative to the actual diﬀerences between the conditions  that way, we know if the diﬀerences in the sample stuﬀ actually represent diﬀerences in the parameters. Jeﬀ Jones OneWay ANOVA ANOVA steps
Here are the basic steps for a OneWay ANOVA:
1 2 3 4 5 6 7 8 9 Form Null and Alternative Hypotheses Set α level (as always) ¯¯ ¯ Calculate Group Means (x1 , x2 , ... , xk ) ¯ Calculate Grand Mean (xG ) Calculate Sum of Squares (between, within, and total) Find df (between and within) Find Mean Squares (between and within) Calculate F statistic Check the probability of the test statistic occurring (given what?) Choose to reject or not reject H0
Jeﬀ Jones 10 OneWay ANOVA An Example ANOVA SetUp
If we have three levels of an Independent Variable, this is typically what the data would look like:
ANOVA Illustration
Overall Mean 65 In this case, the green line marks the Grand Mean: ¯ the mean (xG ) of everybody, ignoring group. Furthermore, Group 1 is furthest to the left, Group 2 is in the center, and Group 3 is furthest to the right. Scores 35 1.0 40 45 50 55 60 1.5 2.0 Group 2.5 3.0 Jeﬀ Jones OneWay ANOVA Group Means
This is a diagram of the Group Means for an ANOVA: Each Group also has respective sample means ¯¯ ¯ (call them x1 , x2 , and x3 ). All three means are marked with red lines. Jeﬀ Jones OneWay ANOVA Breakdown of Sums of Squares
We could look at the squared diﬀerence between each point and the grand mean, ignoring group. If we add up all of these squared diﬀerences, it is called SStotal, or the Sums of Squares for the entire sample, irrespective of group membership. This is like a variance calculation without dividing by N − 1. Jeﬀ Jones OneWay ANOVA Breakdown of Sums of Squares
Each Sums of Squares is the consequence of two things. The ﬁrst part of the squared diﬀerence between each point and the grand mean is due to group membership. The group means won’t be in the same place, and each observation will be hovering around its respective group mean. Jeﬀ Jones OneWay ANOVA Breakdown of Sums of Squares
Each Sums of Squares is the consequence of two things. The second part of the squared diﬀerence between each point and the grand mean is due to individual (or error) variability. The observations in each group won’t all be in the same place, so there will be some eﬀect attributable to random variation around the respective group mean. Jeﬀ Jones OneWay ANOVA ANOVA Objective
Since, because of sampling error (sampling variation), even if all of the groups came from the same population, the sample means would still vary somewhat. What we’re trying to ﬁgure out is whether the actual variation of the sample means is greater than that expected by chance ... assuming they all came from the same population. If the sample means vary too much, then we will say that they probably (according to our weird Null Hypothesis Signiﬁcance Testing criterion) came from diﬀerent populations. Jeﬀ Jones OneWay ANOVA ANOVA Table
To decide whether the groups came from the same population or diﬀerent populations, it’s easiest to ﬁll out a table. Source Between Within Total Sums of Sq. * * * df * * * Mean Sq. * * F stat * Each of the stars will have something ﬁlled in  our objective is to ﬁnd the F stat, as that is our test statistic for an ANOVA.
Jeﬀ Jones OneWay ANOVA Sums of Squares: Total
Here’s the equation for the Sums of Squares Total: SST =
N i =1 ¯ ( xi − xG ) 2 ¯ where N is the total number of people in all of the groups and xG is the grand mean. If we combined all of our groups into one big group, the Total Sums of Squares is like the variance of that big group without the “divided by N − 1” part of it. Jeﬀ Jones OneWay ANOVA Sums of Squares: Between Here’s the equation for the Sums of Squares Between: SSB =
g j =1 ¯¯ nj (xj − xG )2 Where g is the number of groups, j is the index for group, nj is the ¯ number of people in group j , and xG is the grand mean. This is asking “what would the Sums of Squares be if everybody were exactly at the mean of their respective group.” Jeﬀ Jones OneWay ANOVA Sums of Squares: Within
Here’s the equation for the Sums of Squares Within: SSW =
nj g j =1 i =1 ¯ (xji − xj )2 Or the easier way of calculating it: This is the combined spread within the groups. We’re only looking at what’s happening within the groups, not between the groups. If we add what’s happening within the groups to what’s happening between the groups, we have what’s happening in total!
Jeﬀ Jones SSW = SST − SSB OneWay ANOVA Degrees of Freedom Just like Sums of Squares are additive, Degrees of Freedom are also additive: dfW = N − g dfB = g − 1 dfT = dfB + dfW = (g − 1) + (N − g ) = N − 1 The degrees of freedom total is similar to the denominator of the sample variance calculation ... or it’s similar to the degrees of freedom of a onesample t test. Jeﬀ Jones OneWay ANOVA Mean Squares
OK  let’s redeﬁne mean  instead of “Sum of Somethings” divided by “Number of Somethings”  it’s more like “Sum of Somethings” divided by “Respective Degrees of Freedom” (which is usually pretty close to “Number of Somethings”). That’s why s 2 = i =1 −i1 has an N − 1 in the denominator: N because N − 1 is the degrees of freedom for a sample variance. With this in mind: SSB dfB SSW MSW = dfW MSB =
Jeﬀ Jones N (x − x ) 2 ¯ OneWay ANOVA F Stat
Finally: MSB MSW If there are no diﬀerences in the groups, these are both measures of the same thing. F (dfB , dfW ) = If there are no diﬀerences in the groups, MSB should be close to ¯ the expected variance of a x sampling distribution multiplied by N . 2 What is (σx × N )? ¯ Regardless of the diﬀerences in the groups, MSW should be like our pooled variance in the t test (a weighted average of each group’s variance).
Jeﬀ Jones OneWay ANOVA F Stat
If there are no diﬀerences in the groups:
2 MSB ≈ σx 2 MSW ≈ σx F= MSB σ2 ≈ x ≈1 2 MSW σx What happens to this ratio if, instead of coming from the same population, the groups came from at least two diﬀerent population? Jeﬀ Jones OneWay ANOVA F Stat
If there are no diﬀerences in the groups:
2 MSB ≈ σx 2 MSW ≈ σx F= MSB σ2 ≈ x ≈1 2 MSW σx What happens to this ratio if, instead of coming from the same population, the groups came from at least two diﬀerent population? Well, the denominator (MSW ) would still estimate the pooled variance. However, now the numerator (MSB ) would estimate the pooled variance plus the diﬀerences between the populations ... it should have an extra eﬀect.
Jeﬀ Jones OneWay ANOVA The F Distribution
Since the ANOVA is just looking for “any eﬀect” of treatment, it’s looking at a variance ratio. Since variances are always positive, the F distribution is always positive. If there is an eﬀect, then the numerator will be much larger than the denominator and the ratio will be larger than 1. In fact, the F test is just a generalization of the Independent Samples t test. Even stranger, if there are only 2 groups, the F distribution is just the square of the t distribution. Jeﬀ Jones OneWay ANOVA The F Distribution
The F distribution is a generalization of the t distribution:
Histogram of Random F Scores with 1 and 100 df Histogram of Random t Scores Squared with 100 df 0.7 0.6 0.5 0.4 Density 0.3 Density
0 2 4 6 8 10 0.2 0.1 0.0 0.0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 2 4 6 8 10 F Score t Score Squared Since F is t 2 , the F test is only a 1 tailed test.
Jeﬀ Jones OneWay ANOVA A Research Question
OK  so let’s take a simple research question to use an ANOVA: Suppose there were three drugs developed in order to make people happier, “Jeﬀ ’s Drug,” “New Drug,” and “Bad Drug.” We picked 15 people at random  5 took each of the drugs. Furthermore, after a couple of days, each person was measured on a happiness questionnaire. Higher scores indicate “more happiness.” The researcher wants to decide if there are any diﬀerences between the drugs on perceptions of happiness. Since we’re supposed to make our hypotheses before looking at the data, I will save looking at the data for a future slide. We will also set our α level at .05.
Jeﬀ Jones OneWay ANOVA Step 1: Setting up Hypotheses
The ﬁrst step is to ﬁgure out your Null and Alternative Hypotheses: Jeﬀ Jones OneWay ANOVA Step 1: Setting up Hypotheses
The ﬁrst step is to ﬁgure out your Null and Alternative Hypotheses: H0 : µJeﬀ = µNew = µBad H1 : At least one µ diﬀers from at least one other µ Things to remember:
1 2 We are testing something about the means in the population This is a onedirectional test  if the means diﬀer from each other in any way, the F stat will be large. The Null Hypothesis will only hold if all of the means equal each other in the population. The Alternative Hypothesis is confusing. If H0 isn’t true, then something happened (some mean is diﬀerent from some other mean)  however, we don’t know what happened without further tests.
Jeﬀ Jones 3 4 OneWay ANOVA The Data OK  here are our mean level of happiness by drug: Steve Drug 20 22 21 26 31 New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 We want to see whether mean happiness in the population are diﬀerent among the groups. Jeﬀ Jones OneWay ANOVA The ANOVA Table
Let’s take a look at the ANOVA Table again, so we can ﬁll in the missing pieces: Source Between Within Total Sums of Sq. * * * df * * * Mean Sq. * * F stat * OK  the ﬁrst thing to calculate is our Sums of Squares, which involves group means, grand means, and individual observations. We have the individual observations, so let’s calculate the means. Jeﬀ Jones OneWay ANOVA Step 3: Calculate Group Means
Jeﬀ Drug 20 22 21 26 31 New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 Jeﬀ Jones OneWay ANOVA Step 3: Calculate Group Means
Jeﬀ Drug 20 22 21 26 31 ¯ xj = New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 20 + 22 + 21 + 26 + 31 120 = = 24 5 5 19 + 21 + 22 + 23 + 25 110 ¯ xn = = = 22 5 5 19 + 16 + 15 + 16 + 19 85 ¯ xb = = = 16 5 5 Jeﬀ Jones OneWay ANOVA Step 4: Calculate Grand Mean
Furthermore, if the groups all have the same size, the Grand Mean is equal to the mean of means: 24 + 22 + 17 63 = = 21 3 3 If the groups do not have the same size you have to to calculate the mean the regular way: ¯ xG = ¯ xG = {20 + 22 + 21 + 26 + 31 + 19 + 21 + 22 + 23 + 25 + 19 + 16 + 15 + 16 + 19}/15 315 = = 21 15 Jeﬀ Jones OneWay ANOVA Step 4: Calculate Grand Mean
So, we have: Steve Drug 20 22 21 26 31 ¯ xs = 24 New Drug 19 21 22 23 25 ¯ xp = 22 ¯ xG = 21 Bad Drug 19 16 15 16 19 ¯ xb = 16 Jeﬀ Jones OneWay ANOVA Step 5: Calculate Sums of Squares
So, we know: SSB =
g j =1 ¯¯ nj (xj − xG )2 Where nj is the number of people in group j . But since all of the groups have the same number of people, nj = n = 5 for all groups. So: SSB =
g j =1 ¯¯ n(xj − xG )2 = 5(24 − 21)2 + 5(22 − 21)2 + 5(17 − 21)2 = 5(3)2 + 5(1)2 + 5(−4)2 = 5(9) + 5(1) + 5(16) = 45 + 5 + 80 = 130
Jeﬀ Jones OneWay ANOVA Step 5: Calculate Sums of Squares
We also know: SST =
N i =1 ¯ ( xi − xG ) 2 This is just the sample variance multiplied by N − 1. So, if you take each of the 15 observations, subtract 21, square them, and add them up, you will get SST . SST = (20 − 21)2 + (22 − 21)2 + (21 − 21)2 + (26 − 21)2 + (31 − 21)2 + (19 − 21)2 + (21 − 21)2 + (22 − 21)2 + (23 − 21)2 + (25 − 21)2 + (19 − 21)2 + (16 − 21)2 + (15 − 21)2 + (16 − 21)2 + (19 − 21)2 = 246 Jeﬀ Jones OneWay ANOVA Step 5: Calculate Sums of Squares
So, we now have SST and SSB . To ﬁnd SSW we can either use the formula: SSW =
nj g j =1 i =1 ¯ (xji − xj )2 or remember that sums of squares are additive: SST = SSB + SSW .
N i =1 SST = ¯ (xi − xG )2 = 246 SSW = SST − SSB = 246 − 130 = 116
Jeﬀ Jones OneWay ANOVA The ANOVA Table
Let’s take a look at the ANOVA Table again, so we can ﬁll in the missing pieces: Source Between Within Total Sums of Sq. 130 116 246 df * * * Mean Sq. * * F stat * OK  now that we ﬁgured out the Sums of Squares, let’s ﬁnd the degrees of freedom. Jeﬀ Jones OneWay ANOVA Step 6: Find Degrees of Freedom
Remember: We have 15 total people (N = 15); We have 5 in each group (n = 5); We have 3 groups (g = 3). What are the degrees of freedom? Jeﬀ Jones OneWay ANOVA Step 6: Find Degrees of Freedom
Remember: We have 15 total people (N = 15); We have 5 in each group (n = 5); We have 3 groups (g = 3). What are the degrees of freedom? dfW = N − g = 15 − 3 = 12 dfB = g − 1 = 3 − 1 = 2 dfT = dfB + dfW = 2 + 12 = 14 Jeﬀ Jones OneWay ANOVA The ANOVA Table
Let’s take a look at the ANOVA Table again, so we can ﬁll in the missing pieces: Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 Mean Sq. * * F stat * OK  now that we found the degrees of freedom, let’s ﬁnd the Mean Squares. Jeﬀ Jones OneWay ANOVA Step 7: Mean Squares All we have to remember is that a “Mean” is a “Sum of Somethings” divided by its “Degrees of Freedom:” Jeﬀ Jones OneWay ANOVA Step 7: Mean Squares All we have to remember is that a “Mean” is a “Sum of Somethings” divided by its “Degrees of Freedom:” SSB 130 = = 65 dfB 2 SSW 116 MSW = = = 9.667 dfW 12 MSB = Jeﬀ Jones OneWay ANOVA The ANOVA Table
Let’s take a look at the ANOVA Table again, so we can ﬁll in the missing pieces: Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 Mean Sq. 65 9.667 F stat * Finally, let’s ﬁnd the F statistic by dividing “MSB” by “MSW.” Jeﬀ Jones OneWay ANOVA Step 8: The F Statisic
MSB 65 = = 6.72 MSW 9.667 F= Jeﬀ Jones OneWay ANOVA Step 8: The F Statisic
MSB 65 = = 6.72 MSW 9.667 F= So, our ﬁnal ANOVA Table is: Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 Mean Sq. 65 9.667 F stat 6.72 That’s basically it  if you can ﬁll in this table, you’ve accomplished the most important ANOVA stuﬀ. Jeﬀ Jones OneWay ANOVA Step 9: Make Decision OK  so now comes the confusing “Make Decision” part: We have a speciﬁc F table for each level of signiﬁcance (Howell p. 593–594). We look across the top of the table for the Numerator Degrees of Freedom (2 in this case), and look down the side of the table for the Denominator Degrees of Freedom (12 in this case). Where they cross is our critical value. So, our F crit in this situation is: Fcrit (2, 12) = 3.89. OK  now what do we do? Jeﬀ Jones OneWay ANOVA Step 9: Make Decision
Since the F test is only a 1 tailed test, we reject for large F values, indicating a large variance of the means relative to the variance within the groups. Our critical value: F crit = 3.89 Our test statistic: F stat = 6.72 So, what do we do?
Jeﬀ Jones OneWay ANOVA Step 9 & 10: Decision and Conclusion
We want to compare our F stat to our critical value, which consequently compares our p value to our α level. This is how we do that: Since 3.89 < F = 6.72, p < α, we reject H0 and conclude that there are diﬀerences in mean happiness levels between some of the groups. Notice how limited this is  we can only conclude that something happened. We know that there were diﬀerences ... somewhere. However, just based on our F test, we don’t know where the diﬀerences are. Some people refer to the F test as a very weak straw man, since ﬁnding diﬀerences is usually not all that useful.
Jeﬀ Jones OneWay ANOVA Break for Thought
Before we go about ﬁguring out diﬀerences between groups, one of the cool things about ANOVA is the possibility of ﬁlling out the table with only a few pieces of information: Source Between Within Total Sums of Sq. 351.52 * 786.82 df 4 * 49 Mean Sq. * * F stat * You should be able to ﬁll in the stars in the table just based on the four pieces of information given, and you should also be able to compare the F stat to a prespeciﬁed α level. How would you do that?
Jeﬀ Jones OneWay ANOVA PostHoc Tests
The omnibus (overall) ANOVA tells us whether or not there is a diﬀerence somewhere. It does not tells us where the diﬀerences are. We do PostHoc (after the fact) tests to ﬁgure that out. There are many ... many ... many PostHoc tests that serve many purposes. We will be interested in the most basic purpose, to decide where means are diﬀerent from each other in the population. Most PostHoc tests attempt to control the Familywise Error rate: The probability that a family of comparisons contains at least oneType 1 error. The more tests you do, the more likely you are to have a Type I error. We will discuss three diﬀerent PostHoc Tests: Fisher’s LSD, Bonferroni,and Tukey’s HSD.
Jeﬀ Jones OneWay ANOVA Fisher’s LSD Test
1 Fisher’s LSD (Least Signiﬁcant Diﬀerence): A very liberal posthoc test that basically performs ttests where the MSW (also called MSE or MSerror ) is the “pooled variance” of your new t tests.
It can be used to perform t tests on all of the paired groups.
However, you should be cautious of doing this for many pairs. We had 3 groups, so we have 3 paired comparisons to make (12; 13; 23) It doesn’t correct much for Familywise Error: the compounded Type I Error that you will make (on the whole set of tests) by performing so many tests.
However, see Howell for a justiﬁcation of its ability to control familywise error under certain conditions Jeﬀ Jones OneWay ANOVA Fisher’s LSD Test
Things you will need:
1 2 3 4 5 6 A signiﬁcant F test Groups Means Total Sample Size (N ) Individual Group Sample Sizes (nj ) Mean Squares Within (estimating your pooled variance) df Within You can then calculate: ¯¯ xi − xj t= 1 MSE ni +
Jeﬀ Jones 1 nj OneWay ANOVA Fisher’s LSD Test After calculating ¯¯ xi − xj t= 1 MSE ni + 1 nj you would then compare it to the two tailed critical t value with df = N − g . If the t statistic is more extreme than the two tailed critical t value, then the means are diﬀerent in the population. Jeﬀ Jones OneWay ANOVA Fisher’s LSD Test: Example In our situation:
1 2 3 4 5 ¯ ¯ ¯ Group Means: xj = 24, xp = 22, xb = 16 Total Sample Size: N = 15 Individual Group Sample Sizes: nj = n = 5 for all groups. Mean Squares Within: MSW = 9.667 df Within: dfW = 12 Using α = .05, what’s the twotailed critical value where df = 12? tcrit (12) = 2.179 Jeﬀ Jones OneWay ANOVA Fisher’s LSD Test
We will ﬁrst test the mean diﬀerence: µj − µp ¯¯ xj − xp t= 1 1 MSE ( n + n ) 2 = 1.017 1.966 24 − 22 = 9.667( 1 + 1 ) 5 5 = t = 1.107 < tcrit = 2.179 so we fail to reject the Null Hypothesis and conclude that the two drugs do not produce diﬀerent amounts of happiness. Jeﬀ Jones OneWay ANOVA Fisher’s LSD Test
We will do one more test: µj − µb : ¯¯ xj − xb t= 1 1 MSE ( n + n ) 8 = 4.07 1.966 24 − 16 = 9.667( 1 + 1 ) 5 5 = t = 4.07 > tcrit = 2.179 so we reject the Null Hypothesis and conclude that the two drugs do produce diﬀerent amounts of happiness. Jeﬀ Jones OneWay ANOVA PostHoc Tests In the previous posthoc comparison (LSD) we performed two tests, all with α = .05. If they all have the same error rate, and if we perform a lot of tests, eventually we might reject H0 based solely on chance. We want to make our α = .05 for all of the tests as a whole. If H0 is true for all of the tests, we want the probability of rejecting any of the tests to be .05. This leads us to the Bonferroni correction. Jeﬀ Jones OneWay ANOVA Bonferroni 2 Bonferroni: Same as before, only the t test critical value is adjusted by the number of tests.
Let’s say we have 3 groups to compare, so we want to do 3 paired tests (12; 13; 23) Also, let’s say our α = .05 for all the tests as a whole Bonferroni says set our t crit to α where k is the number of k tests This test is very conservative when there is a large number of comparisons we want to make. Jeﬀ Jones OneWay ANOVA Bonferroni
Things you will need:
1 2 3 4 5 Groups Means Total Sample Size (N ) Individual Group Sample Sizes (nj ) Mean Squares Within (estimating your pooled variance) df Within We use the following t statistic ¯¯ xi − xj t= 1 MSE ni + 1 nj This should look familiar  we just did this for the LSD test.
Jeﬀ Jones OneWay ANOVA Bonferroni
The diﬀerence between the LSD approach and Bonferroni approach involves how you obtain the critical t value. For the LSD test you look up the critical t value at some α level with N − g degrees of freedom. For the Bonferroni you use the same degrees of freedom (N − g ) but a modiﬁed α value. αB = α/k where k = number of comparisons to be made. For example, if you were interested in the computing three diﬀerent mean comparisons at an α = .05, The Bonferroni correction would be αB = 0.05/3 = 0.0167 Jeﬀ Jones OneWay ANOVA Bonferroni: Example
In our situation:
1 2 3 4 5 ¯ ¯ ¯ Group Means: xj = 24, xp = 22, xb = 16 Total Sample Size: N = 15 Individual Group Sample Sizes: nj = n = 5 for all groups. Mean Squares Within: MSW = 9.667 df Within: dfW = 12 For this example we will compute two t tests (k = 2). Using α = .05, correctedα = .05 = .025, what’s the twotailed critical 2 value where df = 12? tcrit (12) = 2.56 Jeﬀ Jones OneWay ANOVA Bonferroni: Example
First we will test H0 : µj − µb = 0 ¯¯ xj − xb t= 1 1 MSE ( n + n ) 8 = 4.07 1.966 24 − 16 = 9.667( 1 + 1 ) 5 5 = Because t = 4.07 > tcrit = 2.56, we would reject the Null Hypothesis and conclude that the drugs produce a diﬀerent amount of happiness. Jeﬀ Jones OneWay ANOVA Bonferroni: Example
For the second test we will test H0 : µp − µb = 0 ¯ ¯ xp − xb t= 1 1 MSE ( n + n ) 6 = 3.05 1.966 22 − 16 = 9.667( 1 + 1 ) 5 5 = Because t = 3.05 > tcrit = 2.56, we would reject the Null Hypothesis and conclude that the drugs produce a diﬀerent amount of happiness. Jeﬀ Jones OneWay ANOVA Posthoc Tests
OK  we have one test that is very liberal (LSD) and one test that can be very conservative (Bonferroni). Is there any test in the middle? Well, yes there is  a somewhat conservative test called the Tukey HSD. It ﬁnds the smallest mean diﬀerence that is Honestly Signiﬁcant, and anything surpassing that mean diﬀerence is rejected. I won’t make you perform that test, just know that the rank ordering of liberal to conservative:
1 2 3 LSD Tukey Bonferroni Furthermore, know that you use the tests to correct for familywise error
Jeﬀ Jones OneWay ANOVA Assumptions of ANOVA ANOVA is like an Independent Samples ttest, and it is also like a Regression (though we won’t learn why here). Thus the assumptions:
1 2 3 Normality of the Populations Homogeneity of Variance Independence of Observations Notice how, in the t tests, the ANOVA, and the Regression, how in order to test signiﬁcance, they have the same basic assumptions Jeﬀ Jones OneWay ANOVA Eﬀect Size
ANOVA also has an eﬀectsize, and it is pretty easy to calculate. It’s called “etasquared”: SSB SST An interpretation of this statistic is that it is the “percentage of variance accounted for by group membership.” η2 = Or, it’s how much a person moves predictably toward the mean of their group compared to how much they move total (some of which is random). Think of it in a similar manner as R 2 from the Regression lecture. Jeﬀ Jones OneWay ANOVA Eta Squared Example
Source Between Within Total Sums of Sq. 130 116 246 df 2 12 14 SSB SST 130 = 246 = 0.53 Mean Sq. 65 9.667 F stat 6.72 η2 = Jeﬀ Jones OneWay ANOVA Extending OneWay ANOVA This lecture has been focused on OneWay ANOVA where there is one dependent variable and one independent variable (group membership). The ANOVA framework/test is much more general and can be used to do tests involving both multiple dependent variables and independent variables. The remainder of the lecture will be illustrating ANOVAs generality by working through an example with two independent variables and one dependent variable. The ideas presented here readily generalize to n independent variables and one dependent variable. Jeﬀ Jones OneWay ANOVA Drug Example Extended
Recall the Drug data for the OneWay ANOVA setup: Jeﬀ Drug 20 22 21 26 31 New Drug 19 21 22 23 25 Bad Drug 19 16 15 16 19 Suppose that in addition to Drug Type, we are also interested in Age Diﬀerences, and how Drug Type interacts with Age to aﬀect Happiness. Jeﬀ Jones OneWay ANOVA Drug Example Extended
Suppose we collected the following data Jeﬀ Drug 20 22 21 26 31 28 New Drug 19 21 22 20 18 16 Bad Drug 19 16 15 31 28 33 Young Old We can then run an ANOVA to investigate the eﬀect of Age on Happiness, the eﬀect of Drug on Happiness, and the how the interaction between Age and Drug aﬀects Happiness. Jeﬀ Jones OneWay ANOVA Extending the ANOVA Table Recall the OneWay ANOVA Table Source Between Within Total Sums of Sq. * * * df * * * Mean Sq. * * F stat * We have a Sums of Squares Row for Between  which was the diﬀerence among the Drug means. Now we will have two extra rows  one for Age, and another for the interaction. Jeﬀ Jones OneWay ANOVA Generic TwoWay ANOVA Table Here, A and B stand in for two diﬀerent independent variables (A could be Drug, and B could be Age, for example) Source A B A×B Error Total SS SSA SSB SSA×B SSE SST df a−1 b−1 (a − 1) × (b − 1) N − ab N −1 Mean Sq. MSA MSB MSA×B MSE F stat MSA /MSE MSB /MSE MSA×B /MSE Jeﬀ Jones OneWay ANOVA TwoWay Table for our Example
If you were to do all the calculations, the ANOVA table would be Source Age Drug Age X Drug Error Total SS 174.22 96.44 211.11 48.67 530.44 df 1 2 2 12 17 Mean Sq. 174.22 48.22 105.56 4.056 F stat 42.96 11.89 26.03 All the F statistics are signiﬁcant at α = 0.05 in this example what does this mean? We will look at the interaction eﬀect ﬁrst. Jeﬀ Jones OneWay ANOVA Interaction Eﬀect
It is always a good idea to plot the data to see what is going on. Jeﬀ Jones OneWay ANOVA Main Eﬀects
We can also look at how the independent variables aﬀect happiness. Jeﬀ Jones ...
View
Full
Document
This note was uploaded on 10/08/2010 for the course PSY 2801 taught by Professor Guyer during the Summer '08 term at Minnesota.
 Summer '08
 GUYER

Click to edit the document details