Chapter 7: Analysing the Data 
OneWay ANOVA

Mean 1 
Mean 2 
Mean 3 
Mean 4 
Mean 5 
7.0 
6.9 
11.0 
13.4 
12.0 
we could compare Mean 1 against Mean 2, or against Mean 3, or against Mean 4, or against Mean 5. We could also compare Mean 2 against Mean 3 or against Mean 4, or against Mean 5. We could also compare Mean 3 against Mean 4, or against Mean 5. Finally, we could compare Mean 4 against Mean 5. This gives a total of 10 possible twogroup comparisons. Obviously, the logic used for the ttest cannot immediately be transferred to ANOVA.
Instead, ANOVA uses some simple logic of comparing variances (hence the name 'Analysis of Variance'). If the variance amongst the five means is significantly greater than our measure of random error variance, then our means must be more spread out than we would expect due to chance alone.
If the variance amongst our sample means is the same as the error variance, then you would expect an F = 1.00. If the variance amongst our sample means is greater than the error variance, you would get F > 1.00. What we need therefore is a way of deciding when the variance amongst our sample means is significantly greater than 1.00. (An F < 1.00 does not have much importance and is always > 0.0 because variance is always positive.)
The answer to this question is the distribution of the Fratio. An Fratio is merely the ratio of any two variances. In the case of the between groups ANOVA, the variances we are interested in are the two nominated above.
F distributions depend on the degrees of freedom associated with the numerator in the ratio and the degrees of freedom associated with the denominator. Figure 7.1 shows three different F distributions corresponding to three different combinations of numerator df and denominator df.
Figure 7.1. Different F distributions for different combinations of numerator and denominator degrees of freedom. Notice "variance expected from sampling error" is sometimes called "WITHIN" variance or "withinsubjects" variance, which indicates where it comes from.
You will see that each distribution is not symmetrical and has a peak at about F = 1.00. With degrees of freedom = 3 and 12, a calculated Fvalue greater than 3.49 will be a significant result (p < .05). If the calculated F value is greater than 5.95, the result will be significant at the = .01 level. With 2 and 9 df, the corresponding values are 4.26 and 8.02. (You will be pleased to know, that there are no onetailed tests in ANOVA.)
Variance
Variance was covered earlier but as a reminder . . .
variance = standard deviation^{2}
In ANOVA terminology, variance is often called Mean Square. This is because
That is, variance is equal to Sums of Squares divided by N1. N1 is approximately the number of observations, so variance is an average Sums of Squares or Mean Square for short.
© Copyright 2000 University of New England, Armidale, NSW, 2351. All rights reserved Maintained by Dr Ian Price 