In ANOVA, we *always* report

**F**(the F-value);**df**(degrees of freedom);**p**(statistical significance).

We report these 3 numbers for each effect -possibly just one for one-way ANOVA. Now, p (“Sig.” in SPSS) tells us the likelihood of some **effect being zero** in our population. A zero effect means that all means are exactly equal for some factor such as gender or experimental group.

However, some effect just being *not zero* isn't too interesting, is it? What we really want to know is:
how *strong* is the effect?
We can't conclude that p = 0.05 indicates a stronger effect than p = 0.10 because both are affected by sample sizes and other factors. So how can we quantify how strong effects are for comparing them within or across analyses?

Well, there's several **measures of effect size** that tell us just that. One that's often used is (partial) eta squared, denoted as η^{2} (η is the Greek letter eta).

## Partial Eta Squared - What Is It?

Partial η^{2} a **proportion of variance accounted for** by some effect. If you really *really* want to know:
$$partial\;\eta^2 = \frac{SS_{effect}}{SS_{effect} + SS_{error}}$$

where SS is short for “sums of squares”, the amount of dispersion in our dependent variable. This means that partial η^{2} is the variance attributable to an effect divided by the variance that *could have been* attributable to this effect.

We can easily verify this -and many more calculations- by copy-pasting SPSS’ ANOVA output into this GoogleSheet as shown below.

Note that in one-way ANOVA, we only have one effect. So the variance in our dependent variable is either attributed to the effect or it is error. So for one-way ANOVA
$$partial\;\eta^2 = \frac{SS_{effect}}{SS_{total}}$$

which is equal to (non partial) η^{2}. Let's now go and get (partial) η^{2} from SPSS.

## Example: Happiness Study

A scientist asked 120 people to rate their own happiness on a 100 point scale. Some other questions were employment status, marital status and health. The data thus collected are in happy.sav, part of which is shown below.

We're especially interested in the **effect of employment on happiness**: (how) are they associated and does the association depend on health or marital status too? Let's first just examine employment with a one-way ANOVA.

## Eta Squared for One-Way ANOVA - Option 1

SPSS offers several options for running a one-way ANOVA and many students start off with
η^{2} is **completely absent from this dialog**.

We'll therefore use MEANS instead as shown below.

Clicking

results in the syntax below. Since it's way longer than necessary, I prefer just typing a short version that yields identical results. Let's run it.## SPSS Syntax for Eta Squared from MEANS

***Means command as pasted from menu.**

MEANS TABLES=happy BY employed

/CELLS=MEAN COUNT STDDEV

/STATISTICS ANOVA.

***Short version (creates identical output).**

means happy by employed

/statistics.

## Result

And there we have it: η^{2} = 0.166: some 17% of all variance in happiness is attributable to employment status. I'd say it's not an awful lot but certainly not negligible.

Note that SPSS mentions “**Measures of Association**” rather than “effect size”. It could be argued that these are interchangeable but it's somewhat inconsistent anyway.

## Eta Squared for One-Way ANOVA - Option 2

Perhaps the **best way to run ANOVA** in SPSS is from the univariate GLM dialog. The screenshots below guide you through.

This results in the syntax shown below. Let's run it, see what happens.

## SPSS Syntax for Eta Squared from UNIANOVA

***One-way ANOVA with eta squared as pasted from Analyze - General Linear Model - Univariate.**

UNIANOVA happy BY employed

/METHOD=SSTYPE(3)

/INTERCEPT=INCLUDE

/PRINT=ETASQ

/CRITERIA=ALPHA(.05)

/DESIGN=employed.

## Result

We find partial η^{2} = 0.166. It was previously denoted as just η^{2} but these are identical for one-way ANOVA as already discussed.

## Partial Eta Squared for Multiway ANOVA

For multiway ANOVA -involving more than 1 factor- we can get partial η^{2} from GLM univariate as shown below.

As shown below, we now just add multiple independent variables (“fixed factors”). We then tick

under and we're good to go.## Partial Eta Squared Syntax Example

***Two-way ANOVA with partial eta squared. Pasted from Analyze - General Linear Model - Univariate.**

UNIANOVA happy BY employed healthy

/METHOD=SSTYPE(3)

/INTERCEPT=INCLUDE

/PRINT=ETASQ

/CRITERIA=ALPHA(.05)

/DESIGN=employed healthy employed*healthy.

## Result

First off, both main effects (employment and health) and the interaction between them are statistically significant. The effect of employment (η^{2} = .095) is twice as strong as health (η^{2} = 0.048). And so on.

Note that you **couldn't possibly conclude this from their p-values** (p = 0.003 for employment and p = 0.018 for health). Although the effects are highly *statistically* significant, the effect sizes are moderate. We typically see this pattern with larger sample sizes.

Last, we shouldn't really interpret our main effects because the interaction effect is statistically significant: F(2,114) = 4.9, p = 0.009. As explained in SPSS Two Way ANOVA - Basics Tutorial, we'd better inspect **simple effects** instead of main effects.

## Conclusion

We can get (partial) η^{2} for both one-way and multiway ANOVA from
but it's restricted to one dependent variable at the time. Generally, I'd say **this is the way to go** for any ANOVA because it's the only option that gets us all the output we generally need -including post hoc tests and Levene's test.

We can run multiple one-way ANOVAs with η^{2} in one go with
but it **lacks important options** such as post hoc tests and Levene's test. These -but not η^{2} - are available from the dialog. This renders both options rather inconvenient unless you need a very basic analysis.

Last, several authors prefer a different measure of effect size called **ω ^{2}** (“Omega square”). Unfortunately, this seems completely absent from SPSS. For now at least, I guess η

^{2}will have to do...

I hope you found this tutorial helpful. Thanks for reading!

## THIS TUTORIAL HAS 24 COMMENTS:

## By Ruben Geert van den Berg on April 2nd, 2020

Hi Farhad!

Great question! For a one-way ANOVA, eta-squared is equal to R-squared from running the same ANOVA as regression with dummy variables.

For factorial ANOVA, this doesn't have to be the case: if the factors are correlated, then eta-square for different factors don't add up to R-square for the entire model. This is because different factors account for some of the same variance in the outcome variable.

If factors are not correlated in ANOVA -mostly the case with controlled conditions or stratified sampling- their eta-squares

doadd up to R-square.Keep in mind that any ANOVA and ANCOVA can be run as regression with dummy variables. These 2 techniques are both special cases of the General Linear Model.

Hope that helps!

SPSS tutorials

## By Farhad, Marağa on April 8th, 2020

Dear Ruben,

Thank you for all your help and support.

A popular guideline (.02=small; .13=medium; .26=large) for eta square interpretation is offered in a Youtube video on eta square. Is it for ANOVA? Or multiple regression? I got confused after watching it .Thank you very much

## By Ruben Geert van den Berg on April 9th, 2020

Hi Farhad!

That guideline is rubbish. For the correct guideline, see Effect Size – A Quick Guide.

This refers to ANOVA. Regression uses different measures.

Hope that helps!

SPSS tutorials

## By Olusoji Olumide Odukoya on May 5th, 2020

Found the tutorial extremely useful. Many thanks to those involved in it's development.

## By Mohammad on February 19th, 2021

I have a question, thank you for guide me

The most common calculations, such as Cohen's d, only compare the means of 2 groups. How does one calculate the effect sizes between multiple groups without having to resort to several individual calculations?