52
Jun 12, 2022 1 Chapter 13: Chapter 13: Comparing Several Comparing Several Means Means (One-Way ANOVA) (One-Way ANOVA)

Chapter 13: Comparing Several Means (One-Way ANOVA)

  • Upload
    axelle

  • View
    67

  • Download
    0

Embed Size (px)

DESCRIPTION

Chapter 13: Comparing Several Means (One-Way ANOVA). In Chapter 13:. 13.1 Descriptive Statistics 13.2 The Problem of Multiple Comparisons 13.3 Analysis of Variance 13.4 Post Hoc Comparisons 13.5 The Equal Variance Assumption 13.6 Introduction to Nonparametric Tests. - PowerPoint PPT Presentation

Citation preview

Page 1: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

Apr 21, 2023 1

Chapter 13: Chapter 13: Comparing Several Means Comparing Several Means

(One-Way ANOVA)(One-Way ANOVA)

Page 2: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

In Chapter 13:

13.1 Descriptive Statistics

13.2 The Problem of Multiple Comparisons

13.3 Analysis of Variance

13.4 Post Hoc Comparisons

13.5 The Equal Variance Assumption

13.6 Introduction to Nonparametric Tests

Page 3: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

3

Illustrative Example: Data

Pets as moderators of a stress response. This chapter follows the analysis of data from a study in which heart rates (bpm) of participants were monitored after being exposed to a psychological stressor. Participants were randomized to one of three groups:

• Group 1 - monitored in presence of pet dog• Group 2 - monitored in the presence of human

friend• Group 3 - monitored with neither dog nor human

friend present

Page 4: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

4

Illustrative Example: Data

Page 5: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

5

SPSS Data Table

• Most computer programs require data in two columns

• One column is for the explanatory variable (group)

• One column is for the response variable (hrt_rate)

Page 6: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

6

13.1 Descriptive Statistics

• Data are described and explored before moving to inferential calculations

• Here are summary statistics by group:

Page 7: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

7

Exploring Group Differences

• John Tukey taught us the importance of exploratory data analysis (EDA)

• EDA techniques that apply:– Stemplots

– Boxplots

– Dotplots

John W. Tukey (1915 -2000)

Page 8: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

8

Side-by-Side Stemplots

Page 9: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

9

Side-by-Side Boxplots

Page 10: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

10

§13.2 The Problem of Multiple Comparisons

• Consider a comparison of three groups. There are three possible t tests when considering three groups:(1) H0: μ1 = μ2 versus Ha: μ1 ≠ μ2

(2) H0: μ1 = μ3 versus Ha: μ1 ≠ μ3

(3) H0: μ2 = μ3 versus Ha: μ2 ≠ μ3

• However, we do not perform separate t tests without modification → this would identify too many random differences

Page 11: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

11

Problem of Multiple Comparisons

• Family-wise error rate = probability of at least one false rejection of H0

• Assume three null hypotheses are true:At α = 0.05, the Pr(retain all three H0s) = (1−0.05)3 = 0.857. Therefore, Pr(reject at least one) = 1−0.847 = 0.143 this is the family-wise error rate.

• The family-wise error rate is much greater than intended. This is “The Problem of Multiple Comparisons”

Page 12: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

12

Problem of Multiple Comparisons

The more comparisons you make, the greater the family-wise error rate. This table demonstrates the magnitude of the problem

Page 13: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

13

Two-step approach:

1. Test for overall significance using a technique called “Analysis of Variance”

2. Do post hoc comparison on individual groups

Mitigating the Problem of Multiple Comparisons

Page 14: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

14

13.3 Analysis of Variance

• One-way ANalysis Of VAriance (ANOVA)– Categorical explanatory variable – Quantitative response variable– Test group means for a significant difference

• Statistical hypotheses

– H0: μ1 = μ2 = … = μk

– Ha: at least one of the μis differ

• Method: compare variability between groups to variability within groups (F statistic)

Page 15: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

15

Analysis of Variance Overview, cont.

R. A. Fisher (1890-1962)

The F in the F statistic stands for “Fisher”

Page 16: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

16

Variability Between Groups• Variability of group means around the grand

mean → provides a “signal” of group difference

• Based on a statistic called the Mean Square Between (MSB)

• NotationSSB ≡ sum of squares between

dfB ≡ degrees of freedom betweenk ≡ number of groupsx-bar ≡ grand mean

x-bari ≡ mean of group i

Page 17: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

17

Mean Square Between: Formula

• Sum of Squares Between [Groups]

• Degrees of Freedom Between

• Mean Square Between

Page 18: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

18

Mean Square Between: Graphically

Page 19: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

19

Mean Square Between: Example

Page 20: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

20

Variability Within Groups• Variability of data points within groups →

quantifies random “noise”

• Based on a statistic called the Mean Square Within (MSW)

• NotationSSW ≡ sum of squares withindfW ≡ degrees of freedom withinN ≡ sample size, all groups combinedni ≡ sample size, group Is2

i ≡ variance of group i

Page 21: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

21

Mean Square Within: Formula

• Mean Square Within

• Sum of Squares Within

• Degrees of Freedom Within

Page 22: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

22

Mean Square Within: Graphically

Page 23: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

23

Mean Square Within: Example

Page 24: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

24

The F statistic and ANOVA table• Data are arranged to form an ANOVA

table

• F statistic is the ratio of the MSB to MSW

08.14793.84

843.1193

MSW

MSBFstat

Fstat “signal-to-noise” ratio

Page 25: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

25

Fstat and P-value• The Fstat has numerator and denominator

degrees of freedom: df1 and df2 respectively (corresponding to dfB and dfW)

• Convert Fstat to P-value with a computer program or Table D

• The P-value corresponds to the area in the right tail beyond

Page 26: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

26

Table D (“F Table”)• The F table has limited listings for df2.

• You often must round-down to the next available df2 (rounding down preferable for conservative estimate).

• Wedge the Fstat between listing to find the approximate P-value

df1 = 2, df2 = 42(Table D does not have df2 of 42;

next lowest df2 is 30).

Fstat = 14.02 falls below P of 0.001

Page 27: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

27

Fstat and P-value

P < 0.001

Page 28: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

28

ANOVA Example (Summary)A. Hypotheses: H0: μ1 = μ2 = μ3 vs. Ha: at least one

of the μis differ

B. Statistics: Fstat = 14.08 with 2 and 42 degrees of freedom

C. P-value = .000021 (via SPSS), providing highly significant evidence against the H0; conclude the heart rates (an indicator of the effects of stress) differed in the groups

D. Significance level (optional): Results are significantly at α = .00005

Page 29: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

29

ComputationBecause of the complexity of computations, ANOVA statistics are often calculated by computer

ANOVA

beats per minute

2387.685 2 1193.843 14.079 .000

3561.309 42 84.793

5948.994 44

Between Groups

Within Groups

Total

Sum ofSquares df Mean Square F Sig.

Page 30: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

30

ANOVA and the t test (Optional)

ANOVA for two groups is equivalent to the equal variance (pooled) t test (§12.4)

• Both address H0: μ1 = μ2

• dfW = df for t test = N – 2

• MSW = s2pooled

• Fstat = (tstat)2

• F1,df2,α = (tdf,1-α/2)2

Page 31: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

31

13.4 Post Hoc Comparisons

• ANOVA Ha says “at least one population mean differs” but does not delineate which differ.

• Post hoc comparisons are pursued after rejection of the ANOVA H0 to delineate differences

Page 32: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

32

SPSS Post Hoc Comparison Procedures

Many post hoc comparison procedures exist. We cover the LSD and Bonferroni methods.

Page 33: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

33

Least Squares Difference Procedure

A. Hypotheses. H0: μi = μj vs. Ha: μi ≠ μj for each group i and j

B. Test statistic

C. P-value. Use t table or software

kNdfdf

nnMSWSE

SE

xxt

jixx

xx

ji

ji

Within

stat 11

where)(

21

Do after a significant ANOVA to protect against the problem of multiple comparisons

Page 34: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

34

LSD Procedure: Example

A. Hypotheses. H0: μ1 = μ2 against Ha: μ1 ≠ μ2

B. Test statistic.

C. P-value. P = 0.0000039; highly significant evidence of a difference.

42345 with 31.53623

3259148373

)(

362.315

1

15

1793.84

11

Within21

stat21

21

dfdf.

..

SE

xxt

nnMSWSE

xx

jixx

For the “pets” illustrative data, we test H0: μ1 = μ2 by hand. The other tests will be done by computer

Page 35: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

35

LSD Procedure, SPSSResults for illustrative “pets” data.

H0: μ1 = μ2H0: μ1 = μ3

H0: μ2 = μ3

Page 36: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

36

95% Confidence Interval, Mean Difference, LSD Method

ji

kNji

ji

nnMSWtxx

μ

11)(

for CI %100)1(

21,

Page 37: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

37

95% CI, LSD Method, Example

)0.11 ,6.24(

)362.3)(021.2(842.17

15

1

15

1793.84)325.91483.73(

11)(

for CI %95

975,.42

1,

21

2

t

nnMSWtxx

μ

jikNji

Comparing Group 1 to Group 2:

Page 38: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

38

Bonferroni Procedure

A. Hypotheses. H0: μ1 = μ2 against Ha: μ1 ≠ μ2

B. Test statistic. Same as for the LSD method.

C. P-value. The LSD method produced P = .0000039 (two-tailed). Since there were three post hoc comparisons, PBonf = 3 × .0000039 = .000012.

The Bonferroni procedure is instituted by multiplying the P-value from the LSD procedure by the number of post hoc comparisons “c”.

Page 39: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

39

Bonferroni Confidence Interval

)4.9,3.26(

)362.3)(51.2(842.17

15

1

15

1793.84)()325.91483.73(

11)(for CI %95

9917,.42

1, 2

k

jikNjiji

t

nnMSWtxxμ

c

Let c represent the number of post hoc comparisons. Comparing Group 1 to Group 2:

Page 40: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

40

Bonferroni Procedure, SPSSP-values from Bonferroni are higher and confidence intervals are broader than LSD method, reflecting its conservative approach

Page 41: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

41

§13.5. The Equal Variance Assumption

• Conditions for ANOVA:1. Sampling independence

2. Normal sampling distributions of mean

3. Equal variance within population groups

• Let us focus on condition 3, since conditions 1 and 2 are covered elsewhere.

• Equal variance is called homoscedasticity. (Unequal variance = heteroscedasticity).

• Homoscedasticity allows us to pool group variances to form the MSW

Page 42: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

42

Assessing “Equal Variance”

1. Graphical exploration. Compare spreads visually with side-by-side plots. 2. Descriptive statistics. If a group’s standard deviation is more than twice that of another, be alerted to possible heteroscedasticity

3. Test variances. A statistical test can be applied (next slide).

Page 43: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

43

Levene’s Test of VariancesA.Hypotheses. H0: σ2

1 = σ22 = … = σ2

k

Ha: at least one σ2i differs

B. Test statistic. Test is performed by computer. The test statistic is a particular type of Fstat based on the rank transformed deviations (see p. 283 for details).

C. P-value. The Fstat is converted to a P-value by the computational program. Interpretation of P is routine small P evidence against H0, suggesting heteroscedasticity.

Page 44: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

44

Test of Homogeneity of Variances

beats per minute

.059 2 42 .943

LeveneStatistic df1 df2 Sig.

Levene’s Test – Example (“pets” data)

A. H0: σ21 = σ2

2 = σ23 versus Ha: at least one σ2

i differs

B. SPSS output (below). Fstat = 0.059 with 2 and 42 df

C. P = 0.943. Very weak evidence against H0 retain assumption of homoscedasticity

Page 45: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

45

Analyzing Groups with Unequal Variance

• Stay descriptive. Use summary statistics and EDA methods to compare groups.

• Remove outliers, if appropriate (p. 287).

• Mathematically transform the data to compensate for heteroscedasticity (e.g., a long right tail can be pulled in with a log transform).

• Use robust non-parametric methods.

Page 46: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

46

13.6 Intro to Nonparametric MethodsMany nonparametric procedures are based on rank transformed data (“rank tests”). Here are examples:

Page 47: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

47

The Kruskal-Wallis Test• Let us explore the Kruskal-Wallis test as

an example of a non-parametric test

• The Kruskal-Wallis test is the non-parametric analogue of one-way ANOVA.

• It does not require Normality or Equal Variance conditions for inference.

• It is based on rank transformed data and seeing if the mean ranks in groups differ significantly.

Page 48: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

48

Kruskal-Wallis Test• The K-W hypothesis can be stated in

terms of mean or median (depending on assumptions made about population shapes). Let us use the later.

• Let Mi ≡ the median of population i

• There are k groups

• H0: M1 = M2 = … = Mk

• Ha: at least one Mi differs

Page 49: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

49

Kruskal-Wallis, ExampleAlcohol and income. Data from a survey on alcohol consumption and income are presented.

Page 50: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

50

Kruskal-Wallis Test, ExampleWe wish to test whether the means differ significantly but find graphical and hypothesis testing evidence that the population variances are unequal.

Test of Homogeneity of Variances

Alcohol consumption

10.874 4 708 .000

LeveneStatistic df1 df2 Sig.

Page 51: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

51

Kruskal-Wallis Test, Example, cont.

A.Hypotheses.H0: M1 = M2 = M3 = M4 = M5 vs.Ha: at least one Mi differs

B.Test statistic. Some computer programs use chi-square statistic based upon a Normal approximation. SPSS derives Chi-square statistics = 7.793 with 4 df (next slide)

Page 52: Chapter 13:  Comparing Several Means  (One-Way ANOVA)

52

Kruskal-Wallis Test, Example, cont. Test Statisticsa,b

7.793

4

.099

Chi-Square

df

Asymp. Sig.

Alcoholconsumption

Kruskal Wallis Testa.

Grouping Variable: Incomeb.

Ranks

46 303.58

88 344.68

140 385.17

250 345.26

189 370.40

713

Income1

2

3

4

5

Total

Alcohol consumptionN Mean Rank

P = 0.099, providing marginally significant evidence against H0.