1
46.6 % 15.5 % 37.9 % 0.0 10.0 20.0 30.0 40.0 50.0 Clicker No Preference Scantron % of Total Students Student Preference MAJOR REFERENCES 1) Ainuson, K. 2008. The Effectiveness of Personal Responses Systems as a Classroom Technology Tool at Clemson University. Paper presented at the 2008 APSA Teaching and Learning Conference, San Jose, CA. 2) Bonwell, C. C., and J. A. Eison. 1991. Active Learning: Creating Excitement in the Classroom. ERIC Digest. ERIC Digests. ERIC (ED340272). 3) Wood, W. B. 2004. Clickers: A Teaching Gimmick that Works. Developmental Cell 7 (December): 796-8. Google Scholar (doi: 10.1016/j.devcel.2004.11.004). 4) MacArthur, J. R., and L. J. Jones. 2008. A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice 9: 187-95. Royal Society of Chemistry Journals (10.1039/ B812407H). Pooja S. Jagadish 1 and Stacey E. Wild 2 1 BA, College of Arts and Science, Vanderbilt University, Nashville, TN; 2 Ph.D., Department of Biological Sciences, East Tennessee State University, Johnson City, TN Summer 2010 and Fall 2010 Quiz Data Averaged Across Students: Two-tailed, paired t-test: p = 0.0953 (not statistically significant) Spring 2012 Survey Results are shown as follows: CLICKER VS. SCANTRON: ASSESSING STUDENT ACHIEVEMENT BY QUIZZING METHOD AND RATIONALE FOR STUDENT PREFERENCE OF METHOD This study: (a) determined whether students perform significantly better when answering questions on a Scantron vs. a Clicker and (b) surveyed student preference of one response method over the other and the reasoning behind any existing preferences. 11.2 % 19.0 % 10.3 % 6.0 % 2.6 % 6.9 % 4.3 % 1.7 % 6.0 % 18.1 % 8.6 % 5.2 % 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0 50.0 A B C Other % of Total Students Final Grade Student Preference by Final Grade Scantron No Preference Clicker ACKNOWLEDGEMENTS Samuel S. Marinelli, Michigan State University Dr. Edith Seier, Dept. of Mathematics, ETSU Dr. Alicia Bray., Tennessee Tech University Dr. Tina Rooks, Turning Technologies CLASSROOM MATERIALS: (images within Figure 1) Scantron Corporation 883-E Series bubble forms TurningTechnologies ResponseCard XR or NXT “Clickers” battery-operated devices communicate via a radio or infrared signal with a receiver connected to the instructor’s computer (4) STATISTICAL SOFTWARE: Microsoft Excel 2010 Wolfram Mathematica Threshold: p < .05 Figure 1 (Top): Breakdown of student preference by response method. Figure 2 (Middle): Final grade organized by preference of response method. “Other” includes D, F, and failure due to non-attendance (NF). Figure 3 (Bottom): Reasoning behind student preference of Clicker or Scantron. Students used seven key terms to describe their preference: Easy, Fast, Tangible, Safe/Confident, Comfortable, Fun/Different, and Economical. See Table 1 (Top Right) for a description of each term. 35 19 3 11 5 3 12 18 9 18 17 22 0 5 10 15 20 25 30 35 40 45 50 Counts Reasons Reasoning by Preference for Spring 2012 Class Scantron Form Clicker Device SUMMER 2010: (n = 23) 19 Quizzes = 4 Scantron + 5 Clicker + 10 Assigned* Clicker OR Scantron FALL 2010: (n = 98) 8 Quizzes = 1 Scantron + 1 Clicker + 6 Assigned* Clicker OR Scantron Goal of Quizzes: Determine whether students fare better when responding via one method over the other *Determined by Random Table Generator. Students did not know method of response until test time. SPRING 2012: (n = 116) Online Google Forms survey only open to Spring 2012 class. Goal of Survey: Assess student preferences of Clicker, Scantron, or No Preference and reasoning behind any existing preferences 1. Easy a. Straightforward to use b. Fewer items to remember to bring to test day c. Less clutter on desk d. Facilitates grading for professor 2. Fast: a. Faster initial input of answer b. Speed of return of grades c. Speed of feedback promotes class discussion d. Less time for method distribution 3. Tangible: a. Visual facilitates answer change b. Visual facilitates keeping track of place on quiz 4. Safe/Confident: a. Dissuades cheating and dishonesty b. Less prone to user error c. Less prone to grading error 4. Safe/Confident (cont’d): d. Less prone to submission error e. Lower probability of unforeseen problems (i.e., technology failures) f. Not feeling rushed 5. Comfortable: a. Familiarity with method b. Ability to use device competently c. General discomfort with technology d. Past experiences with method e. Greater physical comfort and better posture 6. Fun/Different: a. Not feeling test-like b. Novelty of method 7. Economical: a. Total expenditure of resources b. Total cost of material(s) Table 1: Summary of Key Terms from the 2012 Student Survey There is no difference in student achievement by method of response. Based on Chi-Squared Tests: Students are statistically significantly more likely to have a preferred response method than not (p = 1.10 × 10 -13 ). A statistically significant number of students preferred the Scantron for being Tangible (p = 1.06 × 10 -3 ) and Comfortable (p = 1.07 × 10 -3 ). A statistically significant number of students preferred the Clicker for being Easy (p = 8.87 × 10 -5 ). Using Fisher’s Exact Tests: A statistically significant number of students preferred the Clicker for being Fast (p = 0.00113) and Economical (p = 0.0137). No statistical significance was determined for Fun/Different ( p = 1). PURPOSE CLICKER VS. SCANTRON: ASSESSING STUDENT ACHIEVEMENT BY QUIZZING METHOD AND RATIONALE FOR STUDENT PREFERENCE OF METHOD INTRODUCTION METHODS & MATERIALS RESULTS CONCLUSIONS Traditional Assessment: paper-based bubble sheets ( Scantrons) Problem with Traditional Assessment: Dwindling financial resources + Increasing university enrollment = Larger lecture-based classes Decreased participation + Passive learning of course materials (1, 2). Difficult to assess student comprehension until standardized tests are mechanically graded (1, 3). Newer Solution: Student Response Systems (“ Clickers”) Pros of Clickers: Promote Active-Learning through paperless, computer-graded questions that are incorporated into lecture Graphical representation of responses promotes peer discussion. May be used for grading Question: Is there a difference in student achievement on graded assessments whether students respond by Clickers or by Scantrons?

Higher Education Study Comparing Clickers to Paper

Embed Size (px)

DESCRIPTION

A poster presentation comparing paper based testing to clicker based testing

Citation preview

Page 1: Higher Education Study Comparing Clickers to Paper

46.6 %

15.5 %

37.9 %

0.0

10.0

20.0

30.0

40.0

50.0

Clicker No Preference Scantron

% of Total

Students

Student Preference

MAJOR REFERENCES

1) Ainuson, K. 2008. The Effectiveness of Personal Responses Systems as a Classroom Technology Tool at Clemson University. Paper presented at the 2008 APSA Teaching and Learning Conference, San Jose, CA.

2) Bonwell, C. C., and J. A. Eison. 1991. Active Learning: Creating Excitement in the Classroom. ERIC Digest. ERIC Digests. ERIC (ED340272).

3) Wood, W. B. 2004. Clickers: A Teaching Gimmick that Works. Developmental Cell 7 (December): 796-8. Google Scholar (doi: 10.1016/j.devcel.2004.11.004).

4) MacArthur, J. R., and L. J. Jones. 2008. A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice 9: 187-95. Royal Society of Chemistry Journals (10.1039/ B812407H).

Pooja S. Jagadish 1 and Stacey E. Wild 2 1 BA, College of Arts and Science, Vanderbilt University, Nashville, TN; 2 Ph.D., Department of Biological Sciences, East Tennessee State University, Johnson City, TN

Summer 2010 and Fall 2010 Quiz Data Averaged Across Students: Two-tailed, paired t-test: p = 0.0953 (not statistically significant) Spring 2012 Survey Results are shown as follows:

CLICKER VS. SCANTRON: ASSESSING STUDENT ACHIEVEMENT BY QUIZZING METHOD AND RATIONALE FOR STUDENT PREFERENCE OF METHOD

This study: (a) determined whether students perform significantly better

when answering questions on a Scantron vs. a Clicker and (b) surveyed student preference of one response method over

the other and the reasoning behind any existing preferences.

11.2 % 19.0 %

10.3 % 6.0 %

2.6 %

6.9 %

4.3 %

1.7 %

6.0 %

18.1 %

8.6 %

5.2 %

0.0

5.0

10.0

15.0

20.0

25.0

30.0

35.0

40.0

45.0

50.0

A B C Other

% o

f To

tal S

tud

en

ts

Final Grade

Student Preference by Final Grade

Scantron

No Preference

Clicker

ACKNOWLEDGEMENTS

◊ Samuel S. Marinelli, Michigan State University

◊ Dr. Edith Seier, Dept. of Mathematics, ETSU

◊ Dr. Alicia Bray., Tennessee Tech University

◊ Dr. Tina Rooks, Turning Technologies

CLASSROOM MATERIALS: (images within Figure 1) Scantron Corporation 883-E Series bubble forms TurningTechnologies ResponseCard XR or NXT “Clickers”

• battery-operated devices communicate via a radio or infrared signal with a receiver connected to the instructor’s computer (4)

STATISTICAL SOFTWARE: Microsoft Excel 2010 Wolfram Mathematica Threshold: p < .05

Figure 1 (Top): Breakdown of student preference by response method. Figure 2 (Middle): Final grade organized by preference of response method. “Other” includes D, F, and failure due to non-attendance (NF). Figure 3 (Bottom): Reasoning behind student preference of Clicker or Scantron. Students used seven key terms to describe their preference: Easy, Fast, Tangible, Safe/Confident, Comfortable, Fun/Different, and Economical. See Table 1 (Top Right) for a description of each term.

35

19

3 11

5 3 12

18

9

18

17 22

0

5

10

15

20

25

30

35

40

45

50

Co

un

ts

Reasons

Reasoning by Preference for Spring 2012 Class

Scantron Form

Clicker Device

SUMMER 2010: (n = 23) 19 Quizzes = 4 Scantron + 5 Clicker + 10 Assigned* Clicker OR Scantron

FALL 2010: (n = 98) 8 Quizzes = 1 Scantron + 1 Clicker + 6 Assigned* Clicker OR Scantron

Goal of Quizzes: Determine whether students fare better when responding via one method over the other *Determined by Random Table Generator. Students did not know method of response until test time.

SPRING 2012: (n = 116) Online Google Forms survey only open to Spring 2012 class. Goal of Survey: Assess student preferences of Clicker, Scantron, or No Preference and reasoning behind any existing preferences

1. Easy a. Straightforward to use b. Fewer items to remember to bring

to test day c. Less clutter on desk d. Facilitates grading for professor

2. Fast: a. Faster initial input of answer b. Speed of return of grades c. Speed of feedback promotes class

discussion d. Less time for method distribution

3. Tangible: a. Visual facilitates answer change b. Visual facilitates keeping track of

place on quiz 4. Safe/Confident:

a. Dissuades cheating and dishonesty b. Less prone to user error c. Less prone to grading error

4. Safe/Confident (cont’d): d. Less prone to submission error e. Lower probability of unforeseen

problems (i.e., technology failures)

f. Not feeling rushed 5. Comfortable:

a. Familiarity with method b. Ability to use device competently c. General discomfort with

technology d. Past experiences with method e. Greater physical comfort and

better posture 6. Fun/Different:

a. Not feeling test-like b. Novelty of method

7. Economical: a. Total expenditure of resources b. Total cost of material(s)

Table 1: Summary of Key Terms from the 2012 Student Survey

There is no difference in student achievement by method of response. Based on Chi-Squared Tests:

Students are statistically significantly more likely to have a preferred response method than not (p = 1.10 × 10-13).

A statistically significant number of students preferred the Scantron for being Tangible (p = 1.06 × 10-3) and Comfortable (p = 1.07 × 10-3).

A statistically significant number of students preferred the Clicker for being Easy (p = 8.87 × 10-5 ).

Using Fisher’s Exact Tests: A statistically significant number of students preferred the Clicker for

being Fast (p = 0.00113) and Economical (p = 0.0137). No statistical significance was determined for Fun/Different (p = 1).

PURPOSE

CLICKER VS. SCANTRON: ASSESSING STUDENT ACHIEVEMENT BY QUIZZING METHOD AND RATIONALE FOR STUDENT PREFERENCE OF METHOD

INTRODUCTION

METHODS & MATERIALS

RESULTS

CONCLUSIONS

Traditional Assessment: paper-based bubble sheets (Scantrons)

Problem with Traditional Assessment: Dwindling financial resources + Increasing university enrollment = Larger lecture-based classes Decreased participation + Passive learning of course materials (1, 2). Difficult to assess student comprehension until standardized tests are mechanically graded (1, 3).

Newer Solution: Student Response Systems (“Clickers”)

Pros of Clickers: • Promote Active-Learning through paperless, computer-graded

questions that are incorporated into lecture • Graphical representation of responses promotes peer discussion. • May be used for grading

Question: Is there a difference in student achievement on graded assessments whether students respond by Clickers or by Scantrons?