41
The Moodle Quiz at the Open University: how we use it & how that helps students Tim Hunt, Senior Developer 5th MoodleMoot Croatia

The Moodle quiz at the Open University

Embed Size (px)

Citation preview

The Moodle Quiz at the Open University:

how we use it & how that helps students

Tim Hunt, Senior Developer

5th MoodleMoot Croatia

The Moodle Quiz at the Open University:

how we use it & how that helps students

Tim Hunt, Senior Developer

5th MoodleMoot Croatiamany dedicated

OU teaching staff

The Open University

About the Open University

45 years old

All distance education

~200 000 students (part time)

Used Moodle since 2005

Where we are

Contains Ordnance Survey data © Crown copyright and database right. Licensed under CC BY-SA 3.0 via

Wikimedia Commons - http://commons.wikimedia.org/wiki/File:Buckinghamshire_UK_location_map.svg

Where we are

Where we are

Moodle at the OU

A typical module website

Daily load

Lots of servers

The quiz at the OU

A typical quiz – Maths

A typical quiz – Languages

Overall use

It matters

End of module survey results

0%

20%

40%

60%

80%

100%

KPI 01: Overall, I am satisfied with the quality of this module.

Q33 Clear understanding of what was required to complete the assessed work

Q34 the assessment activities supported my learning

It makes a difference

Level 3 physics (SM358)

SM358 assessment tasks

0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs

0 iCMAs

1 iCMA

2 iCMAs

3 iCMAs

4 iCMAs

5 iCMAs

6 iCMAs

Students completing tasks

0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs

0 iCMAs 11.6% 3.4% 1.5% 0.5% 0.5%

1 iCMA 1.5% 1.0%

2 iCMAs 1.5% 2.4% 1.5%

3 iCMAs 1.5%

4 iCMAs 5.3% 2.4%

5 iCMAs 0.5% 3.9% 5.8% 8.2%

6 iCMAs 0.5% 0.5% 5.8% 5.8% 34.3%

Average exam scores

0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs

0 iCMAs 6.0

1 iCMA

2 iCMAs 17.0 24.0

3 iCMAs 60.0

4 iCMAs 43.7 62.0

5 iCMAs 23.0 46.0 62.6 69.5

6 iCMAs 35.3 60.8 77.5

Exam scores vs prediction

0 TMAs 1 TMA 2 TMAs 3 TMAs 4 TMAs

0 iCMAs −20.8

1 iCMA

2 iCMAs −43.9 −27.5

3 iCMAs −9.0

4 iCMAs −15.6 +1.8

5 iCMAs −3.8 −11.1 +1.4 +2.4

6 iCMAs −17.1 +3.4 +4.6

Changing assessment type

T184 robotics & meaning of life

Before

10% mid-course iCMA

90% final written EMA

part 1 short-answer

part 2 programming

& essays

After

10% mid-course iCMA

30% final iCMA

60% final written EMA

Module completion rates

T184 completion rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of

CME

T184 completion rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of

CME

Module completion rates

T184 completion rates

60%

65%

70%

75%

80%

85%

90%

95%

100%

2004 2005 2006 2007 2008 2009 2010 2011

May

Oct

Introduction of

CME

Module completion rates

Deadlines

SM358 iCMA51 submit date

2010 advisory deadline

SM358 iCMA51 submit date

2010 advisory deadline

2012 hard deadline

Grades

Optional quizzes

Compulsory quizzes

Can computers grade sentences?

Spoiler: Yes!

How good are humans?

Question Number of

responses in

analysis

Percentage of responses where

the human markers were in

agreement with question author

Percentage of

responses where

computer marking

was in agreement

with question

author

Range for the

6 human

markers

Mean of the 6

human

markers

A 189 97.4 – 100. 98.9 99.5

B 248 83.9 – 97.2 91.9 97.6

C 150 80.7 – 94.0 86.9 94.7

D 129 91.5 – 98.4 96.7 97.6

E 92 92.4 – 97.8 95.1 98.9

F 129 86.0 – 97.7 90.8 97.7

G 132 66.7 – 90.2 83.2 89.4

Comparing three algorithms

Question Number of

responses in

analysis

Percentage of responses where computer marking

was in agreement with question author

Computational

linguistics

Algorithmic manipulation of

keywords

IAT Pattern-match Regular

Expressions

A 189 99.5 99.5 98.9

B 248 97.6 98.8 98.0

C 150 94.7 94.7 90.7

D 129 97.6 96.1 97.7

E 92 98.9 96.7 96.7

F 129 97.7 88.4 89.2

G 132 89.4 87.9 88.6

Summary

References

• Overall iCMA usage numbers collated by Phil Butcher.

• End of module server results from Student Analytics in the Institute of Educational

Technology via Linda Price.

• SM358 data from John Bolton

• T184 data from Jon Rosewell

• SDK125, S141, S151 & S240 data from Sally Jordan

• Jordan, Sally (2014). Using e-assessment to learn about students and learning.

International Journal of e-Assessment, 4(1)

• Jordan, Sally (2014). Adult science learners’ mathematical mistakes: an analysis

of responses to computer-marked questions. European Journal of Science and

Mathematics Education, 2(2) pp. 63–86.

• Jordan, Sally (2012). Short-answer e-assessment questions : five years on. In:

2012 International Computer Assisted Assessment Conference, 10-11 July 2012,

Southampton.

• Pattern-match question type https://moodle.org/plugins/view/qtype_pmatch

• STACK question type (maths) https://moodle.org/plugins/view/qtype_stack

• Chris Sangwin (2013) Computer Aided Assessment of Mathematics

Key points

Getting the assessment right is important

Online quizzes can be powerful learning tools

Computers can grade much more than multi-

choice but only on behalf of a teacher

Analyse the data and you can learn

• Is this quiz working?

• What are my students’ misconceptions?