72
Learning Analytics - value what we measure or measure what we value Dr Mark Glynn @glynnmark

Dit elearning summer school 2015 analytics

Embed Size (px)

Citation preview

Page 1: Dit elearning summer school 2015   analytics

Learning Analytics - value what we measure or measure what we value 

Dr Mark Glynn

@glynnmark

Page 2: Dit elearning summer school 2015   analytics

Contact details

[email protected]

• glynnmark

• http://enhancingteaching.com

Page 3: Dit elearning summer school 2015   analytics

Outline

• Introduction• Motivation and goals• Challenges• Examples• Technical bits• Discussion

– What would you like to analyse

– Collaboration

Page 4: Dit elearning summer school 2015   analytics

Teaching Enhancement Unit

TEACHINGENHANCEMENT

UNIT

Onlineand Blended Learning

Support

Awards and Grants

Credit Earning Modules

Professional Development

Workshops

Page 5: Dit elearning summer school 2015   analytics

Data Analytics

Data analytics is the science of extracting actionable insight from large amounts of raw data

DIT – MSc in Computing

Page 6: Dit elearning summer school 2015   analytics

Youtube

Page 7: Dit elearning summer school 2015   analytics

Tesco

Page 8: Dit elearning summer school 2015   analytics

Definiton

Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. A related field is educational data mining.

- wikipedia

Page 9: Dit elearning summer school 2015   analytics

Discussion

What data do we already collect?

Page 10: Dit elearning summer school 2015   analytics

So much student data we could useDemographics• Age, home/term address, commuting distance, socio-economic status, family

composition, school attended, census information, home property value, sibling activities, census information

Academic Performance• CAO and Leaving cert, University exams, course preferences, performance relative to

peers in school

Physical Behaviour• Library access, sports centre, clubs and societies, eduroam access yielding co-location

with others and peer groupings, lecture/lab attendance,

Online Behaviour• Mood and emotional analysis of Facebook, Twitter, Instagram activities, friends and their

actual social network, access to VLE (Moodle)

Page 11: Dit elearning summer school 2015   analytics

Motivation

Page 12: Dit elearning summer school 2015   analytics

Discussion

What challenges do you foresee in your institution?

Page 13: Dit elearning summer school 2015   analytics

Core Principles – Open University UK

Learning analytics is a moral practise which should align with core organisational principles

The purpose and boundaries regarding the use of learning analytics should be well defined and visible

Students should be engaged as active agents in the implementation of learning analytics

The organisation should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular

intervals

Modelling and interventions based on analysis of data should be free from bias and aligned with appropriate theoretical and pedagogical frameworks wherever possible

Students are not wholly defined by their visible data or our interpretation of that data

Adoption of learning analytics within the organisation requires broad acceptance of the values and benefits (organisational culture) and the development of appropriate skills

The organisation has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of students where feasible

Page 14: Dit elearning summer school 2015   analytics

Amazon

Page 15: Dit elearning summer school 2015   analytics

Examples

Page 16: Dit elearning summer school 2015   analytics

The earlier we diagnose, the earlier we can treat 

John CarrollMark GlynnEabhnat Ni Fhloinn

@glynnmark

Page 17: Dit elearning summer school 2015   analytics

Maths Diagnostic test

Page 18: Dit elearning summer school 2015   analytics

Data Analytics on VLE Access DataHow much can we mine from a mouseclick ?  

John BrennanOwen CorriganAly EganMark GlynnAlan F. SmeatonSinéad Smyth

@glynnmark

Page 19: Dit elearning summer school 2015   analytics

No significant difference in the entry profiles of participants vs. non-participants overall

PredictEd Participant Profile

Page 20: Dit elearning summer school 2015   analytics

Total Moodle Activity – notice the periodicity

Page 21: Dit elearning summer school 2015   analytics

One example module – ideal !

Page 22: Dit elearning summer school 2015   analytics

LGxxx – Predictor confidence (ROC AUC)

Page 23: Dit elearning summer school 2015   analytics

SSxxx

Page 24: Dit elearning summer school 2015   analytics

LG116 MS136 LG101 HR101 LG127 ES125 BE101 SS103 CA103 CA1680%

20%

40%

60%

80%

100%Workshops

Wikis

Forums

Assignments

Quizzes

scorm

lesson

choice

feedback

database

glossary

wiki

url

book

pages

folders

files

Course content

a b c d e f g h i j

Page 25: Dit elearning summer school 2015   analytics

Study by numbers

• 17 Modules across the University (first year, high failure rate, use Loop, periodicity, stability of content, Lecturer on-board)

• Offered to students who opt-in or opt-out, over 18s only

• 76% of students opted-in, 377 opted-out, no difference among cohorts

• 10,245 emails sent to 1,184 students who opted-in over 13 weekly email alerts

Page 26: Dit elearning summer school 2015   analytics

The Interventions – Lecturers’ Experience

Page 27: Dit elearning summer school 2015   analytics

Modules which work well …

• Have periodicity (repeatability) in Moodle access• Confidence of predictor increases over time• Don't have high pass rates (< 0.95)• Have large number of students, early-stage

Page 28: Dit elearning summer school 2015   analytics

LGxxx: law based subject

Students / year = ~110Pass rate = 0.78

Page 29: Dit elearning summer school 2015   analytics

Student Interventions: Feedback

Page 30: Dit elearning summer school 2015   analytics

Relative data

Page 31: Dit elearning summer school 2015   analytics

Student Experience of PredictED

Students who took part were asked to complete a short survey at the start of Semester 2 - N=133 (11% response rate)

Question Group 1 (more detailed email)

Group 2

% of respondents who opted out of PredictED during the course of the

semester4.5% 4.5%

% who changed their Loop usage as a result of the weekly emails

43.3% 28.9%

% who would take part again/are offered and are taking part again

72.2% (45.6%/ 26.6% )

76.6% (46% /30.6% )

Page 32: Dit elearning summer school 2015   analytics

33% said they changed how they used Loop. We asked them how?

• Studied more– “More study”– “Read some other articles online”– “Wrote more notes”– “I tried to apply myself much more, however yielded no results”– “It proved useful for getting tutorial work done”

• Used Loop more– “I tried harder to engage with my modules on loop”– “I think as it is recorded I did not hesitate to go on loop. And loop as

become my first support of study.”– “I logged on more”– “I read most of the extra files under each topic, I usually would just look

at the lecture notes.”– “I looked at more of the links on the course nes pages, which helped me

to further my understanding of the topics”– “I learnt how often I need to log on to stay caught up.”

Page 33: Dit elearning summer school 2015   analytics

Did you change Loop usage for other modules?

• Most who commented used Loop more often for other modules– “More often”– “More efficient”– “Used loop more for other modules when i was logging onto

loop for the module linked to PredictED”– “Felt more motivated to increase my Loop usage in general

for all subjects”

One realised that Lecturers could see their Loop activity“I realised that since teachers knew how much i was

using loop, i had to try to mantain pages long on so it looked as if i used it a lot”

Page 34: Dit elearning summer school 2015   analytics

Subject Description Non-Participant ParticipantBE101 Introduction to Cell Biology and Biochemistry 58.89 62.05CA103 Computer Systems 70.28 71.34CA168 Digital World 63.81 65.26ES125 Social&Personal Dev with Communication Skills 67.00 66.46HR101 Psychology in Organisations 59.43 63.32LG101 Introduction to Law 53.33 54.85LG116 Introduction to Politics 45.68 44.85LG127 Business Law 60.57 61.82MS136 Mathematics for Economics and Business 60.78 69.35SS103 Physiology for Health Sciences 55.27 57.03Overall Dff in all modules 58.36 61.22

Average scores for participants are higher in 8 of the 10 modules analysed, significantly higher in BE101, and CA103

Module Average Performance Participants vs. Non-Participants

Page 35: Dit elearning summer school 2015   analytics

Measuring the Flipping effect 

Patrick DoyleMark GlynnEveyn Kelleher

@glynnmark

Page 36: Dit elearning summer school 2015   analytics

Assessment Challenge

Page 37: Dit elearning summer school 2015   analytics

Logistics

• 200+ students• 4 assignments each• 5 minutes per

assignment• 10 lecturers• 2 weeks of assessment

Page 38: Dit elearning summer school 2015   analytics

Marking guide

Page 39: Dit elearning summer school 2015   analytics

Related research

Comparing students who watched versus not watched video one

Comparing Means [ t-test assuming unequal variances (heteroscedastic) ]

Descriptive Statistics

VAR Sample size Mean Variance

Didn't watch 84 51.86905 691.58505

Watched 102 63.15686 576.74743

Two-tailed distribution

p-level 0.00284 t Critical Value (5%) 1.97402

One-tailed distribution

p-level 0.00142 t Critical Value (5%) 1.65387

       

Page 40: Dit elearning summer school 2015   analytics

Discussion

What would you like to measure?

Page 41: Dit elearning summer school 2015   analytics

Selectively Analyzing your Course data?

@glynnmark@drjaneholland

Dr Jane Holland, RCSIEric Clarke, RCSIDr Mark Glynn, DCUDr Evelyn Kelleher, DCU

Page 42: Dit elearning summer school 2015   analytics

Constructive Alignment

Learning Outcomes

Page 43: Dit elearning summer school 2015   analytics

Particulars

• Attendance– Tutorials– labs

• Moodle logs• Defined times• Assessment results

Page 44: Dit elearning summer school 2015   analytics

Excel results Video tracking

Zero One Two Three Four Five Six Seven0%

10%

20%

30%

40%

50%

60%

70% What students watched "x" amount of videos

Watched

Watched before

Page 45: Dit elearning summer school 2015   analytics

All activities

Page 46: Dit elearning summer school 2015   analytics

One activity in particular

Page 47: Dit elearning summer school 2015   analytics

Multiple activities

Page 48: Dit elearning summer school 2015   analytics

Health warning

Page 49: Dit elearning summer school 2015   analytics

Questions and discussion…

Page 50: Dit elearning summer school 2015   analytics

Talking to one another

LMS

SRS

CMS

Timetable

Wifi

Library

Page 51: Dit elearning summer school 2015   analytics

Databridge

MITM

Course Databas

e

Timetable

ePortfolio

Wifi

LMS

Library

SRS

Page 52: Dit elearning summer school 2015   analytics

Additional slides

Page 53: Dit elearning summer school 2015   analytics

Building classifiers for each week/each module

Training DataTesting

Page 54: Dit elearning summer school 2015   analytics

Notes on model confidence• Y axis is confidence in AUC ROC (not probability)• X axis is time in weeks• 0.5 or below is a poor result• Most Modules start at 0.5 when we don't have much

information• 0.6 is acceptable, 0.7 is really good (for this task)• The model should increase in confidence over time• Even if confidence overall increases, due to randomness

the confidence may go up and down• It should trend upwards to be a valid model and viable

module choice

Page 55: Dit elearning summer school 2015   analytics

BExxx: Intro to Cell Biology

Results / year = ~300Pass rate = 0.86

Page 56: Dit elearning summer school 2015   analytics

BExxx

Page 57: Dit elearning summer school 2015   analytics

SSxx: Health Sciences

Results / Year = ~150Pass rate = 0.92

Page 58: Dit elearning summer school 2015   analytics

MSxxx

Page 59: Dit elearning summer school 2015   analytics

LGxxx

Page 60: Dit elearning summer school 2015   analytics

HR101

Page 61: Dit elearning summer school 2015   analytics

CAxxx

Page 62: Dit elearning summer school 2015   analytics

Some unusable modules

Modules where the ROC AUC increases slowly (e.g stays below 0.6) e.g. PS122

Page 63: Dit elearning summer school 2015   analytics

Timescale for Rollout

• Still some issues on Moodle access log data transfer to be resolved

• Still have to resolve student name / email address / Moodle ID / student number

• Still to resolve timing of when we can get new registration data, updates to registrations (late registrations, change of module, change of course, etc.) …

• Should we get new, “clean” data each week ?

Page 64: Dit elearning summer school 2015   analytics

Why did you take part?

• The majority of students wanted to learn/monitor their performance

• Many others were curious

• Some were interested in the Research aspect

• Some were just following advice

• Others were indifferent

Page 65: Dit elearning summer school 2015   analytics

How easy was it to understand the information in the emails ?(1= not at all easy, 5 = extremely easy)

• Average 3.97 (SD= 1.07)

• Very few had comments to make (19/133)– Most who commented wanted more

detail.

Page 66: Dit elearning summer school 2015   analytics

Week 3

Training DataTesting

Page 67: Dit elearning summer school 2015   analytics

Week 4

Training DataTesting

Page 68: Dit elearning summer school 2015   analytics

Week 5

Training DataTesting

Page 69: Dit elearning summer school 2015   analytics

Week 6

Training DataTesting

Page 70: Dit elearning summer school 2015   analytics

Week 7

Training DataTesting

Page 71: Dit elearning summer school 2015   analytics

Week 8

Training DataTesting

Page 72: Dit elearning summer school 2015   analytics

Week 9

Training DataTesting