Helping teachers understand their learners and their needs better in WebCT
Alan Masson
Philip Turbitt
University of Ulster
Session Overview
• E-learning in context• VLE - agent for change?• Promoting reflective practice within the
classroom context• Examine: cohort profile and user
expectations - why and how• Benefits • Wrap
e-learning in context
• Demand high from learners• Changes in practice largely staff-driven• Training in the void• Success hard to measure, quality hard to assure• Success for whom?• Institutionally, need to support large number of staff
with diverse personal aims• Need - impact on the LEARNER experience• Realise cultural change - driver for (not driven by!!)
technology
CETL(NI): Utilising Institutional e-Learning Services to Enhance the Learning Process
• Aim: “promote, facilitate and reward the adoption of a “learner centred” reflective practice approach to the development of teaching and learning, in particular wrt the use of e-learning technologies”
• Posts: 1x academic staff developer, 1x research associate, 2 tech posts, 4 content developers, learner advocate
• Cultural challenge: effecting changes in “teaching” practices - key to learning experience
VLE as an agent for change?
VLE data usage
• Tracking• Monitoring• Logging
• Retrospective - evaluation of existing practice
• Linkages to reflection / development processes?
Proactive agent for change?
• Need to challenge current practice• Identify emerging tensions and constraints of
current practice– Between successive cohorts– Between learners and “course”– Between learners and tutors
• Promote issues of learner expectation (raising the learner perspective)
• Culturally - supportive not critical
Importance of the online classroom context
• It is the coalface - best place to locate services!• Module code:
– inferred metadata (abc123j1x)– Dynamic key to SRS, VLE and other data resources
• Provide pedagogy support dashboard - adapts on a module by module (and staff by staff) basis
Promoting reflective practice
• Full “cycle” support• Learner focus• Promote relevant
resources• Integrate with relevant
internal and external repositories
• Manage use of resources adopted by academics?
• Inform and complement f2f activities
Reflectionkey process
• Embed QAE processes / resources into classroom
• Provide an information framework to promote learner centric reflection
• Draw on range of institutional data
• Focus on both current data and changes from previous offerings
• Provide pedagogic prompts for reflection
• Assist staff to identify where they can impact on student learning
What can be achieved?
• SRS data opportunities
• WebCT CE4.x opportunities
• WebCT Vista / CE6 opportunities
Student cohort information
Data source:• Student record system (or data warehouse)
Key factor:• Identify changes in cohort and suggest
emerging challenges
Require:• Compare cohort with previous cohort
Student cohort data
Useful fields:• Number of learners• Course code• Mode of study• Age• Gender• “at risk” (likely field - resit flag)• Disability?
Specific Yr1 retention factors
• UCAS pts (entry grades)
• Entrance qualification type
• Course not 1st choice
Student Data presentation
Presentation approach
• Canned reports - no access to raw data
• Simple to interperate - tabular or graphical?
• Clear comparison between cohorts
• Trigger values - flag key changes with issues to consider
Privacy considerations
• Must ensure individual data is NOT exposed
• Ensure adequate filters for small cohorts
• Ensure user has no access to data, only reports
• Ensure strict access policies for WHO can view reports
E-learning “experience” factors
Key factors
• Learners previous e-tool “experience” relative to module toolset
• Learners previous e-tool “experience” relative to tutors experience
e - “experience” data
Looking back - require historical data
• Vista users - Powersight module useful
• Likelihood - “old” Campus Edition as legacy– Info in the system log files– Similar underpinning requirements for both– Use this to illustrate process today
Review of requirements
Permit tutors to (simply) identify:• Tensions in learners e-tool “experience” with
tool use expected in the course• Tensions in e-tool “experience” between the
learner cohort and themselves• NOT aiming to provide a formal evaluation
process • Rather highlighting potential tensions for
reflection
Initial perspective
• “course” data - tool use by cohort in previous offering of the course (phase 1)
• learner data - tool use by current cohort (in their previous courses)
• Comparison data - tool use differences between current cohort and tutor’s own “experience”
Experience or expectation?
• Experience - very difficult to quantify – discriminating quantity v quality– deeper report - more complex presentation– reliability of analysis????
• Expectation - more transparent concept– infer from previous usage– simple to report– better fit with user requirements
Which tools to focus on?
• Initial focus - key tools shortlisted• Communication
– Discussion– Mail– Chat– Calendar?
• Assessment– Quiz– Assignment
Getting data from “old CE”
Staring point - server log filesReview period - end of each semesterStructured, BUT very extensive (Gb’s of data)First task - filter out non relevant transactions• Approach - egrep• 194.125.168.48 – 32263703 [01/May/2004:15:43:37
+0100] GET/SCRIPT/bms803c2x/scripts/student/serve_bulletin?ACTION=LIST&ARG1=1038900217&PAGE=0 HTTP/1.1 200 60751 4
Key tool info embedded within text strings
Structuring the log file data
• Parse out key variables• 194.168.231.6 – 32563603 [01/May/2004:23:25:15 +0100] bms803c2x
student serve_bulletin? 1 200 11133 0
• Data fields now structured - ready for processing
Evaluating tool embeddedness
Factors (per module cohort)
• Extent of tool use across cohort (% of learners)
• Depth of use (threshold for consideration)
• Value judgement:– < x% = low– >x and <y = medium– >y = high
Evaluating historical tool use: current cohort
Factors (per student)
• Usage of tool across courses (extent)
• Depth of use (threshold for consideration)
• Value judgement (over previous semester):– 0 modules = none– 1 or 2 modules = medium– All previous modules = high
Evaluating historical tool use: tutor
Factors (for tutor)• Usage of tool across courses (extent)• Depth of use (threshold for consideration)• Value judgement (over previous
semester):– 0 modules = none– <60% of supported modules modules =
medium– >60% of supported modules modules = high
Indicative report
Tool X Low Medium High
Previous offering
Current
cohort
20% 40% 40%
Tutor
(you)
Benefits
Facilitates reflection by tutor• Promotes awareness of learner expectations• Increases awareness of key learner - course tool
issues• Reinforces relationship between the tutor and learner
expectations
• Seed for effective changes in practice - initiates reflective cycle in an objective manner
Status of work
• Refining factor thresholds• Scaled pilot - in conjunction with Vista migration related
staff development (focus on enhancement)• Data extraction / processing: prototyped and scaling to
production• One component of overall CETL activities
– Interoperability with other aspects (case studies, tool support, learning design models etc.)
• Extensions for next phase: temporal and location factors
Promoting reflective practice