Click here to load reader

Evaluating Mobile Learning

  • View
    403

  • Download
    0

Embed Size (px)

DESCRIPTION

Presentation given at Mobile Learning Early Researcher Symposium, Learning Lab, University of Wolverhampton.

Text of Evaluating Mobile Learning

  • 1. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Mobile Learning EvaluationMobile Learning Evaluation Giasemi Vavoula University of Leicester

2. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 OverviewOverview Evaluation (session) in context Evaluation context Part 1: What do we evaluate? (a framework) Part 2: How do we evaluate it? (methods and tools) Part 3: Practical & ethical considerations Identifying assumptions 3. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Evaluation (session) in ContextEvaluation (session) in Context Evaluation Research Publishing Ethics Theorising 4. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Evaluation ContextEvaluation Context Part 1. What do we evaluate? M3 Evaluation Framework (Vavoula & Sharples 2009) Part 2. How do we evaluate it? Methods and tools Case study of evaluation methods and tools within M3 Framework Part 3. Practical & Ethical considerations Who evaluates and who is evaluated Where When 5. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. What do we evaluate?Part 1. What do we evaluate? technology experience institutional practice personal practice 6. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. M3 Evaluation at three levelsPart 1. M3 Evaluation at three levels Micro level: users experience of the technology Usability Utility of functions Meso level: users learning/educational experience Cognitive learning Breakthroughs Breakdowns Macro level: impact on institutional & personal learning/teaching practice Appropriation of new technology: unexpected and envisaged use New practices further requirements 7. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. M3 Evaluation in three stagesPart 1. M3 Evaluation in three stages Users expectations (data collection) Users actual experience (data collection) Expectations reality gaps (data analysis) 8. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 1. Evaluation at 3 levels,Part 1. Evaluation at 3 levels, in 3 stages, throughout project lifecyclein 3 stages, throughout project lifecycle design implement deploy micromesomacro analyse requirements Technology robust enough to support full user trial Technology deployed long enough to assess impact 9. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. How do we evaluate?Part 2. How do we evaluate? Typical process: Collect data Analyse data Answer/refine research questions 10. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Case Study: MyartspacePart 2. Case Study: Myartspace Handout phones Explore museum Recap learning task etc. Logon Phone training Share / present Example gallery CollectCollectCollect 11. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Case study in greater scheme of thingsPart 2. Case study in greater scheme of things Learning tools Learning method + activities Learning objectives + outcomes Social setting Location + space layout Familiar, setFamiliar, setUnpredictableUnpredictable Pre-determinedPre-determinedUnknown some idea Unknown Pre-set, externalPre-set, externalUnknownUnknown FixedKnownUnpredictableUnpredictable FixedKnown but not standard Known but not standard Unpredictable traditional classroom museum school visit general museum visit mobile vagueness++ -- 12. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ all levelsPart 2. Collect data @ all levels Data sources Stage 1 - expectations Design heuristics System documentation Experience documentation Promotion materials Minutes of project meetings Project proposal Press coverage Scoping study / literature review Stakeholder/user interviews & focus groups Stage 2 - reality Evaluation outcomes Requirements specification User observations Stakeholder/user interviews & focus groups User questionnaires User-created artifacts Stakeholder consultation workshops Heuristic evaluation 13. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ micro levelPart 2. Collect data @ micro level Method: Heuristic Evaluation Collect data re expectations Established design heuristics Collect data re reality Experts undertaking heuristic evaluation Analyse gaps Analysis of expert reports and production of (re)design recommendations 14. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ micro levelPart 2. Collect data @ micro level Method: Technical Testing Collect data re expectations Data supplied by system requirements Collect data re reality System performance tests outcomes Analyse gaps Comparison of performance data against requirements 15. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ micro levelPart 2. Collect data @ micro level Method: Full-scale user trial Collect data re expectations Examine system documentation (Teachers Pack and Lesson Plans, online help) for descriptions of functionality Interview teacher prior to lesson to assess level of knowledge and expectations for functionality Observe training sessions at museum and school to document how functionality is described to teachers/students. Student questionnaires regarding expectations of system functionality in forthcoming lesson Collect data re reality Observe lesson to establish actual teacher and student experience of functionality Interview teacher after the lesson to clarify experience of functionality Questionnaire and focus groups with students after the lesson to capture experience of functionality Analyse gaps Capture expectations-reality gaps in terms of user experience of functionality through reflective interpretation of documentation analysis in the light of observations interviews and focus groups with teachers/students critical incident analysis with students 16. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ meso levelPart 2. Collect data @ meso level Method: Full-scale user trial Collect data re expectations Analyse description of educational experience based on Teachers Pack and Lesson Plans Interview teachers and museum educators prior to lessons about what they have planned for the students learning experience Observe teachers and museum educators while presenting learning experience to students in the classroom/museum Student questionnaires regarding expectations of learning experience in forthcoming lesson Collect data re reality Observe educational experience in museum/classroom Note critical incidents that show new forms of learning or educational interaction Note breakdowns Interviews/focus groups with teachers, museum educators, students on educational experience in museum/classroom Analyse gaps Capture expectations-reality gaps in terms of educational experience through reflective interpretation of documentation analysis and observations interviews/focus groups with teachers, students, museum educators critical incident analysis with students 17. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ macro levelPart 2. Collect data @ macro level Method: Full-scale user trial Collect data re expectations Analyse descriptions in service promotion materials, original proposal, minutes of early project meetings Interviews with stakeholders to elicit initial expectations for impact of service Collect data re reality Review of press coverage and interviews with stakeholders to document impact/transformations effected by the service Analyse gaps Reflective analysis of expectations-reality gaps in terms of service impact 18. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Collect data @ all levelsPart 2. Collect data @ all levels Method: Various for Requirements analysis Collect data re expectations Scoping study of previous projects and related recommendations Consultation workshop on User Experience to establish requirements Collect data re reality Data supplied by evaluation analysis Analyse gaps Workshop to finalise educational and user requirements Revisions of requirements in light of evaluation findings 19. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 TOTAL Collected objects Written comments Sounds Photographs ClassGroup avg. 58 7 7 11 33 637 75 77 121 364 A student can effectively process 5-10 items during a single post-visit lesson Part 2. Example of data analysisPart 2. Example of data analysis -It has a code - I want to take my own picture How will I know what this photo is about? Expect to be able to record what pictures are of 20. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 TOTAL Collected objects Written comments Sounds Photographs ClassGroup avg. 58 7 7 11 33 637 75 77 121 364 Part 2. Example of data analysisPart 2. Example of data analysis A student can effectively process 5-10 items during a single post-visit lesson -It has a code - I want to take my own picture How will I know what this photo is about? Expect to be able to record what pictures are of 21. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 Part 2. Example of data analysisPart 2. Example of data analysis micro meso macro Creating and collecting items is quick and easy Children enjoy the creativity and sense of ownership in creating own content System does not support annotating collected items Frustration / confusion Change to system to support photo annotation Read label into the phone after each photo 22. G. Vavoula 15/10/09 Mobile Learning Evaluation MLearnResearch 09 TOTAL Collected objects Written comments Sounds Photographs ClassGroup avg. 58 7 7 11 33 637 75 77 121 364 Part 2. Example of data analysisPart 2. Example of data analysis A student can effectively process 5-10