Learning analytics for improving educational games jcsg2017

  • View

  • Download

Embed Size (px)

Text of Learning analytics for improving educational games jcsg2017

  1. 1. Using Learning Analytics to improve Serious Games Baltasar Fernandez-Manjon balta@fdi.ucm.es , @BaltaFM e-UCM Research Group , www.e-ucm.es Jornadas eMadrid, 2017, 04/07/2017 Realising an Applied Gaming Eco-System JCSG 2017, Universidad Politcnica de Valencia
  2. 2. Open issues with serious games - Do serious games actually work? - Very few SG have a full formal evaluation (e.g. pre-post) - Usually tested with a very limited number of users - Formal evaluation could be as expensive as creating the game (or even more expensive) - Evaluation is not considered as a strong requirement - Difficult to deploy games in the classroom - Teachers have very little info about what is happening when a game is being used
  3. 3. Learning analytics Improving education based on data analysis Data driven From only theory-driven to evidence-based Off-line Analyzing data after use Discovering patterns of use Allows to improve the educacional experience for future Real time Analyzing data while the system is in use to improve/adapt the current learning experience Allows to also use it in actual presential classes
  4. 4. Game Analytics Application of analytics to game development and research Telemetry Data obtained over distance Mobile games, MMOG Game metrics Interpretable measures of data related to games Player behaviour
  5. 5. Game Learning Analytics (GLA) or Informagic? GLA is learning analytics applied to serious games collect, analyze and visualize data from learners interactions with SG Informagic False expectations of gaining full insight on the game educational experience based only on very shallow game interaction data Set more realistic expectations about learning analytics with serious games Requirements Outcomes Uses Cost/Complexity Perez-Colado, I. J., Alonso-Fernndez, C., Freire-Moran, M., Martinez-Ortiz, I., & Fernndez-Manjn, B. (2018). Game Learning Analytics is not informagic! In IEEE Global Engineering Education Conference (EDUCON).
  6. 6. Uses of Gaming Learning Analytics in educational games Game testing game analytics It is the game realiable? How many students finish the game? Average time to complete the game? Game deployment in the class tools for teachers Real-time information for supporting the teacher Knowing what is happening when the game is deployed in the class Stealth student evaluation Formal Game evaluation From pre-post test to evaluation based on game learning analytics??
  7. 7. Minimun Game Requirements for GLA Most of games are black boxes. No access to what is going on during game play We need access to game guts User interactions Changes of the game state or game variables Or the game must communicate with the outside world Using some logging framework What is the meaning of the that data? Adequate experimental design and setting Are users informed? Anonymization of data could be required
  8. 8. Game Learning Analytics
  9. 9. Evidence-centered assessment design (ECD) The Conceptual Assessment Framework (CAF) Student Model: What Are We Measuring? Evidence Model: How Do We Measure It? Task Model: Where Do We Measure It? Assembly Model: How Much Do We Need to Measure It? (Mislevy, Riconscente, 2005)
  10. 10. Learning Analytics Model (LAM)
  11. 11. LAM: stakeholders, activities, outcomes
  12. 12. xAPI Serious Games application profile Standard interactions model developed and implemented in Experience API (xAPI) by UCM with ADL (ngel Serrano et al, 2017). The model allows tracking of all in-game interactions as xAPI traces (e.g. level started or completed) Now being extended in BEACONING to geolocalized games IEEE stardarization group on xAPI https://www.adlnet.gov/serious-games-cop
  13. 13. Game Learning Analytics Can GLA be systematized? Realising an Applied Gaming Eco-System
  14. 14. Java xApi Tracker Unity xApi Tracker C# xApi Tracker Game trackers and cloud analytics frameworks as open code (github) https://github.com/e-ucm
  15. 15. Client A2 Applications Games Analytics Frontend A2 Frontend PlayersDevs,Students, Teachers. Admins Users Roles Resources Permissions Applications Authentication Authorization JWT JSON WEB TOKEN Topology Analytics Backend Games Sessions Collector Results Application 1 Application N KafkaQueue Games Sessions Results Visualization Architecture and technologies Open code to be deployed in your server You own all your game analytics data
  16. 16. Systematization of Analytics Dashboards As long as traces follow xAPI format, these analysis do not require further configuration! Also possible to configure game-dependent analysis and visualizations for specific games and game characteristics.
  17. 17. Real-time analytics: Alerts and Warnings Identify situations that may require teacher intervention Fully customizable alert and warning system for real-time teacher feedback 24/11/RAGE Project presentation17 Inactive learner: triggers when no traces received in #number of minutes (e.g. 2 minutes) > High % incorrect answers: after a minimum amount of questions answered, if more than # %of the answers are wrong Students that need attention View for an specific student (name anonymized)
  18. 18. xAPI GLA in games authoring Previous game engine eAdventure (in Java) Helps to create educational point & click adventure games Platform updated to uAdventure (in Unity) Full integration of game learning analytics into uAdventure authoring tool uAdventure games with default analytics Include geolocalized games
  19. 19. uAdventure: geolocalized serious games Geolocalized default analytics visualization New Geolocalized game scenes and actions
  20. 20. Examples: First Aid - CPR validated game Collaboration with Centro de Tecnologias Educativas de Aragon, Spain Identify a cardiac arrest and teach how to do a cardiopulmonary resuscitation to middle and high school students Validated game, in 2011, 4 schools with 340 students Marchiori EJ, Ferrer G, Fernndez-Manjn B, Povar Marco J, Suberviola Gonzlez JF, Gimnez Valverde A. Video- game instruction in basic life support maneuvers. Emergencias. 2012;24:433-7.
  21. 21. BEACONING Experiments: data collection 227 students from 12 to 17 years old Game rebuilt with uAdventure Each student completed: 1. Pre-test (multiple-choice questions) 2. Complete gameplay (xAPI traces) 3. Post-test (multiple-choice questions) 104 variables identified for each player.
  22. 22. Replicability of results: knowledge acquisition Original experiment with the game Original experiment, control group Current experiment Lower learning than in original experiment but still significative!
  23. 23. GLA for Improving SG evaluation Ideally: we would want to find a better evaluation method for SGs, avoiding pre-post experiments. High costs in time and effort. Our first approach: use data mining techniques to predict pre-test and post-test scores using the data tracked from in-game interactions. - To measure knowledge acquisition. - But it could also work to measure attitude change or awareness raised.
  24. 24. GLA data for Improving SG evaluation Use data mining techniques to predict pre-test and post-test scores using the data tracked from in-game interactions. 1. To avoid pre-test: - Determine the influence of previous knowledge in game results. 1. To avoid post-test: - Determine the capability of game interactions to predict post test results when combined with the pre test. - Compare the previous capability to that of game interactions on their own to predict post test results.
  25. 25. Improving SG evaluation To avoid pre-test: - Create prediction models of pre-test using only game data Score prediction: - As binary category pass / fail Prediction models used: - Nave Bayes Method Precision Recall Naive Bayes 0.69 0.84 Results obtained:
  26. 26. Improving SG evaluation To avoid post-test: - Create prediction models of post-test score using pre-test + game data - Create prediction models of post-test score using only game data Score prediction: - As numeric value (from 1 to 15 in our game) - As binary category pass / fail Prediction models used: - Trees - Regression - Nave Bayes to compare results
  27. 27. Improving SG evaluation To avoid post-test - summary of results obtained. -Prediction of post-test score (1 to 15) -Prediction of post-test pass / fail Worse predictions without pre-test data but still acceptable results. Method Pre-test ASE Regression trees Yes 4.92 No 5.68 Linear regression Yes 5.81 No 5.71 Method Pre-test Precision Recall Decision trees Yes 0.81 0.94 No 0.88 0.92 Logistic regression Yes 0.89 0.98 No 0.87 0.98 Naive Bayes Yes 0.92 0.89 No 0.89 0.90
  28. 28. New uses of games based on GLA - Avoiding pre-test: Games for evaluation - Avoiding post-test: Games for teaching and measure of learning With or without pre-test.
  29. 29. BEACONING GLA Case study: Downtown Serious Game designed and develop to teach young people with Down Syndrome to move around the city using the subway Evaluated with 51 people with cognitive dissabilities (mainly Down Syndrome) 42 users with all data 3h Gameplay/User 300K analytics xAPI data (traces) to analyze
  30. 30. Case Study: Downtown From user requirements to a game design and its observables Know more about how and what is learn by people with Down Syndrome 31
  31. 31. Media Coverage 32
  32. 32. Game Design and Analysis Workflow Present Individualized Learning Analysis Collective Learning Analysis Lea . Group 1 Group 2 Group 3 LearningProgress d1.a d1.n d2.a d3.a d2.n d3.n *d = Data collected during a game session GLAID (Game Learning Analytics for Intellectual Disabilities Analytics Framework User 1 User 2 User n User 1 User n User 3 User 2 User 5 User 4 Data Handling Designer Perspectiv