1. On Evaluating The 80 Days Geography Game For School Students Mahmoud Abdulwahed 2. Pre Evaluation
- Understand the utilized pedagogicaltheories.
- Understand the utilized cognitive theories, i.e. the CbKST.
- Understand the game architecture, in particular:
- Others: such as Neuroscience, the gaming factor, etc.
The DEGs Evaluation Dimensions 4. The DEGs Learning Outcomes
- Intended learning outcomes, such as:
- and any such intended geography learning outcomes by the school curricula
- Occasional, or un intended learning outcomes (nevertheless its still learning !), such as:
- and any such generic learning outcomes as a result of playing digital games
Two Types of Learning Outcomes are Associated with DEGs 5. Pedagogical Variables of DGs Generic Learning Types Associated with Digital Games (Prensky 2001) 6. Some Important Pedagogical Measures
- Depth of the gained knowledge, i.e. deep vs. surface learning.
- Sustainability of learning (Direct impact on life-long learning), i.e. do students seek to learn more about geography outside the game environment as a result of DEG based learning ?
- Breadth of learning outcome for DEGs vs. Classical Pedagogies.
- Time spent of the learning objectives.
- Learning styles change? i.e. does it increase the experiential learning style?
- Proximal, i.e. is the DEG an optimal platform for implementing effective social learning experience for students in the classroom or the lab?
- Cyber, i.e. online collaborative learning with DEG through the web
7. Some Proposed Methods and Tools
- Analysing the game logged data
- Comparative evaluation, i.e. classical teaching vs. DEG based teaching.
- Learning styles inventories.
8. Usability Problems of Digital Games
- Unpredictable / inconsistent response to users actions
- Does not allow enough customization
- Artificial intelligence problems
- Mismatch between camera/view and action
- Does not let user skip non-playable content
- Difficult to control actions in the game
- Does not provide enough information on game status
- Does not provide adequate training and help
- Command sequences are too complex
- Visual representations are difficult to interpret
- Response to users action not timely enough
Generic List of DGs Usability Problems (Pinelle et al 2008): 9. Setting up The Usability Problems of DEGs
- Many usability problems of the generic DGs usability problems may apply for DEGs.
- Build upon the educational usability findings of educational software research.
- Extension towards the notion ofDEGs Educational Usabilitywill be necessary when evaluating DEGs usability,i.e. the combination of the learning elements make the game boring?, knowledge is difficult to navigate or retrieve?, schools does not have enough IT equipments for running the 3D games (high quality graphic cards)?, too much time needed to learn basic knowledge? etc.
10. Usability Evaluation Methods
- Perspective-based Inspection
- Questionnaires (QUIS, PUEU , NAU
- NHE, CSUQ, ASQ, PHUE, PUTQ, USE )
Fromhttp://www.usabilityhome.com/ 11. Usability Evaluation of DEGs Methodology:The Adoption, Adaptation, and Extension Principle 12. Cognitive and Affective Variables
- Gender issues of the game perception?
- Preference in comparison to commercial educational games during leisure times, i.e. is it alternative? Supplement? or no preference at all?
- Motivation towards schooling
13. Summary: Evaluating, How To? Hybrid Evaluation Methods
- Data Analysis of the Game.
- Pedagogical measurementsmethods and instruments, i.e. the learning style inventory LSI.
- Cognitive measurements instruments, i.e. memory tests, mind maps, IQ tests, etc.
Suggested Tools: ComparativeEvaluations X X Treatment Pre Post Experimental Group Equivalent groups Different Outcome? Control Group Y Yt 14. Conclusions
- Objectives based evaluation. i.e. identifying the objectives of the game on all perspectives and evaluating whether they met or not and to which extent.
- In-Game and Out Game evaluation.
- Surveying the potential current tools and models of evaluation (pedagogical, usability, cognitive, etc)AdaptationExtension. Hence,developing Novel Models.
- Inventing new models from scratch when needed.
- Comparative evaluations when possible.
- Evaluation during the design process phase, and final product evaluation.
- Hybrid evaluation approach, i.e. quantitative and qualitative.
- Embedding more science in explaining the evaluation findings, i.e. relating the findings to the pedagogical theories and cognitive theories, explaining from systems dynamics and game theory perspectives.
- There is a lot of objectives and variables to be evaluated; load of evaluation experiments, methodologies and models that can be designed or usedcomprehensive work,BUT , a lot of high quality publications!.
- Robust and Holistic evaluation models would be developed after having more insight on the project, contacting with the partners, and getting closer view of the associated multidisciplinary domains.