19
Using Schema Analysis for Feedback in Authoring Tools for Learning Environments Harrie Passier* & Johan Jeuring** Faculty of Informatics * Open University of the Netherlands ** Open University of the Netherlands and University of Utrecht

Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

  • Upload
    pilis

  • View
    36

  • Download
    0

Embed Size (px)

DESCRIPTION

Using Schema Analysis for Feedback in Authoring Tools for Learning Environments. Harrie Passier* & Johan Jeuring** Faculty of Informatics * Open University of the Netherlands ** Open University of the Netherlands and University of Utrecht. Overview. Introduction Context Feedback - PowerPoint PPT Presentation

Citation preview

Page 1: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

Harrie Passier* & Johan Jeuring**Faculty of Informatics

* Open University of the Netherlands** Open University of the Netherlands and University of Utrecht

Page 2: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 2

Overview• Introduction

– Context– Feedback– Lack of feedback – Research goal – Ontology based feedback

• Using Schema Analysis for Feedback in Authoring Tools– Schemata– Schema representations– Schemata: abstract interpretations– Schema analysis– Two examples: completeness and synonyms

• Questions and discussion

Page 3: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 3

Context

• Faculty of Informatics of the OUNL

• Research interest: Generating Feedback– Feedback to students

• Design education like modelling (UML – class and object diagrams)

• Mathematics courses (solving systems of linear equation)

– Feedback to authors• Course development

• Information from student phase to author phase: optimisation of e-course (sub project of Alfanet project –Audit module)

Page 4: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 4

Feedback

• Definition– Comparison of actual performance with some set standard (norm)

– Assess progress, correct errors and improve performance

• An essential element needed for effective learning

Page 5: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 5

Lack of feedback

• Student side: there is a frequently lack of (semantically rich) feedback in eLearning systems (Mory, 2003)

• Author side: eLearning systems are often complex tools. There is a high probability of mistakes. To improve the quality, authoring tools should include mechanisms for checking the authored information (Murray, 1999)

Page 6: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 6

Research goal

• Develop generic, domain and task independent feedback mechanisms that produce semantically rich feedback to learners and authors

• Three types of feedback– To a student during learning

– To an author during course authoring

– From a group of learners who study a course to an author

• Ontologies are arguments of the general feedback engine– Reusability, flexibility and adaptability of knowledge structures

(Aroyo, 2004)

Page 7: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 7

Ontology based feedbackFunctional architecture

Domain ontology

Model language ontology

Task ontology

Education ontology

Feedback ontology

eLearning system

Player

Author tool

Feed

back

en

gine

Page 8: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 8

Ontologies as norms

Examples

• Author perspective:– Domain ontology (communication technology)– Course structure (IMS Learning Design – IMS LD) – Task ontology (steps to develop a course)– Education (inductive and deductive learning)– Feedback (preventive and corrective feedback)– …

• Student perspective:– Domain ontology (communication technology)– Model language ontology (UML)– ..

Page 9: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 9

Using Schema Analysis for Feedback in Authoring Tools

Scope:

• Authoring

• Structural aspects– Course structure

– Domain structure

Page 10: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 10

Schema

• An ontology specifies the objects in a domain of interest together with their characteristics in terms of attributes, roles and relations. Many aspects can be represented, such as categories (taxonomic hierarchy), time, events and composition.

• A schema is a certain type of ontology. It describes the structure of a composite object. A composite object contains objects related to other objects using ‘has_part’ or ‘uses’ relations.

Page 11: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 11

Schema representations

Two schemata:

• Domain schema: RDF <resource, property, value>

• Course structure: IMS LD (= Document Type Defintion -DTD)– Addition of specific annotations to content and structure:

• New elements: Definition and Example• New attribute: Educational-strategy (Inductive | Deductive) • In practice many elements can be added

wheel

rimspoke

has_part

Page 12: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 12

Example IMS LD definition

<!ELEMENT Activity %Activity-model; ><!ATTLIST Activity

… Educational-strategy (Inductive | Deductive) ><!ENTITY %Activity-model "(Metadata?, …, Activity-description)" ><!ELEMENT Activity-description (Introduction?, What, How?, …, Feedback-description?) ><!ELEMENT What %Extra-p; ><!ENTITY %Extra-p "(…| Figure | Audio | Emphasis | List | … | Example | Definition)*" >

<!ELEMENT Definition (Description, Concept, RelatedConcept+) ><!ATTLIST Definition Id ID #REQUIRED Name CDATA #REQUIRED><!ELEMENT Example (Description, Concept, RelatedConcept+) ><!ATTLIST Example Id ID #REQUIRED Name CDATA #REQUIRED Belongs-to-definition IDREFS #REQUIRED>

Page 13: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 13

Schemata: abstract interpretations

Possible properties of a course:

• Completeness: Are all concepts that are used in the course defined somewhere?

• Correctness: Does the definition of a concept used in the course correspond to the definition of the concept in the ontology?

• Timely: Are all concepts used in a course defined on time?– Use of educational strategy attribute (inductive, deductive)

• Recursive concepts: Are there concepts defined in terms of it self?

• Synonyms: Are there concepts with different names but exactly the same definition?

• Homonyms: Are there concepts with multiple, different definitions?

Page 14: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 14

Schema analysis

• The analyses take schemata as input

• We perform two types of analyses– The analysis of structural properties of one schema, for example

the recursive property

– The comparison of a schema with one or more other schemata, for example to test on correctness

Page 15: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 15

Some definitions (I)

Suppose o = Ont [(a, [b,c]), (b, []), (c, [d,e]), (d, []), (e, [])] :: Ontology, where the letters represent concepts

a

b c

d e

Page 16: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 16

Some definitions II

Then

• terminalConcepts = [(b, []), (d, []), (e, [])]

• nonTerminalConcepts = [(a, [b,c]), (c, [d,e])]

• allConcepts = [(a, [b,c]), (b, []), (c, [d,e]), (d, []), (e, [])]

• reachable nonTerminalConcepts allConcepts

= [(a, [b,c,d,e]), (b, []), (c, [d, e]), (d, []), (e, [])]

• reachableTerminals nonTerminalConcepts nonTerminalConcepts

= [(a, [b,d,e]), (c, [d,e])]

NB. Functions based on fixpoint calculations (grammar analyses)

a

b c

d e

Page 17: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 17

Example I: Completeness

• Definition: are all concepts used in the course defined somewhere?– Within a course – Within an domain ontology– Between a course and an domain ontology

• Steps (within a course)– Determine the set of used concept id’s

• in the right- and left hand sides of concepts within examples• in the right hand side of concepts within definitions

– Determine the set of defined concept id’s• in the left-hand side of concepts in definitions

– Check that each of the used concepts appears in the set of defined concepts

Page 18: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

AIED 2005 Harrie Passier, OUNL 18

Example II: Synonyms

• Concepts with different names may have exactly the same definition– Within an ontology

• Example– Concept a (a, [c,d]) and concept b (b, [c,d]), are synonyms

• Formal definition: Given a set of productions, two concepts x and y are synomyms if their identifiers are different, Idx Idy, and

(reachableTerminals productions x) equals (reachableTerminals productions y)

• Steps– Determine for all concepts in the ontology all reachable terminal concepts

– Collect the concepts with the same reachable terminal concepts and different concept id’s

Page 19: Using Schema Analysis for Feedback in Authoring Tools for Learning Environments

Questions?

Thank you!

Contact me: [email protected]