38
Online Assessment for Individualized Distributed Learning Applications Greg Chung UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Annual CRESST Conference Los Angeles, CA September 9, 2004

Online Assessment for Individualized Distributed Learning Applications

  • Upload
    onslow

  • View
    43

  • Download
    2

Embed Size (px)

DESCRIPTION

Online Assessment for Individualized Distributed Learning Applications. Greg Chung. UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Annual CRESST Conference Los Angeles, CA September 9, 2004. - PowerPoint PPT Presentation

Citation preview

Page 1: Online Assessment for Individualized Distributed Learning Applications

Online Assessment for Individualized Distributed

Learning Applications

Greg Chung

UCLA Graduate School of Education & Information StudiesNational Center for Research on Evaluation,Standards, and Student Testing (CRESST)

Annual CRESST Conference

Los Angeles, CASeptember 9, 2004

Page 2: Online Assessment for Individualized Distributed Learning Applications

Overview of Talk

• Distributed learning (DL) context

• Elements of a DL system

• Research examples

• Current work

Page 3: Online Assessment for Individualized Distributed Learning Applications

Distributed Learning Definition

The distribution via technology of training, education, and information that resides at one location to any number of learners who may be separated by time and space and who may interact with other parties (peers, instructor, system) synchronously or asynchronously.

Page 4: Online Assessment for Individualized Distributed Learning Applications

Characteristics

• Learner-centric

• Autonomous learner

• Asynchronous communication modes

• Varying degree of instructor support

Page 5: Online Assessment for Individualized Distributed Learning Applications

Typical Vision Statement

“Provide quality instruction to the right people, at the right time, and at the right place.”

Page 6: Online Assessment for Individualized Distributed Learning Applications

Implications of DL• Operational

• Anytime, anywhere learning implies anytime, anywhere assessment

• Online, rapid scoring, immediate feedback to learner, actionable information

• Individualized

• Research

• Examine ways of extracting useful information about learners in an online context

Page 7: Online Assessment for Individualized Distributed Learning Applications

Instruction-AssessmentLoop

Instruction Assessment

Decision

Page 8: Online Assessment for Individualized Distributed Learning Applications

Elements of a DL System

• Framework to guide what information to extract from the online environment

• Method to synthesize disparate information types

• Automated reasoning support for interpreting knowledge and performance observations

Page 9: Online Assessment for Individualized Distributed Learning Applications

CRESST Assessment Model

Learning

ContentKnowledge

Problem Solving

CollaborationSelf-Regulation

Communication

Page 10: Online Assessment for Individualized Distributed Learning Applications

Data Fusion Strategy

indicator

clickstream

construct

• • •

Event• clicked on button 32• selected test item 2• spent 20 sec on help page 3

• • •

Descriptive• adjusted bicycle pump

design• performed (virtual) blood

test correctly

Inferential• used the “generate-and-test”

problem-solving strategy• used productive learning

strategies• understood the fundamentals

of rifle marksmanship

Page 11: Online Assessment for Individualized Distributed Learning Applications

Research Examples

• Elements of DL system tested in several studies

• Pump simulation design task

• Tested whether the “generate-and-test” problem solving strategy could be measured using simple aggregation of clickstream data

• Problem-solving task (IMMEX)

• Tested whether moment-to-moment learning processes could be measured from clickstream data (data fused with Bayesian networks)

Page 12: Online Assessment for Individualized Distributed Learning Applications

Research Examples

• Elements of DL system tested in several studies (continued)

• Knowledge of rifle marksmanship

• Tested individualized instruction based on measures of knowledge

• Data fused with Bayesian networks

Page 13: Online Assessment for Individualized Distributed Learning Applications

Research Example 1: Pump Design Task

• Can the “generate-and-test” problem solving strategy be measured using clickstream data?

• Novel GUI to support measurement

Page 14: Online Assessment for Individualized Distributed Learning Applications

Generate-and-Test Processes

Page 15: Online Assessment for Individualized Distributed Learning Applications

Information events -- Click and hold mouse

down to view information

Information events -- Click and hold mouse

down to view information

Design events – run pump simulation

Design events – run pump simulation

Solve problem event – commit to a design

solution

Solve problem event – commit to a design

solution

Design events – change dimensions

of pump

Design events – change dimensions

of pump

Page 16: Online Assessment for Individualized Distributed Learning Applications

Example 1 Results

Page 17: Online Assessment for Individualized Distributed Learning Applications

Th

eory

On

lin

e

Beh

avio

r

Page 18: Online Assessment for Individualized Distributed Learning Applications

Example 1 Conclusion

• Findings consistent with generate-and-test problem solving strategy

• Sequence of events was an important characteristic of the data

• Simple test of data fusion strategy

• Insertion of software sensors driven by cognitive demands of task

• Low-value clicks transformed into meaningful information

Page 19: Online Assessment for Individualized Distributed Learning Applications

Research Example 2:Problem-solving task

• Research Question

• To what extent can learning processes be modeled solely from clickstream (i.e., behavioral) data?

• More complex test of data fusion strategy in a different domain

• Use Bayesian networks to depict dependencies between cognitive processes and online behavior

Page 20: Online Assessment for Individualized Distributed Learning Applications

ParentsParents

Test procedures

Test procedures

Page 21: Online Assessment for Individualized Distributed Learning Applications

Behavioral Indicator Example

Construct: “Understands a test procedure”

• Indicator: Not testing for a parent thatcould have been eliminated with a prior test

• Indicator: Successive reduction in thenumber of parents testedacross tests

Construct: “Successful learning”

• Indicator: test -> library access of test -> test

• Indicator: library access of test -> test -> library access of test

• Indicator: 5s or more spent on library access of test -> test

Page 22: Online Assessment for Individualized Distributed Learning Applications

Bayesian Network

Inferred processes

Inferred processes

Behavioral indicators

Behavioral indicators

Page 23: Online Assessment for Individualized Distributed Learning Applications

Example 2 Results

Overall, similar pattern of results between BN and think-aloud measures with respect to:

• Task performance measures

• High vs. low performers

• Scientific reasoning

Page 24: Online Assessment for Individualized Distributed Learning Applications

Example 2 Conclusion

• More complex test of data fusion strategy

• Descriptive measures derived from clickstream data

• Low complexity, low inference -- easy to program in software

• Inferences drawn from Bayesian network at level that is meaningful for instruction or assessment purposes

• Low-value clicks transformed into meaningful information

Page 25: Online Assessment for Individualized Distributed Learning Applications

Research Example 3:Knowledge of

rifle marksmanship

• How can information from assessments be used to deliver individualized instructional recommendations in a distributed learning (DL) context?

Page 26: Online Assessment for Individualized Distributed Learning Applications

Linking Assessment and Instruction

Bayesian Network Model of Knowledge

Dependencies

Recommender

Ontology of Marksmanship

Domain

content item-level scores

probability of knowing a topic

individualized feedback and

content

Page 27: Online Assessment for Individualized Distributed Learning Applications

Example of Feedback and Content Delivery

Page 28: Online Assessment for Individualized Distributed Learning Applications

Example 3 Results

• BN probabilities increased for concepts that had instructional content served

• BN probabilities did not change for concepts that did not have instructional content

• BN probabilities corresponded with Marines’ self-ratings of their level of knowledge (80% agreement)

Page 29: Online Assessment for Individualized Distributed Learning Applications

Current Work• Circuit analysis

• Validating technique for use in Electrical Engineering gateway course

• Rifle marksmanship

• Integrated test of general approach

• Compare DL system, coach, control conditions on shooting performance

Page 30: Online Assessment for Individualized Distributed Learning Applications

Estimator

Laptop

Knowledge, Background,Physiological Measures

User Self-Assessments(dynamic)

Shooting performance score dispersion pattern wobble time-to-shot

Clickstream dwell pattern access pattern help seeking pattern

Knowledge dependencies,Knowledge x performance

dependenciesRecom-mender

State measures worry concentration sight alignment steadiness nervousness self-assessment of shooting knowledge

Concept or procedure nomination Procedure recommendation Content recommendation

Content Text Audio Video Pictures

Web page Access to information Current estimatesof knowledge

Bayesian Network

Content Ontology

Page 31: Online Assessment for Individualized Distributed Learning Applications

Summary and Conclusion

• Distributed learning systems likely to increase in education and training contexts (K16, military, business)

• The cognitive demands underlying performance tasks provides strong guidance for developing online measures

• Extracting useful information from online behavior appears promising, but more research needed

Page 32: Online Assessment for Individualized Distributed Learning Applications
Page 33: Online Assessment for Individualized Distributed Learning Applications

Backup

Page 34: Online Assessment for Individualized Distributed Learning Applications

Some 2000-01 Numbers

• 56% of all postsecondary institutions offered distance education courses

• 90% of public 2-year

• 89% of public 4-year

• 48% degree granting (und + grad)

• 40% of private 4-year

• 33% degree granting (und + grad)

2004 NCES Indicator 32

Page 35: Online Assessment for Individualized Distributed Learning Applications

Review Process

• Reviewed 62 commercial and academic Web-based products

• Data sources: online searches, existing reviews, and online learning trade publications

• Criteria for inclusion in analyses:

• System claimed to have Web-based testing capability

• Broad criteria intended to maximize coverage of products

Page 36: Online Assessment for Individualized Distributed Learning Applications

Product/Vendor List

AnlonBKM-elearningBlackboardCentraClass Act (Darascott's)Click2learnComputer Adaptive TechnologyConvene (IZIOPro)CyberWISEDocentE-collegeEdusystem eno.com e-path BuildKit Aud ManagekitEvalFirst ClassGeneration21iAuthorIMS Assessment DesignerInfosource (content authoring tool)Interwise Millennium (enterprice communication platform)IntralearnJones e-education

KenexaKnowledge PlanetLearning ManagerLearning SpaceLearnlinc/TestlincLibrix performance management (maritz)Macromedia (Authorware 6)MentorwareMicrosoft LRN ToolkitMKLessonNCS PearsonOpen Learning Agency of AustraliaPedagogue Testing (Formal Systems)People SciencesPeople SoftPerformance Assessment NetworkPinnaclePlateau4 Learning Management SystemPlatte canyonPrometheusQuelsys

QuestionMark PerceptionRapidExam 2.0RiscSabaSageSmartforceTechnomediaTEDS Learning on DemandTHINQ Training ServerTopClassTrainersoft 7 ProfessionalTRIADSTutorial GatewayUcompass EducatorVcampusVirtual-UWBTmanagerWebCT  

Page 37: Online Assessment for Individualized Distributed Learning Applications

2002 Review

• Current Web-based systems provide tools for end-users to assemble, administer, and score tests containing mostly conventional item formats

• Little support on how to develop quality tests, or how to use test information

• Little support for performance assessments

• Little support for diagnostic information

• Weak support for linking instruction to test results

Page 38: Online Assessment for Individualized Distributed Learning Applications

BN topicReason-

ingKnow.Map

Prior Know-ledge

Shot Group

Posi-tion

Qual.

Score

Overall know-ledge

.28* .08 .76** .27* .32* .22(p < .10)

Aiming .35** .06 .68** .24 .38* .20

Breath control

.24 .08 .66** .48** .17 .16

Trigger control

.36** .20 .50** .30* .30* .40**

Position .17 .14 .59** .17 .36** .32*

Results

N = 53