12
National Center for Research in Advanced Information and Digital Technologies Henry Kelly Federation of American Scientists NITRD Briefing September 16, 2008

National Center for Research in Advanced Information and Digital Technologies Henry Kelly Federation…

Embed Size (px)

DESCRIPTION

Purposes Research, development and demonstrations of learning technologies that could include simulations, games, virtual worlds, intelligent tutors, performance-based assessments, and innovative approaches to pedagogy that these tools can implement. Design and testing of components needed to build prototype systems. Research to determine how these new systems can best be used to build interest and expertise in learners of different ages and backgrounds.

Citation preview

National Center for Research in Advanced Information and Digital

Technologies

Henry KellyFederation of American Scientists

NITRD BriefingSeptember 16, 2008

The National Center for Research in Advanced Information and Digital Technologies is part of the reauthorization of the Higher Education Act (section 802) approved by Congress on July 31, 2008, and signed into law by President Bush on August 14, 2008.

Purposes•  Research, development and demonstrations of learning

technologies that could include simulations, games, virtual worlds, intelligent tutors, performance-based assessments, and innovative approaches to pedagogy that these tools can implement.

•   Design and testing of components needed to build prototype systems. 

•  Research to determine how these new systems can best be used to build interest and expertise in learners of different ages and backgrounds. 

Management:• independent, nonprofit organization with its own Board of

Directors. • can receive funding from any federal agency, from

private organizations • The bill authorizes expenditure of funding from the

Department of Education;  $50 million is being requested for Fiscal Year 200

• Center staff will develop a research plan and ask for competitive proposals.  The research will be selected by a peer review process. 

• All material resulting from the research will quickly be made freely and nonexclusively available to the public (wavers that “would result in significant public benefits” are possible but require unanimous Board approval

Instructional Design• Create authentic challenges, Problem-centered Learning• Continuous assessment of expertise (what can the learner do?)

– Varied and Contrasting Examples

– Demonstration

– Practice opportunities

• Provide relevant information where and when it’s needed (automated & human)

• Reflection

• Feedback

• Assessment

• Skills Refreshment

Bransford/ Jonassen, D. H./Hannafin,/M. J., Land/ S., & Oliver, K.

Game Features Attractive for Learning

• Authentic motivating challenges motivates time on task

• Personalization • Continuous assessment (and the right to fail)• Contextual bridging closes gap between what is

learned and its use• Scaffolding provides cues, hints to keep learner

progressing

• Stimulate deep questions (failing to achieve a compelling goal can do this)• A good answer depends on

Technical accuracy Knowledge about the person asking Knowledge of the context of the question An instructional strategy (answer with another question?)

• Response includes knowledge of: Content Individual learner Context Pedagogical strategy

• Multimedia questions and responses (e.g. “what’s that?” [points at a cell])

• Mixture of artificial and human intelligence

Inquiry Management

Graesser and Person/Beck, I.L., McKeown, M.G., Hamilton, R.L., & Kucan, L. /Miyake, N. & Norman, D.A.

• Measures of expertise that can form the basis of competitive approaches

• Measures authentic to learners, employers, instructors

• Continuous, multi-dimensional assessments of content mastery (how would an expert behave)

• Measures competence using a challenge that makes sense to the learner, instructor, and employers

• Performance based• Reproducible

Assessment

0 100%

Legacy

Time to Train

50%25% 75%

Factors: Very independent, previous work / higher education, tend to higher ASVAB

Factors: Desire continuous direction, weak work experience, tend to lower ASVAB

Note: Some low ASVAB scorers in this range Note: Some high ASVAB scorers in this range

100

200

300

Active Learners

Passive Learners

Students

Time to Train Results

30Sep2004

Population (n)= 11836 (ATT: 10554 IC: 283 GM: 334 TM: 140 FC: 271 ET: 254)

18Jun2004

27 May 2005

“two years ago everybody would show up on Monday and they graduate from school two months later. Not anymore”

“We are moving to performance based testing as quickly as we can.”

VADM Kevin Moran, Commander, Naval Education and Training Command 2006

Evaluation

• Explore gains in deep expertise• Know where and when to use the new tools

(what groups, what concepts)• Group AND individual evaluation• Diverse demographics• Continuous feedback to research teams

Build a World Using a World

Instructor/Team leader

Static Objects & associated metadata

AI characters & associated metadata, scripts, behaviors etc.Converters

Converters

Learners

Players

Visitors

Role Players

Tutors

Experts

3D object producers

Reviewers

Experts, Museums, Archives,…

Data on team

Production schedules.

Student records

Faculty

Learning modules

Performance tests

User ratings

Converters

Learning Management Systems

avatars

Use Cases