14
NCSA Session: Theory and Research on Item Response Demands: What Makes Items Difficult? Construct-Relevant? June 20, 2010 Detroit FRAMEWORKS FOR CONSIDERING ITEM RESPONSE DEMANDS AND ITEM DIFFICULTY KRISTEN HUFF COLLEGE BOARD STEVE FERRARA CTB/MCGRAW-HILL

Frameworks for Considering Item Response Demands and Item Difficulty Kristen Huff College Board

Embed Size (px)

DESCRIPTION

NCSA Session: Theory and Research on Item Response Demands: What Makes Items Difficult? Construct-Relevant? June 20, 2010 Detroit. Frameworks for Considering Item Response Demands and Item Difficulty Kristen Huff College Board Steve Ferrara CTB/McGraw-Hill. Thesis. - PowerPoint PPT Presentation

Citation preview

Page 1: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

NCSA Session: Theory and Research on Item Response

Demands: What Makes Items Difficult? Construct-

Relevant?June 20, 2010

Detroit

FRAMEWORKS FOR CONSIDERING ITEM RESPONSE DEMANDS

AND ITEM DIFFICULTY

KRISTEN HUFFCOLLEGE BOARD

STEVE FERRARACTB/MCGRAW-HILL

Page 2: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

THESISA coherent and comprehensive understanding

of the interaction between items and examinees, and the controllable features of items that elicit predictable interactions is needed to:

Design items that do a better job of measuring what we’re interested in knowing

Design tests that are better suited to facilitate valid inferences about student performance

More generally, bridge the gap between large-scale assessment and teaching & learning

Page 3: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board
Page 4: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

OBJECTIVETo conduct research (& review past research) that

informs the development of a framework, or conceptual structure, of item response demands.

Research questions:How much do existing item response demand

schemas cover current achievement constructs?

What are the features of items that influence item difficulty?

What student responses are triggered by different item features, and how do these responses influence item difficulty?

Page 5: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

feature featurefeature feature

Page 6: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

feature featurefeature feature

Reading comprehension# wordsreading leveloverlap between key and text

Mathnumber of variablesgraphicsfractions vs whole numbers

Page 7: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board
Page 8: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board
Page 9: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

FRAMEWORK

Page 10: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board
Page 11: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

FRAMEWORK

Page 12: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

Intended cognitive complexity

Simple Complex

Observed item

difficulty

Easy“Good” item

(Good design and OTL)

Item design flaw or

Exceptionally effective instruction?

Or flawed assumptions? Or

restriction in range?

Hard Item design flaw

orNo OTL

“Good” itemor

No OTL

Page 13: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

Opportunity to Learn

Taught Not Taught

Observed item

difficulty

Easy Effective Instruction Item design flaw

Hard

Item design flaw

orjust tough

stuff?

No OTL

Page 14: Frameworks for Considering Item Response Demands  and Item Difficulty Kristen Huff College Board

CONCLUSIONS & POINTS OF DISCUSSION

Mapping the landscape of response demands, item features and the interaction between the two is messy, difficult work

Achieving “coherent and comprehensive” frameworks needs to be higher priority in research Start now using draft in operational testing

programs

Need more effective and easier ways to gauge opportunity to learn for many reasons, but also to inform the work we are recommending