35
2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant Cognitive Disabilities Don Peasley, Ohio Department of Education Tom Deeter, Iowa Department of Education Rachel Quenemoen, NCEO

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen Measurement Purgatory or Best Practice? Alternate Assessment for Students with Significant

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Measurement Purgatory or Best Practice? Alternate Assessment

for Students with Significant Cognitive Disabilities

Don Peasley, Ohio Department of Education

Tom Deeter, Iowa Department of Education

Rachel Quenemoen, NCEO

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Overview• What is required for alternate assessments on

alternate achievement standards (AA-AAS) in the context of the 1% Rule? (and last Saturday’s presession)

• What is required for AA-AAS in the context of Title I Peer Review?

• Where are we now, and where do we have to go?

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Alternate Assessments as defined in “1% Rule”

Aligned with the State’s grade level content standards.

Yield results separately in reading/language arts and math.

Designed and implemented to support use of the results to determine AYP.

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Alternate Assessments should have…

Clearly defined structure Guidelines for which students may

participate Clearly defined scoring criteria and

procedures Report format that clearly communicates

student performance in terms of the academic achievement standards defined by the State

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Alternate AssessmentsMust meet the same requirements for high technical quality that apply to regular assessments under NCLB:

Validity

Reliability

Accessibility

Objectivity

Consistent with nationally-recognized professional and technical standards.

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

States may use more than one alternate assessment Alternate assessment scored

against grade-level achievement standards

Alternate assessment scored against alternate achievement standards

Both must support access to grade level curriculum

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Development of Alternate AssessmentsQuenemoen, Rigney, & Thurlow, 2002

1.     Careful stakeholder and policymaker development of desired student outcomes for the population, reflecting the best understanding of research and practice, thoughtfully aligned to same content expected for all students, at grade-level.

• 2.     Careful development, testing, and refinement of assessment methods.

• 3.     Scoring of evidence of grade-level content aligned student work, according to professionally accepted standards, against criteria that reflect best understanding from research and practice.

• 4.     Standard-setting process to allow use of results in reporting and accountability systems.

• 5.     Continuous improvement of the assessment process.

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Observation

Cognition

Interpretation

The assessment triangle (Pellegrino et al., 2001)

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Professional Understanding of

Learning GoalsShifting goals for students with significant cognitive disabilities since 1975 (Browder, 2001; Kearns & Kleinert, 2004)

Developmental Goals – “ready meant never”

1980s - Functional Goals –

NOW WE HAVE REFOCUSED ON:

1990s - Academic Goals – “general curriculum” leading to developmental traps leading to a focus on GRADE LEVEL Academic Content Standards

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

WHAT IS LEARNING? We must ensure all students have

access to and make progress in the academic grade level

content and assess achievement on that content

What is achievement? What is proficiency?

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Title I Peer Review Checklist (MSRRC)

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Draft Technical Manual Outline

Section I—Assessment Development A. Overview• Principles guiding development• Partners and process guiding development• Research base on desired outcomes for this population,

clarification of theory of learning – develop draft performance level descriptors

• Documentation of state conceptualization for (expansion/extension) alignment and access to the state grade level content standards

• Pros and cons of assessment methods considered• Description of selected approach

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

TASK: Write draft performance level descriptors for AA-AAS

Charlie DePascale, Jeff Nellhaus, Barbara Plake, Michael Beck session on Monday – nciea.org – basic information on standard-setting

Depth of understanding? Differ in substance? Differ in amount? All the content? Some of the content? Any of the content?

What does it mean for these students to be proficient in mathematics? In ELA? Are we avoiding developmental traps?

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

B. Test Development • Protocol for alignment to grade level content

standards • Development of draft assessment protocol• Pilot test design and results• Field test design and results

C. Test blueprint• English Language Arts content specifications • Mathematics content specifications• Other (e.g., Science) content specifications

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Section II—Test Administration

A. Procedures for administration• Decision-making process (participation, IEP team

role)• Local responsibility• Timelines

B. Training • Test oversight training for administrators• Educator training for those working directly with

students• Ethical test administration training

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio’s Alternate Assessment for Students with Disabilities

• The Ohio Alternate Assessment is based on a Collection of Evidence COE model

• Designed to be a measure of student achievement aligned with Ohio’s Academic Content Standards

• Alternate assessment is a “snapshot” of achievement during a window of time

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Collection of Evidence

Cover pageEntry 1

(Standard)

Entry 2(Standard)

Entry 3(Standard)

For each academic area

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio’s Participation Decision Framework:

NO

Does the student have a disability that presents “unique and significant” challenges to participation in district and state assessment regardless of the accommodations they could use?

Participation in regular district and state assessments with or without accommodations

Does the student have severe motor or sensory or cognitive or emotional disabilities?

NO YES

YES Continue……

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio’s Decision Framework:Does the student:

•Require substantial modifications to the general education curriculum (form and substance)?

AND

•Require instruction focused on the application of state standards through essential life skills?

AND

•Require instruction multiple levels below age/grade level?

AND

•Is the student unlikely to provide valid and reliable measure of proficiency in content areas via standardized assessment even with accommodations?

Participation in regular district and state assessments with or without accommodations

Student participates in

Alternate Assessment

NO YES

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Section III— Scoring and Reporting

A. Scoring design• Quality control • Benchmarking• Selecting and training scorers• Scoring activities • Inter-scorer reliability

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio’s Alternate Assessment for Students with Disabilities

• Scoring• Collection of Evidence scored across four domains

(scoring criteria)• Performance—holistic by entry

• Independence/Support-holistic by entry

• Context/Complexity—holistic by entry

• Settings and Interactions—for entire collection

• Evidence is scored independently according to professionally accepted standards by scoring contractors

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

B. Standard-setting• Documented and validated process used for standard

setting (Full description in Appendix _)• Performance level descriptors and exemplars for

alternate achievement standards• Distribution of performance across levels • Comparison of performance across levels achieved in

general assessment by students with disabilities in comparable implementation years

C. Reporting design• School/District/State Report • Parent Letter/Individual Student Report

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio Results, Grade 3 Reading Achievement, March 2004

0%

20%

40%

60%

80%

100%

Limited 11.50% 5.70% 21.10%

Basic 13.50% 9.70% 8.10%

Proficient 19.50% 39.80% 33.30%

Accelerated 26.00% 32.50% 27.20%

Advanced 29.60% 12.30% 10.30%

Proficient or Above 75.10% 84.60% 70.80%

Regular Assessments

Scored Alternate Assessments

Total Alternate Assessments

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio Graduation Tests (Grade 10) Reading, March 2004

0%

20%

40%

60%

80%

100%

Limited 12.10% 15.70% 13.20%

Basic 10.30% 12.00% 10.00%

Proficient 25.90% 30.40% 25.40%

Accelerated 25.90% 13.90% 11.60%

Advanced 25.90% 28.10% 23.50%

Proficient or Above 77.70% 72.40% 60.50%

Regular Assessments

Scored Alternate Assessments

Total Alternate Assessments

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Ohio Graduation Tests (Grade 10) Mathematics, March 2004

0%

20%

40%

60%

80%

100%

Limited 17.20% 23% 19.80%

Basic 14.90% 15.30% 13.20%

Proficient 31.30% 19.30% 16.60%

Accelerated 19.60% 21.10% 18.10%

Advanced 17.00% 21.30% 18.30%

Proficient or Above 67.90% 61.70% 53.00%

Regular Assessments

Scored Alternate Assessments

Total Alternate Assessments

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Section IV - Reliability and Validity; Other Technical Considerations

A. Summary of studies for reliability, available data

B. Summary of studies for validity, available data

• Face validity studies • Concurrent validity studies • Consequential validity studies • C. Other technical considerations

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Section V—Appendices

• Appendix A Documentation of development principles, partners, process, research base

• Appendix B Documentation of training provided, attendance, quality control

• Appendix C Documentation of scoring protocols, process, quality control

• Appendix D Formal evaluation data if available• Appendix E Standard setting report

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

• Who are the learners who take alternate assessments? How does the type and size of the population vary in terms of learner characteristics, available response repertoires, and complex medical conditions? How do the variations of who the learners are affect the assessment triangle, and ultimately technical adequacy studies?

• What does the literature say about how students in this (these) population(s) learn? How do current theories of learning in the typical population apply to this population?

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

• How is technical adequacy defined? What is meant by reliability, validity? How do traditional definitions of reliability/validity apply to alternate assessments?

• What are technical adequacy issues in alternate assessments that can not be resolved with the current knowledge-base in large-scale assessment? What strategies can be used to resolve these issues?

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

• What consequential validity issues (intended and unintended consequences) challenge the foundational assumptions in an alternate assessment? What is the relationship between foundational assumptions of alternate assessments and technical adequacy issues?

• What lessons learned from alternate assessment need to be addressed for the general assessment as well?

2004 CCSSO Large-scale Conference Peasley, Deeter, Quenemoen

Next Steps

• Define the learners, and determine how this differs across states

• Build consensus on a theory of learning in the academic content domains for these students

• Step out of our specializations and think together about these challenges