Upload
dwayne-cameron
View
214
Download
0
Tags:
Embed Size (px)
Citation preview
Corey D. Pierce, Ph.D. 1
Assessment 101 – Everything you need to know to efficiently collect academic data in your
system
CBOCES Professional Development Series 2011/12
Session #2November 4, 2011
Loveland, CO
Problem Solving
If we can really understand the problem, the answer will come out of it, because the answer is not separate from the problem.
-Krishnamurti
Corey D. Pierce, Ph.D. 3
When You Think About Assessment-
1. What are the questions that need to be answered?
2. What information do you need to obtain from your evaluation to answer the questions you have?
3. What will you do to get the information you need?
4. How will you use the information you gathered for instructional or curricular purposes?
4
Continuous Improvement Cycle
6
Eligibility Testing
Not Eligible Eligible
SPED IntensiveTreatment
Non Responders Responders
?
General Ed.-Scientifically Validated -Supplemental Treatments: T2 - 3
Non Responders Responders
SPED Eligibility Evaluation
Not Eligible Eligible
SPED Intensive Treatment
Monitor
Monitor
Non Responders Responders
Non - SPED Intensive Treatment
Recycle
ReferralHistorical System
Universal ScreeningEffective Educational Systems
Adapted from Fletcher, ’05, Used with Permission
Assessments in Effective Educational Systems• Screening and Benchmark (at grade level)
Universal measures that give a quick read on whether students have mastered critical skills.
• Progress Monitoring (typically at instructional level)Determines whether adequate progress is made based on individual goals regarding critical skills.
• Diagnostic Individually administered to gain more in-depth information and guide appropriate instruction or intervention plans in the area of concern.
• Outcome/Summative Provides an evaluation of the effectiveness of instruction and indicate student year-end achievement when compared to grade-level performance standards.
7
Corey D. Pierce, Ph.D. 8
Effective systems focus on a process forachieving higher levels of academic and behavioral success for all
students through:
High Quality Instructional Practice
Continuous Review of Student Progress (multiple measures)
Collaboration
Corey D. Pierce, Ph.D. 9
A Systemic Approach forConstant Inquiry
Continuous Review of Student Progress
To Assess:• How all students are performing (screening)• How they are responding to differentiated core instruction (ongoing
assessment)• How they are responding to intervention/additional supports (monitoring
progress)
Corey D. Pierce, Ph.D. 10
Balanced Assessment System
Key Components:• Continuum of assessments
• Multiple users
• Multiple information sources, used to create a complete picture of student progress
• Each assessment type has a primary purpose, as well as strengths and limitations
Corey D. Pierce, Ph.D.
Balanced Assessment System
Formative Benchmark Summative
Daily Ongoing Evaluation Strategies Periodic Diagnostic/Progress Assessments Large-Scale Standardized Assessments
Immediate Feedback Multiple Data Points Across Time Annual Snapshot
Student-Centered Classroom/School-Centered School /District/State-Centered
11
Corey D. Pierce, Ph.D. 12
CLASSROOM INTERIM LARGE SCALE What do I learn from this assessment?
What learning comes next for this student? How are students progressing? How well is this program working?
How are schools, districts, and/or states progressing?
Who will use the information?
Primary Users Students Teachers
Primary Users Students Teachers/ Teaching teams Parents Schools/Districts
Primary Users Secondary Users Schools/Districts Teachers State Parents Federal Students Independent Evaluators
What is the frequency of the assessment?
Short Cycle - occurring within and between lessons
Medium Cycle - occurring within and between instructional units
Long Cycle – occurring annually or bi-annually
What action will be taken with the results?
Within the instructional activity, information is used to change or adjust teaching
Students receive frequent and meaningful feedback on their performance
Teachers engage students in the monitoring of their own learning
Re-teach decisions
Professional learning communities use data to identify strengths and gaps in instruction and curriculum
Curriculum may be changed/refined Teachers may modify instruction for
individuals and groups of students based on their progression towards outcome goals (selected, targeted options)
Strategic long term evaluation of curriculum/programming
Determination of Adequate Yearly Progress
What professional development is needed for the primary users?
Need for a solid understanding of learning progressions and the feedback loop
Concrete examples of formative assessment strategies
Collaborative work time to share effective teaching strategies
Opportunities to practice and perfect usage Collaboration time plan for assessments
If district or school developed – identification of learning outcomes and time to develop items
If using a commercial product – time to learn features of the tool and how the results link to district curriculum and state standards
Time to disaggregate and interpret data Collaborative work time to share effective
teaching strategies
Understanding and interpretation of large scale assessment
Time to disaggregate data and evaluate progress at building level and grade level
Opportunities for revising instructional practices and/or curriculum
What is the connection to content standards?
Lessons are connected to the broader curriculum Curriculum is linked to grade level standards Grade level standards are linked to the framework
Examples Examples: feedback questioning, interviews, exit questions observations discussions un-graded class work
Examples: end of unit assessments district assessments (6-traits) benchmark assessments commercial products (See
www.studentprogress.org))
Examples: CSAP NAEP MAPS?
Corey D. Pierce, Ph.D. 13
Summative/ Large-Scale
• Purpose :• To determine how students in schools,
districts, and states are progressing
• To inform curriculum and instruction
• To determine Adequate Yearly Progress (AYP)
Corey D. Pierce, Ph.D. 14
Benchmark Assessment
• Purpose:• To determine to what extent all students are
progressing (screening)
• To determine how well additional supports or services are working before too much time passes (monitoring progress)
Corey D. Pierce, Ph.D. 15
Formative Assessment
• Purpose:
• To consider what learning comes next for students
• To improve learning while there is still time to act – before the graded event
Corey D. Pierce, Ph.D. 16
Benchmark Assessment: Screening
• Definitions• Purposes/Rationale• Strengths and Limitations• Common features• Research• Resources for getting started: Academics & Behavior
Corey D. Pierce, Ph.D. 17
Screening: Definition
• Screening is characterized by fast, inexpensive, repeatable data collection about critical skills, beliefs, or behaviors.
• Screening usually identifies students who need further assessment or provides information for future planning activities.
Corey D. Pierce, Ph.D. 18
Screening: Purposes/Rationale
• The purpose of screening is to identify students who are “at-risk” of a poor outcome
• Rationale: Use a screener with strong statistical properties along with other data to identify students you want to learn more about
• Don’t wait until it’s too late. CSAP is a poor screener for this reason.
Corey D. Pierce, Ph.D. 19
Screening: Strengths & Limitations
• By definition, easy, quick, repeatable
• Immediate results• Guide programming• Predictive validity
• Diagnostically• Guiding instruction
– Administrators– Teachers
• Absent good PM and Formative Asmt.
• Statistical limitations
Strengths Limitations: How Misused
Corey D. Pierce, Ph.D. 20
Selected Research on Screening• Jenkins, J. R., Hudson, R. F., & Johnson, E. S. (2007). Screening for
service delivery in an RTI framework: Candidate measures. School Psychology Review, 36, 560-82.
• Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103) New York: Macmillan.
• Riedel, B. W. (2007). The relationship between DIBELS, reading comprehension, and vocabulary in urban first-grade students. Reading Research Quarterly, 42, 546–567.
• Ritchie, K. D., & Speece, D. L. (2004). Early identification of reading disabilities: Current status and new directions. Assessment for Effective Intervention, 29(4), 13–24.
• Snellen Eye Chart (1862).
Corey D. Pierce, Ph.D. 21
Resources for Screening
• BASC• CBCL• Office Referrals• Teacher Nomination
• Go to National Center on RtI (http://www.rti4success.org/screeningTools
)
• Also see The ABCs of CBM by Hosp etc.
Behavioral Screening Academic Screening
Corey D. Pierce, Ph.D. 22
Formative (Ongoing) Assessment
• Definitions• Purposes/Rationale• Strengths and Limitations • Common features• Research• Resources for getting started: Academics & Behavior
Corey D. Pierce, Ph.D. 23
Formative (Ongoing) Assessment
Definition:“Formative assessment is an intentional and systematic
process used by teachers and students during instruction that provides feedback to adjust on-going teaching and learning to improve students’ achievement of the intended instructional outcomes.”
CCSSO, 2007
Corey D. Pierce, Ph.D. 24
Formative (Ongoing) Assessment
• intentional • systematic • process • feedback • adjust • on-going • intended instructional outcomes
Corey D. Pierce, Ph.D. 25
Purpose:• To consider what learning comes next for the
student• To improve learning while there is still time to
act – before the graded event
Formative (Ongoing) Assessment
Corey D. Pierce, Ph.D. 26
Formative (Ongoing) Assessment
• Examples: • Teacher observations• Teacher questioning & class discussions• Analysis of student work (graded & non-graded)• Exit questions• Teacher feedback• Student self-assessment• KWLs• Student Journals
Corey D. Pierce, Ph.D. 27
Strengths:• Informs day-to-day instruction• Informs intervention• Instant information• Student self-assessment• Provides information about on-going student progress• Designed & evaluated by those who know the students best• Provides a huge volume of qualitative, descriptive data
Formative (Ongoing) Assessment
Corey D. Pierce, Ph.D. 28
Formative (Ongoing) Assessment
Limitations:• Time• Informal/not standardized• Overabundance of information• May be challenging to ‘grade’• When used to the exclusion of other types of
assessment
Corey D. Pierce, Ph.D. 29
Formative (Ongoing) Assessment
Essential components of effective formative assessment:
• Learning Progressions: clearly articulate the sub-goals of the ultimate learning goal
• Learning Goals and Criteria for Success: clearly identified and communicated to students
• Descriptive Feedback: provided to studentswith evidence-based feedback that is linked to the intended instructional outcomes and criteria for success.
CCSSO, 2008
Corey D. Pierce, Ph.D. 30
Formative (Ongoing) Assessment
Research•Inside the Black Box: Raising Standards Through Classroom AssessmentBy Paul Black and Dylan Wiliam (1998)
•New assessment beliefs for a new school missionBy Rick Stiggins (2004)
•Implementing Formative Assessment at the District Level: An Annotated Bibliography (New England Comprehensive Center)
Corey D. Pierce, Ph.D. 31
Formative (Ongoing) Assessment
• Getting started: Academics & Behavior– Set learning goals and criteria for success– Select assessment techniques (teacher and
students)– Determine how feedback is provided– Organize information from formative assessment
(teacher and students)
Corey D. Pierce, Ph.D. 32
Formative (Ongoing) Assessment
“Assessment FOR learning turns the classroom assessment process and its results into an instructional intervention designed to increase, not merely monitor, student learning.”
Richard Stiggins
Corey D. Pierce, Ph.D. 33
Benchmarks: Progress Monitoring
• Definitions• Purposes/Rationale• Strengths and Limitations • Common features• Research• Resources for getting started: Academics & Behavior
Corey D. Pierce, Ph.D. 34
Progress Monitoring: Definition
• Progress monitoring (PM) is a scientifically-based practice used to assess student performance and evaluate the effectiveness of instruction.
Corey D. Pierce, Ph.D. 35
PM: Purposes/Rationale
• PM has two purposes:– Determine whether students are progressing
appropriately from additional supports and intervention
– Build more effective supports and interventions
• Rationale: Use PM to closely monitor whether what we’re doing is effective!
36
20 1822 21
24 2225
3026
2830
2831
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
Wo
rds
Co
rrec
t P
er M
in
School Weeks
Bart
Tier 2: Strategic -PALS
Tier 3: Intensive - 1:1 instruction, 5x/week, Problem-solving Model to Target Key Decoding Strategies, Comprehension Strategies
Aimline= 1.50 words/week
Trendline = 0.95 words/week
Gap Analysis• A critical factor in determining whether a
student is making sufficient progress is conducting a Gap Analysis.
Example:Benchmark vs. Current Level of Performance = Gap
90 wpm/40 wpm = 2.25
2+ = Significant Gap and signifies a need for intervention to close the Gap between student and peers
Corey D. Pierce, Ph.D. 38
PM: Strengths & Limitations
• High frequency• Sensitive to change• Guide programming more
than screening
• May have to make your own PM tools
• Improper tools give invalid, unreliable results
• Used in isolation
Strengths Limitations: How Misused
Corey D. Pierce, Ph.D. 39
Research on Progress Monitoring
• A substantial research literature• Support a wide range of educational decisions• Beginning in 1977 as Data-Based Program
Modification (Deno & Mirkin, CEC)• "Developments in Curriculum-Based
Measurement" by S.L. Deno, 2003, The Journal of Special Education, 37. 3., 184-192.
Corey D. Pierce, Ph.D. 40
Resources for PM
• Frequency of difficulties in school
• Self-rating• Parent/teacher rating
• Go to National Center on RtI (http://www.rti4success.org/progressMonitoringTools )
• Also see The ABCs of CBM by Hosp
Behavioral PM Academic Screening
41
Available Screening and Progress Monitoring ToolsTitle/
PublisherMaterials Grade Website
AIMSweb Early Literacy/ Reading, Spelling, Numeracy/ Math
Pre – 8 www.edformation.com
DIBELS Early Literacy/ Reading Pre – 6 dibels.uoregon.edu
Measuring Basic Skills Progress
Reading, mathematics1 – 6 www.pro-ed.com
Progress Monitoring
All areas Pre – 12
www.studentprogress.org
Information Central
Reading software 1 – 12 www.interventioncentral.org
National Center on RtI
Ratings and uses for assessments
K-12 www.rti4success.org
How to make a CBM/Progress Monitoring Assessment
1. Examine curriculum to select slice of material to be covered
2. Plan the test3. Sequence the content, behaviors, and
conditions and arrange into a table4. Examine table grid to determine which will
be used for objectives5. You may weight a column, row, or square
Making a CBM cont.
6. Decide on the format you will use, then select or write items for the squares you identified on the table.
7. Establish a CAP (Criteria for Acceptable Performance).
Example CBM Development Table
Content Behaviors Conditions
Single digit addition Watch for sign Vertical and sideways
Single digit subtraction Watch for sign Vertical and sideways
Double digit addition (under 10)
Right to left
Double digit subtraction (under 10)
Right to left
Double digit addition (over 10)
Carrying
Double digit subtraction (over 10)
Borrowing
Guidelines for identifying a good CBM1. Assessment and curriculum must be aligned
2. Assessment instrument must be easy to use
3. Assessment must have a clearly defined purpose
4. Assessments should be standardized
5. Assessments should sample clearly defined domains
6. Assessments should sample relevant types of information7. Collect raw data
8. Collect an adequate sample of student performance
9. Test should use appropriate scoring rules
10. Assessments should be as complex and interactive as possible
Steps to Initial Implementation of Progress Monitoring
1. Identify classrooms or grades to measure2. Determine skill/curriculum area to measure3. Develop/acquire measures4. Screen in the Fall5. Rank students by grade/class and develop norms6. Identify students at-risk as determined by capacity and cut-
scores7. Set year-end goals for each student8. Monitor targeted students progress weekly and graph results9. Evaluate progress continuously, systematically (same time,
same way each time)10. Make changes to instructional programming/intervention as
needed
• Repeat steps to create winter and spring benchmarks
Corey D. Pierce, Ph.D. 47
Progress Monitoring Self-Assessment
• Review the PM Self-Assessment materials from DWW
Decision Rules Based on Progress Monitoring Data
• 3 to 4 data points below goal line: make an intervention change
• 6 consecutive data points above goal line: goal is too low, revise upward
• If neither of these apply, continue doing what you are doing
– NOTE: The CDE recommends having a minimum of 8 data points before making educational decisions.
Ongoing Progress Monitoring
Example:
• For a third-grade student with a learning disability and an IEP math goal, curriculum-based measurement (CBM) is collected each week (e.g., 25 problems sampling the 3rd-grade mathematics concepts and applications curriculum).
Ongoing Progress Monitoring
Example:• For a third-grade student with a learning disability and
an IEP math goal, curriculum-based measurement (CBM) is collected each week (e.g., 25 problems sampling the 3rd-grade mathematics concepts and applications curriculum).
• Take baseline and set year-end goal. Draw goal line to represent performance level expected each week of the school year.
0
10
20
30
40
50
Sep Oct Nov Dec Jan Feb Mar Apr May
Michael T. Applications 3
POINTS
G
Wait. Not enough scores for decision.You need at least 8 scores to make a decision.
Ct
NC
NN
Me
Mn
CG
Fr
De
AC
WP
Ongoing Progress Monitoring
Example:• For a third-grade student with a learning disability and an IEP math
goal, curriculum-based measurement (CBM) is collected each week (e.g., 25 problems sampling the 3rd-grade mathematics concepts and applications curriculum).
• Take baseline and set year-end goal. Draw goal line to represent performance level expected each week of the school year.
• Special education involves five 30-minute math tutoring sessions per week in dyads. Initial program is a validated tutoring protocol. Weekly progress monitoring continues.
Ongoing Progress Monitoring
Example:• For a third-grade student with a learning disability and an IEP math goal,
curriculum-based measurement (CBM) is collected each week (e.g., 25 problems sampling the 3rd-grade mathematics concepts and applications curriculum).
• Take baseline and set year-end goal. Draw goal line to represent performance level expected each week of the school year.
• Special education involves five 30-minute math tutoring sessions per week. Initial program is a validated tutoring protocol. Weekly progress monitoring continues.
• After 8 weeks, the student’s progress is evaluated against the goal line. The special educator determines whether a revision to the program is needed to boost the student’s rate of improvement. If so, an instructional feature, based on a well researched instructional principle, is added to the validated protocol.
0
10
20
30
40
50
Sep Oct Nov Dec Jan Feb Mar Apr May
Michael T. Applications 3
POINTS
T
G
Uh-oh! Make a teaching change.Student's rate of progress is less than the goal line.
Ct
NC
NN
Me
Mn
CG
Fr
De
AC
WP
Ongoing Progress MonitoringExample:• For a third-grade student with a learning disability and an IEP math goal, curriculum-based
measurement (CBM) is collected each week (e.g., 25 problems sampling the 3rd-grade mathematics concepts and applications curriculum).
• Take baseline and set year-end goal. Draw goal line to represent performance level expected each week of the school year.
• Special education involves five 30-minute tutoring sessions in dyads on calculation and word problem skills, deficit areas for the student. Initial program is a validated tutoring protocol. Weekly progress monitoring continues.
• After 8 weeks, the student’s progress is evaluated against the goal line. The special educator determines that a revision to the program is required to boost the student’s rate of improvement. If so, an instructional feature, based on a well researched instructional principle, is added to the validated protocol.
• This iterative process recurs over time so that the teacher uses the progress-monitoring data to formatively design an individualized, appropriate instruction (i.e., scientifically-based and intensive special education and related services) that addresses the general education curriculum.
0
10
20
30
40
50
Sep Oct Nov Dec Jan Feb Mar Apr May
Amanda M. Applications 3
POINTS
G
Wait. Not enough scores for decision.You need at least 8 scores to make a decision.
Ct
NC
NN
Me
Mn
CG
Fr
De
AC
WP
• A critical factor in determining whether a student is making sufficient progress in Tier I is conducting a Gap Analysis.
Example:Benchmark vs. Current Level of Performance = Gap
90 wpm/40 wpm = 2.25
2+ = Significant Gap and signifies a need for Tier II or Tier III intervention to close the Gap between student and peers
Gap Analysis
58
Case Examples
• Thanks to George Batsche, Joe Kovaleski and Ed Shapiro for the case examples
• PA State-wide RtI Initiative
60
LISA
50
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wor
ds C
orre
ct P
er M
in
Benchmark
61
Decision Model at Tier 1- General Education Instruction
• Step 1: Screening• ORF = 50 wcpm, fall benchmark for some risk = 44
wcpm• Comprehension skills are judged as at levels equal to
ORF by her teacher • Is this student at risk?• Current Gen Ed Instruction is Working
No YesMove to Tier 2: Strategic Interventions
Lisa
Continue Tier 1 Instruction
62
Rita
• Second grade student• Beginning of school year• Regular Education• Scores at 20 wcpm in second grade material• Teacher judges (based on in-class observation/evaluation)
comprehension to not be substantially different from ORF
63
Rita
20
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wor
ds C
orre
ct P
er M
in
Benchmark
64
Decision Model at Tier 1- General Education Instruction
• Step 1: Screening• ORF = 20 wcpm, fall benchmark for some risk = 44
wcpm• Comprehension deficits in all 4 of 5 areas are noted• Current Gen Ed Instruction is NOT Working• Is this student at risk?
No YesMove to Tier 2: Strategic InterventionsRita
Rita
Continue Tier 1 Instruction
65
Decision Model at Tier 2- Strategic Interventions & Instruction
• Supplemental, small group instruction (3-4 students with similar skill levels)
• Standard protocol intervention• 3x per week, 30 minutes each• Team selects PALS (Peer Tutoring Strategy)• Implemented by 2 different available instructional personnel• Implemented for 8 weeks• Progress monitoring once every 2 weeks
66
Rita- Tier 2
2024
28
35 34
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wor
ds C
orre
ct P
er M
in
Tier 2: Strategic -PALS
Trendline = 1.85 words/week
Aimline= 1.50 words/week
67
Decision Model at Tier 2- Strategic Intervention & Instruction• ORF = 34 wcpm, winter benchmark (still 8 weeks away)
for some risk = 52 wcpm• Target rate of gain over Tier 1 assessment is 1.5
words/week • Actual attained rate of gain was 1.85 words/week• Gains above benchmark in 4 of 5 comprehension areas• Student on target to attain benchmark• Step 2: Is student responsive to intervention?
NoYesMove to Tier 3: Intensive Interventions
Rita
Continue monitoring or return to Tier 1
68
Steven
• Second grade student• Beginning of school year• Regular Education• Scores at 20 wcpm in second grade material• Teacher judges (based on in-class observation/evaluation)
comprehension to not be substantially different from ORF
69
Steven
20
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wor
ds C
orre
ct P
er M
in
Benchmark
70
Decision Model at Tier 1- General Education Instruction
– Step 1: Screening• ORF = 20 wcpm, fall benchmark for some risk = 44
wcpm• Comprehension screen also shows deficits in all 5 areas• Current Gen Ed Instruction is NOT Working• Is this student at risk?
No YesMove to Tier 2: Strategic InterventionsRita
Steven
Continue Tier 1 Instruction
71
Decision Model at Tier 2- Strategic Interventions & Instruction
• Supplemental, small group instruction in Rita’s group (3-4 students with similar skill levels)
• Standard protocol implementation• 3x per week, 30 minutes each• Team selects PALS (Peer Tutoring Strategy)• Implemented by 2 different available instructional personnel• Implemented for 8 weeks• Progress monitoring once every 2 weeks
72
Steven
20 1822 21
24
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wor
ds C
orre
ct P
er M
in
Tier 2: Strategic -PALS
Aimline= 1.50 words/week
Trendline = 0.55 words/week
73
Decision Model at Tier 2- Strategic Intervention & Instruction
– Step 2: Is student responsive to intervention?• ORF = 24 wcpm, winter benchmark (still 8 weeks away)
for some risk = 52 wcpm• Target rate of gain over Tier 1 assessment is 1.5
words/week • Actual attained rate of gain was 0.55 words/week• Below comprehension benchmarks in 4 of 5 areas• Student NOT on target to attain benchmark• Is student responsive to intervention at Tier 2?
NoYesMove to Tier 3: Intensive Interventions
Steven
Continue monitoring or return to Tier 1
74
Decision Model at Tier 3- Intensive Interventions & Instruction
• Supplemental, 1:3, pull-out instruction• Individualized Problem-Solving, Targeted Instruction• Specific decoding and analysis strategies • Emphasis on comprehension strategies• 5x per week, 30 minutes each• Implemented by 2 different available instructional
personnel• Implemented for 8 weeks• Progress monitoring once every week
75
Steven
20 1822 21
2428
3136 35
42 4440
45
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wo
rds
Co
rrec
t P
er M
in
Tier 2: Strategic -PALS
Tier 3: Intensive - 1:1 instruction, 5x/week, Problem-solving Model to Target Key Decoding Strategies, Comprehension Strategies
Aimline= 1.50 words/week
Trendline = 0.2.32 words/week
76
Decision Model at Tier 3- Intensive Intervention & Instruction
– Step 3: Is student responsive to intervention at Tier 3?• ORF = 45 wcpm, winter benchmark (still 4 weeks away) for some
risk = 52 wcpm• Target rate of gain over Tier 2 assessment is 1.5 words/week • Actual attained rate of gain was 2.32 words/week• At or above comprehension benchmarks in 4 of 5 areas• Student on target to attain benchmark• Step 3: Is student responsive to intervention?• Move student back to Strategic intervention
NoYesMove to Sp Ed Eligibility Determination
Steven
Continue monitoring or return to Tier 2
77
Bart
• Second grade student• Beginning of school year• Regular Education• Scores at 20 wcpm in second grade material• Teacher judges (based on in-class observation/evaluation)
comprehension to not be substantially different from ORF
78
Bart
20 1822 21
24 2225
3026
2830
2831
0
10
20
30
40
50
60
70
80
90
100
Sept Oct Nov Dec Jan Feb
School Weeks
Wo
rds
Co
rrec
t P
er M
in
Tier 2: Strategic -PALS
Tier 3: Intensive - 1:1 instruction, 5x/week, Problem-solving Model to Target Key Decoding Strategies, Comprehension Strategies
Aimline= 1.50 words/week
Trendline = 0.95 words/week
79
Decision Model at Tier 3- Intensive Intervention & Instruction
– Step 3: Is student responsive to intervention at Tier 3?
• ORF = 31 wcpm, winter benchmark (still 4 weeks away) for some risk = 52 wcpm
• Target rate of gain over Tier 2 assessment is 1.5 words/week
• Actual attained rate of gain was 0.95 words/week• Below comprehension benchmarks in all areas• Student NOT on target to attain benchmark
NoYesMove to Sp Ed Eligibility Determination
Bart
Continue monitoring or return to Tier 2
Corey D. Pierce, Ph.D. 80
Diagnostic Assessments
• When to use them– Used when you need more in-depth information
in a specific area of need– Need further triangulation of data– To guide instructional decision making
• How to choose the best one– Match the designed purpose of the test with the
information you need• Which are available through CBOCES?
Corey D. Pierce, Ph.D. 81
How Does it Fit Together?
Addl.Diagnostic
Assessment
InstructionResults
Monitoring
IndividualDiagnostic
IndividualizedIntensive
weekly
All Students at a grade level
Fall Winter Spring
UniversalScreening
None ContinueWithCore
InstructionGuided byFormative
Assessment
SUMMATIVE•Grades•Discipline•AYP Measures
GroupDiagnostic
SmallGroupDifferen-tiatedBy Skill
2 times/month
Step 1Step 2 Step 3 Step 4
SupplementalServices
1-5%
5-10%
80-90%CoreInstruction
IntensiveSupport
Courtesy of Dave Tilley, Heartland AEA
Corey D. Pierce, Ph.D. 82
Questions in Test Selection
• How does this map to our data system?• Does this test have adequate technical properties for our
intended use?– Reliability & Validity– Frequency– Scale– Alignment with our curriculum
• How will we use the collected data? Don’t use a test outside of its how intended.
• If using for SLD, federal regulations § 300.304 require use “for the purposes for which the assessments or measures are valid and reliable”
Corey D. Pierce, Ph.D. 83
Achieving Balance
Thinking about your assessment system…• Is your system balanced?
– If not, are you okay with imbalance?
• How do the assessments support and inform one another?
• Do all users know the purpose, strengths and limitations of the assessments?
• What do you do with the results of the different assessments?
• Does everyone play an important role?• How to deal with disagreement? Collaborate!
All Students Receive
All Students Are Screened for Additional Instructional
Needs (Fall, Winter & Spring DIBELS, DORF, TESA,
ODRs, etc.)
Individualize Instruction Rule: When students fail to progress af ter two (2) consecutive small group interventions
20% Decision Rule: Students below the 20th percentile in academic skills and/ or with chronic behavior needs* are placed in small group instruction
Refer for Special Education Evaluation Rule: When students fail to progress af ter two (2) consecutive individually-designed interventions
80% Decision Rule: I f less than 80% of students are not meeting benchmarks, review core program(s)
ALL STUDENTS RECEIVE QUALITY BEHAVIOR AND ACADEMIC
INSTRUCTION AND SUPPORT
Change Small Group or Individual Interventions Rule: When progress data is below aimline on three (3) consecutive days, or when six (6) data points produce a flat or a decreasing trendline
I nterventions are f urther individual-
ized
DECISION RULES:
EBIS EARLY I DENTIFI CATION PROCESS Tigard- Tualatin School District, Tigard, Oregon
TEAMWORK TIMELINES: EBI S teams meet f all, winter & spring to review data and make decisions about school-wide progress. EBI S teams/Grade level teacher teams meet monthly to review data, plan and adjust interventions.
Small Group Interventions are
designed by teacher teams
with EBIS support
*More than 5 absences or more than 3 counseling or discipline ref errals in 30 days.
Meetings
Use each professional’s expertise to develop a “Super Plan” of intense service that will
benefit the student.
School wide success occurs when students receive purposeful and planned instruction
focused on one common goal from all professionals.
Corey D. Pierce, Ph.D. 86
Data Meetings
• Review Data Analysis Meeting forms from DWW. (Page 11)
SLD: Evaluation & Body of Evidence
More focused than when we conducted a “comprehensive evaluation” (e.g., same battery of assessments for all referred)
Student assessed in all areas related to suspected disability
Must be “sufficiently comprehensive” to identify all child’s special education needs (whether or not commonly linked to primary disability)
Full and Individual Evaluation…
Key Question: What is a “Full and Individual” Evaluation that includes the required “Body of Evidence?”
In conducting an evaluation, the public agency must—
Use a variety of assessment tools and strategies to gather relevant functional, developmental, and academic information…
Not use any single measure or assessment as the sole criterion for determining whether a child is a child with a disability and for determining an appropriate educational program for the child…
-34CFR §300.304(b)
Evaluation
• Interviews (classroom, family)
• Observations in relevant settings (classroom)
• Analysis of work samples
• Performance of informal tasks
• Curriculum Based Measurements and/or
• Other progress monitoring strategies
Building a Body of Evidence: A “variety of assessment tools and strategies” (during RtI & evaluation) may include…
(Required at some point for SLD determination)
• Results from state and district assessments
• Functional Behavior Assessment
• Behavior Rating Scales
• Vocational assessments
• Norm-referenced assessment (focused, not full battery for
every child referred)
Building a Body of Evidence: A “variety of assessment tools and strategies” (during RtI & evaluation) may include…
(Required at some point for SLD determination)
SLD identification should be based on a convergence of data gathered throughout the RtI/problem-solving process as well as any further assessment data gathered as part of the evaluation for special education.
Body of Evidence (Converging Data)
Specifically, Colorado Rules require “a body of evidence demonstrating: academic skill deficit(s); and insufficient progress in response to scientific, research-based intervention” to determine that these two key criteria for SLD have been met.