17
QUICK START User’s Guide 2014 PART OF THE ASSESSMENT LITERACY SERIES THE RIA GROUP | 16407 Highland Club Avenue Baton Rouge, LA 70817

Quick Start Users Guide-June 2014-Working Draft

Embed Size (px)

Citation preview

Page 1: Quick Start Users Guide-June 2014-Working Draft

QUICK START

User’s Guide

2014

PART OF THE ASSESSMENT LITERACY SERIES THE RIA GROUP

| 16407 Highland Club Avenue Baton Rouge, LA 70817

Page 2: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 1

TABLE OF CONTENTS

Introduction 2

Purpose 3

Homeroom 3

Phase I: Designing the Assessment 5

1.1 Goal Statement 5

1.2 Objectives 5

1.3 Guiding Questions 5

1.4 Resources 5

1.5 Procedural Steps 6

STEP 1: Create a Purpose Statement 6

STEP 2: Select Targeted Content Standards 6

STEP 3: Develop a Test Blueprint 7

1.6 Quality Reviews 7

Phase II: Building the Assessment 8

2.1 Goal Statement 8

2.2 Objectives 8

2.3 Guiding Questions 8

2.4 Resources 9

2.5 Procedural Steps 9

STEP 4: Item Stems/Task Prompts 9

STEP 5: Scoring Keys/Scoring Rubrics 10

STEP 6: Test Forms 11

2.6 Quality Reviews 12

Phase III: Reviewing the Assessment 13

3.1 Goal Statement 13

3.2 Objectives 13

3.3 Guiding Questions 13

3.4 Resources 14

3.5 Procedural Steps 14

STEP 7: Item/Tasks Reviews 14

STEP 8: Alignment and Performance Level Reviews 14

STEP 9: Data Reviews 15

STEP 10: Refinements 15

3.6 Quality Reviews 16

Page 3: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 2

Quick Start© User’s Guide

Introduction

The purpose of this document is to provide guidance for developing measures of student

performance that will meet the criteria within the Performance Measure Rubric. The rubric is a

self-assessment tool used to ascertain the technical quality of locally-developed performance

measures. The process used to “design”, “build”, and “review” teacher-made performance

measures is contained within the Quick Start program. Quick Start delivers a

foundational understanding of the procedures necessary to create these performance measures,

which teachers may then use to assess their students’ skills, knowledge, and concept mastery of

targeted content standards.

Figure 1. Process Components

Des

ign • Purpose

Statement

• Targeted Content Standards

• Test Blueprint

Bu

ild • Items/Tasks

• Scoring Keys & Scoring Rubrics

• Test Forms

Rev

iew • Item/Task

Reviews

• Alignment Reviews

• Data Reviews

• Refinements

Page 4: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 3

Purpose

This document guides educators in the development of performance measures in three

phases: Design, Build, and Review. Each phase includes customized training and educator-

friendly tools to ensure that the performance measures meet the criteria within the Performance

Measure Rubric. This rubric, which helps determine the technical quality of performance

measures, follows a structure similar to the training process used in developing student learning

objectives (i.e., Design, Build, and Review). Educators have the flexibility to begin the process

from Orientation to Review, or simply the Review phase, based upon their needs and experience

in the assessment development process.

Homeroom

Homeroom is the learning platform that brings this effective training right to your

fingertips. To access the training and documents necessary for creating high-quality

performance measures visit www.pdehr.riagroup2013.com. It is important to note that

the user may access this training from any device whether it be a tablet, phone, or PC.

When accessing Homeroom for the first time, the user will need to register through the

Homeroom login screen. In the event of a lost password, username, or other questions,

the user may contact the Help Desk through email at [email protected] or call

toll free at 1.855.787.9446 (see Figure 2 below).

Figure 2. Homeroom Login Screen

The home page offers the user the Quick Start icon option as shown below. The first

option, “I am a Teacher”, is oriented to teachers completing the SLO Process. The second

option, “I am a School Leader”, is designed for principals, superintendents, etc. The Quick

Start icon expands as shown in Figure 3 below to offer the user options.

Page 5: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 4

Figure 3. User Options

Each phase of the Quick Start Process; Design, Build, and Review contains the

components listed below. The TRAINING > VIEW THE TRAINING component provides the

user with PowerPoints and videos instructing the user in assessment creation. The TEMPLATES

> CREATE YOUR OWN component provides templates for the user to download and utilize in

developing effective student learning objectives. The RESOURCES > HELPFUL MATERIALS

component provides guides and other resources to enhance the Quick Start Process

experience (see Figure 4 below).

Figure 4. Quick Start Process Components

Page 6: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 5

Phase I: Designing the Assessment

1.1 Goal Statement

Understand and apply the techniques used to design measures of student performance.

1.2 Objectives

The professional will successfully:

o Create a purpose statement for a specific performance measure.

o Identify content standard(s) that represent the Enduring Understanding/Key

Concept within the content area.

o Develop a test blueprint outlining the performance measure’s structure.

1.3 Guiding Questions

What is the performance measure intended to measure and at what grade?

What are the developmental characteristics of test-takers?

Which areas will be targeted among the various content standards?

How will educators use the results (overall score and “growth” inferences)?

When will the performance measure be administered?

Do the items/tasks capture the content standards within the key concept?

Is the number of items/tasks sufficient so that students at varying levels can demonstrate

their knowledge?

What are the time demands for both teachers and students?

How does the design reflect the areas of emphasis in the standards?

1.4 Resources

Training Templates Resources

M1-Designing the

Assessment

Template #1-Designing the

Assessment

HO #1-Designing the Assessment-

Examples

Cognitive Demand Crosswalk

Page 7: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 6

1.5 Procedural Steps

Create a Purpose Statement

Step 1. Individually create a statement about the performance measure in terms of the

content standards it will purport to measure.

Step 2. Build consensus by focusing on three components of the statement: What, How,

Why.

Step 3. Draft three sentences reflecting the group’s consensus for each component, and

review.

Step 4. Merge sentences to create a single paragraph “statement”. Again, review to

ensure that the statement reflects the group’s intent.

Step 5. Finalize the statement and double-check for editorial soundness.

Select Targeted Content Standards

Step 1. Place the course/subject’s name and Enduring Understanding/Key Concept

statement above the Targeted Content Standards table.

Step 2. Place the code for each standard/content strand in the Content ID column along

with the description for each content standard in the Content Statement column.

Step 3. Have a subject matter expert work collaboratively to identify initial (i.e., draft) set

of content standards associated with the “Enduring Understanding”/Key

Concept”.

Step 4. Review the list of targeted content standards and look for gaps and/or

redundancies and then finalize the list by placing an “X” in the Final column.

Step 5. Verify that the “final” targeted content standards will be those used to develop the

test blueprint.

Page 8: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 7

Develop a Test Blueprint

Step 1. Review the targeted content standards identified in Step 2.

Step 2. Insert selected Enduring Understanding/Key Concept and targeted content

standards (numeric code only) into the test blueprint table.

Step 3. Determine the number of items/tasks across the four cognitive levels.

Step 4. Tally the rows and place the values in the Total column. Tally each cognitive

level column and place the resultant values in the Grand Totals row.

Step 5. Report the total number of items/tasks and the total possible points available.

1.6 Quality Reviews

The Performance Measure Rubric is designed to help the educator review items/tasks,

scoring rubrics, and assessment forms to create high-quality performance measures. Strand 1 of

the Performance Measure Rubric evaluates the Design phase of the assessment process (purpose

statement, targeted content standards, and test blueprint). Refer to Handout #3 Performance

Measure Rubric-Scored Example for more information.

Task

ID

Descriptor Rating Evidence

1.1 The purpose of the performance measure is explicitly stated (who, what, why).

1.2 The performance measure has targeted content standards representing a range

of knowledge and skills students are expected to know and demonstrate.

1.3

The performance measure’s design is appropriate for the intended audience

and reflects challenging material needed to develop higher-order thinking

skills.

1.4

Specification tables articulate the number of items/tasks, item/task types,

passage readability, and other information about the performance measure -

OR- Blueprints are used to align items/tasks to targeted content standards.

1.5

Items/tasks are rigorous (designed to measure a range of cognitive

demands/higher-order thinking skills at developmentally appropriate levels)

and of sufficient quantities to measure the depth and breadth of the targeted

content standards.

Strand 1 Summary __out

of 5

Page 9: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 8

Phase II: Building the Assessment

2.1 Goal Statement

Understand and apply the techniques used to build measures of student performance.

2.2 Objectives

The professional will successfully:

o Create the necessary items/tasks to address the test blueprint.

o Develop scoring keys and/or scoring rubrics.

o Organize items/tasks and administration guidelines into a test form.

2.3 Guiding Questions

Are the items aligned with targeted content standards?

Do the selected items/tasks allow students to demonstrate content knowledge by:

o Responding to questions and/or prompts?

o Performing tasks, actions, and/or demonstrations?

Do the items/tasks measure content knowledge, skill, or process and not an external or

environmental factor (e.g., guessing)?

Is the number of items/tasks sufficient to sample the targeted content?

Are the items/tasks developmentally appropriate for the intended test-takers?

Are the correct answers and/or expected responses clearly identified?

Do the performance measure’s directions specify:

o What the test-taker should do, read, or analyze?

o Where and how the test-taker should respond or demonstrate the task?

o How many points a correct/complete response is worth towards the overall score?

Are there directions for different item/task types?

Page 10: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 9

2.4 Resources

Training Templates Resources

M2-Building the

Assessment

Template #2-Building the

Assessment

Performance Task Framework

HO #2-Building the Assessment

Model #1-Art Grade 5-DEMO

Model #2-Grade-Pre-Algebra-

DEMO

Model #3-Nutrition Culinary,

Level III-DEMO

Cognitive Demand Crosswalk

2.5 Procedural Steps

2.5.1 Item Stems/Task Prompts

Multiple Choice (MC) Items

Step 1. Review the targeted content standard.

Step 2. Determine which aspects of the standard can be measured objectively.

Step 3. Select the focused aspect and determine the cognitive demand reflected in the

standard’s description.

Step 4. Create a question (stem), one correct answer, and plausible (realistic) distractors.

Step 5. Review the item and answer options for grammatical soundness.

Short Answer/Extended Answer Items

Step 1. Review the targeted content standard(s).

Step 2. Determine which aspects of the standard(s) can be best measured by having

students “construct” a short response.

Step 3. Select and list aspects of the targeted content standard(s) to be measured.

Step 4. Create a prompt, select a passage, or develop a scenario for students.

Step 5. Develop a clear statement that articulates specific criteria for the test-taker to

provide.

Page 11: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 10

Extended Answer (EA) Tasks

Step 1. Review the targeted content standard(s).

Step 2. Determine which aspects of the standard(s) can be best measured by having

students “construct” an extended response to a given prompt, scenario, or passage.

Step 3. Select and list all aspects of the targeted content standard(s) to be measured.

Step 4. Create a prompt, select a passage, or develop a scenario for students.

Step 5. Develop a clear statement for each subordinate task that articulates specific

criteria for the test-taker to provide.

Extended Performance (EP) Tasks

Step 1. Review the targeted content standard(s).

Step 2. Determine which aspects of the standard(s) can be best measured by having

students “develop a complex response, demonstration, or performance over an

extend period of time (e.g., two weeks).

Step 3. Select and list all aspects of the targeted content standard(s) to be measured.

Step 4. Create a project, portfolio, or demonstration expectation statement that includes

subordinate tasks, which are aligned to the test blueprint.

Step 5. Develop a clear statement for each subordinate task that articulates specific

criteria for the test-taker to provide.

2.5.2 Scoring Keys/Scoring Rubrics

MC Items Score Key

Step 1. Enter the assessment information at the top of the Scoring Key.

Step 2. Record the item number, item tag (optional), item type, and point value.

Step 3. Record the MC answers in the Answer column.

Step 4. Repeat Steps 1-4 until all items on the test blueprint are reflected within the

Scoring Key.

Step 5. Validate that each question-to-answer relationship is recorded correctly

Page 12: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 11

Short Answer/Extended Answer/Extended Performance Scoring Rubrics

Step 1. Review the SA, EA, or EP task and the criteria articulated in the stem/directions.

Step 2. Select a “generic” rubric structure (see Template #2: Building the Assessment)

based upon scoring criteria and the number of dimensions being measured.

Step 3. Modify the rubric language using specific criteria expected in the response to

award the maximum number of points.

Step 4. Determine how much the response can deviate from “fully correct” in order to

earn the next (lower) point value. [Continue until the full range of possible scores

is described.]

Step 5. During the review, ensure the response expectation, scoring rubric, and test

blueprint are fully aligned.

Procedural Steps: Administration Guidelines

Step 1. Create a series of administrative steps for before, during, and after the assessment

window.

Step 2. Explain any requirements or equipment necessary, including accommodations.

State any ancillary materials (e.g., calculators) needed or allowed by the test-

takers.

Step 3. Identify the approximate time afforded to complete the assessment, including

each subtask in an EP task.

Step 4. Include detailed “scripts” articulating exactly what is to be communicated to

students, especially when administering performance tasks over a long period of

time.

Step 5. Include procedures for scoring, administering make-ups, and handling completed

assessments.

Procedural Steps: Test Forms

Step 1. Develop a cover page stating the test form developed and include any necessary

demographic information (e.g., section number, student name, date administered,

etc.).

Step 2. Organize the items/tasks/prompts in a sequence that will maximize student

engagement.

Page 13: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 12

Step 3. Add item tags and section dividers (optional).

Step 4. Refine the draft form to minimize “blank space”, verify picture, graph, table, and

figure placement in relationship to the item/task, and ensure MC answer options

do not drift from one page to the next.

Step 5. Add scoring rubric or criteria for constructed response tasks.

2.6 Quality Reviews

The Performance Measure Rubric is designed to help the educator review items/tasks,

scoring rubrics, and assessment forms to create high-quality performance measures. Strand 2 of

the Performance Measure Rubric evaluates the Build phase of the assessment process (refer to

Handout #3-Performance Measure Rubric-Scored Example for more information).

Task ID Descriptor Rating Evidence

2.1

Items/tasks and score keys are developed using standardized procedures,

including scoring rubrics for human-scored, open-ended questions (e.g.,

short constructed response, writing prompts, performance tasks, etc.).

2.2

Items/tasks are created and reviewed in terms of: (a) alignment to the

targeted content standards, (b) content accuracy, (c) developmental

appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and

fairness.

2.3

Administration guidelines are developed that contain the step-by-step

procedures used to administer the performance measure in a consistent

manner, including scripts to orally communicate directions to students,

day and time constraints, and allowable accommodations/adaptations.

2.4

Scoring guidelines are developed for human-scored items/tasks to

promote score consistency across items/tasks and among different scorers.

These guidelines articulate point values for each item/task used to

combine results into an overall score.

2.5

Summary scores are reported using both raw score points and a

performance level. Performance levels reflect the range of scores possible

on the assessment and use terms or symbols to denote performance levels.

2.6

The total time to administer the performance measure is developmentally

appropriate for the test-taker. Generally, this is 30 minutes or less for

young students and up to 60 minutes per session for older students (high

school).

Strand 2 Summary __out

of 6

Page 14: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 13

Phase III: Reviewing the Assessment

3.1 Goal Statement

Understand and apply the techniques used to review and refine measures of student

performance.

3.2 Objectives

The professional will successfully:

o Review developed items/tasks for validity threats, content and cognitive match;

o Examine test alignment to ensure: (a) all items/tasks match the skills, knowledge,

and concepts in the targeted content standards; (b) all rubrics match the targeted

content standards; and, (c) all items/tasks reflect higher order thinking.

Note: A supplement to this document will address alignment (Step 8), data reviews (Step 9), and

refinement (Step 10), which was addressed in the Orientation presentation.

3.3 Guiding Questions

Does each item/task clearly address the standard?

Is the reading difficulty and vocabulary appropriate?

Is the language clear, consistent, and understandable?

Are charts, tables, graphs, and diagrams clear and understandable?

Is there only one (1) correct answer?

Have the items been reviewed for bias and sensitivity?

o Items provide an equal opportunity for all students to demonstrate their

knowledge and skills. The stimulus material (e.g., reading passage, artwork, and

diagram) does not raise bias and/or sensitivity concerns that would interfere with

the performance of a particular group of students.

Are the items developmentally appropriate for test-takers?

Does the blueprint reflect the test form?

Does the scoring rubric provide detailed scoring information?

Does the assessment have at least two (2) performance levels?

Page 15: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 14

3.4 Resources

Training Templates Resources

M3-Reviewing the

Assessment

Template #3-Performance

Measure Rubric

HO #3-Reviewing the

Assessment-Scored Example

3.5 Procedural Steps

Step 1. Identify at least one other teacher to assist in the review (best accomplished by

department or grade-level committees).

Step 2. Organize the test form, answer key, and/or scoring rubrics, and Handout #3-

Reviewing the Assessment-Scored Example.

Step 3. Read each item/task and highlight any “potential” issues in terms of content

accuracy, potential bias, sensitive materials, fairness, and developmental

appropriateness.

Step 4. After reviewing the entire test form, including scoring rubrics, revisit the

highlighted items/tasks. Determine if the item/tasks can be rewritten or must be

replaced.

Step 5. Print revised assessment documents and conduct an editorial review, ensuring

readability, sentence/passage complexity, and word selection are grammatically

sound. Take corrective actions prior to finalizing the documents.

Step 1. Identify at least one other teacher to assist in the alignment review (best

accomplished by department or grade-level committees).

Step 2. Organize items/tasks, test blueprint, and targeted content standards.

Page 16: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 15

Step 3. Read each item/task in terms of matching the standards both in terms of content

reflection and cognitive demand. For SA, EA, and EP tasks, ensure that scoring

rubrics are focused on specific content-based expectations. Refine any identified

issues.

Step 4. After reviewing all items/tasks, including scoring rubrics, count the number of

item/task points assigned to each targeted content standard. Determine the

percentage of item/task points per targeted content standard based upon the total

available. Identify any shortfalls in which too few points are assigned to a

standard listed in the test blueprint. Refine if patterns do not reflect those in the

standards.

Step 5. Using the item/task distributions, determine whether the assessment has at least

five (5) points for each targeted content standard and if points are attributed to

only developmentally appropriate items/tasks. Refine if point sufficiency does

not reflect the content standards

Step 1. Conduct after test-takers have engaged in the assessment procedures.

Step 2. Focus on data about the items/tasks, performance levels, score distribution,

administration guidelines, etc.

Step 3. Evaluate technical quality by examining aspects such as: rater reliability, internal

consistency, intra-domain correlations, decision-consistency, measurement error,

etc.

Step 1. Complete prior to the beginning of the next assessment cycle.

Step 2. Analyze results from the prior assessment to identify areas of improvement.

Page 17: Quick Start Users Guide-June 2014-Working Draft

Quick Start©

User’s Guide-June 2014-Working Draft 16

Step 3. Consider item/task replacement or augmentation to address areas of concern.

Step 4. Strive to include at least 20% new items/tasks, or implement an item/task tryout

approach.

Step 5. Create two parallel forms (i.e., Form A and B) for test security purposes.

3.6 Quality Reviews

The Performance Measure Rubric is designed to help the educator review items/tasks,

scoring rubrics, and assessment forms to create high-quality performance measures. Strand 3 of

the Performance Measure Rubric evaluates the Review phase of the assessment process (refer to

Handout #3-Performance Measure Rubric-Scored Example for more information).

Task

ID

Descriptor Rating Evidence

3.1

The performance measures are reviewed in terms of design fidelity –

Items/tasks are distributed based upon the design properties found

within the specification or blueprint documents.

Item/task and form statistics are used to examine levels of difficulty,

complexity, distracter quality, and other properties.

Items/tasks and forms are rigorous and free of bias, sensitive, or unfair

characteristics.

3.2

The performance measures are reviewed in terms of editorial soundness, while

ensuring consistency and accuracy of other documents (e.g., administration) –

Identifies words, text, reading passages, and/or graphics that require

copyright permission or acknowledgements

Applies Universal Design principles

Ensures linguistic demands and/or readability is developmentally

appropriate

3.3

The performance measures are reviewed in terms of alignment characteristics –

Pattern consistency (within specifications and/or blueprints)

Matching the targeted content standards

Cognitive demand

Developmental appropriateness

3.4

Cut scores are established for each performance level. Performance level

descriptors describe the achievement continuum using content-based

competencies for each assessed content area.

3.5

As part of the assessment cycle, post-administration analyses are conducted to

examine aspects as items/tasks performance, scale functioning, overall score

distribution, rater drift, content alignment, etc.

3.6

The performance measure has score validity evidence that demonstrated item

responses were consistent with content specifications. Data suggest the scores

represent the intended construct by using an adequate sample of items/tasks

within the targeted content standards. Other sources of validity evidence such

as the interrelationship of items/tasks and alignment characteristics of the

performance measure are collected.

3.7

Reliability coefficients are reported for the performance measure, which

includes estimating internal consistency. Standard errors are reported for

summary scores. When applicable, other reliability statistics such as

classification accuracy, rater reliabilities, and others are calculated and

reviewed.

Strand 3 Summary __out

of 7

Note: A supplement to this document will address Tasks 3.5, 3.6, and 3.7 of the Performance Measure

Rubric.