141
Professional Development Manual on NRS Data Monitoring for Program Improvement By: Mary Ann Corley Principal Research Analyst AMERICAN INSTITUTES FOR RESEARCH 1000 Thomas Jefferson Street, N.W. Washington, DC 20007 This manual was prepared for the project: Promoting the Quality and Use of National Reporting System (NRS) Data Contract #ED-01-CO-0026 U.S. DEPARTMENT OF EDUCATION Office of Vocational and Adult Education Division of Adult Education and Literacy Susan Sclafani, Assistant Secretary for Vocational and Adult Education Cheryl Keenan, Director Division of Adult Education and Literacy Mike Dean, Program Specialist Division of Adult Education and Literacy

Program Area - CALPROcalpro-online.org/documents/dataMonitortraining04.doc  · Web viewLetter to Send to Participants Prior to Training S-3. ... Tell them that each scenario represents

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Professional Development Manualon NRS Data Monitoring for Program

Improvement

By:Mary Ann Corley

Principal Research Analyst

AMERICAN INSTITUTES FOR RESEARCH1000 Thomas Jefferson Street, N.W.

Washington, DC 20007

This manual was prepared for the project:

Promoting the Quality and Use ofNational Reporting System (NRS) Data

Contract #ED-01-CO-0026

U.S. DEPARTMENT OF EDUCATION

Office of Vocational and Adult EducationDivision of Adult Education and Literacy

Susan Sclafani, Assistant Secretary forVocational and Adult Education

Cheryl Keenan, DirectorDivision of Adult Education and Literacy

Mike Dean, Program SpecialistDivision of Adult Education and Literacy

April 2004

Contents

Page

Introduction....................................................................................................................................iii Audience.................................................................................................................................. iii Purpose....................................................................................................................................iii

Workshop Overview..................................................................................................................ivPreparation Checklist................................................................................................................v

Workshop Outline..........................................................................................................................viBefore the Workshop...................................................................................................................viii

Facilitator’s NotesFacilitator’s Notes: Day 1...............................................................................................................1Facilitator’s Notes: Day 2...............................................................................................................9

Participant’s HandoutsWorkshop Objectives.................................................................................................................H-1Workshop Agenda ....................................................................................................................H-2Why Get Engaged with Data?....................................................................................................H-3Your Own Personal Motivators..................................................................................................H-4Questions for Consideration......................................................................................................H-5Decision for State Teams: Selecting a Standard-Setting Model................................................H-6Adjusting Local Standards: Sample Scenarios..........................................................................H-7Reflection on Success of Past Efforts........................................................................................H-8Variations on a Theme...............................................................................................................H-9State Worksheet: Planning for Rewards and Sanctions.........................................................H-10Data Carousel........................................................................................................................H-11aMonitoring Performance Using Indicators of Program Quality.................................................H-12Steps and Guidelines for Monitoring Local Programs..............................................................H-13Planning and Implementing Program Improvement...............................................................H-14aAha! Experiences and Parking Lot Issues...............................................................................H-15Workshop Evaluation Form....................................................................................................H-16a

PowerPoint Slides.......................................................................................................................1

SupplementPossible Questions to Ask When Examining the Data (Answers to H-5)..................................S-1Glossary...................................................................................................................................S-2aLetter to Send to Participants Prior to Training..........................................................................S-3Alternative Monitoring Exercise................................................................................................S-4a

ii

INTRODUCTION TO PROFESSIONAL DEVELOPMENT ON NRS DATA MONITORING FOR PROGRAM IMPROVEMENT

AudienceThis professional development sequence has three distinct audiences:

1. State-level staff (administrators, professional development coordinators, and data managers) who are responsible for statewide planning, management, and dissemination of information and procedures related to the NRS;

2. Professional development specialists who are responsible for rolling out the training statewide to local programs; and

3. Local program administrators, professional development coordinators, data managers, and instructors.

PurposeThe purpose of this professional development sequence is to help state and local literacy program personnel identify and define the interrelationships between data and program performance, explore ways to monitor local programs to strengthen the connection between performance and data, and identify and implement program improvement efforts.

This sequence can be adapted to meet the needs of individual states and local programs. It can be used as a train-the-trainers program in which state-level staff offer the workshop to key personnel (e.g., professional development specialists and data facilitators) who will then repeat the training for local program administrative and instructional staff. Information and activities from this training also can be selected and offered to meet the needs of specific audiences. For example, professional development for instructors may focus on using data to inform instruction. Professional development for local program administrators may focus on using data for enhancing performance or for making program improvements. Users of this training sequence are encouraged to adapt and augment activities accordingly.

iii

Workshop OverviewObjectives:By the end of this professional development sequence, participants will be able to:

1. Describe the importance of getting involved with and using data;

2. Identify four models for setting performance standards as well as the policy strategies, advantages, and disadvantages of each model;

3. Determine when and how to adjust standards for local conditions;

4. Set policy for rewards and sanctions for local programs;

5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention;

6. Distinguish between the uses of desk reviews and on-site reviews for monitoring local programs;

7. Identify steps for monitoring local programs;

8. Identify and apply key elements of a change model; and

9. Work with local programs to plan for and implement changes that will enhance program performance and quality.

Time:The total time required for this workshop is approximately 12 hours of instructional time, or 2 full days of 6 hours each (not including lunch and break times). The 12 hours of training are further divided into 4 discrete segments of 3 hours each. The entire sequence, therefore, may be conducted over 2 consecutive days or delivered in individual 3-hour segments over a 2-week period, thereby affording participants the opportunity to integrate the knowledge and skills gained into their work activities.

Materials Checklist: Overhead projector (for use with transparencies) OR

Laptop and LCD projector (for use with CD-ROM)

Copies of participant handouts for each participant

PowerPoint presentation on CD-ROM or transparencies of PowerPoint slides

Facilitator’s Notes and Supplements

Flipchart, flipchart stand, and marking pens

Blank transparencies and transparency pens

iv

Preparation Checklist Reserve space for the training.

Duplicate handouts.

Download PowerPoint from the NRS Web site <www.nrsweb.org> and create a CD-ROM for use during the workshop, or make overhead transparencies from the PowerPoint slides. Run copies of the PowerPoint slides as handouts, 3 to a page.

Assemble participant packets with copies of handouts and PowerPoint slides.

Make nametags and/or name tents for participants.

Prepare attendance sheet.

Pre-divide attendees into groups (state teams or local program teams or instructor teams, depending on the audience) for small group activities.

Arrange for food and beverages, as appropriate.

Arrive 1/2 hour before training is scheduled to begin.

Check equipment to ensure that it is working properly.

Pre-label flipchart pages, one of the following headings per page:

Expectations—Setting Performance Standards Expectations—Monitoring Parking Lot Issues The Motivation Continuum—5 or 6 pages labeled with this same heading and

with an arrow leading from Intrinsic to Extrinsic as in the following:Intrinsic Extrinsic

Matrix of States’ Preferred Standard-setting Models Reward Structures Suggested Sanctions May Produce Unintended Effects Questions to Ask Local Program About Educational Gain Questions to Ask Local Program About NRS Follow-up Questions to Ask Local Program About Retention Questions to Ask Local Program About Enrollment Needs/Resources Participant Feedback

This flipchart should have two columns on the page, one labeled pluses (+), and one labeled deltas ().

v

WORKSHOP OUTLINE

Materials Activities TimesDAY 1

PPT-1—PPT-5Flipchart page for Expectations— Setting Performance StandardsFlipchart page for Expectations— MonitoringFlipchart page for Parking Lot IssuesS-2a, b, and c; H-16

I. Welcome, Introduction, Objectives, Agenda Review

A. Welcome and IntroductionsB. Professional Development Objectives/Agenda/

ExpectationsC. Parking Lot IssuesD. Terms and DefinitionsE. Workshop Evaluation Form

50 minutes

PPT-6; PPT-7Flipchart page for The Motivation ContinuumH-3; H-4

II. The Power of DataA. Why Get Engaged with Data? 30 minutes

BREAK 15 minutes

PPT-9—PPT-24H-5; H-6S-1 (answers to H-5) Flipchart page for Matrix of States’ Preferred Standard-setting Models

B. The Data-driven Program Improvement ModelC. Setting Performance Standards for Program Quality

25 minutes60 minutes

LUNCH 60 minutes

PPT-25—PPT-33; H-7—H-10Flipchart page for Reward StructuresFlipchart page for Suggested SanctionsFlipchart page for May Produce Unintended Effects

D. Adjusting Standards for Local ConditionsE. Shared Accountability with Appropriate Rewards and

Sanctions

30 minutes60 minutes

BREAK 15 minutes

PPT-34—PPT-40H-11a, b, c, and d 4 flipchart pages(1) Questions to Ask Local Program

about Educational Gain(2) Questions to Ask Local Program

about NRS Follow-up Measures(3) Questions to Ask Local Program

about Retention(4) Questions to Ask Local Program

about Enrollment

III. Getting Under the Data: Performance Measures and Program Processes

60 minutes

Flipchart page for Parking Lot Issues Flipchart page of pluses and deltas, + and

IV. Day 1 Evaluation and Wrap-up 15 minutes

WORKSHOP OUTLINE (CONTINUED)

vi

Materials Activities TimesDAY 2

Flipchart page for Parking Lot IssuesPPT-4; PPT-5

V. Agenda Review for Day 2 30 minutes

PPT-41—PPT-54PPT-55; H-12

VI. Planning for and Implementing Program MonitoringA. Presentation and DiscussionB. Small Group Work on Data Sources

20 minutes60 minutes

BREAK 15 minutes

PPT-56; H-13

C. Small Group ReportsD. Steps and Guidelines for Monitoring Local Programs

30 minutes 25 minutes

LUNCH 60 minutes

PPT-57—PPT-66H-14

VII. Planning for and Implementing Program ImprovementA. A Model of the Program Improvement ProcessB. Bringing it Home: The Start of a State Action Plan

20 minutes60 minutes

BREAK 15 minutes

C. Sharing Action Plans 45 minutes

Flipchart page for Needs/ResourcesH-16a, b, and c

VIII. Closing and EvaluationA. Review Parking Lot IssuesB. Identify Additional ResourcesC. ReflectionD. Workshop Evaluation

30 minutes

vii

BEFORE THE WORKSHOP

The following tasks should be completed before the workshop:

Send out flyers announcing the workshop and the dates. Send out confirmation letters to those who have registered for the workshop. Tell

them that, in preparation for the workshop, they should meet with other persons (from their state or local program) who also will be attending the workshop. Ask them to come prepared to give a 5-minute report that highlights their state or program data. (See S-2a and b for sample letter.)

Duplicate all handouts for the session (H-1 through H-16) and arrange them into participant packets. By providing a packet of materials to each participant, you can avoid constant distribution and handling of materials during the workshop.

Download the PowerPoint slides from the NRS Web site (www.nrsweb.org) and create a CD-ROM for use during the workshop or make overhead transparencies from the PowerPoint slides (PPT-1 through PPT-69).

Pre-label flipchart pages for activities, as indicated in the Preparation Checklist and in the Facilitator’s Notes.

Order all equipment (overhead projector, screen, flipcharts). If you plan to use a CD-ROM instead of overhead transparencies, be sure that you will have a laptop computer and LCD projector available for the session. Check the equipment to ensure that it is working properly. Also check the size of the screen and the clarity of print from the back of the room.

Prepare nametags or name tents for participants.

Make signs or folded cards for each group (state names if this session is for national training, local program names if this is for state training, class names [e.g., ESL, ABE, GED, Workplace Literacy, Family Literacy, EL Civics] if this is for instructor training).

Arrange for a place to hold the workshop session and ensure that it has sufficient space and moveable chairs for break-out activities. Consider the room arrangement that will best facilitate your activities. For this workshop, it is suggested that, if possible, the room arrangement consist of table rounds that each seat from 5 to 8 persons.

Prepare a participant sign-in sheet to verify attendance. Include spaces for participants’ names, program names, addresses, phone and fax numbers, and e-mail addresses. This will be useful if you need to make future contact with participants.

Arrange for refreshments and lunch, as appropriate. Read the Facilitator’s Notes for the workshop, pages 1-12.

Review the handouts (H-1 through H-16), the PowerPoint slides (PPT-1 through PPT-70), and the Supplements (S-1 through S-3).

viii

FACILITATOR’S NOTES

FACILITATOR’S NOTES: DAY 1Materials Activities Times

I. Welcome, Introduction, Objectives, Agenda Review for Day 1 50 minutes

PPT-1A. Welcome and IntroductionWelcome participants to this professional development workshop on NRS Data Monitoring for Program Improvement (PPT-1). Have each of the facilitators introduce themselves and make a brief statement about their background and expertise in either professional development or the management of data collection and reporting. Then ask participants to introduce themselves. If participants are few in number, they can introduce themselves one by one to the large group, stating their names, programs, and positions. Move the activity along, allowing each person to speak for only a minute. If the group is large, ask participants to pair up and share background information (name, program, position). As an optional activity, ask to see a show of hands for those who are local program directors, instructors, and professional development coordinators. Ask whether there are other roles represented among the group and what those roles are.

(30 min.)

B. Professional Development Objectives/Agenda (15 min.)PPT-2 Show participants PPT-2 and outline for them the workshop objectives for Day 1.

Then show PPT-3, Workshop Agenda. Quickly summarize the activities that will be part of this workshop and state their relationship to the expected outcomes. Also show them PPT-4 and PPT-5, the objectives and agenda for Day 2, but do not spend as much time on Day 2 objectives and agenda as you did on Day 1. You will show these again on Day 2; the reason for showing them now is to give participants a sense of the objectives of the full 2-day workshop.

PPT-3PPT-4PPT-5

Flipchart page titled Expectations—Setting Performance Standards Flipchart page titled Expectations—Monitoring

Ask participants to consider one question they want answered about setting performance standards and one question they want answered about monitoring before this workshop is over. After about 5 minutes, sample responses from the group and record them on the flipcharts. Continue listing expectations until there are no more responses.

NOTE: It is not necessary that every participant respond to this question; it’s likely that some participants will have expectations that have already been listed.

Refer to the flipchart list and identify for participants those topics that have been planned for in this workshop, those that have not been planned for but that can be addressed easily during the workshop, and those, if any, that are outside the realm of this workshop. To the extent possible, identify resources that participants can access for information about those content issues that will not be covered in this workshop.

C. Parking Lot Issues (2 min.)Flipchart page titledParking Lot Issues

Tell participants that they will keep a “Parking Lot” of issues and questions that arise that are related to the NRS and assessment but not directly related to this workshop on NRS data monitoring for program improvement. To the extent possible, those questions and issues will be addressed at the end of this workshop.

H-15 NOTE: Post on the wall a flipchart page marked “Parking Lot Issues.” Also place a Post-It pad on each table. Ask participants, throughout the workshop, to write their questions on the Post-It Notes and place them on the flipchart. They may also use H-15 to keep notes of their issues for the parking lot as well as to take notes of any revelations they have, or any light bulbs that go off for them during the workshop.

1

Materials Activities Times

D. Terms and Definitions (2 min.)

S-2a, b, and cPoint out that this professional development sequence, at times, uses technical terms common to the National Reporting System. A Glossary (S-2a, b, and c) of these terms has been provided to make the reading as clear as possible. The Glossary can be found in the Supplement.

E. Evaluation Form (1 min.)H-16 Call participants’ attention to the evaluation form H-16. Remind them that they will

be asked to complete the evaluation form at the end of the workshop.

II. The Power of Data

A. Why Get Engaged with Data? [Warm-up Exercise] 30 minutes

PPT-6, PPT-7 Post-It Notes

PPT-8, H-35-6 flipchart pages of H-3, The Motivation Continuum, posted about the room

H-3H-4

Divide participants into teams of 3 to 5 people. Show PPT-6 and PPT-7 and ask each team to consider the question, “Why is it important to be able to produce evidence of what your state (or local) adult education program achieves for its students?” Provide a supply of Post-It Notes to each team and ask the team to record one reason that it identifies on each of the Post-It Notes. Allow approximately 10 minutes for this activity. Now show PPT-8. Refer to H-3 and to the large wall charts titled The Motivation Continuum. Ask each team to arrange the factors they have identified on one of the large wall charts, ranging in order from those factors that are internally motivated to those factors that are driven by external forces. Ask each team to briefly report to the whole group any problems or questions they had to resolve or consider before placing one or more factors on the Motivation Continuum. Allow time for discussion. Refer to H-3 again and ask participants individually to record on the arrow those factors that they believe are most meaningful. Tell them that these are the factors that are their own personal motivators for getting engaged with data. Refer them to H-4 and ask them to complete the sentence, “I can be motivated to work with our data if I remember that…” Point out that, when state or local program team members share their motivating factors with one another, this can be a powerful, unifying activity for the team in determining next steps in setting policy, monitoring programs, and initiating program improvement efforts.

(10 min.)

(5 min.)

(10 min.)

(5 min.)BREAK 15 minutes

2

Materials Activities Times

B. Overview of the Data-driven Program Improvement Model 25 minutes

PPT-9PPT-10PPT-11

PPT-12

H-5PPT-13Supplemental Handout, S-1

PPT-14

Tell participants that the relationship between data and program quality is dynamic because data, in the form of performance standards and other goals, not only measure program performance but can change it as well. Referring to PPT-9, -10, and -11, describe the steps of a model for data-driven program improvement. The process begins with the setting of standards that define acceptable levels of performance. Underlying the performance measures (or data) are the Powerful Ps, or the program elements of policies, procedures, processes, and products. It is these program elements underlying the data that can be observed and monitored with the aim of improving performance. State and local teams, acting collaboratively, can plan and implement program improvements by making changes to policies, procedures, processes, and products. Now show PPT-12, which displays educational gains for ESL levels and performance standards for one adult education program. Point out that the program exceeded its targets for three of the ESL levels, failed to meet them for two levels, and did not serve any students at the high-advanced ESL level. Ask participants, working in pairs, to use H-5 to identify questions that they, as program monitors, would want to ask of local program staff. Allow 5 minutes for this activity and then sample responses from the whole group. Accept all answers. After responses from the group seem exhausted, show PPT-13 and refer participants to the Supplemental Handout, S-1, which lists possible questions and invite them to review this list to see if it includes their questions. Tell them that this list represents only a subset of all possible questions that a visiting team might ask in monitoring program data and performance. Tell participants that we now are ready to examine more closely the 4 areas of this workshop, namely, (1) setting performance standards, (2) examining the elements underlying the data, (3) program monitoring, and (4) program change. Before proceeding, ask if participants have any questions or comments. After responding to any questions, ask participants to get ready to begin their journey into harnessing the power of data… (Show PPT-14.)

(5 min.)

(15 min.)

(5 min.)

C. Setting Performance Standards for Program Quality 60 minutesPPT-15

PPT-16

Show PPT-15 and tell participants that an accountability system can measure quality accurately only when it contains the following four elements:

An underlying set of goals that the program is to achieve; A common set of measures (either qualitative or quantitative) that reflect

the goals; Performance standards tied to the measures; and Sanctions or rewards for programs, tied to performance.

Now show PPT-16 and explain that the goals of the federally funded adult education program (e.g., literacy skills development, lifelong learning, employment) are reflected in the NRS core outcome measures of

educational gain, GED credential attainment, entry into postsecondary education, and employment.

(20 min.)

3

Materials Activities Times

PPT-17

States’ prior performance

PPT-18 through PPT-22

PPT-23 and PPT-24H-6Flipchart page for Matrix of States’ Preferred Standard-setting Models

Show PPT-17, while reminding participants that each state sets its performance standards in collaboration with the U.S. Department of Education. However, a state’s performance is a reflection of the aggregate performance of all the programs it funds. At this point, distribute to each state team a copy of its prior performance. Be sure that you give each state only its own data, and not data from any other state or program. Suggest that each state team review and consider its negotiated performance standards throughout this workshop. Tell them that states may soon be required to set standards for their local programs and to monitor local programs with the aim of meeting or exceeding its performance standards and improving program quality. Show PPT-18 through PPT-22, which outline four different models for setting performance standards (continuous improvement, relative ranking, external criteria, and return on investment). Discuss each slide, inviting comments from participants. The slides are self-explanatory; however, it is not advisable that the facilitator simply read the slides. Participants will be more engaged in the presentation if you elaborate on some of the slides with comments and anecdotes and also invite comments from participants. Survey the group to determine whether any state currently uses a specific standard-setting model for local programs and ask for comments on the successes and the challenges faced in the standards-setting process. Now show PPT-23 and PPT-24 and refer them to H-6. Ask participants, in their state teams, to consider the questions on H-6. Allow 20 minutes for this. Then ask each state to report on the performance standard-setting model(s) it is leaning toward and to state the reason for this decision. Record the states’ responses by checking the selected model(s) on a flipchart matrix similar to the following and post the chart on a wall for the duration of the workshop. Ensure that participants know that they are not committing to using this performance standard-setting model; that this activity is just for the purpose of getting them familiar with the different models and their uses. Matrix of States’ Preferred Standard-setting Models

State Continuous Improvement

Relative Ranking

External Criteria

Return on Investment

Ask participants in what way the standard-setting model(s) they selected represent a policy statement on the relationship between performance and quality that they want to instill in local programs. (Answer: Continuous improvement model means the state wants every program to improve; relative ranking model implies a more uniform level of quality across programs, etc.) Ask whether there were any AHAs! or light bulbs that went off during this process.

Conduct a brief discussion. Tell participants that, after lunch, they will look at ways to adjust standards for local conditions and they will set rewards and sanctions.

(20 min.)

(15 min.)

(5 min.)

LUNCH 60 minutes

4

Materials Activities Times

D. Adjusting Standards for Local Conditions 30 minutes

PPT-25

PPT-26

Welcome participants back following the lunch break. Tell them that they now need to consider whether the standard-setting models they have selected will have the desired effect on all programs. Show PPT-25 and tell them that research on the effective use of performance standards suggests that standards often need to be adjusted for local conditions before locals can work to improve program quality. Ask if anyone can tell you why this is so. (Answer: Standards that are set at the wrong level will not work—they will be either too easy or too difficult for the program to meet, and they will not result in program improvement.) Show PPT-26 and explain that there are three main factors that affect program performance and that may require them to adjust standards for local conditions. These factors are:

student characteristics, local program elements, and external conditions.

For example, the state standard may be too high for the local program that serves predominantly lower-level students, or experiences a sudden influx of refugees, or sees a dramatic increase in student enrollment when the community’s

largest employer closes its doors and moves out-of-state. The state may find it helpful to adjust literacy standards for the program that emphasizes special content, such as workplace skills. Likewise, programs in areas of high unemployment may need to have lowered standards for “entered and retained employment.” And when natural disasters affect student attendance or availability of services, standards may need to be adjusted.

(10 min.)

H-7 Now refer participants to H-7, sample scenarios. Tell them that each scenario represents a local program’s claim that it cannot meet the state-set performance standards. Divide the group into teams of 4-5 people and assign one scenario to each team. Ask each team to consider its scenario and to propose (1) a strategy for verifying the accuracy of the local program’s claim, and (2) a suggested solution or way to respond to the local program. Have each team select a recorder and a reporter. Allow 10 minutes for the team work and then ask each team to report to the whole group its strategy and suggested solution for its assigned scenario.

(20 min.)

E. Shared Accountability with Appropriate Rewards and Sanctions 60 minutes

H-8

Tell participants that the section of the training that they are now beginning may be one of the most critical in terms of the ultimate success (or failure) of their program improvement and reform efforts, because, without local involvement and cooperation, every initiative launched by the state will be met with resistance and will be doomed to failure. Refer state teams to H-8 and ask them to consider past efforts that their state has initiated and to identify those that have been successful and those that the locals resisted. Ask if they can identify elements that led to the success or failure of these initiatives. Allow 10 minutes for the state teams to work and then sample responses from the total group. It is not necessary for each state team to report here.

(15 min.)

5

Materials Activities Times

PPT-27 through PPT-30

PPT-31

H-9PPT-32

Flipchart (Reward Structures)

Flipchart (Suggested Sanctions)Flipchart (May Produce Unintended Effects)

H-10

Show PPT- 27 through PPT-30 on Shared Accountability. The table on PPT-30 has a horizontal and a vertical dimension (or axis), each indicating movement from low to high. The horizontal axis represents state control and the vertical axis represents local involvement. The table cells representing the intersection of state guidance and support with local involvement show possible effects. In other words, low state control coupled with low degree of local involvement will likely result in stagnation; and high state control coupled with low degree of local involvement will likely result in local animosity and resistance, etc. If state control is delivered in the spirit of providing guidance and support, and if local involvement means that locals truly have a hand in identifying and designing program improvement efforts, then the local program will yield ever-increasing quality performance. In this scenario, everybody wins: students achieve their goals, programs win recognition and increased funding, and states increase overall performance. Tell participants that states must consider the use of appropriate rewards and sanctions for local programs. Show PPT-31 and ask them which they think is the more powerful motivator—rewards or sanctions? (Answer: Research clearly indicates that rewards are more effective than sanctions in promoting program improvement.) Ask them how sanctioning might be counter-productive? (Answer: The pressure created from sanctions such as partial loss of funding may prompt undesirable behavior by locals, such as limiting enrollment to higher-level students or placing students in inappropriately low levels.) Explain that such questionable tactics designed to yield high performance results are what is known as unintended effects—that is, the action does not benefit students; it is put in place for the sole purpose of avoiding harsh sanctions on the program.) Refer participants to H-9, Variations on a Theme, and show PPT-32. Ask them to work in groups of 3 or 4 to brainstorm possible reward structures and possible sanctions for local programs that meet or fail to meet their performance standards. Each group should select a recorder and a reporter. Ensure that there is an ample supply of Post-It Notes and marking pens on each table. Instructions are for the recorder to write one response per Post-It note, using one color Post-It Note for the reward structures and a different color for the suggested sanctions. Allow 10 to 15 minutes for this activity and monitor the groups to determine when to call “Time.” Invite one group to read one of its rewards Post-It Notes and to place the note on the Reward Structures flipchart. Ask other groups if they also had “variations on the theme” (e.g., monetary rewards) and to post these on the Reward Structures flipchart in a column under the first note. When all groups have posted their rewards related to monetary incentives, invite another group to read one of its remaining rewards (e.g., Published Honor Roll of programs that met or exceeded its performance standards) and repeat the process. Continue until there are no remaining rewards to be posted. Then repeat the process for the sanctions, inviting groups to place their Post-It Notes on the Suggested Sanctions flipchart. For each sanction that is read, ask the whole group to serve as an applause-o-meter, clapping for the gentle sanctions and gonging for sanctions that may be too harsh. Place those sanctions that the group considers too harsh on the flipchart labeled May Produce Unintended Effects.

Conduct a discussion of the various sanctions and rewards and tell each state team that, following the break, it is to complete the planning worksheet on H-10 in which the state makes preliminary decisions about the reward structures and sanctions it will put in place and identifies stakeholders it will include as well as the process for making final decisions on rewards and sanctions and turning the decisions into policy.

(10 min.)

(15 min.)

(15 min.)

(5 min.)

BREAK 15 minutes

6

Materials Activities Times

H-10 Allow 15 minutes for state teams to complete the activity described before the break, using H-10. Then ask if any team wants to share its strategies and perhaps help light bulbs to go off for other states.

20 minutes

PPT-33 In summary, show PPT-33, telling participants that the state process of setting local performance standards consists of the following five steps:

1. Select standard-setting model;2. Set rewards and sanctions policy;3. Review performance levels for local adjustment;4. Provide technical assistance to locals in an atmosphere of shared

accountability; and5. Monitor performance often.

5 minutes

III. Getting Under the Data: Performance Measures and Program Processes 60 minutesPPT-34, PPT-35

PPT-36 through PPT-39PPT-404 flipchart pages posted about the room: (1) Questions to Ask Local

Program about Educational Gain

(2) Questions to Ask Local Program about NRS Follow-up Measures

(3) Questions to Ask Local Program about Retention

(4) Questions to Ask Local Program about Enrollment

Next to each flipchart, post one of the data displays from H-11a, b, c, d, and e.

Show PPT-34 and PPT-35. Tell participants that, for the purpose of this workshop, we will consider four sets of measures:

1. Educational gain;2. The NRS follow-up measures of obtained a secondary credential,

entered and retained employment, and entered postsecondary education;

3. Retention; and4. Enrollment.

Under each set of measures lie the programmatic and instructional decisions and procedures that affect program performance and quality. Show the corresponding data pyramids on PPT-36 through PPT-39.

Post the flipchart pages indicated at the left at various places in the room. Next to each flipchart, post a copy of one of the data displays found on H-11 (a through d).Now show PPT-40 and refer participants to the Data Carousel activity (H-11a, b, c, d, and e). Have participants, in groups of 4 or 5, visit each of the carousel displays and develop questions about the data and the underlying elements. Each group should identify as many underlying elements as possible that affect this data and the program’s performance and write their questions on the flipchart provided. Then each group should make recommendations for improving the program’s performance. Inform participants that they should spend approximately 15 minutes at each display (for a total of 1 hour) and that each group should be prepared in the morning to report on their findings. As each group rotates on the carousel to different stops in the room, it should read the questions generated by other groups and add their own questions. Note: Keep these carousel stops with the questions posted for the duration of the workshop.

7

Materials Activities Times

IV. Day 1 Evaluation and Wrap-up 15 minutes

Tell participants that they have now reached the mid-point in the workshop sequence on NRS Data Monitoring for Program Improvement. Review for them the content that they covered and the activities that they engaged in during this first day of the workshop on getting to understand and appreciate the value of data: Warming up to data through the Why Get Engaged

with Data Exercise; Selecting standard-setting models (from the four models of continuous

improvement, relative ranking, external criteria, and return on investment; Adjusting performance levels to meet local circumstances of student

characteristics, local program elements, and external conditions; Determining ways to share accountability with

locals; Setting policy for rewards and sanctions; and Examining the programmatic and instructional decisions and procedures

underlying the data.

Flipchart (Parking Lot Issues)

Ask if there are any questions about the day’s workshop. Respond to questions that you can answer on the spot. If there are questions that will take some research before you can answer or policy questions that you must refer to another source, be sure to add the questions to the “Parking Lot Issues.” These will be addressed at the end of the workshop sequence.

Flipchartpage of pluses and deltas, + and

Now tell participants that you would like to “take the temperature” of the group concerning Day 1 activities and content by doing an informal pluses-and-deltas exercise. On a flipchart page, make a page of two columns, one with a plus sign [+] and one with a delta []. Ask them to call out those things that they liked about today’s workshop. Accept all comments and write them under the [+] column. When there are no more responses, ask them to identify those things that they felt could have been improved about today’s workshop. Again, accept all comments and write them under the [] column. Tell them that you appreciate and take their comments seriously and that, to the extent possible, you will attempt to address those items in the [] column that are under your control throughout the remainder of the workshop sequence.Thank them for their participation and enthusiasm and tell them that you look forward to seeing them tomorrow (or at the next scheduled workshop) and give them the date and location for the next workshop.

8

FACILITATOR’S NOTES: DAY 2Materials Activities Times

V. Agenda Review for Day 2 30 minutesFlipchart (Parking Lot Issues)

Welcome participants back and ask if there are any questions or residual issues from Day 1 for which participants would like clarification before moving on to Day 2 activities. Respond to those questions and issues that relate to the content of this workshop on NRS Data Monitoring for Program Improvement. Use the “Parking Lot Issues” page to list those issues that are outside the realm of this workshop. Tell participants that the parking lot issues will be addressed at the end of this workshop.

(5 min.)

PPT-4

PPT- 5

Show PPT-4, Workshop Objectives for Day 2. Tell them that this handout reviews the same objectives that were stated at the beginning of this workshop sequence, but that you are specifically highlighting the objectives for today’s activities. Also show PPT-5, Workshop Agenda for Day 2. Quickly summarize the activities that will be part of this workshop and state their relationship to the expected outcomes.Ask for feedback on yesterday’s Carousel Activity. Have one team report its findings and invite other teams to comment or to add questions about the data and suggestions for program improvement. Continue until each of the four Carousel stops has been discussed, approximately 5 minutes per stop. Now tell participants that all they have accomplished so far in this workshop will serve as background information for the next important responsibility of states: local program monitoring.

(5 min.)

(20 min.)

VI. Planning for and Implementing Program Monitoring

PPT-41PPT-42

PPT-43PPT-44

PPT-45

A. Presentation and DiscussionShow PPT-41 and ask participants how/why they think local programs might benefit from a state’s policy of conducting regular monitoring of local program performance and measuring it against performance standards. Sample responses and then show PPT-42, discussing each of the points. Encourage participants to share stories/experiences they have had related to these points in conducting local monitoring. Show PPT-43 and PPT-44. Tell them that you understand that the idea of monitoring may seem overwhelming to states with already overburdened staff, but that monitoring can be manageable and can provide an excellent opportunity to work with locals on program improvement.When showing PPT-45, ask whether any states currently include in their monitoring visits a discussion of whether locals are meeting performance standards and whether the monitoring is structured to encourage program improvement. If no states currently measure performance against performance standards, ask whether and how they currently monitor any aspect of local programs. Sample responses, and then ask whether adding the process of measuring performance against performance standards can be fit into their existing monitoring structures. NOTE: Be prepared to hear that some states conduct no monitoring at all. Accept all responses without making judgmental statements. The purpose of this workshop is to provide tools for states to collaboratively set performance standards with locals, to measure local performance against performance standards, and to plan and initiate programmatic improvement.

20 minutes

9

Materials Activities TimesPPT-46 PPT-47, PPT-48PPT-49 through PPT-54PPT-52

PPT-55H-12

Show PPT-46 and discuss the difference between desk reviews and on-site monitoring, and then the advantages/disadvantages of each (PPT-47 and PPT-48). Then, showing PPT-49 through PPT-54, discuss the various data collection strategies for monitoring (Program Self-Reviews, Document Reviews, Observations, and Interviews). PPT-52 shows the difference between quantitative (collected primarily via desk reviews) and qualitative data (collected primarily via on-site reviews). B. Small Group Work on Data SourcesNow show PPT-55 and ask participants, in groups of 4-5, to review H-12 and to fill in the data sources that they would use for each indicator and the strategies they would use if they were conducting (1) a desk review, and (2) an on-site review. Ask all groups to consider all nine indicators, but assign each group only one or two indicators to report on to the whole group. Ask each group to select a recorder and a reporter. Allow 1 hour for this activity before reconvening the group for the reports. Tell them that they also have the scheduled break time (an additional 15 minutes) to use if they need it. Each group will have 2 minutes to report on one indicator (or, if the groups each have been assigned two indicators to report on, they will have a total of 4 minutes to report).

60 minutes

BREAK 15 minutes

PPT-56H-13

NRS Data Monitoring for Program Improvement Guide, pp. 50-55

C. Small Group Reports Ask each group to report on its indicator(s) and the data sources and monitoring strategies for on-site and desk reviews. Allow only 2 minutes per report on each indicator. Following the report on each indicator, ask the total group to comment or, if they wish, suggest additional NRS data sources and other vehicles for conducting the desk and on-site reviews.

D. Steps and Guidelines for Monitoring Local Programs Show PPT-56 and conclude this section on Planning and Implementing Program Monitoring with a review of steps and guidelines for monitoring local programs, H-13. The steps are as follows:

1. Identify state policy for monitoring and gather support from those who have a stake in the monitoring results;

2. Consider past practices when specifying scope of work for monitoring;3. Identify persons to lead and participate in monitoring;4. Identify resources available for monitoring locals;5. Determine process for collecting data with clearly defined criteria for

rating; 6. Conduct monitoring;7. Report findings and recommendations; and8. Follow-up on results.

More information about each of these steps can be found on pages 50 through 54 of the guide, NRS Data Monitoring for Program Improvement (2004). Point out other states’ models and procedures for monitoring (e.g., Pennsylvania, Tennessee) beginning on page 55 in the guide. Tell participants that, following lunch, state teams will have time to consider these steps and the guidelines listed on H-13 and begin to plan a process for local monitoring. Use this time before lunch to clarify any issues related to the workshop content presented thus far and to allow participants to share their concerns about, as well as their past experiences, in monitoring local programs. Now is also a good time to review any of the issues or questions posted on the Parking Lot.

30 minutes

25 minutes

LUNCH 60 minutes

VII. Planning for and Implementing Program Improvement

10

Materials Activities Times

PPT-57PPT-58

PPT-59PPT-60PPT-61

PPT-62

PPT-63 through PPT-66H-14a and b

A. A Model of the Program Improvement ProcessTell participants that they now are entering the final phase of this workshop series: planning and implementing a program improvement based on what they learn through program monitoring. Remind participants that, so far, they have reviewed strategies for setting performance standards for program quality, adjusting standards for local conditions, and setting policy for appropriate rewards and sanctions. They also have examined the programmatic components and policy decisions underlying the measures of educational gain, NRS follow-up, retention, and enrollment, and they have considered strategies for conducting monitoring through both desk reviews and on-site reviews, as well as the stakeholders they need to include in setting policy related to data monitoring. Show PPT- 57 and remind participants once again that data can be of considerable use to state and local programs. However, as indicated in PPT-58, data are useful only if the data are valid and reliable, if the state and locals ask appropriate questions after reviewing the data, and if data analysis leads to making wise decisions. Before moving on, it would be wise to remind participants about the change process—that it is a process, not an event. This means that change does not happen overnight. Show PPT-59, the factors that allow us to accept change; PPT-60, the stages of change; and PPT-61, a word of caution from the State Superintendent of Schools in Spokane, Washington. Now show PPT-62 and describe the four steps of a program improvement process:

1. Planning;2. Implementing;3. Evaluating; and4. Documenting lessons learned and making adjustments, as needed.

B. Bringing it Home: The Start of a State Action PlanShow PPT-63 through PPT-66 and refer participants to H-14a and b. Ask them to work in their state teams to consider the questions on H-14a and b in beginning to plan a model for monitoring local programs. Tell them that they will have 1 hour for planning. They also may use the 15 minutes of scheduled break time, if they need it. When the group reconvenes following the break, each state is to report on the team’s plans as well as the potential problems it anticipates and the strategies it plans to use to mitigate potential problems. They are to be prepared to make 5-minute reports on their planned changes to the whole group.

20 minutes

60 minutes

BREAK 15 minutes

C. Sharing Action PlansAsk representatives from each team to report on the team’s plans for implementing monitoring and program improvement processes. Allow 5 minutes for each report and encourage questions from the other teams. After all teams have reported their plans, ask if anyone would like to make any general observations about the reports—for example, Do states anticipate similar obstacles in setting policy for implementing a program monitoring and improvement process? Have states come up with vastly different models for involving local programs? Ask participants if the various state reports have generated ideas that they can use. Ask whether state teams will modify their plans based on the reports made by other states.

45 minutes

X. Closing and Evaluation 30 minutesA. Review Parking Lot Issues

11

Materials Activities TimesCollect the flipchart pages with the Parking Lot Issues that were posted at the beginning of the workshop. Review lists to determine if these questions have been answered during the workshop. Provide answers to unanswered questions or, if the questions need to be referred to others or if they need research, give participants an approximate date by which they can expect to receive either the answers or referrals to other information sources. Also ask if participants have any questions/items/issues that still need to be clarified.B. Identify Additional Resources

Flipchart(Needs/Resources)

Mark a flipchart page “Needs/Resources.” Ask participants to name additional resources that they need to implement the changes they identified in their action plans. It could be additional training, online resources, policy issues and changes, etc. The purpose of this activity is to provide information to the drivers of change and of policy at the federal and state levels to help them in their planning. Tell participants that their brainstormed list of needs and additional resources will be compiled and mailed to all workshop participants. Allow approximately 20 minutes for this activity.

C. ReflectionProvide closure to the workshop by asking participants to reflect on what they have learned and how they can apply the information they have discussed or acquired.Refer participants to the workshop objectives:

Describe the importance of getting involved with and using data;

Identify four models for setting performance standards as well as the policy strategies, advantages, and disadvantages of each model;

Determine when and how to adjust standards for local conditions;

Set policy for rewards and sanctions for local programs;

Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention;

Distinguish between the uses of desk reviews and on-site reviews for monitoring local programs;

Identify steps for monitoring local programs;

Identify and apply key elements of a change model; and

Work with local programs to plan for and implement changes that will enhance program performance and quality.

H-16a, b, and cD. Workshop Evaluation Direct participants’ attention to H-16a, b, and c (Workshop Evaluation). Ask participants to complete the evaluation. Thank them for attending and participating, and tell them that you look forward to seeing them at the next workshop.

12

PARTICIPANT’S HANDOUTS

WORKSHOP OBJECTIVESBy the end of this professional development sequence, participants will be able to:

Day 1

1. Describe the importance of getting involved with and using data;

2. Identify four models for setting performance standards as well as the policy strategies, advantages, and disadvantages of each model;

3. Determine when and how to adjust standards for local conditions;

4. Set policy for rewards and sanctions for local programs; and

5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention.

Day 2

1. Distinguish between the uses of desk reviews and on-site reviews for monitoring local programs;

2. Identify steps for monitoring local programs;

3. Identify and apply key elements of a change model; and

4. Work with local programs to plan for and implement changes that will enhance program performance and quality.

WORKSHOP AGENDA

H-1

Day 1I. Introductions, Objectives, Agenda Review for Day 1

II. The Power of Data

Why Get Engaged with Data? (exercise and the motivation continuum)

Overview of the Data-drive Program Improvement Model (questioning data exercise)

Setting Performance Standards for Program Quality (presentation on four models for setting standards and exercise)

Adjusting Standards for Local Conditions (scenarios)

Shared Accountability with Appropriate Rewards and Sanctions (variations on a theme exercise)

III. Getting Under the Data: Performance Measures and Program Processes (data pyramids and the underlying program components, decisions, and processes)

IV. Day 1 Evaluation and Wrap-up

Day 2V. Agenda Review for Day 2

VI. Planning for and Implementing Program Monitoring

Differences between Desk Reviews and On-site Reviews

Data Sources for Each (exercise)

Steps and Guidelines for Local Program Monitoring

VII. Planning for and Implementing Program Improvement

The Change Process

A Model of the Program Improvement Process

State Action Planning

Sharing Reports

VIII. Closing and Evaluation

WHY GET ENGAGED WITH DATA?

H-2

Directions: Form a team of 3 to 5 people and consider the following question:

Why is it important to be able to produce evidence of what your

state (or local) adult education program achieves for its students?

Jot down as many responses as you can think of, writing one response on each Post-It Note.

Your workshop facilitator will provide you with a flipchart page titled The Motivation Continuum, similar to the chart that appears below. When your team can think of no more responses, take all the Post-It Notes your team has created and place them on The Motivation Continuum flipchart, ranging in order from those factors that are internally motivated to those factors that are driven by external forces.

The Motivation Continuum

Intrinsic Extrinsic

H-3

YOUR OWN PERSONAL MOTIVATORS

Directions: Complete the following sentence with as many things as apply to you personally.

I can be motivated to work with our data if I remember that…

1. ____________________________________________________________________________________________________________________________________________

2. ____________________________________________________________________________________________________________________________________________

3. ____________________________________________________________________________________________________________________________________________

4. ____________________________________________________________________________________________________________________________________________

5. ____________________________________________________________________________________________________________________________________________

6. ____________________________________________________________________________________________________________________________________________

When state or local program team members share their motivating factors with one another, it can be a powerful, unifying activity for the team in determining next steps in setting policy, monitoring programs, and initiating program improvement efforts.

H-4

QUESTIONS FOR CONSIDERATION

Directions: Examine the data in the following graph.

List all the questions you can think of to ask about this local program’s data and what the underlying reasons might be for the results.

____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

H-5

DECISION FOR STATE TEAMS: SELECTING A STANDARD-SETTING MODEL

Directions: In your state teams, consider the following questions:

1. Which model do you favor for setting standards for/with local programs? Why?____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

2. Is it appropriate for your state to use one statewide model or will you need to use different models for different programs?

____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

3. How will you involve the locals in setting the standards to which they will be held accountable?

____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

Consider question #4, but do not include it in your state report. We will discuss this with the entire group following the state reports.4. How do the standard-setting models(s) that states select represent a policy statement on

the relationship between performance and quality that states want to instill in local programs?

____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

H-6

ADJUSTING LOCAL STANDARDS: SAMPLE SCENARIOS Directions: Read the following scenarios. Each represents a local program’s claim that it cannot meet the state-set performance standards. In your small group, discuss how you would handle each claim by (1) proposing a strategy for verifying the accuracy of the claim, and (2) proposing a solution to the problem. Be prepared to report your team’s strategy and proposed solution to the whole group.

1. Continuous Improvement ModelUsing a continuous improvement model, one state set performance standards for GED attainment for each local program at levels slightly higher than the previous year’s. However, in the previous year, several local programs had received grants to offer an extensive amount of “fast-track GED” instruction prior to the release of GED 2002 and, consequently, their secondary completion and GED rates soared. The “fast-track” grant is now over and the programs think the current levels set by the state are too high and should be lowered, based on levels they attained before the grant.

Your Strategy for Verifying the Accuracy of the Local Program’s Claim: ________________________________________________________________________________________________________________________________________________________________________

Your Solution to the Problem and Response to the Local Program: ________________________________________________________________________________________________________________________________________________________________________

2. Relative Ranking Model Another state uses a relative ranking model to set local performance standards. In reviewing its student demographic data, one local program that fails to meet its educational gain performance standards found that it serves a high proportion of older learners. The state average age of ABE learners is 33 years old, but the local program’s average student age is 49. The program requests that the state adjust standards lower for them, based on the common belief that older learners do not make gains as quickly.

Your Strategy for Verifying the Accuracy of the Local Program’s Claim: ________________________________________________________________________________________________________________________________________________________________________

Your Solution to the Problem and Response to the Local Program: ________________________________________________________________________________________________________________________________________________________________________

3. External Criteria Model The state legislature requires all adult education programs to show at least a 20 percent increase in the percentage of participants who get jobs. In response, the state increases the standard for adult education programs by 25 percent over previous years for ‘entered employment.’ Several adult education programs claim that they cannot meet this standard because they serve significant numbers of learners who are already working, and the number of students with the goal of ‘obtain employment’ is low. If a program does not focus on employment skills, it cannot substantially increase its ‘entered employment’ rate.

Your Strategy for Verifying the Accuracy of the Local Program’s Claim: ________________________________________________________________________________________________________________________________________________________________________

Your Solution to the Problem and Response to the Local Program: ________________________________________________________________________________________________________________________________________________________________________

H-7

REFLECTION ON SUCCESS OF PAST EFFORTS Directions: In your state teams, consider past policy changes that your state adult education office has initiated and asked local programs to comply with. Select one policy change that was well received by the locals, and one policy change that was met with resistance from the locals. Then identify some the factors that you think contributed to the success or failure of these initiatives. You have approximately 10 minutes for this exercise. Be prepared to share your responses with the whole group.

1. List a policy change (imposed by the state office on local programs) that was successful and well received by local programs. ____________________________________________________________________________________________________________________________________________

What factors contributed to the success of this effort?

______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

2. List a policy change (imposed by the state office on local programs) that was not successful and was met with resistance from local programs. ____________________________________________________________________________________________________________________________________________

What factors contributed to the poor reception of this effort?

______________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

H-8

VARIATIONS ON A THEME

Directions: In a small group of 3 or 4 people, brainstorm as many possible rewards or incentives as you can for recognizing local programs that meet their performance standards. Write these in the left column below. Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards. Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note. When you have finished, wait for further instructions from the facilitator.

H-9

Reward Structures

Sanctions

H-15

STATE WORKSHEET: PLANNING FOR REWARDS AND SANCTIONS

Directions: In your state team, make some preliminary decisions about the rewards and sanctions that you might use to reward and motivate local programs to meet their performance standards. Consider the following questions.

1. What reward structures are you thinking of putting in place for local programs that meet their performance standards?__________________________________________________________________________________________________________________________________________________

2. What sanctions are you thinking of putting in place for local programs that fail to meet their performance standards?__________________________________________________________________________________________________________________________________________________

3. What timeline are you thinking of for putting the rewards and sanctions policy in place?__________________________________________________________________________________________________________________________________________________

4. What stakeholders will you include in the decision-making process about rewards and sanctions when you get back to your home state?__________________________________________________________________________________________________________________________________________________

5. Who in the state office will have primary responsibility for the following:

a. Announcing the new policy _____________________________________________

b. Reviewing local programs’ data _________________________________________

c. Determining the reward or sanction for each program ________________________

d. Providing support and technical assistance to the programs that need to improve ___________________________________________________________________

e. What will be the nature of the support and technical assistance provided by the state? ______________________________________________________________________________________________________________________________________

6. What obstacles or resistance do you foresee in putting this policy in place? How might you plan in advance to lessen these obstacles? _______________________________________________________________________________________________________________

7. What other factors, if any, do you need to consider that are specific to your state? __________________________________________________________________________________________________________________________________________________

H-10

DATA CAROUSEL

Directions: In teams of 4-5, visit each of the four carousel stops around the room. Each of these stops represents one of your local programs. Review the data table or chart displayed at each stop (and also displayed on H-11b, c, d, and e). As you review the data at the first stop, ask yourselves what, if anything, in this data is a cause for concern? What questions do you want to ask the local program about these results? Note: Your questions should target and try to get at the underlying elements that may be causing the performance problem(s). Write your questions on the flipchart provided at this stop. Then proceed to the next carousel stop and repeat the process. At each stop, you will see questions that other review teams have written as they revolve on the data carousel. Add your questions, if they are different from the ones that already appear on the flipchart. You may spend up to 15 minutes at each stop.

H-11a

DATA CAROUSEL—STOP #1

EDUCATIONAL GAINABE LEVELS

Definition: Students who advance an NRS Level

NRS LevelNumber Advancing Level And Total Enrolled % Completing Level State

Performance Standards

Program StateProgram State

Advanced Total Advanced TotalABE Beg. Lit. 2 14 169 761 14% 22% 35%

ABE Beg. 9 26 339 1,284 35% 26% 29%

ABE Int. Low 6 31 589 2,060 19% 29% 30%

ABE Int. High 40 58 683 3,339 69% 20% 19%

ASE Low 35 46 385 2,044 76% 19% 21%

ASE High 24 34 307 1,062 71% 29% —

TOTAL 116 209 2,470 10,550 55% 23%

H-11b

DATA CAROUSEL—STOP #2

FOLLOW-UPRECEIPT OF SECONDARY SCHOOL DIPLOMA OR GED (2001-2003)

Definition: The number of students who received a secondary school diploma or GED divided by the number of students who had that as a goal.

Year Number with GoalNumber Achieved

GoalPercent Achieved

GoalPerformance

Standard2001 150 75 50% 60%

2002 120 40 33% 70%

2003 110 40 36% 70%

H-11c

DATA CAROUSEL—STOP #3

RETENTIONAVERAGE HOURS ATTENDED

Definition: Total attended hours divided by number of enrolled students

NRS LevelAverage Attended Hours Performance

Standards

Program State Performance Standards

ABE Beg. Lit. 40 90 70

ABE Beg. 120 120 100

ABE Int. Low 60 120 80

ABE Int. High 161 110 100

ASE Low 50 130 60

ASE High 150 140 100

TOTAL 80 120 —

H-11d

DATA CAROUSEL—STOP #4

Enrollment of Student Sub-Populations

Target Population Actual Performance StandardNumber % of Total Number % of Total

ABE Beginning Literacy 80 19% 90 20%

ESL Beginning Literacy 52 12% 60 13%

On Public Assistance 90 21% 80 18%

Immigrant/Refugee 21 5% 35 8%

Aged 16-24 25 6% 30 7%

Total Enrollment in Program 430 — 450 —

H-11e

MONITORING PERFORMANCE USING INDICATORS OF PROGRAM QUALITY Directions: In groups of 4-5, for each indicator in the table below, fill in the data sources that you would use as well as the questions you would ask in monitoring local programs and the strategies you would use to conduct (1) a desk review, and (2) an on-site review. Complete all 9 indicators in the table below. Select a recorder and a reporter for your group and be prepared to report to the whole group on the indicator(s) assigned to your group by the facilitator. You have 1 hour for this activity, plus 15 minutes for the scheduled break. Your group will have 2 minutes to report on each indicator assigned.

Program Area in Indicators of

Program Quality

NRS Data Source

Questions to Pose of Locals

re: this Indicator

Strategies for Desk Review

Strategies for On-site

Monitoring

Program Management (Data Reporting)

Recruitment

Goal Setting (Intake and Orientation)

Educational Gains

Assessment

Curriculum and Instruction

Persistence

Support Services

Professional Development

H-12

STEPS AND GUIDELINES FOR MONITORING LOCAL PROGRAMS

Monitoring Steps Implementation Guidelines Examples1. Identify state policy for monitoring.

Gather support from those who have a stake in the results.

Provide clear written guidelines to all levels of stakeholders on the scope of the monitoring activities (including process and timelines).

State plan should be open to the public and shared at all levels. State plans often specify: Outcome measures, and Frequency of evaluation

2. Specifying the scope of work for monitoring.

Uses quantitative and qualitative data for effective monitoring.

Quantitative = look at outcome measurementsQualitative = look for evidence using program quality indicators

3. Identify individuals to lead and to participate in monitoring activities.

Consider the unique program features when identifying who should be involved from the local program and who should be part of monitoring team. Consider strength in diversity.

Local staff: practitioners, administrators, partnersExternal team members: content specialists, other educators, and staff from partnering agencies

4. Identify resources available for monitoring local programs.

With competing demands for resources (staff, time, and money for monitoring), consider formalizing a two-stage monitoring approach.

Desk reviews look at program data from a distance.Onsite reviews look at data in context—to see first-hand how the process and operations lead to positive outcome measures.

5. Determine process for collecting data with clearly stated criteria for rating. Conduct monitoring activities.

Create and use standard tools for data collection and analysis. Monitors (state staff and team) need to fully understand the tools, their use, and the rating criteria.

Desk Reviews can include data, proposals, plans, reports, and program self-review.

Onsite reviews can include discussion of self-review, observations, interviews, and a review of files and documents.

6. Report on the findings, including recommendations.

Conclude onsite monitoring visits with a verbal debriefing followed by a written report.

Report might include a short description of the monitoring activities with supporting: Qualitative description Quantitative information.

7. Follow up on the results. Given that the major purpose of monitoring is program improvement, is essential, and should include an ongoing exchange between the state office and the local program.

Follow-up activities might include reviewing performance standards and program improvement, rewarding or sanctioning, and the beginning of technical assistance.

H-13

PLANNING AND IMPLEMENTING PROGRAM IMPROVEMENT

Directions: With your state team members, consider the following questions and begin to plan a program improvement policy and process. Be prepared to report on your plans to the whole group.

1. Who should be included on your program improvement team? List the positions from the state office as well as the local program. Anyone else, such as community members, learners, etc.?

2. How will you prioritize areas needing improvement when you review a local program’s data and find several areas that may need to be addressed?

3. How will you gain cooperation from locals in this process?

4. What type of training or professional development will be needed to get local buy-in?

5. How will you identify and select strategies for effecting improvement?

6. Who will be responsible for taking the lead on ensuring that the change is implemented?

7. How will expectations for the change be promoted and nurtured?

8. How will the change be monitored?

H-14a

9. How will the changes that are implemented be evaluated?

10. Who will interpret the results?

11. Who will be on the lookout for unintended consequences?

12. Who will document the process of what worked, what didn’t, and lessons learned?

13. What problems do you anticipate facing as you plan for and implement policy related to data monitoring for program improvement?

14. What solutions or precautions can you suggest to avoid having these problems become major ones?

15. What is your timeline or expected completion date for the following activities:

A. Setting performance standards?__________________________________________

B. Announcing the standards and asking locals to comply? ______________________

C. Developing a policy for adjusting standards for local conditions?________________

D. Developing a policy for rewards and sanctions?_____________________________

E. Developing a policy and process for monitoring local programs?________________

F. Developing a policy and process for effecting program improvements?___________

G. Other ______________________________________________________________

H-14b

H-15

Aha! Experiences Parking Lot Issues

NRS DATA MONITORING FOR PROGRAM IMPROVEMENT WORKSHOP EVALUATION FORM

Date________________________ Location of Workshop______________________________________

State or Local Program Name_________________________________________________________________

Your Position (Check all that apply) Instructor Local Administrator Data Facilitator Professional Development Coordinator State Director or State Staff Other (identify)___________________________________

1. The objectives of the NRS professional development packet were met

(not at all) 1 2 3 4 (completely)

The Power of Data

2. The “Why Get Engaged with Data?” exercise was(not effective) 1 2 3 4 (highly effective)

3. The concepts and information presented in “The Data-driven Program Improvement Model” were

(not useful) 1 2 3 4 (highly useful)

4. The “Setting Performance Standards for Program Quality” exercise was useful in identifying problem standard-setting models we might use in our state.(not at all) 1 2 3 4 (extremely)

5. The concepts and information presented in “Adjusting Standards for Local Conditions” were

(not helpful) 1 2 3 4 (extremely helpful)

6. The “Shared Accountability with Appropriate Rewards and Sanctions” exercise was useful for making decisions about the rewards and sanctions my state might put in place.(not at all) 1 2 3 4 (extremely)

General comments about the Power of Data Section:

H-16a

Getting Under the Data: Performance Measures and Program Processes7. The directions for the “Data Carousel” exercise were

(confusing) 1 2 3 4 (clear)

8. The concepts and information presented in the data pyramids were(not helpful) 1 2 3 4 (extremely helpful)

General comments about the Getting Under the Data Section:

Planning for and Implementing Program Monitoring

9. The concepts and information presented in the Planning and Implementing Program Monitoring sectionwere(not useful) 1 2 3 4 (very useful)

10. The small group work on Data Sources was helpful in understanding the differences in monitoring local program using desk reviews versus on-site reviews.(not at all) 1 2 3 4 (extremely)

General comments about the Planning for and Implementing Program Monitoring Section:

Planning for and Implementing Program Improvement11. The concepts and information presented in the “Model for Change” presentation and

discussion were(not useful) 1 2 3 4 (very useful)

12. The state action planning was helpful in getting my state started toward data monitoring for program improvement.(not at all) 1 2 3 4 (extremely)

General comments about the Planning for and Implementing Program Improvement Section:

H-16b

Overall Comments

1. What were the most helpful features of the workshop? Please be specific.

2. What do you think were the least helpful features of the workshop?

3. What suggestions do you have for improving this professional development activity?

— DRAFT - NOT FOR DISTRIBUTION —

H-26

POWERPOINT SLIDES

NRS DATA MONITORING FOR PROGRAM IMPROVEMENT(PowerPoint Slides)

Slide 1___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 2 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 3 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

1

Slide 4 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 5 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 6 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

2

Slide 7 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 8 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 9 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

3

Slide 10 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 11 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 12 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

4

Slide 13 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 14 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 15 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

5

Slide 16 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 17 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 18 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

6

Slide 19 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 20 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 21 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

7

Slide 22 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 23 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 24 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

8

Slide 25 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 26 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 27 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

9

Slide 28 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 29 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 30 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

10

Slide 31 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 32 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 33 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

11

Slide 34 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 35 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 36 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

12

Slide 37 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 38 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 39 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

13

Slide 40 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 41 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 42 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

14

Slide 43 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 44 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 45 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

15

Slide 46 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 47 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 48 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

16

Slide 49 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 50 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 51 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

17

Slide 52 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 53 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 54 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

18

Slide 55 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 56 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 57 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

19

Slide 58 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 59 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 60 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

20

Slide 61 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 62 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 63 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

21

Slide 64 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 65 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 66 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

22

Slide 67 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 68 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

Slide 69 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

23

Slide 70 ___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

___________________________________

24

SUPPLEMENT

POSSIBLE QUESTIONS TO ASK WHEN EXAMINING THE DATA ON H-5

1. How were the performance standards set for this program? Were they based on past performance or some other criteria?

2. Are these standards appropriate given the pattern of performance the program has shown? For example, are the standards too low at the higher levels where performance greatly exceeded targets?

3. Is this performance pattern similar to that observed in previous years? If not, what has caused it to change? Will this affect setting of performance standards in the future?

4. What are the program’s assessment and placement procedures? What assessments are used for pre- and posttesting?

5. s the program using the same assessment methods for high and low level ESL? If so, is this appropriate given the performance pattern?

6. What type of curriculum and instruction is the program offering? How does it differ by instructional level?

7. What are student retention patterns by level? Is retention affecting the differences in performance among students at different levels?

8. Could the program’s recruitment practices have had an influence on performance? How many students is it serving at each level?

9. Why are no students enrolled at the highest ESL level? Is this a result of recruitment, type of classes the program offers, or placement procedures? Does the program need to change is recruitment practices?

S-1

GLOSSARY

Advancement Learner advances from one educational functioning level to the next, based on the learner’s performance on state designated assessments.

Aggregation, or Data aggregation

The process of combining reports from one level of administration into a single report at the next (e.g., combining local program reports into one statewide report).

Alternative assessments Procedures and techniques used as an alternative to traditional testing. The focus tends to be on individual student growth over time, rather than comparing students with one another.

Assessment Measures of student progress, including standardized testing, teacher assessment, portfolios, checklists, etc.

Class level The educational functioning level in which students are placed.Contact hours Hours of instruction or instructional activity the learner receives from the program. Instructional

activity includes any program-sponsored activity designed to promote student learning in the program curriculum such as classroom instruction, assessment, tutoring, or participation in a learning lab.

Continuous improvement Model that uses past performance to set standards for the future and to plan for program improvement.

Data forms A written or electronic document for collecting student information.Data items Individual questions or pieces of information contained on data forms.Data quality All states use the same definitions and coding categories for every data element in the NRS.

States follow the same step-by-step instructions on how and when to collect each data element.

Desk review A structured way to look at program information related to outcomes. May include review of data, proposals, reports, budget, etc.

Descriptive measures For the purposes of the NRS, descriptive measures may include student: demographics, educational status, and goals

Earn a high school diploma or achieve a GED

Obtaining a state accredited secondary diploma/credential or passing the General Educational Development (GED) Tests.

Educational gain Learner completes or advances one or more educational functioning levels from starting level as measured at program entry or beginning of an instructional cycle.

Employed Learners who work as paid employees, work in their own business or farm, or who work 15 hours or more per week as unpaid workers on a farm or in a business operated by a member of the family. Also included are learners who are not currently working, but who have jobs or businesses from which they are temporarily absent.

English as a Second Language programs

Programs for limited English proficient students that focus on improving English communication skills such as speaking, reading, writing, and listening.

Enters employment The learner obtains full- or part-time paid employment before the end of the first quarter after the program exit quarter.

S-2a

Enters post-secondary education or training program

The learner enters another education or training program, such as community college, trade school, a four-year college or university, etc.

Evidence Data and documentation to support findings.External Criteria Model for setting standards with a formula based on factors not directly related to a program’s

performance in the past or in relation to others.Family literacy programs A program with a literacy component for parents and children or other intergenerational literacy

components.GED Certificate given to learners who attain passing scores on the General Educational

Development (GED) Tests.Generalizable The extent to which a finding can be generalized to other populations or situations.Goals Information collected at intake about the main reason(s) a student is enrolling in the adult

education program. Consider both long- and short-term goals. For NRS purposes, report goals that can be reached within the fiscal year.

Indicators of program quality

Measures that define policies and practices for effective adult education programs.

Improved employment The learner maintains his or her current employment, and receives an increase in pay, additional responsibilities, or improved job-related skills.

Level benchmarks Guidelines for determining students’ educational functioning levels based on performance on standardized tests.

Longitudinal data Data measured consistently from year to year in order to track learner progress over time.Mandatory program A local, state, or federal program that requires a student to attend adult education classes, for

example welfare, NAFTA, or probation.Mandatory students Students who are required to attend adult education classes because of their participation in

some other local, state, or federal program, including welfare, NAFTA, job training, or probation. Mandatory students do not include students required to attend classes by their employer.

Mean The arithmetic average of a set of scores, or the sum of observations divided by the number of observations.

Median The middle score of a set of scores. Mode The most frequently occurring score in a set of scores. NAFTA program A federal program to assist workers displaced by the North American Free Trade Agreement

(NAFTA).Norm-referenced tests Tests on which the performance is interpreted in the context of the performance of a group with

whom it is reasonable to compare the individual (for example, achieving at a 3.4 grade level). On-site review Monitoring local programs on site to verify data by looking at the processes and procedures

being used. May include a review of files and recruiting materials, observations, interviews, etc.

Outcome measures For the purposes of the NRS, core and secondary outcomes of adult education include learning gains, entry into post-secondary education and training, obtaining high school credentials obtained, entering or advancing in employment, and other gains related to family, education and community participation.

Performance standards (for states)

Numeric levels established for outcome measures in the state plan indicating what proportion of students at each level will achieve each outcome.

Performance Standards (for students)

Statement that indicates how well or to what extent a student will demonstrate knowledge or skills.

Persistence Student’s ability to continue learning over time; the length of time student remains engaged with learning.

Post-test A test administered to a student at designated intervals during a program. It is usually used to measure progress or advancement in the program.

Pre-test A test administered to a student upon entry into a program. It is usually used for determine level for placement.

S-2b

Probation A situation in which a student is under the supervision of a court and may be required to attend classes.

Program (or program area) The main emphasis of instruction for a set of classes. Examples of program areas are ABE, GED, workplace literacy, ESL, family literacy, etc.

P.I.P. or P.I.T Program Improvement Plan or TeamQualitative data Detailed data collected in the form of words or images that is analyzed for description and

themes.Quantitative data Data used to describe trends and relationships among variables. Analysis of the data entails

the use of statistics.Relative ranking Model for setting standards based on a program’s rank relative to the state mean or median

rating or score.Reliability The extent to which others would arrive at similar results if they studied the same case using

the same procedures; evidence of consistency of a measure. Researchable question A research question that can realistically be answered with the skills and resources availableRetain employment The learner remains employed in the third quarter after the exit quarter.Return on Investment (ROI) Net value in relation to cost.Rubric A guide to evaluate a program (or student performance) on a scale with clearly defined criteria.

Scales may be numeric (1 to 5) or descriptive (not evident to exemplary).Standard deviation A measure of the variability or spread of scores; the square root of the average of the squared

deviations of the scores from the means of the set of scores.Student performance Student attainment formally measured by some assessment method.Student record system A computerized or paper-based system for tracking student information related to intake

information, goals, educational levels, attendance, achievements, and outcomes.Student retention Student attends program long enough (persists) to show learning gains.TANF Temporary Assistance for Needy Families. A federal public assistance program.Uniform system for collecting measures

All states and programs use the same methodology for collecting data on the measures. States certify validity through “data quality checklists”

Validity The extent to which a research instrument measures what it purports to measure. Variance A measure of the variability of the scores in a frequency distribution; more specifically, the

square of its standard deviation. Voluntary students Students who attend adult education classes of their own free will; they are not required to

attend by any state agency.Work-based project learner activity

A short-term course (at least 12 hours but no more than 30 hours) in which instruction is designed to teach work-based skills and in which the educational outcomes and standards for achievement are specified.

Workplace literacy programs

A program designed to improve the literacy skills needed to perform a job and at least partly under the auspices of an employer.

S-2c

S-2b

LETTER TO SEND TO PARTICIPANTS PRIOR TO TRAINING

The Department of Education invites adult education staff to attend NRS Data Monitoring for Program Improvement. This workshop is one in a series of trainings designed to promote the quality and use of data collected under the National Reporting System (NRS). The goal of the workshop is to provide training to staff on how to meet requirements to set local program performance standards, better monitor and continuously improve their local programs using NRS data.

TOPICS

Adult education is facing a time of change, as new legislation replacing the Workforce Investment Act is under consideration. As part of these changes, states may soon be developing new state plans and setting performance standards for program accountability. States will be facing new challenges to monitor and continuously improve local program performance. To assist in these efforts, this workshop will cover the following topics:

Setting local performance standards – learn about performance standard setting models and how to select the model that best meets your state policies and promotes local program improvement.

Local monitoring – explore how standards can reflect program instruction, assessment, retention and other procedures and how to use performance standards as indicators of local performance.

Making change – the workshop will cover change models and how to make real changes in your program to improve program performance and student outcomes.

WHO SHOULD ATTEND

The intended audience for the training is adult education staff who are or will be responsible for setting performance standards, monitoring local programs and providing technical assistance for program improvement. Staff responsible for conducting professional development on these topics, and staff who perform these functions may also wish to attend. To provide a rich training experience, we encourage you to send a team of up to three persons to this training.

COST

The cost of this training is $ , which covers…

LOCATION AND DATES

Locations and dates of the trainings are:

Attendance is limited to xx persons, so please register as soon as possible.

To register for the training, please return the attached registration form and please return the registration form no later than … For questions about the training, please contact:

S-3

Alternative Monitoring Exercise: Facilitator’s Notes

GENERAL NOTES FOR FACILITATORS:

This alternative session design allows participants to actually use the spiral bound guide, Data Monitoring for Program Improvement, to find information related to the purposes and strategies for monitoring. Participants can work in state teams or across state teams during Part One – using the guide to find information on different monitoring strategies that will link data collection with the Indicators of Program Quality. Later in Part Two, participants will return to work in state teams to look at the monitoring steps that the state already has in place and the steps that might require updating in the future.

Part One:

Materials NotesTime:

90 minutesPPT- 41-45 A. Introduce local program monitoring for program

improvement.(10 min)

SH-12a

SH- 12b or H-12

B. Small Group Work - use the monitoring guide for questions one and two on SH-12a followed by more intensive practice using either SH-12b (an in depth focus on three Indicators of Program Quality in more depth) or H –12 (a broader focus on nine Indicators of Program Quality)

(60 min)

C. Small Group Report - share perspectives (5 min.) on the benefits of monitoring and to identify the advantages and disadvantages of both desk and on-site reviews. Allow each group to report how they might monitor one indicator (15 min.).

(20 min)

Part Two:

Materials NotesTimes

25 minutes

PPT- 56 A. Review steps and guidelines for a state monitoring system.

(5 min)

SH-12c B. State Team Work to identify what the state already has in place and what steps need to be developed or revised.

(15 min)

C. Whole Group wrap up monitoring section by asking participants to share any salient ideas, strategies, or challenges that came to the surface during this segment of training.

(5 min)

S-4a

Alternative Monitoring Exercise: Monitoring Local Programs

Directions for Search and Find in the spiral guidebook - NRS Data Monitoring for Program Improvement:

Work in state teams. Scan Chapter 4 for information to help complete the following exercises.

1. Identify 2-3 benefits for monitoring local programs (pages 37-38; suggested time 10 minutes).

FROM A STATE PERSPECTIVE… FROM A LOCAL PROGRAM PERSPECTIVE

2. The Monitoring Guide suggests that states can use a two-prong approach to monitoring – desk reviews and on-site reviews (pages 38-40, suggested time 10 minutes).

Scan the information about the approaches and the advantages/disadvantages for each approach.

a. Brainstorm ways that your state could use desk reviews and on-site reviews.b. Brainstorm any obstacles that your state might encounter with each approach.

Approach Useful in our state Possible obstacles in our stateDesk Reviews

On-Site Reviews

S-4b

Alternative Monitoring Exercise: Using Indicators of Program Quality to Monitor Programs

Refer to the NRS Guide to Data Monitoring, Table 4-4 on pages 46-48 and the Pennsylvania sample on pages 55-56. (Suggested time 40 minutes)

a. Select several of your state’s Indicators of Program Quality - exact wording is not necessary. Alternatively, select an indicator from one of the sample states included in the Appendix pages 73-78.

b. Complete the chart below for monitoring the outcomes and processes by identifying your data sources, questions to be answered, and then outlining effective strategies for desk reviews and on-site reviews.

1. Indicator of Program Quality

Data Sources(NRS and Local)

Questions to Pose with Local Staff

StrategiesDesk Review

StrategiesOn-site Review

2. Indicator of Program Quality

Data Sources(NRS and Local)

Questions to Pose with Local Staff

StrategiesDesk Review

StrategiesOn-site Review

3. Indicator of Program Quality

Data Sources(NRS and Local)

Questions to Pose with Local Staff

StrategiesDesk Review

StrategiesOn-site Review

S-4c

Alternative Monitoring Exercise: Steps and Guidelines for Monitoring Local Programs

Directions: Scan the steps and identify what your state has in place and what needs to be done.

Column 4: outline the process and products in place Column 5: identify what needs to be developed

1. Monitoring Steps 2. Implementation Guidelines 3. Examples 4. Process/Products

in place5. To Be

Developed1. Identify state policy

for monitoring. Gather support from those who have a stake in the results.

Provide clear written guidelines to all levels of stakeholders on the scope of monitoring activities (including process and timelines).

State plan should be open to the public and shared at all levels. State plans often specify:1. Outcome

measures2. Frequency

2. Specifying the scope of work for monitoring.

Uses quantitative and qualitative data for effective monitoring.

Quantitative = look at outcome measurementsQualitative = look for evidence using program quality indicators

3. Identify individuals to lead and to participate in monitoring activities.

Consider the unique program features when identifying who should be involved from the local program and who should be part of monitoring team.Consider strength in diversity.

Local staff: practitioners, administrators, partnersExternal team members: content specialists, other educators, and staff from partnering agencies

4. Identify resources available for monitoring local programs.

With competing demands for resources (staff, time, and money for monitoring), consider formalizing a two-stage monitoring approach.

Desk reviews look at program data from a distance.Onsite reviews look at data in context—to see first-hand how the process and operations lead to positive outcome measures.

5. Determine process for collecting data with clearly stated criteria for rating. Conduct monitoring activities.

Create and use standard tools for data collection and analysis. Monitors (state staff and team) need to fully understand the tools, their use, and the rating criteria.

Desk Reviews can include data, proposals, plans, reports, and program self-review.Onsite reviews can include discussion of self-review, observations, interviews, and a review of files and documents.

6. Report on the findings, including recommendations.

Conclude onsite monitoring visits with a verbal debriefing followed by a written report.

Report might include a short description of the monitoring activities with supporting: A. Qualitative descriptionB. Quantitative

7. Follow up on the results.

Given that the major purpose of monitoring is program improvement, it should include an ongoing exchange between the state office and the local program.

Follow-up activities might include reviewing performance standards and program improvement, rewarding or sanctioning, and technical assistance.

S-4d