37
2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department of Education. Making it Stick: Going from Training to Implementation Practice

2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Embed Size (px)

Citation preview

Page 1: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

2013-2014 Webinar series

February 26, 2014

3:30 – 4:30 p.m.

This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department of Education.

Making it Stick: Going from Training to Implementation Practice

Page 2: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

• Review implementation areas of the Direct Access to Achievement rubric.

• Dig deeper into the descriptors from several implementation areas.

• Extend your understanding of the rubric by using the Operationalizing and Optimizing descriptors and process tools to assess teams’ current effectiveness and set improvement goals.

Webinar Focus

Page 3: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Direct Access to Achievement Implementation Rubric

Implementation Areas:

• Leadership• Problem-solving through

data analysis• Curriculum and instruction• Assessment• Positive school climate• Family and community

partnering

Indicators Grouped as:

• Structures• Processes and

procedures• Professional development

Overview

Page 4: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Why are the implementation descriptions grouped under these indicators?

• Structures• Processes and Procedures• Professional Development

Synthesis of the research!

Cosner, 2012; DuFour, DuFour, Eaker, &Karhanek, 2004; Hall & Hord, 2011; White (2005)

2012-2013 Webinar 1: A Leader's Role in Developing and Enhancing Collaborative Data Practices

Overview

Page 5: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Rubric Descriptors

Each phase includes and extends the prior phase• Emerging: Establishing Consensus• Developing: Building Infrastructure• Operationalizing: Gaining Consistency• Optimizing: Innovating and Sustaining

Overview

Page 6: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Leadership—Structures Administrative structures are important because they may help or hinder the systems that support learning conditions for teachers and students.

Dig Deeper into the Rubric

Page 7: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Leadership StructuresAnchor and Guiding Question:

3. How are current policies and structures aligned with Direct Access to Achievement?

Dig Deeper into the Rubric

Page 8: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Which statement(s) describe(s) an optimizing level of implementation?

A. Leadership and teams routinely use common

observation tools to provide teams with actionable feedback.

B. Leadership and staff establish data/PLC teams. Existing teams

and/or roles may be re-structured or combined to

increase efficiency and effectiveness.

C. Leadership examines purposes and overlap among

current school teams.

D. An effective team design is in place to maximize efficiency and

to holistically analyze data to improve teacher effectiveness

and student learning.

?

Dig Deeper into the Rubric

Page 9: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Problem-solving through data analysis

A six-step process used to solve identified concerns.

Dig Deeper into the Rubric

Page 10: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

USE THE RUBRIC TO ASSESS THE LEVEL OF IMPLEMENTATION OF PROBLEM-SOLVING THROUGH DATA ANALYSIS. Establishing a baseline.

Using the Rubric

Page 11: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

How would you classify most of the data team meetings you observe?

A. Meetings where teachers talk about data and assignment of students to interventions, and more rarely how they will change their core instruction.

B. Meetings where data are discussed and then blame is attributed to any number of factors, but rarely instruction.

C. Meetings where teachers mostly discuss workday logistics and other issues, and more rarely reflect on data, interventions, or instruction.

D. Meetings where teachers regularly confront their prior assumptions about the effectiveness of their teaching as supported by evidence (data), share their prior instructional actions, and seek or offer help (as appropriate) to make modifications to future instruction.

Using the Rubric—establishing teams’ baselines.

Page 12: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Leader observation—Most meetings are where teachers talk about data and assignment of students to interventions, and more rarely how they will change their core instruction.

Initial scoring: Observe team

meetings

Use rubric descriptors to differentiate where teams are in progress

toward ‘Optimizing’

Work with teams to establish goals to move from where

they are to next level.

Using the Rubric—establishing teams’ baselines

Page 13: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

EVERYONE DOES THE BEST THEY CAN UNTIL THEY KNOW BETTER, AND THEN THEY DO BETTER.If we expect teams to work toward increased effectiveness, then it is critical to identify where they are now and show them what “better” (increased effectiveness) looks like.

Using the Rubric—establishing teams’ baselines

Page 14: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Assess current processes using the rubric

2. How is the 6-step data/PLC team process used by educators to improve outcomes for students?

Teams self-assess using the descriptions in the implementation rubric.

Leader or designated process observer assesses teams using the descriptions in the implementation rubric.

Using the Rubric—establishing teams’ baselines

Page 15: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Mechanical to Mastery

Using the Rubric and Observation Tool

Mechanical Implementation of Direct Access to Achievement

Assess, Reflect, Develop, Adjust, Monitor, Reassess…

Mastery Implementation of Direct Access to Achievement

Page 16: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Team Observation Tool

Team steps:

• Agenda and Minutes• Norms and Participation• Data Organization and

Analysis• Analysis of Strengths and

Obstacles• Goals• Instructional Strategies• Results Indicators• Meeting Self-Reflection

Indicators provided:

• Descriptions of team actions that indicate Proficient or Exemplary behaviors.

Using the Rubric and Observation Tool

Page 17: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Collect some process data!

• Use the observation tool to objectively collect information about the focus of data/PLC team meetings.

• Minutes & agenda reveal focus and content of team discussions.

• Presence of SMART goals, strategies, and measures for fidelity of implementation reveal how much and how well teams are connecting data to instruction.

Using the Rubric and Observation Tool

Page 18: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Connecting what we’ve learned from analysis to changes in strategies used by the adults

Analysis of achievement, growth, behavior and climate data.

Changes in team processes, instructional strategies, management strategies and structures as needed.

• We’ve identified the problem areas, now how do we make them better?

Reflect on the Problem Identified

Page 19: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Problem-Solving Through Data Analysis

• A team that is operationalizing:• Uses what they’ve learned from data analysis to guide

changes in curriculum, instruction and assessment.• Collects and analyzes data on fidelity of curriculum and

intervention implementation.

• A team that is optimizing:• Routinely uses the the 6-step data process to adjust

programming• Evaluate quality of core instructional strategies,

not just interventions• Evaluate systemic trends

Reflect on the Problem Identified

Page 20: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

What percentage of the teams you observe are actually collecting data on the fidelity of their implementation?

A. 0-20%

B. 21-

40%

C. 41-

60%

D. 61-

80%

E. 81-

100%

What about you?

Page 21: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

System of PLCs: Goal• Create a system of instruction & assessment that serves 80-

90% of students. (Tier 1)• Create a system of intervention to support students in

accessing grade level instruction (Tier 2 & Tier 3)• Create a system of data collection and analysis used for

ongoing reflection of the following:

- Student Growth

- Student Learning/Effectiveness of Instruction

- Effectiveness of Interventions

One School’s Example

Page 22: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Year 1 of Implementation of PLC Teams

• Each grade level determined the data sources to analyze.• Teachers continually contributed to the questions used in the

analysis.• Teachers used a variety of formative, interim and summative

assessment to develop a ‘whole’ picture.

*State Exam Student Profile

*Interim Exams

*Classroom artifacts—

work samples, writing samples,

reading responses, reflections

One School’s Example

Page 23: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

PLC Routines—Fall• Used multiple sources of

data to determine delivery system for each student.

• Delivery system guided decision making in use of time, resources, personnel, and schedule.

• Criteria for “at risk” was determined based on triangulation of data.

• Students placed in interventions and progress monitored.

One School’s Example

Page 24: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

PLC Teams—Ongoing

Weekly PLC meetings used multiple sources of formative data to: • Align curriculum and instruction horizontally and vertically.

• Identify what was working and what was not working.

• Reflections were kept and used to improve guaranteed and viable curriculum.

One School’s Example

Page 25: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Were the interventions working?

One School’s Example

Page 26: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

PLC Teams—Mid-Year Progress Check

• Was the system of service working for each student?• Focus on Growth.• Goal-A minimum of 1 year of growth in 1 year.

• Use Quadrant Analysis• Growth and Level of Intervention (Instructional

Group)

One School’s Example

Page 27: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Procedure:

1. Choose two related variables, collect data

2. Use data to group students

Quadrant Analysis

High/Low High/High

Low/Low High/Low

Level of Intervention

Gro

wth

One School’s Example

Page 28: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

WHAT WAS DISCOVERED? Students receiving interventions were not demonstrating growth as expected—little or no acceleration of their learning.

One School’s Example

Page 29: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

WHY WERE RESULTS NOT ALIGNED WITH EXPECTATIONS? Looked at fidelity of implementation of system of delivery.

Discovered the unexpected!

One School’s Example

Page 30: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

LESSON LEARNED: Walk in the shoes of the most “at-risk” students to ensure a cohesive plan vs. a disjointed day.

One School’s Example

Page 31: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Curriculum and Instruction

2. How is the curriculum and instruction differentiated to meet student needs?

In this example, the curriculum and instruction were differentiated, but they weren’t meeting student needs!

Use the Rubric to Set Goals

Page 32: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

A. Meetings where teachers talk about data and assignment of students to interventions, and more rarely how they will change their core instruction.

B. Meetings where data are discussed and then blame is attributed to any number of factors, but rarely instruction.

C. Meetings where teachers mostly discuss workday logistics and other issues, and more rarely reflect on data, interventions, or instruction.

D. Meetings where teachers regularly confront their prior assumptions about the effectiveness of their teaching as supported by evidence (data), share their prior instructional actions, and seek or offer help (as appropriate) to make modifications to future instruction.

What do we want it to look like?

Page 33: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Force Field Analysis

Procedure:

1. Define the desired change

2. Brainstorm driving and restraining forces

3. Prioritize forces

4. Identify action steps

Use a Process Tool to Inform Goal Setting

Page 34: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Desired Change: Mature PLC team discussions focused on strategies to support improved core instruction.

Driving Forces Restraining Forces

Dedicated time for meetings.

6-step process for using data to inform decisions about instruction.

System of supports in place to promote effective team meetings.

Specialists participate in team meetings to provide expertise for differentiating strategies for students.

Team interactions & trust issues make it hard to share concerns.

Hard to find strategies for differentiating that are aligned to quality core instruction.

Lack of common work sample assessments to form basis for discussion of strategies anchored in student work.

Action Steps: Process observer uses observation tool to observe team interactions. Align professional development to support development of teachers’ strategies. Provide half-day team time to develop unit of strategies and common work sample assessments. Determine indicators to monitor implementation. Use rubric and observation tool to reassess.

Force Field Analysis

Page 35: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

What have you learned from this school's example that applies to teams with which you are currently working?

• Add your comments in the polling window.

What about you?

Page 36: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

• The Direct Access to Achievement Implementation Rubric provides a tool for integrating initiatives and process tools within a system of continuous improvement.

• The rubric is useful for assessing team needs and setting goals for improvement of implementation.

Webinar Review

Page 37: 2013-2014 Webinar series February 26, 2014 3:30 – 4:30 p.m. This training is supported by a Statewide Longitudinal Data Systems grant from the U.S. Department

Looking Ahead

Webinar 3: “What difference is this making? Evaluating program effectiveness and fidelity of implementation”

April 23, 2013

3:30–4:30 p.m.