Quality Improvement Foundational Webinar Series · Quality Improvement Foundational Webinar Series...

Preview:

Citation preview

Quality Improvement Foundational Webinar Series

Webinar 4:

Testing Changes and Measuring Improvement

April 23, 2015

Presenters: Amanda Cornett, MPH

Greg Randolph, MD, MPH

• Review the QI framework and previous tools

• Discuss the importance of testing changes using the Plan-Do-Study-Act cycle

• Discuss the importance of measuring for improvement

• Apply the concepts to a STD related issue

• Share ideas for using the QI tools in your work

Objectives

• Value Stream Map

• Process Flow Diagram

• Swimlane Diagram

• Data Collection

General QI Problem Solving Method

Assess current condition

Prioritize issues &

set a target

BIG, VAGUE PROBLEM

Define

POSSIBLE Changes

IMPROVED OUTCOMES

Clarify problem

Test,

implement,

& sustain

changes

Adapted from:

The Toyota

Way

(The 8 Steps

of the Toyota

Business

Process)

• Measurement Plan

• Fishbone

• 5 Whys

• Evidence Based Strategies

• Brainwriting

• Impact Matrix

• Plan-Do-Study-Act cycles

4

QI In Action

Baseline Goal

% of applications with errors 96% 48%

Average # of returns/application 2 0

# of days to process application 63 days 30 days

1. Conducted Gemba Walk & create VSM

3. Identified measures and goals

2. Reviewed VSM & prioritized areas for improvement

5

4. Identified potential changes to improve identified gaps (> 30 changes)

5. Prioritized changes * Instructions for Completing App * Instructions for Reviewing App * Adapt Database to Track Apps * Weekly Huddle Agenda * Consultant on Call * Electronic Budget (w/ error proofing) * Staff/Customer Satisfaction Survey

QI In Action

6

6. Created sub-teams to develop and test changes

CASSANDRA and ANGIE

* Instructions for Completing App * Instructions for Reviewing App

VALARIE and LORI

* Database for Apps * Consultant on Call

ARNETT and AMANDA

* Weekly Huddle Agenda * Staff and Customer Survey

CATHY and DONNA

* Electronic Budget

QI In Action

7 7

Prioritize Changes: Impact Matrix

• Used to prioritize and identify areas of focus

• Helps identify areas that may have biggest impact on goals quickly

Example: Impact Matrix

•8 8

1 2

3

4

5 6 7

8

9 9

Impact Matrix on a Flip Chart

10

Testing Changes: PDSA Cycle

Act Plan

Study Do

• Objective of cycle

• Questions/predictions

• Plan to carry out the cycle

(who, what, where, when)

• Carry out the plan

• Document

problems/unexpected

observations

• Begin analysis of data

• Complete the

analysis of data

• Compare data to

predictions

• Summarize what

was learned

• What changes are to

be made?

• Adapt? Or Abandon?

• Next cycle?

11

Rapid Tests of Change

Hunches

Theories

Ideas

Changes That

Result in

Improvement

A P

S D

A P

S D

Very Small

Scale Test

Follow-up

Tests

Wide-Scale

Tests of

Change

Implementation of

Change

*Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP.

Why Use the PDSA Cycle?

Helps you adapt good ideas to your specific situation:

–Forces us to think small

–Forces us to be methodical, make predictions and learn

–Allows rapid adaptation and implementation of changes

12

PDSAs vs PDCAs

PDCAs (Plan Do Check Act)

– Originally proposed by Walter Shewhart

– Dr. Deming modified it to modify “check” to “study”

– Check: analyze what happened

– Study: build knowledge, compare the data with the predictions & study the results

– Often both consistent with rapid cycle (may be used interchangeably)

– Sometimes used as a framework

• Graban, M. “Lean hospitals” 2009

• Langley, et al, “The improvement Guide” 2009

• Mears, P. “Quality improvement tools & techniques” 1995

PDSA Cycle Example

Act Plan

Study Do

• If I use apple sauce

instead of butter my

brownies will taste

just as good as

regular brownies.

• Made a batch of

brownies for me

and my husband

and used apple

sauce instead of

butter

• The texture of the

brownies were the

same

• It did not taste bad,

but it did not taste like

regular brownies

• Going to use apple

sauce, but next time

I’m going to use

cinnamon apple

sauce.

14

15

PDSA Cycle Example #2

Electronic

budget

form

Change That

Results in

Improvement

A P

S D

A P

S D

2 CPHQ

team

members QI team

member

10 sites

25 sites

*Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP.

Key Points for Successful Tests • Design PDSAS for success! Take time to plan!

• Initial PDSA cycles on smallest scale possible.

• As you move to implementation, test under as many conditions as possible

– Special situations (e.g., busy days, “It will not work on Wednesdays”)

– Factors that could lead to breakdowns (e.g., different staff involved)

16 16

PDSA Tip #1: Scale Down

• Years

• Quarters

• Months

• Weeks

• Days

• Hours

• Minutes

• Number of clients

“Drop 2”

17 17

Express visits

for STI

screening

Standardizing

Sexual Risk

Assessment

Tool

Signing up

patients for

online reminders

For re-testing

PDSA Tip #2: Tests of Changes in Parallel

18

Leaving out

Test kits and

Pre-printed labels

Handing out

Pre-packaged

Medication for

Expedited partner therapy

PDSA Template

19

Key points:

• Using the template may seem tedious but it pays off in the end!

• Keeps the team focused and on track

• Provides an easy way to communicate to the rest of the SC

• Practice makes perfect!

Value Stream Map

Fishbone

Brainwriting

Breakout Instructions • As a small group:

– Review the changes identified to improve screenings

– Prioritize changes using Impact Matrix

– Use the PDSA cycle template to think through how to test one of the changes

• Report out to the larger group: – What were the “aha” moments?

– How can you use these tools in your daily work?

23 23

Measuring for Improvement

“All improvement is change, but not all change is an improvement.”

Why am I measuring?

24

Research?

What’s the Purpose?

25

Improvement Accountability Research

Purpose Understand process

Spur change

Evaluate change

Comparison

Assurance

Spur change

New Knowledge

Scope Individual program or site Entire organization or multiple

sites

Universal

Measures Few

Easy collection

Approximate

Very few

Complex collection

Precise & valid

Many

Complex collection

Very precise & valid

Time Period Short Long Long

Sample Size Small Large Large

Flexibility of

Hypothesis

Flexible hypothesis; changes

as learning takes place

No hypothesis Fixed hypothesis

Testing Sequential tests No tests One large test

Confounders Consider but rarely measure Describe and try to measure Measure or control

Determining if

Change is

Improvement

Run charts or control charts Not focused on change Hypothesis, statistical

tests (t-test, F-test, chi

square), p-values

Sources: • Institute for Healthcare Improvement. Science of Improvement: Establishing Measures. • Solberg,L., et.al. Three Faces of Performance Management: Improvement, Accountability, and Research.

Purposes of Measurement in QI Projects

• Identify gaps/needs for QI project(s)

• Monitor progress toward project goals/aim

• Generate ideas for improvement

• Evaluate rapid tests of change (PDSAs)

• Monitor for sustainability after improvement

26

Monitoring Progress Toward Goals

• Usually requires more than one measure

• A balanced set of measures helps assure that the system is improved:

– Linked to measurable goals in aim statement

– Show improvement quickly and include outcomes

– Monitor for unintended consequences

27

Types of Project Measures

• Outcome – Ultimate results we are trying to achieve

• Process – What we do to achieve the outcome

• Balancing – What we could “mess up” while trying to

improve process & outcome; monitors for unintended consequences

Balancing Measures

• Purpose: Address the question: “Are we improving some parts of system at expense of others?”

• Example: staff satisfaction in the clinic when the overall project outcome is to decrease the cycle time of a visit

• Sources: skeptics say, “Great idea, BUT…this could mess up X”

29

Key Features of Good Project Measures

• Includes quantitative (outcome, process, and balancing) + qualitative data and stories

• Meaningful and understandable to stakeholders

• Baseline levels not too high (no room to improve) or too low (turn off)

• Data must be perceived by your stakeholders, especially leaders, as valid

30

Tip for success!

Minimize measurement burden

• 3-5 measures (related to your goals in aim statement)

• Keep data collection as simple as possible

• Measure frequently using small sample sizes

31

“You can’t fatten a cow by weighing it”

-Palestinian Proverb

• Outcome – What outcome measures should the team track?

• Process – What process measures should the team track?

• Balancing – What balancing measures should the team track?

32

Try It! The ABC Health Department aims to improve STD screenings for females ages 16-24 years of age from 40% to 80% by December 2015.

33

34

Give Us Feedback!

Click Here to Evaluate

Recommended