15
1 Program and Compliance Program and Compliance Management Management Workshop: Workshop: Common Measures NOT Common Measures NOT V I R T U A L L Y V I R T U A L L Y

1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Embed Size (px)

Citation preview

Page 1: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

1

Program and Compliance ManagementProgram and Compliance Management

Workshop:Workshop:

Common Measures NOTCommon Measures NOT

V I R T U A L L V I R T U A L L YY

Page 2: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

OutlineOutline

• Not Management Tools

• Intermediate Metrics

• Master Your MIS

• Share the Accountability

2

Page 3: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Not Management ToolsNot Management Tools…that is, common measures aren’t…that is, common measures aren’t

• Accountability and comparison tools

• Performance assessed annually; States/Locals must assess operations on ongoing basis

• Common measures focus on outcomes, not operations or strategies

– Translation: Common measures focus on ‘bottom line’ results, not the drivers of performance

3

Page 4: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

• The timing of data availability precludes utility for day-to-day management

– State and local staff should be able to respond to issues, as opposed to having to react

– Local staff need to answer question: What changes in service design or delivery would enhance performance (including common measure outcomes)?

Not Management Tools (2)Not Management Tools (2)…that is, common measures aren’t…that is, common measures aren’t

4

Page 5: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Managing Performance Managing Performance in Lieu of Federal Measures in Lieu of Federal Measures

• Managers and staff need measures that:

– Provide real-time information

– Deploy agency’s strategic plan and focus/align agency activities and efforts

– Test cause/effect relationships among program activities

– Evaluate center and system performance (not just program performance)

5

Page 6: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Developing Developing IntermediateIntermediate Measures Measures

INPUT

IMPACT

OUTCOME

OUTPUT

PROCESS

National (federal) measures focus here

But many things happen before “the outcome” that But many things happen before “the outcome” that can be measured. Some might even can be measured. Some might even predictpredict the the

ultimate outcome!ultimate outcome!

6

Page 7: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

• “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” Harrington, The Improvement Process

– Measure the right things, the right way, at the right time

• Many metrics just waiting to be crafted based on your State’s strategic plan!

Developing Meaningful MetricsDeveloping Meaningful MetricsKeep It Mind!Keep It Mind!

7

Page 8: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

ExamplesExamples

• Input MeasuresInput Measures– Measures related to outreach and recruitment– Enrollment rates– Measures related to percentage of accepted referrals

from other partners

• Process MeasuresProcess Measures– % of individuals who don’t receive services for >30 days– Extent of partnering/referrals for co-enrollments– Timeliness of reports (internal or external)– Employer repeat usage

8

Page 9: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

• Output MeasuresOutput Measures– Completion rates– # of exits with positive outcomes by ‘x’ time period– Percentage of referrals to registered apprenticeships

• Outcome Measures Outcome Measures – Employment, Retention, Average Earnings, Earnings

Change, Wage Replacement, Customer Satisfaction, Credential Attainment (Program-specific or system-wide)

– Percentage of customers employed at exit, 30 days post-exit

• Impact MeasuresImpact Measures– Measures of self-sufficiency, etc.

Examples (2)Examples (2)

9

Page 10: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

You Must Master Your MISYou Must Master Your MISNo Kidding!No Kidding!

• Whether it’s AJL or VOS, etc.

– Know what you have, what it can produce, how to get key information out of it, and understand any data issues (e.g., qualifications)

– Remember that “every ONE counts” (Session #1)

– Know your data-related policies (e.g., maximum timeframe for data entry)

– MIS training/retraining both necessary

10

Page 11: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

The ‘Best’ MIS TrainingThe ‘Best’ MIS Training

• Hands on

• Marries data entry and staff “interaction” with the MIS with case management and program management (keep it real!)

• Demonstrates policies “in action”

• Individual and Group Exercises (e.g., case studies, mock participants)

• Reference materials for post-training

11

Page 12: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Share the AccountabilityShare the AccountabilityContribution vs. AttributionContribution vs. Attribution

• Various partners contributing their resources and services in order to meet the needs of employer and job seeking customers is the intent

• We can share the outcomes so why not the accountability that goes along with it?

– Push the accountability downward to the extent you can!

We discussed

this in Session #1

12

Page 13: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Sharing Accountability…Sharing Accountability…how?how?

• With system partners – through MOUs, for instance

• With One-Stop Operators – through RFP and contracting process, through local reviews

• With service providers – through RFP and subsequent contract provisions

• Within centers – through public sharing of data about other offices within same LWIA

• …These are but a few examples

13

Page 14: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

Example of Example of NotNot Sharing Sharing AccountabilityAccountability

Remember: Contract Management Remember: Contract Management is part of is part of Performance ManagementPerformance Management

• Many contract vehicles lack appropriate protections, which obviously vary depending on the context (e.g., probationary provisions for declining performance?)

– A state workforce agency could no longer continue financing a certain youth provider for WIA. When the state took over operations, the youth case files were not returned, leaving the state without key information by the time of a DV review.

14

Page 15: 1 Program and Compliance Management Workshop: Common Measures NOT V I R T U A L L Y

In An Ideal World . . . (?)In An Ideal World . . . (?)• State/Local staff already collect the necessary data

(consistently) to develop meaningful metrics (e.g., completion info/updates)

• Data entry is timely and accurate and staff understand the impact of timely/accurate reporting

• The data are part of the statewide MIS or another system that processes the data (not Hotel California)

• Management reports are readily produced/available to all staff

• Performance data are routinely discussed at staff meetings

• Data management is a priority

15